WorldWideScience

Sample records for advanced computer studies

  1. NATO Advanced Study Institute on Methods in Computational Molecular Physics

    CERN Document Server

    Diercksen, Geerd

    1992-01-01

    This volume records the lectures given at a NATO Advanced Study Institute on Methods in Computational Molecular Physics held in Bad Windsheim, Germany, from 22nd July until 2nd. August, 1991. This NATO Advanced Study Institute sought to bridge the quite considerable gap which exist between the presentation of molecular electronic structure theory found in contemporary monographs such as, for example, McWeeny's Methods 0/ Molecular Quantum Mechanics (Academic Press, London, 1989) or Wilson's Electron correlation in moleeules (Clarendon Press, Oxford, 1984) and the realization of the sophisticated computational algorithms required for their practical application. It sought to underline the relation between the electronic structure problem and the study of nuc1ear motion. Software for performing molecular electronic structure calculations is now being applied in an increasingly wide range of fields in both the academic and the commercial sectors. Numerous applications are reported in areas as diverse as catalysi...

  2. Identification of Enhancers In Human: Advances In Computational Studies

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2016-03-24

    Roughly ~50% of the human genome, contains noncoding sequences serving as regulatory elements responsible for the diverse gene expression of the cells in the body. One very well studied category of regulatory elements is the category of enhancers. Enhancers increase the transcriptional output in cells through chromatin remodeling or recruitment of complexes of binding proteins. Identification of enhancer using computational techniques is an interesting area of research and up to now several approaches have been proposed. However, the current state-of-the-art methods face limitations since the function of enhancers is clarified, but their mechanism of function is not well understood. This PhD thesis presents a bioinformatics/computer science study that focuses on the problem of identifying enhancers in different human cells using computational techniques. The dissertation is decomposed into four main tasks that we present in different chapters. First, since many of the enhancer’s functions are not well understood, we study the basic biological models by which enhancers trigger transcriptional functions and we survey comprehensively over 30 bioinformatics approaches for identifying enhancers. Next, we elaborate more on the availability of enhancer data as produced by different enhancer identification methods and experimental procedures. In particular, we analyze advantages and disadvantages of existing solutions and we report obstacles that require further consideration. To mitigate these problems we developed the Database of Integrated Human Enhancers (DENdb), a centralized online repository that archives enhancer data from 16 ENCODE cell-lines. The integrated enhancer data are also combined with many other experimental data that can be used to interpret the enhancers content and generate a novel enhancer annotation that complements the existing integrative annotation proposed by the ENCODE consortium. Next, we propose the first deep-learning computational

  3. Center for Advanced Energy Studies: Computer Assisted Virtual Environment (CAVE)

    Data.gov (United States)

    Federal Laboratory Consortium — The laboratory contains a four-walled 3D computer assisted virtual environment - or CAVE TM — that allows scientists and engineers to literally walk into their data...

  4. Advanced Computational Thermal Studies and their Assessment for Supercritical-Pressure Reactors (SCRs)

    Energy Technology Data Exchange (ETDEWEB)

    D. M. McEligot; J. Y. Yoo; J. S. Lee; S. T. Ro; E. Lurien; S. O. Park; R. H. Pletcher; B. L. Smith; P. Vukoslavcevic; J. M. Wallace

    2009-04-01

    The goal of this laboratory / university collaboration of coupled computational and experimental studies is the improvement of predictive methods for supercritical-pressure reactors. The general objective is to develop supporting knowledge needed of advanced computational techniques for the technology development of the concepts and their safety systems.

  5. Advances in Computer Entertainment.

    NARCIS (Netherlands)

    Nijholt, Antinus; Romão, T.; Reidsma, Dennis; Unknown, [Unknown

    2012-01-01

    These are the proceedings of the 9th International Conference on Advances in Computer Entertainment ACE 2012). ACE has become the leading scientific forum for dissemination of cutting-edge research results in the area of entertainment computing. Interactive entertainment is one of the most vibrant

  6. Advanced computers and simulation

    International Nuclear Information System (INIS)

    Ryne, R.D.

    1993-01-01

    Accelerator physicists today have access to computers that are far more powerful than those available just 10 years ago. In the early 1980's, desktop workstations performed less one million floating point operations per second (Mflops), and the realized performance of vector supercomputers was at best a few hundred Mflops. Today vector processing is available on the desktop, providing researchers with performance approaching 100 Mflops at a price that is measured in thousands of dollars. Furthermore, advances in Massively Parallel Processors (MPP) have made performance of over 10 gigaflops a reality, and around mid-decade MPPs are expected to be capable of teraflops performance. Along with advances in MPP hardware, researchers have also made significant progress in developing algorithms and software for MPPS. These changes have had, and will continue to have, a significant impact on the work of computational accelerator physicists. Now, instead of running particle simulations with just a few thousand particles, we can perform desktop simulations with tens of thousands of simulation particles, and calculations with well over 1 million particles are being performed on MPPs. In the area of computational electromagnetics, simulations that used to be performed only on vector supercomputers now run in several hours on desktop workstations, and researchers are hoping to perform simulations with over one billion mesh points on future MPPs. In this paper we will discuss the latest advances, and what can be expected in the near future, in hardware, software and applications codes for advanced simulation of particle accelerators

  7. NATO Advanced Study Institute on Advances in the Computer Simulations of Liquid Crystals

    CERN Document Server

    Zannoni, Claudio

    2000-01-01

    Computer simulations provide an essential set of tools for understanding the macroscopic properties of liquid crystals and of their phase transitions in terms of molecular models. While simulations of liquid crystals are based on the same general Monte Carlo and molecular dynamics techniques as are used for other fluids, they present a number of specific problems and peculiarities connected to the intrinsic properties of these mesophases. The field of computer simulations of anisotropic fluids is interdisciplinary and is evolving very rapidly. The present volume covers a variety of techniques and model systems, from lattices to hard particle and Gay-Berne to atomistic, for thermotropics, lyotropics, and some biologically interesting liquid crystals. Contributions are written by an excellent panel of international lecturers and provides a timely account of the techniques and problems in the field.

  8. Advances in computer applications in radioactive tracer studies of the circulation

    International Nuclear Information System (INIS)

    Wagner, H.N. Jr.; Klingensmith, W.C. III; Knowles, L.G.; Lotter, M.G.; Natarajan, T.K.

    1977-01-01

    Advances in computer technology since the last IAEA symposium on medical radionuclide imaging have now made possible a new approach to the study of physiological processes that promise to improve greatly our perception of body functions and structures. We have developed procedures, called ''compressed time imaging'' (CTI), that display serial images obtained over periods of minutes and hours at framing rates of approximately 16 to 60 per minute. At other times, ''real'' or ''expanded time imaging'' is used, depending on the process under study. Designed initially to study the beating heart, such multidimensional time studies are now being extended to the cerebral and other regional circulations, as well as to other organ systems. The improved imaging methods provide a new approach to space and time in the study of physiology and are supplemented by quantitative analysis of data displayed on the television screen of the computer. (author)

  9. Computational technologies advanced topics

    CERN Document Server

    Vabishchevich, Petr N

    2015-01-01

    This book discusses questions of numerical solutions of applied problems on parallel computing systems. Nowadays, engineering and scientific computations are carried out on parallel computing systems, which provide parallel data processing on a few computing nodes. In constructing computational algorithms, mathematical problems are separated in relatively independent subproblems in order to solve them on a single computing node.

  10. International Conference on Advanced Computing

    CERN Document Server

    Patnaik, Srikanta

    2014-01-01

    This book is composed of the Proceedings of the International Conference on Advanced Computing, Networking, and Informatics (ICACNI 2013), held at Central Institute of Technology, Raipur, Chhattisgarh, India during June 14–16, 2013. The book records current research articles in the domain of computing, networking, and informatics. The book presents original research articles, case-studies, as well as review articles in the said field of study with emphasis on their implementation and practical application. Researchers, academicians, practitioners, and industry policy makers around the globe have contributed towards formation of this book with their valuable research submissions.

  11. Advances in unconventional computing

    CERN Document Server

    2017-01-01

    The unconventional computing is a niche for interdisciplinary science, cross-bred of computer science, physics, mathematics, chemistry, electronic engineering, biology, material science and nanotechnology. The aims of this book are to uncover and exploit principles and mechanisms of information processing in and functional properties of physical, chemical and living systems to develop efficient algorithms, design optimal architectures and manufacture working prototypes of future and emergent computing devices. This first volume presents theoretical foundations of the future and emergent computing paradigms and architectures. The topics covered are computability, (non-)universality and complexity of computation; physics of computation, analog and quantum computing; reversible and asynchronous devices; cellular automata and other mathematical machines; P-systems and cellular computing; infinity and spatial computation; chemical and reservoir computing. The book is the encyclopedia, the first ever complete autho...

  12. Advances in physiological computing

    CERN Document Server

    Fairclough, Stephen H

    2014-01-01

    This edited collection will provide an overview of the field of physiological computing, i.e. the use of physiological signals as input for computer control. It will cover a breadth of current research, from brain-computer interfaces to telemedicine.

  13. Radical decomposition of 2,4-dinitrotoluene (DNT at conditions of advanced oxidation. Computational study

    Directory of Open Access Journals (Sweden)

    Liudmyla K. Sviatenko

    2016-12-01

    Full Text Available At the present time one of the main remediation technologies for such environmental pollutant as 2,4-dinitrotoluene (DNT is advanced oxidation processes (AOPs. Since hydroxyl radical is the most common active species for AOPs, in particular for Fenton oxidation, the study modeled mechanism of interaction between DNT and hydroxyl radical at SMD(Pauling/M06-2X/6-31+G(d,p level. Computed results allow to suggest the most energetically favourable pathway for the process. DNT decomposition consists of sequential hydrogen abstractions and hydroxyl attachments passing through 2,4-dinitrobenzyl alcohol, 2,4-dinitrobenzaldehyde, and 2,4-dinitrobenzoic acid. Further replacement of nitro- and carboxyl groups by hydroxyl leads to 2,4-dihydroxybenzoic acid and 2,4-dinitrophenol, respectively. Reaction intermediates and products are experimentally confirmed. Mostly of reaction steps have low energy barriers, some steps are diffusion controlled. The whole process is highly exothermic.

  14. Learning by Computer Simulation Does Not Lead to Better Test Performance on Advanced Cardiac Life Support Than Textbook Study.

    Science.gov (United States)

    Kim, Jong Hoon; Kim, Won Oak; Min, Kyeong Tae; Yang, Jong Yoon; Nam, Yong Taek

    2002-01-01

    For an effective acquisition and the practical application of rapidly increasing amounts of information, computer-based learning has already been introduced in medical education. However, there have been few studies that compare this innovative method to traditional learning methods in studying advanced cardiac life support (ACLS). Senior medical students were randomized to computer simulation and a textbook study. Each group studied ACLS for 150 minutes. Tests were done one week before, immediately after, and one week after the study period. Testing consisted of 20 questions. All questions were formulated in such a way that there was a single best answer. Each student also completed a questionnaire designed to assess computer skills as well as satisfaction with and benefit from the study materials. Test scores improved after both textbook study and computer simulation study in both groups but the improvement in scores was significantly higher for the textbook group only immediately after the study. There was no significant difference between groups in their computer skill and satisfaction with the study materials. The textbook group reported greater benefit from study materials than did the computer simulation group. Studying ACLS with a hard copy textbook may be more effective than computer simulation for the acquisition of simple information during a brief period. However, the difference in effectiveness is likely transient.

  15. Recent advances in computational aerodynamics

    Science.gov (United States)

    Agarwal, Ramesh K.; Desse, Jerry E.

    1991-04-01

    The current state of the art in computational aerodynamics is described. Recent advances in the discretization of surface geometry, grid generation, and flow simulation algorithms have led to flowfield predictions for increasingly complex and realistic configurations. As a result, computational aerodynamics is emerging as a crucial enabling technology for the development and design of flight vehicles. Examples illustrating the current capability for the prediction of aircraft, launch vehicle and helicopter flowfields are presented. Unfortunately, accurate modeling of turbulence remains a major difficulty in the analysis of viscosity-dominated flows. In the future inverse design methods, multidisciplinary design optimization methods, artificial intelligence technology and massively parallel computer technology will be incorporated into computational aerodynamics, opening up greater opportunities for improved product design at substantially reduced costs.

  16. Recent advances in computational optimization

    CERN Document Server

    2013-01-01

    Optimization is part of our everyday life. We try to organize our work in a better way and optimization occurs in minimizing time and cost or the maximization of the profit, quality and efficiency. Also many real world problems arising in engineering, economics, medicine and other domains can be formulated as optimization tasks. This volume is a comprehensive collection of extended contributions from the Workshop on Computational Optimization. This book presents recent advances in computational optimization. The volume includes important real world problems like parameter settings for con- trolling processes in bioreactor, robot skin wiring, strip packing, project scheduling, tuning of PID controller and so on. Some of them can be solved by applying traditional numerical methods, but others need a huge amount of computational resources. For them it is shown that is appropriate to develop algorithms based on metaheuristic methods like evolutionary computation, ant colony optimization, constrain programming etc...

  17. Advances in embedded computer vision

    CERN Document Server

    Kisacanin, Branislav

    2014-01-01

    This illuminating collection offers a fresh look at the very latest advances in the field of embedded computer vision. Emerging areas covered by this comprehensive text/reference include the embedded realization of 3D vision technologies for a variety of applications, such as stereo cameras on mobile devices. Recent trends towards the development of small unmanned aerial vehicles (UAVs) with embedded image and video processing algorithms are also examined. The authoritative insights range from historical perspectives to future developments, reviewing embedded implementation, tools, technolog

  18. Study of metastatic lymph nodes in advanced gastric cancer with spiral computed tomograph

    International Nuclear Information System (INIS)

    Su Yijuan

    2008-01-01

    Objective: To study the characteristics of spiral computed tomography (SCT) in the diagnosis of lymph nodes metastases in gastric cancer. Methods: The characteristics of spiral computed tomography (SCT) of metastatic lymph nodes in 35 gastric cancer patients were analyzed and compared with operation and pathology. Results: A total amount of 379 lymph nodes (positive 173, negative 206) were detected by SCT and confirmed by pathology in metastasis-positive or metastasis-negative patients. The positive rate with diameter of lymph nodes ≥ 10 mm is 62.7%. The positive rate with ir- regular shape and uneven enhancement lymph nodes were 96.3% and 89.4%. If the attenuation values, more than or equal to 25 HU in plain scan or 70 HU in arterial phase or 80 HU in venous phase, were used as the threshold to detect the metastasis-positive lymph nodes, the positive rate were 55.7%, 56.3%, 67.8% respectively. Conclusion: SCT is valuable in judging the metastasis in gastric cancer. The reference of diameter ≥ 10mm, combining with the shape and the attenuation values can dramatically improve the diagnosis of lymph node metastasis in gastric cancer. (authors)

  19. Advances in Computer Science and Engineering

    CERN Document Server

    Second International Conference on Advances in Computer Science and Engineering (CES 2012)

    2012-01-01

    This book includes the proceedings of the second International Conference on Advances in Computer Science and Engineering (CES 2012), which was held during January 13-14, 2012 in Sanya, China. The papers in these proceedings of CES 2012 focus on the researchers’ advanced works in their fields of Computer Science and Engineering mainly organized in four topics, (1) Software Engineering, (2) Intelligent Computing, (3) Computer Networks, and (4) Artificial Intelligence Software.

  20. Advanced computing in electron microscopy

    CERN Document Server

    Kirkland, Earl J

    2010-01-01

    This book features numerical computation of electron microscopy images as well as multislice methods High resolution CTEM and STEM image interpretation are included in the text This newly updated second edition will bring the reader up to date on new developments in the field since the 1990's The only book that specifically addresses computer simulation methods in electron microscopy

  1. Advances in photonic reservoir computing

    Directory of Open Access Journals (Sweden)

    Van der Sande Guy

    2017-05-01

    Full Text Available We review a novel paradigm that has emerged in analogue neuromorphic optical computing. The goal is to implement a reservoir computer in optics, where information is encoded in the intensity and phase of the optical field. Reservoir computing is a bio-inspired approach especially suited for processing time-dependent information. The reservoir’s complex and high-dimensional transient response to the input signal is capable of universal computation. The reservoir does not need to be trained, which makes it very well suited for optics. As such, much of the promise of photonic reservoirs lies in their minimal hardware requirements, a tremendous advantage over other hardware-intensive neural network models. We review the two main approaches to optical reservoir computing: networks implemented with multiple discrete optical nodes and the continuous system of a single nonlinear device coupled to delayed feedback.

  2. Advances in photonic reservoir computing

    Science.gov (United States)

    Van der Sande, Guy; Brunner, Daniel; Soriano, Miguel C.

    2017-05-01

    We review a novel paradigm that has emerged in analogue neuromorphic optical computing. The goal is to implement a reservoir computer in optics, where information is encoded in the intensity and phase of the optical field. Reservoir computing is a bio-inspired approach especially suited for processing time-dependent information. The reservoir's complex and high-dimensional transient response to the input signal is capable of universal computation. The reservoir does not need to be trained, which makes it very well suited for optics. As such, much of the promise of photonic reservoirs lies in their minimal hardware requirements, a tremendous advantage over other hardware-intensive neural network models. We review the two main approaches to optical reservoir computing: networks implemented with multiple discrete optical nodes and the continuous system of a single nonlinear device coupled to delayed feedback.

  3. Advances in computational complexity theory

    CERN Document Server

    Cai, Jin-Yi

    1993-01-01

    This collection of recent papers on computational complexity theory grew out of activities during a special year at DIMACS. With contributions by some of the leading experts in the field, this book is of lasting value in this fast-moving field, providing expositions not found elsewhere. Although aimed primarily at researchers in complexity theory and graduate students in mathematics or computer science, the book is accessible to anyone with an undergraduate education in mathematics or computer science. By touching on some of the major topics in complexity theory, this book sheds light on this burgeoning area of research.

  4. Advanced in Computer Science and its Applications

    CERN Document Server

    Yen, Neil; Park, James; CSA 2013

    2014-01-01

    The theme of CSA is focused on the various aspects of computer science and its applications for advances in computer science and its applications and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of computer science and its applications. Therefore this book will be include the various theories and practical applications in computer science and its applications.

  5. International Conference on Advanced Computing for Innovation

    CERN Document Server

    Angelova, Galia; Agre, Gennady

    2016-01-01

    This volume is a selected collection of papers presented and discussed at the International Conference “Advanced Computing for Innovation (AComIn 2015)”. The Conference was held at 10th -11th of November, 2015 in Sofia, Bulgaria and was aimed at providing a forum for international scientific exchange between Central/Eastern Europe and the rest of the world on several fundamental topics of computational intelligence. The papers report innovative approaches and solutions in hot topics of computational intelligence – advanced computing, language and semantic technologies, signal and image processing, as well as optimization and intelligent control.

  6. Bringing Advanced Computational Techniques to Energy Research

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Julie C

    2012-11-17

    Please find attached our final technical report for the BACTER Institute award. BACTER was created as a graduate and postdoctoral training program for the advancement of computational biology applied to questions of relevance to bioenergy research.

  7. Soft computing in advanced robotics

    CERN Document Server

    Kobayashi, Ichiro; Kim, Euntai

    2014-01-01

    Intelligent system and robotics are inevitably bound up; intelligent robots makes embodiment of system integration by using the intelligent systems. We can figure out that intelligent systems are to cell units, while intelligent robots are to body components. The two technologies have been synchronized in progress. Making leverage of the robotics and intelligent systems, applications cover boundlessly the range from our daily life to space station; manufacturing, healthcare, environment, energy, education, personal assistance, logistics. This book aims at presenting the research results in relevance with intelligent robotics technology. We propose to researchers and practitioners some methods to advance the intelligent systems and apply them to advanced robotics technology. This book consists of 10 contributions that feature mobile robots, robot emotion, electric power steering, multi-agent, fuzzy visual navigation, adaptive network-based fuzzy inference system, swarm EKF localization and inspection robot. Th...

  8. Computer networks and advanced communications

    International Nuclear Information System (INIS)

    Koederitz, W.L.; Macon, B.S.

    1992-01-01

    One of the major methods for getting the most productivity and benefits from computer usage is networking. However, for those who are contemplating a change from stand-alone computers to a network system, the investigation of actual networks in use presents a paradox: network systems can be highly productive and beneficial; at the same time, these networks can create many complex, frustrating problems. The issue becomes a question of whether the benefits of networking are worth the extra effort and cost. In response to this issue, the authors review in this paper the implementation and management of an actual network in the LSU Petroleum Engineering Department. The network, which has been in operation for four years, is large and diverse (50 computers, 2 sites, PC's, UNIX RISC workstations, etc.). The benefits, costs, and method of operation of this network will be described, and an effort will be made to objectively weigh these elements from the point of view of the average computer user

  9. Advanced Materials for Quantum Computing

    Science.gov (United States)

    2010-04-28

    with Magnons co-PI: Leszek Malkinski w/ Postdoc Dr. Seong-Gi Min Project Name: Quantum Computing with Magnons 1. Brief Narrative: Quanta of...spinwaves called magnons can be used to exchange quantum information between solid state qubits. The project was driven by the concept of spiwave bus

  10. Preface (to: Advances in Computer Entertainment)

    NARCIS (Netherlands)

    Romão, Teresa; Nijholt, Antinus; Romão, Teresa; Reidsma, Dennis

    2012-01-01

    These are the proceedings of the 9th International Conference on Advances in Computer Entertainment ACE 2012). ACE has become the leading scientific forum for dissemination of cutting-edge research results in the area of entertainment computing. Interactive entertainment is one of the most vibrant

  11. Advanced laptop and small personal computer technology

    Science.gov (United States)

    Johnson, Roger L.

    1991-01-01

    Advanced laptop and small personal computer technology is presented in the form of the viewgraphs. The following areas of hand carried computers and mobile workstation technology are covered: background, applications, high end products, technology trends, requirements for the Control Center application, and recommendations for the future.

  12. Advance Trends in Soft Computing

    CERN Document Server

    Kreinovich, Vladik; Kacprzyk, Janusz; WCSC 2013

    2014-01-01

    This book is the proceedings of the 3rd World Conference on Soft Computing (WCSC), which was held in San Antonio, TX, USA, on December 16-18, 2013. It presents start-of-the-art theory and applications of soft computing together with an in-depth discussion of current and future challenges in the field, providing readers with a 360 degree view on soft computing. Topics range from fuzzy sets, to fuzzy logic, fuzzy mathematics, neuro-fuzzy systems, fuzzy control, decision making in fuzzy environments, image processing and many more. The book is dedicated to Lotfi A. Zadeh, a renowned specialist in signal analysis and control systems research who proposed the idea of fuzzy sets, in which an element may have a partial membership, in the early 1960s, followed by the idea of fuzzy logic, in which a statement can be true only to a certain degree, with degrees described by numbers in the interval [0,1]. The performance of fuzzy systems can often be improved with the help of optimization techniques, e.g. evolutionary co...

  13. Molecular Imaging : Computer Reconstruction and Practice - Proceedings of the NATO Advanced Study Institute on Molecular Imaging from Physical Principles to Computer Reconstruction and Practice

    CERN Document Server

    Lemoigne, Yves

    2008-01-01

    This volume collects the lectures presented at the ninth ESI School held at Archamps (FR) in November 2006 and is dedicated to nuclear physics applications in molecular imaging. The lectures focus on the multiple facets of image reconstruction processing and management and illustrate the role of digital imaging in clinical practice. Medical computing and image reconstruction are introduced by analysing the underlying physics principles and their implementation, relevant quality aspects, clinical performance and recent advancements in the field. Several stages of the imaging process are specifically addressed, e.g. optimisation of data acquisition and storage, distributed computing, physiology and detector modelling, computer algorithms for image reconstruction and measurement in tomography applications, for both clinical and biomedical research applications. All topics are presented with didactical language and style, making this book an appropriate reference for students and professionals seeking a comprehen...

  14. Advances in computational plasma physics

    International Nuclear Information System (INIS)

    Jardin, S.C.

    1990-01-01

    Both the derivation of the Grad-Shafranov form of the plasma equilibrium equation and methods for its numerical solution are now well known. The present work discusses two topics related to computing plasma equilibrium that may not be quite as well known to some people. The first is a method to implement free-boundary boundary conditions efficiently; the second is a technique for obtaining plasma profiles that are stationary on the resistive time scale, as well as on the inertial time scale. (Author)

  15. Advances in randomized parallel computing

    CERN Document Server

    Rajasekaran, Sanguthevar

    1999-01-01

    The technique of randomization has been employed to solve numerous prob­ lems of computing both sequentially and in parallel. Examples of randomized algorithms that are asymptotically better than their deterministic counterparts in solving various fundamental problems abound. Randomized algorithms have the advantages of simplicity and better performance both in theory and often in practice. This book is a collection of articles written by renowned experts in the area of randomized parallel computing. A brief introduction to randomized algorithms In the aflalysis of algorithms, at least three different measures of performance can be used: the best case, the worst case, and the average case. Often, the average case run time of an algorithm is much smaller than the worst case. 2 For instance, the worst case run time of Hoare's quicksort is O(n ), whereas its average case run time is only O( n log n). The average case analysis is conducted with an assumption on the input space. The assumption made to arrive at t...

  16. Advances and challenges in computational plasma science

    International Nuclear Information System (INIS)

    Tang, W M; Chan, V S

    2005-01-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This

  17. Computation of Asteroid Proper Elements: Recent Advances

    Science.gov (United States)

    Knežević, Z.

    2017-12-01

    The recent advances in computation of asteroid proper elements are briefly reviewed. Although not representing real breakthroughs in computation and stability assessment of proper elements, these advances can still be considered as important improvements offering solutions to some practical problems encountered in the past. The problem of getting unrealistic values of perihelion frequency for very low eccentricity orbits is solved by computing frequencies using the frequency-modified Fourier transform. The synthetic resonant proper elements adjusted to a given secular resonance helped to prove the existence of Astraea asteroid family. The preliminary assessment of stability with time of proper elements computed by means of the analytical theory provides a good indication of their poorer performance with respect to their synthetic counterparts, and advocates in favor of ceasing their regular maintenance; the final decision should, however, be taken on the basis of more comprehensive and reliable direct estimate of their individual and sample average deviations from constancy.

  18. Computation of asteroid proper elements: Recent advances

    Directory of Open Access Journals (Sweden)

    Knežević Z.

    2017-01-01

    Full Text Available The recent advances in computation of asteroid proper elements are briefly reviewed. Although not representing real breakthroughs in computation and stability assessment of proper elements, these advances can still be considered as important improvements offering solutions to some practical problems encountered in the past. The problem of getting unrealistic values of perihelion frequency for very low eccentricity orbits is solved by computing frequencies using the frequencymodified Fourier transform. The synthetic resonant proper elements adjusted to a given secular resonance helped to prove the existence of Astraea asteroid family. The preliminary assessment of stability with time of proper elements computed by means of the analytical theory provides a good indication of their poorer performance with respect to their synthetic counterparts, and advocates in favor of ceasing their regular maintenance; the final decision should, however, be taken on the basis of more comprehensive and reliable direct estimate of their individual and sample average deviations from constancy.

  19. Computational electromagnetics recent advances and engineering applications

    CERN Document Server

    2014-01-01

    Emerging Topics in Computational Electromagnetics in Computational Electromagnetics presents advances in Computational Electromagnetics. This book is designed to fill the existing gap in current CEM literature that only cover the conventional numerical techniques for solving traditional EM problems. The book examines new algorithms, and applications of these algorithms for solving problems of current interest that are not readily amenable to efficient treatment by using the existing techniques. The authors discuss solution techniques for problems arising in nanotechnology, bioEM, metamaterials, as well as multiscale problems. They present techniques that utilize recent advances in computer technology, such as parallel architectures, and the increasing need to solve large and complex problems in a time efficient manner by using highly scalable algorithms.

  20. Advanced computational electromagnetic methods and applications

    CERN Document Server

    Li, Wenxing; Elsherbeni, Atef; Rahmat-Samii, Yahya

    2015-01-01

    This new resource covers the latest developments in computational electromagnetic methods, with emphasis on cutting-edge applications. This book is designed to extend existing literature to the latest development in computational electromagnetic methods, which are of interest to readers in both academic and industrial areas. The topics include advanced techniques in MoM, FEM and FDTD, spectral domain method, GPU and Phi hardware acceleration, metamaterials, frequency and time domain integral equations, and statistics methods in bio-electromagnetics.

  1. Advances and Challenges in Computational Plasma Science

    International Nuclear Information System (INIS)

    Tang, W.M.; Chan, V.S.

    2005-01-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behavior. Recent advances in simulations of magnetically-confined plasmas are reviewed in this paper with illustrative examples chosen from associated research areas such as microturbulence, magnetohydrodynamics, and other topics. Progress has been stimulated in particular by the exponential growth of computer speed along with significant improvements in computer technology

  2. Advances and Challenges in Computational Plasma Science

    Energy Technology Data Exchange (ETDEWEB)

    W.M. Tang; V.S. Chan

    2005-01-03

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behavior. Recent advances in simulations of magnetically-confined plasmas are reviewed in this paper with illustrative examples chosen from associated research areas such as microturbulence, magnetohydrodynamics, and other topics. Progress has been stimulated in particular by the exponential growth of computer speed along with significant improvements in computer technology.

  3. Advanced computers and Monte Carlo

    International Nuclear Information System (INIS)

    Jordan, T.L.

    1979-01-01

    High-performance parallelism that is currently available is synchronous in nature. It is manifested in such architectures as Burroughs ILLIAC-IV, CDC STAR-100, TI ASC, CRI CRAY-1, ICL DAP, and many special-purpose array processors designed for signal processing. This form of parallelism has apparently not been of significant value to many important Monte Carlo calculations. Nevertheless, there is much asynchronous parallelism in many of these calculations. A model of a production code that requires up to 20 hours per problem on a CDC 7600 is studied for suitability on some asynchronous architectures that are on the drawing board. The code is described and some of its properties and resource requirements ae identified to compare with corresponding properties and resource requirements are identified to compare with corresponding properties and resource requirements are identified to compare with corresponding properties and resources of some asynchronous multiprocessor architectures. Arguments are made for programer aids and special syntax to identify and support important asynchronous parallelism. 2 figures, 5 tables

  4. Computational Intelligence Paradigms in Advanced Pattern Classification

    CERN Document Server

    Jain, Lakhmi

    2012-01-01

    This monograph presents selected areas of application of pattern recognition and classification approaches including handwriting recognition, medical image analysis and interpretation, development of cognitive systems for image computer understanding, moving object detection, advanced image filtration and intelligent multi-object labelling and classification. It is directed to the scientists, application engineers, professors, professors and students will find this book useful.

  5. Computational Biology, Advanced Scientific Computing, and Emerging Computational Architectures

    Energy Technology Data Exchange (ETDEWEB)

    None

    2007-06-27

    This CRADA was established at the start of FY02 with $200 K from IBM and matching funds from DOE to support post-doctoral fellows in collaborative research between International Business Machines and Oak Ridge National Laboratory to explore effective use of emerging petascale computational architectures for the solution of computational biology problems. 'No cost' extensions of the CRADA were negotiated with IBM for FY03 and FY04.

  6. OPENING REMARKS: Scientific Discovery through Advanced Computing

    Science.gov (United States)

    Strayer, Michael

    2006-01-01

    Good morning. Welcome to SciDAC 2006 and Denver. I share greetings from the new Undersecretary for Energy, Ray Orbach. Five years ago SciDAC was launched as an experiment in computational science. The goal was to form partnerships among science applications, computer scientists, and applied mathematicians to take advantage of the potential of emerging terascale computers. This experiment has been a resounding success. SciDAC has emerged as a powerful concept for addressing some of the biggest challenges facing our world. As significant as these successes were, I believe there is also significance in the teams that achieved them. In addition to their scientific aims these teams have advanced the overall field of computational science and set the stage for even larger accomplishments as we look ahead to SciDAC-2. I am sure that many of you are expecting to hear about the results of our current solicitation for SciDAC-2. I’m afraid we are not quite ready to make that announcement. Decisions are still being made and we will announce the results later this summer. Nearly 250 unique proposals were received and evaluated, involving literally thousands of researchers, postdocs, and students. These collectively requested more than five times our expected budget. This response is a testament to the success of SciDAC in the community. In SciDAC-2 our budget has been increased to about 70 million for FY 2007 and our partnerships have expanded to include the Environment and National Security missions of the Department. The National Science Foundation has also joined as a partner. These new partnerships are expected to expand the application space of SciDAC, and broaden the impact and visibility of the program. We have, with our recent solicitation, expanded to turbulence, computational biology, and groundwater reactive modeling and simulation. We are currently talking with the Department’s applied energy programs about risk assessment, optimization of complex systems - such

  7. Advances in computers improving the web

    CERN Document Server

    Zelkowitz, Marvin

    2010-01-01

    This is volume 78 of Advances in Computers. This series, which began publication in 1960, is the oldest continuously published anthology that chronicles the ever- changing information technology field. In these volumes we publish from 5 to 7 chapters, three times per year, that cover the latest changes to the design, development, use and implications of computer technology on society today.Covers the full breadth of innovations in hardware, software, theory, design, and applications.Many of the in-depth reviews have become standard references that continue to be of significant, lasting value i

  8. Advanced computational approaches to biomedical engineering

    CERN Document Server

    Saha, Punam K; Basu, Subhadip

    2014-01-01

    There has been rapid growth in biomedical engineering in recent decades, given advancements in medical imaging and physiological modelling and sensing systems, coupled with immense growth in computational and network technology, analytic approaches, visualization and virtual-reality, man-machine interaction and automation. Biomedical engineering involves applying engineering principles to the medical and biological sciences and it comprises several topics including biomedicine, medical imaging, physiological modelling and sensing, instrumentation, real-time systems, automation and control, sig

  9. Research Institute for Advanced Computer Science

    Science.gov (United States)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a

  10. Advanced techniques for computer-controlled polishing

    Science.gov (United States)

    Schinhaerl, Markus; Stamp, Richard; Pitschke, Elmar; Rascher, Rolf; Smith, Lyndon; Smith, Gordon; Geiss, Andreas; Sperber, Peter

    2008-08-01

    Computer-controlled polishing has introduced determinism into the finishing of high-quality surfaces, for example those used as optical interfaces. Computer-controlled polishing may overcome many of the disadvantages of traditional polishing techniques. The polishing procedure is computed in terms of the surface error-profile and the material removal characteristic of the polishing tool, the influence function. Determinism and predictability not only enable more economical manufacture but also facilitate considerably increased processing accuracy. However, there are several disadvantages that serve to limit the capabilities of computer-controlled polishing, many of these are considered to be issues associated with determination of the influence function. Magnetorheological finishing has been investigated and various new techniques and approaches that dramatically enhance the potential as well as the economics of computer-controlled polishing have been developed and verified experimentally. Recent developments and advancements in computer-controlled polishing are discussed. The generic results of this research may be used in a wide variety of alternative applications in which controlled material removal is employed to achieve a desired surface specification, ranging from surface treatment processes in technical disciplines, to manipulation of biological surface textures in medical technologies.

  11. Advanced Propulsor Design Studies

    Science.gov (United States)

    1985-06-01

    Report No. CG-D-1-86 ADVANCED PROPULSOR DESIGN STUDIES BY WALTER S. GEARHART 00 DTIC ;00 ELECTE MAR 4 S86 B II This document is available to the U;S...Organization Code Advanced Propulsor Design Studies _.__________ 0. Performing Orgoni ation Report No.7. A’jthoart g) Walter S. Gearhart 9. Performing...USCG Hulls Considered in the Preliminary Design Studies . . . . . . * 2 Preliminary Design Investigation . . . . . . . . .. . . . . . . . .. 3

  12. Advances of evolutionary computation methods and operators

    CERN Document Server

    Cuevas, Erik; Oliva Navarro, Diego Alberto

    2016-01-01

    The goal of this book is to present advances that discuss alternative Evolutionary Computation (EC) developments and non-conventional operators which have proved to be effective in the solution of several complex problems. The book has been structured so that each chapter can be read independently from the others. The book contains nine chapters with the following themes: 1) Introduction, 2) the Social Spider Optimization (SSO), 3) the States of Matter Search (SMS), 4) the collective animal behavior (CAB) algorithm, 5) the Allostatic Optimization (AO) method, 6) the Locust Search (LS) algorithm, 7) the Adaptive Population with Reduced Evaluations (APRE) method, 8) the multimodal CAB, 9) the constrained SSO method.

  13. ATCA for Machines-- Advanced Telecommunications Computing Architecture

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, R.S.; /SLAC

    2008-04-22

    The Advanced Telecommunications Computing Architecture is a new industry open standard for electronics instrument modules and shelves being evaluated for the International Linear Collider (ILC). It is the first industrial standard designed for High Availability (HA). ILC availability simulations have shown clearly that the capabilities of ATCA are needed in order to achieve acceptable integrated luminosity. The ATCA architecture looks attractive for beam instruments and detector applications as well. This paper provides an overview of ongoing R&D including application of HA principles to power electronics systems.

  14. ATCA for Machines-- Advanced Telecommunications Computing Architecture

    International Nuclear Information System (INIS)

    Larsen, R

    2008-01-01

    The Advanced Telecommunications Computing Architecture is a new industry open standard for electronics instrument modules and shelves being evaluated for the International Linear Collider (ILC). It is the first industrial standard designed for High Availability (HA). ILC availability simulations have shown clearly that the capabilities of ATCA are needed in order to achieve acceptable integrated luminosity. The ATCA architecture looks attractive for beam instruments and detector applications as well. This paper provides an overview of ongoing R and D including application of HA principles to power electronics systems

  15. An introduction to NASA's advanced computing program: Integrated computing systems in advanced multichip modules

    Science.gov (United States)

    Fang, Wai-Chi; Alkalai, Leon

    1996-01-01

    Recent changes within NASA's space exploration program favor the design, implementation, and operation of low cost, lightweight, small and micro spacecraft with multiple launches per year. In order to meet the future needs of these missions with regard to the use of spacecraft microelectronics, NASA's advanced flight computing (AFC) program is currently considering industrial cooperation and advanced packaging architectures. In relation to this, the AFC program is reviewed, considering the design and implementation of NASA's AFC multichip module.

  16. International Conference on Computers and Advanced Technology in Education

    CERN Document Server

    Advanced Information Technology in Education

    2012-01-01

    The volume includes a set of selected papers extended and revised from the 2011 International Conference on Computers and Advanced Technology in Education. With the development of computers and advanced technology, the human social activities are changing basically. Education, especially the education reforms in different countries, has been experiencing the great help from the computers and advanced technology. Generally speaking, education is a field which needs more information, while the computers, advanced technology and internet are a good information provider. Also, with the aid of the computer and advanced technology, persons can make the education an effective combination. Therefore, computers and advanced technology should be regarded as an important media in the modern education. Volume Advanced Information Technology in Education is to provide a forum for researchers, educators, engineers, and government officials involved in the general areas of computers and advanced technology in education to d...

  17. Computational advances in transition phase analysis

    International Nuclear Information System (INIS)

    Morita, K.; Kondo, S.; Tobita, Y.; Shirakawa, N.; Brear, D.J.; Fischer, E.A.

    1994-01-01

    In this paper, historical perspective and recent advances are reviewed on computational technologies to evaluate a transition phase of core disruptive accidents in liquid-metal fast reactors. An analysis of the transition phase requires treatment of multi-phase multi-component thermohydraulics coupled with space- and energy-dependent neutron kinetics. Such a comprehensive modeling effort was initiated when the program of SIMMER-series computer code development was initiated in the late 1970s in the USA. Successful application of the latest SIMMER-II in USA, western Europe and Japan have proved its effectiveness, but, at the same time, several areas that require further research have been identified. Based on the experience and lessons learned during the SIMMER-II application through 1980s, a new project of SIMMER-III development is underway at the Power Reactor and Nuclear Fuel Development Corporation (PNC), Japan. The models and methods of SIMMER-III are briefly described with emphasis on recent advances in multi-phase multi-component fluid dynamics technologies and their expected implication on a future reliable transition phase analysis. (author)

  18. New or improved computational methods and advanced reactor design

    International Nuclear Information System (INIS)

    Nakagawa, Masayuki; Takeda, Toshikazu; Ushio, Tadashi

    1997-01-01

    Nuclear computational method has been studied continuously up to date, as a fundamental technology supporting the nuclear development. At present, research on computational method according to new theory and the calculating method thought to be difficult to practise are also continued actively to find new development due to splendid improvement of features of computer. In Japan, many light water type reactors are now in operations, new computational methods are induced for nuclear design, and a lot of efforts are concentrated for intending to more improvement of economics and safety. In this paper, some new research results on the nuclear computational methods and their application to nuclear design of the reactor were described for introducing recent trend of the nuclear design of the reactor. 1) Advancement of the computational method, 2) Reactor core design and management of the light water reactor, and 3) Nuclear design of the fast reactor. (G.K.)

  19. Transport modeling and advanced computer techniques

    International Nuclear Information System (INIS)

    Wiley, J.C.; Ross, D.W.; Miner, W.H. Jr.

    1988-11-01

    A workshop was held at the University of Texas in June 1988 to consider the current state of transport codes and whether improved user interfaces would make the codes more usable and accessible to the fusion community. Also considered was the possibility that a software standard could be devised to ease the exchange of routines between groups. It was noted that two of the major obstacles to exchanging routines now are the variety of geometrical representation and choices of units. While the workshop formulated no standards, it was generally agreed that good software engineering would aid in the exchange of routines, and that a continued exchange of ideas between groups would be worthwhile. It seems that before we begin to discuss software standards we should review the current state of computer technology --- both hardware and software to see what influence recent advances might have on our software goals. This is done in this paper

  20. Advanced proton imaging in computed tomography

    CERN Document Server

    Mattiazzo, S; Giubilato, P; Pantano, D; Pozzobon, N; Snoeys, W; Wyss, J

    2015-01-01

    In recent years the use of hadrons for cancer radiation treatment has grown in importance, and many facilities are currently operational or under construction worldwide. To fully exploit the therapeutic advantages offered by hadron therapy, precise body imaging for accurate beam delivery is decisive. Proton computed tomography (pCT) scanners, currently in their R&D phase, provide the ultimate 3D imaging for hadrons treatment guidance. A key component of a pCT scanner is the detector used to track the protons, which has great impact on the scanner performances and ultimately limits its maximum speed. In this article, a novel proton-tracking detector was presented that would have higher scanning speed, better spatial resolution and lower material budget with respect to present state-of-the-art detectors, leading to enhanced performances. This advancement in performances is achieved by employing the very latest development in monolithic active pixel detectors (to build high granularity, low material budget, ...

  1. Making Advanced Computer Science Topics More Accessible through Interactive Technologies

    Science.gov (United States)

    Shao, Kun; Maher, Peter

    2012-01-01

    Purpose: Teaching advanced technical concepts in a computer science program to students of different technical backgrounds presents many challenges. The purpose of this paper is to present a detailed experimental pedagogy in teaching advanced computer science topics, such as computer networking, telecommunications and data structures using…

  2. [Advance in brain-computer interface technology].

    Science.gov (United States)

    Yang, Kunde; Tian, Mengjun; Zhang, Hainan; Zhao, Yamei

    2004-12-01

    This paper introduces one of the young, energetic and rapidly growing research fields in biomedical engineering-Brain-computer interface (BCI) technology, which can provide augmentative communication and control capabilities to patients with severe motor disabilities. We summarize the first two international meetings for BCI, and present the most typical research fruits. The problems in current studies and the direction for future investigation are analyzed.

  3. 77 FR 12823 - Advanced Scientific Computing Advisory Committee

    Science.gov (United States)

    2012-03-02

    ... Exascale ARRA projects--Magellan final report, Advanced Networking update Status from Computer Science COV Early Career technical talks Summary of Applied Math and Computer Science Workshops ASCR's new SBIR...

  4. First Responders Guide to Computer Forensics: Advanced Topics

    National Research Council Canada - National Science Library

    Nolan, Richard; Baker, Marie; Branson, Jake; Hammerstein, Josh; Rush, Kris; Waits, Cal; Schweinsberg, Elizabeth

    2005-01-01

    First Responders Guide to Computer Forensics: Advanced Topics expands on the technical material presented in SEI handbook CMU/SEI-2005-HB-001, First Responders Guide to Computer Forensics [Nolan 05...

  5. Advanced proton imaging in computed tomography.

    Science.gov (United States)

    Mattiazzo, S; Bisello, D; Giubilato, P; Pantano, D; Pozzobon, N; Snoeys, W; Wyss, J

    2015-09-01

    In recent years the use of hadrons for cancer radiation treatment has grown in importance, and many facilities are currently operational or under construction worldwide. To fully exploit the therapeutic advantages offered by hadron therapy, precise body imaging for accurate beam delivery is decisive. Proton computed tomography (pCT) scanners, currently in their R&D phase, provide the ultimate 3D imaging for hadrons treatment guidance. A key component of a pCT scanner is the detector used to track the protons, which has great impact on the scanner performances and ultimately limits its maximum speed. In this article, a novel proton-tracking detector was presented that would have higher scanning speed, better spatial resolution and lower material budget with respect to present state-of-the-art detectors, leading to enhanced performances. This advancement in performances is achieved by employing the very latest development in monolithic active pixel detectors (to build high granularity, low material budget, large area silicon detectors) and a completely new proprietary architecture (to effectively compress the data). © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Advanced Simulation and Computing FY17 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hendrickson, Bruce [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wade, Doug [National Nuclear Security Administration (NNSA), Washington, DC (United States). Office of Advanced Simulation and Computing and Institutional Research and Development; Hoang, Thuc [National Nuclear Security Administration (NNSA), Washington, DC (United States). Computational Systems and Software Environment

    2016-08-29

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.

  7. Advanced topics in security computer system design

    International Nuclear Information System (INIS)

    Stachniak, D.E.; Lamb, W.R.

    1989-01-01

    The capability, performance, and speed of contemporary computer processors, plus the associated performance capability of the operating systems accommodating the processors, have enormously expanded the scope of possibilities for designers of nuclear power plant security computer systems. This paper addresses the choices that could be made by a designer of security computer systems working with contemporary computers and describes the improvement in functionality of contemporary security computer systems based on an optimally chosen design. Primary initial considerations concern the selection of (a) the computer hardware and (b) the operating system. Considerations for hardware selection concern processor and memory word length, memory capacity, and numerous processor features

  8. Some Recent Advances in Computer Graphics.

    Science.gov (United States)

    Whitted, Turner

    1982-01-01

    General principles of computer graphics are reviewed, including discussions of display hardware, geometric modeling, algorithms, and applications in science, computer-aided design, flight training, communications, business, art, and entertainment. (JN)

  9. Designing Serious Computer Games for People With Moderate and Advanced Dementia: Interdisciplinary Theory-Driven Pilot Study

    Science.gov (United States)

    Gross, Daniel; Abikhzer, Judith

    2017-01-01

    Background The field of serious games for people with dementia (PwD) is mostly driven by game-design principals typically applied to games created by and for younger individuals. Little has been done developing serious games to help PwD maintain cognition and to support functionality. Objectives We aimed to create a theory-based serious game for PwD, with input from a multi-disciplinary team familiar with aging, dementia, and gaming theory, as well as direct input from end users (the iterative process). Targeting enhanced self-efficacy in daily activities, the goal was to generate a game that is acceptable, accessible and engaging for PwD. Methods The theory-driven game development was based on the following learning theories: learning in context, errorless learning, building on capacities, and acknowledging biological changes—all with the aim to boost self-efficacy. The iterative participatory process was used for game screen development with input of 34 PwD and 14 healthy community dwelling older adults, aged over 65 years. Development of game screens was informed by the bio-psychological aging related disabilities (ie, motor, visual, and perception) as well as remaining neuropsychological capacities (ie, implicit memory) of PwD. At the conclusion of the iterative development process, a prototype game with 39 screens was used for a pilot study with 24 PwD and 14 healthy community dwelling older adults. The game was played twice weekly for 10 weeks. Results Quantitative analysis showed that the average speed of successful screen completion was significantly longer for PwD compared with healthy older adults. Both PwD and controls showed an equivalent linear increase in the speed for task completion with practice by the third session (Pgame engaging and fun. Healthy older adults found the game too easy. Increase in self-reported self-efficacy was documented with PwD only. Conclusions Our study demonstrated that PwD’s speed improved with practice at the same rate

  10. Designing Serious Computer Games for People With Moderate and Advanced Dementia: Interdisciplinary Theory-Driven Pilot Study.

    Science.gov (United States)

    Tziraki, Chariklia; Berenbaum, Rakel; Gross, Daniel; Abikhzer, Judith; Ben-David, Boaz M

    2017-07-31

    The field of serious games for people with dementia (PwD) is mostly driven by game-design principals typically applied to games created by and for younger individuals. Little has been done developing serious games to help PwD maintain cognition and to support functionality. We aimed to create a theory-based serious game for PwD, with input from a multi-disciplinary team familiar with aging, dementia, and gaming theory, as well as direct input from end users (the iterative process). Targeting enhanced self-efficacy in daily activities, the goal was to generate a game that is acceptable, accessible and engaging for PwD. The theory-driven game development was based on the following learning theories: learning in context, errorless learning, building on capacities, and acknowledging biological changes-all with the aim to boost self-efficacy. The iterative participatory process was used for game screen development with input of 34 PwD and 14 healthy community dwelling older adults, aged over 65 years. Development of game screens was informed by the bio-psychological aging related disabilities (ie, motor, visual, and perception) as well as remaining neuropsychological capacities (ie, implicit memory) of PwD. At the conclusion of the iterative development process, a prototype game with 39 screens was used for a pilot study with 24 PwD and 14 healthy community dwelling older adults. The game was played twice weekly for 10 weeks. Quantitative analysis showed that the average speed of successful screen completion was significantly longer for PwD compared with healthy older adults. Both PwD and controls showed an equivalent linear increase in the speed for task completion with practice by the third session (Pgame engaging and fun. Healthy older adults found the game too easy. Increase in self-reported self-efficacy was documented with PwD only. Our study demonstrated that PwD's speed improved with practice at the same rate as healthy older adults. This implies that when tasks

  11. An advanced study on the hydrometallurgical processing of waste computer printed circuit boards to extract their valuable content of metals.

    Science.gov (United States)

    Birloaga, Ionela; Coman, Vasile; Kopacek, Bernd; Vegliò, Francesco

    2014-12-01

    This study refers to two chemical leaching systems for the base and precious metals extraction from waste printed circuit boards (WPCBs); sulfuric acid with hydrogen peroxide have been used for the first group of metals, meantime thiourea with the ferric ion in sulfuric acid medium were employed for the second one. The cementation process with zinc, copper and iron metal powders was attempted for solutions purification. The effects of hydrogen peroxide volume in rapport with sulfuric acid concentration and temperature were evaluated for oxidative leaching process. 2M H2SO4 (98% w/v), 5% H2O2, 25 °C, 1/10 S/L ratio and 200 rpm were founded as optimal conditions for Cu extraction. Thiourea acid leaching process, performed on the solid filtrate obtained after three oxidative leaching steps, was carried out with 20 g/L of CS(NH2)2, 6g/L of Fe(3+), 0.5M H2SO4, The cross-leaching method was applied by reusing of thiourea liquid suspension and immersing 5 g/L of this reagent for each other experiment material of leaching. This procedure has lead to the doubling and, respectively, tripling, of gold and silver concentrations into solution. These results reveal a very efficient, promising and environmental friendly method for WPCBs processing. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Computational intelligence for big data analysis frontier advances and applications

    CERN Document Server

    Dehuri, Satchidananda; Sanyal, Sugata

    2015-01-01

    The work presented in this book is a combination of theoretical advancements of big data analysis, cloud computing, and their potential applications in scientific computing. The theoretical advancements are supported with illustrative examples and its applications in handling real life problems. The applications are mostly undertaken from real life situations. The book discusses major issues pertaining to big data analysis using computational intelligence techniques and some issues of cloud computing. An elaborate bibliography is provided at the end of each chapter. The material in this book includes concepts, figures, graphs, and tables to guide researchers in the area of big data analysis and cloud computing.

  13. Advanced Technologies, Embedded and Multimedia for Human-Centric Computing

    CERN Document Server

    Chao, Han-Chieh; Deng, Der-Jiunn; Park, James; HumanCom and EMC 2013

    2014-01-01

    The theme of HumanCom and EMC are focused on the various aspects of human-centric computing for advances in computer science and its applications, embedded and multimedia computing and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of human-centric computing. And the theme of EMC (Advanced in Embedded and Multimedia Computing) is focused on the various aspects of embedded system, smart grid, cloud and multimedia computing, and it provides an opportunity for academic, industry professionals to discuss the latest issues and progress in the area of embedded and multimedia computing. Therefore this book will be include the various theories and practical applications in human-centric computing and embedded and multimedia computing.

  14. The advanced computational testing and simulation toolkit (ACTS)

    International Nuclear Information System (INIS)

    Drummond, L.A.; Marques, O.

    2002-01-01

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  15. Advanced drilling systems study.

    Energy Technology Data Exchange (ETDEWEB)

    Pierce, Kenneth G.; Livesay, Billy Joe; Finger, John Travis (Livesay Consultants, Encintas, CA)

    1996-05-01

    This report documents the results of a study of advanced drilling concepts conducted jointly for the Natural Gas Technology Branch and the Geothermal Division of the U.S. Department of Energy. A number of alternative rock cutting concepts and drilling systems are examined. The systems cover the range from current technology, through ongoing efforts in drilling research, to highly speculative concepts. Cutting mechanisms that induce stress mechanically, hydraulically, and thermally are included. All functions necessary to drill and case a well are considered. Capital and operating costs are estimated and performance requirements, based on comparisons of the costs for alternative systems to conventional drilling technology, are developed. A number of problems common to several alternatives and to current technology are identified and discussed.

  16. Advanced Computing Architectures for Cognitive Processing

    Science.gov (United States)

    2009-07-01

    for HPRC applications. 4.2 MODELS OF COMPUTATION Ptolemy is a software framework developed at the University of California, Berkeley and is used for...mixing of different “models of computation”. A model of computation varies from another mainly in its notion of “time”. Ptolemy II is a JAVA-based...concurrent or sequential components. Ptolemy II includes a suite of domains, each of which realizes a model of computation. It also includes a

  17. ADVANCED CUTTINGS TRANSPORT STUDY

    Energy Technology Data Exchange (ETDEWEB)

    Troy Reed; Stefan Miska; Nicholas Takach; Kaveh Ashenayi; Gerald Kane; Mark Pickell; Len Volk; Mike Volk; Affonso Lourenco; Evren Ozbayoglu; Lei Zhou

    2002-01-30

    This is the second quarterly progress report for Year 3 of the ACTS project. It includes a review of progress made in: (1) Flow Loop development and (2) research tasks during the period of time between Oct 1, 2001 and Dec. 31, 2001. This report presents a review of progress on the following specific tasks: (a) Design and development of an Advanced Cuttings Transport Facility (Task 3: Addition of a Cuttings Injection/Collection System), (b) Research project (Task 6): ''Study of Cuttings Transport with Foam Under LPAT Conditions (Joint Project with TUDRP)'', (c) Research project (Task 9): ''Study of Foam Flow Behavior Under EPET Conditions'', (d) Research project (Task 10): ''Study of Cuttings Transport with Aerated Mud Under Elevated Pressure and Temperature Conditions'', (e) Research on instrumentation tasks to measure: Cuttings concentration and distribution in a flowing slurry (Task 11), and Foam properties while transporting cuttings. (Task 12), (f) Development of a Safety program for the ACTS Flow Loop. Progress on a comprehensive safety review of all flow-loop components and operational procedures. (Task 1S). (g) Activities towards technology transfer and developing contacts with Petroleum and service company members, and increasing the number of JIP members.

  18. Application of advanced computational technology to propulsion CFD

    Science.gov (United States)

    Szuch, John R.

    The Internal Fluid Mechanics Division of the NASA Lewis Research Center is combining the key elements of computational fluid dynamics, aerothermodynamic experiments, and advanced computational technology to bring internal computational fluid dynamics (ICFM) to a state of practical application for aerospace propulsion system design. This paper presents an overview of efforts underway at NASA Lewis to advance and apply computational technology to ICFM. These efforts include the use of modern, software engineering principles for code development, the development of an AI-based user-interface for large codes, the establishment of a high-performance, data communications network to link ICFM researchers and facilities, and the application of parallel processing to speed up computationally intensive and/or time-critical ICFM problems. A multistage compressor flow physics program is cited as an example of efforts to use advanced computational technology to enhance a current NASA Lewis ICFM research program.

  19. Second International Conference on Advanced Computing, Networking and Informatics

    CERN Document Server

    Mohapatra, Durga; Konar, Amit; Chakraborty, Aruna

    2014-01-01

    Advanced Computing, Networking and Informatics are three distinct and mutually exclusive disciplines of knowledge with no apparent sharing/overlap among them. However, their convergence is observed in many real world applications, including cyber-security, internet banking, healthcare, sensor networks, cognitive radio, pervasive computing amidst many others. This two-volume proceedings explore the combined use of Advanced Computing and Informatics in the next generation wireless networks and security, signal and image processing, ontology and human-computer interfaces (HCI). The two volumes together include 148 scholarly papers, which have been accepted for presentation from over 640 submissions in the second International Conference on Advanced Computing, Networking and Informatics, 2014, held in Kolkata, India during June 24-26, 2014. The first volume includes innovative computing techniques and relevant research results in informatics with selective applications in pattern recognition, signal/image process...

  20. Advances in Future Computer and Control Systems v.1

    CERN Document Server

    Lin, Sally; 2012 International Conference on Future Computer and Control Systems(FCCS2012)

    2012-01-01

    FCCS2012 is an integrated conference concentrating its focus on Future Computer and Control Systems. “Advances in Future Computer and Control Systems” presents the proceedings of the 2012 International Conference on Future Computer and Control Systems(FCCS2012) held April 21-22,2012, in Changsha, China including recent research results on Future Computer and Control Systems of researchers from all around the world.

  1. Advances in Future Computer and Control Systems v.2

    CERN Document Server

    Lin, Sally; 2012 International Conference on Future Computer and Control Systems(FCCS2012)

    2012-01-01

    FCCS2012 is an integrated conference concentrating its focus on Future Computer and Control Systems. “Advances in Future Computer and Control Systems” presents the proceedings of the 2012 International Conference on Future Computer and Control Systems(FCCS2012) held April 21-22,2012, in Changsha, China including recent research results on Future Computer and Control Systems of researchers from all around the world.

  2. Preface: Special issue on advances in computer entertainment

    OpenAIRE

    Nijholt, Antinus; Romão, T.; Romão, Teresa; Cheok, Adrian D.; Cheok, A.D.

    2013-01-01

    This special issue of the International Journal of Creative Interfaces and Computer Graphics contains a selection of papers from ACE 2012, the 9th International Conference on Advances in Computer Entertainment (Nijholt et al., 2012). ACE is the leading scientific forum for dissemination of cutting-edge research results in the area of entertainment computing. The main goal of ACE is to stimulate discussion in the development of new and compelling entertainment computing and interactive art con...

  3. Computing Algorithms for Nuffield Advanced Physics.

    Science.gov (United States)

    Summers, M. K.

    1978-01-01

    Defines all recurrence relations used in the Nuffield course, to solve first- and second-order differential equations, and describes a typical algorithm for computer generation of solutions. (Author/GA)

  4. Power-efficient computer architectures recent advances

    CERN Document Server

    Själander, Magnus; Kaxiras, Stefanos

    2014-01-01

    As Moore's Law and Dennard scaling trends have slowed, the challenges of building high-performance computer architectures while maintaining acceptable power efficiency levels have heightened. Over the past ten years, architecture techniques for power efficiency have shifted from primarily focusing on module-level efficiencies, toward more holistic design styles based on parallelism and heterogeneity. This work highlights and synthesizes recent techniques and trends in power-efficient computer architecture.Table of Contents: Introduction / Voltage and Frequency Management / Heterogeneity and Sp

  5. Design and installation of advanced computer safety related instrumentation

    International Nuclear Information System (INIS)

    Koch, S.; Andolina, K.; Ruether, J.

    1993-01-01

    The rapidly developing area of computer systems creates new opportunities for commercial utilities operating nuclear reactors to improve plant operation and efficiency. Two of the main obstacles to utilizing the new technology in safety-related applications is the current policy of the licensing agencies and the fear of decision making managers to introduce new technologies. Once these obstacles are overcome, advanced diagnostic systems, CRT-based displays, and advanced communication channels can improve plant operation considerably. The article discusses outstanding issues in the area of designing, qualifying, and licensing of computer-based instrumentation and control systems. The authors describe the experience gained in designing three safety-related systems, that include a Programmable Logic Controller (PLC) based Safeguard Load Sequencer for NSP Prairie Island, a digital Containment Isolation monitoring system for TVA Browns Ferry, and a study that was conducted for EPRI/NSP regarding a PLC-based Reactor Protection system. This article presents the benefits to be gained in replacing existing, outdated equipment with new advanced instrumentation

  6. Advanced Computational Methods for Monte Carlo Calculations

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-12

    This course is intended for graduate students who already have a basic understanding of Monte Carlo methods. It focuses on advanced topics that may be needed for thesis research, for developing new state-of-the-art methods, or for working with modern production Monte Carlo codes.

  7. Advances in Computer, Communication, Control and Automation

    CERN Document Server

    011 International Conference on Computer, Communication, Control and Automation

    2012-01-01

    The volume includes a set of selected papers extended and revised from the 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011). 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011) has been held in Zhuhai, China, November 19-20, 2011. This volume  topics covered include signal and Image processing, speech and audio Processing, video processing and analysis, artificial intelligence, computing and intelligent systems, machine learning, sensor and neural networks, knowledge discovery and data mining, fuzzy mathematics and Applications, knowledge-based systems, hybrid systems modeling and design, risk analysis and management, system modeling and simulation. We hope that researchers, graduate students and other interested readers benefit scientifically from the proceedings and also find it stimulating in the process.

  8. CISM-IUTAM School on Advanced Turbulent Flow Computations

    CERN Document Server

    Krause, Egon

    2000-01-01

    This book collects the lecture notes concerning the IUTAM School on Advanced Turbulent Flow Computations held at CISM in Udine September 7–11, 1998. The course was intended for scientists, engineers and post-graduate students interested in the application of advanced numerical techniques for simulating turbulent flows. The topic comprises two closely connected main subjects: modelling and computation, mesh pionts necessary to simulate complex turbulent flow.

  9. 3rd International Conference on Advanced Computing, Networking and Informatics

    CERN Document Server

    Mohapatra, Durga; Chaki, Nabendu

    2016-01-01

    Advanced Computing, Networking and Informatics are three distinct and mutually exclusive disciplines of knowledge with no apparent sharing/overlap among them. However, their convergence is observed in many real world applications, including cyber-security, internet banking, healthcare, sensor networks, cognitive radio, pervasive computing amidst many others. This two volume proceedings explore the combined use of Advanced Computing and Informatics in the next generation wireless networks and security, signal and image processing, ontology and human-computer interfaces (HCI). The two volumes together include 132 scholarly articles, which have been accepted for presentation from over 550 submissions in the Third International Conference on Advanced Computing, Networking and Informatics, 2015, held in Bhubaneswar, India during June 23–25, 2015.

  10. Advanced intelligent computational technologies and decision support systems

    CERN Document Server

    Kountchev, Roumen

    2014-01-01

    This book offers a state of the art collection covering themes related to Advanced Intelligent Computational Technologies and Decision Support Systems which can be applied to fields like healthcare assisting the humans in solving problems. The book brings forward a wealth of ideas, algorithms and case studies in themes like: intelligent predictive diagnosis; intelligent analyzing of medical images; new format for coding of single and sequences of medical images; Medical Decision Support Systems; diagnosis of Down’s syndrome; computational perspectives for electronic fetal monitoring; efficient compression of CT Images; adaptive interpolation and halftoning for medical images; applications of artificial neural networks for real-life problems solving; present and perspectives for Electronic Healthcare Record Systems; adaptive approaches for noise reduction in sequences of CT images etc.

  11. Communities advance when computers speak their language ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2012-11-05

    Nov 5, 2012 ... Citizens in remote rural areas in 11 Asian countries are leaping over language barriers and into the Internet age. They may now be able to access government services online, and submit college applications without making an arduous trek to the city. And their children are learning the computer skills that ...

  12. Preface: Special issue on advances in computer entertainment

    NARCIS (Netherlands)

    Nijholt, Antinus; Romão, T.; Romão, Teresa; Cheok, Adrian D.; Cheok, A.D.

    2013-01-01

    This special issue of the International Journal of Creative Interfaces and Computer Graphics contains a selection of papers from ACE 2012, the 9th International Conference on Advances in Computer Entertainment (Nijholt et al., 2012). ACE is the leading scientific forum for dissemination of

  13. Advanced Computing Tools and Models for Accelerator Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  14. Advanced Computing Tools and Models for Accelerator Physics

    International Nuclear Information System (INIS)

    Ryne, Robert; Ryne, Robert D.

    2008-01-01

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics

  15. Advances in Computer Science and Education

    CERN Document Server

    Huang, Xiong

    2012-01-01

    CSE2011 is an integrated conference concentration its focus on computer science and education. In the proceeding, you can learn much more knowledge about computer science and education of researchers from all around the world. The main role of the proceeding is to be used as an exchange pillar for researchers who are working in the mentioned fields. In order to meet the high quality of Springer, AISC series, the organization committee has made their efforts to do the following things. Firstly, poor quality paper has been refused after reviewing course by anonymous referee experts. Secondly, periodically review meetings have been held around the reviewers about five times for exchanging reviewing suggestions. Finally, the conference organizers had several preliminary sessions before the conference. Through efforts of different people and departments, the conference will be successful and fruitful

  16. Advances in Computer-Based Autoantibodies Analysis

    Science.gov (United States)

    Soda, Paolo; Iannello, Giulio

    Indirect Immunofluorescence (IIF) imaging is the recommended me-thod to detect autoantibodies in patient serum, whose common markers are antinuclear autoantibodies (ANA) and autoantibodies directed against double strand DNA (anti-dsDNA). Since the availability of accurately performed and correctly reported laboratory determinations is crucial for the clinicians, an evident medical demand is the development of Computer Aided Diagnosis (CAD) tools supporting physicians' decisions.

  17. Extending the horizons advances in computing, optimization, and decision technologies

    CERN Document Server

    Joseph, Anito; Mehrotra, Anuj; Trick, Michael

    2007-01-01

    Computer Science and Operations Research continue to have a synergistic relationship and this book represents the results of cross-fertilization between OR/MS and CS/AI. It is this interface of OR/CS that makes possible advances that could not have been achieved in isolation. Taken collectively, these articles are indicative of the state-of-the-art in the interface between OR/MS and CS/AI and of the high caliber of research being conducted by members of the INFORMS Computing Society. EXTENDING THE HORIZONS: Advances in Computing, Optimization, and Decision Technologies is a volume that presents the latest, leading research in the design and analysis of algorithms, computational optimization, heuristic search and learning, modeling languages, parallel and distributed computing, simulation, computational logic and visualization. This volume also emphasizes a variety of novel applications in the interface of CS, AI, and OR/MS.

  18. ADVANCED CUTTINGS TRANSPORT STUDY

    Energy Technology Data Exchange (ETDEWEB)

    Stefan Miska; Troy Reed; Ergun Kuru

    2004-09-30

    The Advanced Cuttings Transport Study (ACTS) was a 5-year JIP project undertaken at the University of Tulsa (TU). The project was sponsored by the U.S. Department of Energy (DOE) and JIP member companies. The objectives of the project were: (1) to develop and construct a new research facility that would allow three-phase (gas, liquid and cuttings) flow experiments under ambient and EPET (elevated pressure and temperature) conditions, and at different angle of inclinations and drill pipe rotation speeds; (2) to conduct experiments and develop a data base for the industry and academia; and (3) to develop mechanistic models for optimization of drilling hydraulics and cuttings transport. This project consisted of research studies, flow loop construction and instrumentation development. Following a one-year period for basic flow loop construction, a proposal was submitted by TU to the DOE for a five-year project that was organized in such a manner as to provide a logical progression of research experiments as well as additions to the basic flow loop. The flow loop additions and improvements included: (1) elevated temperature capability; (2) two-phase (gas and liquid, foam etc.) capability; (3) cuttings injection and removal system; (4) drill pipe rotation system; and (5) drilling section elevation system. In parallel with the flow loop construction, hydraulics and cuttings transport studies were preformed using drilling foams and aerated muds. In addition, hydraulics and rheology of synthetic drilling fluids were investigated. The studies were performed under ambient and EPET conditions. The effects of temperature and pressure on the hydraulics and cuttings transport were investigated. Mechanistic models were developed to predict frictional pressure loss and cuttings transport in horizontal and near-horizontal configurations. Model predictions were compared with the measured data. Predominantly, model predictions show satisfactory agreements with the measured data. As a

  19. Recent Advances in Computational Mechanics of the Human Knee Joint

    Science.gov (United States)

    Kazemi, M.; Dabiri, Y.; Li, L. P.

    2013-01-01

    Computational mechanics has been advanced in every area of orthopedic biomechanics. The objective of this paper is to provide a general review of the computational models used in the analysis of the mechanical function of the knee joint in different loading and pathological conditions. Major review articles published in related areas are summarized first. The constitutive models for soft tissues of the knee are briefly discussed to facilitate understanding the joint modeling. A detailed review of the tibiofemoral joint models is presented thereafter. The geometry reconstruction procedures as well as some critical issues in finite element modeling are also discussed. Computational modeling can be a reliable and effective method for the study of mechanical behavior of the knee joint, if the model is constructed correctly. Single-phase material models have been used to predict the instantaneous load response for the healthy knees and repaired joints, such as total and partial meniscectomies, ACL and PCL reconstructions, and joint replacements. Recently, poromechanical models accounting for fluid pressurization in soft tissues have been proposed to study the viscoelastic response of the healthy and impaired knee joints. While the constitutive modeling has been considerably advanced at the tissue level, many challenges still exist in applying a good material model to three-dimensional joint simulations. A complete model validation at the joint level seems impossible presently, because only simple data can be obtained experimentally. Therefore, model validation may be concentrated on the constitutive laws using multiple mechanical tests of the tissues. Extensive model verifications at the joint level are still crucial for the accuracy of the modeling. PMID:23509602

  20. Advances in computer imaging/applications in facial plastic surgery.

    Science.gov (United States)

    Papel, I D; Jiannetto, D F

    1999-01-01

    Rapidly progressing computer technology, ever-increasing expectations of patients, and a confusing medicolegal environment requires a clarification of the role of computer imaging/applications. Advances in computer technology and its applications are reviewed. A brief historical discussion is included for perspective. Improvements in both hardware and software with the advent of digital imaging have allowed great increases in speed and accuracy in patient imaging. This facilitates doctor-patient communication and possibly realistic patient expectations. Patients seeking cosmetic surgery now often expect preoperative imaging. Although society in general has become more litigious, a literature search up to 1998 reveals no lawsuits directly involving computer imaging. It appears that conservative utilization of computer imaging by the facial plastic surgeon may actually reduce liability and promote communication. Recent advances have significantly enhanced the value of computer imaging in the practice of facial plastic surgery. These technological advances in computer imaging appear to contribute a useful technique for the practice of facial plastic surgery. Inclusion of computer imaging should be given serious consideration as an adjunct to clinical practice.

  1. Computational neuroscience for advancing artificial intelligence

    Directory of Open Access Journals (Sweden)

    Fernando P. Ponce

    2011-07-01

    Full Text Available resumen del libro de Alonso, E. y Mondragón, E. (2011. Hershey, NY: Medical Information Science Reference. La neurociencia como disciplinapersigue el entendimiento del cerebro y su relación con el funcionamiento de la mente a través del análisis de la comprensión de la interacción de diversos procesos físicos, químicos y biológicos (Bassett & Gazzaniga, 2011. Por otra parte, numerosas disciplinasprogresivamente han realizado significativas contribuciones en esta empresa tales como la matemática, la psicología o la filosofía, entre otras. Producto de este esfuerzo, es que junto con la neurociencia tradicional han aparecido disciplinas complementarias como la neurociencia cognitiva, la neuropsicología o la neurocienciacomputacional (Bengio, 2007; Dayan & Abbott, 2005. En el contexto de la neurociencia computacional como disciplina complementaria a laneurociencia tradicional. Alonso y Mondragón (2011 editan el libroComputacional Neuroscience for Advancing Artificial Intelligence: Models, Methods and Applications.

  2. Advanced turbine study

    Science.gov (United States)

    Castro, J. H.

    1985-01-01

    The feasibility of an advanced convective cooling concept applied to rocket turbine airfoils which operate in a high pressure hydrogen and methane environment was investigated. The concept consists of a central structural member in which grooves are machined. The grooves are temporarily filled with a removable filler and the entire airfoil is covered with a layer of electroformed nickel, or nickel base alloy. After removal of the filler, the low thermal resistance of the nickel closure causes the wall temperature to be reduced by heat transfer to the coolant. The program is divided in the following tasks: (1) turbine performance appraisal; (2) coolant geometry evaluation; (3) test hardware design and analysis; and (4) test airfoil fabrication.

  3. Advances in FDTD computational electrodynamics photonics and nanotechnology

    CERN Document Server

    Oskooi, Ardavan; Johnson, Steven G

    2013-01-01

    Advances in photonics and nanotechnology have the potential to revolutionize humanity s ability to communicate and compute. To pursue these advances, it is mandatory to understand and properly model interactions of light with materials such as silicon and gold at the nanoscale, i.e., the span of a few tens of atoms laid side by side. These interactions are governed by the fundamental Maxwell s equations of classical electrodynamics, supplemented by quantum electrodynamics. This book presents the current state-of-the-art in formulating and implementing computational models of these interactions. Maxwell s equations are solved using the finite-difference time-domain (FDTD) technique, pioneered by the senior editor, whose prior Artech books in this area are among the top ten most-cited in the history of engineering. You discover the most important advances in all areas of FDTD and PSTD computational modeling of electromagnetic wave interactions. This cutting-edge resource helps you understand the latest develo...

  4. Notion Of Artificial Labs Slow Global Warming And Advancing Engine Studies Perspectives On A Computational Experiment On Dual-Fuel Compression-Ignition Engine Research

    Directory of Open Access Journals (Sweden)

    Tonye K. Jack

    2017-06-01

    Full Text Available To appreciate clean energy applications of the dual-fuel internal combustion engine D-FICE with pilot Diesel fuel to aid public policy formulation in terms of present and future benefits to the modern transportation stationary power and promotion of oil and gas green- drilling the brief to an engine research team was to investigate the feasible advantages of dual-fuel compression-ignition engines guided by the following concerns i Sustainable fuel and engine power delivery ii The requirements for fuel flexibility iii Low exhausts emissions and environmental pollution iv Achieving low specific fuel consumption and economy for maximum power v The comparative advantages over the conventional Diesel engines vi Thermo-economic modeling and analysis for the optimal blend as basis for a benefitcost evaluation Planned in two stages for reduced cost and fast turnaround of results - initial preliminary stage with basic simple models and advanced stage with more detailed complex modeling. The paper describes a simplified MATLAB based computational experiment predictive model for the thermodynamic combustion and engine performance analysis of dual-fuel compression-ignition engine studies operating on the theoretical limited-pressure cycle with several alternative fuel-blends. Environmental implications for extreme temperature moderation are considered by finite-time thermodynamic modeling for maximum power with predictions for pollutants formation and control by reaction rates kinetics analysis of systematic reduced plausible coupled chemistry models through the NCN reaction pathway for the gas-phase reactions classes of interest. Controllable variables for engine-out pollutants emissions reduction and in particular NOx elimination are identified. Verifications and Validations VampV through Performance Comparisons were made using a clinical approach in selection of StrokeBore ratios greater-than and equal-to one amp88051 low-to-high engine speeds and medium

  5. Computational Center for Studies of Microturbulence

    International Nuclear Information System (INIS)

    William Dorland

    2006-01-01

    The Maryland Computational Center for Studies of Microturbulence (CCSM) was one component of a larger, multi-institutional Plasma Microturbulence Project, funded through what eventually became DOE's Scientific Discovery Through Advanced Computing Program. The primary focus of research in CCSM was to develop, deploy, maintain, and utilize kinetic simulation techniques, especially the gyrokinetic code called GS2

  6. Prediction of incomplete primary debulking surgery in patients with advanced ovarian cancer: An external validation study of three models using computed tomography.

    Science.gov (United States)

    Rutten, Iris J G; van de Laar, Rafli; Kruitwagen, Roy F P M; Bakers, Frans C H; Ploegmakers, Marieke J M; Pappot, Teun W F; Beets-Tan, Regina G H; Massuger, Leon F A G; Zusterzeel, Petra L M; Van Gorp, Toon

    2016-01-01

    To test the ability of three prospectively developed computed tomography (CT) models to predict incomplete primary debulking surgery in patients with advanced (International Federation of Gynecology and Obstetrics stages III-IV) ovarian cancer. Three prediction models to predict incomplete surgery (any tumor residual >1cm in diameter) previously published by Ferrandina (models A and B) and by Gerestein were applied to a validation cohort consisting of 151 patients with advanced epithelial ovarian cancer. All patients were treated with primary debulking surgery in the Eastern part of the Netherlands between 2000 and 2009 and data were retrospectively collected. Three individual readers evaluated the radiographic parameters and gave a subjective assessment. Using the predicted probabilities from the models, the area under the curve (AUC) was calculated which represents the discriminative ability of the model. The AUC of the Ferrandina models was 0.56, 0.59 and 0.59 in model A, and 0.55, 0.60 and 0.59 in model B for readers 1, 2 and 3, respectively. The AUC of Gerestein's model was 0.69, 0.61 and 0.69 for readers 1, 2 and 3, respectively. AUC values of 0.69 and 0.63 for reader 1 and 3 were found for subjective assessment. Models to predict incomplete surgery in advanced ovarian cancer have limited predictive ability and their reproducibility is questionable. Subjective assessment seems as successful as applying predictive models. Present prediction models are not reliable enough to be used in clinical decision-making and should be interpreted with caution. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. 9th International Conference on Advanced Computing & Communication Technologies

    CERN Document Server

    Mandal, Jyotsna; Auluck, Nitin; Nagarajaram, H

    2016-01-01

    This book highlights a collection of high-quality peer-reviewed research papers presented at the Ninth International Conference on Advanced Computing & Communication Technologies (ICACCT-2015) held at Asia Pacific Institute of Information Technology, Panipat, India during 27–29 November 2015. The book discusses a wide variety of industrial, engineering and scientific applications of the emerging techniques. Researchers from academia and industry present their original work and exchange ideas, information, techniques and applications in the field of Advanced Computing and Communication Technology.

  8. Advanced Clothing Studies

    Science.gov (United States)

    Orndoff, Evelyne; Poritz, Darwin

    2014-01-01

    All human space missions require significant logistical mass and volume that add an unprecedented burden on longduration missions beyond low-Earth orbit. For these missions with limited cleaning resources, a new wardrobe must be developed to reduce this logistical burden by reducing clothing mass and extending clothing wear. The present studies have been undertaken, for the first time, to measure length of wear and to assess the acceptance of such extended wear. Garments in these studies are commercially available exercise T-shirts and shorts, routine-wear T-shirts, and longsleeved pullover shirts. Fabric composition (cotton, polyester, light-weight, superfine Merino wool, modacrylic, cotton/rayon, polyester/Cocona, modacrylic/Xstatic, modacrylic/rayon, modacrylic/lyocell/aramid), construction (open knit, tight knit, open weave, tight weave), and finishing treatment (none, quaternary ammonium salt) are the independent variables. Eleven studies are reported here: five studies of exercise T-shirts, three of exercise shorts, two of routine wear Tshirts, and one of shirts used as sleep-wear. All studies are conducted in a climate-controlled environment, similar to a space vehicle's. For exercise clothing, study participants wear the garments during aerobic exercise. For routine wear clothing, study participants wear the T-shirts daily in an office or laboratory. Daily questionnaires collected data on ordinal preferences of nine sensory elements and on reason for retiring a used garment. Study 1 compares knitted cotton, polyester, and Merino exercise T-shirts (61 participants), study 2, knitted polyester, modacrylic, and polyester/Cocona exercise T-shirts (40 participants), study 3, cotton and polyester exercise shorts, knitted and woven (70 participants), all three using factorial experimental designs with and without a finishing treatment, conducted at the Johnson Space Center, sharing study participants. Study 4 compares knitted polyester and ZQ Merino exercise T

  9. Innovations and Advances in Computer, Information, Systems Sciences, and Engineering

    CERN Document Server

    Sobh, Tarek

    2013-01-01

    Innovations and Advances in Computer, Information, Systems Sciences, and Engineering includes the proceedings of the International Joint Conferences on Computer, Information, and Systems Sciences, and Engineering (CISSE 2011). The contents of this book are a set of rigorously reviewed, world-class manuscripts addressing and detailing state-of-the-art research projects in the areas of  Industrial Electronics, Technology and Automation, Telecommunications and Networking, Systems, Computing Sciences and Software Engineering, Engineering Education, Instructional Technology, Assessment, and E-learning.

  10. Advances in computers dependable and secure systems engineering

    CERN Document Server

    Hurson, Ali

    2012-01-01

    Since its first volume in 1960, Advances in Computers has presented detailed coverage of innovations in computer hardware, software, theory, design, and applications. It has also provided contributors with a medium in which they can explore their subjects in greater depth and breadth than journal articles usually allow. As a result, many articles have become standard references that continue to be of sugnificant, lasting value in this rapidly expanding field. In-depth surveys and tutorials on new computer technologyWell-known authors and researchers in the fieldExtensive bibliographies with m

  11. Special issue on advances in computer entertainment: editorial

    NARCIS (Netherlands)

    Romão, T.; Romão, Teresa; Nijholt, Antinus; Cheok, J.D.; Cheok, Adrian David

    2015-01-01

    This special issue of the International Journal of Arts and Technology comprises a selection of papers from ACE 2012, the 9th International Conference on Advances in Computer Entertainment (Nijholt et al., 2012). ACE is the leading scientific forum for dissemination of cutting-edge research results

  12. Advances in Computer Entertainment. 10th International Conference, ACE 2013

    NARCIS (Netherlands)

    Reidsma, Dennis; Katayose, H.; Nijholt, Antinus; Unknown, [Unknown

    2013-01-01

    These are the proceedings of the 10th International Conference on Advances in Computer Entertainment (ACE 2013), hosted by the Human Media Interaction research group of the Centre for Telematics and Information Technology at the University of Twente, The Netherlands. The ACE series of conferences,

  13. [Activities of Research Institute for Advanced Computer Science

    Science.gov (United States)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  14. 2014 National Workshop on Advances in Communication and Computing

    CERN Document Server

    Prasanna, S; Sarma, Kandarpa; Saikia, Navajit

    2015-01-01

    The present volume is a compilation of research work in computation, communication, vision sciences, device design, fabrication, upcoming materials and related process design, etc. It is derived out of selected manuscripts submitted to the 2014 National Workshop on Advances in Communication and Computing (WACC 2014), Assam Engineering College, Guwahati, Assam, India which is emerging out to be a premier platform for discussion and dissemination of knowhow in this part of the world. The papers included in the volume are indicative of the recent thrust in computation, communications and emerging technologies. Certain recent advances in ZnO nanostructures for alternate energy generation provide emerging insights into an area that has promises for the energy sector including conservation and green technology. Similarly, scholarly contributions have focused on malware detection and related issues. Several contributions have focused on biomedical aspects including contributions related to cancer detection using act...

  15. Advanced computational method for studying molecular vibrations and spectra for symmetrical systems with many degrees of freedom, and its application to fullerene

    Science.gov (United States)

    Bogush, Igor; Ciobu, Victor; Paladi, Florentin

    2017-10-01

    A computational method for studying molecular vibrations and spectra for symmetrical systems with many degrees of freedom was developed. The algorithm allows overcoming difficulties on the automation of calculus related to the symmetry determination of such oscillations in complex systems with many degrees of freedom. One can find symmetrized displacements and, consequently, obtain and classify normal oscillations and their frequencies. The problem is therefore reduced to the determination of eigenvectors by common numerical methods, and the algorithm simplifies the procedure of symmetry determination for normal oscillations. The proposed method was applied to studying molecular vibrations and spectra of the fullerene molecule C60, and the comparison of theoretical results with experimental data is drawn. The computational method can be further extended to other problems of group theory in physics with applications in clusters and nanostructured materials.

  16. Advanced computer graphics techniques as applied to the nuclear industry

    International Nuclear Information System (INIS)

    Thomas, J.J.; Koontz, A.S.

    1985-08-01

    Computer graphics is a rapidly advancing technological area in computer science. This is being motivated by increased hardware capability coupled with reduced hardware costs. This paper will cover six topics in computer graphics, with examples forecasting how each of these capabilities could be used in the nuclear industry. These topics are: (1) Image Realism with Surfaces and Transparency; (2) Computer Graphics Motion; (3) Graphics Resolution Issues and Examples; (4) Iconic Interaction; (5) Graphic Workstations; and (6) Data Fusion - illustrating data coming from numerous sources, for display through high dimensional, greater than 3-D, graphics. All topics will be discussed using extensive examples with slides, video tapes, and movies. Illustrations have been omitted from the paper due to the complexity of color reproduction. 11 refs., 2 figs., 3 tabs

  17. Performance evaluation of scientific programs on advanced architecture computers

    International Nuclear Information System (INIS)

    Walker, D.W.; Messina, P.; Baille, C.F.

    1988-01-01

    Recently a number of advanced architecture machines have become commercially available. These new machines promise better cost-performance then traditional computers, and some of them have the potential of competing with current supercomputers, such as the Cray X/MP, in terms of maximum performance. This paper describes an on-going project to evaluate a broad range of advanced architecture computers using a number of complete scientific application programs. The computers to be evaluated include distributed- memory machines such as the NCUBE, INTEL and Caltech/JPL hypercubes, and the MEIKO computing surface, shared-memory, bus architecture machines such as the Sequent Balance and the Alliant, very long instruction word machines such as the Multiflow Trace 7/200 computer, traditional supercomputers such as the Cray X.MP and Cray-2, and SIMD machines such as the Connection Machine. Currently 11 application codes from a number of scientific disciplines have been selected, although it is not intended to run all codes on all machines. Results are presented for two of the codes (QCD and missile tracking), and future work is proposed

  18. Activities of the Research Institute for Advanced Computer Science

    Science.gov (United States)

    Oliger, Joseph

    1994-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.

  19. Teaching advance care planning to medical students with a computer-based decision aid.

    Science.gov (United States)

    Green, Michael J; Levi, Benjamin H

    2011-03-01

    Discussing end-of-life decisions with cancer patients is a crucial skill for physicians. This article reports findings from a pilot study evaluating the effectiveness of a computer-based decision aid for teaching medical students about advance care planning. Second-year medical students at a single medical school were randomized to use a standard advance directive or a computer-based decision aid to help patients with advance care planning. Students' knowledge, skills, and satisfaction were measured by self-report; their performance was rated by patients. 121/133 (91%) of students participated. The Decision-Aid Group (n = 60) outperformed the Standard Group (n = 61) in terms of students' knowledge (p satisfaction with their learning experience (p student performance. Use of a computer-based decision aid may be an effective way to teach medical students how to discuss advance care planning with cancer patients.

  20. Advances in Cross-Cutting Ideas for Computational Climate Science

    Energy Technology Data Exchange (ETDEWEB)

    Ng, Esmond [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Evans, Katherine J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Caldwell, Peter [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hoffman, Forrest M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Jackson, Charles [Univ. of Texas, Austin, TX (United States); Kerstin, Van Dam [Brookhaven National Lab. (BNL), Upton, NY (United States); Leung, Ruby [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Martin, Daniel F. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ostrouchov, George [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Tuminaro, Raymond [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ullrich, Paul [Univ. of California, Davis, CA (United States); Wild, S. [Argonne National Lab. (ANL), Argonne, IL (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-01-01

    This report presents results from the DOE-sponsored workshop titled, ``Advancing X-Cutting Ideas for Computational Climate Science Workshop,'' known as AXICCS, held on September 12--13, 2016 in Rockville, MD. The workshop brought together experts in climate science, computational climate science, computer science, and mathematics to discuss interesting but unsolved science questions regarding climate modeling and simulation, promoted collaboration among the diverse scientists in attendance, and brainstormed about possible tools and capabilities that could be developed to help address them. Emerged from discussions at the workshop were several research opportunities that the group felt could advance climate science significantly. These include (1) process-resolving models to provide insight into important processes and features of interest and inform the development of advanced physical parameterizations, (2) a community effort to develop and provide integrated model credibility, (3) including, organizing, and managing increasingly connected model components that increase model fidelity yet complexity, and (4) treating Earth system models as one interconnected organism without numerical or data based boundaries that limit interactions. The group also identified several cross-cutting advances in mathematics, computer science, and computational science that would be needed to enable one or more of these big ideas. It is critical to address the need for organized, verified, and optimized software, which enables the models to grow and continue to provide solutions in which the community can have confidence. Effectively utilizing the newest computer hardware enables simulation efficiency and the ability to handle output from increasingly complex and detailed models. This will be accomplished through hierarchical multiscale algorithms in tandem with new strategies for data handling, analysis, and storage. These big ideas and cross-cutting technologies for

  1. Advances in Cross-Cutting Ideas for Computational Climate Science

    Energy Technology Data Exchange (ETDEWEB)

    Ng, E.; Evans, K.; Caldwell, P.; Hoffman, F.; Jackson, C.; Van Dam, K.; Leung, R.; Martin, D.; Ostrouchov, G.; Tuminaro, R.; Ullrich, P.; Wild, S.; Williams, S.

    2017-01-01

    This report presents results from the DOE-sponsored workshop titled, Advancing X-Cutting Ideas for Computational Climate Science Workshop,'' known as AXICCS, held on September 12--13, 2016 in Rockville, MD. The workshop brought together experts in climate science, computational climate science, computer science, and mathematics to discuss interesting but unsolved science questions regarding climate modeling and simulation, promoted collaboration among the diverse scientists in attendance, and brainstormed about possible tools and capabilities that could be developed to help address them. Emerged from discussions at the workshop were several research opportunities that the group felt could advance climate science significantly. These include (1) process-resolving models to provide insight into important processes and features of interest and inform the development of advanced physical parameterizations, (2) a community effort to develop and provide integrated model credibility, (3) including, organizing, and managing increasingly connected model components that increase model fidelity yet complexity, and (4) treating Earth system models as one interconnected organism without numerical or data based boundaries that limit interactions. The group also identified several cross-cutting advances in mathematics, computer science, and computational science that would be needed to enable one or more of these big ideas. It is critical to address the need for organized, verified, and optimized software, which enables the models to grow and continue to provide solutions in which the community can have confidence. Effectively utilizing the newest computer hardware enables simulation efficiency and the ability to handle output from increasingly complex and detailed models. This will be accomplished through hierarchical multiscale algorithms in tandem with new strategies for data handling, analysis, and storage. These big ideas and cross-cutting technologies for enabling

  2. Advanced Computing for 21st Century Accelerator Science and Technology

    International Nuclear Information System (INIS)

    Dragt, Alex J.

    2004-01-01

    Dr. Dragt of the University of Maryland is one of the Institutional Principal Investigators for the SciDAC Accelerator Modeling Project Advanced Computing for 21st Century Accelerator Science and Technology whose principal investigators are Dr. Kwok Ko (Stanford Linear Accelerator Center) and Dr. Robert Ryne (Lawrence Berkeley National Laboratory). This report covers the activities of Dr. Dragt while at Berkeley during spring 2002 and at Maryland during fall 2003

  3. Condition Monitoring Through Advanced Sensor and Computational Technology

    International Nuclear Information System (INIS)

    Kim, Jung Taek; Park, Won Man; Kim, Jung Soo; Seong, Soeng Hwan; Hur, Sub; Cho, Jae Hwan; Jung, Hyung Gue

    2005-05-01

    The overall goal of this joint research project was to develop and demonstrate advanced sensors and computational technology for continuous monitoring of the condition of components, structures, and systems in advanced and next-generation nuclear power plants (NPPs). This project included investigating and adapting several advanced sensor technologies from Korean and US national laboratory research communities, some of which were developed and applied in non-nuclear industries. The project team investigated and developed sophisticated signal processing, noise reduction, and pattern recognition techniques and algorithms. The researchers installed sensors and conducted condition monitoring tests on two test loops, a check valve (an active component) and a piping elbow (a passive component), to demonstrate the feasibility of using advanced sensors and computational technology to achieve the project goal. Acoustic emission (AE) devices, optical fiber sensors, accelerometers, and ultrasonic transducers (UTs) were used to detect mechanical vibratory response of check valve and piping elbow in normal and degraded configurations. Chemical sensors were also installed to monitor the water chemistry in the piping elbow test loop. Analysis results of processed sensor data indicate that it is feasible to differentiate between the normal and degraded (with selected degradation mechanisms) configurations of these two components from the acquired sensor signals, but it is questionable that these methods can reliably identify the level and type of degradation. Additional research and development efforts are needed to refine the differentiation techniques and to reduce the level of uncertainties

  4. Computational experiment approach to advanced secondary mathematics curriculum

    CERN Document Server

    Abramovich, Sergei

    2014-01-01

    This book promotes the experimental mathematics approach in the context of secondary mathematics curriculum by exploring mathematical models depending on parameters that were typically considered advanced in the pre-digital education era. This approach, by drawing on the power of computers to perform numerical computations and graphical constructions, stimulates formal learning of mathematics through making sense of a computational experiment. It allows one (in the spirit of Freudenthal) to bridge serious mathematical content and contemporary teaching practice. In other words, the notion of teaching experiment can be extended to include a true mathematical experiment. When used appropriately, the approach creates conditions for collateral learning (in the spirit of Dewey) to occur including the development of skills important for engineering applications of mathematics. In the context of a mathematics teacher education program, this book addresses a call for the preparation of teachers capable of utilizing mo...

  5. Computational methods of the Advanced Fluid Dynamics Model

    International Nuclear Information System (INIS)

    Bohl, W.R.; Wilhelm, D.; Parker, F.R.

    1987-01-01

    To more accurately treat severe accidents in fast reactors, a program has been set up to investigate new computational models and approaches. The product of this effort is a computer code, the Advanced Fluid Dynamics Model (AFDM). This paper describes some of the basic features of the numerical algorithm used in AFDM. Aspects receiving particular emphasis are the fractional-step method of time integration, the semi-implicit pressure iteration, the virtual mass inertial terms, the use of three velocity fields, higher order differencing, convection of interfacial area with source and sink terms, multicomponent diffusion processes in heat and mass transfer, the SESAME equation of state, and vectorized programming. A calculated comparison with an isothermal tetralin/ammonia experiment is performed. We conclude that significant improvements are possible in reliably calculating the progression of severe accidents with further development

  6. Review of research on advanced computational science in FY2015

    International Nuclear Information System (INIS)

    2017-01-01

    Research on advanced computational science for nuclear applications, based on 'Plan to Achieve Medium- to Long-term Objectives of the Japan Atomic Energy Agency (Medium- to Long-term Plan)', has been performed at Center for Computational Science and e-Systems (CCSE), Japan Atomic Energy Agency. CCSE established the committee consisting of outside experts and authorities which does research evaluation and advices for the assistance of the research and development. This report summarizes the followings. (1) Results of the R and D performed at CCSE in FY 2015 (April 1st, 2015 - March 31st, 2016), (2) Results of the evaluation on the R and D by the committee in FY 2015 (April 1st, 2015 - March 31st, 2016). (author)

  7. Advanced data analysis in neuroscience integrating statistical and computational models

    CERN Document Server

    Durstewitz, Daniel

    2017-01-01

    This book is intended for use in advanced graduate courses in statistics / machine learning, as well as for all experimental neuroscientists seeking to understand statistical methods at a deeper level, and theoretical neuroscientists with a limited background in statistics. It reviews almost all areas of applied statistics, from basic statistical estimation and test theory, linear and nonlinear approaches for regression and classification, to model selection and methods for dimensionality reduction, density estimation and unsupervised clustering.  Its focus, however, is linear and nonlinear time series analysis from a dynamical systems perspective, based on which it aims to convey an understanding also of the dynamical mechanisms that could have generated observed time series. Further, it integrates computational modeling of behavioral and neural dynamics with statistical estimation and hypothesis testing. This way computational models in neuroscience are not only explanat ory frameworks, but become powerfu...

  8. Fermilab advanced computer program multi-microprocessor project

    International Nuclear Information System (INIS)

    Nash, T.; Areti, H.; Biel, J.

    1985-06-01

    Fermilab's Advanced Computer Program is constructing a powerful 128 node multi-microprocessor system for data analysis in high-energy physics. The system will use commercial 32-bit microprocessors programmed in Fortran-77. Extensive software supports easy migration of user applications from a uniprocessor environment to the multiprocessor and provides sophisticated program development, debugging, and error handling and recovery tools. This system is designed to be readily copied, providing computing cost effectiveness of below $2200 per VAX 11/780 equivalent. The low cost, commercial availability, compatibility with off-line analysis programs, and high data bandwidths (up to 160 MByte/sec) make the system an ideal choice for applications to on-line triggers as well as an offline data processor

  9. Advanced computational modeling for in vitro nanomaterial dosimetry.

    Science.gov (United States)

    DeLoid, Glen M; Cohen, Joel M; Pyrgiotakis, Georgios; Pirela, Sandra V; Pal, Anoop; Liu, Jiying; Srebric, Jelena; Demokritou, Philip

    2015-10-24

    -affinity binding resulted in faster and eventual complete deposition of material. The advanced models presented provide practical and robust tools for obtaining accurate dose metrics and concentration profiles across the well, for high-throughput screening of ENMs. The DG model allows rapid modeling that accommodates polydispersity, dissolution, and adsorption. Result of adsorption studies suggest that a reflective lower boundary condition is appropriate for modeling most in vitro ENM exposures.

  10. The ACP (Advanced Computer Program) multiprocessor system at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Nash, T.; Areti, H.; Atac, R.; Biel, J.; Case, G.; Cook, A.; Fischler, M.; Gaines, I.; Hance, R.; Husby, D.

    1986-09-01

    The Advanced Computer Program at Fermilab has developed a multiprocessor system which is easy to use and uniquely cost effective for many high energy physics problems. The system is based on single board computers which cost under $2000 each to build including 2 Mbytes of on board memory. These standard VME modules each run experiment reconstruction code in Fortran at speeds approaching that of a VAX 11/780. Two versions have been developed: one uses Motorola's 68020 32 bit microprocessor, the other runs with AT and T's 32100. both include the corresponding floating point coprocessor chip. The first system, when fully configured, uses 70 each of the two types of processors. A 53 processor system has been operated for several months with essentially no down time by computer operators in the Fermilab Computer Center, performing at nearly the capacity of 6 CDC Cyber 175 mainframe computers. The VME crates in which the processing ''nodes'' sit are connected via a high speed ''Branch Bus'' to one or more MicroVAX computers which act as hosts handling system resource management and all I/O in offline applications. An interface from Fastbus to the Branch Bus has been developed for online use which has been tested error free at 20 Mbytes/sec for 48 hours. ACP hardware modules are now available commercially. A major package of software, including a simulator that runs on any VAX, has been developed. It allows easy migration of existing programs to this multiprocessor environment. This paper describes the ACP Multiprocessor System and early experience with it at Fermilab and elsewhere.

  11. The ACP [Advanced Computer Program] multiprocessor system at Fermilab

    International Nuclear Information System (INIS)

    Nash, T.; Areti, H.; Atac, R.

    1986-09-01

    The Advanced Computer Program at Fermilab has developed a multiprocessor system which is easy to use and uniquely cost effective for many high energy physics problems. The system is based on single board computers which cost under $2000 each to build including 2 Mbytes of on board memory. These standard VME modules each run experiment reconstruction code in Fortran at speeds approaching that of a VAX 11/780. Two versions have been developed: one uses Motorola's 68020 32 bit microprocessor, the other runs with AT and T's 32100. both include the corresponding floating point coprocessor chip. The first system, when fully configured, uses 70 each of the two types of processors. A 53 processor system has been operated for several months with essentially no down time by computer operators in the Fermilab Computer Center, performing at nearly the capacity of 6 CDC Cyber 175 mainframe computers. The VME crates in which the processing ''nodes'' sit are connected via a high speed ''Branch Bus'' to one or more MicroVAX computers which act as hosts handling system resource management and all I/O in offline applications. An interface from Fastbus to the Branch Bus has been developed for online use which has been tested error free at 20 Mbytes/sec for 48 hours. ACP hardware modules are now available commercially. A major package of software, including a simulator that runs on any VAX, has been developed. It allows easy migration of existing programs to this multiprocessor environment. This paper describes the ACP Multiprocessor System and early experience with it at Fermilab and elsewhere

  12. Mirror Advanced Reactor Study (MARS)

    International Nuclear Information System (INIS)

    Logan, B.G.

    1983-01-01

    Progress in a two year study of a 1200 MWe commercial tandem mirror reactor (MARS - Mirror Advanced Reactor Study) has reached the point where major reactor system technologies are identified. New design features of the magnets, blankets, plug heating systems and direct converter are described. With the innovation of radial drift pumping to maintain low plug density, reactor recirculating power fraction is reduced to 20%. Dominance of radial ion and impurity losses into the halo permits gridless, circular direct converters to be dramatically reduced in size. Comparisons of MARS with the Starfire tokamak design are made

  13. Advanced nuclear systems. Review study

    International Nuclear Information System (INIS)

    Liebert, Wolfgang; Glaser, Alexander; Pistner, Christoph; Baehr, Roland; Hahn, Lothar

    1999-04-01

    The task of this review study is to from provide an overview of the developments in the field of the various advanced nuclear systems, and to create the basis for more comprehensive studies of technology assessment. In an overview the concepts for advanced nuclear systems pursued worldwide are subdivided into eight subgroups. A coarse examination raster (set pattern) is developed to enable a detailed examination of the selected systems. In addition to a focus on enhanced safety features, further aspects are also taken into consideration, like the lowering of the proliferation risk, the enhancement of the economic competitiveness of the facilities and new usage possibilities (for instance concerning the relaxation of the waste disposal problem or the usage of alternative fuels to uranium). The question about the expected time span for realization and the discussion about the obstacles on the way to a commercially usable reactor also play a substantial role as well as disposal requirements as far as they can be presently recognized. In the central chapter of this study, the documentation of the representatively selected concepts is evaluated as well as existing technology assessment studies and expert opinions. In a few cases where this appears to be necessary, according technical literature, further policy advisory reports, expert statements as well as other relevant sources are taken into account. Contradictions, different assessments and dissents in the literature as well as a few unsettled questions are thus indicated. The potential of advanced nuclear systems with respect to economical and societal as well as environmental objectives cannot exclusively be measured by the corresponding intrinsic or in comparison remarkable technical improvements. The acceptability of novel or improved systems in nuclear technology will have to be judged by their convincing solutions for the crucial questions of safety, nuclear waste and risk of proliferation of nuclear weapons

  14. Evaluation Approaches to Intelligent Computer-Assisted Instruction. Testing Study Group: The Impact of Advances in Artificial Intelligence on Test Development.

    Science.gov (United States)

    Baker, Eva L.

    Some special problems associated with evaluating intelligent computer-assisted instruction (ICAI) programs are addressed. This paper intends to describe alternative approaches to the assessment and improvement of such applications and to provide examples of efforts undertaken and shortfalls. Issues discussed stem chiefly from the technical demands…

  15. Advanced approaches to characterize the human intestinal microbiota by computational meta-analysis

    NARCIS (Netherlands)

    Nikkilä, J.; Vos, de W.M.

    2010-01-01

    GOALS: We describe advanced approaches for the computational meta-analysis of a collection of independent studies, including over 1000 phylogenetic array datasets, as a means to characterize the variability of human intestinal microbiota. BACKGROUND: The human intestinal microbiota is a complex

  16. Computed tomography findings after radiofrequency ablation in locally advanced pancreatic cancer

    NARCIS (Netherlands)

    Rombouts, Steffi J. E.; Derksen, Tyche C.; Nio, Chung Y.; van Hillegersberg, Richard; van Santvoort, Hjalmar C.; Walma, Marieke S.; Molenaar, Izaak Q.; van Leeuwen, Maarten S.

    2018-01-01

    The purpose of the study was to provide a systematic evaluation of the computed tomography(CT) findings after radiofrequency ablation (RFA) in locally advanced pancreatic cancer(LAPC). Eighteen patients with intra-operative RFA-treated LAPC were included in a prospective case series. All CT-scans

  17. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Report: Exascale Computing Initiative Review

    Energy Technology Data Exchange (ETDEWEB)

    Reed, Daniel [University of Iowa; Berzins, Martin [University of Utah; Pennington, Robert; Sarkar, Vivek [Rice University; Taylor, Valerie [Texas A& M University

    2015-08-01

    On November 19, 2014, the Advanced Scientific Computing Advisory Committee (ASCAC) was charged with reviewing the Department of Energy’s conceptual design for the Exascale Computing Initiative (ECI). In particular, this included assessing whether there are significant gaps in the ECI plan or areas that need to be given priority or extra management attention. Given the breadth and depth of previous reviews of the technical challenges inherent in exascale system design and deployment, the subcommittee focused its assessment on organizational and management issues, considering technical issues only as they informed organizational or management priorities and structures. This report presents the observations and recommendations of the subcommittee.

  18. Recent advances in the computational chemistry of soft porous crystals.

    Science.gov (United States)

    Fraux, Guillaume; Coudert, François-Xavier

    2017-06-29

    Here we highlight recent progress in the field of computational chemistry of nanoporous materials, focusing on methods and studies that address the extraordinary dynamic nature of these systems: the high flexibility of their frameworks, the large-scale structural changes upon external physical or chemical stimulation, and the presence of defects and disorder. The wide variety of behavior demonstrated in soft porous crystals, including the topical class of metal-organic frameworks, opens new challenges for computational chemistry methods at all scales.

  19. International conference on Advances in Intelligent Control and Innovative Computing

    CERN Document Server

    Castillo, Oscar; Huang, Xu; Intelligent Control and Innovative Computing

    2012-01-01

    In the lightning-fast world of intelligent control and cutting-edge computing, it is vitally important to stay abreast of developments that seem to follow each other without pause. This publication features the very latest and some of the very best current research in the field, with 32 revised and extended research articles written by prominent researchers in the field. Culled from contributions to the key 2011 conference Advances in Intelligent Control and Innovative Computing, held in Hong Kong, the articles deal with a wealth of relevant topics, from the most recent work in artificial intelligence and decision-supporting systems, to automated planning, modelling and simulation, signal processing, and industrial applications. Not only does this work communicate the current state of the art in intelligent control and innovative computing, it is also an illuminating guide to up-to-date topics for researchers and graduate students in the field. The quality of the contents is absolutely assured by the high pro...

  20. Computational brain models: Advances from system biology and future challenges

    Directory of Open Access Journals (Sweden)

    George E. Barreto

    2015-02-01

    Full Text Available Computational brain models focused on the interactions between neurons and astrocytes, modeled via metabolic reconstructions, are reviewed. The large source of experimental data provided by the -omics techniques and the advance/application of computational and data-management tools are being fundamental. For instance, in the understanding of the crosstalk between these cells, the key neuroprotective mechanisms mediated by astrocytes in specific metabolic scenarios (1 and the identification of biomarkers for neurodegenerative diseases (2,3. However, the modeling of these interactions demands a clear view of the metabolic and signaling pathways implicated, but most of them are controversial and are still under evaluation (4. Hence, to gain insight into the complexity of these interactions a current view of the main pathways implicated in the neuron-astrocyte communication processes have been made from recent experimental reports and reviews. Furthermore, target problems, limitations and main conclusions have been identified from metabolic models of the brain reported from 2010. Finally, key aspects to take into account into the development of a computational model of the brain and topics that could be approached from a systems biology perspective in future research are highlighted.

  1. Advanced Collaborative Emissions Study (ACES)

    Energy Technology Data Exchange (ETDEWEB)

    Greenbaum, Daniel; Costantini, Maria; Van Erp, Annemoon; Shaikh, Rashid; Bailey, Brent; Tennant, Chris; Khalek, Imad; Mauderly, Joe; McDonald, Jacob; Zielinska, Barbara; Bemis, Jeffrey; Storey, John; Hallberg, Lance; Clark, Nigel

    2013-12-31

    The objective of the Advanced Collaborative Emissions Study (ACES) was to determine before widespread commercial deployment whether or not the new, energy-efficient, heavy duty diesel engines (2007 and 2010 EPA Emissions Standards Compliant) may generate anticipated toxic emissions that could adversely affect the environment and human health. ACES was planned to take place in three phases. In Phase 1, extensive emissions characterization of four production-intent prototype engine and control systems designed to meet 2007 standards for nitrogen oxides (NOx) and particulate matter (PM) was conducted at an existing emissions characterization facility: Southwest Research Institute (SwRI). One of the tested engines was selected (at random, after careful comparison of results) for health testing in Phase 3. In Phase 2, extensive emission characterization of three production-intent prototype engine and control systems meeting the 2010 standards (including more advanced NOx controls to meet the more stringent 2010 NOx standards) was conducted at the same test facility. In Phase 3, one engine/aftertreatment system selected from Phase 1 was further characterized during health effects studies (at an existing inhalation toxicology laboratory: Lovelace Respiratory Research Institute, [LRRI]) to form the basis of the ACES safety assessment. The Department of Energy (DOE) award provided funding for emissions characterization in Phases 1 and 2 as well as exposure characterization in Phase 3. The main health analyses in Phase 3 were funded separately and are not reported here.

  2. Extreme Scale Computing Studies

    Science.gov (United States)

    2010-12-01

    partitioned address space. The second category of departmental systems is composed of clusters of PC boxes (often called Beowulf systems), and at first...message passing models as distributed memory machines including low-cost Beowulf clusters became the architecture of choice. In each case, there were...an expert in the field of parallel computer system architecture and parallel programming methods. Dr. Sterling led the Beowulf Project that performed

  3. Advances in Physarum machines sensing and computing with Slime mould

    CERN Document Server

    2016-01-01

    This book is devoted to Slime mould Physarum polycephalum, which is a large single cell capable for distributed sensing, concurrent information processing, parallel computation and decentralized actuation. The ease of culturing and experimenting with Physarum makes this slime mould an ideal substrate for real-world implementations of unconventional sensing and computing devices The book is a treatise of theoretical and experimental laboratory studies on sensing and computing properties of slime mould, and on the development of mathematical and logical theories of Physarum behavior. It is shown how to make logical gates and circuits, electronic devices (memristors, diodes, transistors, wires, chemical and tactile sensors) with the slime mould. The book demonstrates how to modify properties of Physarum computing circuits with functional nano-particles and polymers, to interface the slime mould with field-programmable arrays, and to use Physarum as a controller of microbial fuel cells. A unique multi-agent model...

  4. SciDAC Advances and Applications in Computational Beam Dynamics

    International Nuclear Information System (INIS)

    Ryne, R.; Abell, D.; Adelmann, A.; Amundson, J.; Bohn, C.; Cary, J.; Colella, P.; Dechow, D.; Decyk, V.; Dragt, A.; Gerber, R.; Habib, S.; Higdon, D.; Katsouleas, T.; Ma, K.-L.; McCorquodale, P.; Mihalcea, D.; Mitchell, C.; Mori, W.; Mottershead, C.T.; Neri, F.; Pogorelov, I.; Qiang, J.; Samulyak, R.; Serafini, D.; Shalf, J.; Siegerist, C.; Spentzouris, P.; Stoltz, P.; Terzic, B.; Venturini, M.; Walstrom, P.

    2005-01-01

    SciDAC has had a major impact on computational beam dynamics and the design of particle accelerators. Particle accelerators--which account for half of the facilities in the DOE Office of Science Facilities for the Future of Science 20 Year Outlook--are crucial for US scientific, industrial, and economic competitiveness. Thanks to SciDAC, accelerator design calculations that were once thought impossible are now carried routinely, and new challenging and important calculations are within reach. SciDAC accelerator modeling codes are being used to get the most science out of existing facilities, to produce optimal designs for future facilities, and to explore advanced accelerator concepts that may hold the key to qualitatively new ways of accelerating charged particle beams. In this poster we present highlights from the SciDAC Accelerator Science and Technology (AST) project Beam Dynamics focus area in regard to algorithm development, software development, and applications

  5. Advanced information processing system: Inter-computer communication services

    Science.gov (United States)

    Burkhardt, Laura; Masotto, Tom; Sims, J. Terry; Whittredge, Roy; Alger, Linda S.

    1991-01-01

    The purpose is to document the functional requirements and detailed specifications for the Inter-Computer Communications Services (ICCS) of the Advanced Information Processing System (AIPS). An introductory section is provided to outline the overall architecture and functional requirements of the AIPS and to present an overview of the ICCS. An overview of the AIPS architecture as well as a brief description of the AIPS software is given. The guarantees of the ICCS are provided, and the ICCS is described as a seven-layered International Standards Organization (ISO) Model. The ICCS functional requirements, functional design, and detailed specifications as well as each layer of the ICCS are also described. A summary of results and suggestions for future work are presented.

  6. Computational modeling, optimization and manufacturing simulation of advanced engineering materials

    CERN Document Server

    2016-01-01

    This volume presents recent research work focused in the development of adequate theoretical and numerical formulations to describe the behavior of advanced engineering materials.  Particular emphasis is devoted to applications in the fields of biological tissues, phase changing and porous materials, polymers and to micro/nano scale modeling. Sensitivity analysis, gradient and non-gradient based optimization procedures are involved in many of the chapters, aiming at the solution of constitutive inverse problems and parameter identification. All these relevant topics are exposed by experienced international and inter institutional research teams resulting in a high level compilation. The book is a valuable research reference for scientists, senior undergraduate and graduate students, as well as for engineers acting in the area of computational material modeling.

  7. Introduction to Naval Hydrodynamics using Advanced Computational and Experimental Tools

    Science.gov (United States)

    Buchholz, James; Carrica, Pablo; Russell, Jae-Eun; Pontarelli, Matthew; Krebill, Austin; Berdon, Randall

    2017-11-01

    An undergraduate certificate program in naval hydrodynamics has been recently established at the University of Iowa. Despite several decades of graduate research in this area, this is the first formal introduction to naval hydrodynamics for University of Iowa undergraduate students. Central to the curriculum are two new courses that emphasize open-ended projects conducted in a novel laboratory/learning community that exposes students to advanced tools in computational and experimental fluid mechanics, respectively. Learning is pursued in a loosely-structured environment in which students work in small groups to conduct simulations and experiments relating to resistance, propulsion, and seakeeping using a revised version of the naval hydrodynamics research flow solver, REX, and a small towing tank. Survey responses indicate that the curriculum and course format has strongly increased student interest in naval hydrodynamics and effectively facilitated depth of student learning. This work was supported by the Office of Naval Research under Award Number N00014-15-1-2448.

  8. High performance parallel computers for science: New developments at the Fermilab advanced computer program

    International Nuclear Information System (INIS)

    Nash, T.; Areti, H.; Atac, R.

    1988-08-01

    Fermilab's Advanced Computer Program (ACP) has been developing highly cost effective, yet practical, parallel computers for high energy physics since 1984. The ACP's latest developments are proceeding in two directions. A Second Generation ACP Multiprocessor System for experiments will include $3500 RISC processors each with performance over 15 VAX MIPS. To support such high performance, the new system allows parallel I/O, parallel interprocess communication, and parallel host processes. The ACP Multi-Array Processor, has been developed for theoretical physics. Each $4000 node is a FORTRAN or C programmable pipelined 20 MFlops (peak), 10 MByte single board computer. These are plugged into a 16 port crossbar switch crate which handles both inter and intra crate communication. The crates are connected in a hypercube. Site oriented applications like lattice gauge theory are supported by system software called CANOPY, which makes the hardware virtually transparent to users. A 256 node, 5 GFlop, system is under construction. 10 refs., 7 figs

  9. Theoretical Advanced Study Institute: 2014

    Energy Technology Data Exchange (ETDEWEB)

    DeGrand, Thomas [Univ. of Colorado, Boulder, CO (United States)

    2016-08-17

    The Theoretical Advanced Study Institute (TASI) was held at the University of Colorado, Boulder, during June 2-27, 2014. The topic was "Journeys through the Precision Frontier: Amplitudes for Colliders." The organizers were Professors Lance Dixon (SLAC) and Frank Petriello (Northwestern and Argonne). There were fifty-one students. Nineteen lecturers gave sixty seventy-five minute lectures. A Proceedings was published. This TASI was unique for its large emphasis on methods for calculating amplitudes. This was embedded in a program describing recent theoretical and phenomenological developments in particle physics. Topics included introductions to the Standard Model, to QCD (both in a collider context and on the lattice), effective field theories, Higgs physics, neutrino interactions, an introduction to experimental techniques, and cosmology.

  10. TerraFERMA: Harnessing Advanced Computational Libraries in Earth Science

    Science.gov (United States)

    Wilson, C. R.; Spiegelman, M.; van Keken, P.

    2012-12-01

    Many important problems in Earth sciences can be described by non-linear coupled systems of partial differential equations. These "multi-physics" problems include thermo-chemical convection in Earth and planetary interiors, interactions of fluids and magmas with the Earth's mantle and crust and coupled flow of water and ice. These problems are of interest to a large community of researchers but are complicated to model and understand. Much of this complexity stems from the nature of multi-physics where small changes in the coupling between variables or constitutive relations can lead to radical changes in behavior, which in turn affect critical computational choices such as discretizations, solvers and preconditioners. To make progress in understanding such coupled systems requires a computational framework where multi-physics problems can be described at a high-level while maintaining the flexibility to easily modify the solution algorithm. Fortunately, recent advances in computational science provide a basis for implementing such a framework. Here we present the Transparent Finite Element Rapid Model Assembler (TerraFERMA), which leverages several advanced open-source libraries for core functionality. FEniCS (fenicsproject.org) provides a high level language for describing the weak forms of coupled systems of equations, and an automatic code generator that produces finite element assembly code. PETSc (www.mcs.anl.gov/petsc) provides a wide range of scalable linear and non-linear solvers that can be composed into effective multi-physics preconditioners. SPuD (amcg.ese.ic.ac.uk/Spud) is an application neutral options system that provides both human and machine-readable interfaces based on a single xml schema. Our software integrates these libraries and provides the user with a framework for exploring multi-physics problems. A single options file fully describes the problem, including all equations, coefficients and solver options. Custom compiled applications are

  11. Conceptual study on advanced PWR system

    International Nuclear Information System (INIS)

    Bae, Yoon Young; Chang, M. H.; Yu, K. J.; Lee, D. J.; Cho, B. H.; Kim, H. Y.; Yoon, J. H.; Lee, Y. J.; Kim, J. P.; Park, C. T.; Seo, J. K.; Kang, H. S.; Kim, J. I.; Kim, Y. W.; Kim, Y. H.

    1997-07-01

    In this study, the adoptable essential technologies and reference design concept of the advanced reactor were developed and related basic experiments were performed. 1) Once-through Helical Steam Generator: a performance analysis computer code for heli-coiled steam generator was developed for thermal sizing of steam generator and determination of thermal-hydraulic parameters. 2) Self-pressurizing pressurizer : a performance analysis computer code for cold pressurizer was developed. 3) Control rod drive mechanism for fine control : type and function were surveyed. 4) CHF in passive PWR condition : development of the prediction model bundle CHF by introducing the correction factor from the data base. 5) Passive cooling concepts for concrete containment systems: development of the PCCS heat transfer coefficient. 6) Steam injector concepts: analysis and experiment were conducted. 7) Fluidic diode concepts : analysis and experiment were conducted. 8) Wet thermal insulator : tests for thin steel layers and assessment of materials. 9) Passive residual heat removal system : a performance analysis computer code for PRHRS was developed and the conformance to EPRI requirement was checked. (author). 18 refs., 55 tabs., 137 figs

  12. Conceptual study on advanced PWR system

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Yoon Young; Chang, M. H.; Yu, K. J.; Lee, D. J.; Cho, B. H.; Kim, H. Y.; Yoon, J. H.; Lee, Y. J.; Kim, J. P.; Park, C. T.; Seo, J. K.; Kang, H. S.; Kim, J. I.; Kim, Y. W.; Kim, Y. H.

    1997-07-01

    In this study, the adoptable essential technologies and reference design concept of the advanced reactor were developed and related basic experiments were performed. (1) Once-through Helical Steam Generator: a performance analysis computer code for heli-coiled steam generator was developed for thermal sizing of steam generator and determination of thermal-hydraulic parameters. (2) Self-pressurizing pressurizer : a performance analysis computer code for cold pressurizer was developed. (3) Control rod drive mechanism for fine control : type and function were surveyed. (4) CHF in passive PWR condition : development of the prediction model bundle CHF by introducing the correction factor from the data base. (5) Passive cooling concepts for concrete containment systems: development of the PCCS heat transfer coefficient. (6) Steam injector concepts: analysis and experiment were conducted. (7) Fluidic diode concepts : analysis and experiment were conducted. (8) Wet thermal insulator : tests for thin steel layers and assessment of materials. (9) Passive residual heat removal system : a performance analysis computer code for PRHRS was developed and the conformance to EPRI requirement was checked. (author). 18 refs., 55 tabs., 137 figs.

  13. NATO Advanced Study Institute on Advances in Chemical Reaction Dynamics

    CERN Document Server

    Capellos, Christos

    1986-01-01

    This book contains the formal lectures and contributed papers presented at the NATO Advanced Study Institute on. the Advances in Chemical Reaction Dynamics. The meeting convened at the city of Iraklion, Crete, Greece on 25 August 1985 and continued to 7 September 1985. The material presented describes the fundamental and recent advances in experimental and theoretical aspects of, reaction dynamics. A large section is devoted to electronically excited states, ionic species, and free radicals, relevant to chemical sys­ tems. In addition recent advances in gas phase polymerization, formation of clusters, and energy release processes in energetic materials were presented. Selected papers deal with topics such as the dynamics of electric field effects in low polar solutions, high electric field perturbations and relaxation of dipole equilibria, correlation in picosecond/laser pulse scattering, and applications to fast reaction dynamics. Picosecond transient Raman spectroscopy which has been used for the elucidati...

  14. NATO Advanced Study Institute on Superconducting Electronics

    CERN Document Server

    Nisenhoff, Martin; Superconducting Electronics

    1989-01-01

    The genesis of the NATO Advanced Study Institute (ASI) upon which this volume is based, occurred during the summer of 1986 when we came to the realization that there had been significant progress during the early 1980's in the field of superconducting electronics and in applications of this technology. Despite this progress, there was a perception among many engineers and scientists that, with the possible exception of a limited number of esoteric fundamental studies and applications (e.g., the Josephson voltage standard or the SQUID magnetometer), there was no significant future for electronic systems incorporating superconducting elements. One of the major reasons for this perception was the aversion to handling liquid helium or including a closed-cycle helium liquefier. In addition, many critics felt that IBM's cancellation of its superconducting computer project in 1983 was "proof" that superconductors could not possibly compete with semiconductors in high-speed signal processing. From our persp...

  15. Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing

    Energy Technology Data Exchange (ETDEWEB)

    Fletcher, James H. [University of North Florida; Cox, Philip [University of North Florida; Harrington, William J [University of North Florida; Campbell, Joseph L [University of North Florida

    2013-09-03

    ABSTRACT Project Title: Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing PROJECT OBJECTIVE The objective of the project was to advance portable fuel cell system technology towards the commercial targets of power density, energy density and lifetime. These targets were laid out in the DOE’s R&D roadmap to develop an advanced direct methanol fuel cell power supply that meets commercial entry requirements. Such a power supply will enable mobile computers to operate non-stop, unplugged from the wall power outlet, by using the high energy density of methanol fuel contained in a replaceable fuel cartridge. Specifically this project focused on balance-of-plant component integration and miniaturization, as well as extensive component, subassembly and integrated system durability and validation testing. This design has resulted in a pre-production power supply design and a prototype that meet the rigorous demands of consumer electronic applications. PROJECT TASKS The proposed work plan was designed to meet the project objectives, which corresponded directly with the objectives outlined in the Funding Opportunity Announcement: To engineer the fuel cell balance-of-plant and packaging to meet the needs of consumer electronic systems, specifically at power levels required for mobile computing. UNF used existing balance-of-plant component technologies developed under its current US Army CERDEC project, as well as a previous DOE project completed by PolyFuel, to further refine them to both miniaturize and integrate their functionality to increase the system power density and energy density. Benefits of UNF’s novel passive water recycling MEA (membrane electrode assembly) and the simplified system architecture it enabled formed the foundation of the design approach. The package design was hardened to address orientation independence, shock, vibration, and environmental requirements. Fuel cartridge and fuel subsystems were improved to ensure effective fuel

  16. Continued rise of the cloud advances and trends in cloud computing

    CERN Document Server

    Mahmood, Zaigham

    2014-01-01

    Cloud computing is no-longer a novel paradigm, but instead an increasingly robust and established technology, yet new developments continue to emerge in this area. Continued Rise of the Cloud: Advances and Trends in Cloud Computing captures the state of the art in cloud technologies, infrastructures, and service delivery and deployment models. The book provides guidance and case studies on the development of cloud-based services and infrastructures from an international selection of expert researchers and practitioners. A careful analysis is provided of relevant theoretical frameworks, prac

  17. Recent advances in swarm intelligence and evolutionary computation

    CERN Document Server

    2015-01-01

    This timely review volume summarizes the state-of-the-art developments in nature-inspired algorithms and applications with the emphasis on swarm intelligence and bio-inspired computation. Topics include the analysis and overview of swarm intelligence and evolutionary computation, hybrid metaheuristic algorithms, bat algorithm, discrete cuckoo search, firefly algorithm, particle swarm optimization, and harmony search as well as convergent hybridization. Application case studies have focused on the dehydration of fruits and vegetables by the firefly algorithm and goal programming, feature selection by the binary flower pollination algorithm, job shop scheduling, single row facility layout optimization, training of feed-forward neural networks, damage and stiffness identification, synthesis of cross-ambiguity functions by the bat algorithm, web document clustering, truss analysis, water distribution networks, sustainable building designs and others. As a timely review, this book can serve as an ideal reference f...

  18. NATO Advanced Study Institute on Nanotechnological Basis for Advanced Sensors

    CERN Document Server

    Reithmaier, Johann Peter; Kulisch, Wilhelm; Popov, Cyril; Petkov, Plamen

    2011-01-01

    Bringing together experts from 15 countries, this book is based on the lectures and contributions of the NATO Advanced Study Institute on “Nanotechnological Basis for Advanced Sensors” held in Sozopol, Bulgaria, 30 May - 11 June, 2010. It gives a broad overview on this topic, and includes articles on: techniques for preparation and characterization of sensor materials; different types of nanoscaled materials for sensor applications, addressing both their structure (nanoparticles, nanocomposites, nanostructured films, etc.) and chemical nature (carbon-based, oxides, glasses, etc.); and on advanced sensors that exploit nanoscience and nanotechnology. In addition, the volume represents an interdisciplinary approach with authors coming from diverse fields such as physics, chemistry, engineering, materials science and biology. A particular strength of the book is its combination of longer papers, introducing the basic knowledge on a certain topic, and brief contributions highlighting special types of sensors a...

  19. NATO Advanced Study Institute on Advanced Physical Oceanographic Numerical Modelling

    CERN Document Server

    1986-01-01

    This book is a direct result of the NATO Advanced Study Institute held in Banyuls-sur-mer, France, June 1985. The Institute had the same title as this book. It was held at Laboratoire Arago. Eighty lecturers and students from almost all NATO countries attended. The purpose was to review the state of the art of physical oceanographic numerical modelling including the parameterization of physical processes. This book represents a cross-section of the lectures presented at the ASI. It covers elementary mathematical aspects through large scale practical aspects of ocean circulation calculations. It does not encompass every facet of the science of oceanographic modelling. We have, however, captured most of the essence of mesoscale and large-scale ocean modelling for blue water and shallow seas. There have been considerable advances in modelling coastal circulation which are not included. The methods section does not include important material on phase and group velocity errors, selection of grid structures, advanc...

  20. Advanced Simulation & Computing FY15 Implementation Plan Volume 2, Rev. 0.5

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Matzen, M. Keith [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-16

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. As the program approaches the end of its second decade, ASC is intently focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Where possible, the program also enables the use of high-performance simulation and computing tools to address broader national security needs, such as foreign nuclear weapon assessments and counternuclear terrorism.

  1. Oracle joins CERN Openlab to advance grid computing

    CERN Multimedia

    2003-01-01

    "CERN and Oracle Corporation today announced that Oracle is joining the CERN openlab for DataGrid applications to collaborate in creating new grid computing technologies and exploring new computing and data management solutions far beyond today's Internet-based computing" (1 page).

  2. First 3 years of operation of RIACS (Research Institute for Advanced Computer Science) (1983-1985)

    Science.gov (United States)

    Denning, P. J.

    1986-01-01

    The focus of the Research Institute for Advanced Computer Science (RIACS) is to explore matches between advanced computing architectures and the processes of scientific research. An architecture evaluation of the MIT static dataflow machine, specification of a graphical language for expressing distributed computations, and specification of an expert system for aiding in grid generation for two-dimensional flow problems was initiated. Research projects for 1984 and 1985 are summarized.

  3. Advanced Simulation and Computing Fiscal Year 2011-2012 Implementation Plan, Revision 0.5

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Phillips, Julia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wampler, Cheryl [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Meisner, Robert [National Nuclear Security Administration (NNSA), Washington, DC (United States)

    2010-09-13

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality, and scientific details); to quantify critical margins and uncertainties; and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  4. Advanced Simulation and Computing Fiscal Year 14 Implementation Plan, Rev. 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Meisner, Robert [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Matzen, M. Keith [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-09-11

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Moreover, ASC’s business model is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive

  5. Advanced computational model for three-phase slurry reactors

    International Nuclear Information System (INIS)

    Goodarz Ahmadi

    2001-10-01

    In the second year of the project, the Eulerian-Lagrangian formulation for analyzing three-phase slurry flows in a bubble column is further developed. The approach uses an Eulerian analysis of liquid flows in the bubble column, and makes use of the Lagrangian trajectory analysis for the bubbles and particle motions. An experimental set for studying a two-dimensional bubble column is also developed. The operation of the bubble column is being tested and diagnostic methodology for quantitative measurements is being developed. An Eulerian computational model for the flow condition in the two-dimensional bubble column is also being developed. The liquid and bubble motions are being analyzed and the results are being compared with the experimental setup. Solid-fluid mixture flows in ducts and passages at different angle of orientations were analyzed. The model predictions were compared with the experimental data and good agreement was found. Gravity chute flows of solid-liquid mixtures is also being studied. Further progress was also made in developing a thermodynamically consistent model for multiphase slurry flows with and without chemical reaction in a state of turbulent motion. The balance laws are obtained and the constitutive laws are being developed. Progress was also made in measuring concentration and velocity of particles of different sizes near a wall in a duct flow. The technique of Phase-Doppler anemometry was used in these studies. The general objective of this project is to provide the needed fundamental understanding of three-phase slurry reactors in Fischer-Tropsch (F-T) liquid fuel synthesis. The other main goal is to develop a computational capability for predicting the transport and processing of three-phase coal slurries. The specific objectives are: (1) To develop a thermodynamically consistent rate-dependent anisotropic model for multiphase slurry flows with and without chemical reaction for application to coal liquefaction. Also establish the

  6. Advanced satellite servicing facility studies

    Science.gov (United States)

    Qualls, Garry D.; Ferebee, Melvin J., Jr.

    1988-01-01

    A NASA-sponsored systems analysis designed to identify and recommend advanced subsystems and technologies specifically for a manned Sun-synchronous platform for satellite management is discussed. An overview of system design, manned and unmanned servicing facilities, and representative mission scenarios are given. Mission areas discussed include facility based satellite assembly, checkout, deployment, refueling, repair, and systems upgrade. The ferrying of materials and consumables to and from manufacturing platforms, deorbit, removal, repositioning, or salvage of satellites and debris, and crew rescue of any other manned vehicles are also examined. Impacted subsytems discussed include guidance navigation and control, propulsion, data management, power, thermal control, structures, life support, and radiation management. In addition, technology issues which would have significant impacts on the system design are discussed.

  7. Parallel computing in genomic research: advances and applications

    Directory of Open Access Journals (Sweden)

    Ocaña K

    2015-11-01

    Full Text Available Kary Ocaña,1 Daniel de Oliveira2 1National Laboratory of Scientific Computing, Petrópolis, Rio de Janeiro, 2Institute of Computing, Fluminense Federal University, Niterói, Brazil Abstract: Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. Keywords: high-performance computing, genomic research, cloud computing, grid computing, cluster computing, parallel computing

  8. Recent Advances in Computational Methods for Nuclear Magnetic Resonance Data Processing

    KAUST Repository

    Gao, Xin

    2013-01-11

    Although three-dimensional protein structure determination using nuclear magnetic resonance (NMR) spectroscopy is a computationally costly and tedious process that would benefit from advanced computational techniques, it has not garnered much research attention from specialists in bioinformatics and computational biology. In this paper, we review recent advances in computational methods for NMR protein structure determination. We summarize the advantages of and bottlenecks in the existing methods and outline some open problems in the field. We also discuss current trends in NMR technology development and suggest directions for research on future computational methods for NMR.

  9. Computer architectures for computational physics work done by Computational Research and Technology Branch and Advanced Computational Concepts Group

    Science.gov (United States)

    1985-01-01

    Slides are reproduced that describe the importance of having high performance number crunching and graphics capability. They also indicate the types of research and development underway at Ames Research Center to ensure that, in the near term, Ames is a smart buyer and user, and in the long-term that Ames knows the best possible solutions for number crunching and graphics needs. The drivers for this research are real computational physics applications of interest to Ames and NASA. They are concerned with how to map the applications, and how to maximize the physics learned from the results of the calculations. The computer graphics activities are aimed at getting maximum information from the three-dimensional calculations by using the real time manipulation of three-dimensional data on the Silicon Graphics workstation. Work is underway on new algorithms that will permit the display of experimental results that are sparse and random, the same way that the dense and regular computed results are displayed.

  10. Conditional Inference and Advanced Mathematical Study

    Science.gov (United States)

    Inglis, Matthew; Simpson, Adrian

    2008-01-01

    Many mathematicians and curriculum bodies have argued in favour of the theory of formal discipline: that studying advanced mathematics develops one's ability to reason logically. In this paper we explore this view by directly comparing the inferences drawn from abstract conditional statements by advanced mathematics students and well-educated arts…

  11. First Responders Guide to Computer Forensics: Advanced Topics

    National Research Council Canada - National Science Library

    Nolan, Richard; Baker, Marie; Branson, Jake; Hammerstein, Josh; Rush, Kris; Waits, Cal; Schweinsberg, Elizabeth

    2005-01-01

    ... on more advanced technical operations like process characterization and spoofed email. It is designed for experienced security and network professionals who already have a fundamental understanding of forensic methodology...

  12. The Advance of Computing from the Ground to the Cloud

    Science.gov (United States)

    Breeding, Marshall

    2009-01-01

    A trend toward the abstraction of computing platforms that has been developing in the broader IT arena over the last few years is just beginning to make inroads into the library technology scene. Cloud computing offers for libraries many interesting possibilities that may help reduce technology costs and increase capacity, reliability, and…

  13. Advances in soft computing, intelligent robotics and control

    CERN Document Server

    Fullér, Robert

    2014-01-01

    Soft computing, intelligent robotics and control are in the core interest of contemporary engineering. Essential characteristics of soft computing methods are the ability to handle vague information, to apply human-like reasoning, their learning capability, and ease of application. Soft computing techniques are widely applied in the control of dynamic systems, including mobile robots. The present volume is a collection of 20 chapters written by respectable experts of the fields, addressing various theoretical and practical aspects in soft computing, intelligent robotics and control. The first part of the book concerns with issues of intelligent robotics, including robust xed point transformation design, experimental verification of the input-output feedback linearization of differentially driven mobile robot and applying kinematic synthesis to micro electro-mechanical systems design. The second part of the book is devoted to fundamental aspects of soft computing. This includes practical aspects of fuzzy rule ...

  14. Contingency Analysis Post-Processing With Advanced Computing and Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt; Fitzhenry, Erin

    2017-07-01

    Contingency analysis is a critical function widely used in energy management systems to assess the impact of power system component failures. Its outputs are important for power system operation for improved situational awareness, power system planning studies, and power market operations. With the increased complexity of power system modeling and simulation caused by increased energy production and demand, the penetration of renewable energy and fast deployment of smart grid devices, and the trend of operating grids closer to their capacity for better efficiency, more and more contingencies must be executed and analyzed quickly in order to ensure grid reliability and accuracy for the power market. Currently, many researchers have proposed different techniques to accelerate the computational speed of contingency analysis, but not much work has been published on how to post-process the large amount of contingency outputs quickly. This paper proposes a parallel post-processing function that can analyze contingency analysis outputs faster and display them in a web-based visualization tool to help power engineers improve their work efficiency by fast information digestion. Case studies using an ESCA-60 bus system and a WECC planning system are presented to demonstrate the functionality of the parallel post-processing technique and the web-based visualization tool.

  15. ADVANCED COMPUTATIONAL MODEL FOR THREE-PHASE SLURRY REACTORS

    International Nuclear Information System (INIS)

    Ahmadi, Goodarz

    2004-01-01

    In this project, an Eulerian-Lagrangian formulation for analyzing three-phase slurry flows in a bubble column was developed. The approach used an Eulerian analysis of liquid flows in the bubble column, and made use of the Lagrangian trajectory analysis for the bubbles and particle motions. The bubble-bubble and particle-particle collisions are included the model. The model predictions are compared with the experimental data and good agreement was found An experimental setup for studying two-dimensional bubble columns was developed. The multiphase flow conditions in the bubble column were measured using optical image processing and Particle Image Velocimetry techniques (PIV). A simple shear flow device for bubble motion in a constant shear flow field was also developed. The flow conditions in simple shear flow device were studied using PIV method. Concentration and velocity of particles of different sizes near a wall in a duct flow was also measured. The technique of Phase-Doppler anemometry was used in these studies. An Eulerian volume of fluid (VOF) computational model for the flow condition in the two-dimensional bubble column was also developed. The liquid and bubble motions were analyzed and the results were compared with observed flow patterns in the experimental setup. Solid-fluid mixture flows in ducts and passages at different angle of orientations were also analyzed. The model predictions were compared with the experimental data and good agreement was found. Gravity chute flows of solid-liquid mixtures were also studied. The simulation results were compared with the experimental data and discussed A thermodynamically consistent model for multiphase slurry flows with and without chemical reaction in a state of turbulent motion was developed. The balance laws were obtained and the constitutive laws established

  16. ADVANCED COMPUTATIONAL MODEL FOR THREE-PHASE SLURRY REACTORS

    Energy Technology Data Exchange (ETDEWEB)

    Goodarz Ahmadi

    2004-10-01

    In this project, an Eulerian-Lagrangian formulation for analyzing three-phase slurry flows in a bubble column was developed. The approach used an Eulerian analysis of liquid flows in the bubble column, and made use of the Lagrangian trajectory analysis for the bubbles and particle motions. The bubble-bubble and particle-particle collisions are included the model. The model predictions are compared with the experimental data and good agreement was found An experimental setup for studying two-dimensional bubble columns was developed. The multiphase flow conditions in the bubble column were measured using optical image processing and Particle Image Velocimetry techniques (PIV). A simple shear flow device for bubble motion in a constant shear flow field was also developed. The flow conditions in simple shear flow device were studied using PIV method. Concentration and velocity of particles of different sizes near a wall in a duct flow was also measured. The technique of Phase-Doppler anemometry was used in these studies. An Eulerian volume of fluid (VOF) computational model for the flow condition in the two-dimensional bubble column was also developed. The liquid and bubble motions were analyzed and the results were compared with observed flow patterns in the experimental setup. Solid-fluid mixture flows in ducts and passages at different angle of orientations were also analyzed. The model predictions were compared with the experimental data and good agreement was found. Gravity chute flows of solid-liquid mixtures were also studied. The simulation results were compared with the experimental data and discussed A thermodynamically consistent model for multiphase slurry flows with and without chemical reaction in a state of turbulent motion was developed. The balance laws were obtained and the constitutive laws established.

  17. RECENT ADVANCES IN COMPUTATIONAL MECHANICS FOR CIVIL ENGINEERING

    Science.gov (United States)

    Applied Mechanics Committee, Computational Mechanics Subcommittee,

    In order to clarify mechanical phenomena in civil engineering, it is necessary to improve computational theory and technique in consideration of the particularity of objects to be analyzed and to update computational mechanics focusing on practical use. In addition to the analysis of infrastructure, for damage prediction of natural disasters such as earthquake, tsunami and flood, since it is essential to reflect broad ranges in space and time inherent to fields of civil engineering as well as material properties, it is important to newly develop computational method in view of the particularity of fields of civil engineering. In this context, research trend of methods of computational mechanics which is noteworthy for resolving the complex mechanics problems in civil engineering is reviewed in this paper.

  18. Advanced Simulation and Computing FY09-FY10 Implementation Plan, Volume 2, Revision 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Meisner, R; Hopson, J; Peery, J; McCoy, M

    2008-10-07

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  19. Advanced Simulation and Computing FY09-FY10 Implementation Plan Volume 2, Rev. 1

    Energy Technology Data Exchange (ETDEWEB)

    Kissel, L

    2009-04-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  20. Advanced Simulation & Computing FY09-FY10 Implementation Plan Volume 2, Rev. 0

    Energy Technology Data Exchange (ETDEWEB)

    Meisner, R; Perry, J; McCoy, M; Hopson, J

    2008-04-30

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  1. Advanced Simulation and Computing FY08-09 Implementation Plan Volume 2 Revision 0

    International Nuclear Information System (INIS)

    McCoy, M; Kusnezov, D; Bikkel, T; Hopson, J

    2007-01-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  2. Advanced Simulation and Computing FY08-09 Implementation Plan, Volume 2, Revision 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Kusnezov, D; Bickel, T; McCoy, M; Hopson, J

    2007-09-13

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  3. Advanced Simulation and Computing FY10-FY11 Implementation Plan Volume 2, Rev. 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Meisner, R; Peery, J; McCoy, M; Hopson, J

    2009-09-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  4. Advanced Simulation and Computing FY10-11 Implementation Plan Volume 2, Rev. 0

    Energy Technology Data Exchange (ETDEWEB)

    Carnes, B

    2009-06-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  5. Advanced Simulation and Computing Fiscal Year 2011-2012 Implementation Plan, Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, M; Phillips, J; Hpson, J; Meisner, R

    2010-04-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  6. Advanced methods in twin studies.

    Science.gov (United States)

    Kaprio, Jaakko; Silventoinen, Karri

    2011-01-01

    While twin studies have been used to estimate the heritability of different traits and disorders since the beginning of the twentieth century, statistical developments over the past 20 years and more extensive and systematic data collection have greatly expanded the scope of twin studies. This chapter reviews selected possibilities of twin study designs to address specific hypotheses regarding the role of both genetic and environmental factors in the development of traits and diseases. In addition to modelling latent genetic influences, current models permit inclusion of information on specific genetic variants, measured environmental factors and their interactive effects. Examples from studies of anthropometric traits are used to illustrate such approaches.

  7. Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology (Final Report)

    Science.gov (United States)

    EPA announced the release of the final report, Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology. This report describes new approaches that are faster, less resource intensive, and more robust that can help ...

  8. The Advanced Study of Gymnastics.

    Science.gov (United States)

    Salmela, John H., Ed.

    The sport of artistic gymnastics is viewed from a multidisciplinary point of view. The training, performance, and judgment of the sport undergo specialized study of interest to sport scientists, teachers, coaches, and athletes. Organized into five major sections, the book presents such themes as the psychological, physiological, biomechanical,…

  9. Proceedings of the International Conference on Advances in Computational Mechanics 2017

    CERN Document Server

    Phung-Van, Phuc; Rabczuk, Timon

    2018-01-01

    This book provides an overview of state-of-the-art methods in computational engineering for modeling and simulation. This proceedings volume includes a selection of refereed papers presented at the International Conference on Advances in Computational Mechanics (ACOME) 2017, which took place on Phu Quoc Island, Vietnam on August 2-4, 2017. The contributions highlight recent advances in and innovative applications of computational mechanics. Subjects covered include: biological systems; damage, fracture and failure; flow problems; multiscale multiphysics problems; composites and hybrid structures; optimization and inverse problems; lightweight structures; computational mechatronics; computational dynamics; numerical methods; and high-performance computing. The book is intended for academics, including graduate students and experienced researchers interested in state-of-the-art computational methods for solving challenging problems in engineering.

  10. Advanced commercial Tokamak optimization studies

    International Nuclear Information System (INIS)

    Whitley, R.H.; Berwald, D.H.; Gordon, J.D.

    1985-01-01

    Our recent studies have concentrated on developing optimal high beta (bean-shaped plasma) commercial tokamak configurations using TRW's Tokamak Reactor Systems Code (TRSC) with special emphasis on lower net electric power reactors that are more easily deployable. A wide range of issues were investigated in the search for the most economic configuration: fusion power, reactor size, wall load, magnet type, inboard blanket and shield thickness, plasma aspect ratio, and operational β value. The costs and configurations of both steady-state and pulsed reactors were also investigated. Optimal small and large reactor concepts were developed and compared by studying the cost of electricity from single units and from multiplexed units. Multiplexed units appear to have advantages because they share some plant equipment and have lower initial capital investment as compared to larger single units

  11. Infrastructure Systems for Advanced Computing in E-science applications

    Science.gov (United States)

    Terzo, Olivier

    2013-04-01

    In the e-science field are growing needs for having computing infrastructure more dynamic and customizable with a model of use "on demand" that follow the exact request in term of resources and storage capacities. The integration of grid and cloud infrastructure solutions allows us to offer services that can adapt the availability in terms of up scaling and downscaling resources. The main challenges for e-sciences domains will on implement infrastructure solutions for scientific computing that allow to adapt dynamically the demands of computing resources with a strong emphasis on optimizing the use of computing resources for reducing costs of investments. Instrumentation, data volumes, algorithms, analysis contribute to increase the complexity for applications who require high processing power and storage for a limited time and often exceeds the computational resources that equip the majority of laboratories, research Unit in an organization. Very often it is necessary to adapt or even tweak rethink tools, algorithms, and consolidate existing applications through a phase of reverse engineering in order to adapt them to a deployment on Cloud infrastructure. For example, in areas such as rainfall monitoring, meteorological analysis, Hydrometeorology, Climatology Bioinformatics Next Generation Sequencing, Computational Electromagnetic, Radio occultation, the complexity of the analysis raises several issues such as the processing time, the scheduling of tasks of processing, storage of results, a multi users environment. For these reasons, it is necessary to rethink the writing model of E-Science applications in order to be already adapted to exploit the potentiality of cloud computing services through the uses of IaaS, PaaS and SaaS layer. An other important focus is on create/use hybrid infrastructure typically a federation between Private and public cloud, in fact in this way when all resources owned by the organization are all used it will be easy with a federate

  12. Advanced Scientific Computing Research Exascale Requirements Review. An Office of Science review sponsored by Advanced Scientific Computing Research, September 27-29, 2016, Rockville, Maryland

    Energy Technology Data Exchange (ETDEWEB)

    Almgren, Ann [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); DeMar, Phil [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Riley, Katherine [Argonne Leadership Computing Facility, Argonne, IL (United States); Antypas, Katie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bard, Deborah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC); Coffey, Richard [Argonne National Lab. (ANL), Argonne, IL (United States); Dart, Eli [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Science Network; Dosanjh, Sudip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gerber, Richard [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hack, James [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Monga, Inder [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Science Network; Papka, Michael E. [Argonne National Lab. (ANL), Argonne, IL (United States); Rotman, Lauren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Science Network; Straatsma, Tjerk [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wells, Jack [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bernholdt, David E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bethel, Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bosilca, George [Univ. of Tennessee, Knoxville, TN (United States); Cappello, Frank [Argonne National Lab. (ANL), Argonne, IL (United States); Gamblin, Todd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Habib, Salman [Argonne National Lab. (ANL), Argonne, IL (United States); Hill, Judy [Oak Ridge Leadership Computing Facility, Oak Ridge, TN (United States); Hollingsworth, Jeffrey K. [Univ. of Maryland, College Park, MD (United States); McInnes, Lois Curfman [Argonne National Lab. (ANL), Argonne, IL (United States); Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Moore, Shirley [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Moreland, Ken [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Roser, Rob [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Shende, Sameer [Univ. of Oregon, Eugene, OR (United States); Shipman, Galen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-06-20

    The widespread use of computing in the American economy would not be possible without a thoughtful, exploratory research and development (R&D) community pushing the performance edge of operating systems, computer languages, and software libraries. These are the tools and building blocks — the hammers, chisels, bricks, and mortar — of the smartphone, the cloud, and the computing services on which we rely. Engineers and scientists need ever-more specialized computing tools to discover new material properties for manufacturing, make energy generation safer and more efficient, and provide insight into the fundamentals of the universe, for example. The research division of the U.S. Department of Energy’s (DOE’s) Office of Advanced Scientific Computing and Research (ASCR Research) ensures that these tools and building blocks are being developed and honed to meet the extreme needs of modern science. See also http://exascaleage.org/ascr/ for additional information.

  13. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Bremer, Peer-Timo [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mohr, Bernd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Schulz, Martin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pasccci, Valerio [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gamblin, Todd [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brunst, Holger [Dresden Univ. of Technology (Germany)

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  14. Advanced Simulation and Computing Co-Design Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Ang, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoang, Thuc T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kelly, Suzanne M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); McPherson, Allen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Neely, Rob [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  15. ADVANCING THE FUNDAMENTAL UNDERSTANDING AND SCALE-UP OF TRISO FUEL COATERS VIA ADVANCED MEASUREMENT AND COMPUTATIONAL TECHNIQUES

    Energy Technology Data Exchange (ETDEWEB)

    Biswas, Pratim; Al-Dahhan, Muthanna

    2012-11-01

    to advance the fundamental understanding of the hydrodynamics by systematically investigating the effect of design and operating variables, to evaluate the reported dimensionless groups as scaling factors, and to establish a reliable scale-up methodology for the TRISO fuel particle spouted bed coaters based on hydrodynamic similarity via advanced measurement and computational techniques. An additional objective is to develop an on-line non-invasive measurement technique based on gamma ray densitometry (i.e. Nuclear Gauge Densitometry) that can be installed and used for coater process monitoring to ensure proper performance and operation and to facilitate the developed scale-up methodology. To achieve the objectives set for the project, the work will use optical probes and gamma ray computed tomography (CT) (for the measurements of solids/voidage holdup cross-sectional distribution and radial profiles along the bed height, spouted diameter, and fountain height) and radioactive particle tracking (RPT) (for the measurements of the 3D solids flow field, velocity, turbulent parameters, circulation time, solids lagrangian trajectories, and many other of spouted bed related hydrodynamic parameters). In addition, gas dynamic measurement techniques and pressure transducers will be utilized to complement the obtained information. The measurements obtained by these techniques will be used as benchmark data to evaluate and validate the computational fluid dynamic (CFD) models (two fluid model or discrete particle model) and their closures. The validated CFD models and closures will be used to facilitate the developed methodology for scale-up, design and hydrodynamic similarity. Successful execution of this work and the proposed tasks will advance the fundamental understanding of the coater flow field and quantify it for proper and safe design, scale-up, and performance. Such achievements will overcome the barriers to AGR applications and will help assure that the US maintains

  16. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    Science.gov (United States)

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  17. Advances in Computer Science and Information Engineering Volume 2

    CERN Document Server

    Lin, Sally

    2012-01-01

    CSIE2012 is an integrated conference concentrating its focus on Computer Science and Information Engineering . In the proceeding, you can learn much more knowledge about Computer Science and Information Engineering of researchers from all around the world. The main role of the proceeding is to be used as an exchange pillar for researchers who are working in the mentioned fields. In order to meet the high quality of Springer, AISC series, the organization committee has made their efforts to do the following things. Firstly, poor quality paper has been refused after reviewing course by anonymous referee experts. Secondly, periodically review meetings have been held around the reviewers about five times for exchanging reviewing suggestions. Finally, the conference organizers had several preliminary sessions before the conference. Through efforts of different people and departments, the conference will be successful and fruitful.

  18. Advances in Computer Science and Information Engineering Volume 1

    CERN Document Server

    Lin, Sally

    2012-01-01

    CSIE2012 is an integrated conference concentrating its focus on Computer Science and Information Engineering . In the proceeding, you can learn much more knowledge about Computer Science and Information Engineering of researchers from all around the world. The main role of the proceeding is to be used as an exchange pillar for researchers who are working in the mentioned fields. In order to meet the high quality of Springer, AISC series, the organization committee has made their efforts to do the following things. Firstly, poor quality paper has been refused after reviewing course by anonymous referee experts. Secondly, periodically review meetings have been held around the reviewers about five times for exchanging reviewing suggestions. Finally, the conference organizers had several preliminary sessions before the conference. Through efforts of different people and departments, the conference will be successful and fruitful.

  19. Advances in neural networks computational intelligence for ICT

    CERN Document Server

    Esposito, Anna; Morabito, Francesco; Pasero, Eros

    2016-01-01

    This carefully edited book is putting emphasis on computational and artificial intelligent methods for learning and their relative applications in robotics, embedded systems, and ICT interfaces for psychological and neurological diseases. The book is a follow-up of the scientific workshop on Neural Networks (WIRN 2015) held in Vietri sul Mare, Italy, from the 20th to the 22nd of May 2015. The workshop, at its 27th edition became a traditional scientific event that brought together scientists from many countries, and several scientific disciplines. Each chapter is an extended version of the original contribution presented at the workshop, and together with the reviewers’ peer revisions it also benefits from the live discussion during the presentation. The content of book is organized in the following sections. 1. Introduction, 2. Machine Learning, 3. Artificial Neural Networks: Algorithms and models, 4. Intelligent Cyberphysical and Embedded System, 5. Computational Intelligence Methods for Biomedical ICT in...

  20. Computer Image Generation: Advanced Visual/Sensor Simulation.

    Science.gov (United States)

    1981-10-01

    levels of suboivision. 3. The cubic surface may not be appropriate for all surfaces. Flat surfaces may be rendered with slight undulations oue to aujacent...aimensions. The alternative is to present cultural objects at a subpixel level, w nicn is simpler but not as cepenoaole because suopixei-size objects wiii...34 Doctoral dissertation, University of Utah, December 1978. Carpenter, L.G., "Computer Rendering of Fractal Curves and Surfaces," Siggraph 󈨔, Special

  1. Advanced Technologies for Human-Computer Interfaces in Mixed Reality

    OpenAIRE

    Marchesi, Marco

    2016-01-01

    As human beings, we trust our five senses, that allow us to experience the world and communicate. Since our birth, the amount of data that every day we can acquire is impressive and such a richness reflects the complexity of humankind in arts, technology, etc. The advent of computers and the consequent progress in Data Science and Artificial Intelligence showed how large amounts of data can contain some sort of “intelligence” themselves. Machines learn and create a superimposed layer of reali...

  2. A Computational Cluster for Advanced Plasma Physics Simulations

    Science.gov (United States)

    2010-02-08

    were made amongst several cluster manufacturers, including Cray, IBM, Dell, Silicon Mechanics, Rackable Sytems , and SiCortex before deciding on the...simulated. The algorithm implements the discontinuous Galerkin method to achieve high-order accuracy and will use body -fitted computational meshes to...APS Poster’s work and ICC 2010 made use of the ICE cluster: 2009 APS: ”Plasma Solution Quality in Distorted, Body -Fitted Meshes in SEL/HiFi”, W

  3. Proceedings of the international conference on advances in computer and communication technology

    International Nuclear Information System (INIS)

    Bakal, J.W.; Kunte, A.S.; Walinjkar, P.B.; Karnani, N.K.

    2012-02-01

    A nation's development is coupled with advancement and adoption of new technologies. During the past decade advancements in computer and communication technologies have grown multi fold. For the growth of any country it is necessary to keep pace with the latest innovations in technology. International Conference on Advances in Computer and Communication Technology organised by Institution of Electronics and Telecommunication Engineers, Mumbai Centre is an attempt to provide a platform for scientists, engineering students, educators and experts to share their knowledge and discuss the efforts put by them in the field of R and D. The papers relevant to INIS are indexed separately

  4. Parallel computing in genomic research: advances and applications.

    Science.gov (United States)

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities.

  5. Advances in bio-inspired computing for combinatorial optimization problems

    CERN Document Server

    Pintea, Camelia-Mihaela

    2013-01-01

    Advances in Bio-inspired Combinatorial Optimization Problems' illustrates several recent bio-inspired efficient algorithms for solving NP-hard problems.Theoretical bio-inspired concepts and models, in particular for agents, ants and virtual robots are described. Large-scale optimization problems, for example: the Generalized Traveling Salesman Problem and the Railway Traveling Salesman Problem, are solved and their results are discussed.Some of the main concepts and models described in this book are: inner rule to guide ant search - a recent model in ant optimization, heterogeneous sensitive a

  6. Advanced Demonstration and Test Reactor Options Study

    Energy Technology Data Exchange (ETDEWEB)

    Petti, David Andrew [Idaho National Lab. (INL), Idaho Falls, ID (United States); Hill, R. [Argonne National Lab. (ANL), Argonne, IL (United States); Gehin, J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Gougar, Hans David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States); Heidet, F. [Argonne National Lab. (ANL), Argonne, IL (United States); Kinsey, J. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Grandy, Christopher [Argonne National Lab. (ANL), Argonne, IL (United States); Qualls, A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Brown, Nicholas [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Powers, J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hoffman, E. [Argonne National Lab. (ANL), Argonne, IL (United States); Croson, D. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-01-01

    Global efforts to address climate change will require large-scale decarbonization of energy production in the United States and elsewhere. Nuclear power already provides 20% of electricity production in the United States (U.S.) and is increasing in countries undergoing rapid growth around the world. Because reliable, grid-stabilizing, low emission electricity generation, energy security, and energy resource diversity will be increasingly valued, nuclear power’s share of electricity production has a potential to grow. In addition, there are non electricity applications (e.g., process heat, desalination, hydrogen production) that could be better served by advanced nuclear systems. Thus, the timely development, demonstration, and commercialization of advanced nuclear reactors could diversify the nuclear technologies available and offer attractive technology options to expand the impact of nuclear energy for electricity generation and non-electricity missions. The purpose of this planning study is to provide transparent and defensible technology options for a test and/or demonstration reactor(s) to be built to support public policy, innovation and long term commercialization within the context of the Department of Energy’s (DOE’s) broader commitment to pursuing an “all of the above” clean energy strategy and associated time lines. This planning study includes identification of the key features and timing needed for advanced test or demonstration reactors to support research, development, and technology demonstration leading to the commercialization of power plants built upon these advanced reactor platforms. This planning study is consistent with the Congressional language contained within the fiscal year 2015 appropriation that directed the DOE to conduct a planning study to evaluate “advanced reactor technology options, capabilities, and requirements within the context of national needs and public policy to support innovation in nuclear energy

  7. UNEDF: Advanced Scientific Computing Collaboration Transforms the Low-Energy Nuclear Many-Body Problem

    International Nuclear Information System (INIS)

    Nam, H; Stoitsov, M; Nazarewicz, W; Hagen, G; Kortelainen, M; Pei, J C; Bulgac, A; Maris, P; Vary, J P; Roche, K J; Schunck, N; Thompson, I; Wild, S M

    2012-01-01

    The demands of cutting-edge science are driving the need for larger and faster computing resources. With the rapidly growing scale of computing systems and the prospect of technologically disruptive architectures to meet these needs, scientists face the challenge of effectively using complex computational resources to advance scientific discovery. Multi-disciplinary collaborating networks of researchers with diverse scientific backgrounds are needed to address these complex challenges. The UNEDF SciDAC collaboration of nuclear theorists, applied mathematicians, and computer scientists is developing a comprehensive description of nuclei and their reactions that delivers maximum predictive power with quantified uncertainties. This paper describes UNEDF and identifies attributes that classify it as a successful computational collaboration. We illustrate significant milestones accomplished by UNEDF through integrative solutions using the most reliable theoretical approaches, most advanced algorithms, and leadership-class computational resources.

  8. National facility for advanced computational science: A sustainable path to scientific discovery

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst; Kramer, William; Saphir, William; Shalf, John; Bailey, David; Oliker, Leonid; Banda, Michael; McCurdy, C. William; Hules, John; Canning, Andrew; Day, Marc; Colella, Philip; Serafini, David; Wehner, Michael; Nugent, Peter

    2004-04-02

    Lawrence Berkeley National Laboratory (Berkeley Lab) proposes to create a National Facility for Advanced Computational Science (NFACS) and to establish a new partnership between the American computer industry and a national consortium of laboratories, universities, and computing facilities. NFACS will provide leadership-class scientific computing capability to scientists and engineers nationwide, independent of their institutional affiliation or source of funding. This partnership will bring into existence a new class of computational capability in the United States that is optimal for science and will create a sustainable path towards petaflops performance.

  9. Advanced and intelligent computations in diagnosis and control

    CERN Document Server

    2016-01-01

    This book is devoted to the demands of research and industrial centers for diagnostics, monitoring and decision making systems that result from the increasing complexity of automation and systems, the need to ensure the highest level of reliability and safety, and continuing research and the development of innovative approaches to fault diagnosis. The contributions combine domains of engineering knowledge for diagnosis, including detection, isolation, localization, identification, reconfiguration and fault-tolerant control. The book is divided into six parts:  (I) Fault Detection and Isolation; (II) Estimation and Identification; (III) Robust and Fault Tolerant Control; (IV) Industrial and Medical Diagnostics; (V) Artificial Intelligence; (VI) Expert and Computer Systems.

  10. Advanced Computational Methods for Thermal Radiative Heat Transfer

    Energy Technology Data Exchange (ETDEWEB)

    Tencer, John; Carlberg, Kevin Thomas; Larsen, Marvin E.; Hogan, Roy E.,

    2016-10-01

    Participating media radiation (PMR) in weapon safety calculations for abnormal thermal environments are too costly to do routinely. This cost may be s ubstantially reduced by applying reduced order modeling (ROM) techniques. The application of ROM to PMR is a new and unique approach for this class of problems. This approach was investigated by the authors and shown to provide significant reductions in the computational expense associated with typical PMR simulations. Once this technology is migrated into production heat transfer analysis codes this capability will enable the routine use of PMR heat transfer in higher - fidelity simulations of weapon resp onse in fire environments.

  11. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    International Nuclear Information System (INIS)

    Moore, Kevin L.; Moiseenko, Vitali; Kagadis, George C.; McNutt, Todd R.; Mutic, Sasa

    2014-01-01

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy

  12. Vision 20/20: Automation and advanced computing in clinical radiation oncology.

    Science.gov (United States)

    Moore, Kevin L; Kagadis, George C; McNutt, Todd R; Moiseenko, Vitali; Mutic, Sasa

    2014-01-01

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.

  13. Advancements in Violin-Related Human-Computer Interaction

    DEFF Research Database (Denmark)

    Overholt, Daniel

    2014-01-01

    Finesse is required while performing with many traditional musical instruments, as they are extremely responsive to human inputs. The violin is specifically examined here, as it excels at translating a performer’s gestures into sound in manners that evoke a wide range of affective qualities....... This type of rich responsiveness is simultaneously what makes it so challenging to play, what keeps it interesting to practice for long periods of time, and what makes overcoming these difficulties worthwhile to performer and audience alike. The capability of an instrument to render audible the complexity...... of human intelligence and emotion is at the core of the Musical Interface Technology Design Space, MITDS. This is a framework that endeavors to retain and enhance such traits of traditional instruments in the design of interactive live performance interfaces. Utilizing the MITDS, advanced Human...

  14. Advances in Intelligent Control Systems and Computer Science

    CERN Document Server

    2013-01-01

    The conception of real-time control networks taking into account, as an integrating approach, both the specific aspects of information and knowledge processing and the dynamic and energetic particularities of physical processes and of communication networks is representing one of the newest scientific and technological challenges. The new paradigm of Cyber-Physical Systems (CPS) reflects this tendency and will certainly change the evolution of the technology, with major social and economic impact. This book presents significant results in the field of process control and advanced information and knowledge processing, with applications in the fields of robotics, biotechnology, environment, energy, transportation, et al.. It introduces intelligent control concepts and strategies as well as real-time implementation aspects for complex control approaches. One of the sections is dedicated to the complex problem of designing software systems for distributed information processing networks. Problems as complexity an...

  15. Advances in computer-aided design and computer-aided manufacture technology.

    Science.gov (United States)

    Calamia, J R

    1996-01-01

    Although the development of computer-aided design (CAD) and computer-aided manufacture (CAM) technology and the benefits of increased productivity became obvious in the automobile and aerospace industries in the 1970s, investigations of this technology's application in the field of dentistry did not begin until the 1980s. Only now are we beginning to see the fruits of this work with the commercial availability of some systems; the potential for this technology seems boundless. This article reviews the recent literature with emphasis on the period from June 1992 to May 1993. This review should familiarize the reader with some of the latest developments in this technology, including a brief description of some systems currently available and the clinical and economical rationale for their acceptance into the dental mainstream. This article concentrates on a particular system, the Cerec (Siemens/Pelton and Crane, Charlotte, NC) system, for three reasons: First, this system has been available since 1985 and, as a result, has a track record of almost 7 years of data. Most of the data have just recently been released and consequently, much of this year's literature on CAD-CAM is monopolized by studies using this system. Second, this system was developed as a mobile, affordable, direct chairside CAD-CAM restorative method. As such, it is of special interest to the patient, providing a one-visit restoration. Third, the author is currently engaged in research using this particular system and has a working knowledge of this system's capabilities.

  16. Annual Performance Assessment of Complex Fenestration Systems in Sunny Climates Using Advanced Computer Simulations

    Directory of Open Access Journals (Sweden)

    Chantal Basurto

    2015-12-01

    Full Text Available Complex Fenestration Systems (CFS are advanced daylighting systems that are placed on the upper part of a window to improve the indoor daylight distribution within rooms. Due to their double function of daylight redirection and solar protection, they are considered as a solution to mitigate the unfavorable effects due to the admission of direct sunlight in buildings located in prevailing sunny climates (risk of glare and overheating. Accordingly, an adequate assessment of their performance should include an annual evaluation of the main aspects relevant to the use of daylight in such regions: the indoor illuminance distribution, thermal comfort, and visual comfort of the occupant’s. Such evaluation is possible with the use of computer simulations combined with the bi-directional scattering distribution function (BSDF data of these systems. This study explores the use of available methods to assess the visible and thermal annual performance of five different CFS using advanced computer simulations. To achieve results, an on-site daylight monitoring was carried out in a building located in a predominantly sunny climate location, and the collected data was used to create and calibrate a virtual model used to carry-out the simulations. The results can be employed to select the CFS, which improves visual and thermal interior environment for the occupants.

  17. Advances in neural networks computational and theoretical issues

    CERN Document Server

    Esposito, Anna; Morabito, Francesco

    2015-01-01

    This book collects research works that exploit neural networks and machine learning techniques from a multidisciplinary perspective. Subjects covered include theoretical, methodological and computational topics which are grouped together into chapters devoted to the discussion of novelties and innovations related to the field of Artificial Neural Networks as well as the use of neural networks for applications, pattern recognition, signal processing, and special topics such as the detection and recognition of multimodal emotional expressions and daily cognitive functions, and  bio-inspired memristor-based networks.  Providing insights into the latest research interest from a pool of international experts coming from different research fields, the volume becomes valuable to all those with any interest in a holistic approach to implement believable, autonomous, adaptive, and context-aware Information Communication Technologies.

  18. Advances in x-ray computed microtomography at the NSLS

    International Nuclear Information System (INIS)

    Dowd, B.A.; Andrews, A.B.; Marr, R.B.; Siddons, D.P.; Jones, K.W.; Peskin, A.M.

    1998-08-01

    The X-Ray Computed Microtomography workstation at beamline X27A at the NSLS has been utilized by scientists from a broad range of disciplines from industrial materials processing to environmental science. The most recent applications are presented here as well as a description of the facility that has evolved to accommodate a wide variety of materials and sample sizes. One of the most exciting new developments reported here resulted from a pursuit of faster reconstruction techniques. A Fast Filtered Back Transform (FFBT) reconstruction program has been developed and implemented, that is based on a refinement of the gridding algorithm first developed for use with radio astronomical data. This program has reduced the reconstruction time to 8.5 sec for a 929 x 929 pixel 2 slice on an R10,000 CPU, more than 8x reduction compared with the Filtered Back-Projection method

  19. Recent advances in computational intelligence in defense and security

    CERN Document Server

    Falcon, Rafael; Zincir-Heywood, Nur; Abbass, Hussein

    2016-01-01

    This volume is an initiative undertaken by the IEEE Computational Intelligence Society’s Task Force on Security, Surveillance and Defense to consolidate and disseminate the role of CI techniques in the design, development and deployment of security and defense solutions. Applications range from the detection of buried explosive hazards in a battlefield to the control of unmanned underwater vehicles, the delivery of superior video analytics for protecting critical infrastructures or the development of stronger intrusion detection systems and the design of military surveillance networks. Defense scientists, industry experts, academicians and practitioners alike will all benefit from the wide spectrum of successful applications compiled in this volume. Senior undergraduate or graduate students may also discover uncharted territory for their own research endeavors.

  20. NATO Advanced Study Institute on Advances in Microlocal Analysis

    CERN Document Server

    1986-01-01

    The 1985 Castel vecchio-Pas coli NATO Advanced Study Institute is aimed to complete the trilogy with the two former institutes I organized : "Boundary Value Problem for Evolution Partial Differential Operators", Liege, 1976 and "Singularities in Boundary Value Problems", Maratea, 1980. It was indeed necessary to record the considerable progress realized in the field of the propagation of singularities of Schwartz Distri­ butions which led recently to the birth of a new branch of Mathema­ tical Analysis called Microlocal Analysis. Most of this theory was mainly built to be applied to distribution solutions of linear partial differential problems. A large part of this institute still went in this direction. But, on the other hand, it was also time to explore the new trend to use microlocal analysis In non linear differential problems. I hope that the Castelvecchio NATO ASI reached its purposes with the help of the more famous authorities in the field. The meeting was held in Tuscany (Italy) at Castelvecchio-P...

  1. Studi Perbandingan Layanan Cloud Computing

    Directory of Open Access Journals (Sweden)

    Afdhal Afdhal

    2014-03-01

    Full Text Available In the past few years, cloud computing has became a dominant topic in the IT area. Cloud computing offers hardware, infrastructure, platform and applications without requiring end-users knowledge of the physical location and the configuration of providers who deliver the services. It has been a good solution to increase reliability, reduce computing cost, and make opportunities to IT industries to get more advantages. The purpose of this article is to present a better understanding of cloud delivery service, correlation and inter-dependency. This article compares and contrasts the different levels of delivery services and the development models, identify issues, and future directions on cloud computing. The end-users comprehension of cloud computing delivery service classification will equip them with knowledge to determine and decide which business model that will be chosen and adopted securely and comfortably. The last part of this article provides several recommendations for cloud computing service providers and end-users.

  2. Experimental and computing strategies in advanced material characterization problems

    Energy Technology Data Exchange (ETDEWEB)

    Bolzon, G. [Department of Civil and Environmental Engineering, Politecnico di Milano, piazza Leonardo da Vinci 32, 20133 Milano, Italy gabriella.bolzon@polimi.it (Italy)

    2015-10-28

    The mechanical characterization of materials relies more and more often on sophisticated experimental methods that permit to acquire a large amount of data and, contemporarily, to reduce the invasiveness of the tests. This evolution accompanies the growing demand of non-destructive diagnostic tools that assess the safety level of components in use in structures and infrastructures, for instance in the strategic energy sector. Advanced material systems and properties that are not amenable to traditional techniques, for instance thin layered structures and their adhesion on the relevant substrates, can be also characterized by means of combined experimental-numerical tools elaborating data acquired by full-field measurement techniques. In this context, parameter identification procedures involve the repeated simulation of the laboratory or in situ tests by sophisticated and usually expensive non-linear analyses while, in some situation, reliable and accurate results would be required in real time. The effectiveness and the filtering capabilities of reduced models based on decomposition and interpolation techniques can be profitably used to meet these conflicting requirements. This communication intends to summarize some results recently achieved in this field by the author and her co-workers. The aim is to foster further interaction between engineering and mathematical communities.

  3. National Energy Research Scientific Computing Center (NERSC): Advancing the frontiers of computational science and technology

    Energy Technology Data Exchange (ETDEWEB)

    Hules, J. [ed.

    1996-11-01

    National Energy Research Scientific Computing Center (NERSC) provides researchers with high-performance computing tools to tackle science`s biggest and most challenging problems. Founded in 1974 by DOE/ER, the Controlled Thermonuclear Research Computer Center was the first unclassified supercomputer center and was the model for those that followed. Over the years the center`s name was changed to the National Magnetic Fusion Energy Computer Center and then to NERSC; it was relocated to LBNL. NERSC, one of the largest unclassified scientific computing resources in the world, is the principal provider of general-purpose computing services to DOE/ER programs: Magnetic Fusion Energy, High Energy and Nuclear Physics, Basic Energy Sciences, Health and Environmental Research, and the Office of Computational and Technology Research. NERSC users are a diverse community located throughout US and in several foreign countries. This brochure describes: the NERSC advantage, its computational resources and services, future technologies, scientific resources, and computational science of scale (interdisciplinary research over a decade or longer; examples: combustion in engines, waste management chemistry, global climate change modeling).

  4. A computational study of high entropy alloys

    Science.gov (United States)

    Wang, Yang; Gao, Michael; Widom, Michael; Hawk, Jeff

    2013-03-01

    As a new class of advanced materials, high-entropy alloys (HEAs) exhibit a wide variety of excellent materials properties, including high strength, reasonable ductility with appreciable work-hardening, corrosion and oxidation resistance, wear resistance, and outstanding diffusion-barrier performance, especially at elevated and high temperatures. In this talk, we will explain our computational approach to the study of HEAs that employs the Korringa-Kohn-Rostoker coherent potential approximation (KKR-CPA) method. The KKR-CPA method uses Green's function technique within the framework of multiple scattering theory and is uniquely designed for the theoretical investigation of random alloys from the first principles. The application of the KKR-CPA method will be discussed as it pertains to the study of structural and mechanical properties of HEAs. In particular, computational results will be presented for AlxCoCrCuFeNi (x = 0, 0.3, 0.5, 0.8, 1.0, 1.3, 2.0, 2.8, and 3.0), and these results will be compared with experimental information from the literature.

  5. Advanced computational model for three-phase slurry reactors

    International Nuclear Information System (INIS)

    Goodarz Ahmadi

    2000-11-01

    In the first year of the project, solid-fluid mixture flows in ducts and passages at different angle of orientations were analyzed. The model predictions are compared with the experimental data and good agreement was found. Progress was also made in analyzing the gravity chute flows of solid-liquid mixtures. An Eulerian-Lagrangian formulation for analyzing three-phase slurry flows in a bubble column is being developed. The approach uses an Eulerian analysis of gas liquid flows in the bubble column, and makes use of the Lagrangian particle tracking procedure to analyze the particle motions. Progress was also made in developing a rate dependent thermodynamically consistent model for multiphase slurry flows in a state of turbulent motion. The new model includes the effect of phasic interactions and leads to anisotropic effective phasic stress tensors. Progress was also made in measuring concentration and velocity of particles of different sizes near a wall in a duct flow. The formulation of a thermodynamically consistent model for chemically active multiphase solid-fluid flows in a turbulent state of motion was also initiated. The general objective of this project is to provide the needed fundamental understanding of three-phase slurry reactors in Fischer-Tropsch (F-T) liquid fuel synthesis. The other main goal is to develop a computational capability for predicting the transport and processing of three-phase coal slurries. The specific objectives are: (1) To develop a thermodynamically consistent rate-dependent anisotropic model for multiphase slurry flows with and without chemical reaction for application to coal liquefaction. Also to establish the material parameters of the model. (2) To provide experimental data for phasic fluctuation and mean velocities, as well as the solid volume fraction in the shear flow devices. (3) To develop an accurate computational capability incorporating the new rate-dependent and anisotropic model for analyzing reacting and

  6. Advanced Subsonic Airplane Design and Economic Studies

    Science.gov (United States)

    Liebeck, Robert H.; Andrastek, Donald A.; Chau, Johnny; Girvin, Raquel; Lyon, Roger; Rawdon, Blaine K.; Scott, Paul W.; Wright, Robert A.

    1995-01-01

    A study was made to examine the effect of advanced technology engines on the performance of subsonic airplanes and provide a vision of the potential which these advanced engines offered. The year 2005 was selected as the entry-into-service (EIS) date for engine/airframe combination. A set of four airplane classes (passenger and design range combinations) that were envisioned to span the needs for the 2005 EIS period were defined. The airframes for all classes were designed and sized using 2005 EIS advanced technology. Two airplanes were designed and sized for each class: one using current technology (1995) engines to provide a baseline, and one using advanced technology (2005) engines. The resulting engine/airframe combinations were compared and evaluated on the basis on sensitivity to basic engine performance parameters (e.g. SFC and engine weight) as well as DOC+I. The advanced technology engines provided significant reductions in fuel burn, weight, and wing area. Average values were as follows: reduction in fuel burn = 18%, reduction in wing area = 7%, and reduction in TOGW = 9%. Average DOC+I reduction was 3.5% using the pricing model based on payload-range index and 5% using the pricing model based on airframe weight. Noise and emissions were not considered.

  7. Computers in Public Education Study.

    Science.gov (United States)

    HBJ Enterprises, Highland Park, NJ.

    This survey conducted for the National Institute of Education reports the use of computers in U.S. public schools in the areas of instructional computing, student accounting, management of educational resources, research, guidance, testing, and library applications. From a stratified random sample of 1800 schools in varying geographic areas and…

  8. Biomedical Visual Computing: Case Studies and Challenges

    KAUST Repository

    Johnson, Christopher

    2012-01-01

    Advances in computational geometric modeling, imaging, and simulation let researchers build and test models of increasing complexity, generating unprecedented amounts of data. As recent research in biomedical applications illustrates, visualization will be critical in making this vast amount of data usable; it\\'s also fundamental to understanding models of complex phenomena. © 2012 IEEE.

  9. Development of a computer code for dynamic analysis of the primary circuit of advanced reactors

    International Nuclear Information System (INIS)

    Rocha, Jussie Soares da; Lira, Carlos A.B.O.; Magalhaes, Mardson A. de Sa

    2011-01-01

    Currently, advanced reactors are being developed, seeking for enhanced safety, better performance and low environmental impacts. Reactor designs must follow several steps and numerous tests before a conceptual project could be certified. In this sense, computational tools become indispensable in the preparation of such projects. Thus, this study aimed at the development of a computational tool for thermal-hydraulic analysis by coupling two computer codes to evaluate the influence of transients caused by pressure variations and flow surges in the region of the primary circuit of IRIS reactor between the core and the pressurizer. For the simulation, it was used a situation of 'insurge', characterized by the entry of water in the pressurizer, due to the expansion of the refrigerant in the primary circuit. This expansion was represented by a pressure disturbance in step form, through the block 'step' of SIMULINK, thus enabling the transient startup. The results showed that the dynamic tool, obtained through the coupling of the codes, generated very satisfactory responses within model limitations, preserving the most important phenomena in the process. (author)

  10. Advances in computational modelling for personalised medicine after myocardial infarction.

    Science.gov (United States)

    Mangion, Kenneth; Gao, Hao; Husmeier, Dirk; Luo, Xiaoyu; Berry, Colin

    2018-04-01

    Myocardial infarction (MI) is a leading cause of premature morbidity and mortality worldwide. Determining which patients will experience heart failure and sudden cardiac death after an acute MI is notoriously difficult for clinicians. The extent of heart damage after an acute MI is informed by cardiac imaging, typically using echocardiography or sometimes, cardiac magnetic resonance (CMR). These scans provide complex data sets that are only partially exploited by clinicians in daily practice, implying potential for improved risk assessment. Computational modelling of left ventricular (LV) function can bridge the gap towards personalised medicine using cardiac imaging in patients with post-MI. Several novel biomechanical parameters have theoretical prognostic value and may be useful to reflect the biomechanical effects of novel preventive therapy for adverse remodelling post-MI. These parameters include myocardial contractility (regional and global), stiffness and stress. Further, the parameters can be delineated spatially to correspond with infarct pathology and the remote zone. While these parameters hold promise, there are challenges for translating MI modelling into clinical practice, including model uncertainty, validation and verification, as well as time-efficient processing. More research is needed to (1) simplify imaging with CMR in patients with post-MI, while preserving diagnostic accuracy and patient tolerance (2) to assess and validate novel biomechanical parameters against established prognostic biomarkers, such as LV ejection fraction and infarct size. Accessible software packages with minimal user interaction are also needed. Translating benefits to patients will be achieved through a multidisciplinary approach including clinicians, mathematicians, statisticians and industry partners. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless

  11. Seismic Response Prediction of Buildings with Base Isolation Using Advanced Soft Computing Approaches

    Directory of Open Access Journals (Sweden)

    Mosbeh R. Kaloop

    2017-01-01

    Full Text Available Modeling response of structures under seismic loads is an important factor in Civil Engineering as it crucially affects the design and management of structures, especially for the high-risk areas. In this study, novel applications of advanced soft computing techniques are utilized for predicting the behavior of centrically braced frame (CBF buildings with lead-rubber bearing (LRB isolation system under ground motion effects. These techniques include least square support vector machine (LSSVM, wavelet neural networks (WNN, and adaptive neurofuzzy inference system (ANFIS along with wavelet denoising. The simulation of a 2D frame model and eight ground motions are considered in this study to evaluate the prediction models. The comparison results indicate that the least square support vector machine is superior to other techniques in estimating the behavior of smart structures.

  12. Advanced Demonstration and Test Reactor Options Study

    International Nuclear Information System (INIS)

    Petti, David Andrew; Hill, R.; Gehin, J.; Gougar, Hans David; Strydom, Gerhard; Heidet, F.; Kinsey, J.; Grandy, Christopher; Qualls, A.; Brown, Nicholas; Powers, J.; Hoffman, E.; Croson, D.

    2017-01-01

    Global efforts to address climate change will require large-scale decarbonization of energy production in the United States and elsewhere. Nuclear power already provides 20% of electricity production in the United States (U.S.) and is increasing in countries undergoing rapid growth around the world. Because reliable, grid-stabilizing, low emission electricity generation, energy security, and energy resource diversity will be increasingly valued, nuclear power's share of electricity production has a potential to grow. In addition, there are non electricity applications (e.g., process heat, desalination, hydrogen production) that could be better served by advanced nuclear systems. Thus, the timely development, demonstration, and commercialization of advanced nuclear reactors could diversify the nuclear technologies available and offer attractive technology options to expand the impact of nuclear energy for electricity generation and non-electricity missions. The purpose of this planning study is to provide transparent and defensible technology options for a test and/or demonstration reactor(s) to be built to support public policy, innovation and long term commercialization within the context of the Department of Energy's (DOE's) broader commitment to pursuing an 'all of the above' clean energy strategy and associated time lines. This planning study includes identification of the key features and timing needed for advanced test or demonstration reactors to support research, development, and technology demonstration leading to the commercialization of power plants built upon these advanced reactor platforms. This planning study is consistent with the Congressional language contained within the fiscal year 2015 appropriation that directed the DOE to conduct a planning study to evaluate 'advanced reactor technology options, capabilities, and requirements within the context of national needs and public policy to support innovation in nuclear

  13. Advanced entry guidance algorithm with landing footprint computation

    Science.gov (United States)

    Leavitt, James Aaron

    -determined angle of attack profile. The method is also capable of producing orbital footprints using an automatically-generated set of angle of attack profiles of varying range, with the lowest profile designed for near-maximum range in the absence of an active heat load constraint. The accuracy of the footprint method is demonstrated by direct comparison with footprints computed independently by an optimization program.

  14. Advanced computational methodology for full-core neutronics calculations

    Science.gov (United States)

    Hiruta, Hikaru

    The modern computational methodology for reactor physics calculations is based on single-assembly transport calculations with reflective boundary conditions that generate homogenized few-group data, and core-level coarse-mesh diffusion calculations that evaluate a large-scale behavior of the scalar flux. Recently, an alternative approach has been developed. It is based on the low-order equations of the quasidiffusion method in order to account accurately for complicated transport effects in full-core calculations. The low-order quasidiffusion (LOAD) equations can capture transport effects to an arbitrary degree of accuracy. This approach is combined with single-assembly transport calculations that use special albedo boundary conditions which enable one to simulate efficiently effects of an unlike neighboring assembly on assembly's group data. In this dissertation, we develop homogenization methodology based on the LOAD equations and spatially consistent coarse-mesh finite element discretization methods for the 2D low-order quasidiffusion equations for the full-core calculations. The coarse-mesh solution generated by this method preserves a number of spatial polynomial moments of the fine-mesh transport solution over coarse cells. The proposed method reproduces accurately the complicated large-scale behavior of the transport solution within assemblies. To demonstrate accuracy of the developed method, we present numerical results of calculations of test problems that simulate interaction of MOX and uranium assemblies. We also develop a splitting method that can efficiently solve the coarse-mesh discretized LOQD equations. The presented method splits the LOAD problem into two parts: (i) the D-problem that captures a significant part of transport solution in the central parts of assemblies and can be reduced to a diffusion-type equation, and (ii) the Q-problem that accounts for the complicated behavior of the transport solution near assembly boundaries. Independent

  15. Advances in Computational Fluid-Structure Interaction and Flow Simulation Conference

    CERN Document Server

    Takizawa, Kenji

    2016-01-01

    This contributed volume celebrates the work of Tayfun E. Tezduyar on the occasion of his 60th birthday. The articles it contains were born out of the Advances in Computational Fluid-Structure Interaction and Flow Simulation (AFSI 2014) conference, also dedicated to Prof. Tezduyar and held at Waseda University in Tokyo, Japan on March 19-21, 2014. The contributing authors represent a group of international experts in the field who discuss recent trends and new directions in computational fluid dynamics (CFD) and fluid-structure interaction (FSI). Organized into seven distinct parts arranged by thematic topics, the papers included cover basic methods and applications of CFD, flows with moving boundaries and interfaces, phase-field modeling, computer science and high-performance computing (HPC) aspects of flow simulation, mathematical methods, biomedical applications, and FSI. Researchers, practitioners, and advanced graduate students working on CFD, FSI, and related topics will find this collection to be a defi...

  16. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    Energy Technology Data Exchange (ETDEWEB)

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  17. Advanced Cell Development and Degradation Studies

    Energy Technology Data Exchange (ETDEWEB)

    J. E. O' Brien; C. M. Stoots; J. S. Herring; R. C. O' Brien; K. G. Condie; M. Sohal; G. K. Housley; J. J. Hartvigsen; D. Larsen; G. Tao; B. Yildiz; V. Sharma; P. Singh; N. Petigny; T. L. Cable

    2010-09-01

    The Idaho National Laboratory (INL) has been researching the application of solid-oxide electrolysis cells for large-scale hydrogen production from steam over a temperature range of 800 to 900ºC. From 2003 – 2009, this work was sponsored by the DOE Nuclear Hydrogen Initiative (NHI). Starting in 2010, the HTE research program has been sponsored by the Next Generation Nuclear Plant (NGNP) program. HTSE research priorities in FY10 are centered on understanding and reducing cell and stack performance degradation to an acceptable level to advance the technology readiness level of HTSE and to justify further large-scale demonstration activities. This report provides a summary of our FY10 experimental program, which has been focused on advanced cell and stack development and degradation studies. Advanced cell and stack development activities are under way at five technology partners: MSRI, Versa Power, Ceramatec, NASA Glenn, and St. Gobain. Performance evaluation of the advanced technology cells and stacks has been performed by the technology partners, by MIT and the University of Connecticut and at the INL HTE Laboratory. Summaries of these development activities and test results are presented.

  18. Advanced Modulation Techniques for High-Performance Computing Optical Interconnects

    DEFF Research Database (Denmark)

    Karinou, Fotini; Borkowski, Robert; Zibar, Darko

    2013-01-01

    We experimentally assess the performance of a 64 × 64 optical switch fabric used for ns-speed optical cell switching in supercomputer optical interconnects. More specifically, we study four alternative modulation formats and detection schemes, namely, 10-Gb/s nonreturn-to-zero differential phase-...

  19. System identification advances and case studies

    CERN Document Server

    Mehra, Raman K

    1976-01-01

    In this book, we study theoretical and practical aspects of computing methods for mathematical modelling of nonlinear systems. A number of computing techniques are considered, such as methods of operator approximation with any given accuracy; operator interpolation techniques including a non-Lagrange interpolation; methods of system representation subject to constraints associated with concepts of causality, memory and stationarity; methods of system representation with an accuracy that is the best within a given class of models; methods of covariance matrix estimation;methods for low-rank mat

  20. The advanced statistical methods in aerobiological studies

    Directory of Open Access Journals (Sweden)

    Agnieszka Grinn-Gofroń

    2012-12-01

    Full Text Available Pollen and spore forecasting has become an important aim in aerobiology. The main goal is to provide accurate information on biological particles in the air to sensitive users in order to help them optimize their treatment process. Many statistical methods of data analysis are based on the assumptions of linearity and normality that often cannot be fulfilled. The advanced statistical methods can be applied to the problems that cannot be solved in any other effective way, and are suited to predicting the concentration of airborne pollen or spores in relation to weather conditions. The purpose of the study was to review some advanced statistical methods that can be used in aerobiological studies.

  1. Pressure Safety: Advanced Self-Study 30120

    Energy Technology Data Exchange (ETDEWEB)

    Glass, George [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-02-29

    Pressure Safety Advance Self-Study (Course 30120) consists of an introduction, five modules, and a quiz. To receive credit in UTrain for completing this course, you must score 80% or better on the 15-question quiz (check UTrain). Directions for initiating the quiz are appended to the end of this training manual. This course contains several links to LANL websites. UTrain might not support active links, so please copy links into the address line in your browser.

  2. [Advances in the studies of concealed penis].

    Science.gov (United States)

    Fan, Sheng-hai; Li, Xue-de

    2015-09-01

    Concealed penis is usually found in children, which affects the patients both physiologically and psychologically. Some of the patients are wrongly treated by circumcision, which may bring about serious consequences to the sexual life of the patients in their adulthood. In the recent years, this disease has been receiving more and more attention from both doctors and parents. However, controversies remain as to its classification, pathogenesis, pathology, and treatment. This paper focuses on the understanding and advances in the studies of concealed penis.

  3. A first attempt to bring computational biology into advanced high school biology classrooms.

    Science.gov (United States)

    Gallagher, Suzanne Renick; Coon, William; Donley, Kristin; Scott, Abby; Goldberg, Debra S

    2011-10-01

    Computer science has become ubiquitous in many areas of biological research, yet most high school and even college students are unaware of this. As a result, many college biology majors graduate without adequate computational skills for contemporary fields of biology. The absence of a computational element in secondary school biology classrooms is of growing concern to the computational biology community and biology teachers who would like to acquaint their students with updated approaches in the discipline. We present a first attempt to correct this absence by introducing a computational biology element to teach genetic evolution into advanced biology classes in two local high schools. Our primary goal was to show students how computation is used in biology and why a basic understanding of computation is necessary for research in many fields of biology. This curriculum is intended to be taught by a computational biologist who has worked with a high school advanced biology teacher to adapt the unit for his/her classroom, but a motivated high school teacher comfortable with mathematics and computing may be able to teach this alone. In this paper, we present our curriculum, which takes into consideration the constraints of the required curriculum, and discuss our experiences teaching it. We describe the successes and challenges we encountered while bringing this unit to high school students, discuss how we addressed these challenges, and make suggestions for future versions of this curriculum.We believe that our curriculum can be a valuable seed for further development of computational activities aimed at high school biology students. Further, our experiences may be of value to others teaching computational biology at this level. Our curriculum can be obtained at http://ecsite.cs.colorado.edu/?page_id=149#biology or by contacting the authors.

  4. Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, Forest M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bochev, Pavel B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cameron-Smith, Philip J.. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Easter, Richard C [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Elliott, Scott M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ghan, Steven J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Xiaohong [Univ. of Wyoming, Laramie, WY (United States); Lowrie, Robert B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lucas, Donald D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ma, Po-lun [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sacks, William J. [National Center for Atmospheric Research (NCAR), Boulder, CO (United States); Shrivastava, Manish [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Singh, Balwinder [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tautges, Timothy J. [Argonne National Lab. (ANL), Argonne, IL (United States); Taylor, Mark A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Vertenstein, Mariana [National Center for Atmospheric Research (NCAR), Boulder, CO (United States); Worley, Patrick H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2014-01-15

    The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.

  5. Center for Advanced Energy Studies Program Plan

    Energy Technology Data Exchange (ETDEWEB)

    Kevin Kostelnik

    2005-09-01

    The world is facing critical energy-related challenges regarding world and national energy demands, advanced science and energy technology delivery, nuclear engineering educational shortfalls, and adequately trained technical staff. Resolution of these issues is important for the United States to ensure a secure and affordable energy supply, which is essential for maintaining U.S. national security, continued economic prosperity, and future sustainable development. One way that the U.S. Department of Energy (DOE) is addressing these challenges is by tasking the Battelle Energy Alliance, LLC (BEA) with developing the Center for Advanced Energy Studies (CAES) at the Idaho National Laboratory (INL). By 2015, CAES will be a self-sustaining, world-class, academic and research institution where the INL; DOE; Idaho, regional, and other national universities; and the international community will cooperate to conduct critical energy-related research, classroom instruction, technical training, policy conceptualization, public dialogue, and other events.

  6. Study of advanced fuel system concepts for commercial aircraft

    Science.gov (United States)

    Coffinberry, G. A.

    1985-01-01

    An analytical study was performed in order to assess relative performance and economic factors involved with alternative advanced fuel systems for future commercial aircraft operating with broadened property fuels. The DC-10-30 wide-body tri-jet aircraft and the CF6-8OX engine were used as a baseline design for the study. Three advanced systems were considered and were specifically aimed at addressing freezing point, thermal stability and lubricity fuel properties. Actual DC-10-30 routes and flight profiles were simulated by computer modeling and resulted in prediction of aircraft and engine fuel system temperatures during a nominal flight and during statistical one-day-per-year cold and hot flights. Emergency conditions were also evaluated. Fuel consumption and weight and power extraction results were obtained. An economic analysis was performed for new aircraft and systems. Advanced system means for fuel tank heating included fuel recirculation loops using engine lube heat and generator heat. Environmental control system bleed air heat was used for tank heating in a water recirculation loop. The results showed that fundamentally all of the three advanced systems are feasible but vary in their degree of compatibility with broadened-property fuel.

  7. Innovations and advances in computing, informatics, systems sciences, networking and engineering

    CERN Document Server

    Elleithy, Khaled

    2015-01-01

    Innovations and Advances in Computing, Informatics, Systems Sciences, Networking and Engineering  This book includes a set of rigorously reviewed world-class manuscripts addressing and detailing state-of-the-art research projects in the areas of Computer Science, Informatics, and Systems Sciences, and Engineering. It includes selected papers from the conference proceedings of the Eighth and some selected papers of the Ninth International Joint Conferences on Computer, Information, and Systems Sciences, and Engineering (CISSE 2012 & CISSE 2013). Coverage includes topics in: Industrial Electronics, Technology & Automation, Telecommunications and Networking, Systems, Computing Sciences and Software Engineering, Engineering Education, Instructional Technology, Assessment, and E-learning.  ·       Provides the latest in a series of books growing out of the International Joint Conferences on Computer, Information, and Systems Sciences, and Engineering; ·       Includes chapters in the most a...

  8. NATO Advanced Study Institute on Recent Advances in the Modeling of Hydrologic Systems

    CERN Document Server

    O’Connell, P

    1991-01-01

    Modeling of the rainfall-runoff process is of both scientific and practical significance. Many of the currently used mathematical models of hydrologic systems were developed a genera­ tion ago. Much of the effort since then has focused on refining these models rather than on developing new models based on improved scientific understanding. In the past few years, however, a renewed effort has been made to improve both our fundamental understanding of hydrologic processes and to exploit technological advances in computing and remote sensing. It is against this background that the NATO Advanced Study Institute on Recent Advances in the Modeling of Hydrologic Systems was organized. The idea for holding a NATO ASI on this topic grew out of an informal discussion between one of the co-directors and Professor Francisco Nunes-Correia at a previous NATO ASI held at Tucson, Arizona in 1985. The Special Program Panel on Global Transport Mechanisms in the Geo-Sciences of the NATO Scientific Affairs Division agreed to sp...

  9. Nacelle design studies for advanced transport aircraft.

    Science.gov (United States)

    Sussman, M. B.; Gunnarson, D. W.; Edwards, P.

    1972-01-01

    Results are given of several analytical studies of nacelles suitable for advanced subsonic commercial transport aircraft. The impact on the nacelle of reduced aircraft noise and increased cruise Mach number is emphasized and initially developed in terms of the individual nacelle components: inlet, fan cowl, nozzle, etc. This is achieved by relating the noise and cruise speed constraints to which the aircraft system must be designed to specific limitations on the individual nacelle components. Performance assessments are then made (separately for each nacelle component) of competitive design concepts. Overall nacelle designs, synthesized on the basis of the individual component studies, are briefly discussed.

  10. 1st International Conference on Computational Advancement in Communication Circuits and Systems

    CERN Document Server

    Dalapati, Goutam; Banerjee, P; Mallick, Amiya; Mukherjee, Moumita

    2015-01-01

    This book comprises the proceedings of 1st International Conference on Computational Advancement in Communication Circuits and Systems (ICCACCS 2014) organized by Narula Institute of Technology under the patronage of JIS group, affiliated to West Bengal University of Technology. The conference was supported by Technical Education Quality Improvement Program (TEQIP), New Delhi, India and had technical collaboration with IEEE Kolkata Section, along with publication partner by Springer. The book contains 62 refereed papers that aim to highlight new theoretical and experimental findings in the field of Electronics and communication engineering including interdisciplinary fields like Advanced Computing, Pattern Recognition and Analysis, Signal and Image Processing. The proceedings cover the principles, techniques and applications in microwave & devices, communication & networking, signal & image processing, and computations & mathematics & control. The proceedings reflect the conference’s emp...

  11. Creating Educational Technology Curricula for Advanced Studies in Learning Technology

    Directory of Open Access Journals (Sweden)

    Minoru Nakayama

    2016-08-01

    Full Text Available Curriculum design and content are key factors in the area of human resource development. To examine the possibility of using a collaboration of Human Computer Interaction (HCI and Educational Technology (ET to develop innovative improvements to the education system, the curricula of these two areas of study were lexically analyzed and compared. As a further example, the curriculum of a joint course in HCI and ET was also lexically analyzed and the contents were examined. These analyses can be used as references in the development of human resources for use in advanced learning environments.

  12. Educational NASA Computational and Scientific Studies (enCOMPASS)

    Science.gov (United States)

    Memarsadeghi, Nargess

    2013-01-01

    Educational NASA Computational and Scientific Studies (enCOMPASS) is an educational project of NASA Goddard Space Flight Center aimed at bridging the gap between computational objectives and needs of NASA's scientific research, missions, and projects, and academia's latest advances in applied mathematics and computer science. enCOMPASS achieves this goal via bidirectional collaboration and communication between NASA and academia. Using developed NASA Computational Case Studies in university computer science/engineering and applied mathematics classes is a way of addressing NASA's goals of contributing to the Science, Technology, Education, and Math (STEM) National Objective. The enCOMPASS Web site at http://encompass.gsfc.nasa.gov provides additional information. There are currently nine enCOMPASS case studies developed in areas of earth sciences, planetary sciences, and astrophysics. Some of these case studies have been published in AIP and IEEE's Computing in Science and Engineering magazines. A few university professors have used enCOMPASS case studies in their computational classes and contributed their findings to NASA scientists. In these case studies, after introducing the science area, the specific problem, and related NASA missions, students are first asked to solve a known problem using NASA data and past approaches used and often published in a scientific/research paper. Then, after learning about the NASA application and related computational tools and approaches for solving the proposed problem, students are given a harder problem as a challenge for them to research and develop solutions for. This project provides a model for NASA scientists and engineers on one side, and university students, faculty, and researchers in computer science and applied mathematics on the other side, to learn from each other's areas of work, computational needs and solutions, and the latest advances in research and development. This innovation takes NASA science and

  13. Computer technology forecast study for general aviation

    Science.gov (United States)

    Seacord, C. L.; Vaughn, D.

    1976-01-01

    A multi-year, multi-faceted program is underway to investigate and develop potential improvements in airframes, engines, and avionics for general aviation aircraft. The objective of this study was to assemble information that will allow the government to assess the trends in computer and computer/operator interface technology that may have application to general aviation in the 1980's and beyond. The current state of the art of computer hardware is assessed, technical developments in computer hardware are predicted, and nonaviation large volume users of computer hardware are identified.

  14. Government Cloud Computing Policies: Potential Opportunities for Advancing Military Biomedical Research.

    Science.gov (United States)

    Lebeda, Frank J; Zalatoris, Jeffrey J; Scheerer, Julia B

    2018-02-07

    This position paper summarizes the development and the present status of Department of Defense (DoD) and other government policies and guidances regarding cloud computing services. Due to the heterogeneous and growing biomedical big datasets, cloud computing services offer an opportunity to mitigate the associated storage and analysis requirements. Having on-demand network access to a shared pool of flexible computing resources creates a consolidated system that should reduce potential duplications of effort in military biomedical research. Interactive, online literature searches were performed with Google, at the Defense Technical Information Center, and at two National Institutes of Health research portfolio information sites. References cited within some of the collected documents also served as literature resources. We gathered, selected, and reviewed DoD and other government cloud computing policies and guidances published from 2009 to 2017. These policies were intended to consolidate computer resources within the government and reduce costs by decreasing the number of federal data centers and by migrating electronic data to cloud systems. Initial White House Office of Management and Budget information technology guidelines were developed for cloud usage, followed by policies and other documents from the DoD, the Defense Health Agency, and the Armed Services. Security standards from the National Institute of Standards and Technology, the Government Services Administration, the DoD, and the Army were also developed. Government Services Administration and DoD Inspectors General monitored cloud usage by the DoD. A 2016 Government Accountability Office report characterized cloud computing as being economical, flexible and fast. A congressionally mandated independent study reported that the DoD was active in offering a wide selection of commercial cloud services in addition to its milCloud system. Our findings from the Department of Health and Human Services

  15. Computational techniques for ECG analysis and interpretation in light of their contribution to medical advances.

    Science.gov (United States)

    Lyon, Aurore; Mincholé, Ana; Martínez, Juan Pablo; Laguna, Pablo; Rodriguez, Blanca

    2018-01-01

    Widely developed for clinical screening, electrocardiogram (ECG) recordings capture the cardiac electrical activity from the body surface. ECG analysis can therefore be a crucial first step to help diagnose, understand and predict cardiovascular disorders responsible for 30% of deaths worldwide. Computational techniques, and more specifically machine learning techniques and computational modelling are powerful tools for classification, clustering and simulation, and they have recently been applied to address the analysis of medical data, especially ECG data. This review describes the computational methods in use for ECG analysis, with a focus on machine learning and 3D computer simulations, as well as their accuracy, clinical implications and contributions to medical advances. The first section focuses on heartbeat classification and the techniques developed to extract and classify abnormal from regular beats. The second section focuses on patient diagnosis from whole recordings, applied to different diseases. The third section presents real-time diagnosis and applications to wearable devices. The fourth section highlights the recent field of personalized ECG computer simulations and their interpretation. Finally, the discussion section outlines the challenges of ECG analysis and provides a critical assessment of the methods presented. The computational methods reported in this review are a strong asset for medical discoveries and their translation to the clinical world may lead to promising advances. © 2018 The Author(s).

  16. Image analysis and modeling in medical image computing. Recent developments and advances.

    Science.gov (United States)

    Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T

    2012-01-01

    Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body

  17. Technological advances for studying human behavior

    Science.gov (United States)

    Roske-Hofstrand, Renate J.

    1990-01-01

    Technological advances for studying human behavior are noted in viewgraph form. It is asserted that performance-aiding systems are proliferating without a fundamental understanding of how they would interact with the humans who must control them. Two views of automation research, the hardware view and the human-centered view, are listed. Other viewgraphs give information on vital elements for human-centered research, a continuum of the research process, available technologies, new technologies for persistent problems, a sample research infrastructure, the need for metrics, and examples of data-link technology.

  18. Robotics, stem cells, and brain-computer interfaces in rehabilitation and recovery from stroke: updates and advances.

    Science.gov (United States)

    Boninger, Michael L; Wechsler, Lawrence R; Stein, Joel

    2014-11-01

    The aim of this study was to describe the current state and latest advances in robotics, stem cells, and brain-computer interfaces in rehabilitation and recovery for stroke. The authors of this summary recently reviewed this work as part of a national presentation. The article represents the information included in each area. Each area has seen great advances and challenges as products move to market and experiments are ongoing. Robotics, stem cells, and brain-computer interfaces all have tremendous potential to reduce disability and lead to better outcomes for patients with stroke. Continued research and investment will be needed as the field moves forward. With this investment, the potential for recovery of function is likely substantial.

  19. Computational code in atomic and nuclear quantum optics: Advanced computing multiphoton resonance parameters for atoms in a strong laser field

    Science.gov (United States)

    Glushkov, A. V.; Gurskaya, M. Yu; Ignatenko, A. V.; Smirnov, A. V.; Serga, I. N.; Svinarenko, A. A.; Ternovsky, E. V.

    2017-10-01

    The consistent relativistic energy approach to the finite Fermi-systems (atoms and nuclei) in a strong realistic laser field is presented and applied to computing the multiphoton resonances parameters in some atoms and nuclei. The approach is based on the Gell-Mann and Low S-matrix formalism, multiphoton resonance lines moments technique and advanced Ivanov-Ivanova algorithm of calculating the Green’s function of the Dirac equation. The data for multiphoton resonance width and shift for the Cs atom and the 57Fe nucleus in dependence upon the laser intensity are listed.

  20. COMPARATIVE STUDY OF CLOUD COMPUTING AND MOBILE CLOUD COMPUTING

    OpenAIRE

    Nidhi Rajak*, Diwakar Shukla

    2018-01-01

    Present era is of Information and Communication Technology (ICT) and there are number of researches are going on Cloud Computing and Mobile Cloud Computing such security issues, data management, load balancing and so on. Cloud computing provides the services to the end user over Internet and the primary objectives of this computing are resource sharing and pooling among the end users. Mobile Cloud Computing is a combination of Cloud Computing and Mobile Computing. Here, data is stored in...

  1. Biophysics of the Eye in Computer Vision: Methods and Advanced Technologies

    Science.gov (United States)

    Hammoud, Riad I.; Hansen, Dan Witzner

    The eyes have it! This chapter describes cutting-edge computer vision methods employed in advanced vision sensing technologies for medical, safety, and security applications, where the human eye represents the object of interest for both the imager and the computer. A camera receives light from the real eye to form a sequence of digital images of it. As the eye scans the environment, or focuses on particular objects in the scene, the computer simultaneously localizes the eye position, tracks its movement over time, and infers measures such as the attention level, and the gaze direction in real time and fully automatic. The main focus of this chapter is on computer vision and pattern recognition algorithms for eye appearance variability modeling, automatic eye detection, and robust eye position tracking. This chapter offers good readings and solid methodologies to build the two fundamental low-level building blocks of a vision-based eye tracking technology.

  2. NATO Advanced Study Institute on Metal Hydrides

    CERN Document Server

    1981-01-01

    In the last five years, the study of metal hydrides has ex­ panded enormously due to the potential technological importance of this class of materials in hydrogen based energy conversion schemes. The scope of this activity has been worldwide among the industrially advanced nations. There has been a consensus among researchers in both fundamental and applied areas that a more basic understanding of the properties of metal/hydrogen syster;,s is required in order to provide a rational basis for the selection of materials for specific applications. The current worldwide need for and interest in research in metal hydrides indicated the timeliness of an Advanced Study Insti­ tute to provide an in-depth view of the field for those active in its various aspects. The inclusion of speakers from non-NATO coun­ tries provided the opportunity for cross-fertilization of ideas for future research. While the emphasis of the Institute was on basic properties, there was a conscious effort to stimulate interest in the applic...

  3. Mirror Advanced Reactor Study interim design report

    International Nuclear Information System (INIS)

    1983-04-01

    The status of the design of a tenth-of-a-kind commercial tandem-mirror fusion reactor is described at the midpoint of a two-year study. When completed, the design is to serve as a strategic goal for the mirror fusion program. The main objectives of the Mirror Advanced Reactor Study (MARS) are: (1) to design an attractive tandem-mirror fusion reactor producing electricity and synfuels (in alternate versions), (2) to identify key development and technology needs, and (3) to exploit the potential of fusion for safety, low activation, and simple disposal of radioactive waste. In the first year we have emphasized physics and engineering of the central cell and physics of the end cell. Design optimization and trade studies are continuing, and we expect additional modifications in the end cells to further improve the performance of the final design

  4. Mirror Advanced Reactor Study interim design report

    Energy Technology Data Exchange (ETDEWEB)

    1983-04-01

    The status of the design of a tenth-of-a-kind commercial tandem-mirror fusion reactor is described at the midpoint of a two-year study. When completed, the design is to serve as a strategic goal for the mirror fusion program. The main objectives of the Mirror Advanced Reactor Study (MARS) are: (1) to design an attractive tandem-mirror fusion reactor producing electricity and synfuels (in alternate versions), (2) to identify key development and technology needs, and (3) to exploit the potential of fusion for safety, low activation, and simple disposal of radioactive waste. In the first year we have emphasized physics and engineering of the central cell and physics of the end cell. Design optimization and trade studies are continuing, and we expect additional modifications in the end cells to further improve the performance of the final design.

  5. PREFACE: 16th International workshop on Advanced Computing and Analysis Techniques in physics research (ACAT2014)

    Science.gov (United States)

    Fiala, L.; Lokajicek, M.; Tumova, N.

    2015-05-01

    This volume of the IOP Conference Series is dedicated to scientific contributions presented at the 16th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2014), this year the motto was ''bridging disciplines''. The conference took place on September 1-5, 2014, at the Faculty of Civil Engineering, Czech Technical University in Prague, Czech Republic. The 16th edition of ACAT explored the boundaries of computing system architectures, data analysis algorithmics, automatic calculations, and theoretical calculation technologies. It provided a forum for confronting and exchanging ideas among these fields, where new approaches in computing technologies for scientific research were explored and promoted. This year's edition of the workshop brought together over 140 participants from all over the world. The workshop's 16 invited speakers presented key topics on advanced computing and analysis techniques in physics. During the workshop, 60 talks and 40 posters were presented in three tracks: Computing Technology for Physics Research, Data Analysis - Algorithms and Tools, and Computations in Theoretical Physics: Techniques and Methods. The round table enabled discussions on expanding software, knowledge sharing and scientific collaboration in the respective areas. ACAT 2014 was generously sponsored by Western Digital, Brookhaven National Laboratory, Hewlett Packard, DataDirect Networks, M Computers, Bright Computing, Huawei and PDV-Systemhaus. Special appreciations go to the track liaisons Lorenzo Moneta, Axel Naumann and Grigory Rubtsov for their work on the scientific program and the publication preparation. ACAT's IACC would also like to express its gratitude to all referees for their work on making sure the contributions are published in the proceedings. Our thanks extend to the conference liaisons Andrei Kataev and Jerome Lauret who worked with the local contacts and made this conference possible as well as to the program

  6. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges

    Energy Technology Data Exchange (ETDEWEB)

    Lucas, Robert [University of Southern California, Information Sciences Institute; Ang, James [Sandia National Laboratories; Bergman, Keren [Columbia University; Borkar, Shekhar [Intel; Carlson, William [Institute for Defense Analyses; Carrington, Laura [University of California, San Diego; Chiu, George [IBM; Colwell, Robert [DARPA; Dally, William [NVIDIA; Dongarra, Jack [University of Tennessee; Geist, Al [Oak Ridge National Laboratory; Haring, Rud [IBM; Hittinger, Jeffrey [Lawrence Livermore National Laboratory; Hoisie, Adolfy [Pacific Northwest National Laboratory; Klein, Dean Micron; Kogge, Peter [University of Notre Dame; Lethin, Richard [Reservoir Labs; Sarkar, Vivek [Rice University; Schreiber, Robert [Hewlett Packard; Shalf, John [Lawrence Berkeley National Laboratory; Sterling, Thomas [Indiana University; Stevens, Rick [Argonne National Laboratory; Bashor, Jon [Lawrence Berkeley National Laboratory; Brightwell, Ron [Sandia National Laboratories; Coteus, Paul [IBM; Debenedictus, Erik [Sandia National Laboratories; Hiller, Jon [Science and Technology Associates; Kim, K. H. [IBM; Langston, Harper [Reservoir Labs; Murphy, Richard Micron; Webster, Clayton [Oak Ridge National Laboratory; Wild, Stefan [Argonne National Laboratory; Grider, Gary [Los Alamos National Laboratory; Ross, Rob [Argonne National Laboratory; Leyffer, Sven [Argonne National Laboratory; Laros III, James [Sandia National Laboratories

    2014-02-10

    Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a system that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.

  7. The New Center for Advanced Energy Studies

    Energy Technology Data Exchange (ETDEWEB)

    L.J. Bond; K. Kostelnik; R.A. Wharton; A. Kadak

    2006-06-01

    A secure and affordable energy supply is essential for achieving U.S. national security, in continuing U.S. prosperity and in laying the foundation to enable future economic growth. The next generation energy workforce in the U.S. is a critical element in meeting both national and global energy needs. The Center for Advanced Energy Studies (CAES) was established in 2005 in response to U.S. Department of Energy (DOE) requirements. CAES, located at the new Idaho National Laboratory (INL), will address critical energy education, research, policy study and training needs. CAES is a unique joint partnership between the Battelle Energy Alliance (BEA), the State of Idaho, an Idaho University Consortium (IUC), and a National University Consortium (NUC). CAES will be based in a new facility that will foster collaborative academic and research efforts among participating institutions.

  8. Advances in mobile cloud computing and big data in the 5G era

    CERN Document Server

    Mastorakis, George; Dobre, Ciprian

    2017-01-01

    This book reports on the latest advances on the theories, practices, standards and strategies that are related to the modern technology paradigms, the Mobile Cloud computing (MCC) and Big Data, as the pillars and their association with the emerging 5G mobile networks. The book includes 15 rigorously refereed chapters written by leading international researchers, providing the readers with technical and scientific information about various aspects of Big Data and Mobile Cloud Computing, from basic concepts to advanced findings, reporting the state-of-the-art on Big Data management. It demonstrates and discusses methods and practices to improve multi-source Big Data manipulation techniques, as well as the integration of resources availability through the 3As (Anywhere, Anything, Anytime) paradigm, using the 5G access technologies.

  9. Use of the INAA [instrumental neutron activation analysis] Advance Prediction Computer Program [APCP] for research reactors

    International Nuclear Information System (INIS)

    Guinn, V.P.

    1990-01-01

    An important aspect of the neutron irradiation of small samples (usually solids or liquids) in research-type nuclear reactors is advance knowledge of the gamma-ray activity levels of the samples at the end of irradiation (EOI) and at subsequent decay times thereafter. Such knowledge is important in neutron activation analysis (NAA) as well as in other work involving reactor irradiations of samples. Obviously, the activity levels of the various neutron-induced radionuclides depend on a variety of factors: sample weight and elemental composition; neutron fluxes; length of irradiation; and length of decay. The instrumental NAA (INAA) Advance Prediction Computer Program (APCP) was developed and tested experimentally some years ago for work in the field of INAA. Very recently, the program has been rewritten for use with an IBM-compatible personal computer. To illustrate some of the features of the APCP output, an example is cited

  10. Robotics, Stem Cells and Brain Computer Interfaces in Rehabilitation and Recovery from Stroke; Updates and Advances

    Science.gov (United States)

    Boninger, Michael L; Wechsler, Lawrence R.; Stein, Joel

    2014-01-01

    Objective To describe the current state and latest advances in robotics, stem cells, and brain computer interfaces in rehabilitation and recovery for stroke. Design The authors of this summary recently reviewed this work as part of a national presentation. The paper represents the information included in each area. Results Each area has seen great advances and challenges as products move to market and experiments are ongoing. Conclusion Robotics, stem cells, and brain computer interfaces all have tremendous potential to reduce disability and lead to better outcomes for patients with stroke. Continued research and investment will be needed as the field moves forward. With this investment, the potential for recovery of function is likely substantial PMID:25313662

  11. Engaging students for the learning and assessment of the advanced computer graphics module using the latest technologies

    OpenAIRE

    Liu, Y; Yang, l; Han, J; Lu, B; Yuen, P; Zhao, Y; Song, R

    2017-01-01

    The advanced computer graphics has been one of the most basic and landmark modules in the field of computer science. It usually covers such topics as core mathematics, lighting and shading, texture mapping, colour and depth, and advanced modeling. All such topics involve mathematics for object modeling and transformation, and programming for object visualization and interaction. While some students are not as good in either mathematics or programming, it is usually a challenge to teach comput...

  12. Encouraging Advanced Second Language Speakers to Recognise Their Language Difficulties: A Personalised Computer-Based Approach

    Science.gov (United States)

    Xu, Jing; Bull, Susan

    2010-01-01

    Despite holding advanced language qualifications, many overseas students studying at English-speaking universities still have difficulties in formulating grammatically correct sentences. This article introduces an "independent open learner model" for advanced second language speakers of English, which confronts students with the state of their…

  13. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    Science.gov (United States)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to

  14. A Codesign Case Study in Computer Graphics

    DEFF Research Database (Denmark)

    Brage, Jens P.; Madsen, Jan

    1994-01-01

    The paper describes a codesign case study where a computer graphics application is examined with the intention to speed up its execution. The application is specified as a C program, and is characterized by the lack of a simple compute-intensive kernel. The hardware/software partitioning is based...

  15. Condition monitoring through advanced sensor and computational technology : final report (January 2002 to May 2005).

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jung-Taek (Korea Atomic Energy Research Institute, Daejon, Korea); Luk, Vincent K.

    2005-05-01

    The overall goal of this joint research project was to develop and demonstrate advanced sensors and computational technology for continuous monitoring of the condition of components, structures, and systems in advanced and next-generation nuclear power plants (NPPs). This project included investigating and adapting several advanced sensor technologies from Korean and US national laboratory research communities, some of which were developed and applied in non-nuclear industries. The project team investigated and developed sophisticated signal processing, noise reduction, and pattern recognition techniques and algorithms. The researchers installed sensors and conducted condition monitoring tests on two test loops, a check valve (an active component) and a piping elbow (a passive component), to demonstrate the feasibility of using advanced sensors and computational technology to achieve the project goal. Acoustic emission (AE) devices, optical fiber sensors, accelerometers, and ultrasonic transducers (UTs) were used to detect mechanical vibratory response of check valve and piping elbow in normal and degraded configurations. Chemical sensors were also installed to monitor the water chemistry in the piping elbow test loop. Analysis results of processed sensor data indicate that it is feasible to differentiate between the normal and degraded (with selected degradation mechanisms) configurations of these two components from the acquired sensor signals, but it is questionable that these methods can reliably identify the level and type of degradation. Additional research and development efforts are needed to refine the differentiation techniques and to reduce the level of uncertainties.

  16. The Pan American Advanced Studies Institute

    CERN Document Server

    Arous, Gérard; Ferrari, Pablo; Newman, Charles; Sidoravicius, Vladas; Vares, Maria

    2014-01-01

    This volume features selected and peer-reviewed articles from the Pan-American Advanced Studies Institute (PASI). The chapters are written by international specialists who participated in the conference. Topics include developments based on breakthroughs in the mathematical understanding of phenomena describing systems in highly inhomogeneous and disordered media, including the KPZ universality class (describing the evolution of interfaces in two dimensions), spin glasses, random walks in random environment, and percolative systems. PASI fosters a collaboration between North American and Latin American researchers and students. The conference that inspired this volume took place in January 2012 in both Santiago de Chile and Buenos Aires. Researchers and graduate students will find timely research in probability theory, statistical physics and related disciplines.

  17. Using Computational and Mechanical Models to Study Animal Locomotion

    Science.gov (United States)

    Miller, Laura A.; Goldman, Daniel I.; Hedrick, Tyson L.; Tytell, Eric D.; Wang, Z. Jane; Yen, Jeannette; Alben, Silas

    2012-01-01

    Recent advances in computational methods have made realistic large-scale simulations of animal locomotion possible. This has resulted in numerous mathematical and computational studies of animal movement through fluids and over substrates with the purpose of better understanding organisms’ performance and improving the design of vehicles moving through air and water and on land. This work has also motivated the development of improved numerical methods and modeling techniques for animal locomotion that is characterized by the interactions of fluids, substrates, and structures. Despite the large body of recent work in this area, the application of mathematical and numerical methods to improve our understanding of organisms in the context of their environment and physiology has remained relatively unexplored. Nature has evolved a wide variety of fascinating mechanisms of locomotion that exploit the properties of complex materials and fluids, but only recently are the mathematical, computational, and robotic tools available to rigorously compare the relative advantages and disadvantages of different methods of locomotion in variable environments. Similarly, advances in computational physiology have only recently allowed investigators to explore how changes at the molecular, cellular, and tissue levels might lead to changes in performance at the organismal level. In this article, we highlight recent examples of how computational, mathematical, and experimental tools can be combined to ultimately answer the questions posed in one of the grand challenges in organismal biology: “Integrating living and physical systems.” PMID:22988026

  18. Using computational and mechanical models to study animal locomotion.

    Science.gov (United States)

    Miller, Laura A; Goldman, Daniel I; Hedrick, Tyson L; Tytell, Eric D; Wang, Z Jane; Yen, Jeannette; Alben, Silas

    2012-11-01

    Recent advances in computational methods have made realistic large-scale simulations of animal locomotion possible. This has resulted in numerous mathematical and computational studies of animal movement through fluids and over substrates with the purpose of better understanding organisms' performance and improving the design of vehicles moving through air and water and on land. This work has also motivated the development of improved numerical methods and modeling techniques for animal locomotion that is characterized by the interactions of fluids, substrates, and structures. Despite the large body of recent work in this area, the application of mathematical and numerical methods to improve our understanding of organisms in the context of their environment and physiology has remained relatively unexplored. Nature has evolved a wide variety of fascinating mechanisms of locomotion that exploit the properties of complex materials and fluids, but only recently are the mathematical, computational, and robotic tools available to rigorously compare the relative advantages and disadvantages of different methods of locomotion in variable environments. Similarly, advances in computational physiology have only recently allowed investigators to explore how changes at the molecular, cellular, and tissue levels might lead to changes in performance at the organismal level. In this article, we highlight recent examples of how computational, mathematical, and experimental tools can be combined to ultimately answer the questions posed in one of the grand challenges in organismal biology: "Integrating living and physical systems."

  19. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    Science.gov (United States)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  20. Computer stress study of bone with computed tomography

    International Nuclear Information System (INIS)

    Linden, M.J.; Marom, S.A.; Linden, C.N.

    1986-01-01

    A computer processing tool has been developed which, together with a finite element program, determines the stress-deformation pattern in a long bone, utilizing Computed Tomography (CT) data files for the geometry and radiographic density information. The geometry, together with mechanical properties and boundary conditions: loads and displacements, comprise the input of the Finite element (FE) computer program. The output of the program is the stresses and deformations in the bone. The processor is capable of developing an accurate three-dimensional finite element model from a scanned human long bone due to the CT high pixel resolution and the local mechanical properties determined from the radiographic densities of the scanned bone. The processor, together with the finite element program, serves first as an analysis tool towards improved understanding of bone function and remodelling. In this first stage, actual long bones may be scanned and analyzed under applied loads and displacements, determined from existing gait analyses. The stress-deformation patterns thus obtained may be used for studying the biomechanical behavior of particular long bones such as bones with implants and with osteoporosis. As a second stage, this processor may serve as a diagnostic tool for analyzing the biomechanical response of a specific patient's long long bone under applied loading by utilizing a CT data file of the specific bone as an input to the processor with the FE program

  1. NATO Advanced Research Workshop on Exploiting Mental Imagery with Computers in Mathematics Education

    CERN Document Server

    Mason, John

    1995-01-01

    The advent of fast and sophisticated computer graphics has brought dynamic and interactive images under the control of professional mathematicians and mathematics teachers. This volume in the NATO Special Programme on Advanced Educational Technology takes a comprehensive and critical look at how the computer can support the use of visual images in mathematical problem solving. The contributions are written by researchers and teachers from a variety of disciplines including computer science, mathematics, mathematics education, psychology, and design. Some focus on the use of external visual images and others on the development of individual mental imagery. The book is the first collected volume in a research area that is developing rapidly, and the authors pose some challenging new questions.

  2. Computational intelligence in wireless sensor networks recent advances and future challenges

    CERN Document Server

    Falcon, Rafael; Koeppen, Mario

    2017-01-01

    This book emphasizes the increasingly important role that Computational Intelligence (CI) methods are playing in solving a myriad of entangled Wireless Sensor Networks (WSN) related problems. The book serves as a guide for surveying several state-of-the-art WSN scenarios in which CI approaches have been employed. The reader finds in this book how CI has contributed to solve a wide range of challenging problems, ranging from balancing the cost and accuracy of heterogeneous sensor deployments to recovering from real-time sensor failures to detecting attacks launched by malicious sensor nodes and enacting CI-based security schemes. Network managers, industry experts, academicians and practitioners alike (mostly in computer engineering, computer science or applied mathematics) benefit from the spectrum of successful applications reported in this book. Senior undergraduate or graduate students may discover in this book some problems well suited for their own research endeavors. USP: Presents recent advances and fu...

  3. Recent advances in computational methods and clinical applications for spine imaging

    CERN Document Server

    Glocker, Ben; Klinder, Tobias; Li, Shuo

    2015-01-01

    This book contains the full papers presented at the MICCAI 2014 workshop on Computational Methods and Clinical Applications for Spine Imaging. The workshop brought together scientists and clinicians in the field of computational spine imaging. The chapters included in this book present and discuss the new advances and challenges in these fields, using several methods and techniques in order to address more efficiently different and timely applications involving signal and image acquisition, image processing and analysis, image segmentation, image registration and fusion, computer simulation, image based modeling, simulation and surgical planning, image guided robot assisted surgical and image based diagnosis. The book also includes papers and reports from the first challenge on vertebra segmentation held at the workshop.

  4. Computer Hardware, Advanced Mathematics and Model Physics pilot project final report

    International Nuclear Information System (INIS)

    1992-05-01

    The Computer Hardware, Advanced Mathematics and Model Physics (CHAMMP) Program was launched in January, 1990. A principal objective of the program has been to utilize the emerging capabilities of massively parallel scientific computers in the challenge of regional scale predictions of decade-to-century climate change. CHAMMP has already demonstrated the feasibility of achieving a 10,000 fold increase in computational throughput for climate modeling in this decade. What we have also recognized, however, is the need for new algorithms and computer software to capitalize on the radically new computing architectures. This report describes the pilot CHAMMP projects at the DOE National Laboratories and the National Center for Atmospheric Research (NCAR). The pilot projects were selected to identify the principal challenges to CHAMMP and to entrain new scientific computing expertise. The success of some of these projects has aided in the definition of the CHAMMP scientific plan. Many of the papers in this report have been or will be submitted for publication in the open literature. Readers are urged to consult with the authors directly for questions or comments about their papers

  5. 17th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2016)

    International Nuclear Information System (INIS)

    2016-01-01

    Preface The 2016 version of the International Workshop on Advanced Computing and Analysis Techniques in Physics Research took place on January 18-22, 2016, at the Universidad Técnica Federico Santa Maria -UTFSM- in Valparaiso, Chile. The present volume of IOP Conference Series is devoted to the selected scientific contributions presented at the workshop. In order to guarantee the scientific quality of the Proceedings all papers were thoroughly peer-reviewed by an ad-hoc Editorial Committee with the help of many careful reviewers. The ACAT Workshop series has a long tradition starting in 1990 (Lyon, France), and takes place in intervals of a year and a half. Formerly these workshops were known under the name AIHENP (Artificial Intelligence for High Energy and Nuclear Physics). Each edition brings together experimental and theoretical physicists and computer scientists/experts, from particle and nuclear physics, astronomy and astrophysics in order to exchange knowledge and experience in computing and data analysis in physics. Three tracks cover the main topics: Computing technology: languages and system architectures. Data analysis: algorithms and tools. Theoretical Physics: techniques and methods. Although most contributions and discussions are related to particle physics and computing, other fields like condensed matter physics, earth physics, biophysics are often addressed in the hope to share our approaches and visions. It created a forum for exchanging ideas among fields, exploring and promoting cutting-edge computing technologies and debating hot topics. (paper)

  6. Advances in soil-structure interaction studies

    International Nuclear Information System (INIS)

    Maheshwari, B.K.

    2011-01-01

    It is utmost important that lifeline infrastructures (such as bridges, hospitals, power plants, dams etc.) are safe and functional during earthquakes as damage or collapse of these structures may have far reaching implications. A lifeline's failure may hamper relief and rescue operations required just after an earthquake and secondly its indirect economical losses may be very severe. Therefore, safety of these structures during earthquakes is vital. Further, damage to nuclear facilities during earthquake may lead to disaster. These structures should be designed adequately taking into account all the important issues. Soil-Structure Interaction (SSI) is one of the design issues, which is often overlooked and even in some cases ignored. The effects of dynamic SSI are well understood and practiced in the nuclear power industry (for large foundations of the nuclear containment structures) since sixties. However, in last decade, there are many advances in techniques of SSI and those need to be incorporated in practice. Failures of many structures occurred during the 1989 Loma Prieta and 1994 Northridge, California earthquakes and the 1995 Kobe, Japan earthquake due to SSI or a related issue. Many jetties had failed in Andaman and Nicobar islands due to Sumatra earthquake and ensuing tsunamis. It is because of this recent experience that the importance of SSI on dynamic response of structures during earthquakes has been fully realized. General belief that the SSI effects are always beneficial for the structure is not correct. Some cases have been presented where it is shown that SSI effects are detrimental for the stability of the structure. This paper addresses the effects of dynamic SSI on the response of the structures and explains its importance. Further advances in SSI studies have been discussed

  7. Recent advances in protein-protein interaction prediction: experimental and computational methods.

    Science.gov (United States)

    Jessulat, Matthew; Pitre, Sylvain; Gui, Yuan; Hooshyar, Mohsen; Omidi, Katayoun; Samanfar, Bahram; Tan, Le Hoa; Alamgir, Md; Green, James; Dehne, Frank; Golshani, Ashkan

    2011-09-01

    Proteins within the cell act as part of complex networks, which allow pathways and processes to function. Therefore, understanding how proteins interact is a significant area of current research. This review aims to present an overview of key experimental techniques (yeast two-hybrid, tandem affinity purification and protein microarrays) used to discover protein-protein interactions (PPIs), as well as to briefly discuss certain computational methods for predicting protein interactions based on gene localization, phylogenetic information, 3D structural modeling or primary protein sequence data. Due to the large-scale applicability of primary sequence-based methods, the authors have chosen to focus on this strategy for our review. There is an emphasis on a recent algorithm called Protein Interaction Prediction Engine (PIPE) that can predict global PPIs. The readers will discover recent advances both in the practical determination of protein interaction and the strategies that are available to attempt to anticipate interactions without the time and costs of experimental work. Global PPI maps can help understand the biology of complex diseases and facilitate the identification of novel drug target sites. This study describes different techniques used for PPI prediction that we believe will significantly impact the development of the field in a new future. We expect to see a growing number of similar techniques capable of large-scale PPI predictions.

  8. Recent advances in the development of antiviral agents using computer-aided structure based approaches.

    Science.gov (United States)

    Kumar, Vikash; Chandra, Sharat; Siddiqi, Mohammad Imran

    2014-01-01

    Viral diseases have been affecting the human race since ancient times. Currently, a long list of diseases caused by the viruses is available and extensive research in this area has resulted in understanding the finest details of the molecular mechanism of pathogenesis caused by these pathogens. Side by side, efforts have been made towards the search and design of antiviral agents that could interfere with viral pathogenesis. As a result of these efforts a number of effective antiviral agents have been developed and are available in the market. However, the high cost and lengthy protocol of the drug discovery process are some of the major limiting factors in the development of new and more effective antiviral agents. Considering the above fact, presently the research community is trying to integrate short and cost effective techniques in the modern drug discovery process for the identification and design of novel antiviral agents. Computeraided drug design (CADD), which comprises of various techniques like molecular docking, virtual screening, three dimensional quantitative structure activity relationship (3D-QSAR) studies and many more, has the capability to speed up the antiviral drug development process. Successful design of antiviral drugs like Relenza, Saquinavir and Tamiflu have validated application of these techniques and holds a bright future in drug discovery protocol. This review explores the role of CADD in antiviral drug development and highlights the recent advances in antiviral drug research using computer-aided structure based approaches.

  9. Advanced scientific computational methods and their applications of nuclear technologies. (1) Overview of scientific computational methods, introduction of continuum simulation methods and their applications (1)

    International Nuclear Information System (INIS)

    Oka, Yoshiaki; Okuda, Hiroshi

    2006-01-01

    Scientific computational methods have advanced remarkably with the progress of nuclear development. They have played the role of weft connecting each realm of nuclear engineering and then an introductory course of advanced scientific computational methods and their applications to nuclear technologies were prepared in serial form. This is the first issue showing their overview and introduction of continuum simulation methods. Finite element method as their applications is also reviewed. (T. Tanaka)

  10. Conceptual study of advanced PWR core design

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Jin; Chang, Moon Hee; Kim, Keung Ku; Joo, Hyung Kuk; Kim, Young Il; Noh, Jae Man; Hwang, Dae Hyun; Kim, Taek Kyum; Yoo, Yon Jong

    1997-09-01

    The purpose of this project is for developing and verifying the core design concepts with enhanced safety and economy, and associated methodologies for core analyses. From the study of the sate-of-art of foreign advanced reactor cores, we developed core concepts such as soluble boron free, high convertible and enhanced safety core loaded semi-tight lattice hexagonal fuel assemblies. To analyze this hexagonal core, we have developed and verified some neutronic and T/H analysis methodologies. HELIOS code was adopted as the assembly code and HEXFEM code was developed for hexagonal core analysis. Based on experimental data in hexagonal lattices and the COBRA-IV-I code, we developed a thermal-hydraulic analysis code for hexagonal lattices. Using the core analysis code systems developed in this project, we designed a 600 MWe core and studied the feasibility of the core concepts. Two additional scopes were performed in this project : study on the operational strategies of soluble boron free core and conceptual design of large scale passive core. By using the axial BP zoning concept and suitable design of control rods, this project showed that it was possible to design a soluble boron free core in 600 MWe PWR. The results of large scale core design showed that passive concepts and daily load follow operation could be practiced. (author). 15 refs., 52 tabs., 101 figs.

  11. Conceptual study of advanced PWR core design

    International Nuclear Information System (INIS)

    Kim, Young Jin; Chang, Moon Hee; Kim, Keung Ku; Joo, Hyung Kuk; Kim, Young Il; Noh, Jae Man; Hwang, Dae Hyun; Kim, Taek Kyum; Yoo, Yon Jong.

    1997-09-01

    The purpose of this project is for developing and verifying the core design concepts with enhanced safety and economy, and associated methodologies for core analyses. From the study of the sate-of-art of foreign advanced reactor cores, we developed core concepts such as soluble boron free, high convertible and enhanced safety core loaded semi-tight lattice hexagonal fuel assemblies. To analyze this hexagonal core, we have developed and verified some neutronic and T/H analysis methodologies. HELIOS code was adopted as the assembly code and HEXFEM code was developed for hexagonal core analysis. Based on experimental data in hexagonal lattices and the COBRA-IV-I code, we developed a thermal-hydraulic analysis code for hexagonal lattices. Using the core analysis code systems developed in this project, we designed a 600 MWe core and studied the feasibility of the core concepts. Two additional scopes were performed in this project : study on the operational strategies of soluble boron free core and conceptual design of large scale passive core. By using the axial BP zoning concept and suitable design of control rods, this project showed that it was possible to design a soluble boron free core in 600 MWe PWR. The results of large scale core design showed that passive concepts and daily load follow operation could be practiced. (author). 15 refs., 52 tabs., 101 figs

  12. Computational study of a dynamic contact problem

    Directory of Open Access Journals (Sweden)

    Jigarkumar Patel

    2013-10-01

    Full Text Available In this article, we describe a computational framework to study the influence of a normal crack on the dynamics of a cantilever beam; i.e., changes in its natural frequency, amplitude and period of vibration, etc.

  13. NATO Advanced Study Institute on Electron Crystallography

    CERN Document Server

    Weirich, Thomas E; Zou, Xiaodong

    2006-01-01

    During the last decade we have been witness to several exciting achievements in electron crystallography. This includes structural and charge density studies on organic molecules complicated inorganic and metallic materials in the amorphous, nano-, meso- and quasi-crystalline state and also development of new software, tailor-made for the special needs of electron crystallography. Moreover, these developments have been accompanied by a now available new generation of computer controlled electron microscopes equipped with high-coherent field-emission sources, cryo-specimen holders, ultra-fast CCD cameras, imaging plates, energy filters and even correctors for electron optical distortions. Thus, a fast and semi-automatic data acquisition from small sample areas, similar to what we today know from imaging plates diffraction systems in X-ray crystallography, can be envisioned for the very near future. This progress clearly shows that the contribution of electron crystallography is quite unique, as it enables to r...

  14. CE-ACCE: The Cloud Enabled Advanced sCience Compute Environment

    Science.gov (United States)

    Cinquini, L.; Freeborn, D. J.; Hardman, S. H.; Wong, C.

    2017-12-01

    Traditionally, Earth Science data from NASA remote sensing instruments has been processed by building custom data processing pipelines (often based on a common workflow engine or framework) which are typically deployed and run on an internal cluster of computing resources. This approach has some intrinsic limitations: it requires each mission to develop and deploy a custom software package on top of the adopted framework; it makes use of dedicated hardware, network and storage resources, which must be specifically purchased, maintained and re-purposed at mission completion; and computing services cannot be scaled on demand beyond the capability of the available servers.More recently, the rise of Cloud computing, coupled with other advances in containerization technology (most prominently, Docker) and micro-services architecture, has enabled a new paradigm, whereby space mission data can be processed through standard system architectures, which can be seamlessly deployed and scaled on demand on either on-premise clusters, or commercial Cloud providers. In this talk, we will present one such architecture named CE-ACCE ("Cloud Enabled Advanced sCience Compute Environment"), which we have been developing at the NASA Jet Propulsion Laboratory over the past year. CE-ACCE is based on the Apache OODT ("Object Oriented Data Technology") suite of services for full data lifecycle management, which are turned into a composable array of Docker images, and complemented by a plug-in model for mission-specific customization. We have applied this infrastructure to both flying and upcoming NASA missions, such as ECOSTRESS and SMAP, and demonstrated deployment on the Amazon Cloud, either using simple EC2 instances, or advanced AWS services such as Amazon Lambda and ECS (EC2 Container Services).

  15. Computational Studies of Drug Resistance

    DEFF Research Database (Denmark)

    da Silva Martins, João Miguel

    Drug resistance has been an increasing problem in patient treatment and drug development. Starting in the last century and becoming a major worry in the medical and scienti c communities in the early part of the current millennium, major research must be performed to address the issues of viral...... is of the utmost importance in developing better and less resistance-inducing drugs. A drug's in uence can be characterized in many diff erent ways, however, and the approaches I take in this work re ect those same different in uences. This is what I try to achieve in this work, through seemingly unrelated...... approaches that come together in the study of drug's and their in uence on proteins and vice-versa. In part I, I aim to understand through combined theoretical ensemble analysis and free energy calculations the e ects mutations have over the binding anity and function of the M2 proton channel. This research...

  16. An expanded framework for the advanced computational testing and simulation toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Marques, Osni A.; Drummond, Leroy A.

    2003-11-09

    The Advanced Computational Testing and Simulation (ACTS) Toolkit is a set of computational tools developed primarily at DOE laboratories and is aimed at simplifying the solution of common and important computational problems. The use of the tools reduces the development time for new codes and the tools provide functionality that might not otherwise be available. This document outlines an agenda for expanding the scope of the ACTS Project based on lessons learned from current activities. Highlights of this agenda include peer-reviewed certification of new tools; finding tools to solve problems that are not currently addressed by the Toolkit; working in collaboration with other software initiatives and DOE computer facilities; expanding outreach efforts; promoting interoperability, further development of the tools; and improving functionality of the ACTS Information Center, among other tasks. The ultimate goal is to make the ACTS tools more widely used and more effective in solving DOE's and the nation's scientific problems through the creation of a reliable software infrastructure for scientific computing.

  17. 3-D Rat Brain Phantom for High-Resolution Molecular Imaging : Experimental studies aimed at advancing understanding of human brain disease and malfunction, and of behavior problems, may be aided by computer models of small laboratory animals

    NARCIS (Netherlands)

    Beekman, F.J.; Vastenhouw, B.; Van der Wilt, G.; Vervloet, M.; Visscher, R.; Booij, J.; Gerrits, M.; Ji, C.; Ramakers, R.; Van der Have, F.

    2009-01-01

    With the steadily improving resolution of novel small-animal single photon emission computed tomography (SPECT) and positron emission tomography devices, highly detailed phantoms are required for testing and optimizing these systems. We present a three-dimensional (3-D) digital and physical phantom

  18. An advanced course in computational nuclear physics bridging the scales from quarks to neutron stars

    CERN Document Server

    Lombardo, Maria; Kolck, Ubirajara

    2017-01-01

    This graduate-level text collects and synthesizes a series of ten lectures on the nuclear quantum many-body problem. Starting from our current understanding of the underlying forces, it presents recent advances within the field of lattice quantum chromodynamics before going on to discuss effective field theories, central many-body methods like Monte Carlo methods, coupled cluster theories, the similarity renormalization group approach, Green’s function methods and large-scale diagonalization approaches. Algorithmic and computational advances show particular promise for breakthroughs in predictive power, including proper error estimates, a better understanding of the underlying effective degrees of freedom and of the respective forces at play. Enabled by recent improvements in theoretical, experimental and numerical techniques, the state-of-the art applications considered in this volume span the entire range, from our smallest components – quarks and gluons as the mediators of the strong force – to the c...

  19. Uncertainty quantification an accelerated course with advanced applications in computational engineering

    CERN Document Server

    Soize, Christian

    2017-01-01

    This book presents the fundamental notions and advanced mathematical tools in the stochastic modeling of uncertainties and their quantification for large-scale computational models in sciences and engineering. In particular, it focuses in parametric uncertainties, and non-parametric uncertainties with applications from the structural dynamics and vibroacoustics of complex mechanical systems, from micromechanics and multiscale mechanics of heterogeneous materials. Resulting from a course developed by the author, the book begins with a description of the fundamental mathematical tools of probability and statistics that are directly useful for uncertainty quantification. It proceeds with a well carried out description of some basic and advanced methods for constructing stochastic models of uncertainties, paying particular attention to the problem of calibrating and identifying a stochastic model of uncertainty when experimental data is available. < This book is intended to be a graduate-level textbook for stu...

  20. Computed Tomograpy Venography diagnosis of iliocaval venous obstruction in advanced chronic venous insufficiency

    Directory of Open Access Journals (Sweden)

    Fabio Henrique Rossi

    2014-12-01

    Full Text Available Objective:Iliocaval obstruction is associated with venous hypertension symptoms and may predispose to deep venous thrombosis (DVT. Ultrasonography may fail to achieve noninvasive diagnosis of these obstructions. The possibility of using Computed Tomography Venography (CTV for these diagnoses is under investigation.Methods:Patients with CVI graded at CEAP clinical classes 3 to 6 and previous treatment failure underwent evaluation with CTV. Percentage obstruction was rated by two independent examiners. Obstruction prevalence and its associations with risk factors and CEAP classification were analyzed.Results:A total of 112 limbs were prospectively evaluated. Mean patient age was 55.8 years and 75.4% were women. Obstructions involved the left lower limb in 71.8% of cases and 35.8% of patients reported a medical history of deep venous thrombosis. Overall, 57.1% of imaging studies demonstrated venous obstruction of at least 50% and 10.7% showed obstruction of >80%. The only risk factor that was found to be independently associated with a significantly higher incidence of >50% venous obstruction was a medical history of DVT (p=0.035 (Fisher's exact test. There was a positive relationship between clinical classification (CEAP and degree of venous obstruction in the limbs studied (Chi-square test for linear trend; p=0.011.Conclusion:Patients with advanced CVI are often affected by obstructions in the iliocaval venous territory and CTV is able to diagnose the degree of obstruction. There is a positive association between degree of obstruction and both previous history of DVT and severity of symptoms of CVI.

  1. Computational mechanics - Advances and trends; Proceedings of the Session - Future directions of Computational Mechanics of the ASME Winter Annual Meeting, Anaheim, CA, Dec. 7-12, 1986

    Science.gov (United States)

    Noor, Ahmed K. (Editor)

    1986-01-01

    The papers contained in this volume provide an overview of the advances made in a number of aspects of computational mechanics, identify some of the anticipated industry needs in this area, discuss the opportunities provided by new hardware and parallel algorithms, and outline some of the current government programs in computational mechanics. Papers are included on advances and trends in parallel algorithms, supercomputers for engineering analysis, material modeling in nonlinear finite-element analysis, the Navier-Stokes computer, and future finite-element software systems.

  2. Advanced scientific computational methods and their applications to nuclear technologies. (4) Overview of scientific computational methods, introduction of continuum simulation methods and their applications (4)

    International Nuclear Information System (INIS)

    Sekimura, Naoto; Okita, Taira

    2006-01-01

    Scientific computational methods have advanced remarkably with the progress of nuclear development. They have played the role of weft connecting each realm of nuclear engineering and then an introductory course of advanced scientific computational methods and their applications to nuclear technologies were prepared in serial form. This is the fourth issue showing the overview of scientific computational methods with the introduction of continuum simulation methods and their applications. Simulation methods on physical radiation effects on materials are reviewed based on the process such as binary collision approximation, molecular dynamics, kinematic Monte Carlo method, reaction rate method and dislocation dynamics. (T. Tanaka)

  3. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Subcommittee Report on Scientific and Technical Information

    Energy Technology Data Exchange (ETDEWEB)

    Hey, Tony [eScience Institute, University of Washington; Agarwal, Deborah [Lawrence Berkeley National Laboratory; Borgman, Christine [University of California, Los Angeles; Cartaro, Concetta [SLAC National Accelerator Laboratory; Crivelli, Silvia [Lawrence Berkeley National Laboratory; Van Dam, Kerstin Kleese [Pacific Northwest National Laboratory; Luce, Richard [University of Oklahoma; Arjun, Shankar [CADES, Oak Ridge National Laboratory; Trefethen, Anne [University of Oxford; Wade, Alex [Microsoft Research, Microsoft Corporation; Williams, Dean [Lawrence Livermore National Laboratory

    2015-09-04

    The Advanced Scientific Computing Advisory Committee (ASCAC) was charged to form a standing subcommittee to review the Department of Energy’s Office of Scientific and Technical Information (OSTI) and to begin by assessing the quality and effectiveness of OSTI’s recent and current products and services and to comment on its mission and future directions in the rapidly changing environment for scientific publication and data. The Committee met with OSTI staff and reviewed available products, services and other materials. This report summaries their initial findings and recommendations.

  4. Advancements in remote physiological measurement and applications in human-computer interaction

    Science.gov (United States)

    McDuff, Daniel

    2017-04-01

    Physiological signals are important for tracking health and emotional states. Imaging photoplethysmography (iPPG) is a set of techniques for remotely recovering cardio-pulmonary signals from video of the human body. Advances in iPPG methods over the past decade combined with the ubiquity of digital cameras presents the possibility for many new, lowcost applications of physiological monitoring. This talk will highlight methods for recovering physiological signals, work characterizing the impact of video parameters and hardware on these measurements, and applications of this technology in human-computer interfaces.

  5. Advanced Simulation and Computing Fiscal Year 2016 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hendrickson, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-27

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The purpose of this IP is to outline key work requirements to be performed and to control individual work activities within the scope of work. Contractors may not deviate from this plan without a revised WA or subsequent IP.

  6. Advances in Intelligent Modelling and Simulation Artificial Intelligence-Based Models and Techniques in Scalable Computing

    CERN Document Server

    Khan, Samee; Burczy´nski, Tadeusz

    2012-01-01

    One of the most challenging issues in today’s large-scale computational modeling and design is to effectively manage the complex distributed environments, such as computational clouds, grids, ad hoc, and P2P networks operating under  various  types of users with evolving relationships fraught with  uncertainties. In this context, the IT resources and services usually belong to different owners (institutions, enterprises, or individuals) and are managed by different administrators. Moreover, uncertainties are presented to the system at hand in various forms of information that are incomplete, imprecise, fragmentary, or overloading, which hinders in the full and precise resolve of the evaluation criteria, subsequencing and selection, and the assignment scores. Intelligent scalable systems enable the flexible routing and charging, advanced user interactions and the aggregation and sharing of geographically-distributed resources in modern large-scale systems.   This book presents new ideas, theories, models...

  7. Review of research on advanced computational science in FY2010-2014

    International Nuclear Information System (INIS)

    2016-03-01

    Research on advanced computational science for nuclear applications, based on 'the plan for meeting the mid-term goal of the Japan Atomic Energy Agency', has been performed at Center for Computational Science and e-Systems (CCSE), Japan Atomic Energy Agency. CCSE established the committee consisting outside experts and authorities which does research evaluation and advices for the assistance of the research and development. This report summarizes the followings. (1) Results of the R and D performed at CCSE in the period of the midterm plan (April 1st, 2010 - March 31st, 2015) (2) Results of the evaluation on the R and D by the committee in the period of the midterm plan (April 1st, 2010 - March 31st, 2015). (author)

  8. Further development of the Dynamic Control Assemblies Worth Measurement Method for Advanced Reactivity Computers

    International Nuclear Information System (INIS)

    Petenyi, V.; Strmensky, C.; Jagrik, J.; Minarcin, M.; Sarvaic, I.

    2005-01-01

    The dynamic control assemblies worth measurement technique is a quick method for validation of predicted control assemblies worth. The dynamic control assemblies worth measurement utilize space-time corrections for the measured out of core ionization chamber readings calculated by DYN 3D computer code. The space-time correction arising from the prompt neutron density redistribution in the measured ionization chamber reading can be directly applied in the advanced reactivity computer. The second correction concerning the difference of spatial distribution of delayed neutrons can be calculated by simulation the measurement procedure by dynamic version of the DYN 3D code. In the paper some results of dynamic control assemblies worth measurement applied for NPP Mochovce are presented (Authors)

  9. Recent advances in transient imaging: A computer graphics and vision perspective

    Directory of Open Access Journals (Sweden)

    Adrian Jarabo

    2017-03-01

    Full Text Available Transient imaging has recently made a huge impact in the computer graphics and computer vision fields. By capturing, reconstructing, or simulating light transport at extreme temporal resolutions, researchers have proposed novel techniques to show movies of light in motion, see around corners, detect objects in highly-scattering media, or infer material properties from a distance, to name a few. The key idea is to leverage the wealth of information in the temporal domain at the pico or nanosecond resolution, information usually lost during the capture-time temporal integration. This paper presents recent advances in this field of transient imaging from a graphics and vision perspective, including capture techniques, analysis, applications and simulation. Keywords: Transient imaging, Ultrafast imaging, Time-of-flight

  10. Advanced scientific computational methods and their applications to nuclear technologies. (3) Introduction of continuum simulation methods and their applications (3)

    International Nuclear Information System (INIS)

    Satake, Shin-ichi; Kunugi, Tomoaki

    2006-01-01

    Scientific computational methods have advanced remarkably with the progress of nuclear development. They have played the role of weft connecting each realm of nuclear engineering and then an introductory course of advanced scientific computational methods and their applications to nuclear technologies were prepared in serial form. This is the third issue showing the introduction of continuum simulation methods and their applications. Spectral methods and multi-interface calculation methods in fluid dynamics are reviewed. (T. Tanaka)

  11. Using Computer-Assisted Argumentation Mapping to develop effective argumentation skills in high school advanced placement physics

    Science.gov (United States)

    Heglund, Brian

    Educators recognize the importance of reasoning ability for development of critical thinking skills, conceptual change, metacognition, and participation in 21st century society. There is a recognized need for students to improve their skills of argumentation, however, argumentation is not explicitly taught outside logic and philosophy---subjects that are not part of the K-12 curriculum. One potential way of supporting the development of argumentation skills in the K-12 context is through incorporating Computer-Assisted Argument Mapping to evaluate arguments. This quasi-experimental study tested the effects of such argument mapping software and was informed by the following two research questions: 1. To what extent does the collaborative use of Computer-Assisted Argumentation Mapping to evaluate competing theories influence the critical thinking skill of argument evaluation, metacognitive awareness, and conceptual knowledge acquisition in high school Advanced Placement physics, compared to the more traditional method of text tables that does not employ Computer-Assisted Argumentation Mapping? 2. What are the student perceptions of the pros and cons of argument evaluation in the high school Advanced Placement physics environment? This study examined changes in critical thinking skills, including argumentation evaluation skills, as well as metacognitive awareness and conceptual knowledge, in two groups: a treatment group using Computer-Assisted Argumentation Mapping to evaluate physics arguments, and a comparison group using text tables to evaluate physics arguments. Quantitative and qualitative methods for collecting and analyzing data were used to answer the research questions. Quantitative data indicated no significant difference between the experimental groups, and qualitative data suggested students perceived pros and cons of argument evaluation in the high school Advanced Placement physics environment, such as self-reported sense of improvement in argument

  12. Advanced control of a water supply system : A case study

    NARCIS (Netherlands)

    Bakker, M.; Rajewicz, T.; Kien, H.; Vreeburg, J.H.G.; Rietveld, L.C.

    2014-01-01

    Conventional automatic production flow control and pump pressure control of water supply systems are robust and simple: production flow is controlled based on the level in the clear water reservoir and pump pressure is controlled on a static set-point. Recently, more advanced computer-based control

  13. Advanced computational sensors technology: testing and evaluation in visible, SWIR, and LWIR imaging

    Science.gov (United States)

    Rizk, Charbel G.; Wilson, John P.; Pouliquen, Philippe

    2015-05-01

    The Advanced Computational Sensors Team at the Johns Hopkins University Applied Physics Laboratory and the Johns Hopkins University Department of Electrical and Computer Engineering has been developing advanced readout integrated circuit (ROIC) technology for more than 10 years with a particular focus on the key challenges of dynamic range, sampling rate, system interface and bandwidth, and detector materials or band dependencies. Because the pixel array offers parallel sampling by default, the team successfully demonstrated that adding smarts in the pixel and the chip can increase performance significantly. Each pixel becomes a smart sensor and can operate independently in collecting, processing, and sharing data. In addition, building on the digital circuit revolution, the effective well size can be increased by orders of magnitude within the same pixel pitch over analog designs. This research has yielded an innovative class of a system-on-chip concept: the Flexible Readout and Integration Sensor (FRIS) architecture. All key parameters are programmable and/or can be adjusted dynamically, and this architecture can potentially be sensor and application agnostic. This paper reports on the testing and evaluation of one prototype that can support either detector polarity and includes sample results with visible, short-wavelength infrared (SWIR), and long-wavelength infrared (LWIR) imaging.

  14. A Delphi study to validate an advanced practice nursing tool.

    Science.gov (United States)

    Chang, Anne M; Gardner, Glenn E; Duffield, Christine; Ramis, Mary-Anne

    2010-10-01

    This paper is a report of a study conducted to validate an instrument for measuring advanced practice nursing role delineation in an international contemporary health service context using the Delphi technique. Although most countries now have clear definitions and competency standards for nurse practitioners, no such clarity exists for many advanced practice nurse roles, leaving healthcare providers uncertain whether their service needs can or should be met by an advanced practice nurse or a nurse practitioner. The validation of a tool depicting advanced practice nursing is essential for the appropriate deployment of advanced practice nurses. This paper is the second in a three-phase study to develop an operational framework for assigning advanced practice nursing roles. An expert panel was established to review the activities in the Strong Model of Advanced Practice Role Delineation tool. Using the Delphi technique, data were collected via an on-line survey through a series of iterative rounds in 2008. Feedback and statistical summaries of responses were distributed to the panel until the 75% consensus cut-off was obtained. After three rounds and modification of five activities, consensus was obtained for validation of the content of this tool. The Strong Model of Advanced Practice Role Delineation tool is valid for depicting the dimensions of practice of the advanced practice role in an international contemporary health service context thereby having the potential to optimize the utilization of the advanced practice nursing workforce. © 2010 The Authors. Journal of Advanced Nursing © 2010 Blackwell Publishing Ltd.

  15. Computed Tomography Study Of Complicated Bacterial Meningitis ...

    African Journals Online (AJOL)

    To monitor the structural intracranial complications of bacterial meningitis using computed tomography (CT) scan. Retrospective study of medical and radiological records of patients who underwent CT scan over a 4 year period. AUniversityTeachingHospital in a developing country. Thirty three patients with clinically and ...

  16. Advances in the operation of the DIII-D neutral beam computer systems

    International Nuclear Information System (INIS)

    Phillips, J.C.; Busath, J.L.; Penaflor, B.G.; Piglowski, D.; Kellman, D.H.; Chiu, H.K.; Hong, R.M.

    1998-02-01

    The DIII-D neutral beam system routinely provides up to 20 MW of deuterium neutral beam heating in support of experiments on the DIII-D tokamak, and is a critical part of the DIII-D physics experimental program. The four computer systems previously used to control neutral beam operation and data acquisition were designed and implemented in the late 1970's and used on DIII and DIII-D from 1981--1996. By comparison to modern standards, they had become expensive to maintain, slow and cumbersome, making it difficult to implement improvements. Most critical of all, they were not networked computers. During the 1997 experimental campaign, these systems were replaced with new Unix compliant hardware and, for the most part, commercially available software. This paper describes operational experience with the new neutral beam computer systems, and new advances made possible by using features not previously available. These include retention and access to historical data, an asynchronously fired ''rules'' base, and a relatively straightforward programming interface. Methods and principles for extending the availability of data beyond the scope of the operator consoles will be discussed

  17. Advanced nuclear systems. Review study; Fortgeschrittene Nuklearsysteme. Review Study

    Energy Technology Data Exchange (ETDEWEB)

    Liebert, Wolfgang; Glaser, Alexander; Pistner, Christoph [Interdisziplinaere Arbeitsgruppe Naturwissenschaft, Technik und Sicherheit (IANUS), Darmstadt University of Technology, Hochschulstrasse 10, D-64289 Darmstadt (Germany); Baehr, Roland; Hahn, Lothar [Institute for applied ecology (Oeko-Institut), Elisabethenstrasse 55-57, D-64283 Darmstadt (Germany)

    1999-04-01

    The task of this review study is to from provide an overview of the developments in the field of the various advanced nuclear systems, and to create the basis for more comprehensive studies of technology assessment. In an overview the concepts for advanced nuclear systems pursued worldwide are subdivided into eight subgroups. A coarse examination raster (set pattern) is developed to enable a detailed examination of the selected systems. In addition to a focus on enhanced safety features, further aspects are also taken into consideration, like the lowering of the proliferation risk, the enhancement of the economic competitiveness of the facilities and new usage possibilities (for instance concerning the relaxation of the waste disposal problem or the usage of alternative fuels to uranium). The question about the expected time span for realization and the discussion about the obstacles on the way to a commercially usable reactor also play a substantial role as well as disposal requirements as far as they can be presently recognized. In the central chapter of this study, the documentation of the representatively selected concepts is evaluated as well as existing technology assessment studies and expert opinions. In a few cases where this appears to be necessary, according technical literature, further policy advisory reports, expert statements as well as other relevant sources are taken into account. Contradictions, different assessments and dissents in the literature as well as a few unsettled questions are thus indicated. The potential of advanced nuclear systems with respect to economical and societal as well as environmental objectives cannot exclusively be measured by the corresponding intrinsic or in comparison remarkable technical improvements. The acceptability of novel or improved systems in nuclear technology will have to be judged by their convincing solutions for the crucial questions of safety, nuclear waste and risk of proliferation of nuclear weapons

  18. Conceptual study of advanced PWR systems. A study of passive and inherent safety design concepts for advanced light water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Soon Heung; No, Hee Cheon; Baek, Won Pil; Shim Young Jae; Lee, Goung Jin; Na, Man Gyun; Lee, Jae Young; Kim, Han Gon; Kang, Ki Sig; Moon, Sang Ki; Kim, Yun Il; Park, Jae Wook; Yang, Soo Hyung; Kim, Soo Hyung; Lee, Seong Wook; Kim, Hong Che; Park, Hyun Sik; Jeong, Ji Hwan; Lee, Sang Il; Jung, Hae Yong; Kim, Hyong Tae; Chae, Kyung Sun; Moon, Ki Hoon [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1995-08-01

    The five thermal-hydraulic concepts chosen for advanced PWR have been studied as follows: (1) Critical Heat Flux: Review of previous works, analysis of parametric trends, analysis of transient CHF characteristics, extension of the CHF date bank, survey and assessment of correlations, design of a intermediate-pressure CHF test loop have been performed. (2) Passive Cooling Concepts for Concrete Containment system: Review of condensation phenomena with noncondensable gases, selection of a promising concept (i.e., use of external condensers), design of test loop according to scaling laws have been accomplished. and computer programs based on the control-volume approach, and the conceptual design of test loop have been accomplished. (4) Fluidic Diode Concepts: Review of previous applications of the concept, analysis major parameters affecting the performance, development of a computational code, and conceptual investigation of the verification test loop have been performed. (5) Wet Thermal Insulator: Review of previous works, selection of promising methods ( i.e. ceramic fiber in a steel case and mirror-type insulator), and conceptual design of the experimental loop have been performed. (author). 9 refs.

  19. Eighteenth Workshop on Recent Developments in Computer Simulation Studies in Condensed Matter Physics

    CERN Document Server

    Landau, David P; Schüttler, Heinz-Bernd; Computer Simulation Studies in Condensed-Matter Physics XVIII

    2006-01-01

    This volume represents a "status report" emanating from presentations made during the 18th Annual Workshop on Computer Simulations Studies in Condensed Matter Physics at the Center for Simulational Physics at the University of Georgia in March 2005. It provides a broad overview of the most recent advances in the field, spanning the range from statistical physics to soft condensed matter and biological systems. Results on nanostructures and materials are included as are several descriptions of advances in quantum simulations and quantum computing as well as.methodological advances.

  20. Studying Kv Channels Function using Computational Methods.

    Science.gov (United States)

    Deyawe, Audrey; Kasimova, Marina A; Delemotte, Lucie; Loussouarn, Gildas; Tarek, Mounir

    2018-01-01

    In recent years, molecular modeling techniques, combined with MD simulations, provided significant insights on voltage-gated (Kv) potassium channels intrinsic properties. Among the success stories are the highlight of molecular level details of the effects of mutations, the unraveling of several metastable intermediate states, and the influence of a particular lipid, PIP 2 , in the stability and the modulation of Kv channel function. These computational studies offered a detailed view that could not have been reached through experimental studies alone. With the increase of cross disciplinary studies, numerous experiments provided validation of these computational results, which endows an increase in the reliability of molecular modeling for the study of Kv channels. This chapter offers a description of the main techniques used to model Kv channels at the atomistic level.

  1. Computational Studies in Molecular Geochemistry and Biogeochemistry

    Energy Technology Data Exchange (ETDEWEB)

    Felmy, Andrew R.; Bylaska, Eric J.; Dixon, David A.; Dupuis, Michel; Halley, James W.; Kawai, R.; Rosso, Kevin M.; Rustad, James R.; Smith, Paul E.; Straatsma, TP; Voth, Gregory A.; Weare, John H.; Yuen, David A.

    2006-04-18

    The ability to predict the transport and transformations of contaminants within the subsurface is critical for decisions on virtually every waste disposal option facing the Department of Energy (DOE), from remediation technologies such as in situ bioremediation to evaluations of the safety of nuclear waste repositories. With this fact in mind, the DOE has recently sponsored a series of workshops on the development of a Strategic Simulation Plan on applications of high perform-ance computing to national problems of significance to the DOE. One of the areas selected for application was in the area of subsurface transport and environmental chemistry. Within the SSP on subsurface transport and environmental chemistry several areas were identified where applications of high performance computing could potentially significantly advance our knowledge of contaminant fate and transport. Within each of these areas molecular level simulations were specifically identified as a key capability necessary for the development of a fundamental mechanistic understanding of complex biogeochemical processes. This effort consists of a series of specific molecular level simulations and program development in four key areas of geochemistry/biogeochemistry (i.e., aqueous hydrolysis, redox chemistry, mineral surface interactions, and microbial surface properties). By addressing these four differ-ent, but computationally related, areas it becomes possible to assemble a team of investigators with the necessary expertise in high performance computing, molecular simulation, and geochemistry/biogeochemistry to make significant progress in each area. The specific targeted geochemical/biogeochemical issues include: Microbial surface mediated processes: the effects of lipopolysacchardies present on gram-negative bacteria. Environmental redox chemistry: Dechlorination pathways of carbon tetrachloride and other polychlorinated compounds in the subsurface. Mineral surface interactions: Describing

  2. A Meta-Analysis of Advance-Organizer Studies.

    Science.gov (United States)

    Stone, Carol Leth

    Long term studies of advance organizers (AO) were analyzed with Glass's meta-analysis technique. AO's were defined as bridges from reader's previous knowledge to what is to be learned. The results were compared with predictions from Ausubel's model of assimilative learning. The results of the study indicated that advance organizers were associated…

  3. National Computing Studies Summit: Open Learning Approaches to Computing Studies--An ACCE Discussion Paper

    Science.gov (United States)

    Webb, Ian

    2008-01-01

    In 2005 the Australian Council for Computers in Education (ACCE) was successful in obtaining a grant from National Centre of Science, Information and Communication Technology and Mathematics Education for Rural and Regional Australia (SiMERR) to undertake the Computing Studies Teachers Network Rural and Regional Focus Project. The project had five…

  4. Computer programs for capital cost estimation, lifetime economic performance simulation, and computation of cost indexes for laser fusion and other advanced technology facilities

    International Nuclear Information System (INIS)

    Pendergrass, J.H.

    1978-01-01

    Three FORTRAN programs, CAPITAL, VENTURE, and INDEXER, have been developed to automate computations used in assessing the economic viability of proposed or conceptual laser fusion and other advanced-technology facilities, as well as conventional projects. The types of calculations performed by these programs are, respectively, capital cost estimation, lifetime economic performance simulation, and computation of cost indexes. The codes permit these three topics to be addressed with considerable sophistication commensurate with user requirements and available data

  5. Mathematical and computational modeling and simulation fundamentals and case studies

    CERN Document Server

    Moeller, Dietmar P F

    2004-01-01

    Mathematical and Computational Modeling and Simulation - a highly multi-disciplinary field with ubiquitous applications in science and engineering - is one of the key enabling technologies of the 21st century. This book introduces to the use of Mathematical and Computational Modeling and Simulation in order to develop an understanding of the solution characteristics of a broad class of real-world problems. The relevant basic and advanced methodologies are explained in detail, with special emphasis on ill-defined problems. Some 15 simulation systems are presented on the language and the logical level. Moreover, the reader can accumulate experience by studying a wide variety of case studies. The latter are briefly described within the book but their full versions as well as some simulation software demos are available on the Web. The book can be used for University courses of different level as well as for self-study. Advanced sections are marked and can be skipped in a first reading or in undergraduate courses...

  6. Advanced parallel computing for the coupled PCR-GLOBWB-MODFLOW model

    Science.gov (United States)

    Verkaik, Jarno; Schmitz, Oliver; Sutanudjaja, Edwin

    2017-04-01

    PCR-GLOBWB (https://github.com/UU-Hydro/PCR-GLOBWB_model) is a large-scale hydrological model intended for global to regional studies and developed at the Department of Physical Geography, Utrecht University (Netherlands). The latest version of the model can simulate terrestrial hydrological and water resource fluxes and storages with a typical spatial resolution of 5 arc-minutes (less than 10 km) at the global extent. One of the recent features in the model development is the inclusion of a global 2-layer MODFLOW model simulating groundwater lateral flow. This advanced feature enables us to simulate and assess the groundwater head dynamics at the global extent, including at regions with declining groundwater head problems. Unfortunately, the current coupled PCR-GLOBWB-MODFLOW requires long run times mainly attributed to the current inefficient parallel computing and coupling algorithm. In this work, we aim to improve it by setting-up a favorable river-basin partitioning manner that reduces I/O communication and optimizes load balance between PCR-GLOBWB and MODFLOW. We also aim to replace the MODFLOW-2000 in the current coupled model with MODFLOW-USG. This will allow us to use the new Parallel Krylov Solver (PKS) that can run with Message Passing Interface (MPI) and can be easily combined with Open Multi-Processing (OpenMP). The latest scaling test carried out on the Cartesius Dutch National supercomputer shows that the usage of MODFLOW-USG and new PKS solver can result in significant MODFLOW calculation speedups (up to 45). The encouraging result of this work opens a possibility for running the model with more detailed setup and at higher resolution. As MODFLOW-USG supports both structured and unstructured grids, this includes an opportunity to have a next generation of PCR-GLOBWB-MODFLOW model that has flexibility in grid design for its groundwater flow simulation (e.g. grid design can be used to focus along rivers and around wells, to discretize individual

  7. Advanced neural network-based computational schemes for robust fault diagnosis

    CERN Document Server

    Mrugalski, Marcin

    2014-01-01

    The present book is devoted to problems of adaptation of artificial neural networks to robust fault diagnosis schemes. It presents neural networks-based modelling and estimation techniques used for designing robust fault diagnosis schemes for non-linear dynamic systems. A part of the book focuses on fundamental issues such as architectures of dynamic neural networks, methods for designing of neural networks and fault diagnosis schemes as well as the importance of robustness. The book is of a tutorial value and can be perceived as a good starting point for the new-comers to this field. The book is also devoted to advanced schemes of description of neural model uncertainty. In particular, the methods of computation of neural networks uncertainty with robust parameter estimation are presented. Moreover, a novel approach for system identification with the state-space GMDH neural network is delivered. All the concepts described in this book are illustrated by both simple academic illustrative examples and practica...

  8. Advanced Scientific Computing Research Network Requirements: ASCR Network Requirements Review Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Bacon, Charles [Argonne National Lab. (ANL), Argonne, IL (United States); Bell, Greg [ESnet, Berkeley, CA (United States); Canon, Shane [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dart, Eli [ESnet, Berkeley, CA (United States); Dattoria, Vince [Dept. of Energy (DOE), Washington DC (United States). Office of Science. Advanced Scientific Computing Research (ASCR); Goodwin, Dave [Dept. of Energy (DOE), Washington DC (United States). Office of Science. Advanced Scientific Computing Research (ASCR); Lee, Jason [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hicks, Susan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Holohan, Ed [Argonne National Lab. (ANL), Argonne, IL (United States); Klasky, Scott [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Lauzon, Carolyn [Dept. of Energy (DOE), Washington DC (United States). Office of Science. Advanced Scientific Computing Research (ASCR); Rogers, Jim [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Shipman, Galen [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Skinner, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Tierney, Brian [ESnet, Berkeley, CA (United States)

    2013-03-08

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In October 2012, ESnet and the Office of Advanced Scientific Computing Research (ASCR) of the DOE SC organized a review to characterize the networking requirements of the programs funded by the ASCR program office. The requirements identified at the review are summarized in the Findings section, and are described in more detail in the body of the report.

  9. Development of an Advanced Computational Model for OMCVD of Indium Nitride

    Science.gov (United States)

    Cardelino, Carlos A.; Moore, Craig E.; Cardelino, Beatriz H.; Zhou, Ning; Lowry, Sam; Krishnan, Anantha; Frazier, Donald O.; Bachmann, Klaus J.

    1999-01-01

    An advanced computational model is being developed to predict the formation of indium nitride (InN) film from the reaction of trimethylindium (In(CH3)3) with ammonia (NH3). The components are introduced into the reactor in the gas phase within a background of molecular nitrogen (N2). Organometallic chemical vapor deposition occurs on a heated sapphire surface. The model simulates heat and mass transport with gas and surface chemistry under steady state and pulsed conditions. The development and validation of an accurate model for the interactions between the diffusion of gas phase species and surface kinetics is essential to enable the regulation of the process in order to produce a low defect material. The validation of the model will be performed in concert with a NASA-North Carolina State University project.

  10. Advanced Computational Modeling of Vapor Deposition in a High-Pressure Reactor

    Science.gov (United States)

    Cardelino, Beatriz H.; Moore, Craig E.; McCall, Sonya D.; Cardelino, Carlos A.; Dietz, Nikolaus; Bachmann, Klaus

    2004-01-01

    In search of novel approaches to produce new materials for electro-optic technologies, advances have been achieved in the development of computer models for vapor deposition reactors in space. Numerical simulations are invaluable tools for costly and difficult processes, such as those experiments designed for high pressures and microgravity conditions. Indium nitride is a candidate compound for high-speed laser and photo diodes for optical communication system, as well as for semiconductor lasers operating into the blue and ultraviolet regions. But InN and other nitride compounds exhibit large thermal decomposition at its optimum growth temperature. In addition, epitaxy at lower temperatures and subatmospheric pressures incorporates indium droplets into the InN films. However, surface stabilization data indicate that InN could be grown at 900 K in high nitrogen pressures, and microgravity could provide laminar flow conditions. Numerical models for chemical vapor deposition have been developed, coupling complex chemical kinetics with fluid dynamic properties.

  11. Proceedings of the NATO-Advanced Study Institute on Computer Aided Analysis of Rigid and Flexible Mechanical Systems Held in Troia, Portugal on 27 Jun-9 Jul, 1993. Volume 2. Contributed Papers

    Science.gov (United States)

    1993-07-09

    at National Institute for Aviation Research (NIAR) and Civil AeroMedical Institute (CAMI), were accomplished. A parametric study of the coefficients...DUPLICATION OF EXPERMT S Some experimental studies, including deceleration sled tests with forward-fa=ng and Ptch configurations were performed at Civil ...Ag~ricia de Viagens IDMECIIST - Instituto de Engenharia Mecdnica INSTITUTO SUPERIOR T9CNICO L - --- -

  12. Advance study of fiber-reinforced self-compacting concrete

    Science.gov (United States)

    Mironova, M.; Ivanova, M.; Naidenov, V.; Georgiev, I.; Stary, J.

    2015-10-01

    Incorporation in concrete composition of steel macro- and micro - fiber reinforcement with structural function increases the degree of ductility of typically brittle cement-containing composites, which in some cases can replace completely or partially conventional steel reinforcement in the form of rods and meshes. Thus, that can reduce manufacturing, detailing and placement of conventional reinforcement, which enhances productivity and economic efficiency of the building process. In this paper, six fiber-reinforced with different amounts of steel fiber cement-containing self-compacting compositions are investigated. The results of some of their main strength-deformation characteristics are presented. Advance approach for the study of structural and material properties of these type composites is proposed by using the methods of industrial computed tomography. The obtained original tomography results about the microstructure and characteristics of individual structural components make it possible to analyze the effective macro-characteristics of the studied composites. The resulting analytical data are relevant for the purposes of multi-dimensional modeling of these systems. Multifactor structure-mechanical analysis of the obtained with different methods original scientific results is proposed. It is presented a conclusion of the capabilities and effectiveness of complex analysis in the studies to characterize the properties of self-compacting fiber-reinforced concrete.

  13. Advancement of the state system of accounting for mainframe to personal computer (PC) technology

    International Nuclear Information System (INIS)

    Proco, G.; Nardi, J.

    1999-01-01

    The advancement of the U.S. government's state system of accounting from a mainframe computer to a personal computer (PC) had been successfully completed. The accounting system, from 1965 until 1995 a mainframe application, was replaced in September 1995 by an accounting system employing local area network (LAN) capabilities and other state-of-the-art characteristics. The system is called the Nuclear Materials Management and Safeguards System (NMMSS), tracking nuclear material activities and providing accounting reports for a variety of government and private users. The uses of the system include not only the tracking of nuclear materials for international and domestic safeguards purposes but also serving to facilitate the government's resource management purposes as well. The system was converted to PC hardware and fourth generation software to improve upon the mainframe system. The change was motivated by the desire to have a system amenable to frequent modifications, to improve upon services to users and to reduce increasing operating costs. Based on two years of operating the new system, it is clear that these objectives were met. Future changes to the system are inevitable and the national system of accounting for nuclear materials has the technology base to meet the challenges with proven capability. (author)

  14. ABrIL - Advanced Brain Imaging Lab : a cloud based computation environment for cooperative neuroimaging projects.

    Science.gov (United States)

    Neves Tafula, Sérgio M; Moreira da Silva, Nádia; Rozanski, Verena E; Silva Cunha, João Paulo

    2014-01-01

    Neuroscience is an increasingly multidisciplinary and highly cooperative field where neuroimaging plays an important role. Neuroimaging rapid evolution is demanding for a growing number of computing resources and skills that need to be put in place at every lab. Typically each group tries to setup their own servers and workstations to support their neuroimaging needs, having to learn from Operating System management to specific neuroscience software tools details before any results can be obtained from each setup. This setup and learning process is replicated in every lab, even if a strong collaboration among several groups is going on. In this paper we present a new cloud service model - Brain Imaging Application as a Service (BiAaaS) - and one of its implementation - Advanced Brain Imaging Lab (ABrIL) - in the form of an ubiquitous virtual desktop remote infrastructure that offers a set of neuroimaging computational services in an interactive neuroscientist-friendly graphical user interface (GUI). This remote desktop has been used for several multi-institution cooperative projects with different neuroscience objectives that already achieved important results, such as the contribution to a high impact paper published in the January issue of the Neuroimage journal. The ABrIL system has shown its applicability in several neuroscience projects with a relatively low-cost, promoting truly collaborative actions and speeding up project results and their clinical applicability.

  15. Advanced practice nurses' scope of practice: a qualitative study of advanced clinical competencies.

    Science.gov (United States)

    Nieminen, Anna-Lena; Mannevaara, Bodil; Fagerström, Lisbeth

    2011-12-01

    To describe and explore Advanced Practice Nurses' clinical competencies and how these are expressed in clinical practice. Discussion concerning advanced clinical practice has been ongoing in the USA since the 1960s and in the UK since the late 1980s. Approximately 24 countries, excluding the USA, have implemented the role of Advance Practice Nurse (APN). In the Nordic countries, especially Sweden and Finland, APNs have been introduced in some organizations but their competency domains have not yet been clearly defined. The study's theoretical framework emanates from Aristotle's three-dimensional view of knowledge that is epistêmê, technê, and phronesis. Between October 2005 and January 2006, focus group interviews of Clinical Nurse Specialists who provide expert functions in pediatric, internal medicine, and surgical units (n = 26) and APN students (n = 8) were conducted. The data material was analyzed using inductive content analysis. Grouped into five main themes, the study results indicate that APNs possess advanced level clinical competencies in: (A) assessment of patients' caring needs and nursing care activities, (B) the caring relationship, (C) multi-professional teamwork, (D) development of competence and nursing care, and (E) leadership in a learning and caring culture. Clinical competencies consist of advanced skills, which typify an expanding role that offers new possibilities for holistic patient care practice. APNs' scope of practice is characterized by responsibility and competence in making autonomous judgments based on expanded clinical competence. On an advanced level, clinical competence consists not merely of advanced skills for assessing and meeting the needs of patients but also the creation of safe and trustful relationships with patients and collaboration with colleagues. APNs can realize advanced skills in their actions through their manner of knowing, doing, and being. © 2011 The Authors. Scandinavian Journal of Caring Sciences © 2011

  16. Integrated Computational Materials Engineering Development of Advanced High Strength Steel for Lightweight Vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Hector, Jr., Louis G. [General Motors, Warren, MI (United States); McCarty, Eric D. [United States Automotive Materials Partnership LLC (USAMP), Southfield, MI (United States)

    2017-07-31

    The goal of the ICME 3GAHSS project was to successfully demonstrate the applicability of Integrated Computational Materials Engineering (ICME) for the development and deployment of third generation advanced high strength steels (3GAHSS) for immediate weight reduction in passenger vehicles. The ICME approach integrated results from well-established computational and experimental methodologies to develop a suite of material constitutive models (deformation and failure), manufacturing process and performance simulation modules, a properties database, as well as the computational environment linking them together for both performance prediction and material optimization. This is the Final Report for the ICME 3GAHSS project, which achieved the fol-lowing objectives: 1) Developed a 3GAHSS ICME model, which includes atomistic, crystal plasticity, state variable and forming models. The 3GAHSS model was implemented in commercially available LS-DYNA and a user guide was developed to facilitate use of the model. 2) Developed and produced two 3GAHSS alloys using two different chemistries and manufacturing processes, for use in calibrating and validating the 3GAHSS ICME Model. 3) Optimized the design of an automotive subassembly by substituting 3GAHSS for AHSS yielding a design that met or exceeded all baseline performance requirements with a 30% mass savings. A technical cost model was also developed to estimate the cost per pound of weight saved when substituting 3GAHSS for AHSS. The project demonstrated the potential for 3GAHSS to achieve up to 30% weight savings in an automotive structure at a cost penalty of up to $0.32 to $1.26 per pound of weight saved. The 3GAHSS ICME Model enables the user to design 3GAHSS to desired mechanical properties in terms of strength and ductility.

  17. Whole Body Computed Tomography with Advanced Imaging Techniques: A Research Tool for Measuring Body Composition in Dogs

    Directory of Open Access Journals (Sweden)

    Dharma Purushothaman

    2013-01-01

    Full Text Available The use of computed tomography (CT to evaluate obesity in canines is limited. Traditional CT image analysis is cumbersome and uses prediction equations that require manual calculations. In order to overcome this, our study investigated the use of advanced image analysis software programs to determine body composition in dogs with an application to canine obesity research. Beagles and greyhounds were chosen for their differences in morphology and propensity to obesity. Whole body CT scans with regular intervals were performed on six beagles and six greyhounds that were subjected to a 28-day weight-gain protocol. The CT images obtained at days 0 and 28 were analyzed using software programs OsiriX, ImageJ, and AutoCAT. The CT scanning technique was able to differentiate bone, lean, and fat tissue in dogs and proved sensitive enough to detect increases in both lean and fat during weight gain over a short period. A significant difference in lean : fat ratio was observed between the two breeds on both days 0 and 28 (P<0.01. Therefore, CT and advanced image analysis proved useful in the current study for the estimation of body composition in dogs and has the potential to be used in canine obesity research.

  18. Cosmos, an international center for advanced studies

    Science.gov (United States)

    Ryzhov, Iurii; Alifanov, Oleg; Sadin, Stanley; Coleman, Paul

    1990-01-01

    The concept of Cosmos, a Soviet operating center for aerospace activities, is presented. The main Cosmos participants are the Institute for Aerospace Education, the Institute for Research and Commercial Development, and the Department of Space Policy and Socio-Economic Studies. Cosmos sponsors a number of educational programs, basic research, and studies of the social impact of space-related technologies.

  19. Computer Aided Theragnosis Using Quantitative Ultrasound Spectroscopy and Maximum Mean Discrepancy in Locally Advanced Breast Cancer.

    Science.gov (United States)

    Gangeh, Mehrdad J; Tadayyon, Hadi; Sannachi, Lakshmanan; Sadeghi-Naini, Ali; Tran, William T; Czarnota, Gregory J

    2016-03-01

    A noninvasive computer-aided-theragnosis (CAT) system was developed for the early therapeutic cancer response assessment in patients with locally advanced breast cancer (LABC) treated with neoadjuvant chemotherapy. The proposed CAT system was based on multi-parametric quantitative ultrasound (QUS) spectroscopic methods in conjunction with advanced machine learning techniques. Specifically, a kernel-based metric named maximum mean discrepancy (MMD), a technique for learning from imbalanced data based on random undersampling, and supervised learning were investigated with response-monitoring data from LABC patients. The CAT system was tested on 56 patients using statistical significance tests and leave-one-subject-out classification techniques. Textural features using state-of-the-art local binary patterns (LBP), and gray-scale intensity features were extracted from the spectral parametric maps in the proposed CAT system. The system indicated significant differences in changes between the responding and non-responding patient populations as well as high accuracy, sensitivity, and specificity in discriminating between the two patient groups early after the start of treatment, i.e., on weeks 1 and 4 of several months of treatment. The proposed CAT system achieved an accuracy of 85%, 87%, and 90% on weeks 1, 4 and 8, respectively. The sensitivity and specificity of developed CAT system for the same times was 85%, 95%, 90% and 85%, 85%, 91%, respectively. The proposed CAT system thus establishes a noninvasive framework for monitoring cancer treatment response in tumors using clinical ultrasound imaging in conjunction with machine learning techniques. Such a framework can potentially facilitate the detection of refractory responses in patients to treatment early on during a course of therapy to enable possibly switching to more efficacious treatments.

  20. On several computer-oriented studies

    International Nuclear Information System (INIS)

    Takahashi, Ryoichi

    1982-01-01

    To utilize fully digital techniques for solving various difficult problems, nuclear engineers have recourse to computer-oriented approaches. The current trend, in such fields as optimization theory, control system theory and computational fluid dynamics reflect the ability to use computers to obtain numerical solutions to complex problems. Special purpose computers will be used as the integral part of the solving system to process a large amount of data, to implement a control law and even to produce a decision-making. Many problem-solving systems designed in the future will incorporate special-purpose computers as system component. The optimum use of computer system is discussed: why are energy model, energy data base and a big computer used; why will the economic process-computer be allocated to nuclear plants in the future; why should the super-computer be demonstrated at once. (Mori, K.)

  1. The traveling salesman problem a computational study

    CERN Document Server

    Applegate, David L; Chvatal, Vasek; Cook, William J

    2006-01-01

    This book presents the latest findings on one of the most intensely investigated subjects in computational mathematics--the traveling salesman problem. It sounds simple enough: given a set of cities and the cost of travel between each pair of them, the problem challenges you to find the cheapest route by which to visit all the cities and return home to where you began. Though seemingly modest, this exercise has inspired studies by mathematicians, chemists, and physicists. Teachers use it in the classroom. It has practical applications in genetics, telecommunications, and neuroscience.

  2. Policy Studies on Bioethical Issues Advance

    Science.gov (United States)

    Chemical and Engineering News, 1976

    1976-01-01

    Describes the policies, operation, and some decisions of the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research instituted to study the ethical aspects of scientific research. (MLH)

  3. Advances in clinical study of curcumin.

    Science.gov (United States)

    Yang, Chunfen; Su, Xun; Liu, Anchang; Zhang, Lin; Yu, Aihua; Xi, Yanwei; Zhai, Guangxi

    2013-01-01

    Curcumin has been estimated as a potential agent for many diseases and attracted great attention owing to its various pharmacological activities, including anti-cancer, and anti-inflammatory. Now curcumin is being applied to a number of patients with breast cancer, rheumatoid arthritis, Alzheimer's disease, colorectal cancer, psoriatic, etc. Several clinical trials have stated that curcumin is safe enough and effective. The objective of this article was to summarize the clinical studies of curcumin, and give a reference for future studies.

  4. Computer visualization for enhanced operator performance for advanced nuclear power plants

    International Nuclear Information System (INIS)

    Simon, B.H.; Raghavan, R.

    1993-01-01

    The operators of nuclear power plants are presented with an often uncoordinated and arbitrary array of displays and controls. Information is presented in different formats and on physically dissimilar instruments. In an accident situation, an operator must be very alert to quickly diagnose and respond to the state of the plant as represented by the control room displays. Improvements in display technology and increased automation have helped reduce operator burden; however, too much automation may lead to operator apathy and decreased efficiency. A proposed approach to the human-system interface uses modern graphics technology and advances in computational power to provide a visualization or ''virtual reality'' framework for the operator. This virtual reality comprises a simulated perception of another existence, complete with three-dimensional structures, backgrounds, and objects. By placing the operator in an environment that presents an integrated, graphical, and dynamic view of the plant, his attention is directly engaged. Through computer simulation, the operator can view plant equipment, read local displays, and manipulate controls as if he were in the local area. This process not only keeps an operator involved in plant operation and testing procedures, but also reduces personnel exposure. In addition, operator stress is reduced because, with realistic views of plant areas and equipment, the status of the plant can be accurately grasped without interpreting a large number of displays. Since a single operator can quickly ''visit'' many different plant areas without physically moving from the control room, these techniques are useful in reducing labor requirements for surveillance and maintenance activities. This concept requires a plant dynamic model continuously updated via real-time process monitoring. This model interacts with a three-dimensional, solid-model architectural configuration of the physical plant

  5. Pre-Flight Advanced Clothing Study

    Science.gov (United States)

    Orndoff, Evelyne; Poritz, Darwin; Schlesinger, Thilini; Byme, Vicky

    2014-01-01

    All human space missions require significant logistical mass and volume that will become an excessive burden for long duration missions beyond low Earth orbit. The current International Space Station (ISS) crew wardrobe has already evolved not only to reduce some of the logistical burden but also to address crew preference. The present study was undertaken to find ways further to reduce this logistical burden while examining human response to different types of clothes. The primary objective of the study is to measure how long people can wear the same exercise garment, depending on the type of fabric and the presence of antimicrobial treatment. The secondary objective is to assess the reasons for length of wear from perceptions of clothing characteristics, including nine ordinal scales. Cardiovascular exercise was chosen as the activity in this experiment for its profuse sweating effect and because it is considered a more severe treatment applied to the clothes than every-day usage. Study garments were exercise T-shirts and shorts purchased from various vendors. Fabric construction, fabric composition, and finishing treatment were defined as the key variables. A web-based questionnaire was used for self-reported data collection. The study was divided in three balanced experiments: a cotton-polyester-wool (CPW) T-shirts study with 61 participants, a polyester-modacrylic-polyester/cocona (PMC) T-shirts study with 40 participants, and a shorts study with 70 participants. In the CPW study, the T-shirts were made of 100% cotton, or of 100% polyester or of 100% wool, and categorized into open and tight knit constructions. In the PMC study, the T-shirts were made of 100% polyester, or of 82% modacrylic, or of 95% polyester with 5% cocona fiber, without construction distinction. The shorts were made either of 100% cotton or of 100% polyester, and were knitted or woven. Some garments were treated with Bio-Protect 500 antimicrobial finish according the experimental design

  6. Advanced Hydraulic Studies on Enhancing Particle Removal

    DEFF Research Database (Denmark)

    He, Cheng

    The removal of suspended solids and attached pollutants is one of the main treatment processes in wastewater treatment. This thesis presents studies on the hydraulic conditions of various particle removal facilities for possible ways to increase their treatment capacity and performance by utilizing...... and improving hydraulic conditions. Unlike most traditional theses which usually focus only on one particular subject of study, this thesis contains four relatively independent studies which cover the following topics: a newly proposed particle settling enhancement plate, the redesign of the inlet zone......, Vortex Plate, were tested under various flows and settling conditions. Structure of the Vortex Plate consists of multiple long narrow parallel slots which are built on a flat plate. Vortices are generated by cross-flow passing the long narrow parallel slots. The Vortex Plate can be used in the same way...

  7. Advanced supplier partnership practices: a case study.

    Science.gov (United States)

    Williams, B R

    2000-05-01

    This article describes how a supplier partnership was set up to avoid the typical purchasing relationship--price being inversely proportional to quantity and having the purchaser take all the risk of product obsolescence. The case study also describes how rate-based replenishment replaced time-based delivery, and how all these advantages were achieved at reduced administrative costs.

  8. Advances in Epidemiological Studies of Herpes Zoster

    Directory of Open Access Journals (Sweden)

    Gu Xiaoming

    2015-12-01

    Full Text Available Mycoplasma genitalium (Mg commonly causes nongonococcal urethritis and cervicitis. Mg is a fastidious bacterium that poses difficulty in time-consuming isolation and culture. Lack of specificity for serological tests also hampers clinical research of Mg. With development of molecular biology, polymerase chain reaction tests, which exhibit high sensitivities and specificities, became primary tools for foundational and clinical studies of Mg.

  9. Advances in Epidemiological Studies of Herpes Zoster

    OpenAIRE

    Gu Xiaoming

    2015-01-01

    Mycoplasma genitalium (Mg) commonly causes nongonococcal urethritis and cervicitis. Mg is a fastidious bacterium that poses difficulty in time-consuming isolation and culture. Lack of specificity for serological tests also hampers clinical research of Mg. With development of molecular biology, polymerase chain reaction tests, which exhibit high sensitivities and specificities, became primary tools for foundational and clinical studies of Mg.

  10. Imaging in rheumatoid arthritis--status and recent advances for magnetic resonance imaging, ultrasonography, computed tomography and conventional radiography

    DEFF Research Database (Denmark)

    Østergaard, Morten; Pedersen, Susanne Juhl; Dohn, U.M.

    2008-01-01

    , and have several documented and potential applications in RA patients. This chapter will review key aspects of the current status and recent important advances in imaging in RA, briefly discussing X-ray and computed tomography, and particularly focusing on MRI and US. Suggestions for use in clinical trials...

  11. Recent advances in Optical Computed Tomography (OCT) imaging system for three dimensional (3D) radiotherapy dosimetry

    Science.gov (United States)

    Rahman, Ahmad Taufek Abdul; Farah Rosli, Nurul; Zain, Shafirah Mohd; Zin, Hafiz M.

    2018-01-01

    Radiotherapy delivery techniques for cancer treatment are becoming more complex and highly focused, to enable accurate radiation dose delivery to the cancerous tissue and minimum dose to the healthy tissue adjacent to tumour. Instrument to verify the complex dose delivery in radiotherapy such as optical computed tomography (OCT) measures the dose from a three-dimensional (3D) radiochromic dosimeter to ensure the accuracy of the radiotherapy beam delivery to the patient. OCT measures the optical density in radiochromic material that changes predictably upon exposure to radiotherapy beams. OCT systems have been developed using a photodiode and charged coupled device (CCD) as the detector. The existing OCT imaging systems have limitation in terms of the accuracy and the speed of the measurement. Advances in on-pixel intelligence CMOS image sensor (CIS) will be exploited in this work to replace current detector in OCT imaging systems. CIS is capable of on-pixel signal processing at a very fast imaging speed (over several hundred images per second) that will allow improvement in the 3D measurement of the optical density. The paper will review 3D radiochromic dosimeters and OCT systems developed and discuss how CMOS based OCT imaging will provide accurate and fast optical density measurements in 3D. The paper will also discuss the configuration of the CMOS based OCT developed in this work and how it may improve the existing OCT system.

  12. IMPROVED COMPUTATIONAL NEUTRONICS METHODS AND VALIDATION PROTOCOLS FOR THE ADVANCED TEST REACTOR

    Energy Technology Data Exchange (ETDEWEB)

    David W. Nigg; Joseph W. Nielsen; Benjamin M. Chase; Ronnie K. Murray; Kevin A. Steuhm

    2012-04-01

    The Idaho National Laboratory (INL) is in the process of modernizing the various reactor physics modeling and simulation tools used to support operation and safety assurance of the Advanced Test Reactor (ATR). Key accomplishments so far have encompassed both computational as well as experimental work. A new suite of stochastic and deterministic transport theory based reactor physics codes and their supporting nuclear data libraries (HELIOS, KENO6/SCALE, NEWT/SCALE, ATTILA, and an extended implementation of MCNP5) has been installed at the INL. Corresponding models of the ATR and ATRC are now operational with all five codes, demonstrating the basic feasibility of the new code packages for their intended purpose. Of particular importance, a set of as-run core depletion HELIOS calculations for all ATR cycles since August 2009 was successfully completed during 2011. This demonstration supported a decision late in the year to proceed with the phased incorporation of the HELIOS methodology into the ATR fuel cycle management process beginning in 2012. On the experimental side of the project, new hardware was fabricated, measurement protocols were finalized, and the first four of six planned physics code validation experiments based on neutron activation spectrometry were conducted at the ATRC facility. Data analysis for the first three experiments, focused on characterization of the neutron spectrum in one of the ATR flux traps, has been completed. The six experiments will ultimately form the basis for a flexible, easily-repeatable ATR physics code validation protocol that is consistent with applicable ASTM standards.

  13. [Advances in studies on bear bile powder].

    Science.gov (United States)

    Zhou, Chao-fan; Gao, Guo-jian; Liu, Ying

    2015-04-01

    In this paper, a detailed analysis was made on relevant literatures about bear bile powder in terms of chemical component, pharmacological effect and clinical efficacy, indicating bear bile powder's significant pharmacological effects and clinical application in treating various diseases. Due to the complex composition, bear bile powder is relatively toxic. Therefore, efforts shall be made to study bear bile powder's pharmacological effects, clinical application, chemical composition and toxic side-effects, with the aim to provide a scientific basis for widespread reasonable clinical application of bear bile powder.

  14. Advanced Computational Thermal Fluid Physics (CTFP) and Its Assessment for Light Water Reactors and Supercritical Reactors

    International Nuclear Information System (INIS)

    D.M. McEligot; K. G. Condie; G. E. McCreery; H. M. McIlroy; R. J. Pink; L.E. Hochreiter; J.D. Jackson; R.H. Pletcher; B.L. Smith; P. Vukoslavcevic; J.M. Wallace; J.Y. Yoo; J.S. Lee; S.T. Ro; S.O. Park

    2005-01-01

    Background: The ultimate goal of the study is the improvement of predictive methods for safety analyses and design of Generation IV reactor systems such as supercritical water reactors (SCWR) for higher efficiency, improved performance and operation, design simplification, enhanced safety and reduced waste and cost. The objective of this Korean/US/laboratory/university collaboration of coupled fundamental computational and experimental studies is to develop the supporting knowledge needed for improved predictive techniques for use in the technology development of Generation IV reactor concepts and their passive safety systems. The present study emphasizes SCWR concepts in the Generation IV program

  15. Advanced Computational Thermal Fluid Physics (CTFP) and Its Assessment for Light Water Reactors and Supercritical Reactors

    Energy Technology Data Exchange (ETDEWEB)

    D.M. McEligot; K. G. Condie; G. E. McCreery; H. M. McIlroy; R. J. Pink; L.E. Hochreiter; J.D. Jackson; R.H. Pletcher; B.L. Smith; P. Vukoslavcevic; J.M. Wallace; J.Y. Yoo; J.S. Lee; S.T. Ro; S.O. Park

    2005-10-01

    Background: The ultimate goal of the study is the improvement of predictive methods for safety analyses and design of Generation IV reactor systems such as supercritical water reactors (SCWR) for higher efficiency, improved performance and operation, design simplification, enhanced safety and reduced waste and cost. The objective of this Korean / US / laboratory / university collaboration of coupled fundamental computational and experimental studies is to develop the supporting knowledge needed for improved predictive techniques for use in the technology development of Generation IV reactor concepts and their passive safety systems. The present study emphasizes SCWR concepts in the Generation IV program.

  16. Further advances in aging studies for RPCs

    CERN Document Server

    Aielli, G; Cardarelli, R; Di Ciaccio, A; Di Stante, L; Liberti, B; Paoloni, A; Pastori, E; Santonico, R

    2003-01-01

    Aging phenomena in RPCs have been studied since 1996 in the framework of the ATLAS experiment with small-size detectors, irradiated with a **6**0Co source. The results evidenced a decrease of the detector rate capability at fixed electric field, due to an increase of the total resistance of electrodes. This was confirmed by an extensive aging test performed on the ATLAS RPC "module-0" at GIF-X5, the CERN irradiation facility. A primary cause for this effect was previously shown to be the degradation of the anodic graphite coating, which distributes the electric field on the bakelite electrode. We present here a systematic study of the graphite aging which fully confirms this interpretation. Moreover, we show that detectors with improved graphite coating allow to gain a factor of at least two in lifetime. In the framework of these tests, we also show that the aging behavior of a detector working at high current induced by heavy irradiation can be reproduced by operating the detector, filled with pure Argon, in...

  17. Advances in the study of nodavirus

    Directory of Open Access Journals (Sweden)

    Chean Yeah Yong

    2017-09-01

    Full Text Available Nodaviruses are small bipartite RNA viruses which belong to the family of Nodaviridae. They are categorized into alpha-nodavirus, which infects insects, and beta-nodavirus, which infects fishes. Another distinct group of nodavirus infects shrimps and prawns, which has been proposed to be categorized as gamma-nodavirus. Our current review focuses mainly on recent studies performed on nodaviruses. Nodavirus can be transmitted vertically and horizontally. Recent outbreaks have been reported in China, Indonesia, Singapore and India, affecting the aquaculture industry. It also decreased mullet stock in the Caspian Sea. Histopathology and transmission electron microscopy (TEM are used to examine the presence of nodaviruses in infected fishes and prawns. For classification, virus isolation followed by nucleotide sequencing are required. In contrast to partial sequence identification, profiling the whole transcriptome using next generation sequencing (NGS offers a more comprehensive comparison and characterization of the virus. For rapid diagnosis of nodavirus, assays targeting the viral RNA based on reverse-transcription PCR (RT-PCR such as microfluidic chips, reverse-transcription loop-mediated isothermal amplification (RT-LAMP and RT-LAMP coupled with lateral flow dipstick (RT-LAMP-LFD have been developed. Besides viral RNA detections, diagnosis based on immunological assays such as enzyme-linked immunosorbent assay (ELISA, immunodot and Western blotting have also been reported. In addition, immune responses of fish and prawn are also discussed. Overall, in fish, innate immunity, cellular type I interferon immunity and humoral immunity cooperatively prevent nodavirus infections, whereas prawns and shrimps adopt different immune mechanisms against nodavirus infections, through upregulation of superoxide anion, prophenoloxidase, superoxide dismutase (SOD, crustin, peroxinectin, anti-lipopolysaccharides and heat shock proteins (HSP. Potential vaccines

  18. Recent Advances in the Studies on Luotonins

    Directory of Open Access Journals (Sweden)

    Yurngdong Jahng

    2011-06-01

    Full Text Available Luotonins are alkaloids from the aerial parts of Peganum nigellastrum Bunge. that display three major skeleton types. Luotonins A, B, and E are pyrroloquinazolino-quinoline alkaloids, luotonins C and D are canthin-6-one alkaloids, and luotonin F is a 4(3H-quinazolinone alkaloid. All six luotonins have shown promising cytotoxicities towards selected human cancer cell lines, especially against leukemia P-388 cells. Luotonin A is the most active one, with its activity stemming from topoisomerase I-dependent DNA-cleavage. Such intriguing biological activities and unique structures have led not only to the development of synthetic methods for the efficient synthesis of these compounds, but also to interest in structural modifications for improving the biological properties. Recent progress in the study of luotonins is covered.

  19. Advanced Sensors and Applications Study (ASAS)

    Science.gov (United States)

    Chism, S. B.; Hughes, C. L.

    1976-01-01

    The present EOD requirements for sensors in the space shuttle era are reported with emphasis on those applications which were deemed important enough to warrant separate sections. The application areas developed are: (1) agriculture; (2) atmospheric corrections; (3) cartography; (4) coastal studies; (5) forestry; (6) geology; (7) hydrology; (8) land use; (9) oceanography; and (10) soil moisture. For each application area. The following aspects were covered: (1) specific goals and techniques, (2) individual sensor requirements including types, bands, resolution, etc.; (3) definition of mission requirements, type orbits, coverages, etc.; and (4) discussion of anticipated problem areas and solutions. The remote sensors required for these application areas include; (1) camera systems; (2) multispectral scanners; (3) microwave scatterometers; (4) synthetic aperture radars; (5) microwave radiometers; and (6) vidicons. The emphasis in the remote sensor area was on the evaluation of present technology implications about future systems.

  20. Dissociated dislocations in Ni: a computational study

    International Nuclear Information System (INIS)

    Szelestey, P.; Patriarca, M.; Kaski, K.

    2005-01-01

    A systematic computational study of the behavior of a (1/2) dissociated screw dislocation in fcc nickel is presented, in which atomic interactions are described through an embedded-atom potential. A suitable external stress is applied on the system, both for modifying the equilibrium separation distance d and moving the dislocation complex. The structure of the dislocation and its corresponding changes during the motion are studied in the framework of the two-dimensional Peierls model, for different values of the ratio d/a', where a' is the period of the Peierls potential. The distance between the edge and screw components of the partials, as well as their widths, undergo a modulation with period a', as the dislocation moves, and the amplitudes of such oscillations are shown to depend on d/a'. The stress profile acting on the dislocation complex is analyzed and the effective Peierls stress is estimated for different values of d/a'

  1. METHODS ADVANCEMENT FOR MILK ANALYSIS: THE MAMA STUDY

    Science.gov (United States)

    The Methods Advancement for Milk Analysis (MAMA) study was designed by US EPA and CDC investigators to provide data to support the technological and study design needs of the proposed National Children=s Study (NCS). The NCS is a multi-Agency-sponsored study, authorized under the...

  2. NATO Advanced Research Workshop on Computational Methods for Polymers and Liquid Crystalline Polymers

    CERN Document Server

    Pasini, Paolo; Žumer, Slobodan; Computer Simulations of Liquid Crystals and Polymers

    2005-01-01

    Liquid crystals, polymers and polymer liquid crystals are soft condensed matter systems of major technological and scientific interest. An understanding of the macroscopic properties of these complex systems and of their many and interesting peculiarities at the molecular level can nowadays only be attained using computer simulations and statistical mechanical theories. Both in the Liquid Crystal and Polymer fields a considerable amount of simulation work has been done in the last few years with various classes of models at different special resolutions, ranging from atomistic to molecular and coarse-grained lattice models. Each of the two fields has developed its own set of tools and specialized procedures and the book aims to provide a state of the art review of the computer simulation studies of polymers and liquid crystals. This is of great importance in view of a potential cross-fertilization between these connected areas which is particularly apparent for a number of experimental systems like, e.g. poly...

  3. Advances in Computer Science, Engineering & Applications : Proceedings of the Second International Conference on Computer Science, Engineering & Applications

    CERN Document Server

    Zizka, Jan; Nagamalai, Dhinaharan

    2012-01-01

    The International conference series on Computer Science, Engineering & Applications (ICCSEA) aims to bring together researchers and practitioners from academia and industry to focus on understanding computer science, engineering and applications and to establish new collaborations in these areas. The Second International Conference on Computer Science, Engineering & Applications (ICCSEA-2012), held in Delhi, India, during May 25-27, 2012 attracted many local and international delegates, presenting a balanced mixture of  intellect and research both from the East and from the West. Upon a strenuous peer-review process the best submissions were selected leading to an exciting, rich and a high quality technical conference program, which featured high-impact presentations in the latest developments of various areas of computer science, engineering and applications research.  

  4. Advances in Computer Science, Engineering & Applications : Proceedings of the Second International Conference on Computer Science, Engineering & Applications

    CERN Document Server

    Zizka, Jan; Nagamalai, Dhinaharan

    2012-01-01

    The International conference series on Computer Science, Engineering & Applications (ICCSEA) aims to bring together researchers and practitioners from academia and industry to focus on understanding computer science, engineering and applications and to establish new collaborations in these areas. The Second International Conference on Computer Science, Engineering & Applications (ICCSEA-2012), held in Delhi, India, during May 25-27, 2012 attracted many local and international delegates, presenting a balanced mixture of  intellect and research both from the East and from the West. Upon a strenuous peer-review process the best submissions were selected leading to an exciting, rich and a high quality technical conference program, which featured high-impact presentations in the latest developments of various areas of computer science, engineering and applications research.

  5. Computational study of performance characteristics for truncated conical aerospike nozzles

    Science.gov (United States)

    Nair, Prasanth P.; Suryan, Abhilash; Kim, Heuy Dong

    2017-12-01

    Aerospike nozzles are advanced rocket nozzles that can maintain its aerodynamic efficiency over a wide range of altitudes. It belongs to class of altitude compensating nozzles. A vehicle with an aerospike nozzle uses less fuel at low altitudes due to its altitude adaptability, where most missions have the greatest need for thrust. Aerospike nozzles are better suited to Single Stage to Orbit (SSTO) missions compared to conventional nozzles. In the current study, the flow through 20% and 40% aerospike nozzle is analyzed in detail using computational fluid dynamics technique. Steady state analysis with implicit formulation is carried out. Reynolds averaged Navier-Stokes equations are solved with the Spalart-Allmaras turbulence model. The results are compared with experimental results from previous work. The transition from open wake to closed wake happens in lower Nozzle Pressure Ratio for 20% as compared to 40% aerospike nozzle.

  6. Non-Determinism: An Abstract Concept in Computer Science Studies

    Science.gov (United States)

    Armoni, Michal; Gal-Ezer, Judith

    2007-01-01

    Non-determinism is one of the most important, yet abstract, recurring concepts of Computer Science. It plays an important role in Computer Science areas such as formal language theory, computability theory, distributed computing, and operating systems. We conducted a series of studies on the perception of non-determinism. In the current research,…

  7. A computed tomography study of Alzheimer's disease

    International Nuclear Information System (INIS)

    Arai, H.; Kobayashi, K.; Juntendo Univ. School of Medicine, Tokyo; Ikeda, Y.; Nagao, Y.; Ogihara, R.; Kosaka, K.; Psychiatric Research Inst. of Tokyo

    1983-01-01

    Computed tomography (CT) was used to study cerebral atrophy in 18 patients with clinically diagnosed Alzheimer's disease of presenile type and in 14 healthy age-matched subjects as controls. Using the computerized planimetric method, Subarachnoid Space Volume Index and Ventricle Volume Index were calculated as the measure of cortical atrophy and ventricular dilatation respectively. From the results the following conclusions were drawn: 1. The cerebral atrophy in Alzheimer patients could be attributable to the disease processes rather than to physiological aging of the brain. 2. The degree of atrophy increases in parallel with the progress of the clinical stage, and the cortical atrophy is already apparent at an early stage, whereas the ventricular dilatation becomes pronounced at later stages. 3. CT could be one of the most useful clinical tests available for the diagnosis of Alzheimer's disease. (orig.) [de

  8. Advanced airborne 3D computer image generation systems technologies for the year 2000

    Science.gov (United States)

    Bridges, Alan L.

    1992-07-01

    An airborne 3-D computer image generation system (CIGS) is a modular avionics box that receives commands from and sends status information to other avionics units. The CIGS maintains a large amount of data in secondary storage systems and simultaneously drives several display units. Emerging requirements for CIGS include: advanced avionics system architecture requirements and BIT/fault tolerance; real-time operating systems and graphic interface languages in Ada; and geometric/pixel processing functions, rendering system, and frame buffers/display controllers for pictorial displays. In addition, podded sensors (FLIR, LLTV, radar, etc.) will require multiplexing of high-resolution sensor video with graphics overlays. A combination of head-down AMLCD flat panels, helmet-mounted display (HMD), and Head-Up Display (HUD) will require highly parallel graphics generation technology. Generation of high-resolution, real-time 2-D/3-D displays with anti-aliasing, transparency, shading, and motion, however, emphasizes compute-intensive processing. High-performance graphics engines, powerful floating point processors, and parallel architectures are needed to increase the rendering speed, functionality and reliability, while reducing power, space requirements, and cost. The CIGS of the future will feature special high speed busses geared toward real-time graphics processing. The CIG system will be multi-channel, will have a high addressable resolution to drive HUD, 3-D displays in 4-pi-steradian virtual space, and 3-D panoramic displays; and will include fiber optics video distribution between CIG and display units. The head-down display (HDD) is by far the most complex display in that both background and overlay display elements are required. The background is usually generated from terrain/cultural features data. Terrain data is used to generate 2-D map backgrounds or 3-D perspective views duplicating or substituting for the pilot's out-the-window view. Performance of 150

  9. [The Psychomat computer complex for psychophysiologic studies].

    Science.gov (United States)

    Matveev, E V; Nadezhdin, D S; Shemsudov, A I; Kalinin, A V

    1991-01-01

    The authors analyze the principles of the design of a computed psychophysiological system for universal uses. Show the effectiveness of the use of computed technology as a combination of universal computation and control potentialities of a personal computer equipped with problem-oriented specialized facilities of stimuli presentation and detection of the test subject's reactions. Define the hardware and software configuration of the microcomputer psychophysiological system "Psychomat". Describe its functional possibilities and the basic medico-technical characteristics. Review organizational issues of the maintenance of its full-scale production.

  10. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  11. Advanced Computational Materials Science: Application to Fusion and Generation IV Fission Reactors (Workshop Report)

    Energy Technology Data Exchange (ETDEWEB)

    Stoller, RE

    2004-07-15

    The ''Workshop on Advanced Computational Materials Science: Application to Fusion and Generation IV Fission Reactors'' was convened to determine the degree to which an increased effort in modeling and simulation could help bridge the gap between the data that is needed to support the implementation of these advanced nuclear technologies and the data that can be obtained in available experimental facilities. The need to develop materials capable of performing in the severe operating environments expected in fusion and fission (Generation IV) reactors represents a significant challenge in materials science. There is a range of potential Gen-IV fission reactor design concepts and each concept has its own unique demands. Improved economic performance is a major goal of the Gen-IV designs. As a result, most designs call for significantly higher operating temperatures than the current generation of LWRs to obtain higher thermal efficiency. In many cases, the desired operating temperatures rule out the use of the structural alloys employed today. The very high operating temperature (up to 1000 C) associated with the NGNP is a prime example of an attractive new system that will require the development of new structural materials. Fusion power plants represent an even greater challenge to structural materials development and application. The operating temperatures, neutron exposure levels and thermo-mechanical stresses are comparable to or greater than those for proposed Gen-IV fission reactors. In addition, the transmutation products created in the structural materials by the high energy neutrons produced in the DT plasma can profoundly influence the microstructural evolution and mechanical behavior of these materials. Although the workshop addressed issues relevant to both Gen-IV and fusion reactor materials, much of the discussion focused on fusion; the same focus is reflected in this report. Most of the physical models and computational methods

  12. On the Predictability of Computer simulations: Advances in Verification and Validation

    KAUST Repository

    Prudhomme, Serge

    2014-01-06

    We will present recent advances on the topics of Verification and Validation in order to assess the reliability and predictability of computer simulations. The first part of the talk will focus on goal-oriented error estimation for nonlinear boundary-value problems and nonlinear quantities of interest, in which case the error representation consists of two contributions: 1) a first contribution, involving the residual and the solution of the linearized adjoint problem, which quantifies the discretization or modeling error; and 2) a second contribution, combining higher-order terms that describe the linearization error. The linearization error contribution is in general neglected with respect to the discretization or modeling error. However, when nonlinear effects are significant, it is unclear whether ignoring linearization effects may produce poor convergence of the adaptive process. The objective will be to show how both contributions can be estimated and employed in an adaptive scheme that simultaneously controls the two errors in a balanced manner. In the second part of the talk, we will present novel approach for calibration of model parameters. The proposed inverse problem not only involves the minimization of the misfit between experimental observables and their theoretical estimates, but also an objective function that takes into account some design goals on specific design scenarios. The method can be viewed as a regularization approach of the inverse problem, one, however, that best respects some design goals for which mathematical models are intended. The inverse problem is solved by a Bayesian method to account for uncertainties in the data. We will show that it shares the same structure as the deterministic problem that one would obtain by multi-objective optimization theory. The method is illustrated on an example of heat transfer in a two-dimensional fin. The proposed approach has the main benefit that it increases the confidence in predictive

  13. Integrated Graphics Operations and Analysis Lab Development of Advanced Computer Graphics Algorithms

    Science.gov (United States)

    Wheaton, Ira M.

    2011-01-01

    The focus of this project is to aid the IGOAL in researching and implementing algorithms for advanced computer graphics. First, this project focused on porting the current International Space Station (ISS) Xbox experience to the web. Previously, the ISS interior fly-around education and outreach experience only ran on an Xbox 360. One of the desires was to take this experience and make it into something that can be put on NASA s educational site for anyone to be able to access. The current code works in the Unity game engine which does have cross platform capability but is not 100% compatible. The tasks for an intern to complete this portion consisted of gaining familiarity with Unity and the current ISS Xbox code, porting the Xbox code to the web as is, and modifying the code to work well as a web application. In addition, a procedurally generated cloud algorithm will be developed. Currently, the clouds used in AGEA animations and the Xbox experiences are a texture map. The desire is to create a procedurally generated cloud algorithm to provide dynamically generated clouds for both AGEA animations and the Xbox experiences. This task consists of gaining familiarity with AGEA and the plug-in interface, developing the algorithm, creating an AGEA plug-in to implement the algorithm inside AGEA, and creating a Unity script to implement the algorithm for the Xbox. This portion of the project was unable to be completed in the time frame of the internship; however, the IGOAL will continue to work on it in the future.

  14. A STUDY OF LOCALLY ADVANCED CARCINOMA OF BREAST

    Directory of Open Access Journals (Sweden)

    Prabhakar Jenna

    2017-08-01

    Full Text Available BACKGROUND Worldwide, breast cancer is the most frequent cancer in women and represents the second leading cause of cancer death among women. Locally advanced breast cancer constitutes more than 50-70% of the patients presenting for treatment has two common problems in treatment. Achieving local control and prolonging survival by preventing or delaying distant metastasis. Today, treatment of LABC requires a combination of systemic and local/regional therapies. The aim of the study is to study the clinicopathological presentation, age distribution and various modes of management of locally advanced breast carcinoma. Worldwide breast cancer is the most frequent cancer in women and represents the second leading cause of cancer death among women. Locally advanced breast cancer constitutes more than 50-70% of the patients presenting treatment. MATERIALS AND METHODS The present study includes 50 patients who attended Department of General Surgery for a period of three years. RESULTS The patients were regularly followed up and at the end of the study 35 (70% of the patients were doing well. 4(8% of the patients developed distant metastasis and 3 (6% of the patients developing local recurrence. 8 (16% of the patients were lost follow up. CONCLUSION About half of the cases presenting with breast cancer are in locally advanced stages. Multimodality therapy is the effective treatment of locally advanced carcinoma of breast. Breast cancer management is a challenge and improvement in therapies are needed for disease-free interval and overall survival period.

  15. Study guide to accompany computers data and processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Study Guide to Accompany Computer and Data Processing provides information pertinent to the fundamental aspects of computers and computer technology. This book presents the key benefits of using computers.Organized into five parts encompassing 19 chapters, this book begins with an overview of the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. This text then introduces computer hardware and describes the processor. Other chapters describe how microprocessors are made and describe the physical operation of computers. This book discusses as w

  16. NATO Advanced Study Institute on Microscopic Simulations of Complex Hydrodynamic Phenomena

    CERN Document Server

    Holian, Brad

    1992-01-01

    This volume contains the proceedings of a NATO Advanced Study Institute which was held in Alghero, Sardinia, in July 1991. The development of computers in the recent years has lead to the emergence of unconventional ideas aiming at solving old problems. Among these, the possibility of computing directly fluid flows from the trajectories of constituent particles has been much exploited in the last few years: lattice gases cellular automata and more generally Molecular Dynamics have been used to reproduce and study complex flows. Whether or not these methods may someday compete with more traditional approaches is a question which cannot be answered at the present time: it will depend on the new computer architectures as well as on the possibility to develop very simple models to reproduce the most complex phenomena taking place in the approach of fully developed turbulence or plastic flows. In any event, these molecular methods are already used, and sometimes in an applied engineering context, to study strong s...

  17. Computer-Based Study Guides II: Educational Components and Advantages.

    Science.gov (United States)

    Harden, R. M.; Smyth, J. J.

    1994-01-01

    Examines the potential contributions of computers to medical education and, in particular, the use of computer-based study guides. Presents the educational functions of various study guide components and lists advantages and disadvantages of computer-based study guides over print-based guides. (LZ)

  18. Conditional Inference and Advanced Mathematical Study: Further Evidence

    Science.gov (United States)

    Inglis, Matthew; Simpson, Adrian

    2009-01-01

    In this paper, we examine the support given for the "theory of formal discipline" by Inglis and Simpson (Educational Studies Mathematics 67:187-204, "2008"). This theory, which is widely accepted by mathematicians and curriculum bodies, suggests that the study of advanced mathematics develops general thinking skills and, in particular, conditional…

  19. Advanced computational workflow for the multi-scale modeling of the bone metabolic processes.

    Science.gov (United States)

    Dao, Tien Tuan

    2017-06-01

    Multi-scale modeling of the musculoskeletal system plays an essential role in the deep understanding of complex mechanisms underlying the biological phenomena and processes such as bone metabolic processes. Current multi-scale models suffer from the isolation of sub-models at each anatomical scale. The objective of this present work was to develop a new fully integrated computational workflow for simulating bone metabolic processes at multi-scale levels. Organ-level model employs multi-body dynamics to estimate body boundary and loading conditions from body kinematics. Tissue-level model uses finite element method to estimate the tissue deformation and mechanical loading under body loading conditions. Finally, cell-level model includes bone remodeling mechanism through an agent-based simulation under tissue loading. A case study on the bone remodeling process located on the human jaw was performed and presented. The developed multi-scale model of the human jaw was validated using the literature-based data at each anatomical level. Simulation outcomes fall within the literature-based ranges of values for estimated muscle force, tissue loading and cell dynamics during bone remodeling process. This study opens perspectives for accurately simulating bone metabolic processes using a fully integrated computational workflow leading to a better understanding of the musculoskeletal system function from multiple length scales as well as to provide new informative data for clinical decision support and industrial applications.

  20. Computational Studies of Snake Venom Toxins.

    Science.gov (United States)

    Ojeda, Paola G; Ramírez, David; Alzate-Morales, Jans; Caballero, Julio; Kaas, Quentin; González, Wendy

    2017-12-22

    Most snake venom toxins are proteins, and participate to envenomation through a diverse array of bioactivities, such as bleeding, inflammation, and pain, cytotoxic, cardiotoxic or neurotoxic effects. The venom of a single snake species contains hundreds of toxins, and the venoms of the 725 species of venomous snakes represent a large pool of potentially bioactive proteins. Despite considerable discovery efforts, most of the snake venom toxins are still uncharacterized. Modern bioinformatics tools have been recently developed to mine snake venoms, helping focus experimental research on the most potentially interesting toxins. Some computational techniques predict toxin molecular targets, and the binding mode to these targets. This review gives an overview of current knowledge on the ~2200 sequences, and more than 400 three-dimensional structures of snake toxins deposited in public repositories, as well as of molecular modeling studies of the interaction between these toxins and their molecular targets. We also describe how modern bioinformatics have been used to study the snake venom protein phospholipase A2, the small basic myotoxin Crotamine, and the three-finger peptide Mambalgin.

  1. Computational Studies of Snake Venom Toxins

    Directory of Open Access Journals (Sweden)

    Paola G. Ojeda

    2017-12-01

    Full Text Available Most snake venom toxins are proteins, and participate to envenomation through a diverse array of bioactivities, such as bleeding, inflammation, and pain, cytotoxic, cardiotoxic or neurotoxic effects. The venom of a single snake species contains hundreds of toxins, and the venoms of the 725 species of venomous snakes represent a large pool of potentially bioactive proteins. Despite considerable discovery efforts, most of the snake venom toxins are still uncharacterized. Modern bioinformatics tools have been recently developed to mine snake venoms, helping focus experimental research on the most potentially interesting toxins. Some computational techniques predict toxin molecular targets, and the binding mode to these targets. This review gives an overview of current knowledge on the ~2200 sequences, and more than 400 three-dimensional structures of snake toxins deposited in public repositories, as well as of molecular modeling studies of the interaction between these toxins and their molecular targets. We also describe how modern bioinformatics have been used to study the snake venom protein phospholipase A2, the small basic myotoxin Crotamine, and the three-finger peptide Mambalgin.

  2. Proceedings of the topical meeting on advances in human factors research on man/computer interactions

    International Nuclear Information System (INIS)

    Anon.

    1990-01-01

    This book discusses the following topics: expert systems and knowledge engineering-I; verification and validation of software; methods for modeling UMAN/computer performance; MAN/computer interaction problems in producing procedures -1-2; progress and problems with automation-1-2; experience with electronic presentation of procedures-2; intelligent displays and monitors; modeling user/computer interface; and computer-based human decision-making aids

  3. Sensitivity analysis to compute advanced stochastic problems in uncertain and complex electromagnetic environments

    Directory of Open Access Journals (Sweden)

    S. Lalléchère

    2012-10-01

    Full Text Available This paper deals with the advanced integration of uncertainties in electromagnetic interferences (EMI and electromagnetic compatibility (EMC problems.   In this context,  the Monte Carlo formalism may provide a reliable reference to proceed to statistical assessments.   After all, other  less  expensive  and  efficient  techniques  have  been implemented more recently (the unscented transform and stochastic collocation methods for instance and will be illustrated through uncertain EMC problems. Finally, we will present how the use of sensitivity analysis techniques may offer an efficient complement to rough statistical or stochastic studies.

  4. Computational protein biomarker prediction: a case study for prostate cancer

    Directory of Open Access Journals (Sweden)

    Adam Bao-Ling

    2004-03-01

    Full Text Available Abstract Background Recent technological advances in mass spectrometry pose challenges in computational mathematics and statistics to process the mass spectral data into predictive models with clinical and biological significance. We discuss several classification-based approaches to finding protein biomarker candidates using protein profiles obtained via mass spectrometry, and we assess their statistical significance. Our overall goal is to implicate peaks that have a high likelihood of being biologically linked to a given disease state, and thus to narrow the search for biomarker candidates. Results Thorough cross-validation studies and randomization tests are performed on a prostate cancer dataset with over 300 patients, obtained at the Eastern Virginia Medical School using SELDI-TOF mass spectrometry. We obtain average classification accuracies of 87% on a four-group classification problem using a two-stage linear SVM-based procedure and just 13 peaks, with other methods performing comparably. Conclusions Modern feature selection and classification methods are powerful techniques for both the identification of biomarker candidates and the related problem of building predictive models from protein mass spectrometric profiles. Cross-validation and randomization are essential tools that must be performed carefully in order not to bias the results unfairly. However, only a biological validation and identification of the underlying proteins will ultimately confirm the actual value and power of any computational predictions.

  5. Energy Efficient Engine program advanced turbofan nacelle definition study

    Science.gov (United States)

    Howe, David C.; Wynosky, T. A.

    1985-01-01

    Advanced, low drag, nacelle configurations were defined for some of the more promising propulsion systems identified in the earlier Benefit/Cost Study, to assess the benefits associated with these advanced technology nacelles and formulate programs for developing these nacelles and low volume thrust reversers/spoilers to a state of technology readiness in the early 1990's. The study results established the design feasibility of advanced technology, slim line nacelles applicable to advanced technology, high bypass ratio turbofan engines. Design feasibility was also established for two low volume thrust reverse/spoiler concepts that meet or exceed the required effectiveness for these engines. These nacelle and thrust reverse/spoiler designs were shown to be applicable in engines with takeoff thrust sizes ranging from 24,000 to 60,000 pounds. The reduced weight, drag, and cost of the advanced technology nacelle installations relative to current technology nacelles offer a mission fuel burn savings ranging from 3.0 to 4.5 percent and direct operating cost plus interest improvements from 1.6 to 2.2 percent.

  6. The ADVANCE Code of Conduct for collaborative vaccine studies.

    Science.gov (United States)

    Kurz, Xavier; Bauchau, Vincent; Mahy, Patrick; Glismann, Steffen; van der Aa, Lieke Maria; Simondon, François

    2017-04-04

    Lessons learnt from the 2009 (H1N1) flu pandemic highlighted factors limiting the capacity to collect European data on vaccine exposure, safety and effectiveness, including lack of rapid access to available data sources or expertise, difficulties to establish efficient interactions between multiple parties, lack of confidence between private and public sectors, concerns about possible or actual conflicts of interest (or perceptions thereof) and inadequate funding mechanisms. The Innovative Medicines Initiative's Accelerated Development of VAccine benefit-risk Collaboration in Europe (ADVANCE) consortium was established to create an efficient and sustainable infrastructure for rapid and integrated monitoring of post-approval benefit-risk of vaccines, including a code of conduct and governance principles for collaborative studies. The development of the code of conduct was guided by three core and common values (best science, strengthening public health, transparency) and a review of existing guidance and relevant published articles. The ADVANCE Code of Conduct includes 45 recommendations in 10 topics (Scientific integrity, Scientific independence, Transparency, Conflicts of interest, Study protocol, Study report, Publication, Subject privacy, Sharing of study data, Research contract). Each topic includes a definition, a set of recommendations and a list of additional reading. The concept of the study team is introduced as a key component of the ADVANCE Code of Conduct with a core set of roles and responsibilities. It is hoped that adoption of the ADVANCE Code of Conduct by all partners involved in a study will facilitate and speed-up its initiation, design, conduct and reporting. Adoption of the ADVANCE Code of Conduct should be stated in the study protocol, study report and publications and journal editors are encouraged to use it as an indication that good principles of public health, science and transparency were followed throughout the study. Copyright © 2017

  7. Grand Challenges of Advanced Computing for Energy Innovation Report from the Workshop Held July 31-August 2, 2012

    Energy Technology Data Exchange (ETDEWEB)

    Larzelere, Alex R.; Ashby, Steven F.; Christensen, Dana C.; Crawford, Dona L.; Khaleel, Mohammad A.; John, Grosh; Stults, B. Ray; Lee, Steven L.; Hammond, Steven W.; Grover, Benjamin T.; Neely, Rob; Dudney, Lee Ann; Goldstein, Noah C.; Wells, Jack; Peltz, Jim

    2013-03-06

    On July 31-August 2 of 2012, the U.S. Department of Energy (DOE) held a workshop entitled Grand Challenges of Advanced Computing for Energy Innovation. This workshop built on three earlier workshops that clearly identified the potential for the Department and its national laboratories to enable energy innovation. The specific goal of the workshop was to identify the key challenges that the nation must overcome to apply the full benefit of taxpayer-funded advanced computing technologies to U.S. energy innovation in the ways that the country produces, moves, stores, and uses energy. Perhaps more importantly, the workshop also developed a set of recommendations to help the Department overcome those challenges. These recommendations provide an action plan for what the Department can do in the coming years to improve the nation’s energy future.

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  9. Computed tomographic study on Mycoplasma pneumoniae pneumonia

    International Nuclear Information System (INIS)

    Tanaka, Hiroshi; Koba, Hiroyuki; Mori, Takuji; Mori, Masaki; Tsunematsu, Kazunori; Natori, Hiroshi; Asakawa, Mitsuo; Suzuki, Akira; Doi, Mikio.

    1985-01-01

    Serologically proven 21 patients with Mycoplasma pneumoniae pneumonia that showed infiltrative shadows on chest radiograms were studied by computed tomography (CT). Localization of the lesion and the fashion of its progression through the lung were analyzed. Following 3 loci were defined on the basis of the investigations of critical analysis of the chest radiograms, and of radiopathological analysis of the experimental animal model of mycoplasmal pneumonia with soft X-ray image. I: Peribronchial and periarterial interstitium. II: Bronchiole and its surroundings. III: Lung parenchyma, on hilar area as IIIh, on marginal area as IIIm. Even in the early phase of this disease, radiopathological findings on CT have been distributed in all loci mentioned above. The Shadow disappeared from locus III approximately 14th day from the onset. The shadow have remained, however, loci I, II for a long period. Those findings suggest that locus I and II are one of the major focus of Mycoplasma neumoniae pneumonia. Volume loss in the locus III was observed 78 % of the cases at 28th day from the onset. The shadow on locus IIIh was more prominent than locus IIIm. Reported analytical method with CT could be widely applied to disclose a radiopathological details in other infectious diseases of the lung. (author)

  10. Computational Studies of Protein Hydration Methods

    Science.gov (United States)

    Morozenko, Aleksandr

    It is widely appreciated that water plays a vital role in proteins' functions. The long-range proton transfer inside proteins is usually carried out by the Grotthuss mechanism and requires a chain of hydrogen bonds that is composed of internal water molecules and amino acid residues of the protein. In other cases, water molecules can facilitate the enzymes catalytic reactions by becoming a temporary proton donor/acceptor. Yet a reliable way of predicting water protein interior is still not available to the biophysics community. This thesis presents computational studies that have been performed to gain insights into the problems of fast and accurate prediction of potential water sites inside internal cavities of protein. Specifically, we focus on the task of attainment of correspondence between results obtained from computational experiments and experimental data available from X-ray structures. An overview of existing methods of predicting water molecules in the interior of a protein along with a discussion of the trustworthiness of these predictions is a second major subject of this thesis. A description of differences of water molecules in various media, particularly, gas, liquid and protein interior, and theoretical aspects of designing an adequate model of water for the protein environment are widely discussed in chapters 3 and 4. In chapter 5, we discuss recently developed methods of placement of water molecules into internal cavities of a protein. We propose a new methodology based on the principle of docking water molecules to a protein body which allows to achieve a higher degree of matching experimental data reported in protein crystal structures than other techniques available in the world of biophysical software. The new methodology is tested on a set of high-resolution crystal structures of oligopeptide-binding protein (OppA) containing a large number of resolved internal water molecules and applied to bovine heart cytochrome c oxidase in the fully

  11. Development of Computational Capabilities to Predict the Corrosion Wastage of Boiler Tubes in Advanced Combustion Systems

    Energy Technology Data Exchange (ETDEWEB)

    Kung, Steven; Rapp, Robert

    2014-08-31

    A comprehensive corrosion research project consisting of pilot-scale combustion testing and long-term laboratory corrosion study has been successfully performed. A pilot-scale combustion facility available at Brigham Young University was selected and modified to enable burning of pulverized coals under the operating conditions typical for advanced coal-fired utility boilers. Eight United States (U.S.) coals were selected for this investigation, with the test conditions for all coals set to have the same heat input to the combustor. In addition, the air/fuel stoichiometric ratio was controlled so that staged combustion was established, with the stoichiometric ratio maintained at 0.85 in the burner zone and 1.15 in the burnout zone. The burner zone represented the lower furnace of utility boilers, while the burnout zone mimicked the upper furnace areas adjacent to the superheaters and reheaters. From this staged combustion, approximately 3% excess oxygen was attained in the combustion gas at the furnace outlet. During each of the pilot-scale combustion tests, extensive online measurements of the flue gas compositions were performed. In addition, deposit samples were collected at the same location for chemical analyses. Such extensive gas and deposit analyses enabled detailed characterization of the actual combustion environments existing at the lower furnace walls under reducing conditions and those adjacent to the superheaters and reheaters under oxidizing conditions in advanced U.S. coal-fired utility boilers. The gas and deposit compositions were then carefully simulated in a series of 1000-hour laboratory corrosion tests, in which the corrosion performances of different commercial candidate alloys and weld overlays were evaluated at various temperatures for advanced boiler systems. Results of this laboratory study led to significant improvement in understanding of the corrosion mechanisms operating on the furnace walls as well as superheaters and reheaters in

  12. Advanced Dementia Research in the Nursing Home: The CASCADE Study

    Science.gov (United States)

    Mitchell, Susan L.; Kiely, Dan K.; Jones, Richard N.; Prigerson, Holly; Volicer, Ladislav; Teno, Joan M.

    2009-01-01

    Despite the growing number of persons with advanced dementia, and the need to improve their end-of-life care, few studies have addressed this important topic. The objectives of this report are to present the methodology established in the CASCADE (Choices, Attitudes, and Strategies for Care of Advanced Dementia at the End-of-Life) study, and to describe how challenges specific to this research were met. The CASCADE study is an ongoing, federally funded, 5-year prospective cohort study of nursing [nursing home (NH)] residents with advanced dementia and their health care proxies (HCPs) initiated in February 2003. Subjects were recruited from 15 facilities around Boston. The recruitment and data collection protocols are described. The demographic features, ownership, staffing, and quality of care of participant facilities are presented and compared to NHs nationwide. To date, 189 resident/HCP dyads have been enrolled. Baseline data are presented, demonstrating the success of the protocol in recruiting and repeatedly assessing NH residents with advanced dementia and their HCPs. Factors challenging and enabling implementation of the protocol are described. The CASCADE experience establishes the feasibility of conducting rigorous, multisite dementia NH research, and the described methodology serves as a detailed reference for subsequent CASCADE publications as results from the study emerge. PMID:16917187

  13. A computer program for estimating the power-density spectrum of advanced continuous simulation language generated time histories

    Science.gov (United States)

    Dunn, H. J.

    1981-01-01

    A computer program for performing frequency analysis of time history data is presented. The program uses circular convolution and the fast Fourier transform to calculate power density spectrum (PDS) of time history data. The program interfaces with the advanced continuous simulation language (ACSL) so that a frequency analysis may be performed on ACSL generated simulation variables. An example of the calculation of the PDS of a Van de Pol oscillator is presented.

  14. Advanced Topics in Computational Partial Differential Equations: Numerical Methods and Diffpack Programming

    International Nuclear Information System (INIS)

    Katsaounis, T D

    2005-01-01

    The scope of this book is to present well known simple and advanced numerical methods for solving partial differential equations (PDEs) and how to implement these methods using the programming environment of the software package Diffpack. A basic background in PDEs and numerical methods is required by the potential reader. Further, a basic knowledge of the finite element method and its implementation in one and two space dimensions is required. The authors claim that no prior knowledge of the package Diffpack is required, which is true, but the reader should be at least familiar with an object oriented programming language like C++ in order to better comprehend the programming environment of Diffpack. Certainly, a prior knowledge or usage of Diffpack would be a great advantage to the reader. The book consists of 15 chapters, each one written by one or more authors. Each chapter is basically divided into two parts: the first part is about mathematical models described by PDEs and numerical methods to solve these models and the second part describes how to implement the numerical methods using the programming environment of Diffpack. Each chapter closes with a list of references on its subject. The first nine chapters cover well known numerical methods for solving the basic types of PDEs. Further, programming techniques on the serial as well as on the parallel implementation of numerical methods are also included in these chapters. The last five chapters are dedicated to applications, modelled by PDEs, in a variety of fields. In summary, the book focuses on the computational and implementational issues involved in solving partial differential equations. The potential reader should have a basic knowledge of PDEs and the finite difference and finite element methods. The examples presented are solved within the programming framework of Diffpack and the reader should have prior experience with the particular software in order to take full advantage of the book. Overall

  15. From "fixing women" to "institutional transformation": An ADVANCE case study

    Science.gov (United States)

    Yennello, Sherry; Kaunas, Christine

    2015-12-01

    The United States' position in the global economy requires an influx of women into science, technology, engineering, and mathematics (STEM) fields in order to remain competitive. Despite this, the representation of women in STEM continues to be low. The National Science Foundation's ADVANCE Program addresses this issue by funding projects that aim to increase the representation of women in academic STEM fields through transformation of institutional structures that impede women's progress in academic STEM fields. This paper includes a case study of the Texas A&M University ADVANCE Program.

  16. Pathological study on preoperative concurrent chemoradiation for advanced hypopharyngeal cancer

    International Nuclear Information System (INIS)

    Inoue, Toshiya; Nagata, Motoki; Yukawa, Hisaya

    2008-01-01

    Chemoradiotherapy is frequently applied as the first-line therapy for advanced hypopharyngeal cancer. However, organ-preserving therapy does not allow true pathological assessment of the effectiveness of the therapy. We therefore determined the following treatment modality for advanced hypopharyngeal cancer based on local findings upon the completion of radiotherapy at 40 Gy. Pathological assessments of 33 cases of advanced hypopharyngeal cancer who had undergone extended operation after chemoradiotherapy were performed. The pathological effects were 12 cases of Grade 1, 13 cases of Grade 2 and 8 cases of Grade 3. However, the rate of tumor-free cases was only 60% of the extended operation. In those cases, the local controlled lesions were well, however, distant metastases influenced the outcome; to control distant metastasis is a future issue. Additional studies to select a surgical approach will be needed. (author)

  17. Advancing a distributed multi-scale computing framework for large-scale high-throughput discovery in materials science.

    Science.gov (United States)

    Knap, J; Spear, C E; Borodin, O; Leiter, K W

    2015-10-30

    We describe the development of a large-scale high-throughput application for discovery in materials science. Our point of departure is a computational framework for distributed multi-scale computation. We augment the original framework with a specialized module whose role is to route evaluation requests needed by the high-throughput application to a collection of available computational resources. We evaluate the feasibility and performance of the resulting high-throughput computational framework by carrying out a high-throughput study of battery solvents. Our results indicate that distributed multi-scale computing, by virtue of its adaptive nature, is particularly well-suited for building high-throughput applications.

  18. Ganglion Plexus Ablation in Advanced Atrial Fibrillation: The AFACT Study

    NARCIS (Netherlands)

    Driessen, Antoine H. G.; Berger, Wouter R.; Krul, Sébastien P. J.; van den Berg, Nicoline W. E.; Neefs, Jolien; Piersma, Femke R.; Chan Pin Yin, Dean R. P. P.; de Jong, Jonas S. S. G.; van Boven, WimJan P.; de Groot, Joris R.

    2016-01-01

    Patients with long duration of atrial fibrillation (AF), enlarged atria, or failed catheter ablation have advanced AF and may require more extensive treatment than pulmonary vein isolation. The aim of this study was to investigate the efficacy and safety of additional ganglion plexus (GP) ablation

  19. Opportunities for in-situ diffraction studies of advanced materials ...

    Indian Academy of Sciences (India)

    Opportunities for in-situ diffraction studies of advanced materials under extreme conditions at the US spallation neutron source. J P HODGES. Neutron Scattering Sciences Division, Spallation Neutron Source, Oak Ridge National. Laboratory, P.O. Box 2008, TN 37831-6474, USA. E-mail: hodgesj@ornl.gov. Abstract.

  20. A Meta-Analysis of Advanced Organizer Studies.

    Science.gov (United States)

    Stone, Carol Leth

    1983-01-01

    Twenty-nine reports yielding 112 studies were analyzed with Glass's meta-analysis technique, and results were compared with predictions from Ausubel's model of assimilative learning. Overall, advance organizers were shown to be associated with increased learning and retention of material to be learned. (Author)

  1. Advanced language modeling approaches, case study: Expert search

    NARCIS (Netherlands)

    Hiemstra, Djoerd

    2008-01-01

    This tutorial gives a clear and detailed overview of advanced language modeling approaches and tools, including the use of document priors, translation models, relevance models, parsimonious models and expectation maximization training. Expert search will be used as a case study to explain the

  2. Study of variation of thermal diffusivity of advanced composite ...

    Indian Academy of Sciences (India)

    Home; Journals; Bulletin of Materials Science; Volume 32; Issue 1. Study of variation of thermal diffusivity of advanced composite materials of E-glass fibre reinforced plastic (GFRP) in temperature range 5–300 K. Kalobaran Das S M Kamaruzzaman Tapas Ranjan Middya Siddhartha Datta. Ceramics and Glasses Volume 32 ...

  3. Integrated Application of Active Controls (IAAC) technology to an advanced subsonic transport project: Current and advanced act control system definition study. Volume 2: Appendices

    Science.gov (United States)

    Hanks, G. W.; Shomber, H. A.; Dethman, H. A.; Gratzer, L. B.; Maeshiro, A.; Gangsaas, D.; Blight, J. D.; Buchan, S. M.; Crumb, C. B.; Dorwart, R. J.

    1981-01-01

    The current status of the Active Controls Technology (ACT) for the advanced subsonic transport project is investigated through analysis of the systems technical data. Control systems technologies under examination include computerized reliability analysis, pitch axis fly by wire actuator, flaperon actuation system design trade study, control law synthesis and analysis, flutter mode control and gust load alleviation analysis, and implementation of alternative ACT systems. Extensive analysis of the computer techniques involved in each system is included.

  4. The advantages of advanced computer-assisted diagnostics and three-dimensional preoperative planning on implant position in orbital reconstruction.

    Science.gov (United States)

    Jansen, Jesper; Schreurs, Ruud; Dubois, Leander; Maal, Thomas J J; Gooris, Peter J J; Becking, Alfred G

    2018-02-26

    Advanced three-dimensional (3D) diagnostics and preoperative planning are the first steps in computer-assisted surgery (CAS). They are an integral part of the workflow, and allow the surgeon to adequately assess the fracture and to perform virtual surgery to find the optimal implant position. The goal of this study was to evaluate the accuracy and predictability of 3D diagnostics and preoperative virtual planning without intraoperative navigation in orbital reconstruction. In 10 cadaveric heads, 19 complex orbital fractures were created. First, all fractures were reconstructed without preoperative planning (control group) and at a later stage the reconstructions were repeated with the help of preoperative planning. Preformed titanium mesh plates were used for the reconstructions by two experienced oral and maxillofacial surgeons. The preoperative virtual planning was easily accessible for the surgeon during the reconstruction. Computed tomographic scans were obtained before and after creation of the orbital fractures and postoperatively. Using a paired t-test, implant positioning accuracy (translation and rotations) of both groups were evaluated by comparing the planned implant position with the position of the implant on the postoperative scan. Implant position improved significantly (P preoperative planning (Table 1). Pitch did not improve significantly (P = 0.78). The use of 3D diagnostics and preoperative planning without navigation in complex orbital wall fractures has a positive effect on implant position. This is due to a better assessment of the fracture, the possibility of virtual surgery and because the planning can be used as a virtual guide intraoperatively. The surgeon has more control in positioning the implant in relation to the rim and other bony landmarks. Copyright © 2018 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  5. Large Advanced Space Systems (LASS) computer-aided design program additions

    Science.gov (United States)

    Farrell, C. E.

    1982-01-01

    The LSS preliminary and conceptual design requires extensive iteractive analysis because of the effects of structural, thermal, and control intercoupling. A computer aided design program that will permit integrating and interfacing of required large space system (LSS) analyses is discussed. The primary objective of this program is the implementation of modeling techniques and analysis algorithms that permit interactive design and tradeoff studies of LSS concepts. Eight software modules were added to the program. The existing rigid body controls module was modified to include solar pressure effects. The new model generator modules and appendage synthesizer module are integrated (interfaced) to permit interactive definition and generation of LSS concepts. The mass properties module permits interactive specification of discrete masses and their locations. The other modules permit interactive analysis of orbital transfer requirements, antenna primary beam n, and attitude control requirements.

  6. Advances in Parallel Computing and Databases for Digital Pathology in Cancer Research

    Science.gov (United States)

    2016-11-13

    Through a number of efforts such as the National Strategic Computing Initiative (NSCI), there has been a push to merge these “ Big Data ” and...potential for applications in cancer research. II. PARALLEL COMPUTING AND BIG DATA Parallel computing is the ability to take a given program and split it...in-database analytics , hard- ware accelerated DBMS operations, and data models that more closely resemble the type of data being stored. For example

  7. The Plant-Window System: A framework for an integrated computing environment at advanced nuclear power plants

    International Nuclear Information System (INIS)

    Wood, R.T.; Mullens, J.A.; Naser, J.A.

    1997-01-01

    Power plant data, and the information that can be derived from it, provide the link to the plant through which the operations, maintenance and engineering staff understand and manage plant performance. The extensive use of computer technology in advanced reactor designs provides the opportunity to greatly expand the capability to obtain, analyze, and present data about the plant to station personnel. However, to support highly efficient and increasingly safe operation of nuclear power plants, it is necessary to transform the vast quantity of available data into clear, concise, and coherent information that can be readily accessed and used throughout the plant. This need can be met by an integrated computer workstation environment that provides the necessary information and software applications, in a manner that can be easily understood and sued, to the proper users throughout the plan. As part of a Cooperative Research and Development Agreement with the Electric Power Research Institute, the Oak Ridge National laboratory has developed functional requirements for a Plant-Wide Integrated Environment Distributed On Workstations (Plant-Window) System. The Plant-Window System (PWS) can serve the needs of operations, engineering, and maintenance personnel at nuclear power stations by providing integrated data and software applications within a common computing environment. The PWS requirements identify functional capabilities and provide guidelines for standardized hardware, software, and display interfaces so as to define a flexible computing environment for both current generation nuclear power plants and advanced reactor designs

  8. An Overview of the Advanced CompuTational Software (ACTS)Collection

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, Leroy A.; Marques, Osni A.

    2005-02-02

    The ACTS Collection brings together a number of general-purpose computational tools that were developed by independent research projects mostly funded and supported by the U.S. Department of Energy. These tools tackle a number of common computational issues found in many applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. In this article, we introduce the numerical tools in the collection and their functionalities, present a model for developing more complex computational applications on top of ACTS tools, and summarize applications that use these tools. Lastly, we present a vision of the ACTS project for deployment of the ACTS Collection by the computational sciences community.

  9. Computed tomographic features of the feline brain change with advancing age?

    Directory of Open Access Journals (Sweden)

    Viviam R. Babicsak

    2015-12-01

    Full Text Available Abstract: A better understanding of normal or expected encephalic changes with increasing age in cats is needed as a growing number of these animals is attended in veterinary clinics, and imaging data referring to normal age-associated changes are extremely scarce in the literature. The objective of this study was to identify age-related changes in feline brain using CT imaging. Fifteen non-brachycephalic healthy cats with age between 1 to 6 years (adult group and others over 12 years (geriatric group were submitted to CT scan of the brain. Statistically significant differences were found between the groups for the ability to identify the left lateral ventricle and for falx cerebri calcification, both identified in a greater number of cats of the geriatric group. A significantly higher mean width of the third ventricle was also detected in geriatric animals. There were no statistically significant differences between lateral ventricular dimensions and encephalic parenchymal attenuation on pre and post-contrast CT phases. The results of the present study show an increase in the incidence of falx cerebri calcification and a third ventricular dilatation with advancing age in cats. Future researches using MRI scanners and a greater quantity of cats are needed in order to identify supplementary age-related changes.

  10. Advanced turbine systems study system scoping and feasibility study

    Energy Technology Data Exchange (ETDEWEB)

    1993-04-01

    United Technologies Research Center, Pratt Whitney Commercial Engine Business, And Pratt Whitney Government Engine and Space Propulsion has performed a preliminary analysis of an Advanced Turbine System (ATS) under Contract DE-AC21-92MC29247 with the Morgantown Energy Technology Center. The natural gas-fired reference system identified by the UTC team is the Humid Air Turbine (HAT) Cycle in which the gas turbine exhaust heat and heat rejected from the intercooler is used in a saturator to humidify the high pressure compressor discharge air. This results in a significant increase in flow through the turbine at no increase in compressor power. Using technology based on the PW FT4000, the industrial engine derivative of the PW4000, currently under development by PW, the system would have an output of approximately 209 MW and an efficiency of 55.3%. Through use of advanced cooling and materials technologies similar to those currently in the newest generation military aircraft engines, a growth version of this engine could attain approximately 295 MW output at an efficiency of 61.5%. There is the potential for even higher performance in the future as technology from aerospace R D programs is adapted to aero-derivative industrial engines.

  11. Students' Computing Use and Study: When More is Less

    Directory of Open Access Journals (Sweden)

    Christine A McLachlan

    2016-02-01

    Full Text Available Since the turn of the century there has been a steady decline in enrolments of students in senior secondary computing classes in Australia. A flow on effect has seen reduced enrolments in tertiary computing courses and the subsequent predictions of shortages in skilled computing professionals. This paper investigates the relationship between students’ computing literacy levels, their use and access to computing tools, and students’ interest in and attitudes to formal computing study. Through the use of secondary data obtained from Australian and international reports, a reverse effect was discovered indicating that the more students used computing tools, the less interested they become in computing studies. Normal 0 false false false EN-AU X-NONE X-NONE

  12. Recent advances in the reconstruction of cranio-maxillofacial defects using computer-aided design/computer-aided manufacturing.

    Science.gov (United States)

    Oh, Ji-Hyeon

    2018-12-01

    With the development of computer-aided design/computer-aided manufacturing (CAD/CAM) technology, it has been possible to reconstruct the cranio-maxillofacial defect with more accurate preoperative planning, precise patient-specific implants (PSIs), and shorter operation times. The manufacturing processes include subtractive manufacturing and additive manufacturing and should be selected in consideration of the material type, available technology, post-processing, accuracy, lead time, properties, and surface quality. Materials such as titanium, polyethylene, polyetheretherketone (PEEK), hydroxyapatite (HA), poly-DL-lactic acid (PDLLA), polylactide-co-glycolide acid (PLGA), and calcium phosphate are used. Design methods for the reconstruction of cranio-maxillofacial defects include the use of a pre-operative model printed with pre-operative data, printing a cutting guide or template after virtual surgery, a model after virtual surgery printed with reconstructed data using a mirror image, and manufacturing PSIs by directly obtaining PSI data after reconstruction using a mirror image. By selecting the appropriate design method, manufacturing process, and implant material according to the case, it is possible to obtain a more accurate surgical procedure, reduced operation time, the prevention of various complications that can occur using the traditional method, and predictive results compared to the traditional method.

  13. Robustness of advanced nuclear fuel reprocessing processes. Study on solvent extraction processes adjusted to advanced reprocessing process. Document on collaborative study

    International Nuclear Information System (INIS)

    Yamamoto, Ichiro; Enokida, Youichi; Kobayashi, Noboru; Takanashi, Mitsuhiro; Aoshima, Atsushi; Nomura, Kazunori; Shibata, Atsuhiro

    2002-05-01

    The advanced nuclear fuel reprocessing process with crystallization uranium recovery has been proposed to enhance economical incentive and to reduce amount of discharged waste. Because a solvent extraction process following the crystallization uranium recovery will be operated with new process parameters due to different parameters of loading of heavy metals, decontamination factors, flow rates etc, fundamental studies on chemical flowsheet of the process are required to verify robustness of the process and to understand influence of process variation upon process performance. In this study, theoretical and computational studies were performed from this kind of aspect. Firstly, separation characteristics with the chemical flowsheet were studied for the steady-state, and recovery yields of uranium and plutonium, decontamination factor, process waste amount were computated for the normal process condition. Secondary, transient behaviors were computated with some variations in flow rates, heavy metal loading and so on from the normal process condition. Finally, influence of small fluctuation of the process condition was analyzed and the robustness of the new solvent extraction process was verified. This work was performed by Nagoya University and Japan Nuclear Cycle Development Institute under the JNC Cooperative Research Scheme on the Nuclear Fuel Cycle. (author)

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  15. Computer game playing and social skills: a pilot study

    OpenAIRE

    Griffiths, MD

    2010-01-01

    Computer game playing is a popular leisure activity. However, there is little known about the longer-term effects that regular computer game playing could have on social development. A questionnaire study was conducted with 144 undergraduate students examining frequency of computer game playing behaviour against scores on the Social Situations Questionnaire designed to identify social inadequacy. Results showed that high frequency computer game players exhibited more social anxiety than low f...

  16. Computer Science and Engineering Students Addressing Critical Issues Regarding Gender Differences in Computing: A Case Study

    Science.gov (United States)

    Tsagala, Evrikleia; Kordaki, Maria

    2008-01-01

    This study focuses on how Computer Science and Engineering Students (CSESs) of both genders address certain critical issues for gender differences in the field of Computer Science and Engineering (CSE). This case study is based on research conducted on a sample of 99 Greek CSESs, 43 of which were women. More specifically, these students were asked…

  17. Definition study for temperature control in advanced protein crystal growth

    Science.gov (United States)

    Nyce, Thomas A.; Rosenberger, Franz; Sowers, Jennifer W.; Monaco, Lisa A.

    1990-01-01

    Some of the technical requirements for an expedient application of temperature control to advanced protein crystal growth activities are defined. Lysozome was used to study the effects of temperature ramping and temperature gradients for nucleation/dissolution and consecutive growth of sizable crystals and, to determine a prototype temperature program. The solubility study was conducted using equine serum albumin (ESA) which is an extremely stable, clinically important protein due to its capability to bind and transport many different small ions and molecules.

  18. Experimental and computational studies of nanofluids

    Science.gov (United States)

    Vajjha, Ravikanth S.

    The goals of this dissertation were (i) to experimentally investigate the fluid dynamic and heat transfer performance of nanofluids in a circular tube, (ii) to study the influence of temperature and particle volumetric concentration of nanofluids on thermophysical properties, heat transfer and pumping power, (iii) to measure the rheological properties of various nanofluids and (iv) to investigate using a computational fluid dynamic (CFD) technique the performance of nanofluids in the flat tube of a radiator. Nanofluids are a new class of fluids prepared by dispersing nanoparticles with average sizes of less than 100 nm in traditional heat transfer fluids such as water, oil, ethylene glycol and propylene glycol. In cold regions of the world, the choice of base fluid for heat transfer applications is an ethylene glycol or propylene glycol mixed with water in different proportions. In the present research, a 60% ethylene glycol (EG) or propylene glycol (PG) and 40% water (W) by mass fluid mixture (60:40 EG/W or 60:40 PG/W) was used as a base fluid, which provides freeze protection to a very low level of temperature. Experiments were conducted to measure the convective heat transfer coefficient and pressure loss of nanofluids flowing in a circular tube in the fully developed turbulent regime. The experimental measurements were carried out for aluminum oxide (Al2O3), copper oxide (CuO) and silicon dioxide (SiO2) nanoparticles dispersed in 60:40 EG/W base fluid. Experiments revealed that the heat transfer coefficient of nanofluids showed an increase with the particle volumetric concentration. Pressure loss was also observed to increase with the nanoparticle volumetric concentration. New correlations for the Nusselt number and the friction factor were developed. The effects of temperature and particle volumetric concentration on different thermophysical properties (e.g. viscosity, thermal conductivity, specific heat and density) and subsequently on the Prandtl number

  19. Young Researchers Advancing Computational Science: Perspectives of the Young Scientists Conference 2015

    NARCIS (Netherlands)

    Boukhanovsky, A.V.; Ilyin, V.A; Krzhizhanovskaya, V.V.; Athanassoulis, G.A.; Klimentov, A.A.; Sloot, P.M.A.

    2015-01-01

    We present an annual international Young Scientists Conference (YSC) on computational science http://ysc.escience.ifmo.ru/, which brings together renowned experts and young researchers working in high-performance computing, data-driven modeling, and simulation of large-scale complex systems. The

  20. Cloud and fog computing in 5G mobile networks emerging advances and applications

    CERN Document Server

    Markakis, Evangelos; Mavromoustakis, Constandinos X; Pallis, Evangelos

    2017-01-01

    This book focuses on the challenges and solutions related to cloud and fog computing for 5G mobile networks, and presents novel approaches to the frameworks and schemes that carry out storage, communication, computation and control in the fog/cloud paradigm.

  1. Kids at CERN Grids for Kids programme leads to advanced computing knowledge.

    CERN Multimedia

    2008-01-01

    Children as young as 10 are learning computing skills, such as middleware, parallel processing and supercomputing, at CERN, the European Organisation for Nuclear Research, last week. The initiative for 10 to 12 years olds is part of the Grids for Kids programme, which aims to introduce Grid computing as a tool for research.

  2. Young Researchers Advancing Computational Science: Perspectives of the Young Scientists Conference 2015

    CERN Document Server

    Boukhanovsky, Alexander V; Krzhizhanovskaya, Valeria V; Athanassoulis, Gerassimos A; Klimentov, Alexei A; Sloot, Peter M A

    2015-01-01

    We present an annual international Young Scientists Conference (YSC) on computational science http://ysc.escience.ifmo.ru/, which brings together renowned experts and young researchers working in high-performance computing, data-driven modeling, and simulation of large-scale complex systems. The first YSC event was organized in 2012 by the University of Amsterdam, the Netherlands and ITMO University, Russia with the goal of opening a dialogue on the present and the future of computational science and its applications. We believe that the YSC conferences will strengthen the ties between young scientists in different countries, thus promoting future collaboration. In this paper we briefly introduce the challenges the millennial generation is facing; describe the YSC conference history and topics; and list the keynote speakers and program committee members. This volume of Procedia Computer Science presents selected papers from the 4th International Young Scientists Conference on Computational Science held on 25 ...

  3. BOOK REVIEW: Advanced Topics in Computational Partial Differential Equations: Numerical Methods and Diffpack Programming

    Science.gov (United States)

    Katsaounis, T. D.

    2005-02-01

    The scope of this book is to present well known simple and advanced numerical methods for solving partial differential equations (PDEs) and how to implement these methods using the programming environment of the software package Diffpack. A basic background in PDEs and numerical methods is required by the potential reader. Further, a basic knowledge of the finite element method and its implementation in one and two space dimensions is required. The authors claim that no prior knowledge of the package Diffpack is required, which is true, but the reader should be at least familiar with an object oriented programming language like C++ in order to better comprehend the programming environment of Diffpack. Certainly, a prior knowledge or usage of Diffpack would be a great advantage to the reader. The book consists of 15 chapters, each one written by one or more authors. Each chapter is basically divided into two parts: the first part is about mathematical models described by PDEs and numerical methods to solve these models and the second part describes how to implement the numerical methods using the programming environment of Diffpack. Each chapter closes with a list of references on its subject. The first nine chapters cover well known numerical methods for solving the basic types of PDEs. Further, programming techniques on the serial as well as on the parallel implementation of numerical methods are also included in these chapters. The last five chapters are dedicated to applications, modelled by PDEs, in a variety of fields. The first chapter is an introduction to parallel processing. It covers fundamentals of parallel processing in a simple and concrete way and no prior knowledge of the subject is required. Examples of parallel implementation of basic linear algebra operations are presented using the Message Passing Interface (MPI) programming environment. Here, some knowledge of MPI routines is required by the reader. Examples solving in parallel simple PDEs using

  4. Computational study of patterns in simple nonequilibrium systems

    International Nuclear Information System (INIS)

    Barach, J.P.

    1997-01-01

    We present computational studies of a two component reaction diffusion system of the Grey and Scott type. The calculation involves a discrete treatment of the diffusion equation and some details of that problem are explained. As the simulation calculation runs over a 200x200 square spatial field ridge like patterns develop if one diffusion coefficient is about twice the size of the other and if the rate parameters are in a narrow range. Pattern development is faster when the reaction rates are larger, within this range. It is shown that for an advancing wave, the lead component has a wider front than the other although in steady state the two components obey a ridge/valley or valley/ridge equilibrium. We investigate ways in which a more complex time dependence could be introduced to the system and display one example of such a possible expansion of the study. A correlation coefficient study shows a modest but distinct difference between our pattern development and a random field. copyright 1997 American Institute of Physics

  5. Evaluation Statistics Computed for the Wave Information Studies (WIS)

    Science.gov (United States)

    2016-07-01

    defining it as the standard deviation of the errors (i.e., demeaned RMSE) divided by the mean of the observations (Mentaschi et al. 2013), as done by...SI. To overcome these challenges, the wave model community should make strides in standardizing statistical metrics to advance the objective...ERDC/CHL CHETN-I-91 July 2016 Approved for public release; distribution is unlimited. Evaluation Statistics Computed for the Wave Information

  6. Computer architecture for efficient algorithmic executions in real-time systems: New technology for avionics systems and advanced space vehicles

    Science.gov (United States)

    Carroll, Chester C.; Youngblood, John N.; Saha, Aindam

    1987-01-01

    Improvements and advances in the development of computer architecture now provide innovative technology for the recasting of traditional sequential solutions into high-performance, low-cost, parallel system to increase system performance. Research conducted in development of specialized computer architecture for the algorithmic execution of an avionics system, guidance and control problem in real time is described. A comprehensive treatment of both the hardware and software structures of a customized computer which performs real-time computation of guidance commands with updated estimates of target motion and time-to-go is presented. An optimal, real-time allocation algorithm was developed which maps the algorithmic tasks onto the processing elements. This allocation is based on the critical path analysis. The final stage is the design and development of the hardware structures suitable for the efficient execution of the allocated task graph. The processing element is designed for rapid execution of the allocated tasks. Fault tolerance is a key feature of the overall architecture. Parallel numerical integration techniques, tasks definitions, and allocation algorithms are discussed. The parallel implementation is analytically verified and the experimental results are presented. The design of the data-driven computer architecture, customized for the execution of the particular algorithm, is discussed.

  7. A comparison between ten advanced and soft computing models for groundwater qanat potential assessment in Iran using R and GIS

    Science.gov (United States)

    Naghibi, Seyed Amir; Pourghasemi, Hamid Reza; Abbaspour, Karim

    2018-02-01

    Considering the unstable condition of water resources in Iran and many other countries in arid and semi-arid regions, groundwater studies are very important. Therefore, the aim of this study is to model groundwater potential by qanat locations as indicators and ten advanced and soft computing models applied to the Beheshtabad Watershed, Iran. Qanat is a man-made underground construction which gathers groundwater from higher altitudes and transmits it to low land areas where it can be used for different purposes. For this purpose, at first, the location of the qanats was detected using extensive field surveys. These qanats were classified into two datasets including training (70%) and validation (30%). Then, 14 influence factors depicting the region's physical, morphological, lithological, and hydrological features were identified to model groundwater potential. Linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), flexible discriminant analysis (FDA), penalized discriminant analysis (PDA), boosted regression tree (BRT), random forest (RF), artificial neural network (ANN), K-nearest neighbor (KNN), multivariate adaptive regression splines (MARS), and support vector machine (SVM) models were applied in R scripts to produce groundwater potential maps. For evaluation of the performance accuracies of the developed models, ROC curve and kappa index were implemented. According to the results, RF had the best performance, followed by SVM and BRT models. Our results showed that qanat locations could be used as a good indicator for groundwater potential. Furthermore, altitude, slope, plan curvature, and profile curvature were found to be the most important influence factors. On the other hand, lithology, land use, and slope aspect were the least significant factors. The methodology in the current study could be used by land use and terrestrial planners and water resource managers to reduce the costs of groundwater resource discovery.

  8. Standardized uptake value on positron emission tomography/computed tomography predicts prognosis in patients with locally advanced pancreatic cancer.

    Science.gov (United States)

    Wang, Si-Liang; Cao, Shuo; Sun, Yu-Nan; Wu, Rong; Chi, Feng; Tang, Mei-Yue; Jin, Xue-Ying; Chen, Xiao-Dong

    2015-10-01

    The aim of the present study was to investigate the use and value of maximum standardized uptake value (SUV max) on positron emission tomography/computed tomography (PET/CT) images as a prognostic marker for patients with locally advanced pancreatic cancer (LAPC). The medical records of all consecutive patients who underwent PET/CT examination in our institution were retrospectively reviewed. Inclusion criteria were histologically or cytologically proven LAPC. Patients with distant metastasis were excluded. For statistical analysis, the SUV max of primary pancreatic cancer was measured. Survival rates were calculated using the Kaplan-Meier method, and multivariable analysis was performed to determine the association of SUV max with overall survival (OS) and progression-free survival (PFS) using a Cox proportional hazards model. Between July 2006 and June 2013, 69 patients were enrolled in the present study. OS and PFS were 14.9 months [95% confidence interval (CI) 13.1-16.7] and 8.3 months (95% CI 7.1-9.5), respectively. A high SUV max (>5.5) was observed in 35 patients, who had significantly worse OS and PFS than the remaining patients with a low SUV max (P = 0.025 and P = 0.003). Univariate analysis showed that SUV max and tumor size were prognostic factors for OS, with a hazard ratio of 1.90 and 1.81, respectively. A high SUV max was an independent prognostic factor, with a hazard ratio of 1.89 (95% CI 1.015-3.519, P = 0.045). The present study suggests that increased SUV max is a predictor of poor prognosis in patients with LAPC.

  9. FY 1997 Blue Book: High Performance Computing and Communications: Advancing the Frontiers of Information Technology

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — The Federal High Performance Computing and Communications HPCC Program will celebrate its fifth anniversary in October 1996 with an impressive array of...

  10. Extreme-Scale Computing Project Aims to Advance Precision Oncology | FNLCR Staging

    Science.gov (United States)

    Two government agencies and five national laboratories are collaborating to develop extremely high-performance computing capabilities that will analyze mountains of research and clinical data to improve scientific understanding of cancer, predict dru

  11. Extreme-Scale Computing Project Aims to Advance Precision Oncology | Poster

    Science.gov (United States)

    Two government agencies and five national laboratories are collaborating to develop extremely high-performance computing capabilities that will analyze mountains of research and clinical data to improve scientific understanding of cancer, predict drug response, and improve treatments for patients.

  12. Extreme-Scale Computing Project Aims to Advance Precision Oncology | FNLCR

    Science.gov (United States)

    Two government agencies and five national laboratories are collaborating to develop extremely high-performance computing capabilities that will analyze mountains of research and clinical data to improve scientific understanding of cancer, predict dru

  13. Advances in thermal-hydraulic studies of a transmutation advanced device for sustainable energy applications

    Energy Technology Data Exchange (ETDEWEB)

    Fajardo, Laura Garcia, E-mail: laura.gf@cern.ch [European Organization for Nuclear Research (CERN), Geneva (Switzerland). Technology Department; Hernandez, Carlos Garcia; Mazaira, Leorlen Rojas, E-mail: cgh@instec.cu, E-mail: irojas@instec.cu [Higher Institute of Technologies and Applied Sciences (INSTEC), Habana (Cuba); Castells, Facundo Alberto Escriva, E-mail: aescriva@iqn.upv.es [University of Valencia (UV), Valencia (Spain). Energetic Engineering Institute; Lira, Carlos Brayner de Olivera, E-mail: cabol@ufpe.br [Universidade Federal de Pernambuco (UFPE), Recife, PE (BRazil). Dept. de Engenharia Nuclear

    2013-07-01

    The Transmutation Advanced Device for Sustainable Energy Applications (TADSEA) is a pebble-bed Accelerator Driven System (ADS) with a graphite-gas configuration, designed for nuclear waste trans- mutation and for obtaining heat at very high temperatures to produce hydrogen. In previous work, the TADSEA's nuclear core was considered as a porous medium performed with a CFD code and thermal-hydraulic studies of the nuclear core were presented. In this paper, the heat transfer from the fuel to the coolant was analyzed for three core states during normal operation. The heat transfer inside the spherical fuel elements was also studied. Three critical fuel elements groups were defined regarding their position inside the core. Results were compared with a realistic CFD model of the critical fuel elements groups. During the steady state, no critical elements reached the limit temperature of this type of fuel. (author)

  14. 5th Conference on Advanced Mathematical and Computational Tools in Metrology

    CERN Document Server

    Cox, M G; Filipe, E; Pavese, F; Richter, D

    2001-01-01

    Advances in metrology depend on improvements in scientific and technical knowledge and in instrumentation quality, as well as on better use of advanced mathematical tools and development of new ones. In this volume, scientists from both the mathematical and the metrological fields exchange their experiences. Industrial sectors, such as instrumentation and software, will benefit from this exchange, since metrology has a high impact on the overall quality of industrial products, and applied mathematics is becoming more and more important in industrial processes.This book is of interest to people

  15. Verification, Validation and Credibility Assessment of a Computational Model of the Advanced Resistive Exercise Device (ARED)

    Science.gov (United States)

    Werner, C. R.; Humphreys, B. T.; Mulugeta, L.

    2014-01-01

    The Advanced Resistive Exercise Device (ARED) is the resistive exercise device used by astronauts on the International Space Station (ISS) to mitigate bone loss and muscle atrophy due to extended exposure to microgravity (micro g). The Digital Astronaut Project (DAP) has developed a multi-body dynamics model of biomechanics models for use in spaceflight exercise physiology research and operations. In an effort to advance model maturity and credibility of the ARED model, the DAP performed verification, validation and credibility (VV and C) assessment of the analyses of the model in accordance to NASA-STD-7009 'Standards for Models and Simulations'.

  16. Observational study of sleep disturbances in advanced cancer.

    Science.gov (United States)

    Davies, Andrew Neil; Patel, Shuchita D; Gregory, Amanda; Lee, Bernadette

    2017-12-01

    To determine the prevalence of nightmares, sleep terrors and vivid dreams in patients with advanced cancer (and the factors associated with them in this group of patients). The study was a multicentre, prospective observational study. Participants were patients with locally advanced/metastatic cancer, who were under the care of a specialist palliative care team. Data were collected on demographics, cancer diagnosis, cancer treatment, current medication, performance status, sleep quality (Pittsburgh Sleep Quality Index), dreams and nightmares, and physical and psychological symptoms (Memorial Symptom Assessment Scale-Short Form). 174 patients completed the study. Sleep quality was poor in 70.5% participants and was worse in younger patients and in inpatients (hospital, hospice). 18% of patients reported nightmares, 8% sleep terrors and 34% vivid dreams. Nightmares were associated with poor sleep quality and greater sleep disturbance; nightmares were also associated with greater physical and psychological burden. Nightmares (and vivid dreams) were not associated with the use of opioid analgesics. Nightmares do not seem to be especially common in patients with advanced cancer, and when they do occur, there is often an association with sleep disturbance, and/or physical and psychological burden. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  17. A computed tomographic study on epilepsy

    International Nuclear Information System (INIS)

    Bae, Hoon Sik

    1980-01-01

    140 patients with epileptic seizure were studied by computed tomography during the period from Feb. 1979 to Aug. 1979 in the Department or Radiology, College of Medicine, Hangyang University. Those findings on CT and clinical records including EEG findings were reviewed. The results were as follows: 1. Age distribution of the total 140 patients was broad ranging from 1 month to 63 years. 73.5% of patients was below the age of 30. The patient population was comprised of 93 males and 47 females, and its male to female ratio was 2 : 1. 2. The type of epileptic seizure were classified according to the International League against Epilepsy. 42.9% of patients had primary generalized seizure, 47.1% with partial seizure, and 10% with non classifiable seizure. 3. As additional symptoms and signs except seizure, headache was most common, and the next was nausea and vomiting. Uncommonly, there were also insomnia, personality change, and memory disturbance. 4. 37.1% of patients had less than 1 month of seizure history, 19.3% between 1 year and 5 years. 5. EEG findings were available in 41 patients, and normal in 15 cases. 26 patients revealed abnormal findings. Among those abnormal findings focal slowing was appeared in 19.5% and generalized slowing in 17.1%. 6. 52% of patients showed abnormal findings on CT. The most common abnormal findings was focal low density (30%), and the next was diffuse hydrocephalus (7.1%). After contrast infusion, contrast enhancement was occurred in cases with focal low density, focal high or isodense mass density. In patients with focal low density, ring or nodular enhancement were common, and diffuse or serpentime enhancement in focal high or isodence mass density. 7. The frequency of structural abnormalities on CT was more common in patients below the age of 10 and over 30 than other age groups. The epilepsy starting below 10 and over 30 years of age showed structural abnormalities in 63.6-100%. 8. The patients who had less than 6 months of

  18. The effect of psychosocial stress on muscle activity during computer work: Comparative study between desktop computer and mobile computing products.

    Science.gov (United States)

    Taib, Mohd Firdaus Mohd; Bahn, Sangwoo; Yun, Myung Hwan

    2016-06-27

    The popularity of mobile computing products is well known. Thus, it is crucial to evaluate their contribution to musculoskeletal disorders during computer usage under both comfortable and stressful environments. This study explores the effect of different computer products' usages with different tasks used to induce psychosocial stress on muscle activity. Fourteen male subjects performed computer tasks: sixteen combinations of four different computer products with four different tasks used to induce stress. Electromyography for four muscles on the forearm, shoulder and neck regions and task performances were recorded. The increment of trapezius muscle activity was dependent on the task used to induce the stress where a higher level of stress made a greater increment. However, this relationship was not found in the other three muscles. Besides that, compared to desktop and laptop use, the lowest activity for all muscles was obtained during the use of a tablet or smart phone. The best net performance was obtained in a comfortable environment. However, during stressful conditions, the best performance can be obtained using the device that a user is most comfortable with or has the most experience with. Different computer products and different levels of stress play a big role in muscle activity during computer work. Both of these factors must be taken into account in order to reduce the occurrence of musculoskeletal disorders or problems.

  19. Computed tomographic study in children with microcephaly

    International Nuclear Information System (INIS)

    Ito, Masatoshi; Okuno, Takehiko; Mikawa, Haruki

    1989-01-01

    Computed tomographic (CT) brain scanning was performed on fifty-eight infants and children with microcephaly. CT scans were useful for detecting unsuspected brain lesions and for diagnosing underlying diseases. The head size did not correlate with the CT findings, the degree of mental retardation, or the existence of motor disturbance or epilepsy. On the other hand, the CT findings were correlated with the degree of mental retardation, and the existence of motor disturbance or epilepsy. CT scans were useful for determining the prognosis of the microcephaly. (author)

  20. NATO Advanced Study Institute on Low Temperature Molecular Spectroscopy

    CERN Document Server

    1996-01-01

    Molecular spectroscopy has achieved rapid and significant progress in recent years, the low temperature techniques in particular having proved very useful for the study of reactive species, phase transitions, molecular clusters and crystals, superconductors and semiconductors, biochemical systems, astrophysical problems, etc. The widening range of applications has been accompanied by significant improvements in experimental methods, and low temperature molecular spectroscopy has been revealed as the best technique, in many cases, to establish the connection between experiment and theoretical calculations. This, in turn, has led to a rapidly increasing ability to predict molecular spectroscopic properties. The combination of an advanced tutorial standpoint with an emphasis on recent advances and new perspectives in both experimental and theoretical molecular spectroscopy contained in this book offers the reader insight into a wide range of techniques, particular emphasis being given to supersonic jet and matri...

  1. Concurrent chemoradiotherapy for advanced cervical cancer. A pilot study

    International Nuclear Information System (INIS)

    Kodama, Junichi; Hashimoto, Ichiro; Seki, Noriko; Hongo, Atsushi; Mizutani, Yasushi; Miyagi, Yasunari; Yoshinouchi, Mitsuo; Kudo, Takafumi

    2001-01-01

    Recently, attempts have made to use radiotherapy in combination with chemotherapy in various solid tumors including cervical cancer. Twenty-four patients with locally advanced cervical cancer were treated with concurrent Carboplatin (16-24 mg/m 2 /day) or Nedaplatin (20 mg/m 2 /week) and conventional radiotherapy. Of 13 evaluable patients, there were nine complete responders and four partial responders. There was no renal damage or grade 4 hematological toxicity. Gastrointestinal adverse reactions were mild. One patient had grade 3 dermatologic toxicity after delayed radiation therapy. This pilot study suggests that daily Carboplatin or weekly Nedaplatin administered with standard radiation therapy is safe, well-tolerated, and thus may be useful as a radiation sensitizer in the treatment of locally advanced cervical cancer. (author)

  2. Computer Science and Engineering Students Addressing Critical Issues Regarding Gender Differences in Computing: a Case Study

    OpenAIRE

    Evrikleia Tsagala; Maria Kordaki

    2008-01-01

    This study focuses on how Computer Science and Engineering Students (CSESs) of both genders address certain critical issues for gender differences in the field of Computer Science and Engineering (CSE). This case study is based on research conducted on a sample of 99 Greek CSESs, 43 of which were women. More specifically, these students were asked to respond to a specially designed questionnaire addressing the following issues: a) essential motives in selecting CSE as a subject of study, thei...

  3. Advanced computational simulation for design and manufacturing of lightweight material components for automotive applications

    Energy Technology Data Exchange (ETDEWEB)

    Simunovic, S.; Aramayo, G.A.; Zacharia, T. [Oak Ridge National Lab., TN (United States); Toridis, T.G. [George Washington Univ., Washington, DC (United States); Bandak, F.; Ragland, C.L. [Dept. of Transportation, Washington, DC (United States)

    1997-04-01

    Computational vehicle models for the analysis of lightweight material performance in automobiles have been developed through collaboration between Oak Ridge National Laboratory, the National Highway Transportation Safety Administration, and George Washington University. The vehicle models have been verified against experimental data obtained from vehicle collisions. The crashed vehicles were analyzed, and the main impact energy dissipation mechanisms were identified and characterized. Important structural parts were extracted and digitized and directly compared with simulation results. High-performance computing played a key role in the model development because it allowed for rapid computational simulations and model modifications. The deformation of the computational model shows a very good agreement with the experiments. This report documents the modifications made to the computational model and relates them to the observations and findings on the test vehicle. Procedural guidelines are also provided that the authors believe need to be followed to create realistic models of passenger vehicles that could be used to evaluate the performance of lightweight materials in automotive structural components.

  4. Recent Advances of Computational Modeling for Predicting Drug Metabolism: A Perspective.

    Science.gov (United States)

    Kar, Supratik; Leszczynski, Jerzy

    2017-01-01

    Absorption, Distribution, Metabolism, Excretion (ADME) properties along with drug induced adverse effects are the major reasons for the late stage failure of drug candidates as well as the cause for the expensive withdrawal of many approved drugs from the market. Considering the adverse effects of drugs, metabolism factor has great importance in medicinal chemistry and clinical pharmacology because it influences the deactivation, activation, detoxification and toxification of drugs. Computational methods are effective approaches to reduce the number of safety issues by analyzing possible links between chemical structures and metabolism followed by adverse effects, as they serve the integration of information on several levels to enhance the reliability of outcomes. In silico profiling of drug metabolism can help progress only those molecules along the discovery chain that is less likely to fail later in the drug discovery process. This positively impacts the very high costs of drug discovery and development. Understanding the science behind computational tools, their opportunities, and limitations is essential to make a true influence on drug discovery at different levels. If applied in a scientifically consequential way, computational tools may improve the capability to identify and evaluate potential drug molecules considering pharmacokinetic properties of drugs. Herein, current trends in computational modeling for predicting drug metabolism are reviewed highlighting new computational tools for drug metabolism prediction followed by reporting large and integrated databases of approved drugs associated with diverse metabolism issues. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  5. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 4: Advanced fan section aerodynamic analysis computer program user's manual

    Science.gov (United States)

    Crook, Andrew J.; Delaney, Robert A.

    1992-01-01

    The computer program user's manual for the ADPACAPES (Advanced Ducted Propfan Analysis Code-Average Passage Engine Simulation) program is included. The objective of the computer program is development of a three-dimensional Euler/Navier-Stokes flow analysis for fan section/engine geometries containing multiple blade rows and multiple spanwise flow splitters. An existing procedure developed by Dr. J. J. Adamczyk and associates at the NASA Lewis Research Center was modified to accept multiple spanwise splitter geometries and simulate engine core conditions. The numerical solution is based upon a finite volume technique with a four stage Runge-Kutta time marching procedure. Multiple blade row solutions are based upon the average-passage system of equations. The numerical solutions are performed on an H-type grid system, with meshes meeting the requirement of maintaining a common axisymmetric mesh for each blade row grid. The analysis was run on several geometry configurations ranging from one to five blade rows and from one to four radial flow splitters. The efficiency of the solution procedure was shown to be the same as the original analysis.

  6. [Diagnosis. Radiological study. Ultrasound, computed tomography and magnetic resonance imaging].

    Science.gov (United States)

    Gallo Vallejo, Francisco Javier; Giner Ruiz, Vicente

    2014-01-01

    Because of its low cost, availability in primary care and ease of interpretation, simple X-ray should be the first-line imaging technique used by family physicians for the diagnosis and/or follow-up of patients with osteoarthritis. Nevertheless, this technique should only be used if there are sound indications and if the results will influence decision-making. Despite the increase of indications in patients with rheumatological disease, the role of ultrasound in patients with osteoarthritis continues to be limited. Computed tomography (CT) is of some -although limited- use in osteoarthritis, especially in the study of complex joints (such as the sacroiliac joint and facet joints). Magnetic resonance imaging (MRI) has represented a major advance in the evaluation of joint cartilage and subchondral bone in patients with osteoarthritis but, because of its high cost and diagnostic-prognostic yield, this technique should only be used in highly selected patients. The indications for ultrasound, CT and MRI in patients with osteoarthritis continue to be limited in primary care and often coincide with situations in which the patient may require hospital referral. Patient safety should be bourne in mind. Patients should be protected from excessive ionizing radiation due to unnecessary repeat X-rays or inadequate views or to requests for tests such as CT, when not indicated. Copyright © 2014 Elsevier España, S.L. All rights reserved.

  7. A computational study on outliers in world music.

    Directory of Open Access Journals (Sweden)

    Maria Panteli

    Full Text Available The comparative analysis of world music cultures has been the focus of several ethnomusicological studies in the last century. With the advances of Music Information Retrieval and the increased accessibility of sound archives, large-scale analysis of world music with computational tools is today feasible. We investigate music similarity in a corpus of 8200 recordings of folk and traditional music from 137 countries around the world. In particular, we aim to identify music recordings that are most distinct compared to the rest of our corpus. We refer to these recordings as 'outliers'. We use signal processing tools to extract music information from audio recordings, data mining to quantify similarity and detect outliers, and spatial statistics to account for geographical correlation. Our findings suggest that Botswana is the country with the most distinct recordings in the corpus and China is the country with the most distinct recordings when considering spatial correlation. Our analysis includes a comparison of musical attributes and styles that contribute to the 'uniqueness' of the music of each country.

  8. A computational study on outliers in world music.

    Science.gov (United States)

    Panteli, Maria; Benetos, Emmanouil; Dixon, Simon

    2017-01-01

    The comparative analysis of world music cultures has been the focus of several ethnomusicological studies in the last century. With the advances of Music Information Retrieval and the increased accessibility of sound archives, large-scale analysis of world music with computational tools is today feasible. We investigate music similarity in a corpus of 8200 recordings of folk and traditional music from 137 countries around the world. In particular, we aim to identify music recordings that are most distinct compared to the rest of our corpus. We refer to these recordings as 'outliers'. We use signal processing tools to extract music information from audio recordings, data mining to quantify similarity and detect outliers, and spatial statistics to account for geographical correlation. Our findings suggest that Botswana is the country with the most distinct recordings in the corpus and China is the country with the most distinct recordings when considering spatial correlation. Our analysis includes a comparison of musical attributes and styles that contribute to the 'uniqueness' of the music of each country.

  9. A computational study on outliers in world music

    Science.gov (United States)

    Benetos, Emmanouil; Dixon, Simon

    2017-01-01

    The comparative analysis of world music cultures has been the focus of several ethnomusicological studies in the last century. With the advances of Music Information Retrieval and the increased accessibility of sound archives, large-scale analysis of world music with computational tools is today feasible. We investigate music similarity in a corpus of 8200 recordings of folk and traditional music from 137 countries around the world. In particular, we aim to identify music recordings that are most distinct compared to the rest of our corpus. We refer to these recordings as ‘outliers’. We use signal processing tools to extract music information from audio recordings, data mining to quantify similarity and detect outliers, and spatial statistics to account for geographical correlation. Our findings suggest that Botswana is the country with the most distinct recordings in the corpus and China is the country with the most distinct recordings when considering spatial correlation. Our analysis includes a comparison of musical attributes and styles that contribute to the ‘uniqueness’ of the music of each country. PMID:29253027

  10. Advanced computer-aided design for bone tissue-engineering scaffolds.

    Science.gov (United States)

    Ramin, E; Harris, R A

    2009-04-01

    The design of scaffolds with an intricate and controlled internal structure represents a challenge for tissue engineering. Several scaffold-manufacturing techniques allow the creation of complex architectures but with little or no control over the main features of the channel network such as the size, shape, and interconnectivity of each individual channel, resulting in intricate but random structures. The combined use of computer-aided design (CAD) systems and layer-manufacturing techniques allows a high degree of control over these parameters with few limitations in terms of achievable complexity. However, the design of complex and intricate networks of channels required in CAD is extremely time-consuming since manually modelling hundreds of different geometrical elements, all with different parameters, may require several days to design individual scaffold structures. An automated design methodology is proposed by this research to overcome these limitations. This approach involves the investigation of novel software algorithms, which are able to interact with a conventional CAD program and permit the automated design of several geometrical elements, each with a different size and shape. In this work, the variability of the parameters required to define each geometry has been set as random, but any other distribution could have been adopted. This methodology has been used to design five cubic scaffolds with interconnected pore channels that range from 200 to 800 microm in diameter, each with an increased complexity of the internal geometrical arrangement. A clinical case study, consisting of an integration of one of these geometries with a craniofacial implant, is then presented.

  11. Developing advanced X-ray scattering methods combined with crystallography and computation.

    Science.gov (United States)

    Perry, J Jefferson P; Tainer, John A

    2013-03-01

    The extensive use of small angle X-ray scattering (SAXS) over the last few years is rapidly providing new insights into protein interactions, complex formation and conformational states in solution. This SAXS methodology allows for detailed biophysical quantification of samples of interest. Initial analyses provide a judgment of sample quality, revealing the potential presence of aggregation, the overall extent of folding or disorder, the radius of gyration, maximum particle dimensions and oligomerization state. Structural characterizations include ab initio approaches from SAXS data alone, and when combined with previously determined crystal/NMR, atomistic modeling can further enhance structural solutions and assess validity. This combination can provide definitions of architectures, spatial organizations of protein domains within a complex, including those not determined by crystallography or NMR, as well as defining key conformational states of a protein interaction. SAXS is not generally constrained by macromolecule size, and the rapid collection of data in a 96-well plate format provides methods to screen sample conditions. This includes screening for co-factors, substrates, differing protein or nucleotide partners or small molecule inhibitors, to more fully characterize the variations within assembly states and key conformational changes. Such analyses may be useful for screening constructs and conditions to determine those most likely to promote crystal growth of a complex under study. Moreover, these high throughput structural determinations can be leveraged to define how polymorphisms affect assembly formations and activities. This is in addition to potentially providing architectural characterizations of complexes and interactions for systems biology-based research, and distinctions in assemblies and interactions in comparative genomics. Thus, SAXS combined with crystallography/NMR and computation provides a unique set of tools that should be considered

  12. Leveraging Advanced Computational Algorithms and Satellite Remote Sensing Products for Flood Forecasting

    Science.gov (United States)

    Aristizabal, F.; Judge, J.; Rangarajan, A.

    2016-12-01

    According to the 2015 World Disasters Report published by the International Federation of Red Cross and Red Crescent Societies, floods, on an average annual reported basis during the period of 2005 to 2014, produced over 5,900 fatalities, over 86 million affected individuals, and over $34 billion in damages. Flood warning and forecasting systems relying on earth observations have generally employed deterministic, process-based hydrology models for near real-time operational applications. Limited research exists examining the application of computational and informatics algorithms for flood forecasting from earth observation data products.The goal of this study is to determine the efficacy of data-driven methodologies for accurately forecasting riverine flooding while exclusively utilizing remotely-sensed, satellite-based observations. Several data-driven spatiotemporal techniques for other applications have been employed such as adaptive resonance theory-MMAP (ART-MMAP) networks and associative neural networks. The initial methodology for flood prediction, similar to ART-MMAP, will be applied to a significant, historical flood spanning a data-rich agricultural area. Existing flood monitoring systems that could be appropriated as dependent data include MODIS Near-Real Time Global Flood Mapping and the Global Flood Detection System. Auxiliary data that are relational and causational to flooding include ground moisture, precipitation, elevation models, and land-cover all which are available from earth observation products. Ground data from observational records and in-situ stream gauges will be used to verify the accuracy of predictions and quantify omission and commission errors.

  13. Distributed Computing: An Overview

    OpenAIRE

    Md. Firoj Ali; Rafiqul Zaman Khan

    2015-01-01

    Decrease in hardware costs and advances in computer networking technologies have led to increased interest in the use of large-scale parallel and distributed computing systems. Distributed computing systems offer the potential for improved performance and resource sharing. In this paper we have made an overview on distributed computing. In this paper we studied the difference between parallel and distributed computing, terminologies used in distributed computing, task allocation in distribute...

  14. High performance computing and communications: Advancing the frontiers of information technology

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    This report, which supplements the President`s Fiscal Year 1997 Budget, describes the interagency High Performance Computing and Communications (HPCC) Program. The HPCC Program will celebrate its fifth anniversary in October 1996 with an impressive array of accomplishments to its credit. Over its five-year history, the HPCC Program has focused on developing high performance computing and communications technologies that can be applied to computation-intensive applications. Major highlights for FY 1996: (1) High performance computing systems enable practical solutions to complex problems with accuracies not possible five years ago; (2) HPCC-funded research in very large scale networking techniques has been instrumental in the evolution of the Internet, which continues exponential growth in size, speed, and availability of information; (3) The combination of hardware capability measured in gigaflop/s, networking technology measured in gigabit/s, and new computational science techniques for modeling phenomena has demonstrated that very large scale accurate scientific calculations can be executed across heterogeneous parallel processing systems located thousands of miles apart; (4) Federal investments in HPCC software R and D support researchers who pioneered the development of parallel languages and compilers, high performance mathematical, engineering, and scientific libraries, and software tools--technologies that allow scientists to use powerful parallel systems to focus on Federal agency mission applications; and (5) HPCC support for virtual environments has enabled the development of immersive technologies, where researchers can explore and manipulate multi-dimensional scientific and engineering problems. Educational programs fostered by the HPCC Program have brought into classrooms new science and engineering curricula designed to teach computational science. This document contains a small sample of the significant HPCC Program accomplishments in FY 1996.

  15. Advances in computational dynamics of particles, materials and structures a unified approach

    CERN Document Server

    Har, Jason

    2012-01-01

    Computational methods for the modeling and simulation of the dynamic response and behavior of particles, materials and structural systems have had a profound influence on science, engineering and technology. Complex science and engineering applications dealing with complicated structural geometries and materials that would be very difficult to treat using analytical methods have been successfully simulated using computational tools. With the incorporation of quantum, molecular and biological mechanics into new models, these methods are poised to play an even bigger role in the future. Ad

  16. Further assessment studies of the Advanced Cold Process Canister

    International Nuclear Information System (INIS)

    Henshaw, J.; Hoch, A.; Sharland, S.M.

    1990-08-01

    A preliminary assessment of the performance of the Advanced Cold Process Canister (ACPC) was carried out recently by Marsh. The aim of the study presented in this report is to re-examine the validity of some of the assumptions made, and re-evaluate the canister performance as appropriate. Two areas were highlighted in the preliminary study as requiring more detailed quantitative evaluation. 1) Assessment of the risk of internal stress-corrosion cracking induced by irradiation of moist air inside the canister if, under fault conditions, significant water was carried into the canister before sealing. 2) Evaluation of the corrosion behaviour subsequent to first breach of outer container. (author)

  17. Advances in molecular genetic studies of primary dystonia

    Directory of Open Access Journals (Sweden)

    MA Ling-yan

    2013-07-01

    Full Text Available Dystonias are heterogeneous hyperkinetic movement disorders characterized by involuntary muscle contractions which result in twisting, repetitive movements and abnormal postures. In recent years, there was a great advance in molecular genetic studies of primary dystonia. This paper will review the clinical characteristics and molecular genetic studies of primary dystonia, including early-onset generalized torsion dystonia (DYT1, whispering dysphonia (DYT4, dopa-responsive dystonia (DYT5, mixed-type dystonia (DYT6, paroxysmal kinesigenic dyskinesia (DYT10, myoclonus-dystonia syndrome (DYT11, rapid-onset dystonia parkinsonism (DYT12, adult-onset cervical dystonia (DYT23, craniocervical dystonia (DYT24 and primary torsion dystonia (DYT25.

  18. Studies on computer analysis for radioisotope images

    International Nuclear Information System (INIS)

    Takizawa, Masaomi

    1977-01-01

    A hybrid type image file and processing system are devised by the author to file and the radioisotope image processing with analog display. This system has some functions as follows; Ten thousand images can be stored in 60 feets length video-tape-recorder (VTR) tape. Maximum of an image on the VTR tape is within 15 sec. An image display enabled by the analog memory, which has brightness more than 15 gray levels. By using the analog memories, effective image processing can be done by the small computer. Many signal sources can be inputted into the hybrid system. This system can be applied many fields to both routine works and multi-purpose radioisotope image processing. (auth.)

  19. Experimental PIV and CFD studies of UV-peroxide advanced oxidation reactors for water treatment

    International Nuclear Information System (INIS)

    Sozzi, A.; Taghipour, F.

    2004-01-01

    An experimental and numerical study of the flow characteristics in an annular UV reactor, as used for drinking water disinfection or Advanced Oxidation Processes, was carried out using Particle Image Velocimetry (PIV) and Computational Fluid Dynamics (CFD). The influence of different turbulence models and mesh structures on the CFD results was investigated. By qualitative and quantitative comparison of CFD and PIV experimental data, it was shown that the Realizable k-e- turbulence model is best suited for simulating the hydrodynamics of this geometry. (author)

  20. Computational and experimental study of laminar flames

    Energy Technology Data Exchange (ETDEWEB)

    Smooke, Mitchell [Yale Univ., New Haven, CT (United States)

    2015-05-29

    During the past three years, our research has centered on an investigation of the effects of complex chemistry and detailed transport on the structure and extinction of hydrocarbon flames in coflowing axisymmetric configurations. We have pursued both computational and experimental aspects of the research in parallel on both steady-state and time-dependent systems. The computational work has focused on the application of accurate and efficient numerical methods for the solution of the steady-state and time-dependent boundary value problems describing the various reacting systems. Detailed experimental measurements were performed on axisymmetric coflow flames using two-dimensional imaging techniques. Previously, spontaneous Raman scattering, chemiluminescence, and laser-induced fluorescence were used to measure the temperature, major and minor species profiles. Particle image velocimetry (PIV) has been used to investigate velocity distributions and for calibration of time-varying flames. Laser-induced incandescence (LII) with an extinction calibration was used to determine soot volume fractions, while soot surface temperatures were measured with three-color optical pyrometry using a color digital camera. A blackbody calibration of the camera allows for determination of soot volume fraction as well, which can be compared with the LII measurements. More recently, we have concentrated on a detailed characterization of soot using a variety of techniques including time-resolved LII (TiRe-LII) for soot primary particles sizes, multi-angle light scattering (MALS) for soot radius of gyration, and spectrally-resolved line of sight attenuation (spec-LOSA). Combining the information from all of these soot measurements can be used to determine the soot optical properties, which are observed to vary significantly depending on spatial location and fuel dilution. Our goal has been to obtain a more fundamental understanding of the important fluid dynamic and chemical interactions in

  1. Study on PCS heat and mass transfer of advanced PWR with CFD code

    Energy Technology Data Exchange (ETDEWEB)

    Huang, X. G.; Cheng, X. [Shanghai Jiao Tong Univ., Shanghai (China); Wang, F. N.; Zhang, Z. D.; Cheng, X. [State Nuclear Power Technology Company, Beijing (China)

    2012-03-15

    During the hypothetical Double-Ended Cold Leg Guillotine (DECLG) of large advanced pressure water reactor (PWR), a large amount of steam ejects from the break into the containment. Passive containment cooling system (PCS) is implemented to prevent over-pressure and over-temperature. The computational fluid dynamics (CFD) code GASFLOW coupled with Film Coverage and Evaporation Model (FICEM) is applied in this study to analyze the PCS performance during DECLG.FICEM can calculate film coverage rate, film evaporation rate and containment heat removal capability. Results show that the modified GASFLOW version coupled with FICEM is feasible to analyze the thermal-hydraulic behavior in PCS of advanced passive PWR. Capability of PCS for large scale PWR is investigated through using the modified GASFLOW code.

  2. Development of Computational Approaches for Simulation and Advanced Controls for Hybrid Combustion-Gasification Chemical Looping

    Energy Technology Data Exchange (ETDEWEB)

    Joshi, Abhinaya; Lou, Xinsheng; Neuschaefer, Carl; Chaudry, Majid; Quinn, Joseph

    2012-07-31

    This document provides the results of the project through September 2009. The Phase I project has recently been extended from September 2009 to March 2011. The project extension will begin work on Chemical Looping (CL) Prototype modeling and advanced control design exploration in preparation for a scale-up phase. The results to date include: successful development of dual loop chemical looping process models and dynamic simulation software tools, development and test of several advanced control concepts and applications for Chemical Looping transport control and investigation of several sensor concepts and establishment of two feasible sensor candidates recommended for further prototype development and controls integration. There are three sections in this summary and conclusions. Section 1 presents the project scope and objectives. Section 2 highlights the detailed accomplishments by project task area. Section 3 provides conclusions to date and recommendations for future work.

  3. The views of older Malaysians on advanced directive and advanced care planning: a qualitative study.

    Science.gov (United States)

    Htut, Y; Shahrul, K; Poi, P J H

    2007-01-01

    The provision of optimum care for the ageing population is dependent on the understanding of their views and values on end of life issues. A qualitative descriptive study was conducted to describe views of elderly Malaysians on Advanced Care Planning (henceforth ACP) and Advanced Directives (henceforth AD), and explore factors influencing these views. Fifteen elderly subjects with ages ranging from 65 to 83 years, representing different ethnic and religious groups in Malaysia were selected for in-depth interviews guided by a questionnaire. Five core themes were extracted from the interviews: 1) Considering the future 2) Contingency plans for future illnesses 3) Attitudes towards life prolonging treatment procedures 4) Doctor-patient relationships and 5) Influence of religion on decisions related to future illness. Despite the lack of knowledge on ACP and AD, older respondents were very receptive to their concept. Although the majority agreed on the importance of planning for future medical management and having open discussion on end of life issues with their doctor, they felt it unnecessary to make a formal written AD. Most felt that the future was best left to fate or God, and none had made any contingency plan for severe future illnesses citing religion as reason for this view. Cardiopulmonary resuscitation, mechanical ventilation and dialysis were considered by most to be invasive life prolonging treatments. We suggest that doctors initiate discussions on end of life care with every older patient and their family so as to promote awareness and introduce the concept of ACP/AD to a Malaysian setting.

  4. Preferred computer activities among individuals with dementia: a pilot study.

    Science.gov (United States)

    Tak, Sunghee H; Zhang, Hongmei; Hong, Song Hee

    2015-03-01

    Computers offer new activities that are easily accessible, cognitively stimulating, and enjoyable for individuals with dementia. The current descriptive study examined preferred computer activities among nursing home residents with different severity levels of dementia. A secondary data analysis was conducted using activity observation logs from 15 study participants with dementia (severe = 115 logs, moderate = 234 logs, and mild = 124 logs) who participated in a computer activity program. Significant differences existed in preferred computer activities among groups with different severity levels of dementia. Participants with severe dementia spent significantly more time watching slide shows with music than those with both mild and moderate dementia (F [2,12] = 9.72, p = 0.003). Preference in playing games also differed significantly across the three groups. It is critical to consider individuals' interests and functional abilities when computer activities are provided for individuals with dementia. A practice guideline for tailoring computer activities is detailed. Copyright 2015, SLACK Incorporated.

  5. Advancement and Implementation of Integrated Computational Materials Engineering (ICME) for Aerospace Applications

    Science.gov (United States)

    2010-03-01

    when combined with advanced cooling configurations, have greatly increased turbine temperature and efficiency while simultaneously increasing...Phase diagrams • Database is generally sound Thermophysical properties 2.0 Material Processing Casting/Solidification ProCAST, Magma • • Commerc ial...London Elevated temperature flow stress Univ of Michigan Yield Model • • University code • Developed by Prof. Pollock Yield stress for ’ superalloys

  6. The coupling of fluids, dynamics, and controls on advanced architecture computers

    Science.gov (United States)

    Atwood, Christopher

    1995-01-01

    This grant provided for the demonstration of coupled controls, body dynamics, and fluids computations in a workstation cluster environment; and an investigation of the impact of peer-peer communication on flow solver performance and robustness. The findings of these investigations were documented in the conference articles.The attached publication, 'Towards Distributed Fluids/Controls Simulations', documents the solution and scaling of the coupled Navier-Stokes, Euler rigid-body dynamics, and state feedback control equations for a two-dimensional canard-wing. The poor scaling shown was due to serialized grid connectivity computation and Ethernet bandwidth limits. The scaling of a peer-to-peer communication flow code on an IBM SP-2 was also shown. The scaling of the code on the switched fabric-linked nodes was good, with a 2.4 percent loss due to communication of intergrid boundary point information. The code performance on 30 worker nodes was 1.7 (mu)s/point/iteration, or a factor of three over a Cray C-90 head. The attached paper, 'Nonlinear Fluid Computations in a Distributed Environment', documents the effect of several computational rate enhancing methods on convergence. For the cases shown, the highest throughput was achieved using boundary updates at each step, with the manager process performing communication tasks only. Constrained domain decomposition of the implicit fluid equations did not degrade the convergence rate or final solution. The scaling of a coupled body/fluid dynamics problem on an Ethernet-linked cluster was also shown.

  7. Exploring Interactive and Dynamic Simulations Using a Computer Algebra System in an Advanced Placement Chemistry Course

    Science.gov (United States)

    Matsumoto, Paul S.

    2014-01-01

    The article describes the use of Mathematica, a computer algebra system (CAS), in a high school chemistry course. Mathematica was used to generate a graph, where a slider controls the value of parameter(s) in the equation; thus, students can visualize the effect of the parameter(s) on the behavior of the system. Also, Mathematica can show the…

  8. CNC Turning Center Advanced Operations. Computer Numerical Control Operator/Programmer. 444-332.

    Science.gov (United States)

    Skowronski, Steven D.; Tatum, Kenneth

    This student guide provides materials for a course designed to introduce the student to the operations and functions of a two-axis computer numerical control (CNC) turning center. The course consists of seven units. Unit 1 presents course expectations and syllabus, covers safety precautions, and describes the CNC turning center components, CNC…

  9. Some research advances in computer graphics that will enhance applications to engineering design

    Science.gov (United States)

    Allan, J. J., III

    1975-01-01

    Research in man/machine interactions and graphics hardware/software that will enhance applications to engineering design was described. Research aspects of executive systems, command languages, and networking used in the computer applications laboratory are mentioned. Finally, a few areas where little or no research is being done were identified.

  10. Collaborative Learning: Cognitive and Computational Approaches. Advances in Learning and Instruction Series.

    Science.gov (United States)

    Dillenbourg, Pierre, Ed.

    Intended to illustrate the benefits of collaboration between scientists from psychology and computer science, namely machine learning, this book contains the following chapters, most of which are co-authored by scholars from both sides: (1) "Introduction: What Do You Mean by 'Collaborative Learning'?" (Pierre Dillenbourg); (2)…

  11. Interactive computer program for optimal designs of longitudinal cohort studies.

    Science.gov (United States)

    Tekle, Fetene B; Tan, Frans E S; Berger, Martijn P F

    2009-05-01

    Many large scale longitudinal cohort studies have been carried out or are ongoing in different fields of science. Such studies need a careful planning to obtain the desired quality of results with the available resources. In the past, a number of researches have been performed on optimal designs for longitudinal studies. However, there was no computer program yet available to help researchers to plan their longitudinal cohort design in an optimal way. A new interactive computer program for the optimization of designs of longitudinal cohort studies is therefore presented. The computer program helps users to identify the optimal cohort design with an optimal number of repeated measurements per subject and an optimal allocations of time points within a given study period. Further, users can compute the loss in relative efficiencies of any other alternative design compared to the optimal one. The computer program is described and illustrated using a practical example.

  12. Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.

    Science.gov (United States)

    Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P

    2010-12-22

    Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.

  13. 13th International Workshop on Advanced Computing and Analysis Techniques in Physics Research

    Science.gov (United States)

    Speer, T.; Boudjema, F.; Lauret, J.; Naumann, A.; Teodorescu, L.; Uwer, P.

    "Beyond the Cutting edge in Computing" Fundamental research is dealing, by definition, with the two extremes: the extremely small and the extremely large. The LHC and Astroparticle physics experiments will soon offer new glimpses beyond the current frontiers. And the computing infrastructure to support such physics research needs to look beyond the cutting edge. Once more it seems that we are on the edge of a computing revolution. But perhaps what we are seeing now is a even more epochal change where not only the pace of the revolution is changing, but also its very nature. Change is not any more an "event" meant to open new possibilities that have to be understood first and exploited then to prepare the ground for a new leap. Change is becoming the very essence of the computing reality, sustained by a continuous flow of technical and paradigmatic innovation. The hardware is definitely moving toward more massive parallelism, in a breathtaking synthesis of all the past techniques of concurrent computation. New many-core machines offer opportunities for all sorts of Single/Multiple Instructions, Single/Multiple Data and Vector computations that in the past required specialised hardware. At the same time, all levels of virtualisation imagined till now seem to be possible via Clouds, and possibly many more. Information Technology has been the working backbone of the Global Village, and now, in more than one sense, it is becoming itself the Global Village. Between these two, the gap between the need for adapting applications to exploit the new hardware possibilities and the push toward virtualisation of resources is widening, creating more challenges as technical and intellectual progress continues. ACAT 2010 proposes to explore and confront the different boundaries of the evolution of computing, and its possible consequences on our scientific activity. What do these new technologies entail for physics research? How will physics research benefit from this revolution in

  14. NATO Advanced Study Institute on Relativistic and Electron Correlation Effects in Molecules and Solids

    CERN Document Server

    1994-01-01

    The NATO Advanced Study Institute (ASI) on "R@lativistic and Electron Correlation Effects in Molecules and Solids", co-sponsored by Simon Fraser University (SFU) and the Natural Sciences and Engineering Research Council of Canada (NSERC) was held Aug 10- 21, 1992 at the University of British Columbia (UBC), Vancouver, Canada. A total of 90 lecturers and students with backgrounds in Chemistry, Physics, Mathematics and various interdisciplinary subjects attended the ASI. In my proposal submitted to NATO for financial support for this ASI, I pointed out that a NATO ASI on the effects of relativity in many-electron systems was held ten years ago, [See G.L. Malli, (ed) Relativistic Effects in Atoms, Molecules and Solids, Plenum Press, Vol B87, New York, 1983]. Moreover, at a NATO Advanced Research Workshop (ARW) on advanced methods for molecular electronic structure "an assessment of state-of­ the-art of Electron Correlation ... " was carried out [see C.E. Dykstra, (ed), Advanced Theories and Computational Approa...

  15. Advances in Grid Computing for the FabrIc for Frontier Experiments Project at Fermialb

    Energy Technology Data Exchange (ETDEWEB)

    Herner, K. [Fermilab; Alba Hernandex, A. F. [Fermilab; Bhat, S. [Fermilab; Box, D. [Fermilab; Boyd, J. [Fermilab; Di Benedetto, V. [Fermilab; Ding, P. [Fermilab; Dykstra, D. [Fermilab; Fattoruso, M. [Fermilab; Garzoglio, G. [Fermilab; Kirby, M. [Fermilab; Kreymer, A. [Fermilab; Levshina, T. [Fermilab; Mazzacane, A. [Fermilab; Mengel, M. [Fermilab; Mhashilkar, P. [Fermilab; Podstavkov, V. [Fermilab; Retzke, K. [Fermilab; Sharma, N. [Fermilab; Teheran, J. [Fermilab

    2016-01-01

    The FabrIc for Frontier Experiments (FIFE) project is a major initiative within the Fermilab Scientic Computing Division charged with leading the computing model for Fermilab experiments. Work within the FIFE project creates close collaboration between experimenters and computing professionals to serve high-energy physics experiments of diering size, scope, and physics area. The FIFE project has worked to develop common tools for job submission, certicate management, software and reference data distribution through CVMFS repositories, robust data transfer, job monitoring, and databases for project tracking. Since the projects inception the experiments under the FIFE umbrella have signicantly matured, and present an increasingly complex list of requirements to service providers. To meet these requirements, the FIFE project has been involved in transitioning the Fermilab General Purpose Grid cluster to support a partitionable slot model, expanding the resources available to experiments via the Open Science Grid, assisting with commissioning dedicated high-throughput computing resources for individual experiments, supporting the eorts of the HEP Cloud projects to provision a variety of back end resources, including public clouds and high performance computers, and developing rapid onboarding procedures for new experiments and collaborations. The larger demands also require enhanced job monitoring tools, which the project has developed using such tools as ElasticSearch and Grafana. in helping experiments manage their large-scale production work ows. This group in turn requires a structured service to facilitate smooth management of experiment requests, which FIFE provides in the form of the Production Operations Management Service (POMS). POMS is designed to track and manage requests from the FIFE experiments to run particular work ows, and support troubleshooting and triage in case of problems. Recently a new certicate management infrastructure called Distributed

  16. Advances in Grid Computing for the Fabric for Frontier Experiments Project at Fermilab

    Science.gov (United States)

    Herner, K.; Alba Hernandez, A. F.; Bhat, S.; Box, D.; Boyd, J.; Di Benedetto, V.; Ding, P.; Dykstra, D.; Fattoruso, M.; Garzoglio, G.; Kirby, M.; Kreymer, A.; Levshina, T.; Mazzacane, A.; Mengel, M.; Mhashilkar, P.; Podstavkov, V.; Retzke, K.; Sharma, N.; Teheran, J.

    2017-10-01

    The Fabric for Frontier Experiments (FIFE) project is a major initiative within the Fermilab Scientific Computing Division charged with leading the computing model for Fermilab experiments. Work within the FIFE project creates close collaboration between experimenters and computing professionals to serve high-energy physics experiments of differing size, scope, and physics area. The FIFE project has worked to develop common tools for job submission, certificate management, software and reference data distribution through CVMFS repositories, robust data transfer, job monitoring, and databases for project tracking. Since the projects inception the experiments under the FIFE umbrella have significantly matured, and present an increasingly complex list of requirements to service providers. To meet these requirements, the FIFE project has been involved in transitioning the Fermilab General Purpose Grid cluster to support a partitionable slot model, expanding the resources available to experiments via the Open Science Grid, assisting with commissioning dedicated high-throughput computing resources for individual experiments, supporting the efforts of the HEP Cloud projects to provision a variety of back end resources, including public clouds and high performance computers, and developing rapid onboarding procedures for new experiments and collaborations. The larger demands also require enhanced job monitoring tools, which the project has developed using such tools as ElasticSearch and Grafana. in helping experiments manage their large-scale production workflows. This group in turn requires a structured service to facilitate smooth management of experiment requests, which FIFE provides in the form of the Production Operations Management Service (POMS). POMS is designed to track and manage requests from the FIFE experiments to run particular workflows, and support troubleshooting and triage in case of problems. Recently a new certificate management infrastructure called

  17. Studying an Eulerian Computer Model on Different High-performance Computer Platforms and Some Applications

    Science.gov (United States)

    Georgiev, K.; Zlatev, Z.

    2010-11-01

    The Danish Eulerian Model (DEM) is an Eulerian model for studying the transport of air pollutants on large scale. Originally, the model was developed at the National Environmental Research Institute of Denmark. The model computational domain covers Europe and some neighbour parts belong to the Atlantic Ocean, Asia and Africa. If DEM model is to be applied by using fine grids, then its discretization leads to a huge computational problem. This implies that such a model as DEM must be run only on high-performance computer architectures. The implementation and tuning of such a complex large-scale model on each different computer is a non-trivial task. Here, some comparison results of running of this model on different kind of vector (CRAY C92A, Fujitsu, etc.), parallel computers with distributed memory (IBM SP, CRAY T3E, Beowulf clusters, Macintosh G4 clusters, etc.), parallel computers with shared memory (SGI Origin, SUN, etc.) and parallel computers with two levels of parallelism (IBM SMP, IBM BlueGene/P, clusters of multiprocessor nodes, etc.) will be presented. The main idea in the parallel version of DEM is domain partitioning approach. Discussions according to the effective use of the cache and hierarchical memories of the modern computers as well as the performance, speed-ups and efficiency achieved will be done. The parallel code of DEM, created by using MPI standard library, appears to be highly portable and shows good efficiency and scalability on different kind of vector and parallel computers. Some important applications of the computer model output are presented in short.

  18. Preliminary design studies of an advanced general aviation aircraft

    Science.gov (United States)

    Barrett, Ron; Demoss, Shane; Dirkzwager, AB; Evans, Darryl; Gomer, Charles; Keiter, Jerry; Knipp, Darren; Seier, Glen; Smith, Steve; Wenninger, ED

    1991-01-01

    The preliminary design results are presented of the advanced aircraft design project. The goal was to take a revolutionary look into the design of a general aviation aircraft. Phase 1 of the project included the preliminary design of two configurations, a pusher, and a tractor. Phase 2 included the selection of only one configuration for further study. The pusher configuration was selected on the basis of performance characteristics, cabin noise, natural laminar flow, and system layouts. The design was then iterated to achieve higher levels of performance.

  19. Recent progress in orbital-free density functional theory (recent advances in computational chemistry)

    CERN Document Server

    Wesolowski, Tomasz A

    2013-01-01

    This is a comprehensive overview of state-of-the-art computational methods based on orbital-free formulation of density functional theory completed by the most recent developments concerning the exact properties, approximations, and interpretations of the relevant quantities in density functional theory. The book is a compilation of contributions stemming from a series of workshops which had been taking place since 2002. It not only chronicles many of the latest developments but also summarises some of the more significant ones. The chapters are mainly reviews of sub-domains but also include original research. Readership: Graduate students, academics and researchers in computational chemistry. Atomic & molecular physicists, theoretical physicists, theoretical chemists, physical chemists and chemical physicists.

  20. Research in advanced formal theorem-proving techniques. [design and implementation of computer languages

    Science.gov (United States)

    Raphael, B.; Fikes, R.; Waldinger, R.

    1973-01-01

    The results are summarised of a project aimed at the design and implementation of computer languages to aid in expressing problem solving procedures in several areas of artificial intelligence including automatic programming, theorem proving, and robot planning. The principal results of the project were the design and implementation of two complete systems, QA4 and QLISP, and their preliminary experimental use. The various applications of both QA4 and QLISP are given.

  1. Development of advanced computational fluid dynamics tools and their application to simulation of internal turbulent flows

    Science.gov (United States)

    Emelyanov, V. N.; Karpenko, A. G.; Volkov, K. N.

    2015-06-01

    Modern graphics processing units (GPU) provide architectures and new programming models that enable to harness their large processing power and to design computational fluid dynamics (CFD) simulations at both high performance and low cost. Possibilities of the use of GPUs for the simulation of internal fluid flows are discussed. The finite volume method is applied to solve three-dimensional (3D) unsteady compressible Euler and Navier-Stokes equations on unstructured meshes. Compute Inified Device Architecture (CUDA) technology is used for programming implementation of parallel computational algorithms. Solution of some fluid dynamics problems on GPUs is presented and approaches to optimization of the CFD code related to the use of different types of memory are discussed. Speedup of solution on GPUs with respect to the solution on central processor unit (CPU) is compared with the use of different meshes and different methods of distribution of input data into blocks. Performance measurements show that numerical schemes developed achieve 20 to 50 speedup on GPU hardware compared to CPU reference implementation. The results obtained provide promising perspective for designing a GPU-based software framework for applications in CFD.

  2. Computer simulation of a 20-kHz power system for advanced launch systems

    Science.gov (United States)

    Sudhoff, S. D.; Wasynczuk, O.; Krause, P. C.; Kenny, B. H.

    1993-01-01

    The performance of two 20-kHz actuator power systems being built for an advanced launch system are evaluated for typical launch senario using an end-to-end system simulation. Aspects of system performance ranging from the switching of the power electronic devices to the vehicle aerodynamics are represented in the simulation. It is shown that both systems adequately stabilize the vehicle against a wind gust during launch. However, it is also shown that in both cases there are bus voltage and current fluctuations which make system power quality a concern.

  3. Advanced Fusion Power Plant Studies. Annual Report for 1999

    International Nuclear Information System (INIS)

    Chan, V.S.; Chu, M.S.; Greenfield, C.M.; Kinsey, J.E.

    2000-01-01

    Significant progress in physics understanding of the reversed shear advanced tokamak regime has been made since the last ARIES-RS study was completed in 1996. The 1999 study aimed at updating the physics design of ARIES-RS, which has been renamed ARIES-AT, using the improved understanding achieved in the last few years. The new study focused on: Improvement of beta-limit stability calculations to include important non-ideal effects such as resistive wall modes and neo-classical tearing modes; Use of physics based transport model for internal transport barrier (ITB) formation and sustainment; Comparison of current drive and rotational flow drive using fast wave, electron cyclotron wave and neutral particle beam; Improvement in heat and particle control; Integrated modeling of the optimized scenario with self-consistent current and transport profiles to study the robustness of the bootstrap alignment, ITB sustainment, and stable path to high beta and high bootstrap fraction operation

  4. NASA Computational Case Study: The Flight of Friendship 7

    Science.gov (United States)

    Simpson, David G.

    2012-01-01

    In this case study, we learn how to compute the position of an Earth-orbiting spacecraft as a function of time. As an exercise, we compute the position of John Glenn's Mercury spacecraft Friendship 7 as it orbited the Earth during the third flight of NASA's Mercury program.

  5. Factors Affecting Softlifting Intention of Computing Students: An Empirical Study.

    Science.gov (United States)

    Rahim, Md. Mahbubur; Seyal, Afzaal H.; Rahman, Mohd. Noah Abd.

    2001-01-01

    Discusses softlifting as a form of software piracy and describes a study that analyzed the softlifting intentions of computing students in Brunei Darussalam. Considers student attitudes; gender; family income; personal computer ownership; experience; faculty remarks; institutional monitoring; and implications for attempts to curb software piracy.…

  6. Artificial intelligence in medicine and cardiac imaging: harnessing big data and advanced computing to provide personalized medical diagnosis and treatment.

    Science.gov (United States)

    Dilsizian, Steven E; Siegel, Eliot L

    2014-01-01

    Although advances in information technology in the past decade have come in quantum leaps in nearly every aspect of our lives, they seem to be coming at a slower pace in the field of medicine. However, the implementation of electronic health records (EHR) in hospitals is increasing rapidly, accelerated by the meaningful use initiatives associated with the Center for Medicare & Medicaid Services EHR Incentive Programs. The transition to electronic medical records and availability of patient data has been associated with increases in the volume and complexity of patient information, as well as an increase in medical alerts, with resulting "alert fatigue" and increased expectations for rapid and accurate diagnosis and treatment. Unfortunately, these increased demands on health care providers create greater risk for diagnostic and therapeutic errors. In the near future, artificial intelligence (AI)/machine learning will likely assist physicians with differential diagnosis of disease, treatment options suggestions, and recommendations, and, in the case of medical imaging, with cues in image interpretation. Mining and advanced analysis of "big data" in health care provide the potential not only to perform "in silico" research but also to provide "real time" diagnostic and (potentially) therapeutic recommendations based on empirical data. "On demand" access to high-performance computing and large health care databases will support and sustain our ability to achieve personalized medicine. The IBM Jeopardy! Challenge, which pitted the best all-time human players against the Watson computer, captured the imagination of millions of people across the world and demonstrated the potential to apply AI approaches to a wide variety of subject matter, including medicine. The combination of AI, big data, and massively parallel computing offers the potential to create a revolutionary way of practicing evidence-based, personalized medicine.

  7. Study of check image using computed radiography

    International Nuclear Information System (INIS)

    Sato, Hiroshi

    2002-01-01

    There are two image forming methods both a check image and a portal image in the linacogram. It has been established the image forming method in the check image using computed radiography (CR). On the other hand, it is not established the image forming method in the portal image using CR yet. Usually, in the electric portal imaging device (EPID) is mainly used just before radiotherapy start. The usefulness of the portal image forming method by CR using in place of EPID is possible to confirm the precision for determining to specific position at the irradiate part and to the irradiate method for the human organs. There are some technical problems that, since in the early time, the linac graphy (LG) image have low resolution power. In order to improve to the resolution power in LG image, CR image technologies have been introduced to the check image forming method. Heavy metallic sheet (HMS) is used to the front side of CR-IP cassette, and high contactness sponge is used to the back side of the cassette. Improved contactness between HMS and imaging plate (IP) by means of the high contactness sponge contributed to improve the resolution power in the check images. A lot of paper which is connected with these information have been reported. Imaging plate ST-III should be used to maintain high sensitivity in the check film image forming method. The same image forming method in the check image established by CR has been introduced into the portal image forming method in order to improve the resolution power. However, as a result, it couldn't acquired high resolution image forming in the portal images because of the combination of ST-III and radiotherapy dose. After several trials, it has been recognized that HR-V imaging plate for mammography is the most useful application to maintain high resolution power in the portal images. Also, it is possible to modify the image quality by changing GS parameter which is one of image processing parameters in CR. Furthermore, in case

  8. Comparative study of mesothelioma and asbestosis using computed tomography and conventional chest radiography

    International Nuclear Information System (INIS)

    Rabinowitz, T.G.; Efremidis, S.C.; Cohen, B.; Dan, S.; Efremidis, A.; Chahinian, A.P.; Teirstein, A.S.

    1982-01-01

    A comparative study using computed tomography and conventional posteroanterior radiography was performed on 27 patients with mesothelioma and 13 patients with advanced asbestosis. The major pathologic features of both asbestosis and mesothelioma were well demonstrated by both modalities; computed tomography demonstrated the findings more frequently and in greater detail. No distinguishing features could be established based on configuration and size of the lesion. Many pleural plaques associated with advanced asbestosis were large and irregular and resembled those associated with mesothelioma. However, nodular involvement of the pleural fissures, pleural effusion, and ipsilateral volume loss with a fixed mediastinum were features predominating in mesothelioma. Growth determination of the plaques associated with asbestosis may be of minimal value since such plaques also undergo growth due to active inflammatory changes

  9. Methods and advances in the study of aeroelasticity with uncertainties

    Directory of Open Access Journals (Sweden)

    Dai Yuting

    2014-06-01

    Full Text Available Uncertainties denote the operators which describe data error, numerical error and model error in the mathematical methods. The study of aeroelasticity with uncertainty embedded in the subsystems, such as the uncertainty in the modeling of structures and aerodynamics, has been a hot topic in the last decades. In this paper, advances of the analysis and design in aeroelasticity with uncertainty are summarized in detail. According to the non-probabilistic or probabilistic uncertainty, the developments of theories, methods and experiments with application to both robust and probabilistic aeroelasticity analysis are presented, respectively. In addition, the advances in aeroelastic design considering either probabilistic or non-probabilistic uncertainties are introduced along with aeroelastic analysis. This review focuses on the robust aeroelasticity study based on the structured singular value method, namely the μ method. It covers the numerical calculation algorithm of the structured singular value, uncertainty model construction, robust aeroelastic stability analysis algorithms, uncertainty level verification, and robust flutter boundary prediction in the flight test, etc. The key results and conclusions are explored. Finally, several promising problems on aeroelasticity with uncertainty are proposed for future investigation.

  10. NATO Advanced Study Institute on Magnetic Resonance : Introduction, Advanced Topics and Applications to Fossil Energy

    CERN Document Server

    Fraissard, Jacques

    1984-01-01

    This volume contains the lectures presented at an Advanced Study Institute on "Magnetic Resonance Techniques in Fossil Energy Problems," which was held at the village of Maleme, Crete, in July of 1983. As of this writing, a different popular attitude prevails from that when the ASI was proposed as far as how critical the world energy picture is. In the popular press, a panglossian attitude (the "petroleum glut" of the 80's) has replaced the jeremiads of the 70's ( a catastrophic "energy crisis"). Yet, there are certain important constants: (a) for the foreseeable future, fossil energy sources (petroleum, coal, oil shale, etc. ) will continue to be of paramount importance; and (b) science and technology of the highest order are needed to extend the fossil ener~y resource base and to utilize it in a cost-effective manner that is also environmentally acceptable. It is precisely this second item that this volume addresses. The volume introduces the phenomenology of magnetic resonance ~n a unified and detailed man...

  11. Computer Aided Design of Advanced Turbine Airfoil Alloys for Industrial Gas Turbines in Coal Fired Environments

    Energy Technology Data Exchange (ETDEWEB)

    G.E. Fuchs

    2007-12-31

    Recent initiatives for fuel flexibility, increased efficiency and decreased emissions in power generating industrial gas turbines (IGT's), have highlighted the need for the development of techniques to produce large single crystal or columnar grained, directionally solidified Ni-base superalloy turbine blades and vanes. In order to address the technical difficulties of producing large single crystal components, a program has been initiated to, using computational materials science, better understand how alloy composition in potential IGT alloys and solidification conditions during processing, effect castability, defect formation and environmental resistance. This program will help to identify potential routes for the development of high strength, corrosion resistant airfoil/vane alloys, which would be a benefit to all IGT's, including small IGT's and even aerospace gas turbines. During the first year, collaboration with Siemens Power Corporation (SPC), Rolls-Royce, Howmet and Solar Turbines has identified and evaluated about 50 alloy compositions that are of interest for this potential application. In addition, alloy modifications to an existing alloy (CMSX-4) were also evaluated. Collaborating with SPC and using computational software at SPC to evaluate about 50 alloy compositions identified 5 candidate alloys for experimental evaluation. The results obtained from the experimentally determined phase transformation temperatures did not compare well to the calculated values in many cases. The effects of small additions of boundary strengtheners (i.e., C, B and N) to CMSX-4 were also examined. The calculated phase transformation temperatures were somewhat closer to the experimentally determined values than for the 5 candidate alloys, discussed above. The calculated partitioning coefficients were similar for all of the CMSX-4 alloys, similar to the experimentally determined segregation behavior. In general, it appears that computational materials

  12. Computer Aided Diagnosis for Confocal Laser Endomicroscopy in Advanced Colorectal Adenocarcinoma

    DEFF Research Database (Denmark)

    Ştefănescu, Daniela; Streba, Costin; Cârţână, Elena Tatiana

    2016-01-01

    -layer feed forward neural network was used to train and automatically diagnose the malignant samples, based on the seven parameters tested. The neural network operations were cross-entropy with the results: training: 0.53, validation: 1.17, testing: 1.17, and percent error, resulting: training: 16.......14, validation: 17.42, testing: 15.48. The diagnosis accuracy error was 15.5%. CONCLUSIONS: Computed aided diagnosis via fractal analysis of glandular structures can complement the traditional histological and minimally invasive imaging methods. A larger dataset from colorectal and other pathologies should...

  13. Advanced Numerical Methods for Computing Statistical Quantities of Interest from Solutions of SPDES

    Science.gov (United States)

    2012-01-19

    uDNS(T )]‖L2(Ω) and error( var )= ‖ var [(uADM − uDNS),x, T ]‖L2(Ω). h error(E) rate(E) error( var ) rate( var ) 1/4 0.0724682 0.00319198 1/8 0.0297043 1.287...viscosity method developed by Tadmor and co-workers in [25] and several related papers and later extended to the finite element and wavelet cases...element techniques; Comput. Methods Appl. Mech. Engrg. 190 2001, 6359-6372. [18] D. Diez, M. Gunzburger and A. Kunoth, An adaptive wavelet viscosity

  14. Quantitative Study on Computer Self-Efficacy and Computer Anxiety Differences in Academic Major and Residential Status

    Science.gov (United States)

    Binkley, Zachary Wayne McClellan

    2017-01-01

    This study investigates computer self-efficacy and computer anxiety within 61 students across two academic majors, Aviation and Sports and Exercise Science, while investigating the impact residential status, age, and gender has on those two psychological constructs. The purpose of the study is to find if computer self-efficacy and computer anxiety…

  15. Open-Source Software in Computational Research: A Case Study

    Directory of Open Access Journals (Sweden)

    Sreekanth Pannala

    2008-04-01

    Full Text Available A case study of open-source (OS development of the computational research software MFIX, used for multiphase computational fluid dynamics simulations, is presented here. The verification and validation steps required for constructing modern computational software and the advantages of OS development in those steps are discussed. The infrastructure used for enabling the OS development of MFIX is described. The impact of OS development on computational research and education in gas-solids flow, as well as the dissemination of information to other areas such as geophysical and volcanology research, is demonstrated. This study shows that the advantages of OS development were realized in the case of MFIX: verification by many users, which enhances software quality; the use of software as a means for accumulating and exchanging information; the facilitation of peer review of the results of computational research.

  16. Case Study on Algebraic Software Methodologies for Scientific Computing

    Directory of Open Access Journals (Sweden)

    Magne Haveraaen

    2000-01-01

    Full Text Available The use of domain specific languages and appropriate software architectures are currently seen as the way to enhance reusability and improve software productivity. Here we outline a use of algebraic software methodologies and advanced program constructors to improve the abstraction level of software for scientific computing. This leads us to the language of coordinate free numerics as an alternative to the traditional coordinate dependent array notation. This provides the backdrop for the three accompanying papers: Coordinate Free Programming of Computational Fluid Dynamics Problems, centered around an example of using coordinate free numerics, Machine and Collection Abstractions for User-Implemented Data-Parallel Programming, exploiting the higher abstraction level when parallelising code, and An Algebraic Programming Style for Numerical Software and its Optimization, looking at high-level transformations enabled by the domain specific programming style.

  17. TRAC-PF1: an advanced best-estimate computer program for pressurized water reactor analysis

    International Nuclear Information System (INIS)

    Liles, D.R.; Mahaffy, J.H.

    1984-02-01

    The Transient Reactor Analysis Code (TRAC) is being developed at the Los Alamos National Laboratory to provide advanced best-estimate predictions of postulated accidents in light water reactors. The TRAC-PF1 program provides this capability for pressurized water reactors and for many thermal-hydraulic experimental facilities. The code features either a one-dimensional or a three-dimensional treatment of the pressure vessel and its associated internals; a two-phase, two-fluid nonequilibrium hydrodynamics model with a noncondensable gas field; flow-regime-dependent constitutive equation treatment; optional reflood tracking capability for both bottom flood and falling-film quench fronts; and consistent treatment of entire accident sequences including the generation of consistent initial conditions. This report describes the thermal-hydraulic models and the numerical solution methods used in the code. Detailed programming and user information also are provided

  18. TRAC-PF1: an advanced best-estimate computer program for pressurized water reactor analysis

    Energy Technology Data Exchange (ETDEWEB)

    Liles, D.R.; Mahaffy, J.H.

    1984-02-01

    The Transient Reactor Analysis Code (TRAC) is being developed at the Los Alamos National Laboratory to provide advanced best-estimate predictions of postulated accidents in light water reactors. The TRAC-PF1 program provides this capability for pressurized water reactors and for many thermal-hydraulic experimental facilities. The code features either a one-dimensional or a three-dimensional treatment of the pressure vessel and its associated internals; a two-phase, two-fluid nonequilibrium hydrodynamics model with a noncondensable gas field; flow-regime-dependent constitutive equation treatment; optional reflood tracking capability for both bottom flood and falling-film quench fronts; and consistent treatment of entire accident sequences including the generation of consistent initial conditions. This report describes the thermal-hydraulic models and the numerical solution methods used in the code. Detailed programming and user information also are provided.

  19. Advanced computed tomography system for the inspection of large aluminium car bodies

    Energy Technology Data Exchange (ETDEWEB)

    Simon, M.; Tiseanu, I.; Sauerwein, C. [Hans Waelischmiller, Meersburg (Germany); Sindel, M.; Brodmann, M.; Schmuecker, M. [AUDI AG, Neckarsulm (Germany)

    2006-07-01

    An advanced 3D CT system with the capability to scan parts sizing from 3 mm up to 5000 mm was developed. The newly designed non destructive inspection system overcomes existing limitations of conventional CT systems in terms of part size and resolution. Reconstruction and scan algorithms were developed that allow achieving three-dimensional information of material and geometry in large automotive bodies with a resolution of up to 30 {mu}m. In micro 3D CT mode a resolution of up to 3 {mu}m can be achieved. The development of the mechatronic inspection system includes aspects of mechanics, electronics, software, and algorithms. For the manipulation of the full range of parts a high precision manipulation system and an industrial robot are used. The system allows the car manufacturer to inspect non-destructively a variety of join connections in car body parts. The capability of the system is demonstrated by different applications. (orig.)

  20. Computer-based regulating control system for the Advanced Test Reactor

    International Nuclear Information System (INIS)

    Johnson, M.R.

    1983-01-01

    This paper describes a new control system which has recently been designed and installed at the Advanced Test Reactor at INEL, replacing an older system that had been in service for some 17 years. Based on modern digital technology, the new system provides improved capability, reliability, and an enhanced man/machine interface that includes comprehensive failure and error messages and voice synthesis. In addition to control functions, and transparent to the operator, the system performs continual on-line checks to sense subsystem failures and takes appropriate automatic action. In the maintenance mode, service technicians can carry on a dialog with the controller to quickly identify faulty components. The operational capabilities of the new system are summarized, and reactor operator training, experience, and acceptance of the system are discussed