WorldWideScience

Sample records for computer evolution project

  1. Effective Strategies for Teaching Evolution: The Primary Evolution Project

    Science.gov (United States)

    Hatcher, Chris

    2015-01-01

    When Chris Hatcher joined the Primary Evolution Project team at the University of Reading, his goal was to find effective strategies to teach evolution in a way that keeps children engaged and enthused. Hatcher has collaborated with colleagues at the University's Institute of Education to break the evolution unit down into distinct topics and…

  2. `95 computer system operation project

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Taek; Lee, Hae Cho; Park, Soo Jin; Kim, Hee Kyung; Lee, Ho Yeun; Lee, Sung Kyu; Choi, Mi Kyung [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-12-01

    This report describes overall project works related to the operation of mainframe computers, the management of nuclear computer codes and the project of nuclear computer code conversion. The results of the project are as follows ; 1. The operation and maintenance of the three mainframe computers and other utilities. 2. The management of the nuclear computer codes. 3. The finishing of the computer codes conversion project. 26 tabs., 5 figs., 17 refs. (Author) .new.

  3. '95 computer system operation project

    International Nuclear Information System (INIS)

    Kim, Young Taek; Lee, Hae Cho; Park, Soo Jin; Kim, Hee Kyung; Lee, Ho Yeun; Lee, Sung Kyu; Choi, Mi Kyung

    1995-12-01

    This report describes overall project works related to the operation of mainframe computers, the management of nuclear computer codes and the project of nuclear computer code conversion. The results of the project are as follows ; 1. The operation and maintenance of the three mainframe computers and other utilities. 2. The management of the nuclear computer codes. 3. The finishing of the computer codes conversion project. 26 tabs., 5 figs., 17 refs. (Author) .new

  4. Language evolution and human-computer interaction

    Science.gov (United States)

    Grudin, Jonathan; Norman, Donald A.

    1991-01-01

    Many of the issues that confront designers of interactive computer systems also appear in natural language evolution. Natural languages and human-computer interfaces share as their primary mission the support of extended 'dialogues' between responsive entities. Because in each case one participant is a human being, some of the pressures operating on natural languages, causing them to evolve in order to better support such dialogue, also operate on human-computer 'languages' or interfaces. This does not necessarily push interfaces in the direction of natural language - since one entity in this dialogue is not a human, this is not to be expected. Nonetheless, by discerning where the pressures that guide natural language evolution also appear in human-computer interaction, we can contribute to the design of computer systems and obtain a new perspective on natural languages.

  5. Computer Interactives for the Mars Atmospheric and Volatile Evolution (MAVEN) Mission through NASA's "Project Spectra!"

    Science.gov (United States)

    Wood, E. L.

    2014-12-01

    "Project Spectra!" is a standards-based E-M spectrum and engineering program that includes paper and pencil activities as well as Flash-based computer games that help students solidify understanding of high-level planetary and solar physics. Using computer interactive games, students experience and manipulate information making abstract concepts accessible, solidifying understanding and enhancing retention of knowledge. Since students can choose what to watch and explore, the interactives accommodate a broad range of learning styles. Students can go back and forth through the interactives if they've missed a concept or wish to view something again. In the end, students are asked critical thinking questions and conduct web-based research. As part of the Mars Atmospheric and Volatile EvolutioN (MAVEN) mission education programming, we've developed two new interactives. The MAVEN mission will study volatiles in the upper atmosphere to help piece together Mars' climate history. In the first interactive, students explore black body radiation, albedo, and a simplified greenhouse effect to establish what factors contribute to overall planetary temperature. Students design a planet that is able to maintain liquid water on the surface. In the second interactive, students are asked to consider conditions needed for Mars to support water on the surface, keeping some variables fixed. Ideally, students will walk away with the very basic and critical elements required for climate studies, which has far-reaching implications beyond the study of Mars. These interactives were pilot tested at Arvada High School in Colorado.

  6. Computer simulations for the Mars Atmospheric and Volatile EvolutioN (MAVEN) mission through NASA's "Project Spectra!"

    Science.gov (United States)

    Christofferson, R.; Wood, E. L.; Euler, G.

    2012-12-01

    "Project Spectra!" is a standards-based light science and engineering program on solar system exploration that includes both hands-on paper and pencil activities as well as Flash-based computer games that help students solidify understanding of high-level planetary and solar physics. Using computer interactive games where students experience and manipulate the information makes abstract concepts accessible. Visualizing lessons with multi-media tools solidifies understanding and retention of knowledge. Since students can choose what to watch and explore, the interactives accommodate a broad range of learning styles. Students can go back and forth through the interactives if they've missed a concept or wish to view something again. In the end, students are asked critical thinking questions and conduct web-based research. As a part of the Mars Atmospheric and Volatile EvolutioN (MAVEN) mission education programming, we've developed two new "Project Spectra!" interactives that go hand-in-hand with a paper and pencil activity. The MAVEN mission will study volatiles in the upper atmosphere to help piece together Mars' climate history. In the first interactive, students explore black body radiation, albedo, and a simplified greenhouse effect to establish what factors contribute to overall planetary temperature and how they contribute. Students are asked to create a scenario in which a planet they build and design is able to maintain liquid water on the surface. In the second interactive, students are asked to consider Mars and the conditions needed for Mars to support water on the surface, keeping some variables fixed. Ideally, students will walk away with the very basic and critical elements required for climate studies, which has far-reaching implications beyond the study of Mars. These interactives are currently being pilot tested at Arvada High School in Colorado.

  7. Computer simulations for the Mars Atmospheric and Volatile EvolutioN (MAVEN) mission through NASA's 'Project Spectra!'

    Science.gov (United States)

    Wood, E. L.

    2013-12-01

    'Project Spectra!' is a standards-based light science and engineering program on solar system exploration that includes both hands-on paper and pencil activities as well as Flash-based computer games that help students solidify understanding of high-level planetary and solar physics. Using computer interactive games where students experience and manipulate the information makes abstract concepts accessible. Visualizing lessons with multi-media tools solidifies understanding and retention of knowledge. Since students can choose what to watch and explore, the interactives accommodate a broad range of learning styles. Students can go back and forth through the interactives if they've missed a concept or wish to view something again. In the end, students are asked critical thinking questions and conduct web-based research. As a part of the Mars Atmospheric and Volatile EvolutioN (MAVEN) mission education programming, we've developed two new 'Project Spectra!' interactives that go hand-in-hand with a paper and pencil activity. The MAVEN mission will study volatiles in the upper atmosphere to help piece together Mars' climate history. In the first interactive, students explore black body radiation, albedo, and a simplified greenhouse effect to establish what factors contribute to overall planetary temperature and how they contribute. Students are asked to create a scenario in which a planet they build and design is able to maintain liquid water on the surface. In the second interactive, students are asked to consider Mars and the conditions needed for Mars to support water on the surface, keeping some variables fixed. Ideally, students will walk away with the very basic and critical elements required for climate studies, which has far-reaching implications beyond the study of Mars. These interactives were pilot tested at Arvada High School in Colorado.

  8. Stable numerical method in computation of stellar evolution

    International Nuclear Information System (INIS)

    Sugimoto, Daiichiro; Eriguchi, Yoshiharu; Nomoto, Ken-ichi.

    1982-01-01

    To compute the stellar structure and evolution in different stages, such as (1) red-giant stars in which the density and density gradient change over quite wide ranges, (2) rapid evolution with neutrino loss or unstable nuclear flashes, (3) hydrodynamical stages of star formation or supernova explosion, (4) transition phases from quasi-static to dynamical evolutions, (5) mass-accreting or losing stars in binary-star systems, and (6) evolution of stellar core whose mass is increasing by shell burning or decreasing by penetration of convective envelope into the core, we face ''multi-timescale problems'' which can neither be treated by simple-minded explicit scheme nor implicit one. This problem has been resolved by three prescriptions; one by introducing the hybrid scheme suitable for the multi-timescale problems of quasi-static evolution with heat transport, another by introducing also the hybrid scheme suitable for the multi-timescale problems of hydrodynamic evolution, and the other by introducing the Eulerian or, in other words, the mass fraction coordinate for evolution with changing mass. When all of them are combined in a single computer code, we can compute numerically stably any phase of stellar evolution including transition phases, as far as the star is spherically symmetric. (author)

  9. Curve Evolution in Subspaces and Exploring the Metameric Class of Histogram of Gradient Orientation based Features using Nonlinear Projection Methods

    DEFF Research Database (Denmark)

    Tatu, Aditya Jayant

    This thesis deals with two unrelated issues, restricting curve evolution to subspaces and computing image patches in the equivalence class of Histogram of Gradient orientation based features using nonlinear projection methods. Curve evolution is a well known method used in various applications like...... tracking interfaces, active contour based segmentation methods and others. It can also be used to study shape spaces, as deforming a shape can be thought of as evolving its boundary curve. During curve evolution a curve traces out a path in the infinite dimensional space of curves. Due to application...... specific requirements like shape priors or a given data model, and due to limitations of the computer, the computed curve evolution forms a path in some finite dimensional subspace of the space of curves. We give methods to restrict the curve evolution to a finite dimensional linear or implicitly defined...

  10. The evolution of computer technology

    CERN Document Server

    Kamar, Haq

    2018-01-01

    Today it seems that computers occupy every single space in life. This book traces the evolution of computers from the humble beginnings as simple calculators up to the modern day jack-of-all trades devices like the iPhone. Readers will learn about how computers evolved from humongous military-issue refrigerators to the spiffy, delicate, and intriguing devices that many modern people feel they can't live without anymore. Readers will also discover the historical significance of computers, and their pivotal roles in World War II, the Space Race, and the emergence of modern Western powers.

  11. Computation of a long-time evolution in a Schroedinger system

    International Nuclear Information System (INIS)

    Girard, R.; Kroeger, H.; Labelle, P.; Bajzer, Z.

    1988-01-01

    We compare different techniques for the computation of a long-time evolution and the S matrix in a Schroedinger system. As an application we consider a two-nucleon system interacting via the Yamaguchi potential. We suggest computation of the time evolution for a very short time using Pade approximants, the long-time evolution being obtained by iterative squaring. Within the technique of strong approximation of Moller wave operators (SAM) we compare our calculation with computation of the time evolution in the eigenrepresentation of the Hamiltonian and with the standard Lippmann-Schwinger solution for the S matrix. We find numerical agreement between these alternative methods for time-evolution computation up to half the number of digits of internal machine precision, and fairly rapid convergence of both techniques towards the Lippmann-Schwinger solution

  12. Enzyme (re)design: lessons from natural evolution and computation.

    Science.gov (United States)

    Gerlt, John A; Babbitt, Patricia C

    2009-02-01

    The (re)design of enzymes to catalyze 'new' reactions is a topic of considerable practical and intellectual interest. Directed evolution (random mutagenesis followed by screening/selection) has been used widely to identify novel biocatalysts. However, 'rational' approaches using either natural divergent evolution or computational predictions based on chemical principles have been less successful. This review summarizes recent progress in evolution-based and computation-based (re)design.

  13. Framework for Computer-Aided Evolution of Object-Oriented Designs

    NARCIS (Netherlands)

    Ciraci, S.; van den Broek, P.M.; Aksit, Mehmet

    2008-01-01

    In this paper, we describe a framework for the computer aided evolution of the designs of object-oriented software systems. Evolution mechanisms are software structures that prepare software for certain type of evolutions. The framework uses a database which holds the evolution mechanisms, modeled

  14. EVOLUT - a computer program for fast burnup evaluation

    International Nuclear Information System (INIS)

    Craciunescu, T.; Dobrin, R.; Stamatescu, L.; Alexa, A.

    1999-01-01

    EVOLUT is a computer program for burnup evaluation. The input data consist on the one hand of axial and radial gamma-scanning profiles (for the experimental evaluation of the number of nuclei of a fission product - the burnup monitor - at the end of irradiation) and on the other hand of the history of irradiation (the time length and values proportional to the neutron flux for each step of irradiation). Using the equation of evolution of the burnup monitor the flux values are iteratively adjusted, by a multiplier factor, until the calculated number of nuclei is equal to the experimental one. The flux values are used in the equation of evolution of the fissile and fertile nuclei to determine the fission number and consequently the burnup. EVOLUT was successfully used in the analysis of several hundreds of CANDU and TRIGA-type fuel rods. We appreciate that EVOLUT is a useful tool in the burnup evaluation based on gamma spectrometry measurements. EVOLUT can be used on an usual AT computer and in this case the results are obtained in a few minutes. It has an original and user-friendly graphical interface and it provides also output in script MATLAB files for graphical representation and further numerical analysis. The computer program needs simple data and it is valuable especially when a large number of burnup analyses are required quickly. (authors)

  15. Cloud computing task scheduling strategy based on improved differential evolution algorithm

    Science.gov (United States)

    Ge, Junwei; He, Qian; Fang, Yiqiu

    2017-04-01

    In order to optimize the cloud computing task scheduling scheme, an improved differential evolution algorithm for cloud computing task scheduling is proposed. Firstly, the cloud computing task scheduling model, according to the model of the fitness function, and then used improved optimization calculation of the fitness function of the evolutionary algorithm, according to the evolution of generation of dynamic selection strategy through dynamic mutation strategy to ensure the global and local search ability. The performance test experiment was carried out in the CloudSim simulation platform, the experimental results show that the improved differential evolution algorithm can reduce the cloud computing task execution time and user cost saving, good implementation of the optimal scheduling of cloud computing tasks.

  16. Building an application for computing the resource requests such as disk, CPU, and tape and studying the time evolution of computing model

    CERN Document Server

    Noormandipour, Mohammad Reza

    2017-01-01

    The goal of this project was building an application to calculate the computing resources needed by the LHCb experiment for data processing and analysis, and to predict their evolution in future years. The source code was developed in the Python programming language and the application built and developed in CERN GitLab. This application will facilitate the calculation of resources required by LHCb in both qualitative and quantitative aspects. The granularity of computations is improved to a weekly basis, in contrast with the yearly basis used so far. The LHCb computing model will benefit from the new possibilities and options added, as the new predictions and calculations are aimed at giving more realistic and accurate estimates.

  17. Evolution of Computational Toxicology-from Primitive ...

    Science.gov (United States)

    Presentation at the Health Canada seminar in Ottawa, ON, Canada on Nov. 15. 2016 Presentation at the Health Canada seminar in Ottawa, ON, Canada on Nov. 15. 2016 on the Evolution of Computational Toxicology-from Primitive Beginnings to Sophisticated Application

  18. From evolutionary computation to the evolution of things

    NARCIS (Netherlands)

    Eiben, A.E.; Smith, J.E.

    2015-01-01

    Evolution has provided a source of inspiration for algorithm designers since the birth of computers. The resulting field, evolutionary computation, has been successful in solving engineering tasks ranging in outlook from the molecular to the astronomical. Today, the field is entering a new phase as

  19. Computer Assets Recovery Project

    Science.gov (United States)

    CortesPena, Aida Yoguely

    2010-01-01

    This document reports on the project that was performed during the internship of the author. The project involved locating and recovering machines in various locations that Boeing has no need for, and therefore requires that they be transferred to another user or transferred to a non-profit organization. Other projects that the author performed was an inventory of toner and printers, loading new computers and connecting them to the network.

  20. CMS computing model evolution

    International Nuclear Information System (INIS)

    Grandi, C; Bonacorsi, D; Colling, D; Fisk, I; Girone, M

    2014-01-01

    The CMS Computing Model was developed and documented in 2004. Since then the model has evolved to be more flexible and to take advantage of new techniques, but many of the original concepts remain and are in active use. In this presentation we will discuss the changes planned for the restart of the LHC program in 2015. We will discuss the changes planning in the use and definition of the computing tiers that were defined with the MONARC project. We will present how we intend to use new services and infrastructure to provide more efficient and transparent access to the data. We will discuss the computing plans to make better use of the computing capacity by scheduling more of the processor nodes, making better use of the disk storage, and more intelligent use of the networking.

  1. Bibliography. Computer-Oriented Projects, 1987.

    Science.gov (United States)

    Smith, Richard L., Comp.

    1988-01-01

    Provides an annotated list of references on computer-oriented projects. Includes information on computers; hands-on versus simulations; games; instruction; students' attitudes and learning styles; artificial intelligence; tutoring; and application of spreadsheets. (RT)

  2. The Evolution of Computing: Slowing down? Not Yet!

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    Dr Sutherland will review the evolution of computing over the past decade, focusing particularly on the development of the database and middleware from client server to Internet computing. But what are the next steps from the perspective of a software company? Dr Sutherland will discuss the development of Grid as well as the future applications revolving around collaborative working, which are appearing as the next wave of computing applications.

  3. [Earth Science Technology Office's Computational Technologies Project

    Science.gov (United States)

    Fischer, James (Technical Monitor); Merkey, Phillip

    2005-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  4. Design, Results, Evolution and Status of the ATLAS Simulation at Point1 Project

    CERN Document Server

    AUTHOR|(SzGeCERN)377840; Fressard-Batraneanu, Silvia Maria; Ballestrero, Sergio; Contescu, Alexandru Cristian; Fazio, Daniel; Di Girolamo, Alessandro; Lee, Christopher Jon; Pozo Astigarraga, Mikel Eukeni; Scannicchio, Diana; Sedov, Alexey; Twomey, Matthew Shaun; Wang, Fuquan; Zaytsev, Alexander

    2015-01-01

    Abstract. During the LHC Long Shutdown 1 period (LS1), that started in 2013, the Simulation at Point1 (Sim@P1) Project takes advantage, in an opportunistic way, of the TDAQ (Trigger and Data Acquisition) HLT (High Level Trigger) farm of the ATLAS experiment. This farm provides more than 1300 compute nodes, which are particularly suited for running event generation and Monte Carlo production jobs that are mostly CPU and not I/O bound. It is capable of running up to 2700 virtual machines (VMs) provided with 8 CPU cores each, for a total of up to 22000 parallel running jobs. This contribution gives a review of the design, the results, and the evolution of the Sim@P1 Project; operating a large scale OpenStack based virtualized platform deployed on top of the ATLAS TDAQ HLT farm computing resources. During LS1, Sim@P1 was one of the most productive ATLAS sites: it delivered more than 50 million CPU-hours and it generated more than 1.7 billion Monte Carlo events to various analysis communities. The design aspects a...

  5. Design, Results, Evolution and Status of the ATLAS simulation in Point1 project.

    CERN Document Server

    Ballestrero, Sergio; The ATLAS collaboration; Brasolin, Franco; Contescu, Alexandru Cristian; Fazio, Daniel; Di Girolamo, Alessandro; Lee, Christopher Jon; Pozo Astigarraga, Mikel Eukeni; Scannicchio, Diana; Sedov, Alexey; Twomey, Matthew Shaun; Wang, Fuquan; Zaytsev, Alexander

    2015-01-01

    During the LHC long shutdown period (LS1), that started in 2013, the simulation in Point1 (Sim@P1) project takes advantage in an opportunistic way of the trigger and data acquisition (TDAQ) farm of the ATLAS experiment. The farm provides more than 1500 computer nodes, and they are particularly suitable for running event generation and Monte Carlo production jobs that are mostly CPU and not I/O bound. It is capable of running up to 2500 virtual machines (VM) provided with 8 CPU cores each, for a total of up to 20000 parallel running jobs. This contribution gives a thorough review of the design, the results and the evolution of the Sim@P1 project operating a large scale Openstack based virtualized platform deployed on top of the ATLAS TDAQ farm computing resources. During LS1, Sim@P1 was one of the most productive GRID sites: it delivered more than 50 million CPU-hours and it generated more than 1.7 billion Monte Carlo events to various analysis communities within the ATLAS collaboration. The particular design ...

  6. Computer modelling as a tool for understanding language evolution

    NARCIS (Netherlands)

    de Boer, Bart; Gontier, N; VanBendegem, JP; Aerts, D

    2006-01-01

    This paper describes the uses of computer models in studying the evolution of language. Language is a complex dynamic system that can be studied at the level of the individual and at the level of the population. Much of the dynamics of language evolution and language change occur because of the

  7. [''R"--project for statistical computing

    DEFF Research Database (Denmark)

    Dessau, R.B.; Pipper, Christian Bressen

    2008-01-01

    An introduction to the R project for statistical computing (www.R-project.org) is presented. The main topics are: 1. To make the professional community aware of "R" as a potent and free software for graphical and statistical analysis of medical data; 2. Simple well-known statistical tests are fai...... are fairly easy to perform in R, but more complex modelling requires programming skills; 3. R is seen as a tool for teaching statistics and implementing complex modelling of medical data among medical professionals Udgivelsesdato: 2008/1/28......An introduction to the R project for statistical computing (www.R-project.org) is presented. The main topics are: 1. To make the professional community aware of "R" as a potent and free software for graphical and statistical analysis of medical data; 2. Simple well-known statistical tests...

  8. Projected evolution superoperators and the density operator

    International Nuclear Information System (INIS)

    Turner, R.E.; Dahler, J.S.; Snider, R.F.

    1982-01-01

    The projection operator method of Zwanzig and Feshbach is used to construct the time dependent density operator associated with a binary scattering event. The formula developed to describe this time dependence involves time-ordered cosine and sine projected evolution (memory) superoperators. Both Schroedinger and interaction picture results are presented. The former is used to demonstrate the equivalence of the time dependent solution of the von Neumann equation and the more familiar frequency dependent Laplace transform solution. For two particular classes of projection superoperators projected density operators are shown to be equivalent to projected wave functions. Except for these two special cases, no projected wave function analogs of projected density operators exist. Along with the decoupled-motions approximation, projected interaction picture density operators are applied to inelastic scattering events. Simple illustrations are provided of how this formalism is related to previously established results for two-state processes, namely, the theory of resonant transfer events, the first order Magnus approximation, and the Landau-Zener theory

  9. Open Compute Project at CERN

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    The Open Compute Project, OCP ( http://www.opencompute.org/), was launched by Facebook in 2011 with the objective of building efficient computing infrastructures at lowest possible cost. The technologies are released as open hardware design, with the goal to develop servers and data centers following the model traditionally associated with open source software projects. We have been following the OCP project for some time and decided to buy two OCP twin servers in 2013 to get some hands-on experience. The servers have been tested and compared with our standard hardware regularly acquired through large tenders. In this presentation we will give some relevant results from this testing and also discuss some of the more important differences that can matter for a larger deployment at CERN. Finally it will outline the details for a possible project for a larger deployment of OCP hardware for production use at CERN.

  10. Project PEACH at UCLH: Student Projects in Healthcare Computing.

    Science.gov (United States)

    Ramachandran, Navin; Mohamedally, Dean; Taylor, Paul

    2017-01-01

    A collaboration between clinicians at UCLH and the Dept of Computer Science at UCL is giving students of computer science the opportunity to undertake real healthcare computing projects as part of their education. This is enabling the creation of a significant research computing platform within the Trust, based on open source components and hosted in the cloud, while providing a large group of students with experience of the specific challenges of health IT.

  11. Computer models of vocal tract evolution: an overview and critique

    NARCIS (Netherlands)

    de Boer, B.; Fitch, W. T.

    2010-01-01

    Human speech has been investigated with computer models since the invention of digital computers, and models of the evolution of speech first appeared in the late 1960s and early 1970s. Speech science and computer models have a long shared history because speech is a physical signal and can be

  12. Problem-Based Service Learning: The Evolution of a Team Project

    Science.gov (United States)

    Connor-Greene, Patricia A.

    2002-01-01

    In this article, I describe the evolution of a problem-based service learning project in an undergraduate Abnormal Psychology course. Students worked in teams on a semester-long project to locate and evaluate information and treatment for specific psychiatric disorders. As part of the project, each team selected relevant bibliographic materials,…

  13. Group Projects and the Computer Science Curriculum

    Science.gov (United States)

    Joy, Mike

    2005-01-01

    Group projects in computer science are normally delivered with reference to good software engineering practice. The discipline of software engineering is rapidly evolving, and the application of the latest 'agile techniques' to group projects causes a potential conflict with constraints imposed by regulating bodies on the computer science…

  14. Pervasive Computing Support for Hospitals: An Overview of the Activity-Based Computing Project

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Bardram, Jakob E

    2007-01-01

    The activity-based computing project researched pervasive computing support for clinical hospital work. Such technologies have potential for supporting the mobile, collaborative, and disruptive use of heterogeneous embedded devices in a hospital......The activity-based computing project researched pervasive computing support for clinical hospital work. Such technologies have potential for supporting the mobile, collaborative, and disruptive use of heterogeneous embedded devices in a hospital...

  15. Evolution of a Computer-Based Testing Laboratory

    Science.gov (United States)

    Moskal, Patrick; Caldwell, Richard; Ellis, Taylor

    2009-01-01

    In 2003, faced with increasing growth in technology-based and large-enrollment courses, the College of Business Administration at the University of Central Florida opened a computer-based testing lab to facilitate administration of course examinations. Patrick Moskal, Richard Caldwell, and Taylor Ellis describe the development and evolution of the…

  16. Projection computation based on pixel in simultaneous algebraic reconstruction technique

    International Nuclear Information System (INIS)

    Wang Xu; Chen Zhiqiang; Xiong Hua; Zhang Li

    2005-01-01

    SART is an important arithmetic of image reconstruction, in which the projection computation takes over half of the reconstruction time. An efficient way to compute projection coefficient matrix together with memory optimization is presented in this paper. Different from normal method, projection lines are located based on every pixel, and the following projection coefficient computation can make use of the results. Correlation of projection lines and pixels can be used to optimize the computation. (authors)

  17. Competitiveness in organizational integrated computer system project management

    Directory of Open Access Journals (Sweden)

    Zenovic GHERASIM

    2010-06-01

    Full Text Available The organizational integrated computer system project management aims at achieving competitiveness by unitary, connected and personalised treatment of the requirements for this type of projects, along with the adequate application of all the basic management, administration and project planning principles, as well as of the basic concepts of the organisational information management development. The paper presents some aspects of organizational computer systems project management competitiveness with the specific reference to some Romanian companies’ projects.

  18. Numerical evaluation of methods for computing tomographic projections

    International Nuclear Information System (INIS)

    Zhuang, W.; Gopal, S.S.; Hebert, T.J.

    1994-01-01

    Methods for computing forward/back projections of 2-D images can be viewed as numerical integration techniques. The accuracy of any ray-driven projection method can be improved by increasing the number of ray-paths that are traced per projection bin. The accuracy of pixel-driven projection methods can be increased by dividing each pixel into a number of smaller sub-pixels and projecting each sub-pixel. The authors compared four competing methods of computing forward/back projections: bilinear interpolation, ray-tracing, pixel-driven projection based upon sub-pixels, and pixel-driven projection based upon circular, rather than square, pixels. This latter method is equivalent to a fast, bi-nonlinear interpolation. These methods and the choice of the number of ray-paths per projection bin or the number of sub-pixels per pixel present a trade-off between computational speed and accuracy. To solve the problem of assessing backprojection accuracy, the analytical inverse Fourier transform of the ramp filtered forward projection of the Shepp and Logan head phantom is derived

  19. The ECLSS Advanced Automation Project Evolution and Technology Assessment

    Science.gov (United States)

    Dewberry, Brandon S.; Carnes, James R.; Lukefahr, Brenda D.; Rogers, John S.; Rochowiak, Daniel M.; Mckee, James W.; Benson, Brian L.

    1990-01-01

    Viewgraphs on Environmental Control and Life Support System (ECLSS) advanced automation project evolution and technology assessment are presented. Topics covered include: the ECLSS advanced automation project; automatic fault diagnosis of ECLSS subsystems descriptions; in-line, real-time chemical and microbial fluid analysis; and object-oriented, distributed chemical and microbial modeling of regenerative environmental control systems description.

  20. Evolution of the ATLAS distributed computing system during the LHC long shutdown

    Science.gov (United States)

    Campana, S.; Atlas Collaboration

    2014-06-01

    The ATLAS Distributed Computing project (ADC) was established in 2007 to develop and operate a framework, following the ATLAS computing model, to enable data storage, processing and bookkeeping on top of the Worldwide LHC Computing Grid (WLCG) distributed infrastructure. ADC development has always been driven by operations and this contributed to its success. The system has fulfilled the demanding requirements of ATLAS, daily consolidating worldwide up to 1 PB of data and running more than 1.5 million payloads distributed globally, supporting almost one thousand concurrent distributed analysis users. Comprehensive automation and monitoring minimized the operational manpower required. The flexibility of the system to adjust to operational needs has been important to the success of the ATLAS physics program. The LHC shutdown in 2013-2015 affords an opportunity to improve the system in light of operational experience and scale it to cope with the demanding requirements of 2015 and beyond, most notably a much higher trigger rate and event pileup. We will describe the evolution of the ADC software foreseen during this period. This includes consolidating the existing Production and Distributed Analysis framework (PanDA) and ATLAS Grid Information System (AGIS), together with the development and commissioning of next generation systems for distributed data management (DDM/Rucio) and production (Prodsys-2). We will explain how new technologies such as Cloud Computing and NoSQL databases, which ATLAS investigated as R&D projects in past years, will be integrated in production. Finally, we will describe more fundamental developments such as breaking job-to-data locality by exploiting storage federations and caches, and event level (rather than file or dataset level) workload engines.

  1. Evolution of the ATLAS distributed computing system during the LHC long shutdown

    International Nuclear Information System (INIS)

    Campana, S

    2014-01-01

    The ATLAS Distributed Computing project (ADC) was established in 2007 to develop and operate a framework, following the ATLAS computing model, to enable data storage, processing and bookkeeping on top of the Worldwide LHC Computing Grid (WLCG) distributed infrastructure. ADC development has always been driven by operations and this contributed to its success. The system has fulfilled the demanding requirements of ATLAS, daily consolidating worldwide up to 1 PB of data and running more than 1.5 million payloads distributed globally, supporting almost one thousand concurrent distributed analysis users. Comprehensive automation and monitoring minimized the operational manpower required. The flexibility of the system to adjust to operational needs has been important to the success of the ATLAS physics program. The LHC shutdown in 2013-2015 affords an opportunity to improve the system in light of operational experience and scale it to cope with the demanding requirements of 2015 and beyond, most notably a much higher trigger rate and event pileup. We will describe the evolution of the ADC software foreseen during this period. This includes consolidating the existing Production and Distributed Analysis framework (PanDA) and ATLAS Grid Information System (AGIS), together with the development and commissioning of next generation systems for distributed data management (DDM/Rucio) and production (Prodsys-2). We will explain how new technologies such as Cloud Computing and NoSQL databases, which ATLAS investigated as R and D projects in past years, will be integrated in production. Finally, we will describe more fundamental developments such as breaking job-to-data locality by exploiting storage federations and caches, and event level (rather than file or dataset level) workload engines.

  2. Computing as Empirical Science – Evolution of a Concept

    Directory of Open Access Journals (Sweden)

    Polak Paweł

    2016-12-01

    Full Text Available This article presents the evolution of philosophical and methodological considerations concerning empiricism in computer/computing science. In this study, we trace the most important current events in the history of reflection on computing. The forerunners of Artificial Intelligence H.A. Simon and A. Newell in their paper Computer Science As Empirical Inquiry (1975 started these considerations. Later the concept of empirical computer science was developed by S.S. Shapiro, P. Wegner, A.H. Eden and P.J. Denning. They showed various empirical aspects of computing. This led to a view of the science of computing (or science of information processing - the science of general scope. Some interesting contemporary ways towards a generalized perspective on computations were also shown (e.g. natural computing.

  3. Nonlinear evolution equations and solving algebraic systems: the importance of computer algebra

    International Nuclear Information System (INIS)

    Gerdt, V.P.; Kostov, N.A.

    1989-01-01

    In the present paper we study the application of computer algebra to solve the nonlinear polynomial systems which arise in investigation of nonlinear evolution equations. We consider several systems which are obtained in classification of integrable nonlinear evolution equations with uniform rank. Other polynomial systems are related with the finding of algebraic curves for finite-gap elliptic potentials of Lame type and generalizations. All systems under consideration are solved using the method based on construction of the Groebner basis for corresponding polynomial ideals. The computations have been carried out using computer algebra systems. 20 refs

  4. Design, Results, Evolution and Status of the ATLAS Simulation at Point1 Project

    Science.gov (United States)

    Ballestrero, S.; Batraneanu, S. M.; Brasolin, F.; Contescu, C.; Fazio, D.; Di Girolamo, A.; Lee, C. J.; Pozo Astigarraga, M. E.; Scannicchio, D. A.; Sedov, A.; Twomey, M. S.; Wang, F.; Zaytsev, A.

    2015-12-01

    During the LHC Long Shutdown 1 (LSI) period, that started in 2013, the Simulation at Point1 (Sim@P1) project takes advantage, in an opportunistic way, of the TDAQ (Trigger and Data Acquisition) HLT (High-Level Trigger) farm of the ATLAS experiment. This farm provides more than 1300 compute nodes, which are particularly suited for running event generation and Monte Carlo production jobs that are mostly CPU and not I/O bound. It is capable of running up to 2700 Virtual Machines (VMs) each with 8 CPU cores, for a total of up to 22000 parallel jobs. This contribution gives a review of the design, the results, and the evolution of the Sim@P1 project, operating a large scale OpenStack based virtualized platform deployed on top of the ATLAS TDAQ HLT farm computing resources. During LS1, Sim@P1 was one of the most productive ATLAS sites: it delivered more than 33 million CPU-hours and it generated more than 1.1 billion Monte Carlo events. The design aspects are presented: the virtualization platform exploited by Sim@P1 avoids interferences with TDAQ operations and it guarantees the security and the usability of the ATLAS private network. The cloud mechanism allows the separation of the needed support on both infrastructural (hardware, virtualization layer) and logical (Grid site support) levels. This paper focuses on the operational aspects of such a large system during the upcoming LHC Run 2 period: simple, reliable, and efficient tools are needed to quickly switch from Sim@P1 to TDAQ mode and back, to exploit the resources when they are not used for the data acquisition, even for short periods. The evolution of the central OpenStack infrastructure is described, as it was upgraded from Folsom to the Icehouse release, including the scalability issues addressed.

  5. The Evolution of Cloud Computing in ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00224309; Berghaus, Frank; Brasolin, Franco; Cordeiro, Cristovao; Desmarais, Ron; Field, Laurence; Gable, Ian; Giordano, Domenico; Di Girolamo, Alessandro; Hover, John; Leblanc, Matthew Edgar; Love, Peter; Paterson, Michael; Sobie, Randall; Zaytsev, Alexandr

    2015-01-01

    The ATLAS experiment has successfully incorporated cloud computing technology and cloud resources into its primarily grid-based model of distributed computing. Cloud R&D activities continue to mature and transition into stable production systems, while ongoing evolutionary changes are still needed to adapt and refine the approaches used, in response to changes in prevailing cloud technology. In addition, completely new developments are needed to handle emerging requirements. This paper describes the overall evolution of cloud computing in ATLAS. The current status of the virtual machine (VM) management systems used for harnessing infrastructure as a service (IaaS) resources are discussed. Monitoring and accounting systems tailored for clouds are needed to complete the integration of cloud resources within ATLAS' distributed computing framework. We are developing and deploying new solutions to address the challenge of operation in a geographically distributed multi-cloud scenario, including a system for ma...

  6. The Evolution of Cloud Computing in ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00224309; The ATLAS collaboration; Berghaus, Frank; Love, Peter; Leblanc, Matthew Edgar; Di Girolamo, Alessandro; Paterson, Michael; Gable, Ian; Sobie, Randall; Field, Laurence

    2015-01-01

    The ATLAS experiment has successfully incorporated cloud computing technology and cloud resources into its primarily grid-based model of distributed computing. Cloud R&D activities continue to mature and transition into stable production systems, while ongoing evolutionary changes are still needed to adapt and refine the approaches used, in response to changes in prevailing cloud technology. In addition, completely new developments are needed to handle emerging requirements. This work will describe the overall evolution of cloud computing in ATLAS. The current status of the VM management systems used for harnessing IAAS resources will be discussed. Monitoring and accounting systems tailored for clouds are needed to complete the integration of cloud resources within ATLAS' distributed computing framework. We are developing and deploying new solutions to address the challenge of operation in a geographically distributed multi-cloud scenario, including a system for managing VM images across multiple clouds, ...

  7. Computer Tutors: An Innovative Approach to Computer Literacy. Part I: The Early Stages.

    Science.gov (United States)

    Targ, Joan

    1981-01-01

    In Part I of this two-part article, the author describes the evolution of the Computer Tutor project in Palo Alto, California, and the strategies she incorporated into a successful student-taught computer literacy program. Journal availability: Educational Computer, P.O. Box 535, Cupertino, CA 95015. (Editor/SJL)

  8. The Activity-Based Computing Project - A Software Architecture for Pervasive Computing Final Report

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    . Special attention should be drawn to publication [25], which gives an overview of the ABC project to the IEEE Pervasive Computing community; the ACM CHI 2006 [19] paper that documents the implementation of the ABC technology; and the ACM ToCHI paper [12], which is the main publication of the project......, documenting all of the project’s four objectives. All of these publication venues are top-tier journals and conferences within computer science. From a business perspective, the project had the objective of incorporating relevant parts of the ABC technology into the products of Medical Insight, which has been...... done. Moreover, partly based on the research done in the ABC project, the company Cetrea A/S has been founded, which incorporate ABC concepts and technologies in its products. The concepts of activity-based computing have also been researched in cooperation with IBM Research, and the ABC project has...

  9. Applying natural evolution for solving computational problems - Lecture 1

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Darwin’s natural evolution theory has inspired computer scientists for solving computational problems. In a similar way to how humans and animals have evolved along millions of years, computational problems can be solved by evolving a population of solutions through generations until a good solution is found. In the first lecture, the fundaments of evolutionary computing (EC) will be described, covering the different phases that the evolutionary process implies. ECJ, a framework for researching in such field, will be also explained. In the second lecture, genetic programming (GP) will be covered. GP is a sub-field of EC where solutions are actual computational programs represented by trees. Bloat control and distributed evaluation will be introduced.

  10. Applying natural evolution for solving computational problems - Lecture 2

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Darwin’s natural evolution theory has inspired computer scientists for solving computational problems. In a similar way to how humans and animals have evolved along millions of years, computational problems can be solved by evolving a population of solutions through generations until a good solution is found. In the first lecture, the fundaments of evolutionary computing (EC) will be described, covering the different phases that the evolutionary process implies. ECJ, a framework for researching in such field, will be also explained. In the second lecture, genetic programming (GP) will be covered. GP is a sub-field of EC where solutions are actual computational programs represented by trees. Bloat control and distributed evaluation will be introduced.

  11. Discovering local patterns of co - evolution: computational aspects and biological examples

    Directory of Open Access Journals (Sweden)

    Tuller Tamir

    2010-01-01

    Full Text Available Abstract Background Co-evolution is the process in which two (or more sets of orthologs exhibit a similar or correlative pattern of evolution. Co-evolution is a powerful way to learn about the functional interdependencies between sets of genes and cellular functions and to predict physical interactions. More generally, it can be used for answering fundamental questions about the evolution of biological systems. Orthologs that exhibit a strong signal of co-evolution in a certain part of the evolutionary tree may show a mild signal of co-evolution in other branches of the tree. The major reasons for this phenomenon are noise in the biological input, genes that gain or lose functions, and the fact that some measures of co-evolution relate to rare events such as positive selection. Previous publications in the field dealt with the problem of finding sets of genes that co-evolved along an entire underlying phylogenetic tree, without considering the fact that often co-evolution is local. Results In this work, we describe a new set of biological problems that are related to finding patterns of local co-evolution. We discuss their computational complexity and design algorithms for solving them. These algorithms outperform other bi-clustering methods as they are designed specifically for solving the set of problems mentioned above. We use our approach to trace the co-evolution of fungal, eukaryotic, and mammalian genes at high resolution across the different parts of the corresponding phylogenetic trees. Specifically, we discover regions in the fungi tree that are enriched with positive evolution. We show that metabolic genes exhibit a remarkable level of co-evolution and different patterns of co-evolution in various biological datasets. In addition, we find that protein complexes that are related to gene expression exhibit non-homogenous levels of co-evolution across different parts of the fungi evolutionary line. In the case of mammalian evolution

  12. Schedule evolution during the life-time of the LHC project

    CERN Document Server

    Foraz, K; Gaillard, H; Hauviller, Claude; Weisz, S

    2007-01-01

    The Large Hadron Collider Project was approved by the CERN Council in December 1994. The CERN management opted from the beginning of the project for a very aggressive installation planning based on a just-in-time sequencing of all activities. This paper aims to draw how different factors (technical development, procurement, logistics and organization) have impacted on the schedule evolution through the lifetime of the project. It describes the cause effect analysis of the major rescheduling that occurred during the installation of the LHC and presents some general conclusions potentially applicable in other projects.

  13. Cloud Computing as Evolution of Distributed Computing – A Case Study for SlapOS Distributed Cloud Computing Platform

    Directory of Open Access Journals (Sweden)

    George SUCIU

    2013-01-01

    Full Text Available The cloud computing paradigm has been defined from several points of view, the main two directions being either as an evolution of the grid and distributed computing paradigm, or, on the contrary, as a disruptive revolution in the classical paradigms of operating systems, network layers and web applications. This paper presents a distributed cloud computing platform called SlapOS, which unifies technologies and communication protocols into a new technology model for offering any application as a service. Both cloud and distributed computing can be efficient methods for optimizing resources that are aggregated from a grid of standard PCs hosted in homes, offices and small data centers. The paper fills a gap in the existing distributed computing literature by providing a distributed cloud computing model which can be applied for deploying various applications.

  14. Nascence project: nanoscale engineering for novel computation using evolution

    NARCIS (Netherlands)

    Broersma, Haitze J.; Gomez, Faustino; Miller, Julian; Petty, Mike; Tufte, Gunnar

    2012-01-01

    Living systems are able to achieve prodigious feats of computation with remarkable speed and efficiency (e.g. navigation in a complex environment, object recognition, decision making, and reasoning). Many of these tasks have not been adequately solved using algorithms running on our most powerful

  15. Virtual Mockup test based on computational science and engineering. Near future technology projected by JSPS-RFTFADVENTURE project

    International Nuclear Information System (INIS)

    Yoshimura, Shinobu

    2001-01-01

    The ADVENTURE project began on August, 1997, as a project in the computational science' field of JSPS-RFTFADVENTURE project, and is progressed as five year project. In this project, by using versatile parallel computer environment such as PC cluster, super parallel computer, and so on , to solve an arbitrary shape of actual dynamical equation by using 10 to 100 million freedom class mode under maintaining a general use analytical capacity agreeable with present general use computational mechanics system, further development of a large-scale parallel computational mechanics system (ADVENTURE system) capable of carrying out an optimization design on shapes, physical properties, loading conditions, and so on is performed. Here was scoped, after outlining on background of R and D on ADVENTURE system and its features, on near future virtual mockup test forecast from it. (G.K.)

  16. The three-dimensional matrix -- An evolution in project management

    Energy Technology Data Exchange (ETDEWEB)

    Glidewell, D.

    1996-09-01

    In the Functional Department Dimension, functional departments such as project management, design, and construction would be maintained to maximize consistency among project teams, evenly allocate training opportunities, and facilitate the crossfeeding of lessons learned and innovative ideas. Functional departments were also determined to be the surest way of complying uniformly with all project control systems required by the Department of Energy (Sandia`s primary external customer). The Technical Discipline dimension was maintained to enhance communication within the technical disciplines, such as electrical engineering, mechanical engineering, civil engineering, etc., and to evenly allocate technical training opportunities, reduce technical obsolescence, and enhance design standards. The third dimension, the Project Dimension, represents the next step in the project management evolution at Sandia, and together with Functional Department and Technical Discipline Dimensions constitutes the three-dimensional matrix. It is this Project Dimension that will be explored thoroughly in this paper, including a discussion of the specific roles and responsibilities of both management and the project team.

  17. Digital Genesis: Computers, Evolution and Artificial Life

    OpenAIRE

    Taylor, Tim; Dorin, Alan; Korb, Kevin

    2015-01-01

    The application of evolution in the digital realm, with the goal of creating artificial intelligence and artificial life, has a history as long as that of the digital computer itself. We illustrate the intertwined history of these ideas, starting with the early theoretical work of John von Neumann and the pioneering experimental work of Nils Aall Barricelli. We argue that evolutionary thinking and artificial life will continue to play an integral role in the future development of the digital ...

  18. Collaborative Computational Project for Electron cryo-Microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Wood, Chris; Burnley, Tom [Science and Technology Facilities Council, Research Complex at Harwell, Didcot OX11 0FA (United Kingdom); Patwardhan, Ardan [European Molecular Biology Laboratory, Wellcome Trust Genome Campus, Hinxton, Cambridge CB10 1SD (United Kingdom); Scheres, Sjors [MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge Biomedical Campus, Cambridge CB2 0QH (United Kingdom); Topf, Maya [University of London, Malet Street, London WC1E 7HX (United Kingdom); Roseman, Alan [University of Manchester, Oxford Road, Manchester M13 9PT (United Kingdom); Winn, Martyn, E-mail: martyn.winn@stfc.ac.uk [Science and Technology Facilities Council, Daresbury Laboratory, Warrington WA4 4AD (United Kingdom); Science and Technology Facilities Council, Research Complex at Harwell, Didcot OX11 0FA (United Kingdom)

    2015-01-01

    The Collaborative Computational Project for Electron cryo-Microscopy (CCP-EM) is a new initiative for the structural biology community, following the success of CCP4 for macromolecular crystallography. Progress in supporting the users and developers of cryoEM software is reported. The Collaborative Computational Project for Electron cryo-Microscopy (CCP-EM) has recently been established. The aims of the project are threefold: to build a coherent cryoEM community which will provide support for individual scientists and will act as a focal point for liaising with other communities, to support practising scientists in their use of cryoEM software and finally to support software developers in producing and disseminating robust and user-friendly programs. The project is closely modelled on CCP4 for macromolecular crystallography, and areas of common interest such as model fitting, underlying software libraries and tools for building program packages are being exploited. Nevertheless, cryoEM includes a number of techniques covering a large range of resolutions and a distinct project is required. In this article, progress so far is reported and future plans are discussed.

  19. Collaborative Computational Project for Electron cryo-Microscopy

    International Nuclear Information System (INIS)

    Wood, Chris; Burnley, Tom; Patwardhan, Ardan; Scheres, Sjors; Topf, Maya; Roseman, Alan; Winn, Martyn

    2015-01-01

    The Collaborative Computational Project for Electron cryo-Microscopy (CCP-EM) is a new initiative for the structural biology community, following the success of CCP4 for macromolecular crystallography. Progress in supporting the users and developers of cryoEM software is reported. The Collaborative Computational Project for Electron cryo-Microscopy (CCP-EM) has recently been established. The aims of the project are threefold: to build a coherent cryoEM community which will provide support for individual scientists and will act as a focal point for liaising with other communities, to support practising scientists in their use of cryoEM software and finally to support software developers in producing and disseminating robust and user-friendly programs. The project is closely modelled on CCP4 for macromolecular crystallography, and areas of common interest such as model fitting, underlying software libraries and tools for building program packages are being exploited. Nevertheless, cryoEM includes a number of techniques covering a large range of resolutions and a distinct project is required. In this article, progress so far is reported and future plans are discussed

  20. Exploitation of cloud computing in management of construction projects in Slovakia

    Directory of Open Access Journals (Sweden)

    Mandičák Tomáš

    2016-12-01

    Full Text Available The issue of cloud computing is a highly topical issue. Cloud computing represents a new model for information technology (IT services based on the exploitation of Web (it represents a cloud and other application platforms, as well as software as a service. In general, the exploitation of cloud computing in construction project management has several advantages, as demonstrated by several research reports. Currently, research quantifying the exploitation of cloud computing in the Slovak construction industry has not yet been carried out. The article discusses the issue of exploitation of cloud computing in construction project management in Slovakia. The main objective of the research is to confirm whether factors such as size of construction enterprise, owner of construction enterprise and participant of construction project have any impact on the exploitation level of cloud computing in construction project management. It includes confirmation of differences in use between different participants of the construction project or between construction enterprises broken down by size and shareholders.

  1. A new major SETI project based on Project SERENDIP data and 100,000 personal computers

    Science.gov (United States)

    Sullivan, Woodruff T., III; Werthimer, Dan; Bowyer, Stuart; Cobb, Jeff; Gedye, David; Anderson, David

    1997-01-01

    We are now developing an innovative SETI project involving massively parallel computation on desktop computers scattered around the world. The public will be uniquely involved in a real scientific project. Individuals will download a screensaver program that will not only provide the usual attractive graphics when their computer is idle, but will also perform sophisticated analysis of SETI data using the host computer. The data are tapped off Project SERENDIP IV's receiver and SETI survey operating on the 305-m-diameter Arecibo radio telescope. We make a continuous tape-recording of a 2-MHz bandwidth signal centered on the 21-cm H I line. The data on these tapes are then preliminarily screened and parceled out by a server that supplies small chunks of data over the Internet to clients possessing the screen-saver software. After the client computer has automatically analyzed a complete chunk of data a report on the best candidate signals is sent back to the server, whereupon a new chunk of data is sent out. If 50,000-100,000 customers can be achieved, the computing power will be equivalent to a substantial fraction of atypical supercomputer, and the project will cover a volume of parameter space comparable to that of SERENDIP IV.

  2. Vinayaka : A Semi-Supervised Projected Clustering Method Using Differential Evolution

    OpenAIRE

    Satish Gajawada; Durga Toshniwal

    2012-01-01

    Differential Evolution (DE) is an algorithm for evolutionary optimization. Clustering problems have beensolved by using DE based clustering methods but these methods may fail to find clusters hidden insubspaces of high dimensional datasets. Subspace and projected clustering methods have been proposed inliterature to find subspace clusters that are present in subspaces of dataset. In this paper we proposeVINAYAKA, a semi-supervised projected clustering method based on DE. In this method DE opt...

  3. AGIS: Evolution of Distributed Computing information system for ATLAS

    Science.gov (United States)

    Anisenkov, A.; Di Girolamo, A.; Alandes, M.; Karavakis, E.

    2015-12-01

    ATLAS, a particle physics experiment at the Large Hadron Collider at CERN, produces petabytes of data annually through simulation production and tens of petabytes of data per year from the detector itself. The ATLAS computing model embraces the Grid paradigm and a high degree of decentralization of computing resources in order to meet the ATLAS requirements of petabytes scale data operations. It has been evolved after the first period of LHC data taking (Run-1) in order to cope with new challenges of the upcoming Run- 2. In this paper we describe the evolution and recent developments of the ATLAS Grid Information System (AGIS), developed in order to integrate configuration and status information about resources, services and topology of the computing infrastructure used by the ATLAS Distributed Computing applications and services.

  4. Computational nuclear quantum many-body problem: The UNEDF project

    Science.gov (United States)

    Bogner, S.; Bulgac, A.; Carlson, J.; Engel, J.; Fann, G.; Furnstahl, R. J.; Gandolfi, S.; Hagen, G.; Horoi, M.; Johnson, C.; Kortelainen, M.; Lusk, E.; Maris, P.; Nam, H.; Navratil, P.; Nazarewicz, W.; Ng, E.; Nobre, G. P. A.; Ormand, E.; Papenbrock, T.; Pei, J.; Pieper, S. C.; Quaglioni, S.; Roche, K. J.; Sarich, J.; Schunck, N.; Sosonkina, M.; Terasaki, J.; Thompson, I.; Vary, J. P.; Wild, S. M.

    2013-10-01

    The UNEDF project was a large-scale collaborative effort that applied high-performance computing to the nuclear quantum many-body problem. The primary focus of the project was on constructing, validating, and applying an optimized nuclear energy density functional, which entailed a wide range of pioneering developments in microscopic nuclear structure and reactions, algorithms, high-performance computing, and uncertainty quantification. UNEDF demonstrated that close associations among nuclear physicists, mathematicians, and computer scientists can lead to novel physics outcomes built on algorithmic innovations and computational developments. This review showcases a wide range of UNEDF science results to illustrate this interplay.

  5. Norwegian computers in European energy research project

    International Nuclear Information System (INIS)

    Anon.

    1979-01-01

    16 NORD computers have been ordered for the JET data acquisition and storage system. The computers will be arranged in a 'double star' configuration, developed by CERN. Two control consoles each have their own computer. All computers for communication, control, diagnostics, consoles and testing are NORD-100s while the computer for data storage and analysis is a NORD-500. The operating system is SINTRAN CAMAC SERIAL HIGHWAY with fibre optics to be used for long communications paths. The programming languages FORTRAN, NODAL, NORD PL, PASCAL and BASIC may be used. The JET project and TOKAMAK type machines are briefly described. (JIW)

  6. Computing element evolution towards Exascale and its impact on legacy simulation codes

    International Nuclear Information System (INIS)

    Colin de Verdiere, Guillaume J.L.

    2015-01-01

    In the light of the current race towards the Exascale, this article highlights the main features of the forthcoming computing elements that will be at the core of next generations of supercomputers. The market analysis, underlying this work, shows that computers are facing a major evolution in terms of architecture. As a consequence, it is important to understand the impacts of those evolutions on legacy codes or programming methods. The problems of dissipated power and memory access are discussed and will lead to a vision of what should be an exascale system. To survive, programming languages had to respond to the hardware evolutions either by evolving or with the creation of new ones. From the previous elements, we elaborate why vectorization, multithreading, data locality awareness and hybrid programming will be the key to reach the exascale, implying that it is time to start rewriting codes. (orig.)

  7. Computing element evolution towards Exascale and its impact on legacy simulation codes

    Science.gov (United States)

    Colin de Verdière, Guillaume J. L.

    2015-12-01

    In the light of the current race towards the Exascale, this article highlights the main features of the forthcoming computing elements that will be at the core of next generations of supercomputers. The market analysis, underlying this work, shows that computers are facing a major evolution in terms of architecture. As a consequence, it is important to understand the impacts of those evolutions on legacy codes or programming methods. The problems of dissipated power and memory access are discussed and will lead to a vision of what should be an exascale system. To survive, programming languages had to respond to the hardware evolutions either by evolving or with the creation of new ones. From the previous elements, we elaborate why vectorization, multithreading, data locality awareness and hybrid programming will be the key to reach the exascale, implying that it is time to start rewriting codes.

  8. The fifth generation computer project state of the art report 111

    CERN Document Server

    Scarrott

    1983-01-01

    The Fifth Generation Computer Project is a two-part book consisting of the invited papers and the analysis. The invited papers examine various aspects of The Fifth Generation Computer Project. The analysis part assesses the major advances of the Fifth Generation Computer Project and provides a balanced analysis of the state of the art in The Fifth Generation. This part provides a balanced and comprehensive view of the development in Fifth Generation Computer technology. The Bibliography compiles the most important published material on the subject of The Fifth Generation.

  9. ABrIL - Advanced Brain Imaging Lab : a cloud based computation environment for cooperative neuroimaging projects.

    Science.gov (United States)

    Neves Tafula, Sérgio M; Moreira da Silva, Nádia; Rozanski, Verena E; Silva Cunha, João Paulo

    2014-01-01

    Neuroscience is an increasingly multidisciplinary and highly cooperative field where neuroimaging plays an important role. Neuroimaging rapid evolution is demanding for a growing number of computing resources and skills that need to be put in place at every lab. Typically each group tries to setup their own servers and workstations to support their neuroimaging needs, having to learn from Operating System management to specific neuroscience software tools details before any results can be obtained from each setup. This setup and learning process is replicated in every lab, even if a strong collaboration among several groups is going on. In this paper we present a new cloud service model - Brain Imaging Application as a Service (BiAaaS) - and one of its implementation - Advanced Brain Imaging Lab (ABrIL) - in the form of an ubiquitous virtual desktop remote infrastructure that offers a set of neuroimaging computational services in an interactive neuroscientist-friendly graphical user interface (GUI). This remote desktop has been used for several multi-institution cooperative projects with different neuroscience objectives that already achieved important results, such as the contribution to a high impact paper published in the January issue of the Neuroimage journal. The ABrIL system has shown its applicability in several neuroscience projects with a relatively low-cost, promoting truly collaborative actions and speeding up project results and their clinical applicability.

  10. Topographic evolution of sandbars: Flume experiment and computational modeling

    Science.gov (United States)

    Kinzel, Paul J.; Nelson, Jonathan M.; McDonald, Richard R.; Logan, Brandy L.

    2010-01-01

    Measurements of sandbar formation and evolution were carried out in a laboratory flume and the topographic characteristics of these barforms were compared to predictions from a computational flow and sediment transport model with bed evolution. The flume experiment produced sandbars with approximate mode 2, whereas numerical simulations produced a bed morphology better approximated as alternate bars, mode 1. In addition, bar formation occurred more rapidly in the laboratory channel than for the model channel. This paper focuses on a steady-flow laboratory experiment without upstream sediment supply. Future experiments will examine the effects of unsteady flow and sediment supply and the use of numerical models to simulate the response of barform topography to these influences.

  11. Optimized temporal pattern of brain stimulation designed by computational evolution.

    Science.gov (United States)

    Brocker, David T; Swan, Brandon D; So, Rosa Q; Turner, Dennis A; Gross, Robert E; Grill, Warren M

    2017-01-04

    Brain stimulation is a promising therapy for several neurological disorders, including Parkinson's disease. Stimulation parameters are selected empirically and are limited to the frequency and intensity of stimulation. We varied the temporal pattern of deep brain stimulation to ameliorate symptoms in a parkinsonian animal model and in humans with Parkinson's disease. We used model-based computational evolution to optimize the stimulation pattern. The optimized pattern produced symptom relief comparable to that from standard high-frequency stimulation (a constant rate of 130 or 185 Hz) and outperformed frequency-matched standard stimulation in a parkinsonian rat model and in patients. Both optimized and standard high-frequency stimulation suppressed abnormal oscillatory activity in the basal ganglia of rats and humans. The results illustrate the utility of model-based computational evolution of temporal patterns to increase the efficiency of brain stimulation in treating Parkinson's disease and thereby reduce the energy required for successful treatment below that of current brain stimulation paradigms. Copyright © 2017, American Association for the Advancement of Science.

  12. Projects Using a Computer Algebra System in First-Year Undergraduate Mathematics

    Science.gov (United States)

    Rosenzweig, Martin

    2007-01-01

    This paper illustrates the use of computer-based projects in two one-semester first-year undergraduate mathematics classes. Developed over a period of years, the approach is one in which the classes are organised into work-groups, with computer-based projects being undertaken periodically to illustrate the class material. These projects are…

  13. Evaluating the Effectiveness of Collaborative Computer-Intensive Projects in an Undergraduate Psychometrics Course

    Science.gov (United States)

    Barchard, Kimberly A.; Pace, Larry A.

    2010-01-01

    Undergraduate psychometrics classes often use computer-intensive active learning projects. However, little research has examined active learning or computer-intensive projects in psychometrics courses. We describe two computer-intensive collaborative learning projects used to teach the design and evaluation of psychological tests. Course…

  14. Evolution of the heteroharmonic strategy for target-range computation in the echolocation of Mormoopidae.

    Directory of Open Access Journals (Sweden)

    Emanuel C Mora

    2013-06-01

    Full Text Available Echolocating bats use the time elapsed from biosonar pulse emission to the arrival of echo (defined as echo-delay to assess target-distance. Target-distance is represented in the brain by delay-tuned neurons that are classified as either heteroharmonic or homoharmormic. Heteroharmonic neurons respond more strongly to pulse-echo pairs in which the timing of the pulse is given by the fundamental biosonar harmonic while the timing of echoes is provided by one (or several of the higher order harmonics. On the other hand, homoharmonic neurons are tuned to the echo delay between similar harmonics in the emitted pulse and echo. It is generally accepted that heteroharmonic computations are advantageous over homoharmonic computations; i.e. heteroharmonic neurons receive information from call and echo in different frequency-bands which helps to avoid jamming between pulse and echo signals. Heteroharmonic neurons have been found in two species of the family Mormoopidae (Pteronotus parnellii and Pteronotus quadridens and in Rhinolophus rouxi. Recently, it was proposed that heteroharmonic target-range computations are a primitive feature of the genus Pteronotus that was preserved in the evolution of the genus. Here we review recent findings on the evolution of echolocation in Mormoopidae, and try to link those findings to the evolution of the heteroharmonic computation strategy. We stress the hypothesis that the ability to perform heteroharmonic computations evolved separately from the ability of using long constant-frequency echolocation calls, high duty cycle echolocation and Doppler Shift Compensation. Also, we present the idea that heteroharmonic computations might have been of advantage for categorizing prey size, hunting eared insects and living in large conspecific colonies. We make five testable predictions that might help future investigations to clarify the evolution of the heteroharmonic echolocation in Mormoopidae and other families.

  15. Computational Modeling of Microstructural-Evolution in AISI 1005 Steel During Gas Metal Arc Butt Welding

    Science.gov (United States)

    2013-05-01

    H.K.D.H. Bhadeshia, A Model for the Microstruc- ture of Some Advanced Bainitic Steels , Mater. Trans., 1991, 32, p 689–696 19. G.J. Davies and J.G. Garland...REPORT Computational Modeling of Microstructural-Evolution in AISI 1005 Steel During Gas Metal Arc Butt Welding 14. ABSTRACT 16. SECURITY...Computational Modeling of Microstructural-Evolution in AISI 1005 Steel During Gas Metal Arc Butt Welding Report Title ABSTRACT A fully coupled (two-way

  16. The Physics of Open Ended Evolution

    Science.gov (United States)

    Adams, Alyssa M.

    What makes living systems different than non-living ones? Unfortunately this question is impossible to answer, at least currently. Instead, we must face computationally tangible questions based on our current understanding of physics, computation, information, and biology. Yet we have few insights into how living systems might quantifiably differ from their non-living counterparts, as in a mathematical foundation to explain away our observations of biological evolution, emergence, innovation, and organization. The development of a theory of living systems, if at all possible, demands a mathematical understanding of how data generated by complex biological systems changes over time. In addition, this theory ought to be broad enough as to not be constrained to an Earth-based biochemistry. In this dissertation, the philosophy of studying living systems from the perspective of traditional physics is first explored as a motivating discussion for subsequent research. Traditionally, we have often thought of the physical world from a bottom-up approach: things happening on a smaller scale aggregate into things happening on a larger scale. In addition, the laws of physics are generally considered static over time. Research suggests that biological evolution may follow dynamic laws that (at least in part) change as a function of the state of the system. Of the three featured research projects, cellular automata (CA) are used as a model to study certain aspects of living systems in two of them. These aspects include self-reference, open-ended evolution, local physical universality, subjectivity, and information processing. Open-ended evolution and local physical universality are attributed to the vast amount of innovation observed throughout biological evolution. Biological systems may distinguish themselves in terms of information processing and storage, not outside the theory of computation. The final research project concretely explores real-world phenomenon by means of

  17. VIP visit of LHC Computing Grid Project

    CERN Multimedia

    Krajewski, Yann Tadeusz

    2015-01-01

    VIP visit of LHC Computing Grid Project with Dr -.Ing. Tarek Kamel [Senior Advisor to the President for Government Engagement, ICANN Geneva Office] and Dr Nigel Hickson [VP, IGO Engagement, ICANN Geneva Office

  18. Spiral and Project-Based Learning with Peer Assessment in a Computer Science Project Management Course

    Science.gov (United States)

    Jaime, Arturo; Blanco, José Miguel; Domínguez, César; Sánchez, Ana; Heras, Jónathan; Usandizaga, Imanol

    2016-01-01

    Different learning methods such as project-based learning, spiral learning and peer assessment have been implemented in science disciplines with different outcomes. This paper presents a proposal for a project management course in the context of a computer science degree. Our proposal combines three well-known methods: project-based learning,…

  19. Computer architecture evaluation for structural dynamics computations: Project summary

    Science.gov (United States)

    Standley, Hilda M.

    1989-01-01

    The intent of the proposed effort is the examination of the impact of the elements of parallel architectures on the performance realized in a parallel computation. To this end, three major projects are developed: a language for the expression of high level parallelism, a statistical technique for the synthesis of multicomputer interconnection networks based upon performance prediction, and a queueing model for the analysis of shared memory hierarchies.

  20. Elastic Multi-scale Mechanisms: Computation and Biological Evolution.

    Science.gov (United States)

    Diaz Ochoa, Juan G

    2018-01-01

    Explanations based on low-level interacting elements are valuable and powerful since they contribute to identify the key mechanisms of biological functions. However, many dynamic systems based on low-level interacting elements with unambiguous, finite, and complete information of initial states generate future states that cannot be predicted, implying an increase of complexity and open-ended evolution. Such systems are like Turing machines, that overlap with dynamical systems that cannot halt. We argue that organisms find halting conditions by distorting these mechanisms, creating conditions for a constant creativity that drives evolution. We introduce a modulus of elasticity to measure the changes in these mechanisms in response to changes in the computed environment. We test this concept in a population of predators and predated cells with chemotactic mechanisms and demonstrate how the selection of a given mechanism depends on the entire population. We finally explore this concept in different frameworks and postulate that the identification of predictive mechanisms is only successful with small elasticity modulus.

  1. Computer-integrated design and information management for nuclear projects

    International Nuclear Information System (INIS)

    Gonzalez, A.; Martin-Guirado, L.; Nebrera, F.

    1987-01-01

    Over the past seven years, Empresarios Agrupados has been developing a comprehensive, computer-integrated system to perform the majority of the engineering, design, procurement and construction management activities in nuclear, fossil-fired as well as hydro power plant projects. This system, which is already in a production environment, comprises a large number of computer programs and data bases designed using a modular approach. Each software module, dedicated to meeting the needs of a particular design group or project discipline, facilitates the performance of functional tasks characteristic of the power plant engineering process

  2. Modeling the evolution of channel shape: Balancing computational efficiency with hydraulic fidelity

    Science.gov (United States)

    Wobus, C.W.; Kean, J.W.; Tucker, G.E.; Anderson, R. Scott

    2008-01-01

    The cross-sectional shape of a natural river channel controls the capacity of the system to carry water off a landscape, to convey sediment derived from hillslopes, and to erode its bed and banks. Numerical models that describe the response of a landscape to changes in climate or tectonics therefore require formulations that can accommodate evolution of channel cross-sectional geometry. However, fully two-dimensional (2-D) flow models are too computationally expensive to implement in large-scale landscape evolution models, while available simple empirical relationships between width and discharge do not adequately capture the dynamics of channel adjustment. We have developed a simplified 2-D numerical model of channel evolution in a cohesive, detachment-limited substrate subject to steady, unidirectional flow. Erosion is assumed to be proportional to boundary shear stress, which is calculated using an approximation of the flow field in which log-velocity profiles are assumed to apply along vectors that are perpendicular to the local channel bed. Model predictions of the velocity structure, peak boundary shear stress, and equilibrium channel shape compare well with predictions of a more sophisticated but more computationally demanding ray-isovel model. For example, the mean velocities computed by the two models are consistent to within ???3%, and the predicted peak shear stress is consistent to within ???7%. Furthermore, the shear stress distributions predicted by our model compare favorably with available laboratory measurements for prescribed channel shapes. A modification to our simplified code in which the flow includes a high-velocity core allows the model to be extended to estimate shear stress distributions in channels with large width-to-depth ratios. Our model is efficient enough to incorporate into large-scale landscape evolution codes and can be used to examine how channels adjust both cross-sectional shape and slope in response to tectonic and climatic

  3. Computational Nuclear Quantum Many-Body Problem: The UNEDF Project

    OpenAIRE

    Bogner, Scott; Bulgac, Aurel; Carlson, Joseph A.; Engel, Jonathan; Fann, George; Furnstahl, Richard J.; Gandolfi, Stefano; Hagen, Gaute; Horoi, Mihai; Johnson, Calvin W.; Kortelainen, Markus; Lusk, Ewing; Maris, Pieter; Nam, Hai Ah; Navratil, Petr

    2013-01-01

    The UNEDF project was a large-scale collaborative effort that applied high-performance computing to the nuclear quantum many-body problem. UNEDF demonstrated that close associations among nuclear physicists, mathematicians, and computer scientists can lead to novel physics outcomes built on algorithmic innovations and computational developments. This review showcases a wide range of UNEDF science results to illustrate this interplay.

  4. Computer Hardware, Advanced Mathematics and Model Physics pilot project final report

    International Nuclear Information System (INIS)

    1992-05-01

    The Computer Hardware, Advanced Mathematics and Model Physics (CHAMMP) Program was launched in January, 1990. A principal objective of the program has been to utilize the emerging capabilities of massively parallel scientific computers in the challenge of regional scale predictions of decade-to-century climate change. CHAMMP has already demonstrated the feasibility of achieving a 10,000 fold increase in computational throughput for climate modeling in this decade. What we have also recognized, however, is the need for new algorithms and computer software to capitalize on the radically new computing architectures. This report describes the pilot CHAMMP projects at the DOE National Laboratories and the National Center for Atmospheric Research (NCAR). The pilot projects were selected to identify the principal challenges to CHAMMP and to entrain new scientific computing expertise. The success of some of these projects has aided in the definition of the CHAMMP scientific plan. Many of the papers in this report have been or will be submitted for publication in the open literature. Readers are urged to consult with the authors directly for questions or comments about their papers

  5. 2nd Generation QUATARA Flight Computer Project

    Science.gov (United States)

    Falker, Jay; Keys, Andrew; Fraticelli, Jose Molina; Capo-Iugo, Pedro; Peeples, Steven

    2015-01-01

    Single core flight computer boards have been designed, developed, and tested (DD&T) to be flown in small satellites for the last few years. In this project, a prototype flight computer will be designed as a distributed multi-core system containing four microprocessors running code in parallel. This flight computer will be capable of performing multiple computationally intensive tasks such as processing digital and/or analog data, controlling actuator systems, managing cameras, operating robotic manipulators and transmitting/receiving from/to a ground station. In addition, this flight computer will be designed to be fault tolerant by creating both a robust physical hardware connection and by using a software voting scheme to determine the processor's performance. This voting scheme will leverage on the work done for the Space Launch System (SLS) flight software. The prototype flight computer will be constructed with Commercial Off-The-Shelf (COTS) components which are estimated to survive for two years in a low-Earth orbit.

  6. Computer applications in project KARP

    International Nuclear Information System (INIS)

    Raju, R.P.; Siddiqui, H.R.

    1992-01-01

    For effective project implementation of Kalpakkam Reprocessing Plant (KARP) at Kalpakkam, an elaborate Management Information Systems (MIS) was developed in-house for physical and financial progress monitoring and reporting. Computer aided design software for design of process piping layout was also developed and implemented for generation of process cell piping drawings for construction purposes. Modelling and simulation studies were carried out to optimize process parameters and fault tree analysis techniques utilised for evaluating plant availability factors. (author). 2 tabs

  7. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  8. Evolution of facility layout requirements and CAD [computer-aided design] system development

    International Nuclear Information System (INIS)

    Jones, M.

    1990-06-01

    The overall configuration of the Superconducting Super Collider (SSC) including the infrastructure and land boundary requirements were developed using a computer-aided design (CAD) system. The evolution of the facility layout requirements and the use of the CAD system are discussed. The emphasis has been on minimizing the amount of input required and maximizing the speed by which the output may be obtained. The computer system used to store the data is also described

  9. Power-Efficient Computing: Experiences from the COSA Project

    Directory of Open Access Journals (Sweden)

    Daniele Cesini

    2017-01-01

    Full Text Available Energy consumption is today one of the most relevant issues in operating HPC systems for scientific applications. The use of unconventional computing systems is therefore of great interest for several scientific communities looking for a better tradeoff between time-to-solution and energy-to-solution. In this context, the performance assessment of processors with a high ratio of performance per watt is necessary to understand how to realize energy-efficient computing systems for scientific applications, using this class of processors. Computing On SOC Architecture (COSA is a three-year project (2015–2017 funded by the Scientific Commission V of the Italian Institute for Nuclear Physics (INFN, which aims to investigate the performance and the total cost of ownership offered by computing systems based on commodity low-power Systems on Chip (SoCs and high energy-efficient systems based on GP-GPUs. In this work, we present the results of the project analyzing the performance of several scientific applications on several GPU- and SoC-based systems. We also describe the methodology we have used to measure energy performance and the tools we have implemented to monitor the power drained by applications while running.

  10. Volunteer computing experience with ATLAS@Home

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00068610; The ATLAS collaboration; Bianchi, Riccardo-Maria; Cameron, David; Filipčič, Andrej; Lançon, Eric; Wu, Wenjing

    2016-01-01

    ATLAS@Home is a volunteer computing project which allows the public to contribute to computing for the ATLAS experiment through their home or office computers. The project has grown continuously since its creation in mid-2014 and now counts almost 100,000 volunteers. The combined volunteers’ resources make up a sizeable fraction of overall resources for ATLAS simulation. This paper takes stock of the experience gained so far and describes the next steps in the evolution of the project. These improvements include running natively on Linux to ease the deployment on for example university clusters, using multiple cores inside one task to reduce the memory requirements and running different types of workload such as event generation. In addition to technical details the success of ATLAS@Home as an outreach tool is evaluated.

  11. Volunteer Computing Experience with ATLAS@Home

    CERN Document Server

    Cameron, David; The ATLAS collaboration; Bourdarios, Claire; Lan\\c con, Eric

    2016-01-01

    ATLAS@Home is a volunteer computing project which allows the public to contribute to computing for the ATLAS experiment through their home or office computers. The project has grown continuously since its creation in mid-2014 and now counts almost 100,000 volunteers. The combined volunteers' resources make up a sizable fraction of overall resources for ATLAS simulation. This paper takes stock of the experience gained so far and describes the next steps in the evolution of the project. These improvements include running natively on Linux to ease the deployment on for example university clusters, using multiple cores inside one job to reduce the memory requirements and running different types of workload such as event generation. In addition to technical details the success of ATLAS@Home as an outreach tool is evaluated.

  12. Volunteer Computing Experience with ATLAS@Home

    Science.gov (United States)

    Adam-Bourdarios, C.; Bianchi, R.; Cameron, D.; Filipčič, A.; Isacchini, G.; Lançon, E.; Wu, W.; ATLAS Collaboration

    2017-10-01

    ATLAS@Home is a volunteer computing project which allows the public to contribute to computing for the ATLAS experiment through their home or office computers. The project has grown continuously since its creation in mid-2014 and now counts almost 100,000 volunteers. The combined volunteers’ resources make up a sizeable fraction of overall resources for ATLAS simulation. This paper takes stock of the experience gained so far and describes the next steps in the evolution of the project. These improvements include running natively on Linux to ease the deployment on for example university clusters, using multiple cores inside one task to reduce the memory requirements and running different types of workload such as event generation. In addition to technical details the success of ATLAS@Home as an outreach tool is evaluated.

  13. Weierstrass Elliptic Function Solutions to Nonlinear Evolution Equations

    International Nuclear Information System (INIS)

    Yu Jianping; Sun Yongli

    2008-01-01

    This paper is based on the relations between projection Riccati equations and Weierstrass elliptic equation, combined with the Groebner bases in the symbolic computation. Then the novel method for constructing the Weierstrass elliptic solutions to the nonlinear evolution equations is given by using the above relations

  14. Lattice QCD Application Development within the US DOE Exascale Computing Project

    Energy Technology Data Exchange (ETDEWEB)

    Brower, Richard [Boston U.; Christ, Norman [Columbia U.; DeTar, Carleton [Utah U.; Edwards, Robert [Jefferson Lab; Mackenzie, Paul [Fermilab

    2017-10-30

    In October, 2016, the US Department of Energy launched the Exascale Computing Project, which aims to deploy exascale computing resources for science and engineering in the early 2020's. The project brings together application teams, software developers, and hardware vendors in order to realize this goal. Lattice QCD is one of the applications. Members of the US lattice gauge theory community with significant collaborators abroad are developing algorithms and software for exascale lattice QCD calculations. We give a short description of the project, our activities, and our plans.

  15. Lattice QCD Application Development within the US DOE Exascale Computing Project

    Science.gov (United States)

    Brower, Richard; Christ, Norman; DeTar, Carleton; Edwards, Robert; Mackenzie, Paul

    2018-03-01

    In October, 2016, the US Department of Energy launched the Exascale Computing Project, which aims to deploy exascale computing resources for science and engineering in the early 2020's. The project brings together application teams, software developers, and hardware vendors in order to realize this goal. Lattice QCD is one of the applications. Members of the US lattice gauge theory community with significant collaborators abroad are developing algorithms and software for exascale lattice QCD calculations. We give a short description of the project, our activities, and our plans.

  16. Lattice QCD Application Development within the US DOE Exascale Computing Project

    Directory of Open Access Journals (Sweden)

    Brower Richard

    2018-01-01

    Full Text Available In October, 2016, the US Department of Energy launched the Exascale Computing Project, which aims to deploy exascale computing resources for science and engineering in the early 2020’s. The project brings together application teams, software developers, and hardware vendors in order to realize this goal. Lattice QCD is one of the applications. Members of the US lattice gauge theory community with significant collaborators abroad are developing algorithms and software for exascale lattice QCD calculations. We give a short description of the project, our activities, and our plans.

  17. Two-qubit quantum computing in a projected subspace

    International Nuclear Information System (INIS)

    Bi Qiao; Ruda, H.E.; Zhan, M.S.

    2002-01-01

    A formulation for performing quantum computing in a projected subspace is presented, based on the subdynamical kinetic equation (SKE) for an open quantum system. The eigenvectors of the kinetic equation are shown to remain invariant before and after interaction with the environment. However, the eigenvalues in the projected subspace exhibit a type of phase shift to the evolutionary states. This phase shift does not destroy the decoherence-free (DF) property of the subspace because the associated fidelity is 1. This permits a universal formalism to be presented--the eigenprojectors of the free part of the Hamiltonian for the system and bath may be used to construct a DF projected subspace based on the SKE. To eliminate possible phase or unitary errors induced by the change in the eigenvalues, a cancellation technique is proposed, using the adjustment of the coupling time, and applied to a two-qubit computing system. A general criteria for constructing a DF-projected subspace from the SKE is discussed. Finally, a proposal for using triangulation to realize a decoherence-free subsystem based on SKE is presented. The concrete formulation for a two-qubit model is given exactly. Our approach is general and appears to be applicable to any type of decoherence

  18. Evolution of Project-Based Learning in Small Groups in Environmental Engineering Courses

    Science.gov (United States)

    Requies, Jesús M.; Agirre, Ion; Barrio, V. Laura; Graells, Moisès

    2018-01-01

    This work presents the assessment of the development and evolution of an active methodology (Project-Based Learning--PBL) implemented on the course "Unit Operations in Environmental Engineering", within the bachelor's degree in Environmental Engineering, with the purpose of decreasing the dropout rate in this course. After the initial…

  19. Evolution of the GATE project: new results and developments

    Energy Technology Data Exchange (ETDEWEB)

    Santin, G. [ESA-ESTEC, Keplerlaan 1, 2200 AG Noordwijk (Netherlands); Staelens, S. [ELIS Department, Ghent University, B-9000 Ghent (Belgium); Taschereau, R. [CRUMP Institute for Molecular Imaging, University of California Los Angeles, 700 Westwood Plaza A438, Los Angeles, CA 90095-1770 (United States); Descourt, P. [U650 INSERM, LaTIM, Brest (France); Schmidtlein, C.R. [Memorial Sloan-Kettering Cancer Center, New York, New York, US (United States); Simon, L. [Department of Radiation Oncology, Institut Curie, Paris (France); Visvikis, D. [U650 INSERM, LaTIM, Brest (France); Jan, S. [Service Hospitalier Frederic Joliot (SHFJ), CEA-Orsay, Orsay (France); Buvat, I. [U678 INSERM, CHU Pitie-Salpetriere, Paris (France)

    2007-10-15

    We present the status of the Geant4 Application for Emission Tomography (GATE) project, a Monte Carlo simulator for Single Photon Emission Computed Tomography (SPECT) and Positron annihilation Emission Tomography (PET). Its main features are reminded, including modelling of time dependent phenomena and versatile, user-friendly scripting interface. The focus of this manuscript will be on new developments introduced in the past 4 years. New results have been achieved in the fields of validation on real medical and research PET and SPECT systems, voxel geometries, digitisation, distributed computing and dosimetry.

  20. Evolution of the GATE project: new results and developments

    International Nuclear Information System (INIS)

    Santin, G.; Staelens, S.; Taschereau, R.; Descourt, P.; Schmidtlein, C.R.; Simon, L.; Visvikis, D.; Jan, S.; Buvat, I.

    2007-01-01

    We present the status of the Geant4 Application for Emission Tomography (GATE) project, a Monte Carlo simulator for Single Photon Emission Computed Tomography (SPECT) and Positron annihilation Emission Tomography (PET). Its main features are reminded, including modelling of time dependent phenomena and versatile, user-friendly scripting interface. The focus of this manuscript will be on new developments introduced in the past 4 years. New results have been achieved in the fields of validation on real medical and research PET and SPECT systems, voxel geometries, digitisation, distributed computing and dosimetry

  1. Quantum ballistic evolution in quantum mechanics: Application to quantum computers

    International Nuclear Information System (INIS)

    Benioff, P.

    1996-01-01

    Quantum computers are important examples of processes whose evolution can be described in terms of iterations of single-step operators or their adjoints. Based on this, Hamiltonian evolution of processes with associated step operators T is investigated here. The main limitation of this paper is to processes which evolve quantum ballistically, i.e., motion restricted to a collection of nonintersecting or distinct paths on an arbitrary basis. The main goal of this paper is proof of a theorem which gives necessary and sufficient conditions that T must satisfy so that there exists a Hamiltonian description of quantum ballistic evolution for the process, namely, that T is a partial isometry and is orthogonality preserving and stable on some basis. Simple examples of quantum ballistic evolution for quantum Turing machines with one and with more than one type of elementary step are discussed. It is seen that for nondeterministic machines the basis set can be quite complex with much entanglement present. It is also proven that, given a step operator T for an arbitrary deterministic quantum Turing machine, it is decidable if T is stable and orthogonality preserving, and if quantum ballistic evolution is possible. The proof fails if T is a step operator for a nondeterministic machine. It is an open question if such a decision procedure exists for nondeterministic machines. This problem does not occur in classical mechanics. Also the definition of quantum Turing machines used here is compared with that used by other authors. copyright 1996 The American Physical Society

  2. Computational performance of a projection and rescaling algorithm

    OpenAIRE

    Pena, Javier; Soheili, Negar

    2018-01-01

    This paper documents a computational implementation of a {\\em projection and rescaling algorithm} for finding most interior solutions to the pair of feasibility problems \\[ \\text{find} \\; x\\in L\\cap\\mathbb{R}^n_{+} \\;\\;\\;\\; \\text{ and } \\; \\;\\;\\;\\; \\text{find} \\; \\hat x\\in L^\\perp\\cap\\mathbb{R}^n_{+}, \\] where $L$ denotes a linear subspace in $\\mathbb{R}^n$ and $L^\\perp$ denotes its orthogonal complement. The projection and rescaling algorithm is a recently developed method that combines a {\\...

  3. Computing the maximum volume inscribed ellipsoid of a polytopic projection

    NARCIS (Netherlands)

    Zhen, Jianzhe; den Hertog, Dick

    We introduce a novel scheme based on a blending of Fourier-Motzkin elimination (FME) and adjustable robust optimization techniques to compute the maximum volume inscribed ellipsoid (MVE) in a polytopic projection. It is well-known that deriving an explicit description of a projected polytope is

  4. Computing the Maximum Volume Inscribed Ellipsoid of a Polytopic Projection

    NARCIS (Netherlands)

    Zhen, J.; den Hertog, D.

    2015-01-01

    We introduce a novel scheme based on a blending of Fourier-Motzkin elimination (FME) and adjustable robust optimization techniques to compute the maximum volume inscribed ellipsoid (MVE) in a polytopic projection. It is well-known that deriving an explicit description of a projected polytope is

  5. Distributed and grid computing projects with research focus in human health.

    Science.gov (United States)

    Diomidous, Marianna; Zikos, Dimitrios

    2012-01-01

    Distributed systems and grid computing systems are used to connect several computers to obtain a higher level of performance, in order to solve a problem. During the last decade, projects use the World Wide Web to aggregate individuals' CPU power for research purposes. This paper presents the existing active large scale distributed and grid computing projects with research focus in human health. There have been found and presented 11 active projects with more than 2000 Processing Units (PUs) each. The research focus for most of them is molecular biology and, specifically on understanding or predicting protein structure through simulation, comparing proteins, genomic analysis for disease provoking genes and drug design. Though not in all cases explicitly stated, common target diseases include research to find cure against HIV, dengue, Duchene dystrophy, Parkinson's disease, various types of cancer and influenza. Other diseases include malaria, anthrax, Alzheimer's disease. The need for national initiatives and European Collaboration for larger scale projects is stressed, to raise the awareness of citizens to participate in order to create a culture of internet volunteering altruism.

  6. Computer Assisted Project-Based Instruction: The Effects on Science Achievement, Computer Achievement and Portfolio Assessment

    Science.gov (United States)

    Erdogan, Yavuz; Dede, Dinçer

    2015-01-01

    The purpose of this study is to compare the effects of computer assisted project-based instruction on learners' achievement in a science and technology course, in a computer course and in portfolio development. With this aim in mind, a quasi-experimental design was used and a sample of 70 seventh grade secondary school students from Org. Esref…

  7. Computer-aided engineering for Qinshan CANDU projects

    International Nuclear Information System (INIS)

    Huang Zhizhang; Goland, D.

    1999-01-01

    The author briefly describes AECL's work in applying computer-aided engineering tools to the Qinshan CANDU Project. The main emphases will be to introduce the major CADD software tools and their use in civil design, process design and EI and C design. Other special software tools and non-CADD tools and their applications are also briefly introduced

  8. Selection Finder (SelFi: A computational metabolic engineering tool to enable directed evolution of enzymes

    Directory of Open Access Journals (Sweden)

    Neda Hassanpour

    2017-06-01

    Full Text Available Directed evolution of enzymes consists of an iterative process of creating mutant libraries and choosing desired phenotypes through screening or selection until the enzymatic activity reaches a desired goal. The biggest challenge in directed enzyme evolution is identifying high-throughput screens or selections to isolate the variant(s with the desired property. We present in this paper a computational metabolic engineering framework, Selection Finder (SelFi, to construct a selection pathway from a desired enzymatic product to a cellular host and to couple the pathway with cell survival. We applied SelFi to construct selection pathways for four enzymes and their desired enzymatic products xylitol, D-ribulose-1,5-bisphosphate, methanol, and aniline. Two of the selection pathways identified by SelFi were previously experimentally validated for engineering Xylose Reductase and RuBisCO. Importantly, SelFi advances directed evolution of enzymes as there is currently no known generalized strategies or computational techniques for identifying high-throughput selections for engineering enzymes.

  9. The models of the life cycle of a computer system

    Directory of Open Access Journals (Sweden)

    Sorina-Carmen Luca

    2006-01-01

    Full Text Available The paper presents a comparative study on the patterns of the life cycle of a computer system. There are analyzed the advantages of each pattern and presented the graphic schemes that point out each stage and step in the evolution of a computer system. In the end the classifications of the methods of projecting the computer systems are discussed.

  10. Analytical simulation platform describing projections in computed tomography systems

    International Nuclear Information System (INIS)

    Youn, Hanbean; Kim, Ho Kyung

    2013-01-01

    To reduce the patient dose, several approaches such as spectral imaging using photon counting detectors and statistical image reconstruction, are being considered. Although image-reconstruction algorithms may significantly enhance image quality in reconstructed images with low dose, true signal-to-noise properties are mainly determined by image quality in projections. We are developing an analytical simulation platform describing projections to investigate how quantum-interaction physics in each component configuring CT systems affect image quality in projections. This simulator will be very useful for an improved design or optimization of CT systems in economy as well as the development of novel image-reconstruction algorithms. In this study, we present the progress of development of the simulation platform with an emphasis on the theoretical framework describing the generation of projection data. We have prepared the analytical simulation platform describing projections in computed tomography systems. The remained further study before the meeting includes the following: Each stage in the cascaded signal-transfer model for obtaining projections will be validated by the Monte Carlo simulations. We will build up energy-dependent scatter and pixel-crosstalk kernels, and show their effects on image quality in projections and reconstructed images. We will investigate the effects of projections obtained from various imaging conditions and system (or detector) operation parameters on reconstructed images. It is challenging to include the interaction physics due to photon-counting detectors into the simulation platform. Detailed descriptions of the simulator will be presented with discussions on its performance and limitation as well as Monte Carlo validations. Computational cost will also be addressed in detail. The proposed method in this study is simple and can be used conveniently in lab environment

  11. Dynamic computer model for the metallogenesis and tectonics of the Circum-North Pacific

    Science.gov (United States)

    Scotese, Christopher R.; Nokleberg, Warren J.; Monger, James W.H.; Norton, Ian O.; Parfenov, Leonid M.; Khanchuk, Alexander I.; Bundtzen, Thomas K.; Dawson, Kenneth M.; Eremin, Roman A.; Frolov, Yuri F.; Fujita, Kazuya; Goryachev, Nikolai A.; Pozdeev, Anany I.; Ratkin, Vladimir V.; Rodinov, Sergey M.; Rozenblum, Ilya S.; Scholl, David W.; Shpikerman, Vladimir I.; Sidorov, Anatoly A.; Stone, David B.

    2001-01-01

    The digital files on this report consist of a dynamic computer model of the metallogenesis and tectonics of the Circum-North Pacific, and background articles, figures, and maps. The tectonic part of the dynamic computer model is derived from a major analysis of the tectonic evolution of the Circum-North Pacific which is also contained in directory tectevol. The dynamic computer model and associated materials on this CD-ROM are part of a project on the major mineral deposits, metallogenesis, and tectonics of the Russian Far East, Alaska, and the Canadian Cordillera. The project provides critical information on bedrock geology and geophysics, tectonics, major metalliferous mineral resources, metallogenic patterns, and crustal origin and evolution of mineralizing systems for this region. The major scientific goals and benefits of the project are to: (1) provide a comprehensive international data base on the mineral resources of the region that is the first, extensive knowledge available in English; (2) provide major new interpretations of the origin and crustal evolution of mineralizing systems and their host rocks, thereby enabling enhanced, broad-scale tectonic reconstructions and interpretations; and (3) promote trade and scientific and technical exchanges between North America and Eastern Asia.

  12. Exploring Students' Computational Thinking Skills in Modeling and Simulation Projects: : A Pilot Study

    NARCIS (Netherlands)

    Grgurina, Natasa; van Veen, Klaas; Barendsen, Erik; Zwaneveld, Bert; Suhre, Cor; Gal-Ezer, Judith; Sentance, Sue; Vahrenhold, Jan

    2015-01-01

    Computational Thinking (CT) is gaining a lot of attention in education. We explored how to discern the occurrences of CT in the projects of 12th grade high school students in the computer science (CS) course. Within the projects, they constructed models and ran simulations of phenomena from other

  13. apeNEXT A Multi-Tflops LQCD Computing Project

    CERN Document Server

    Alfieri, R; Onofri, E.; Bartoloni, A.; Battista, C.; Cabibbo, N.; Cosimi, M.; Lonardo, A.; Michelotti, A.; Rapuano, F.; Proietti, B.; Rossetti, D.; Sacco, G.; Tassa, S.; Torelli, M.; Vicini, P.; Boucaud, Philippe; Pene, O.; Errico, W.; Magazzu, G.; Sartori, L.; Schifano, F.; Tripiccione, R.; De Riso, P.; Petronzio, R.; Destri, C.; Frezzotti, R.; Marchesini, G.; Gensch, U.; Kretzschmann, A.; Leich, H.; Paschedag, N.; Schwendicke, U.; Simma, H.; Sommer, R.; Sulanke, K.; Wegner, P.; Pleiter, D.; Jansen, K.; Fucci, A.; Martin, B.; Pech, J.; Panizzi, E.; Petricola, A.

    2001-01-01

    This paper is a slightly modified and reduced version of the proposal of the {\\bf apeNEXT} project, which was submitted to DESY and INFN in spring 2000. .It presents the basic motivations and ideas of a next generation lattice QCD (LQCD) computing project, whose goal is the construction and operation of several large scale Multi-TFlops LQCD engines, providing an integrated peak performance of tens of TFlops, and a sustained (double precision) performance on key LQCD kernels of about 50% of peak speed.

  14. Integration of case study approach, project design and computer ...

    African Journals Online (AJOL)

    Integration of case study approach, project design and computer modeling in managerial accounting education ... Journal of Fundamental and Applied Sciences ... in the Laboratory of Management Accounting and Controlling Systems at the ...

  15. A Synthesis and Survey of Critical Success Factors for Computer Technology Projects

    Science.gov (United States)

    Baker, Ross A.

    2012-01-01

    The author investigated the existence of critical success factors for computer technology projects. Current research literature and a survey of experienced project managers indicate that there are 23 critical success factors (CSFs) that correlate with project success. The survey gathered an assessment of project success and the degree to which…

  16. The ATLAS Computing Agora: a resource web site for citizen science projects

    CERN Document Server

    Bourdarios, Claire; The ATLAS collaboration

    2016-01-01

    The ATLAS collaboration has recently setup a number of citizen science projects which have a strong IT component and could not have been envisaged without the growth of general public computing resources and network connectivity: event simulation through volunteer computing, algorithms improvement via Machine Learning challenges, event display analysis on citizen science platforms, use of open data, etc. Most of the interactions with volunteers are handled through message boards, but specific outreach material was also developed, giving an enhanced visibility to the ATLAS software and computing techniques, challenges and community. In this talk the Atlas Computing Agora (ACA) web platform will be presented as well as some of the specific material developed for some of the projects.

  17. The LHC Computing Grid Project

    CERN Multimedia

    Åkesson, T

    In the last ATLAS eNews I reported on the preparations for the LHC Computing Grid Project (LCGP). Significant LCGP resources were mobilized during the summer, and there have been numerous iterations on the formal paper to put forward to the CERN Council to establish the LCGP. ATLAS, and also the other LHC-experiments, has been very active in this process to maximally influence the outcome. Our main priorities were to ensure that the global aspects are properly taken into account, that the CERN non-member states are also included in the structure, that the experiments are properly involved in the LCGP execution and that the LCGP takes operative responsibility during the data challenges. A Project Launch Board (PLB) was active from the end of July until the 10th of September. It was chaired by Hans Hoffmann and had the IT division leader as secretary. Each experiment had a representative (me for ATLAS), and the large CERN member states were each represented while the smaller were represented as clusters ac...

  18. Advances in Grid Computing for the Fabric for Frontier Experiments Project at Fermilab

    Science.gov (United States)

    Herner, K.; Alba Hernandez, A. F.; Bhat, S.; Box, D.; Boyd, J.; Di Benedetto, V.; Ding, P.; Dykstra, D.; Fattoruso, M.; Garzoglio, G.; Kirby, M.; Kreymer, A.; Levshina, T.; Mazzacane, A.; Mengel, M.; Mhashilkar, P.; Podstavkov, V.; Retzke, K.; Sharma, N.; Teheran, J.

    2017-10-01

    The Fabric for Frontier Experiments (FIFE) project is a major initiative within the Fermilab Scientific Computing Division charged with leading the computing model for Fermilab experiments. Work within the FIFE project creates close collaboration between experimenters and computing professionals to serve high-energy physics experiments of differing size, scope, and physics area. The FIFE project has worked to develop common tools for job submission, certificate management, software and reference data distribution through CVMFS repositories, robust data transfer, job monitoring, and databases for project tracking. Since the projects inception the experiments under the FIFE umbrella have significantly matured, and present an increasingly complex list of requirements to service providers. To meet these requirements, the FIFE project has been involved in transitioning the Fermilab General Purpose Grid cluster to support a partitionable slot model, expanding the resources available to experiments via the Open Science Grid, assisting with commissioning dedicated high-throughput computing resources for individual experiments, supporting the efforts of the HEP Cloud projects to provision a variety of back end resources, including public clouds and high performance computers, and developing rapid onboarding procedures for new experiments and collaborations. The larger demands also require enhanced job monitoring tools, which the project has developed using such tools as ElasticSearch and Grafana. in helping experiments manage their large-scale production workflows. This group in turn requires a structured service to facilitate smooth management of experiment requests, which FIFE provides in the form of the Production Operations Management Service (POMS). POMS is designed to track and manage requests from the FIFE experiments to run particular workflows, and support troubleshooting and triage in case of problems. Recently a new certificate management infrastructure called

  19. Grid computing the European Data Grid Project

    CERN Document Server

    Segal, B; Gagliardi, F; Carminati, F

    2000-01-01

    The goal of this project is the development of a novel environment to support globally distributed scientific exploration involving multi- PetaByte datasets. The project will devise and develop middleware solutions and testbeds capable of scaling to handle many PetaBytes of distributed data, tens of thousands of resources (processors, disks, etc.), and thousands of simultaneous users. The scale of the problem and the distribution of the resources and user community preclude straightforward replication of the data at different sites, while the aim of providing a general purpose application environment precludes distributing the data using static policies. We will construct this environment by combining and extending newly emerging "Grid" technologies to manage large distributed datasets in addition to computational elements. A consequence of this project will be the emergence of fundamental new modes of scientific exploration, as access to fundamental scientific data is no longer constrained to the producer of...

  20. The Challenge '88 Project: Interfacing of Chemical Instruments to Computers.

    Science.gov (United States)

    Lyons, Jim; Verghese, Manoj

    The main part of this project involved using a computer, either an Apple or an IBM, as a chart recorder for the infrared (IR) and nuclear magnetic resonance (NMR) spectrophotometers. The computer "reads" these machines and displays spectra on its monitor. The graphs can then be stored for future reference and manipulation. The program to…

  1. Remote handling prospects. Computer aided remote handling

    International Nuclear Information System (INIS)

    Vertut, J.

    1984-01-01

    Mechanical manipulators, electrical control manipulators and computer aided manipulators were successively developed. The aim of computer aided manipulators is the realization of complex or tricky job in adverse environment but man is required for non routine work or for situation in evolution. French effort is developed in the frame of the project automation and advanced robotics and new problems have to be solved particularly at the interface man/machine [fr

  2. Click! 101 Computer Activities and Art Projects for Kids and Grown-Ups.

    Science.gov (United States)

    Bundesen, Lynne; And Others

    This book presents 101 computer activities and projects geared toward children and adults. The activities for both personal computers (PCs) and Macintosh were developed on the Windows 95 computer operating system, but they are adaptable to non-Windows personal computers as well. The book is divided into two parts. The first part provides an…

  3. Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, Forest M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bochev, Pavel B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cameron-Smith, Philip J.. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Easter, Richard C [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Elliott, Scott M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ghan, Steven J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Xiaohong [Univ. of Wyoming, Laramie, WY (United States); Lowrie, Robert B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lucas, Donald D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ma, Po-lun [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sacks, William J. [National Center for Atmospheric Research (NCAR), Boulder, CO (United States); Shrivastava, Manish [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Singh, Balwinder [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tautges, Timothy J. [Argonne National Lab. (ANL), Argonne, IL (United States); Taylor, Mark A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Vertenstein, Mariana [National Center for Atmospheric Research (NCAR), Boulder, CO (United States); Worley, Patrick H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2014-01-15

    The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.

  4. The QUANTGRID Project (RO)—Quantum Security in GRID Computing Applications

    Science.gov (United States)

    Dima, M.; Dulea, M.; Petre, M.; Petre, C.; Mitrica, B.; Stoica, M.; Udrea, M.; Sterian, R.; Sterian, P.

    2010-01-01

    The QUANTGRID Project, financed through the National Center for Programme Management (CNMP-Romania), is the first attempt at using Quantum Crypted Communications (QCC) in large scale operations, such as GRID Computing, and conceivably in the years ahead in the banking sector and other security tight communications. In relation with the GRID activities of the Center for Computing & Communications (Nat.'l Inst. Nucl. Phys.—IFIN-HH), the Quantum Optics Lab. (Nat.'l Inst. Plasma and Lasers—INFLPR) and the Physics Dept. (University Polytechnica—UPB) the project will build a demonstrator infrastructure for this technology. The status of the project in its incipient phase is reported, featuring tests for communications in classical security mode: socket level communications under AES (Advanced Encryption Std.), both proprietary code in C++ technology. An outline of the planned undertaking of the project is communicated, highlighting its impact in quantum physics, coherent optics and information technology.

  5. Computational methods for coupling microstructural and micromechanical materials response simulations

    Energy Technology Data Exchange (ETDEWEB)

    HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.; FANG,HUEI ELIOT; RINTOUL,MARK DANIEL; VEDULA,VENKATA R.; GLASS,S. JILL; KNOROVSKY,GERALD A.; NEILSEN,MICHAEL K.; WELLMAN,GERALD W.; SULSKY,DEBORAH; SHEN,YU-LIN; SCHREYER,H. BUCK

    2000-04-01

    Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were applied to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.

  6. Computer-Automated Evolution of Spacecraft X-Band Antennas

    Science.gov (United States)

    Lohn, Jason D.; Homby, Gregory S.; Linden, Derek S.

    2010-01-01

    A document discusses the use of computer- aided evolution in arriving at a design for X-band communication antennas for NASA s three Space Technology 5 (ST5) satellites, which were launched on March 22, 2006. Two evolutionary algorithms, incorporating different representations of the antenna design and different fitness functions, were used to automatically design and optimize an X-band antenna design. A set of antenna designs satisfying initial ST5 mission requirements was evolved by use these algorithms. The two best antennas - one from each evolutionary algorithm - were built. During flight-qualification testing of these antennas, the mission requirements were changed. After minimal changes in the evolutionary algorithms - mostly in the fitness functions - new antenna designs satisfying the changed mission requirements were evolved and within one month of this change, two new antennas were designed and prototypes of the antennas were built and tested. One of these newly evolved antennas was approved for deployment on the ST5 mission, and flight-qualified versions of this design were built and installed on the spacecraft. At the time of writing the document, these antennas were the first computer-evolved hardware in outer space.

  7. Really Large Scale Computer Graphic Projection Using Lasers and Laser Substitutes

    Science.gov (United States)

    Rother, Paul

    1989-07-01

    This paper reflects on past laser projects to display vector scanned computer graphic images onto very large and irregular surfaces. Since the availability of microprocessors and high powered visible lasers, very large scale computer graphics projection have become a reality. Due to the independence from a focusing lens, lasers easily project onto distant and irregular surfaces and have been used for amusement parks, theatrical performances, concert performances, industrial trade shows and dance clubs. Lasers have been used to project onto mountains, buildings, 360° globes, clouds of smoke and water. These methods have proven successful in installations at: Epcot Theme Park in Florida; Stone Mountain Park in Georgia; 1984 Olympics in Los Angeles; hundreds of Corporate trade shows and thousands of musical performances. Using new ColorRayTM technology, the use of costly and fragile lasers is no longer necessary. Utilizing fiber optic technology, the functionality of lasers can be duplicated for new and exciting projection possibilities. The use of ColorRayTM technology has enjoyed worldwide recognition in conjunction with Pink Floyd and George Michaels' world wide tours.

  8. Toward Cloud Computing Evolution

    OpenAIRE

    Susanto, Heru; Almunawar, Mohammad Nabil; Kang, Chen Chin

    2012-01-01

    -Information Technology (IT) shaped the success of organizations, giving them a solid foundation that increases both their level of efficiency as well as productivity. The computing industry is witnessing a paradigm shift in the way computing is performed worldwide. There is a growing awareness among consumers and enterprises to access their IT resources extensively through a "utility" model known as "cloud computing." Cloud computing was initially rooted in distributed grid-based computing. ...

  9. Volcanoes: Where and Why? Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    Science.gov (United States)

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  10. The evolution of the Waste Isolation Pilot Plant (WIPP) project's public affairs program

    International Nuclear Information System (INIS)

    Walter, L.H.

    1988-01-01

    As a first-of-a-kind facility, the Waste Isolation Pilot Plant (WIPP) presents a unique perspective on the value of designing a public affairs program that grown with and complements a project's evolution from construction to operations. Like the project itself, the public affairs programs progressed through several stages to its present scope. During the construction phase, foundations were laid in the community. Then, in this past year as the project entered a preoperational status, emphasis shifted to broaden the positive image that had been created locally. In this stage, public affairs presented the project's positive elements to the various state agencies, government officials, and federal organizations involved in our country's radioactive waste management program. Most recently, and continuing until receipt of the first shipment of waste in October 1988, an even broader, more aggressive public affairs program is planned

  11. The UK Human Genome Mapping Project online computing service.

    Science.gov (United States)

    Rysavy, F R; Bishop, M J; Gibbs, G P; Williams, G W

    1992-04-01

    This paper presents an overview of computing and networking facilities developed by the Medical Research Council to provide online computing support to the Human Genome Mapping Project (HGMP) in the UK. The facility is connected to a number of other computing facilities in various centres of genetics and molecular biology research excellence, either directly via high-speed links or through national and international wide-area networks. The paper describes the design and implementation of the current system, a 'client/server' network of Sun, IBM, DEC and Apple servers, gateways and workstations. A short outline of online computing services currently delivered by this system to the UK human genetics research community is also provided. More information about the services and their availability could be obtained by a direct approach to the UK HGMP-RC.

  12. The Cc1 Project – System For Private Cloud Computing

    Directory of Open Access Journals (Sweden)

    J Chwastowski

    2012-01-01

    Full Text Available The main features of the Cloud Computing system developed at IFJ PAN are described. The project is financed from the structural resources provided by the European Commission and the Polish Ministry of Science and Higher Education (Innovative Economy, National Cohesion Strategy. The system delivers a solution for carrying out computer calculations on a Private Cloud computing infrastructure. It consists of an intuitive Web based user interface, a module for the users and resources administration and the standard EC2 interface implementation. Thanks to the distributed character of the system it allows for the integration of a geographically distant federation of computer clusters within a uniform user environment.

  13. Projection matrix acquisition for cone-beam computed tomography iterative reconstruction

    Science.gov (United States)

    Yang, Fuqiang; Zhang, Dinghua; Huang, Kuidong; Shi, Wenlong; Zhang, Caixin; Gao, Zongzhao

    2017-02-01

    Projection matrix is an essential and time-consuming part in computed tomography (CT) iterative reconstruction. In this article a novel calculation algorithm of three-dimensional (3D) projection matrix is proposed to quickly acquire the matrix for cone-beam CT (CBCT). The CT data needed to be reconstructed is considered as consisting of the three orthogonal sets of equally spaced and parallel planes, rather than the individual voxels. After getting the intersections the rays with the surfaces of the voxels, the coordinate points and vertex is compared to obtain the index value that the ray traversed. Without considering ray-slope to voxel, it just need comparing the position of two points. Finally, the computer simulation is used to verify the effectiveness of the algorithm.

  14. Technological Evolution on Computed Tomography and Radioprotection

    Energy Technology Data Exchange (ETDEWEB)

    Leite, Bruno Barros; Ribeiro, Nuno Carrilho [Servico de Radiologia, Hospital de Curry Cabral, Rua da Beneficencia, 8, 1069-166 Lisboa (Portugal)

    2006-05-15

    Computed Tomography (CT) has been available since the 70s and has experienced a dramatic technical evolution. Multi-detector technology is our current standard, offering capabilities unthinkable only a decade ago. Yet, we must nor forget the ionizing nature of CT's scanning energy (X-rays). It represents the most important cause of medical-associated radiation exposure to the general public, with a trend to increase. It is compulsory to intervene with the objective of dose reduction, following ALARA policies. Currently there are some technical advances that allow dose reduction, without sacrificing diagnostic image capabilities. However, human intervention is also essential. We must keep investment on education so that CT exams are don when they are really useful in clinical decision. Alternative techniques should also be considered. Image quality must not be searched disregarding the biological effects of radiation. Generally, it is possible to obtain clinically acceptable images with lower dose protocols. (author)

  15. Technological Evolution on Computed Tomography and Radioprotection

    International Nuclear Information System (INIS)

    Leite, Bruno Barros; Ribeiro, Nuno Carrilho

    2006-01-01

    Computed Tomography (CT) has been available since the 70s and has experienced a dramatic technical evolution. Multi-detector technology is our current standard, offering capabilities unthinkable only a decade ago. Yet, we must nor forget the ionizing nature of CT's scanning energy (X-rays). It represents the most important cause of medical-associated radiation exposure to the general public, with a trend to increase. It is compulsory to intervene with the objective of dose reduction, following ALARA policies. Currently there are some technical advances that allow dose reduction, without sacrificing diagnostic image capabilities. However, human intervention is also essential. We must keep investment on education so that CT exams are don when they are really useful in clinical decision. Alternative techniques should also be considered. Image quality must not be searched disregarding the biological effects of radiation. Generally, it is possible to obtain clinically acceptable images with lower dose protocols. (author)

  16. Advances in Grid Computing for the FabrIc for Frontier Experiments Project at Fermialb

    Energy Technology Data Exchange (ETDEWEB)

    Herner, K. [Fermilab; Alba Hernandex, A. F. [Fermilab; Bhat, S. [Fermilab; Box, D. [Fermilab; Boyd, J. [Fermilab; Di Benedetto, V. [Fermilab; Ding, P. [Fermilab; Dykstra, D. [Fermilab; Fattoruso, M. [Fermilab; Garzoglio, G. [Fermilab; Kirby, M. [Fermilab; Kreymer, A. [Fermilab; Levshina, T. [Fermilab; Mazzacane, A. [Fermilab; Mengel, M. [Fermilab; Mhashilkar, P. [Fermilab; Podstavkov, V. [Fermilab; Retzke, K. [Fermilab; Sharma, N. [Fermilab; Teheran, J. [Fermilab

    2016-01-01

    The FabrIc for Frontier Experiments (FIFE) project is a major initiative within the Fermilab Scientic Computing Division charged with leading the computing model for Fermilab experiments. Work within the FIFE project creates close collaboration between experimenters and computing professionals to serve high-energy physics experiments of diering size, scope, and physics area. The FIFE project has worked to develop common tools for job submission, certicate management, software and reference data distribution through CVMFS repositories, robust data transfer, job monitoring, and databases for project tracking. Since the projects inception the experiments under the FIFE umbrella have signicantly matured, and present an increasingly complex list of requirements to service providers. To meet these requirements, the FIFE project has been involved in transitioning the Fermilab General Purpose Grid cluster to support a partitionable slot model, expanding the resources available to experiments via the Open Science Grid, assisting with commissioning dedicated high-throughput computing resources for individual experiments, supporting the eorts of the HEP Cloud projects to provision a variety of back end resources, including public clouds and high performance computers, and developing rapid onboarding procedures for new experiments and collaborations. The larger demands also require enhanced job monitoring tools, which the project has developed using such tools as ElasticSearch and Grafana. in helping experiments manage their large-scale production work ows. This group in turn requires a structured service to facilitate smooth management of experiment requests, which FIFE provides in the form of the Production Operations Management Service (POMS). POMS is designed to track and manage requests from the FIFE experiments to run particular work ows, and support troubleshooting and triage in case of problems. Recently a new certicate management infrastructure called Distributed

  17. PERPHECLIM ACCAF Project - Perennial fruit crops and forest phenology evolution facing climatic changes

    Science.gov (United States)

    Garcia de Cortazar-Atauri, Iñaki; Audergon, Jean Marc; Bertuzzi, Patrick; Anger, Christel; Bonhomme, Marc; Chuine, Isabelle; Davi, Hendrik; Delzon, Sylvain; Duchêne, Eric; Legave, Jean Michel; Raynal, Hélène; Pichot, Christian; Van Leeuwen, Cornelis; Perpheclim Team

    2015-04-01

    Phenology is a bio-indicator of climate evolutions. Measurements of phenological stages on perennial species provide actually significant illustrations and assessments of the impact of climate change. Phenology is also one of the main key characteristics of the capacity of adaptation of perennial species, generating questions about their consequences on plant growth and development or on fruit quality. Predicting phenology evolution and adaptative capacities of perennial species need to override three main methodological limitations: 1) existing observations and associated databases are scattered and sometimes incomplete, rendering difficult implementation of multi-site study of genotype-environment interaction analyses; 2) there are not common protocols to observe phenological stages; 3) access to generic phenological models platforms is still very limited. In this context, the PERPHECLIM project, which is funded by the Adapting Agriculture and Forestry to Climate Change Meta-Program (ACCAF) from INRA (French National Institute of Agronomic Research), has the objective to develop the necessary infrastructure at INRA level (observatories, information system, modeling tools) to enable partners to study the phenology of various perennial species (grapevine, fruit trees and forest trees). Currently the PERPHECLIM project involves 27 research units in France. The main activities currently developed are: define protocols and observation forms to observe phenology for various species of interest for the project; organizing observation training; develop generic modeling solutions to simulate phenology (Phenological Modelling Platform and modelling platform solutions); support in building research projects at national and international level; develop environment/genotype observation networks for fruit trees species; develop an information system managing data and documentation concerning phenology. Finally, PERPHECLIM project aims to build strong collaborations with public

  18. Project Management of a personnel radiation records computer system

    International Nuclear Information System (INIS)

    Labenski, T.

    1984-01-01

    Project Management techniques have been used to develop a data base management information system to provide storage and retrieval of personnel radiation and Health Physics records. The system is currently being developed on a Hewlett Packard 1000 Series E Computer with provisions to include plant radiation survey information, radiation work permit information, inventory management for Health Physics supplies and instrumentation, and control of personnel access to radiological controlled areas. The methodologies used to manage the overall project are presented along with selection and management of software vendors

  19. LightKone Project: Lightweight Computation for Networks at the Edge

    OpenAIRE

    Van Roy, Peter; TEKK Tour Digital Wallonia

    2017-01-01

    LightKone combines two recent advances in distributed computing to enable general-purpose computing on edge networks: * Synchronization-free programming: Large-scale applications can run efficiently on edge networks by using convergent data structures (based on Lasp and Antidote from previous project SyncFree) → tolerates dynamicity and loose coupling of edge networks * Hybrid gossip: Communication can be made highly resilient on edge networks by combining gossip with classical distributed al...

  20. Computer simulation of the time evolution of a quenched model alloy in the nucleation region

    International Nuclear Information System (INIS)

    Marro, J.; Lebowitz, J.L.; Kalos, M.H.

    1979-01-01

    The time evolution of the structure function and of the cluster (or grain) distribution following quenching in a model binary alloy with a small concentration of minority atoms is obtained from computer simulations. The structure function S-bar (k,t) obeys a simple scaling relation, S-bar (k,t) = K -3 F (k/K) with K (t) proportional t/sup -a/, a approx. = 0.25, during the latter and larger part of the evolution. During the same period, the mean cluster size grows approximately linearly with time

  1. WE-AB-303-09: Rapid Projection Computations for On-Board Digital Tomosynthesis in Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Iliopoulos, AS; Sun, X [Duke University, Durham, NC (United States); Pitsianis, N [Aristotle University of Thessaloniki (Greece); Duke University, Durham, NC (United States); Yin, FF; Ren, L [Duke University Medical Center, Durham, NC (United States)

    2015-06-15

    Purpose: To facilitate fast and accurate iterative volumetric image reconstruction from limited-angle on-board projections. Methods: Intrafraction motion hinders the clinical applicability of modern radiotherapy techniques, such as lung stereotactic body radiation therapy (SBRT). The LIVE system may impact clinical practice by recovering volumetric information via Digital Tomosynthesis (DTS), thus entailing low time and radiation dose for image acquisition during treatment. The DTS is estimated as a deformation of prior CT via iterative registration with on-board images; this shifts the challenge to the computational domain, owing largely to repeated projection computations across iterations. We address this issue by composing efficient digital projection operators from their constituent parts. This allows us to separate the static (projection geometry) and dynamic (volume/image data) parts of projection operations by means of pre-computations, enabling fast on-board processing, while also relaxing constraints on underlying numerical models (e.g. regridding interpolation kernels). Further decoupling the projectors into simpler ones ensures the incurred memory overhead remains low, within the capacity of a single GPU. These operators depend only on the treatment plan and may be reused across iterations and patients. The dynamic processing load is kept to a minimum and maps well to the GPU computational model. Results: We have integrated efficient, pre-computable modules for volumetric ray-casting and FDK-based back-projection with the LIVE processing pipeline. Our results show a 60x acceleration of the DTS computations, compared to the previous version, using a single GPU; presently, reconstruction is attained within a couple of minutes. The present implementation allows for significant flexibility in terms of the numerical and operational projection model; we are investigating the benefit of further optimizations and accurate digital projection sub

  2. WE-AB-303-09: Rapid Projection Computations for On-Board Digital Tomosynthesis in Radiation Therapy

    International Nuclear Information System (INIS)

    Iliopoulos, AS; Sun, X; Pitsianis, N; Yin, FF; Ren, L

    2015-01-01

    Purpose: To facilitate fast and accurate iterative volumetric image reconstruction from limited-angle on-board projections. Methods: Intrafraction motion hinders the clinical applicability of modern radiotherapy techniques, such as lung stereotactic body radiation therapy (SBRT). The LIVE system may impact clinical practice by recovering volumetric information via Digital Tomosynthesis (DTS), thus entailing low time and radiation dose for image acquisition during treatment. The DTS is estimated as a deformation of prior CT via iterative registration with on-board images; this shifts the challenge to the computational domain, owing largely to repeated projection computations across iterations. We address this issue by composing efficient digital projection operators from their constituent parts. This allows us to separate the static (projection geometry) and dynamic (volume/image data) parts of projection operations by means of pre-computations, enabling fast on-board processing, while also relaxing constraints on underlying numerical models (e.g. regridding interpolation kernels). Further decoupling the projectors into simpler ones ensures the incurred memory overhead remains low, within the capacity of a single GPU. These operators depend only on the treatment plan and may be reused across iterations and patients. The dynamic processing load is kept to a minimum and maps well to the GPU computational model. Results: We have integrated efficient, pre-computable modules for volumetric ray-casting and FDK-based back-projection with the LIVE processing pipeline. Our results show a 60x acceleration of the DTS computations, compared to the previous version, using a single GPU; presently, reconstruction is attained within a couple of minutes. The present implementation allows for significant flexibility in terms of the numerical and operational projection model; we are investigating the benefit of further optimizations and accurate digital projection sub

  3. "Simulated molecular evolution" or computer-generated artifacts?

    Science.gov (United States)

    Darius, F; Rojas, R

    1994-11-01

    1. The authors define a function with value 1 for the positive examples and 0 for the negative ones. They fit a continuous function but do not deal at all with the error margin of the fit, which is almost as large as the function values they compute. 2. The term "quality" for the value of the fitted function gives the impression that some biological significance is associated with values of the fitted function strictly between 0 and 1, but there is no justification for this kind of interpretation and finding the point where the fit achieves its maximum does not make sense. 3. By neglecting the error margin the authors try to optimize the fitted function using differences in the second, third, fourth, and even fifth decimal place which have no statistical significance. 4. Even if such a fit could profit from more data points, the authors should first prove that the region of interest has some kind of smoothness, that is, that a continuous fit makes any sense at all. 5. "Simulated molecular evolution" is a misnomer. We are dealing here with random search. Since the margin of error is so large, the fitted function does not provide statistically significant information about the points in search space where strings with cleavage sites could be found. This implies that the method is a highly unreliable stochastic search in the space of strings, even if the neural network is capable of learning some simple correlations. 6. Classical statistical methods are for these kind of problems with so few data points clearly superior to the neural networks used as a "black box" by the authors, which in the way they are structured provide a model with an error margin as large as the numbers being computed.7. And finally, even if someone would provide us with a function which separates strings with cleavage sites from strings without them perfectly, so-called simulated molecular evolution would not be better than random selection.Since a perfect fit would only produce exactly ones or

  4. Image reconstruction from projections and its application in emission computer tomography

    International Nuclear Information System (INIS)

    Kuba, Attila; Csernay, Laszlo

    1989-01-01

    Computer tomography is an imaging technique for producing cross sectional images by reconstruction from projections. Its two main branches are called transmission and emission computer tomography, TCT and ECT, resp. After an overview of the theory and practice of TCT and ECT, the first Hungarian ECT type MB 9300 SPECT consisting of a gamma camera and Ketronic Medax N computer is described, and its applications to radiological patient observations are discussed briefly. (R.P.) 28 refs.; 4 figs

  5. Galactic evolution of copper in the light of NLTE computations

    Science.gov (United States)

    Andrievsky, S.; Bonifacio, P.; Caffau, E.; Korotin, S.; Spite, M.; Spite, F.; Sbordone, L.; Zhukova, A. V.

    2018-01-01

    We have developed a model atom for Cu with which we perform statistical equilibrium computations that allow us to compute the line formation of Cu I lines in stellar atmospheres without assuming local thermodynamic equilibrium (LTE). We validate this model atom by reproducing the observed line profiles of the Sun, Procyon and 11 metal-poor stars. Our sample of stars includes both dwarfs and giants. Over a wide range of stellar parameters, we obtain excellent agreement among different Cu I lines. The 11 metal-poor stars have iron abundances in the range - 4.2 ≤ [Fe/H] ≤ -1.4, the weighted mean of the [Cu/Fe] ratios is -0.22 dex, with a scatter of -0.15 dex. This is very different from the results from LTE analysis (the difference between NLTE and LTE abundances reaches 1 dex) and in spite of the small size of our sample, it prompts for a revision of the Galactic evolution of Cu.

  6. Evolution of brain-computer interfaces: going beyond classic motor physiology

    Science.gov (United States)

    Leuthardt, Eric C.; Schalk, Gerwin; Roland, Jarod; Rouse, Adam; Moran, Daniel W.

    2010-01-01

    The notion that a computer can decode brain signals to infer the intentions of a human and then enact those intentions directly through a machine is becoming a realistic technical possibility. These types of devices are known as brain-computer interfaces (BCIs). The evolution of these neuroprosthetic technologies could have significant implications for patients with motor disabilities by enhancing their ability to interact and communicate with their environment. The cortical physiology most investigated and used for device control has been brain signals from the primary motor cortex. To date, this classic motor physiology has been an effective substrate for demonstrating the potential efficacy of BCI-based control. However, emerging research now stands to further enhance our understanding of the cortical physiology underpinning human intent and provide further signals for more complex brain-derived control. In this review, the authors report the current status of BCIs and detail the emerging research trends that stand to augment clinical applications in the future. PMID:19569892

  7. Cross-cultural dataset for the evolution of religion and morality project.

    Science.gov (United States)

    Purzycki, Benjamin Grant; Apicella, Coren; Atkinson, Quentin D; Cohen, Emma; McNamara, Rita Anne; Willard, Aiyana K; Xygalatas, Dimitris; Norenzayan, Ara; Henrich, Joseph

    2016-11-08

    A considerable body of research cross-culturally examines the evolution of religious traditions, beliefs and behaviors. The bulk of this research, however, draws from coded qualitative ethnographies rather than from standardized methods specifically designed to measure religious beliefs and behaviors. Psychological data sets that examine religious thought and behavior in controlled conditions tend to be disproportionately sampled from student populations. Some cross-national databases employ standardized methods at the individual level, but are primarily focused on fully market integrated, state-level societies. The Evolution of Religion and Morality Project sought to generate a data set that systematically probed individual level measures sampling across a wider range of human populations. The set includes data from behavioral economic experiments and detailed surveys of demographics, religious beliefs and practices, material security, and intergroup perceptions. This paper describes the methods and variables, briefly introduces the sites and sampling techniques, notes inconsistencies across sites, and provides some basic reporting for the data set.

  8. Fitting models of continuous trait evolution to incompletely sampled comparative data using approximate Bayesian computation.

    Science.gov (United States)

    Slater, Graham J; Harmon, Luke J; Wegmann, Daniel; Joyce, Paul; Revell, Liam J; Alfaro, Michael E

    2012-03-01

    In recent years, a suite of methods has been developed to fit multiple rate models to phylogenetic comparative data. However, most methods have limited utility at broad phylogenetic scales because they typically require complete sampling of both the tree and the associated phenotypic data. Here, we develop and implement a new, tree-based method called MECCA (Modeling Evolution of Continuous Characters using ABC) that uses a hybrid likelihood/approximate Bayesian computation (ABC)-Markov-Chain Monte Carlo approach to simultaneously infer rates of diversification and trait evolution from incompletely sampled phylogenies and trait data. We demonstrate via simulation that MECCA has considerable power to choose among single versus multiple evolutionary rate models, and thus can be used to test hypotheses about changes in the rate of trait evolution across an incomplete tree of life. We finally apply MECCA to an empirical example of body size evolution in carnivores, and show that there is no evidence for an elevated rate of body size evolution in the pinnipeds relative to terrestrial carnivores. ABC approaches can provide a useful alternative set of tools for future macroevolutionary studies where likelihood-dependent approaches are lacking. © 2011 The Author(s). Evolution© 2011 The Society for the Study of Evolution.

  9. Lessons from two Dutch projects for the introduction of computers in schools

    NARCIS (Netherlands)

    ten Brummelhuis, A.C.A.; Plomp, T.

    1993-01-01

    The systematic introduction of computers in schools for general secondary education in The Netherlands started in the early 1980s. Initially, the Dutch government experimented in 1983 with a project in 100 lower general secondary schools limited in scope to gain experience with educational computer

  10. Taiwan links up to world's first LHC computing grid project

    CERN Multimedia

    2003-01-01

    "Taiwan's Academia Sinica was linked up to the Large Hadron Collider (LHC) Computing Grid Project last week to work jointly with 12 other countries to construct the world's largest and most powerful particle accelerator" (1/2 page).

  11. FireProt: Energy- and Evolution-Based Computational Design of Thermostable Multiple-Point Mutants.

    Science.gov (United States)

    Bednar, David; Beerens, Koen; Sebestova, Eva; Bendl, Jaroslav; Khare, Sagar; Chaloupkova, Radka; Prokop, Zbynek; Brezovsky, Jan; Baker, David; Damborsky, Jiri

    2015-11-01

    There is great interest in increasing proteins' stability to enhance their utility as biocatalysts, therapeutics, diagnostics and nanomaterials. Directed evolution is a powerful, but experimentally strenuous approach. Computational methods offer attractive alternatives. However, due to the limited reliability of predictions and potentially antagonistic effects of substitutions, only single-point mutations are usually predicted in silico, experimentally verified and then recombined in multiple-point mutants. Thus, substantial screening is still required. Here we present FireProt, a robust computational strategy for predicting highly stable multiple-point mutants that combines energy- and evolution-based approaches with smart filtering to identify additive stabilizing mutations. FireProt's reliability and applicability was demonstrated by validating its predictions against 656 mutations from the ProTherm database. We demonstrate that thermostability of the model enzymes haloalkane dehalogenase DhaA and γ-hexachlorocyclohexane dehydrochlorinase LinA can be substantially increased (ΔTm = 24°C and 21°C) by constructing and characterizing only a handful of multiple-point mutants. FireProt can be applied to any protein for which a tertiary structure and homologous sequences are available, and will facilitate the rapid development of robust proteins for biomedical and biotechnological applications.

  12. FireProt: Energy- and Evolution-Based Computational Design of Thermostable Multiple-Point Mutants.

    Directory of Open Access Journals (Sweden)

    David Bednar

    2015-11-01

    Full Text Available There is great interest in increasing proteins' stability to enhance their utility as biocatalysts, therapeutics, diagnostics and nanomaterials. Directed evolution is a powerful, but experimentally strenuous approach. Computational methods offer attractive alternatives. However, due to the limited reliability of predictions and potentially antagonistic effects of substitutions, only single-point mutations are usually predicted in silico, experimentally verified and then recombined in multiple-point mutants. Thus, substantial screening is still required. Here we present FireProt, a robust computational strategy for predicting highly stable multiple-point mutants that combines energy- and evolution-based approaches with smart filtering to identify additive stabilizing mutations. FireProt's reliability and applicability was demonstrated by validating its predictions against 656 mutations from the ProTherm database. We demonstrate that thermostability of the model enzymes haloalkane dehalogenase DhaA and γ-hexachlorocyclohexane dehydrochlorinase LinA can be substantially increased (ΔTm = 24°C and 21°C by constructing and characterizing only a handful of multiple-point mutants. FireProt can be applied to any protein for which a tertiary structure and homologous sequences are available, and will facilitate the rapid development of robust proteins for biomedical and biotechnological applications.

  13. Evolution and experience with the ATLAS Simulation at Point1 Project

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00389536; The ATLAS collaboration; Brasolin, Franco; Kouba, Tomas; Schovancova, Jaroslava; Fazio, Daniel; Di Girolamo, Alessandro; Scannicchio, Diana; Twomey, Matthew Shaun; Wang, Fuquan; Zaytsev, Alexander; Lee, Christopher

    2017-01-01

    The Simulation at Point1 project is successfully running standard ATLAS simulation jobs on the TDAQ HLT resources. The pool of available resources changes dynamically, therefore we need to be very effective in exploiting the available computing cycles. We present our experience with using the Event Service that provides the event-level granularity of computations. We show the design decisions and overhead time related to the usage of the Event Service. The improved utilization of the resources is also presented with the recent development in monitoring, automatic alerting, deployment and GUI.

  14. Evolution and experience with the ATLAS simulation at Point1 project

    CERN Document Server

    Ballestrero, Sergio; The ATLAS collaboration; Fazio, Daniel; Di Girolamo, Alessandro; Kouba, Tomas; Lee, Christopher; Scannicchio, Diana; Schovancova, Jaroslava; Twomey, Matthew Shaun; Wang, Fuquan; Zaytsev, Alexander

    2016-01-01

    The Simulation at Point1 project is successfully running traditional ATLAS simulation jobs on the TDAQ HLT resources. The pool of available resources changes dynamically, therefore we need to be very effective in exploiting the available computing cycles. We will present our experience with using the Event Service that provides the event-level granularity of computations. We will show the design decisions and overhead time related to the usage of the Event Service. The improved utilization of the resources will also be presented with the recent development in monitoring, automatic alerting, deployment and GUI.

  15. An Interdisciplinary Team Project: Psychology and Computer Science Students Create Online Cognitive Tasks

    Science.gov (United States)

    Flannery, Kathleen A.; Malita, Mihaela

    2014-01-01

    We present our case study of an interdisciplinary team project for students taking either a psychology or computer science (CS) course. The project required psychology and CS students to combine their knowledge and skills to create an online cognitive task. Each interdisciplinary project team included two psychology students who conducted library…

  16. Drifting Continents and Wandering Poles. Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    Science.gov (United States)

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  17. Drifting Continents and Magnetic Fields. Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    Science.gov (United States)

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  18. Evolution of perturbed dynamical systems: analytical computation with time independent accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Gurzadyan, A.V. [Russian-Armenian (Slavonic) University, Department of Mathematics and Mathematical Modelling, Yerevan (Armenia); Kocharyan, A.A. [Monash University, School of Physics and Astronomy, Clayton (Australia)

    2016-12-15

    An analytical method for investigation of the evolution of dynamical systems with independent on time accuracy is developed for perturbed Hamiltonian systems. The error-free estimation using of computer algebra enables the application of the method to complex multi-dimensional Hamiltonian and dissipative systems. It also opens principal opportunities for the qualitative study of chaotic trajectories. The performance of the method is demonstrated on perturbed two-oscillator systems. It can be applied to various non-linear physical and astrophysical systems, e.g. to long-term planetary dynamics. (orig.)

  19. Computer simulation of the topography evolution on ion bombarded surfaces

    CERN Document Server

    Zier, M

    2003-01-01

    The development of roughness on ion bombarded surfaces (facets, ripples) on single crystalline and amorphous homogeneous solids plays an important role for example in depth profiling techniques. To verify a faceting mechanism based not only on sputtering by directly impinging ions but also on the contribution of reflected ions and the redeposition of sputtered material a computer simulation has been carried out. The surface in this model is treated as a two-dimensional line segment profile. The model describes the topography evolution on ion bombarded surfaces including the growth mechanism of a facetted surface, using only the interplay of reflected and primary ions and redeposited atoms.

  20. Evaluating Students' Perceptions and Attitudes toward Computer-Mediated Project-Based Learning Environment: A Case Study

    Science.gov (United States)

    Seet, Ling Ying Britta; Quek, Choon Lang

    2010-01-01

    This research investigated 68 secondary school students' perceptions of their computer-mediated project-based learning environment and their attitudes towards Project Work (PW) using two instruments--Project Work Classroom Learning Environment Questionnaire (PWCLEQ) and Project Work Related Attitudes Instrument (PWRAI). In this project-based…

  1. Computational methods for planning and evaluating geothermal energy projects

    International Nuclear Information System (INIS)

    Goumas, M.G.; Lygerou, V.A.; Papayannakis, L.E.

    1999-01-01

    In planning, designing and evaluating a geothermal energy project, a number of technical, economic, social and environmental parameters should be considered. The use of computational methods provides a rigorous analysis improving the decision-making process. This article demonstrates the application of decision-making methods developed in operational research for the optimum exploitation of geothermal resources. Two characteristic problems are considered: (1) the economic evaluation of a geothermal energy project under uncertain conditions using a stochastic analysis approach and (2) the evaluation of alternative exploitation schemes for optimum development of a low enthalpy geothermal field using a multicriteria decision-making procedure. (Author)

  2. Hamiltonian evolutions of twisted polygons in RPn

    International Nuclear Information System (INIS)

    Beffa, Gloria Marì; Wang, Jing Ping

    2013-01-01

    In this paper we find a discrete moving frame and their associated invariants along projective polygons in RP n , and we use them to describe invariant evolutions of projective N-gons. We then apply a reduction process to obtain a natural Hamiltonian structure on the space of projective invariants for polygons, establishing a close relationship between the projective N-gon invariant evolutions and the Hamiltonian evolutions on the invariants of the flow. We prove that any Hamiltonian evolution is induced on invariants by an invariant evolution of N-gons—what we call a projective realization—and both evolutions are connected explicitly in a very simple way. Finally, we provide a completely integrable evolution (the Boussinesq lattice related to the lattice W 3 -algebra), its projective realization in RP 2 and its Hamiltonian pencil. We generalize both structures to n-dimensions and we prove that they are Poisson, defining explicitly the n-dimensional generalization of the planar evolution (a discretization of the W n -algebra). We prove that the generalization is completely integrable, and we also give its projective realization, which turns out to be very simple. (paper)

  3. (The evolution of) post-secondary education: a computational model and experiments

    Czech Academy of Sciences Publication Activity Database

    Ortmann, Andreas; Slobodyan, Sergey

    -, č. 355 (2008), s. 1-46 ISSN 1211-3298 R&D Projects: GA MŠk LC542 Institutional research plan: CEZ:MSM0021620846 Keywords : post-secondary education * for-profit higher education providers * computational simulations Subject RIV: AH - Economics http://www.cerge-ei.cz/pdf/wp/Wp355.pdf

  4. The Evolution of Culture-Climate Interplay in Temporary Multi-Organisations: The Case of Construction Alliancing Projects

    OpenAIRE

    Kusuma, I. C.

    2016-01-01

    Organisational culture has been a long-standing debate in management research. However, in the field of construction project management, it is relatively under-explored. This is mainly due to the different organisational context of Temporary Multi-Organisations (TMOs). This research re-explores the notion of organisational culture in construction projects. Based on Darwin’s theory of evolution this research goes back to the very beginning; illustrating the exact meaning and dynamics of organi...

  5. COMPUTER GRAPHICAL REPRESENTATION, IN TREBLE ORTHOGONAL PROJECTION, OF A POINT

    Directory of Open Access Journals (Sweden)

    SLONOVSCHI Andrei

    2017-05-01

    Full Text Available In the stages of understanding and study, by students, of descriptive geometry, the treble orthogonal projection of a point, creates problems in the situations in that one or more descriptive coordinates are zero. Starting from these considerations the authors have created an original computer program which offers to the students the possibility to easily understanding of the way in which a point is represented, in draught, in the treble orthogonal projection whatever which are its values of the descriptive coordinates.

  6. (The evolution of) post-secondary education: a computational model and experiments

    Czech Academy of Sciences Publication Activity Database

    Ortmann, Andreas; Slobodyan, Sergey

    -, č. 355 (2008), s. 1-46 ISSN 1211-3298 R&D Projects: GA MŠk LC542 Institutional research plan: CEZ:AV0Z70850503 Keywords : post-secondary education * for-profit higher education providers * computational simulations Subject RIV: AH - Economics http://www.cerge-ei.cz/pdf/wp/Wp355.pdf

  7. Taiwan links up to world's 1st LHC Computing Grid Project

    CERN Multimedia

    2003-01-01

    Taiwan's Academia Sinica was linked up to the Large Hadron Collider (LHC) Computing Grid Project to work jointly with 12 other countries to construct the world's largest and most powerful particle accelerator

  8. The method of projected characteristics for the evolution of magnetic arches

    Science.gov (United States)

    Nakagawa, Y.; Hu, Y. Q.; Wu, S. T.

    1987-01-01

    A numerical method of solving fully nonlinear MHD equation is described. In particular, the formulation based on the newly developed method of projected characteristics (Nakagawa, 1981) suitable to study the evolution of magnetic arches due to motions of their foot-points is presented. The final formulation is given in the form of difference equations; therefore, the analysis of numerical stability is also presented. Further, the most important derivation of physically self-consistent, time-dependent boundary conditions (i.e. the evolving boundary equations) is given in detail, and some results obtained with such boundary equations are reported.

  9. A Project-Based Learning Approach to Programmable Logic Design and Computer Architecture

    Science.gov (United States)

    Kellett, C. M.

    2012-01-01

    This paper describes a course in programmable logic design and computer architecture as it is taught at the University of Newcastle, Australia. The course is designed around a major design project and has two supplemental assessment tasks that are also described. The context of the Computer Engineering degree program within which the course is…

  10. Computational issues in alternating projection algorithms for fixed-order control design

    DEFF Research Database (Denmark)

    Beran, Eric Bengt; Grigoriadis, K.

    1997-01-01

    Alternating projection algorithms have been introduced recently to solve fixed-order controller design problems described by linear matrix inequalities and non-convex coupling rank constraints. In this work, an extensive numerical experimentation using proposed benchmark fixed-order control design...... examples is used to indicate the computational efficiency of the method. These results indicate that the proposed alternating projections are effective in obtaining low-order controllers for small and medium order problems...

  11. Computational representation of Alzheimer's disease evolution applied to a cooking activity.

    Science.gov (United States)

    Serna, Audrey; Rialle, Vincent; Pigot, Hélène

    2006-01-01

    This article presents a computational model and a simulation of the decrease of activities of daily living performances due to Alzheimer's disease. The disease evolution is simulated thanks to the cognitive architecture ACT-R. Activities are represented according to the retrieval of semantic units in declarative memory and the trigger of rules in procedural memory. The simulation of Alzheimer's disease decrease is simulated thanks to the variation of subsymbolic parameters. The model is applied to a cooking activity. Simulation of 100 hundred subjects shows results similar to those realised in a standardized assessment with human subjects.

  12. Evolution of the Atlas data and computing model for a Tier-2 in the EGI infrastructure

    CERN Document Server

    Fernandez, A; The ATLAS collaboration; AMOROS, G; VILLAPLANA, M; FASSI, F; KACI, M; LAMAS, A; OLIVER, E; SALT, J; SANCHEZ, J; SANCHEZ, V

    2012-01-01

    ABSTRAC ISCG 2012 Evolution of the Atlas data and computing model for a Tier2 in the EGI infrastructure During last years the Atlas computing model has moved from a more strict design, where every Tier2 had a liaison and a network dependence from a Tier1, to a more meshed approach where every cloud could be connected. Evolution of ATLAS data models requires changes in ATLAS Tier2s policy for the data replication, dynamic data caching and remote data access. It also requires rethinking the network infrastructure to enable any Tier2 and associated Tier3 to easily connect to any Tier1 or Tier2. Tier2s are becoming more and more important in the ATLAS computing model as it allows more data to be readily accessible for analysis jobs to all users, independently of their geographical location. The Tier2s disk space has been reserved for real, simulated, calibration and alignment, group, and user data. A buffer disk space is needed for input and output data for simulations jobs. Tier2s are going to be used more effic...

  13. A generative representation for the evolution of jazz solos

    DEFF Research Database (Denmark)

    Bäckman, Kjell; Dahlstedt, Palle

    2008-01-01

    This paper describes a system developed to create computer based jazz improvisation solos. The generation of the improvisation material uses interactive evolution, based on a dual genetic representation: a basic melody line representation, with energy constraints ("rubber band") and a hierarchic...... developed for this specific type of music. This is the first published part of an ongoing research project in generative jazz, based on probabilistic and evolutionary strategies....

  14. Hot Spots in the Earth's Crust. Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    Science.gov (United States)

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  15. Research Progress in Mathematical Analysis of Map Projection by Computer Algebra

    Directory of Open Access Journals (Sweden)

    BIAN Shaofeng

    2017-10-01

    Full Text Available Map projection is an important component of modern cartography, and involves many fussy mathematical analysis processes, such as the power series expansions of elliptical functions, differential of complex and implicit functions, elliptical integral and the operation of complex numbers. The derivation of these problems by hand not only consumes much time and energy but also makes mistake easily, and sometimes can not be realized at all because of the impossible complexity. The research achievements in mathematical analysis of map projection by computer algebra are systematically reviewed in five aspects, i.e., the symbolic expressions of forward and inverse solution of ellipsoidal latitudes, the direct transformations between map projections with different distortion properties, expressions of Gauss projection by complex function, mathematical analysis of oblique Mercator projection, polar chart projection with its transformation. Main problems that need to be further solved in this research field are analyzed. It will be helpful to promote the development of map projection.

  16. Computers in Education: An Overview. Publication Number One. Software Engineering/Education Cooperative Project.

    Science.gov (United States)

    Collis, Betty; Muir, Walter

    The first of four major sections in this report presents an overview of the background and evolution of computer applications to learning and teaching. It begins with the early attempts toward "automated teaching" of the 1920s, and the "teaching machines" of B. F. Skinner of the 1940s through the 1960s. It then traces the…

  17. Studying fatigue damage evolution in uni-directional composites using x-ray computed tomography

    DEFF Research Database (Denmark)

    Mikkelsen, Lars Pilgaard

    , it will be possible to lower the costs of energy for wind energy based electricity. In the present work, a lab-source x-ray computed tomography equipment (Zeiss Xradia 520 Versa) has been used in connection with ex-situ fatigue testing of uni-directional composites in order to identify fibre failure during...... comparable x-ray studies) have been used in order to ensure a representative test volume during the ex-situ fatigue testing. Using the ability of the x-ray computed tomography to zoom into regions of interest, non-destructive, the fatigue damage evolution in a repeating ex-situ fatigue loaded test sample has...... improving the fatigue resistance of non-crimp fabric used in the wind turbine industry can be made....

  18. Descriptive and Computer Aided Drawing Perspective on an Unfolded Polyhedral Projection Surface

    Science.gov (United States)

    Dzwierzynska, Jolanta

    2017-10-01

    The aim of the herby study is to develop a method of direct and practical mapping of perspective on an unfolded prism polyhedral projection surface. The considered perspective representation is a rectilinear central projection onto a surface composed of several flat elements. In the paper two descriptive methods of drawing perspective are presented: direct and indirect. The graphical mapping of the effects of the representation is realized directly on the unfolded flat projection surface. That is due to the projective and graphical connection between points displayed on the polyhedral background and their counterparts received on the unfolded flat surface. For a significant improvement of the construction of line, analytical algorithms are formulated. They draw a perspective image of a segment of line passing through two different points determined by their coordinates in a spatial coordinate system of axis x, y, z. Compared to other perspective construction methods that use information about points, for computer vision and the computer aided design, our algorithms utilize data about lines, which are applied very often in architectural forms. Possibility of drawing lines in the considered perspective enables drawing an edge perspective image of an architectural object. The application of the changeable base elements of perspective as a horizon height and a station point location enable drawing perspective image from different viewing positions. The analytical algorithms for drawing perspective images are formulated in Mathcad software, however, they can be implemented in the majority of computer graphical packages, which can make drawing perspective more efficient and easier. The representation presented in the paper and the way of its direct mapping on the flat unfolded projection surface can find application in presentation of architectural space in advertisement and art.

  19. High performance simulation for the Silva project using the tera computer

    International Nuclear Information System (INIS)

    Bergeaud, V.; La Hargue, J.P.; Mougery, F.; Boulet, M.; Scheurer, B.; Le Fur, J.F.; Comte, M.; Benisti, D.; Lamare, J. de; Petit, A.

    2003-01-01

    In the context of the SILVA Project (Atomic Vapor Laser Isotope Separation), numerical simulation of the plant scale propagation of laser beams through uranium vapour was a great challenge. The PRODIGE code has been developed to achieve this goal. Here we focus on the task of achieving high performance simulation on the TERA computer. We describe the main issues for optimizing the parallelization of the PRODIGE code on TERA. Thus, we discuss advantages and drawbacks of the implemented diagonal parallelization scheme. As a consequence, it has been found fruitful to fit out the code in three aspects: memory allocation, MPI communications and interconnection network bandwidth usage. We stress out the interest of MPI/IO in this context and the benefit obtained for production computations on TERA. Finally, we shall illustrate our developments. We indicate some performance measurements reflecting the good parallelization properties of PRODIGE on the TERA computer. The code is currently used for demonstrating the feasibility of the laser propagation at a plant enrichment level and for preparing the 2003 Menphis experiment. We conclude by emphasizing the contribution of high performance TERA simulation to the project. (authors)

  20. High performance simulation for the Silva project using the tera computer

    Energy Technology Data Exchange (ETDEWEB)

    Bergeaud, V.; La Hargue, J.P.; Mougery, F. [CS Communication and Systemes, 92 - Clamart (France); Boulet, M.; Scheurer, B. [CEA Bruyeres-le-Chatel, 91 - Bruyeres-le-Chatel (France); Le Fur, J.F.; Comte, M.; Benisti, D.; Lamare, J. de; Petit, A. [CEA Saclay, 91 - Gif sur Yvette (France)

    2003-07-01

    In the context of the SILVA Project (Atomic Vapor Laser Isotope Separation), numerical simulation of the plant scale propagation of laser beams through uranium vapour was a great challenge. The PRODIGE code has been developed to achieve this goal. Here we focus on the task of achieving high performance simulation on the TERA computer. We describe the main issues for optimizing the parallelization of the PRODIGE code on TERA. Thus, we discuss advantages and drawbacks of the implemented diagonal parallelization scheme. As a consequence, it has been found fruitful to fit out the code in three aspects: memory allocation, MPI communications and interconnection network bandwidth usage. We stress out the interest of MPI/IO in this context and the benefit obtained for production computations on TERA. Finally, we shall illustrate our developments. We indicate some performance measurements reflecting the good parallelization properties of PRODIGE on the TERA computer. The code is currently used for demonstrating the feasibility of the laser propagation at a plant enrichment level and for preparing the 2003 Menphis experiment. We conclude by emphasizing the contribution of high performance TERA simulation to the project. (authors)

  1. Organization and evolution of primate centromeric DNA from whole-genome shotgun sequence data.

    Directory of Open Access Journals (Sweden)

    Can Alkan

    2007-09-01

    Full Text Available The major DNA constituent of primate centromeres is alpha satellite DNA. As much as 2%-5% of sequence generated as part of primate genome sequencing projects consists of this material, which is fragmented or not assembled as part of published genome sequences due to its highly repetitive nature. Here, we develop computational methods to rapidly recover and categorize alpha-satellite sequences from previously uncharacterized whole-genome shotgun sequence data. We present an algorithm to computationally predict potential higher-order array structure based on paired-end sequence data and then experimentally validate its organization and distribution by experimental analyses. Using whole-genome shotgun data from the human, chimpanzee, and macaque genomes, we examine the phylogenetic relationship of these sequences and provide further support for a model for their evolution and mutation over the last 25 million years. Our results confirm fundamental differences in the dispersal and evolution of centromeric satellites in the Old World monkey and ape lineages of evolution.

  2. Organization and evolution of primate centromeric DNA from whole-genome shotgun sequence data.

    Science.gov (United States)

    Alkan, Can; Ventura, Mario; Archidiacono, Nicoletta; Rocchi, Mariano; Sahinalp, S Cenk; Eichler, Evan E

    2007-09-01

    The major DNA constituent of primate centromeres is alpha satellite DNA. As much as 2%-5% of sequence generated as part of primate genome sequencing projects consists of this material, which is fragmented or not assembled as part of published genome sequences due to its highly repetitive nature. Here, we develop computational methods to rapidly recover and categorize alpha-satellite sequences from previously uncharacterized whole-genome shotgun sequence data. We present an algorithm to computationally predict potential higher-order array structure based on paired-end sequence data and then experimentally validate its organization and distribution by experimental analyses. Using whole-genome shotgun data from the human, chimpanzee, and macaque genomes, we examine the phylogenetic relationship of these sequences and provide further support for a model for their evolution and mutation over the last 25 million years. Our results confirm fundamental differences in the dispersal and evolution of centromeric satellites in the Old World monkey and ape lineages of evolution.

  3. The ZAP Project: Designing Interactive Computer Tools for Learning Psychology

    Science.gov (United States)

    Hulshof, Casper; Eysink, Tessa; de Jong, Ton

    2006-01-01

    In the ZAP project, a set of interactive computer programs called "ZAPs" was developed. The programs were designed in such a way that first-year students experience psychological phenomena in a vivid and self-explanatory way. Students can either take the role of participant in a psychological experiment, they can experience phenomena themselves,…

  4. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Spentzouris, P.; /Fermilab; Cary, J.; /Tech-X, Boulder; McInnes, L.C.; /Argonne; Mori, W.; /UCLA; Ng, C.; /SLAC; Ng, E.; Ryne, R.; /LBL, Berkeley

    2011-11-14

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization

  5. Evolution of Things

    OpenAIRE

    Eiben, A. E.; Ferreira, N.; Schut, M.; Kernbach, S.

    2011-01-01

    Evolution is one of the major omnipresent powers in the universe that has been studied for about two centuries. Recent scientific and technical developments make it possible to make the transition from passively understanding to actively mastering evolution. As of today, the only area where human experimenters can design and manipulate evolutionary processes in full is that of Evolutionary Computing, where evolutionary processes are carried out in a digital space, inside computers, in simulat...

  6. A Framework for Debugging Geoscience Projects in a High Performance Computing Environment

    Science.gov (United States)

    Baxter, C.; Matott, L.

    2012-12-01

    High performance computing (HPC) infrastructure has become ubiquitous in today's world with the emergence of commercial cloud computing and academic supercomputing centers. Teams of geoscientists, hydrologists and engineers can take advantage of this infrastructure to undertake large research projects - for example, linking one or more site-specific environmental models with soft computing algorithms, such as heuristic global search procedures, to perform parameter estimation and predictive uncertainty analysis, and/or design least-cost remediation systems. However, the size, complexity and distributed nature of these projects can make identifying failures in the associated numerical experiments using conventional ad-hoc approaches both time- consuming and ineffective. To address these problems a multi-tiered debugging framework has been developed. The framework allows for quickly isolating and remedying a number of potential experimental failures, including: failures in the HPC scheduler; bugs in the soft computing code; bugs in the modeling code; and permissions and access control errors. The utility of the framework is demonstrated via application to a series of over 200,000 numerical experiments involving a suite of 5 heuristic global search algorithms and 15 mathematical test functions serving as cheap analogues for the simulation-based optimization of pump-and-treat subsurface remediation systems.

  7. Dual phase evolution

    CERN Document Server

    Green, David G; Abbass, Hussein A

    2014-01-01

    This book explains how dual phase evolution operates in all these settings and provides a detailed treatment of the subject. The authors discuss the theoretical foundations for the theory, how it relates to other phase transition phenomena and its advantages in evolutionary computation and complex adaptive systems. The book provides methods and techniques to use this concept for problem solving. Dual phase evolution concerns systems that evolve via repeated phase shifts in the connectivity of their elements. It occurs in vast range of settings, including natural systems (species evolution, landscape ecology, geomorphology), socio-economic systems (social networks) and in artificial systems (annealing, evolutionary computing).

  8. NeuroManager: A workflow analysis based simulation management engine for computational neuroscience

    Directory of Open Access Journals (Sweden)

    David Bruce Stockton

    2015-10-01

    Full Text Available We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach 1 provides flexibility to adapt to a variety of neuroscience simulators, 2 simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and 3 improves tracking of simulator/simulation evolution. We implemented NeuroManager in Matlab, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in twenty-two stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to Matlab's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  9. Development and application of project management computer system in nuclear power station

    International Nuclear Information System (INIS)

    Chen Junpu

    2000-01-01

    According to the experiences in the construction of Daya Bay and Lingao nuclear power plants presents, the necessity to use the computers for management and their application in the nuclear power engineering project are explained

  10. A Computer Supported Teamwork Project for People with a Visual Impairment.

    Science.gov (United States)

    Hale, Greg

    2000-01-01

    Discussion of the use of computer supported teamwork (CSTW) in team-based organizations focuses on problems that visually impaired people have reading graphical user interface software via screen reader software. Describes a project that successfully used email for CSTW, and suggests issues needing further research. (LRW)

  11. SPECT: Theoretical aspecte and evolution of emission computed axial tomography

    International Nuclear Information System (INIS)

    Brunol, J.; Nuta, V.

    1981-01-01

    We have detailed certain of the elements of 3-D image reconstruction from axial projections. Two of the aspects specific to nuclear medicine have been analysed namely self-absorption and statistics. In our view, the development of ECAT in the months to come must hence proceed in two essential directions: - application to dynamic cardiac imagery (multigated). Results of this type have been obtained over 8 months in the Radioisotope Service of Cochin Hospital in Paris. It must be stressed here that the number of images to be processed then becomes considerable (multiplication by the gate factor yielding more than 100 images), the more the statistics are reduced due to the fact of the temporal separation. The obtaining of good image quality requires sophisticated quadri-dimensional processing. It follows that the computing times, with all the mini-computers available in nuclear medicine, then become much too great to envisage really application in hospital routine (several hours of computing). This is the reason why we connected an array processor with the IMAC system. This very powerful system (several tens of times the power of a mini-computer) will reduce the time of such computing to less than 10 minutes. New elements can be introduced into the reconstruction algorithm (static case opposite the foregoing one). These important elements of improvement are to the detriment of space and hence of computing time. Here again, the use of an array processor appears indispensable. It is to recall that the ECAT is today a currently used method, the theoretical analyses that it has necessitated have opened the way to new effective methods of tomography by 'Slanted Hole'. (orig.) [de

  12. Developing a project-based computational physics course grounded in expert practice

    Science.gov (United States)

    Burke, Christopher J.; Atherton, Timothy J.

    2017-04-01

    We describe a project-based computational physics course developed using a backwards course design approach. From an initial competency-based model of problem solving in computational physics, we interviewed faculty who use these tools in their own research to determine indicators of expert practice. From these, a rubric was formulated that enabled us to design a course intended to allow students to learn these skills. We also report an initial implementation of the course and, by having the interviewees regrade student work, show that students acquired many of the expert practices identified.

  13. Madagascar: open-source software project for multidimensional data analysis and reproducible computational experiments

    Directory of Open Access Journals (Sweden)

    Sergey Fomel

    2013-11-01

    Full Text Available The Madagascar software package is designed for analysis of large-scale multidimensional data, such as those occurring in exploration geophysics. Madagascar provides a framework for reproducible research. By “reproducible research” we refer to the discipline of attaching software codes and data to computational results reported in publications. The package contains a collection of (a computational modules, (b data-processing scripts, and (c research papers. Madagascar is distributed on SourceForge under a GPL v2 license https://sourceforge.net/projects/rsf/. By October 2013, more than 70 people from different organizations around the world have contributed to the project, with increasing year-to-year activity. The Madagascar website is http://www.ahay.org/.

  14. Correction of computed tomography motion artifacts using pixel-specific back-projection

    International Nuclear Information System (INIS)

    Ritchie, C.J.; Crawford, C.R.; Godwin, J.D.; Kim, Y. King, K.F.

    1996-01-01

    Cardiac and respiratory motion can cause artifacts in computed tomography scans of the chest. The authors describe a new method for reducing these artifacts called pixel-specific back-projection (PSBP). PSBP reduces artifacts caused by in-plane motion by reconstructing each pixel in a frame of reference that moves with the in-plane motion in the volume being scanned. The motion of the frame of reference is specified by constructing maps that describe the motion of each pixel in the image at the time each projection was measured; these maps are based on measurements of the in-plane motion. PSBP has been tested in computer simulations and with volunteer data. In computer simulations, PSBP removed the structured artifacts caused by motion. In scans of two volunteers, PSBP reduced doubling and streaking in chest scans to a level that made the images clinically useful. PSBP corrections of liver scans were less satisfactory because the motion of the liver is predominantly superior-inferior (S-I). PSBP uses a unique set of motion parameters to describe the motion at each point in the chest as opposed to requiring that the motion be described by a single set of parameters. Therefore, PSBP may be more useful in correcting clinical scans than are other correction techniques previously described

  15. A computational genomics pipeline for prokaryotic sequencing projects.

    Science.gov (United States)

    Kislyuk, Andrey O; Katz, Lee S; Agrawal, Sonia; Hagen, Matthew S; Conley, Andrew B; Jayaraman, Pushkala; Nelakuditi, Viswateja; Humphrey, Jay C; Sammons, Scott A; Govil, Dhwani; Mair, Raydel D; Tatti, Kathleen M; Tondella, Maria L; Harcourt, Brian H; Mayer, Leonard W; Jordan, I King

    2010-08-01

    New sequencing technologies have accelerated research on prokaryotic genomes and have made genome sequencing operations outside major genome sequencing centers routine. However, no off-the-shelf solution exists for the combined assembly, gene prediction, genome annotation and data presentation necessary to interpret sequencing data. The resulting requirement to invest significant resources into custom informatics support for genome sequencing projects remains a major impediment to the accessibility of high-throughput sequence data. We present a self-contained, automated high-throughput open source genome sequencing and computational genomics pipeline suitable for prokaryotic sequencing projects. The pipeline has been used at the Georgia Institute of Technology and the Centers for Disease Control and Prevention for the analysis of Neisseria meningitidis and Bordetella bronchiseptica genomes. The pipeline is capable of enhanced or manually assisted reference-based assembly using multiple assemblers and modes; gene predictor combining; and functional annotation of genes and gene products. Because every component of the pipeline is executed on a local machine with no need to access resources over the Internet, the pipeline is suitable for projects of a sensitive nature. Annotation of virulence-related features makes the pipeline particularly useful for projects working with pathogenic prokaryotes. The pipeline is licensed under the open-source GNU General Public License and available at the Georgia Tech Neisseria Base (http://nbase.biology.gatech.edu/). The pipeline is implemented with a combination of Perl, Bourne Shell and MySQL and is compatible with Linux and other Unix systems.

  16. A computer graphics pilot project - Spacecraft mission support with an interactive graphics workstation

    Science.gov (United States)

    Hagedorn, John; Ehrner, Marie-Jacqueline; Reese, Jodi; Chang, Kan; Tseng, Irene

    1986-01-01

    The NASA Computer Graphics Pilot Project was undertaken to enhance the quality control, productivity and efficiency of mission support operations at the Goddard Operations Support Computing Facility. The Project evolved into a set of demonstration programs for graphics intensive simulated control room operations, particularly in connection with the complex space missions that began in the 1980s. Complex mission mean more data. Graphic displays are a means to reduce the probabilities of operator errors. Workstations were selected with 1024 x 768 pixel color displays controlled by a custom VLSI chip coupled to an MC68010 chip running UNIX within a shell that permits operations through the medium of mouse-accessed pulldown window menus. The distributed workstations run off a host NAS 8040 computer. Applications of the system for tracking spacecraft orbits and monitoring Shuttle payload handling illustrate the system capabilities, noting the built-in capabilities of shifting the point of view and rotating and zooming in on three-dimensional views of spacecraft.

  17. LHC Computing Grid Project Launches intAction with International Support. A thousand times more computing power by 2006

    CERN Multimedia

    2001-01-01

    The first phase of the LHC Computing Grid project was approved at an extraordinary meeting of the Council on 20 September 2001. CERN is preparing for the unprecedented avalanche of data that will be produced by the Large Hadron Collider experiments. A thousand times more computer power will be needed by 2006! CERN's need for a dramatic advance in computing capacity is urgent. As from 2006, the four giant detectors observing trillions of elementary particle collisions at the LHC will accumulate over ten million Gigabytes of data, equivalent to the contents of about 20 million CD-ROMs, each year of its operation. A thousand times more computing power will be needed than is available to CERN today. The strategy the collabortations have adopted to analyse and store this unprecedented amount of data is the coordinated deployment of Grid technologies at hundreds of institutes which will be able to search out and analyse information from an interconnected worldwide grid of tens of thousands of computers and storag...

  18. CMS computing upgrade and evolution

    CERN Document Server

    Hernandez Calama, Jose

    2013-01-01

    The distributed Grid computing infrastructure has been instrumental in the successful exploitation of the LHC data leading to the discovery of the Higgs boson. The computing system will need to face new challenges from 2015 on when LHC restarts with an anticipated higher detector output rate and event complexity, but with only a limited increase in the computing resources. A more efficient use of the available resources will be mandatory. CMS is improving the data storage, distribution and access as well as the processing efficiency. Remote access to the data through the WAN, dynamic data replication and deletion based on the data access patterns, and separation of disk and tape storage are some of the areas being actively developed. Multi-core processing and scheduling is being pursued in order to make a better use of the multi-core nodes available at the sites. In addition, CMS is exploring new computing techniques, such as Cloud Computing, to get access to opportunistic resources or as a means of using wit...

  19. The SILCC (SImulating the LifeCycle of molecular Clouds) project - I. Chemical evolution of the supernova-driven ISM

    Czech Academy of Sciences Publication Activity Database

    Walch, S.; Girichidis, P.; Naab, T.; Gatto, A.; Glover, S.C.O.; Wünsch, Richard; Klessen, R.S.; Clark, P.C.; Peters, T.; Derigs, D.; Baczynski, C.

    2015-01-01

    Roč. 454, č. 1 (2015), s. 238-268 ISSN 0035-8711 R&D Projects: GA ČR GAP209/12/1795 Institutional support: RVO:67985815 Keywords : magnetodydrodynamics * ISM clouds * ISM evolution Subject RIV: BN - Astronomy , Celestial Mechanics, Astrophysics Impact factor: 4.952, year: 2015

  20. Evolution of the future plants operation for a better safety

    International Nuclear Information System (INIS)

    Papin, B.; Malvache, P.

    1994-01-01

    This paper describes a coordinated research project of the french CEA, addressing to the evolutions in plant operation apt to bring perceptible and assessable improvement in the operational safety. This program has been scheduled for the 1992-1996 period, with a global 40 men/year effort. The present status of the two main parts of the project is presented: ESCRIME (program aiming at defining the optimal share of tasks between humans and computers in plant operation), IMAGIN (research in the domain of plant information management, in order to ensure the global coherence of the image of the plant, used by the different actors in plant operation). (authors). 3 refs., 4 figs

  1. Computers in construction

    DEFF Research Database (Denmark)

    Howard, Rob

    The evolution of technology, particularly computing in building, learning from the past in order to anticipate what may happen in the future......The evolution of technology, particularly computing in building, learning from the past in order to anticipate what may happen in the future...

  2. Cloud/Fog Computing System Architecture and Key Technologies for South-North Water Transfer Project Safety

    Directory of Open Access Journals (Sweden)

    Yaoling Fan

    2018-01-01

    Full Text Available In view of the real-time and distributed features of Internet of Things (IoT safety system in water conservancy engineering, this study proposed a new safety system architecture for water conservancy engineering based on cloud/fog computing and put forward a method of data reliability detection for the false alarm caused by false abnormal data from the bottom sensors. Designed for the South-North Water Transfer Project (SNWTP, the architecture integrated project safety, water quality safety, and human safety. Using IoT devices, fog computing layer was constructed between cloud server and safety detection devices in water conservancy projects. Technologies such as real-time sensing, intelligent processing, and information interconnection were developed. Therefore, accurate forecasting, accurate positioning, and efficient management were implemented as required by safety prevention of the SNWTP, and safety protection of water conservancy projects was effectively improved, and intelligential water conservancy engineering was developed.

  3. Projection-based curve clustering

    International Nuclear Information System (INIS)

    Auder, Benjamin; Fischer, Aurelie

    2012-01-01

    This paper focuses on unsupervised curve classification in the context of nuclear industry. At the Commissariat a l'Energie Atomique (CEA), Cadarache (France), the thermal-hydraulic computer code CATHARE is used to study the reliability of reactor vessels. The code inputs are physical parameters and the outputs are time evolution curves of a few other physical quantities. As the CATHARE code is quite complex and CPU time-consuming, it has to be approximated by a regression model. This regression process involves a clustering step. In the present paper, the CATHARE output curves are clustered using a k-means scheme, with a projection onto a lower dimensional space. We study the properties of the empirically optimal cluster centres found by the clustering method based on projections, compared with the 'true' ones. The choice of the projection basis is discussed, and an algorithm is implemented to select the best projection basis among a library of orthonormal bases. The approach is illustrated on a simulated example and then applied to the industrial problem. (authors)

  4. Reservoir computer predictions for the Three Meter magnetic field time evolution

    Science.gov (United States)

    Perevalov, A.; Rojas, R.; Lathrop, D. P.; Shani, I.; Hunt, B. R.

    2017-12-01

    The source of the Earth's magnetic field is the turbulent flow of liquid metal in the outer core. Our experiment's goal is to create Earth-like dynamo, to explore the mechanisms and to understand the dynamics of the magnetic and velocity fields. Since it is a complicated system, predictions of the magnetic field is a challenging problem. We present results of mimicking the three Meter experiment by a reservoir computer deep learning algorithm. The experiment is a three-meter diameter outer sphere and a one-meter diameter inner sphere with the gap filled with liquid sodium. The spheres can rotate up to 4 and 14 Hz respectively, giving a Reynolds number near to 108. Two external electromagnets apply magnetic fields, while an array of 31 external and 2 internal Hall sensors measure the resulting induced fields. We use this magnetic probe data to train a reservoir computer to predict the 3M time evolution and mimic waves in the experiment. Surprisingly accurate predictions can be made for several magnetic dipole time scales. This shows that such a complicated MHD system's behavior can be predicted. We gratefully acknowledge support from NSF EAR-1417148.

  5. Impact of configuration management system of computer center on support of scientific projects throughout their lifecycle

    International Nuclear Information System (INIS)

    Bogdanov, A.V.; Yuzhanin, N.V.; Zolotarev, V.I.; Ezhakova, T.R.

    2017-01-01

    In this article the problem of scientific projects support throughout their lifecycle in the computer center is considered in every aspect of support. Configuration Management system plays a connecting role in processes related to the provision and support of services of a computer center. In view of strong integration of IT infrastructure components with the use of virtualization, control of infrastructure becomes even more critical to the support of research projects, which means higher requirements for the Configuration Management system. For every aspect of research projects support, the influence of the Configuration Management system is reviewed and development of the corresponding elements of the system is described in the present paper.

  6. Computing Nash equilibria through computational intelligence methods

    Science.gov (United States)

    Pavlidis, N. G.; Parsopoulos, K. E.; Vrahatis, M. N.

    2005-03-01

    Nash equilibrium constitutes a central solution concept in game theory. The task of detecting the Nash equilibria of a finite strategic game remains a challenging problem up-to-date. This paper investigates the effectiveness of three computational intelligence techniques, namely, covariance matrix adaptation evolution strategies, particle swarm optimization, as well as, differential evolution, to compute Nash equilibria of finite strategic games, as global minima of a real-valued, nonnegative function. An issue of particular interest is to detect more than one Nash equilibria of a game. The performance of the considered computational intelligence methods on this problem is investigated using multistart and deflection.

  7. Time evolution of a quenched binary alloy: computer simulation of a three-dimensional model system

    International Nuclear Information System (INIS)

    Marro, J.; Bortz, A.B.; Kalos, M.H.; Lebowitz, J.L.; Sur, A.

    1976-01-01

    Results are presented of computer simulation of the time evolution for a model of a binary alloy, such as ZnAl, following quenching. The model system is a simple cubic lattice the sites of which are occupied either by A or B particles. There is a nearest neighbor interaction favoring segregation into an A rich and a B rich phase at low temperatures, T less than T/sub c/. Starting from a random configuration, T much greater than T/sub c/, the system is quenched to and evolves at a temperature T less than T/sub c/. The evolution takes place through exchanges between A and B atoms on nearest neighbor sites. The probability of such an exchange is assumed proportional to e/sup -βΔU/ [1 + e/sup -βΔU/] -1 where β = (k/sub B/T) -1 and ΔU is the change in energy resulting from the exchange. In the simulations either a 30 x 30 x 30 or a 50 x 50 x 50 lattice is used with various fractions of the sites occupied by A particles. The evolution of the Fourier transform of the spherically averaged structure function S(k,t), the energy, and the cluster distribution were computed. Comparison is made with various theories of this process and with some experiments. It is found in particular that the results disagree with the predictions of the linearized Cahn-Hilliard theory of spinodal decomposition. The qualitative form of the results appear to be unaffected if the change in the positions of the atoms takes place via a vacancy mechanism rather than through direct exchanges

  8. Fast computation of statistical uncertainty for spatiotemporal distributions estimated directly from dynamic cone beam SPECT projections

    International Nuclear Information System (INIS)

    Reutter, Bryan W.; Gullberg, Grant T.; Huesman, Ronald H.

    2001-01-01

    The estimation of time-activity curves and kinetic model parameters directly from projection data is potentially useful for clinical dynamic single photon emission computed tomography (SPECT) studies, particularly in those clinics that have only single-detector systems and thus are not able to perform rapid tomographic acquisitions. Because the radiopharmaceutical distribution changes while the SPECT gantry rotates, projections at different angles come from different tracer distributions. A dynamic image sequence reconstructed from the inconsistent projections acquired by a slowly rotating gantry can contain artifacts that lead to biases in kinetic parameters estimated from time-activity curves generated by overlaying regions of interest on the images. If cone beam collimators are used and the focal point of the collimators always remains in a particular transaxial plane, additional artifacts can arise in other planes reconstructed using insufficient projection samples [1]. If the projection samples truncate the patient's body, this can result in additional image artifacts. To overcome these sources of bias in conventional image based dynamic data analysis, we and others have been investigating the estimation of time-activity curves and kinetic model parameters directly from dynamic SPECT projection data by modeling the spatial and temporal distribution of the radiopharmaceutical throughout the projected field of view [2-8]. In our previous work we developed a computationally efficient method for fully four-dimensional (4-D) direct estimation of spatiotemporal distributions from dynamic SPECT projection data [5], which extended Formiconi's least squares algorithm for reconstructing temporally static distributions [9]. In addition, we studied the biases that result from modeling various orders temporal continuity and using various time samplings [5]. the present work, we address computational issues associated with evaluating the statistical uncertainty of

  9. Software and man-machine interface considerations for a nuclear plant computer replacement and upgrade project

    International Nuclear Information System (INIS)

    Diamond, G.; Robinson, E.

    1984-01-01

    Some of the key software functions and Man-Machine Interface considerations in a computer replacement and upgrade project for a nuclear power plant are described. The project involves the installation of two separate computer systems: an Emergency Response Facilities Computer System (ERFCS) and a Plant Process Computer System (PPCS). These systems employ state-of-the-art computer hardware and software. The ERFCS is a new system intended to provide enhanced functions to meet NRC post-TMI guidelines. The PPCS is intended to replace and upgrade an existing obsolete plant computer system. A general overview of the hardware and software aspects of the replacement and upgrade is presented. The work done to develop the upgraded Man-Machine Interface is described. For the ERFCS, a detailed discussion is presented of the work done to develop logic to evaluate the readiness and performance of safety systems and their supporting functions. The Man-Machine Interface considerations of reporting readiness and performance to the operator are discussed. Finally, the considerations involved in the implementation of this logic in real-time software are discussed.. For the PPCS, a detailed discussion is presented of some new features

  10. Education as an Agent of Social Evolution: The Educational Projects of Patrick Geddes in Late-Victorian Scotland

    Science.gov (United States)

    Sutherland, Douglas

    2009-01-01

    This paper examines the educational projects of Patrick Geddes in late-Victorian Scotland. Initially a natural scientist, Geddes drew on an eclectic mix of social theory to develop his own ideas on social evolution. For him education was a vital agent of social change which, he believed, had the potential to develop active citizens whose…

  11. Summary of computational support and general documentation for computer code (GENTREE) used in Office of Nuclear Waste Isolation Pilot Salt Site Selection Project

    International Nuclear Information System (INIS)

    Beatty, J.A.; Younker, J.L.; Rousseau, W.F.; Elayat, H.A.

    1983-01-01

    A Decision Tree Computer Model was adapted for the purposes of a Pilot Salt Site Selection Project conducted by the Office of Nuclear Waste Isolation (ONWI). A deterministic computer model was developed to structure the site selection problem with submodels reflecting the five major outcome categories (Cost, Safety, Delay, Environment, Community Impact) to be evaluated in the decision process. Time-saving modifications were made in the tree code as part of the effort. In addition, format changes allowed retention of information items which are valuable in directing future research and in isolation of key variabilities in the Site Selection Decision Model. The deterministic code was linked to the modified tree code and the entire program was transferred to the ONWI-VAX computer for future use by the ONWI project

  12. High Performance Parallel Processing Project: Industrial computing initiative. Progress reports for fiscal year 1995

    Energy Technology Data Exchange (ETDEWEB)

    Koniges, A.

    1996-02-09

    This project is a package of 11 individual CRADA`s plus hardware. This innovative project established a three-year multi-party collaboration that is significantly accelerating the availability of commercial massively parallel processing computing software technology to U.S. government, academic, and industrial end-users. This report contains individual presentations from nine principal investigators along with overall program information.

  13. ProjectQ: An Open Source Software Framework for Quantum Computing

    OpenAIRE

    Steiger, Damian S.; Häner, Thomas; Troyer, Matthias

    2016-01-01

    We introduce ProjectQ, an open source software effort for quantum computing. The first release features a compiler framework capable of targeting various types of hardware, a high-performance simulator with emulation capabilities, and compiler plug-ins for circuit drawing and resource estimation. We introduce our Python-embedded domain-specific language, present the features, and provide example implementations for quantum algorithms. The framework allows testing of quantum algorithms through...

  14. Management evolution in the LSST project

    Science.gov (United States)

    Sweeney, Donald; Claver, Charles; Jacoby, Suzanne; Kantor, Jeffrey; Krabbendam, Victor; Kurita, Nadine

    2010-07-01

    The Large Synoptic Survey Telescope (LSST) project has evolved from just a few staff members in 2003 to about 100 in 2010; the affiliation of four founding institutions has grown to 32 universities, government laboratories, and industry. The public private collaboration aims to complete the estimated $450 M observatory in the 2017 timeframe. During the design phase of the project from 2003 to the present the management structure has been remarkably stable. At the same time, the funding levels, staffing levels and scientific community participation have grown dramatically. The LSSTC has introduced project controls and tools required to manage the LSST's complex funding model, technical structure and distributed work force. Project controls have been configured to comply with the requirements of federal funding agencies. Some of these tools for risk management, configuration control and resource-loaded schedule have been effective and others have not. Technical tasks associated with building the LSST are distributed into three subsystems: Telescope & Site, Camera, and Data Management. Each sub-system has its own experienced Project Manager and System Scientist. Delegation of authority is enabling and effective; it encourages a strong sense of ownership within the project. At the project level, subsystem management follows the principle that there is one Board of Directors, Director, and Project Manager who have overall authority.

  15. Development of computer assisted learning program using cone beam projection for head radiography

    International Nuclear Information System (INIS)

    Nakazeko, Kazuma; Araki, Misao; Kajiwara, Hironori; Watanabe, Hiroyuki; Kuwayama, Jun; Karube, Shuhei; Hashimoto, Takeyuki; Shinohara, Hiroyuki

    2012-01-01

    We present a computer assisted learning (CAL) program to simulate head radiography. The program provides cone beam projections of a target volume, simulating three-dimensional computed tomography (CT) of a head phantom. The generated image is 512 x 512 x 512 pixels with each pixel 0.6 mm on a side. The imaging geometry, such as X-ray tube orientation and phantom orientation, can be varied. The graphical user interface (GUI) of the CAL program allows the study of the effects of varying the imaging geometry; each simulated projection image is shown quickly in an adjoining window. Simulated images with an assigned geometry were compared with the image obtained using the standard geometry in clinical use. The accuracy of the simulated image was verified through comparison with the image acquired using radiography of the head phantom, subsequently processed with a computed radiography system (CR image). Based on correlation coefficient analysis and visual assessment, it was concluded that the CAL program can satisfactorily simulate the CR image. Therefore, it should be useful for the training of head radiography. (author)

  16. Design Principles for "Thriving in Our Digital World": A High School Computer Science Course

    Science.gov (United States)

    Veletsianos, George; Beth, Bradley; Lin, Calvin; Russell, Gregory

    2016-01-01

    "Thriving in Our Digital World" is a technology-enhanced dual enrollment course introducing high school students to computer science through project- and problem-based learning. This article describes the evolution of the course and five lessons learned during the design, development, implementation, and iteration of the course from its…

  17. Evolution of Safeguards over Time: Past, Present, and Projected Facilities, Material, and Budget

    Energy Technology Data Exchange (ETDEWEB)

    Kollar, Lenka; Mathews, Caroline E.

    2009-07-01

    This study examines the past trends and evolution of safeguards over time and projects growth through 2030. The report documents the amount of nuclear material and facilities under safeguards from 1970 until present, along with the corresponding budget. Estimates for the future amount of facilities and material under safeguards are made according to non-nuclear-weapons states’ (NNWS) plans to build more nuclear capacity and sustain current nuclear infrastructure. Since nuclear energy is seen as a clean and economic option for base load electric power, many countries are seeking to either expand their current nuclear infrastructure, or introduce nuclear power. In order to feed new nuclear power plants and sustain existing ones, more nuclear facilities will need to be built, and thus more nuclear material will be introduced into the safeguards system. The projections in this study conclude that a zero real growth scenario for the IAEA safeguards budget will result in large resource gaps in the near future.

  18. Evolution of Safeguards over Time: Past, Present, and Projected Facilities, Material, and Budget

    International Nuclear Information System (INIS)

    Kollar, Lenka; Mathews, Caroline E.

    2009-01-01

    This study examines the past trends and evolution of safeguards over time and projects growth through 2030. The report documents the amount of nuclear material and facilities under safeguards from 1970 until present, along with the corresponding budget. Estimates for the future amount of facilities and material under safeguards are made according to non-nuclear-weapons states (NNWS) plans to build more nuclear capacity and sustain current nuclear infrastructure. Since nuclear energy is seen as a clean and economic option for base load electric power, many countries are seeking to either expand their current nuclear infrastructure, or introduce nuclear power. In order to feed new nuclear power plants and sustain existing ones, more nuclear facilities will need to be built, and thus more nuclear material will be introduced into the safeguards system. The projections in this study conclude that a zero real growth scenario for the IAEA safeguards budget will result in large resource gaps in the near future.

  19. Impact of configuration management system of computer center on support of scientific projects throughout their lifecycle

    Science.gov (United States)

    Bogdanov, A. V.; Iuzhanin, N. V.; Zolotarev, V. I.; Ezhakova, T. R.

    2017-12-01

    In this article the problem of scientific projects support throughout their lifecycle in the computer center is considered in every aspect of support. Configuration Management system plays a connecting role in processes related to the provision and support of services of a computer center. In view of strong integration of IT infrastructure components with the use of virtualization, control of infrastructure becomes even more critical to the support of research projects, which means higher requirements for the Configuration Management system. For every aspect of research projects support, the influence of the Configuration Management system is being reviewed and development of the corresponding elements of the system is being described in the present paper.

  20. Resource-constrained project scheduling: computing lower bounds by solving minimum cut problems

    NARCIS (Netherlands)

    Möhring, R.H.; Nesetril, J.; Schulz, A.S.; Stork, F.; Uetz, Marc Jochen

    1999-01-01

    We present a novel approach to compute Lagrangian lower bounds on the objective function value of a wide class of resource-constrained project scheduling problems. The basis is a polynomial-time algorithm to solve the following scheduling problem: Given a set of activities with start-time dependent

  1. Glaciation and geosphere evolution - Greenland Analogue Project

    International Nuclear Information System (INIS)

    Hirschorn, S.; Vorauer, A.; Belfadhel, M.B.; Jensen, M.

    2011-01-01

    permafrost occurrence, amongst other attributes; Evolution of deep groundwater systems and impacts of Coupled Thermo-Hydro-Mechanical effects imposed by glacial cycles; Impacts of climate change on redox stability using both numerical simulations and paleohydrogeological investigations; and Potential for seismicity and faulting induced by glacial rebound. This paper presents an overview of studies underway as part of the Greenland Analogue Project (GAP) to evaluate the impact of an ice sheet on groundwater chemistry at repository depth using the Greenland Ice Sheet as an analogue to future glaciations in North America. The study of the Greenland Ice Sheet will allow us to increase our understanding of hydrological, hydrogeological and geochemical processes during glacial conditions. (author)

  2. Evolution of Cloud Computing and Enabling Technologies

    OpenAIRE

    Rabi Prasad Padhy; Manas Ranjan Patra

    2012-01-01

    We present an overview of the history of forecasting software over the past 25 years, concentrating especially on the interaction between computing and technologies from mainframe computing to cloud computing. The cloud computing is latest one. For delivering the vision of  various  of computing models, this paper lightly explains the architecture, characteristics, advantages, applications and issues of various computing models like PC computing, internet computing etc and related technologie...

  3. PREPS2 - a PC-based computer program for performing economic analysis of capital projects

    International Nuclear Information System (INIS)

    Blake, M.W.; Brand, D.O.; Chastain, E.T.; Johnson, E.D.

    1990-01-01

    In these times of increased spending to finance new capacity and to meet clean air act legislation, many electric utilities are giving a high priority to controlling capital expenditures at existing generating facilities. Determining the level of capital expenditures which are economically justified is very difficult; units which have higher capacity factors are worth more to the utility. Therefore, the utility can more readily justify higher capital expenditures to improve or maintain reliability and heat rate than on units with lower capacity factors. This paper describes a PC-based computer program (PREPS2) which performs an economic analysis of individual capital projects. The program incorporates tables which describe the worth to the system of making improvements in each unit. This computer program is currently being used by the six Southern Company operating companies to evaluate all production capital projects over $50,000. Approximately 500 projects representing about $300 million are being analyzed each year

  4. eCodonOpt: a systematic computational framework for optimizing codon usage in directed evolution experiments

    OpenAIRE

    Moore, Gregory L.; Maranas, Costas D.

    2002-01-01

    We present a systematic computational framework, eCodonOpt, for designing parental DNA sequences for directed evolution experiments through codon usage optimization. Given a set of homologous parental proteins to be recombined at the DNA level, the optimal DNA sequences encoding these proteins are sought for a given diversity objective. We find that the free energy of annealing between the recombining DNA sequences is a much better descriptor of the extent of crossover formation than sequence...

  5. Helios: Understanding Solar Evolution Through Text Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Randazzese, Lucien [SRI International, Menlo Park, CA (United States)

    2016-12-02

    This proof-of-concept project focused on developing, testing, and validating a range of bibliometric, text analytic, and machine-learning based methods to explore the evolution of three photovoltaic (PV) technologies: Cadmium Telluride (CdTe), Dye-Sensitized solar cells (DSSC), and Multi-junction solar cells. The analytical approach to the work was inspired by previous work by the same team to measure and predict the scientific prominence of terms and entities within specific research domains. The goal was to create tools that could assist domain-knowledgeable analysts in investigating the history and path of technological developments in general, with a focus on analyzing step-function changes in performance, or “breakthroughs,” in particular. The text-analytics platform developed during this project was dubbed Helios. The project relied on computational methods for analyzing large corpora of technical documents. For this project we ingested technical documents from the following sources into Helios: Thomson Scientific Web of Science (papers), the U.S. Patent & Trademark Office (patents), the U.S. Department of Energy (technical documents), the U.S. National Science Foundation (project funding summaries), and a hand curated set of full-text documents from Thomson Scientific and other sources.

  6. Scientific and computational challenges of the fusion simulation project (FSP)

    International Nuclear Information System (INIS)

    Tang, W M

    2008-01-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Project (FSP). The primary objective is to develop advanced software designed to use leadership-class computers for carrying out multiscale physics simulations to provide information vital to delivering a realistic integrated fusion simulation model with unprecedented physics fidelity. This multiphysics capability will be unprecedented in that in the current FES applications domain, the largest-scale codes are used to carry out first-principles simulations of mostly individual phenomena in realistic 3D geometry while the integrated models are much smaller-scale, lower-dimensionality codes with significant empirical elements used for modeling and designing experiments. The FSP is expected to be the most up-to-date embodiment of the theoretical and experimental understanding of magnetically confined thermonuclear plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing a reliable ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices on all relevant time and space scales. From a computational perspective, the fusion energy science application goal to produce high-fidelity, whole-device modeling capabilities will demand computing resources in the petascale range and beyond, together with the associated multicore algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative device involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics

  7. ATLAS distributed computing: experience and evolution

    International Nuclear Information System (INIS)

    Nairz, A

    2014-01-01

    The ATLAS experiment has just concluded its first running period which commenced in 2010. After two years of remarkable performance from the LHC and ATLAS, the experiment has accumulated more than 25 fb −1 of data. The total volume of beam and simulated data products exceeds 100 PB distributed across more than 150 computing centres around the world, managed by the experiment's distributed data management system. These sites have provided up to 150,000 computing cores to ATLAS's global production and analysis processing system, enabling a rich physics programme including the discovery of the Higgs-like boson in 2012. The wealth of accumulated experience in global data-intensive computing at this massive scale, and the considerably more challenging requirements of LHC computing from 2015 when the LHC resumes operation, are driving a comprehensive design and development cycle to prepare a revised computing model together with data processing and management systems able to meet the demands of higher trigger rates, energies and event complexities. An essential requirement will be the efficient utilisation of current and future processor technologies as well as a broad range of computing platforms, including supercomputing and cloud resources. We will report on experience gained thus far and our progress in preparing ATLAS computing for the future

  8. Reconstruction of computed tomographic image from a few x-ray projections by means of accelerative gradient method

    International Nuclear Information System (INIS)

    Kobayashi, Fujio; Yamaguchi, Shoichiro

    1982-01-01

    A method of the reconstruction of computed tomographic images was proposed to reduce the exposure dose to X-ray. The method is the small number of X-ray projection method by accelerative gradient method. The procedures of computation are described. The algorithm of these procedures is simple, the convergence of the computation is fast, and the required memory capacity is small. Numerical simulation was carried out to conform the validity of this method. A sample of simple shape was considered, projection data were given, and the images were reconstructed from 6 views. Good results were obtained, and the method is considered to be useful. (Kato, T.)

  9. Multiscale analysis of nonlinear systems using computational homology

    Energy Technology Data Exchange (ETDEWEB)

    Konstantin Mischaikow; Michael Schatz; William Kalies; Thomas Wanner

    2010-05-24

    Characterization - We extended our previous work on studying the time evolution of patterns associated with phase separation in conserved concentration fields. (6) Probabilistic Homology Validation - work on microstructure characterization is based on numerically studying the homology of certain sublevel sets of a function, whose evolution is described by deterministic or stochastic evolution equations. (7) Computational Homology and Dynamics - Topological methods can be used to rigorously describe the dynamics of nonlinear systems. We are approaching this problem from several perspectives and through a variety of systems. (8) Stress Networks in Polycrystals - we have characterized stress networks in polycrystals. This part of the project is aimed at developing homological metrics which can aid in distinguishing not only microstructures, but also derived mechanical response fields. (9) Microstructure-Controlled Drug Release - This part of the project is concerned with the development of topological metrics in the context of controlled drug delivery systems, such as drug-eluting stents. We are particularly interested in developing metrics which can be used to link the processing stage to the resulting microstructure, and ultimately to the achieved system response in terms of drug release profiles. (10) Microstructure of Fuel Cells - we have been using our computational homology software to analyze the topological structure of the void, metal and ceramic components of a Solid Oxide Fuel Cell.

  10. Multiscale analysis of nonlinear systems using computational homology

    Energy Technology Data Exchange (ETDEWEB)

    Konstantin Mischaikow, Rutgers University/Georgia Institute of Technology, Michael Schatz, Georgia Institute of Technology, William Kalies, Florida Atlantic University, Thomas Wanner,George Mason University

    2010-05-19

    Characterization - We extended our previous work on studying the time evolution of patterns associated with phase separation in conserved concentration fields. (6) Probabilistic Homology Validation - work on microstructure characterization is based on numerically studying the homology of certain sublevel sets of a function, whose evolution is described by deterministic or stochastic evolution equations. (7) Computational Homology and Dynamics - Topological methods can be used to rigorously describe the dynamics of nonlinear systems. We are approaching this problem from several perspectives and through a variety of systems. (8) Stress Networks in Polycrystals - we have characterized stress networks in polycrystals. This part of the project is aimed at developing homological metrics which can aid in distinguishing not only microstructures, but also derived mechanical response fields. (9) Microstructure-Controlled Drug Release - This part of the project is concerned with the development of topological metrics in the context of controlled drug delivery systems, such as drug-eluting stents. We are particularly interested in developing metrics which can be used to link the processing stage to the resulting microstructure, and ultimately to the achieved system response in terms of drug release profiles. (10) Microstructure of Fuel Cells - we have been using our computational homology software to analyze the topological structure of the void, metal and ceramic components of a Solid Oxide Fuel Cell.

  11. The evolution of the project management

    Directory of Open Access Journals (Sweden)

    Catalin Drob

    2009-12-01

    Full Text Available Project management has appeared and developed based on scientific management theory during the '50s-'60s of the last century. After the 1990s of the last century, we can say that project management has truly become an independent discipline, which has a huge impact on the success or failure of companies which are engaged in major projects.

  12. The Variation Theorem Applied to H-2+: A Simple Quantum Chemistry Computer Project

    Science.gov (United States)

    Robiette, Alan G.

    1975-01-01

    Describes a student project which requires limited knowledge of Fortran and only minimal computing resources. The results illustrate such important principles of quantum mechanics as the variation theorem and the virial theorem. Presents sample calculations and the subprogram for energy calculations. (GS)

  13. Evolutionary computation in zoology and ecology.

    Science.gov (United States)

    Boone, Randall B

    2017-12-01

    Evolutionary computational methods have adopted attributes of natural selection and evolution to solve problems in computer science, engineering, and other fields. The method is growing in use in zoology and ecology. Evolutionary principles may be merged with an agent-based modeling perspective to have individual animals or other agents compete. Four main categories are discussed: genetic algorithms, evolutionary programming, genetic programming, and evolutionary strategies. In evolutionary computation, a population is represented in a way that allows for an objective function to be assessed that is relevant to the problem of interest. The poorest performing members are removed from the population, and remaining members reproduce and may be mutated. The fitness of the members is again assessed, and the cycle continues until a stopping condition is met. Case studies include optimizing: egg shape given different clutch sizes, mate selection, migration of wildebeest, birds, and elk, vulture foraging behavior, algal bloom prediction, and species richness given energy constraints. Other case studies simulate the evolution of species and a means to project shifts in species ranges in response to a changing climate that includes competition and phenotypic plasticity. This introduction concludes by citing other uses of evolutionary computation and a review of the flexibility of the methods. For example, representing species' niche spaces subject to selective pressure allows studies on cladistics, the taxon cycle, neutral versus niche paradigms, fundamental versus realized niches, community structure and order of colonization, invasiveness, and responses to a changing climate.

  14. The harmonic distortion evolution of current in computers; A evolucao da distorcao harmonica de corrente em computadores

    Energy Technology Data Exchange (ETDEWEB)

    Bollen, Math; Larsson, Anders; Lundmark, Martin [Universidade de Tecnologia de Lulea (LTU) (Sweden); Wahlberg, Mats; Roennberg, Sarah [Skelleftea Kraft (Sweden)

    2010-05-15

    This project made feeding measurements of large group of computers during games between 2002 and 2008, including the magnitude of current in each phase and in the neutral conductor, the energy consumption and the harmonic spectrum. The presented results show that the harmonic distortion has been diminishing significantly, while the energy consumption by computer do not register important increase.

  15. Students Upgrading through Computer and Career Education System Services (Project SUCCESS). Final Evaluation Report 1993-94. OER Report.

    Science.gov (United States)

    Greene, Judy

    Students Upgrading through Computer and Career Education System Services (Project SUCCESS) was an Elementary and Secondary Education Act Title VII-funded project in its fourth year of operation. The project operated at two high schools in Brooklyn and one in Manhattan (New York). In the 1993-94 school year, the project served 393 students of…

  16. ATLAS Distributed Computing: Experience and Evolution

    CERN Document Server

    Nairz, A; The ATLAS collaboration

    2013-01-01

    The ATLAS experiment has just concluded its first running period which commenced in 2010. After two years of remarkable performance from the LHC and ATLAS, the experiment has accumulated more than 25 fb-1 of data. The total volume of beam and simulated data products exceeds 100 PB distributed across more than 150 computing centers around the world, managed by the experiment's distributed data management system. These sites have provided up to 150,000 computing cores to ATLAS's global production and analysis processing system, enabling a rich physics program including the discovery of the Higgs-like boson in 2012. The wealth of accumulated experience in global data-intensive computing at this massive scale, and the considerably more challenging requirements of LHC computing from 2014 when the LHC resumes operation, are driving a comprehensive design and development cycle to prepare a revised computing model together with data processing and management systems able to meet the demands of higher trigger rates, e...

  17. ATLAS distributed computing: experience and evolution

    CERN Document Server

    Nairz, A; The ATLAS collaboration

    2014-01-01

    The ATLAS experiment has just concluded its first running period which commenced in 2010. After two years of remarkable performance from the LHC and ATLAS, the experiment has accumulated more than 25/fb of data. The total volume of beam and simulated data products exceeds 100~PB distributed across more than 150 computing centres around the world, managed by the experiment's distributed data management system. These sites have provided up to 150,000 computing cores to ATLAS's global production and analysis processing system, enabling a rich physics programme including the discovery of the Higgs-like boson in 2012. The wealth of accumulated experience in global data-intensive computing at this massive scale, and the considerably more challenging requirements of LHC computing from 2015 when the LHC resumes operation, are driving a comprehensive design and development cycle to prepare a revised computing model together with data processing and management systems able to meet the demands of higher trigger rates, e...

  18. MBEToolbox: a Matlab toolbox for sequence data analysis in molecular biology and evolution

    Directory of Open Access Journals (Sweden)

    Xia Xuhua

    2005-03-01

    Full Text Available Abstract Background MATLAB is a high-performance language for technical computing, integrating computation, visualization, and programming in an easy-to-use environment. It has been widely used in many areas, such as mathematics and computation, algorithm development, data acquisition, modeling, simulation, and scientific and engineering graphics. However, few functions are freely available in MATLAB to perform the sequence data analyses specifically required for molecular biology and evolution. Results We have developed a MATLAB toolbox, called MBEToolbox, aimed at filling this gap by offering efficient implementations of the most needed functions in molecular biology and evolution. It can be used to manipulate aligned sequences, calculate evolutionary distances, estimate synonymous and nonsynonymous substitution rates, and infer phylogenetic trees. Moreover, it provides an extensible, functional framework for users with more specialized requirements to explore and analyze aligned nucleotide or protein sequences from an evolutionary perspective. The full functions in the toolbox are accessible through the command-line for seasoned MATLAB users. A graphical user interface, that may be especially useful for non-specialist end users, is also provided. Conclusion MBEToolbox is a useful tool that can aid in the exploration, interpretation and visualization of data in molecular biology and evolution. The software is publicly available at http://web.hku.hk/~jamescai/mbetoolbox/ and http://bioinformatics.org/project/?group_id=454.

  19. Verification of Electromagnetic Physics Models for Parallel Computing Architectures in the GeantV Project

    Energy Technology Data Exchange (ETDEWEB)

    Amadio, G.; et al.

    2017-11-22

    An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physics models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.

  20. Expanding the Understanding of Evolution

    Science.gov (United States)

    Musante, Susan

    2011-01-01

    Originally designed for K-12 teachers, the Understanding Evolution (UE) Web site ("www.understandingevolution.org") is a one-stop shop for all of a teacher's evolution education needs, with lesson plans, teaching tips, lists of common evolution misconceptions, and much more. However, during the past five years, the UE project team learned that…

  1. Evolution of the ATLAS Distributed Computing during the LHC long shutdown

    CERN Document Server

    Campana, S; The ATLAS collaboration

    2013-01-01

    The ATLAS Distributed Computing project (ADC) was established in 2007 to develop and operate a framework, following the ATLAS computing model, to enable data storage, processing and bookkeeping on top of the WLCG distributed infrastructure. ADC development has always been driven by operations and this contributed to its success. The system has fulfilled the demanding requirements of ATLAS, daily consolidating worldwide up to 1PB of data and running more than 1.5 million payloads distributed globally, supporting almost one thousand concurrent distributed analysis users. Comprehensive automation and monitoring minimized the operational manpower required. The flexibility of the system to adjust to operational needs has been important to the success of the ATLAS physics program. The LHC shutdown in 2013-2015 affords an opportunity to improve the system in light of operational experience and scale it to cope with the demanding requirements of 2015 and beyond, most notably a much higher trigger rate and event pileu...

  2. Revealing fatigue damage evolution in unidirectional composites for wind turbine blades using x-ray computed tomography

    DEFF Research Database (Denmark)

    Mikkelsen, Lars Pilgaard

    ’. Thereby, it will be possible to lower the cost of energy for wind energy based electricity. In the presented work, a lab-source x-ray computed tomography equipment (Zeiss Xradia 520 Versa) has been used in connection with ex-situ fatigue testing of uni-directional composites in order to identify fibre...... to other comparable x-ray studies) have been used in order to ensure a representative test volume during the ex-situ fatigue testing. Using the ability of the x-ray computed tomography to zoom into regions of interest, non-destructive, the fatigue damage evolution in a repeating ex-situ fatigue loaded test...... improving the fatigue resistance of non-crimp fabric used in the wind turbine industry can be made....

  3. Summary of papers on predicting aggregated-scale coastal evolution

    NARCIS (Netherlands)

    Hulscher, Suzanne J.M.H.

    2003-01-01

    Coastal evolution puts many questions, both to coastal engineers as well as to scientists. The project PACE (Predicting Aggregated-Scale Coastal Evolution) is a successful project in which they both meet. This paper puts the overview papers of the project into perspective and highlights the results.

  4. Evolution of the U.S. energy service company industry: Market size and project performance from 1990–2008

    International Nuclear Information System (INIS)

    Larsen, Peter H.; Goldman, Charles A.; Satchwell, Andrew

    2012-01-01

    The U.S. energy service company (ESCO) industry is an example of a private sector business model where energy savings are delivered to customers primarily through the use of performance-based contracts. This study was conceived as a snapshot of the ESCO industry prior to the economic slowdown and the introduction of federal stimulus funding mandated by enactment of the American Recovery and Reinvestment Act of 2009 (ARRA). This study utilizes two parallel analytic approaches to characterize ESCO industry and market trends in the U.S.: (1) a “top-down” approach involving a survey of individual ESCOs to estimate aggregate industry activity and (2) a “bottom-up” analysis of a database of ∼3250 projects (representing over $8B in project investment) that reports market trends including installed EE retrofit strategies, project installation costs and savings, project payback times, and benefit-cost ratios over time. Despite the onset of a severe economic recession, the U.S. ESCO industry managed to grow at about 7% per year between 2006 and 2008. ESCO industry revenues were about $4.1 billion in 2008 and ESCOs anticipate accelerated growth through 2011 (25% per year). We found that 2484 ESCO projects in our database generated ∼$4.0 billion ($2009) in net, direct economic benefits to their customers. We estimate that the ESCO project database includes about 20% of all U.S. ESCO market activity from 1990–2008. Assuming the net benefits per project are comparable for ESCO projects that are not included in the LBNL database, this would suggest that the ESCO industry has generated ∼$23 billion in net direct economic benefits for customers at projects installed between 1990 and 2008. There is empirical evidence confirming that the industry is evolving by installing more comprehensive and complex measures—including onsite generation and measures to address deferred maintenance—but this evolution has significant implications for customer project economics

  5. Software for Distributed Computation on Medical Databases: A Demonstration Project

    Directory of Open Access Journals (Sweden)

    Balasubramanian Narasimhan

    2017-05-01

    Full Text Available Bringing together the information latent in distributed medical databases promises to personalize medical care by enabling reliable, stable modeling of outcomes with rich feature sets (including patient characteristics and treatments received. However, there are barriers to aggregation of medical data, due to lack of standardization of ontologies, privacy concerns, proprietary attitudes toward data, and a reluctance to give up control over end use. Aggregation of data is not always necessary for model fitting. In models based on maximizing a likelihood, the computations can be distributed, with aggregation limited to the intermediate results of calculations on local data, rather than raw data. Distributed fitting is also possible for singular value decomposition. There has been work on the technical aspects of shared computation for particular applications, but little has been published on the software needed to support the "social networking" aspect of shared computing, to reduce the barriers to collaboration. We describe a set of software tools that allow the rapid assembly of a collaborative computational project, based on the flexible and extensible R statistical software and other open source packages, that can work across a heterogeneous collection of database environments, with full transparency to allow local officials concerned with privacy protections to validate the safety of the method. We describe the principles, architecture, and successful test results for the site-stratified Cox model and rank-k singular value decomposition.

  6. Scientific computing and algorithms in industrial simulations projects and products of Fraunhofer SCAI

    CERN Document Server

    Schüller, Anton; Schweitzer, Marc

    2017-01-01

    The contributions gathered here provide an overview of current research projects and selected software products of the Fraunhofer Institute for Algorithms and Scientific Computing SCAI. They show the wide range of challenges that scientific computing currently faces, the solutions it offers, and its important role in developing applications for industry. Given the exciting field of applied collaborative research and development it discusses, the book will appeal to scientists, practitioners, and students alike. The Fraunhofer Institute for Algorithms and Scientific Computing SCAI combines excellent research and application-oriented development to provide added value for our partners. SCAI develops numerical techniques, parallel algorithms and specialized software tools to support and optimize industrial simulations. Moreover, it implements custom software solutions for production and logistics, and offers calculations on high-performance computers. Its services and products are based on state-of-the-art metho...

  7. US QCD computational performance studies with PERI

    International Nuclear Information System (INIS)

    Zhang, Y; Fowler, R; Huck, K; Malony, A; Porterfield, A; Reed, D; Shende, S; Taylor, V; Wu, X

    2007-01-01

    We report on some of the interactions between two SciDAC projects: The National Computational Infrastructure for Lattice Gauge Theory (USQCD), and the Performance Engineering Research Institute (PERI). Many modern scientific programs consistently report the need for faster computational resources to maintain global competitiveness. However, as the size and complexity of emerging high end computing (HEC) systems continue to rise, achieving good performance on such systems is becoming ever more challenging. In order to take full advantage of the resources, it is crucial to understand the characteristics of relevant scientific applications and the systems these applications are running on. Using tools developed under PERI and by other performance measurement researchers, we studied the performance of two applications, MILC and Chroma, on several high performance computing systems at DOE laboratories. In the case of Chroma, we discuss how the use of C++ and modern software engineering and programming methods are driving the evolution of performance tools

  8. Computing for magnetic fusion energy research: An updated vision

    International Nuclear Information System (INIS)

    Henline, P.; Giarrusso, J.; Davis, S.; Casper, T.

    1993-01-01

    This Fusion Computing Council perspective is written to present the primary of the fusion computing community at the time of publication of the report necessarily as a summary of the information contained in the individual sections. These concerns reflect FCC discussions during final review of contributions from the various working groups and portray our latest information. This report itself should be considered as dynamic, requiring periodic updating in an attempt to track rapid evolution of the computer industry relevant to requirements for magnetic fusion research. The most significant common concern among the Fusion Computing Council working groups is networking capability. All groups see an increasing need for network services due to the use of workstations, distributed computing environments, increased use of graphic services, X-window usage, remote experimental collaborations, remote data access for specific projects and other collaborations. Other areas of concern include support for workstations, enhanced infrastructure to support collaborations, the User Service Centers, NERSC and future massively parallel computers, and FCC sponsored workshops

  9. Gradient optimization of finite projected entangled pair states

    Science.gov (United States)

    Liu, Wen-Yuan; Dong, Shao-Jun; Han, Yong-Jian; Guo, Guang-Can; He, Lixin

    2017-05-01

    Projected entangled pair states (PEPS) methods have been proven to be powerful tools to solve strongly correlated quantum many-body problems in two dimensions. However, due to the high computational scaling with the virtual bond dimension D , in a practical application, PEPS are often limited to rather small bond dimensions, which may not be large enough for some highly entangled systems, for instance, frustrated systems. Optimization of the ground state using the imaginary time evolution method with a simple update scheme may go to a larger bond dimension. However, the accuracy of the rough approximation to the environment of the local tensors is questionable. Here, we demonstrate that by combining the imaginary time evolution method with a simple update, Monte Carlo sampling techniques and gradient optimization will offer an efficient method to calculate the PEPS ground state. By taking advantage of massive parallel computing, we can study quantum systems with larger bond dimensions up to D =10 without resorting to any symmetry. Benchmark tests of the method on the J1-J2 model give impressive accuracy compared with exact results.

  10. FPGAs in High Perfomance Computing: Results from Two LDRD Projects.

    Energy Technology Data Exchange (ETDEWEB)

    Underwood, Keith D; Ulmer, Craig D.; Thompson, David; Hemmert, Karl Scott

    2006-11-01

    Field programmable gate arrays (FPGAs) have been used as alternative computational de-vices for over a decade; however, they have not been used for traditional scientific com-puting due to their perceived lack of floating-point performance. In recent years, there hasbeen a surge of interest in alternatives to traditional microprocessors for high performancecomputing. Sandia National Labs began two projects to determine whether FPGAs wouldbe a suitable alternative to microprocessors for high performance scientific computing and,if so, how they should be integrated into the system. We present results that indicate thatFPGAs could have a significant impact on future systems. FPGAs have thepotentialtohave order of magnitude levels of performance wins on several key algorithms; however,there are serious questions as to whether the system integration challenge can be met. Fur-thermore, there remain challenges in FPGA programming and system level reliability whenusing FPGA devices.4 AcknowledgmentArun Rodrigues provided valuable support and assistance in the use of the Structural Sim-ulation Toolkit within an FPGA context. Curtis Janssen and Steve Plimpton provided valu-able insights into the workings of two Sandia applications (MPQC and LAMMPS, respec-tively).5

  11. Constrained evolution in numerical relativity

    Science.gov (United States)

    Anderson, Matthew William

    The strongest potential source of gravitational radiation for current and future detectors is the merger of binary black holes. Full numerical simulation of such mergers can provide realistic signal predictions and enhance the probability of detection. Numerical simulation of the Einstein equations, however, is fraught with difficulty. Stability even in static test cases of single black holes has proven elusive. Common to unstable simulations is the growth of constraint violations. This work examines the effect of controlling the growth of constraint violations by solving the constraints periodically during a simulation, an approach called constrained evolution. The effects of constrained evolution are contrasted with the results of unconstrained evolution, evolution where the constraints are not solved during the course of a simulation. Two different formulations of the Einstein equations are examined: the standard ADM formulation and the generalized Frittelli-Reula formulation. In most cases constrained evolution vastly improves the stability of a simulation at minimal computational cost when compared with unconstrained evolution. However, in the more demanding test cases examined, constrained evolution fails to produce simulations with long-term stability in spite of producing improvements in simulation lifetime when compared with unconstrained evolution. Constrained evolution is also examined in conjunction with a wide variety of promising numerical techniques, including mesh refinement and overlapping Cartesian and spherical computational grids. Constrained evolution in boosted black hole spacetimes is investigated using overlapping grids. Constrained evolution proves to be central to the host of innovations required in carrying out such intensive simulations.

  12. Final Project Report: Data Locality Enhancement of Dynamic Simulations for Exascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Shen, Xipeng [North Carolina State Univ., Raleigh, NC (United States)

    2016-04-27

    The goal of this project is to develop a set of techniques and software tools to enhance the matching between memory accesses in dynamic simulations and the prominent features of modern and future manycore systems, alleviating the memory performance issues for exascale computing. In the first three years, the PI and his group have achieves some significant progress towards the goal, producing a set of novel techniques for improving the memory performance and data locality in manycore systems, yielding 18 conference and workshop papers and 4 journal papers and graduating 6 Ph.Ds. This report summarizes the research results of this project through that period.

  13. Mobile clusters of single board computers: an option for providing resources to student projects and researchers.

    Science.gov (United States)

    Baun, Christian

    2016-01-01

    Clusters usually consist of servers, workstations or personal computers as nodes. But especially for academic purposes like student projects or scientific projects, the cost for purchase and operation can be a challenge. Single board computers cannot compete with the performance or energy-efficiency of higher-value systems, but they are an option to build inexpensive cluster systems. Because of the compact design and modest energy consumption, it is possible to build clusters of single board computers in a way that they are mobile and can be easily transported by the users. This paper describes the construction of such a cluster, useful applications and the performance of the single nodes. Furthermore, the clusters' performance and energy-efficiency is analyzed by executing the High Performance Linpack benchmark with a different number of nodes and different proportion of the systems total main memory utilized.

  14. Cloud computing task scheduling strategy based on differential evolution and ant colony optimization

    Science.gov (United States)

    Ge, Junwei; Cai, Yu; Fang, Yiqiu

    2018-05-01

    This paper proposes a task scheduling strategy DEACO based on the combination of Differential Evolution (DE) and Ant Colony Optimization (ACO), aiming at the single problem of optimization objective in cloud computing task scheduling, this paper combines the shortest task completion time, cost and load balancing. DEACO uses the solution of the DE to initialize the initial pheromone of ACO, reduces the time of collecting the pheromone in ACO in the early, and improves the pheromone updating rule through the load factor. The proposed algorithm is simulated on cloudsim, and compared with the min-min and ACO. The experimental results show that DEACO is more superior in terms of time, cost, and load.

  15. Computer graphics and research projects

    International Nuclear Information System (INIS)

    Ingtrakul, P.

    1994-01-01

    This report was prepared as an account of scientific visualization tools and application tools for scientists and engineers. It is provided a set of tools to create pictures and to interact with them in natural ways. It applied many techniques of computer graphics and computer animation through a number of full-color presentations as computer animated commercials, 3D computer graphics, dynamic and environmental simulations, scientific modeling and visualization, physically based modelling, and beavioral, skelatal, dynamics, and particle animation. It took in depth at original hardware and limitations of existing PC graphics adapters contain syste m performance, especially with graphics intensive application programs and user interfaces

  16. IMASIS computer-based medical record project: dealing with the human factor.

    Science.gov (United States)

    Martín-Baranera, M; Planas, I; Palau, J; Sanz, F

    1995-01-01

    level, problems to be solved in utilization of the system, errors detected in the systems' database, and the personal interest in participating in the IMASIS project. The questionnaire was also intended to be a tool to monitor IMASIS evolution. Our study showed that medical staff had a lack of information about the current HIS, leading to a poor utilization of some system options. Another major characteristic, related to the above, was the feeling that the project would negatively affect the organization of work at the hospitals. A computer-based medical record was feared to degrade physician-patient relationship, introduce supplementary administrative burden in clinicians day-to-day work, unnecessarily slow history taking, and imply too-rigid patterns of work. The most frequent problems in using the current system could be classified into two groups: problems related to lack of agility and consistency in user interface design, and those derived from lack of a common patient identification number. Duplication of medical records was the most frequent error detected by physicians. Analysis of physicians' attitudes towards IMASIS revealed a lack of confidence globally. This was probably the consequence of two current features: a lack of complete information about IMASIS possibilities and problems faced when using the system. To deal with such factors, three types of measures have been planned. First, an effort is to be done to ensure that every physician is able to adequately use the current system and understands long-term benefits of the project. This task will be better accomplished by personal interaction between clinicians and a physician from the Informatics Department than through formal teaching of IMASIS. Secondly, a protocol for evaluating the HIS is being developed and will be systematically applied to detect both database errors and systemUs design pitfalls. Finally, the IMASIS project has to find a convenient point for starting, to offer short-term re

  17. KWIKPLAN: a computer program for projecting the annual requirements of nuclear fuel cycle operations

    International Nuclear Information System (INIS)

    Salmon, R.; Kee, C.W.

    1977-06-01

    The computer code KWIKPLAN was written to facilitate the calculation of projected nuclear fuel cycle activities. Using given projections of power generation, the code calculates annual requirements for fuel fabrication, fuel reprocessing, uranium mining, and plutonium use and production. The code uses installed capacity projections and mass flow data for six types of reactors to calculate projected fuel cycle activities and inventories. It calculates fissile uranium and plutonium flows and inventories after allowing for an economy with limited reprocessing capacity and a backlog of unreprocessed fuel. All calculations are made on a quarterly basis; printed and punched output of the projected fuel cycle activities are made on an annual basis. Since the punched information is used in another code to determine waste inventories, the code punches a table from which the effective average burnup can be calculated for the fuel being reprocessed

  18. Nonlinear simulations with and computational issues for NIMROD

    International Nuclear Information System (INIS)

    Sovinec, C.R.

    1998-01-01

    The NIMROD (Non-Ideal Magnetohydrodynamics with Rotation, Open Discussion) code development project was commissioned by the US Department of Energy in February, 1996 to provide the fusion research community with a computational tool for studying low-frequency behavior in experiments. Specific problems of interest include the neoclassical evolution of magnetic islands and the nonlinear behavior of tearing modes in the presence of rotation and nonideal walls in tokamaks; they also include topics relevant to innovative confinement concepts such as magnetic turbulence. Besides having physics models appropriate for these phenomena, an additional requirement is the ability to perform the computations in realistic geometries. The NIMROD Team is using contemporary management and computational methods to develop a computational tool for investigating low-frequency behavior in plasma fusion experiments. The authors intend to make the code freely available, and are taking steps to make it as easy to learn and use as possible. An example application for NIMROD is the nonlinear toroidal RFP simulation--the first in a series to investigate how toroidal geometry affects MHD activity in RFPs. Finally, the most important issue facing the project is execution time, and they are exploring better matrix solvers and a better parallel decomposition to address this

  19. Nonlinear simulations with and computational issues for NIMROD

    Energy Technology Data Exchange (ETDEWEB)

    Sovinec, C.R. [Los Alamos National Lab., NM (United States)

    1998-12-31

    The NIMROD (Non-Ideal Magnetohydrodynamics with Rotation, Open Discussion) code development project was commissioned by the US Department of Energy in February, 1996 to provide the fusion research community with a computational tool for studying low-frequency behavior in experiments. Specific problems of interest include the neoclassical evolution of magnetic islands and the nonlinear behavior of tearing modes in the presence of rotation and nonideal walls in tokamaks; they also include topics relevant to innovative confinement concepts such as magnetic turbulence. Besides having physics models appropriate for these phenomena, an additional requirement is the ability to perform the computations in realistic geometries. The NIMROD Team is using contemporary management and computational methods to develop a computational tool for investigating low-frequency behavior in plasma fusion experiments. The authors intend to make the code freely available, and are taking steps to make it as easy to learn and use as possible. An example application for NIMROD is the nonlinear toroidal RFP simulation--the first in a series to investigate how toroidal geometry affects MHD activity in RFPs. Finally, the most important issue facing the project is execution time, and they are exploring better matrix solvers and a better parallel decomposition to address this.

  20. The community project COSA: comparison of geo-mechanical computer codes for salt

    International Nuclear Information System (INIS)

    Lowe, M.J.S.; Knowles, N.C.

    1986-01-01

    Two benchmark problems related to waste disposal in salt were tackled by ten European organisations using twelve rock-mechanics finite element computer codes. The two problems represented increasing complexity with first a hypothetical verification and then the simulation of a laboratory experiment. The project allowed to ascertain a shapshot of the current combined expertise of European organisations in the modelling of salt behaviour

  1. Bringing molecules back into molecular evolution.

    Directory of Open Access Journals (Sweden)

    Claus O Wilke

    Full Text Available Much molecular-evolution research is concerned with sequence analysis. Yet these sequences represent real, three-dimensional molecules with complex structure and function. Here I highlight a growing trend in the field to incorporate molecular structure and function into computational molecular-evolution work. I consider three focus areas: reconstruction and analysis of past evolutionary events, such as phylogenetic inference or methods to infer selection pressures; development of toy models and simulations to identify fundamental principles of molecular evolution; and atom-level, highly realistic computational modeling of molecular structure and function aimed at making predictions about possible future evolutionary events.

  2. A reconstruction algorithm for coherent scatter computed tomography based on filtered back-projection

    International Nuclear Information System (INIS)

    Stevendaal, U. van; Schlomka, J.-P.; Harding, A.; Grass, M.

    2003-01-01

    Coherent scatter computed tomography (CSCT) is a reconstructive x-ray imaging technique that yields the spatially resolved coherent-scatter form factor of the investigated object. Reconstruction from coherently scattered x-rays is commonly done using algebraic reconstruction techniques (ART). In this paper, we propose an alternative approach based on filtered back-projection. For the first time, a three-dimensional (3D) filtered back-projection technique using curved 3D back-projection lines is applied to two-dimensional coherent scatter projection data. The proposed algorithm is tested with simulated projection data as well as with projection data acquired with a demonstrator setup similar to a multi-line CT scanner geometry. While yielding comparable image quality as ART reconstruction, the modified 3D filtered back-projection algorithm is about two orders of magnitude faster. In contrast to iterative reconstruction schemes, it has the advantage that subfield-of-view reconstruction becomes feasible. This allows a selective reconstruction of the coherent-scatter form factor for a region of interest. The proposed modified 3D filtered back-projection algorithm is a powerful reconstruction technique to be implemented in a CSCT scanning system. This method gives coherent scatter CT the potential of becoming a competitive modality for medical imaging or nondestructive testing

  3. Students Upgrading through Computer and Career Education System Services (Project SUCCESS). Final Evaluation Report 1992-93. OER Report.

    Science.gov (United States)

    New York City Board of Education, Brooklyn, NY. Office of Educational Research.

    Student Upgrading through Computer and Career Education System Services (Project SUCCESS) was an Elementary and Secondary Education Act Title VII-funded project in its third year of operation. Project SUCCESS served 460 students of limited English proficiency at two high schools in Brooklyn and one high school in Manhattan (New York City).…

  4. Contemporary evolution strategies

    CERN Document Server

    Bäck, Thomas; Krause, Peter

    2013-01-01

    Evolution strategies have more than 50 years of history in the field of evolutionary computation. Since the early 1990s, many algorithmic variations of evolution strategies have been developed, characterized by the fact that they use the so-called derandomization concept for strategy parameter adaptation. Most importantly, the covariance matrix adaptation strategy (CMA-ES) and its successors are the key representatives of this group of contemporary evolution strategies. This book provides an overview of the key algorithm developments between 1990 and 2012, including brief descriptions of the a

  5. Conceptual Evolution and Importance of Andragogy towards the Scope Optimization of University Academic Rural Development Programs and Projects

    Directory of Open Access Journals (Sweden)

    José Bernal Azofeifa-Bolaños

    2016-12-01

    Full Text Available This study was carried out with the objective of describing the evolution and importance of andragogical processes in the search of rural profiles committed to the university work in the development and implementation of programs and projects. Among its main contributions, the importance of knowing and teaching processes applied strictly for adults by university coordinators of programs and projects stands out. The relevance of applying this kind of knowledge will allow efficient use of institutional financial resources, particularly for the real commitment of the rural adult community towards the implementation of field activities and accomplishing, in a shorter term, the expected academic achievement. A successful project experience is described in which some andragogical strategies were applied through extension, and which produced a better participation and engagement from rural people with the projects developed by the University. Consequently, applicability of these concepts in the programs and projects of rural development promoted through universities must lay the foundation for regional rural development strategies with the ultimate goal of finding ways to improve the quality of life of people in particular scenarios.

  6. ATLAS Cloud Computing R&D project

    CERN Document Server

    Panitkin, S; The ATLAS collaboration; Caballero Bejar, J; Benjamin, D; DiGirolamo, A; Gable, I; Hendrix, V; Hover, J; Kucharczuk, K; Medrano LLamas, R; Ohman, H; Paterson, M; Sobie, R; Taylor, R; Walker, R; Zaytsev, A

    2013-01-01

    The computing model of the ATLAS experiment was designed around the concept of grid computing and, since the start of data taking, this model has proven very successful. However, new cloud computing technologies bring attractive features to improve the operations and elasticity of scientific distributed computing. ATLAS sees grid and cloud computing as complementary technologies that will coexist at different levels of resource abstraction, and two years ago created an R&D working group to investigate the different integration scenarios. The ATLAS Cloud Computing R&D has been able to demonstrate the feasibility of offloading work from grid to cloud sites and, as of today, is able to integrate transparently various cloud resources into the PanDA workload management system. The ATLAS Cloud Computing R&D is operating various PanDA queues on private and public resources and has provided several hundred thousand CPU days to the experiment. As a result, the ATLAS Cloud Computing R&D group has gained...

  7. Code and papers: computing publication patterns in the LHC era

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Publications in scholarly journals establish the body of knowledge deriving from scientific research; they also play a fundamental role in the career path of scientists and in the evaluation criteria of funding agencies. This presentation reviews the evolution of computing-oriented publications in HEP following the start of operation of LHC. Quantitative analyses are illustrated, which document the production of scholarly papers on computing-related topics by HEP experiments and core tools projects (including distributed computing R&D), and the citations they receive. Several scientometric indicators are analyzed to characterize the role of computing in HEP literature. Distinctive features of scholarly publication production in the software-oriented and hardware-oriented experimental HEP communities are highlighted. Current patterns and trends are compared to the situation in previous generations' HEP experiments at LEP, Tevatron and B-factories. The results of this scientometric analysis document objec...

  8. Large Atmospheric Computation on the Earth Simulator: The LACES Project

    Directory of Open Access Journals (Sweden)

    Michel Desgagné

    2006-01-01

    Full Text Available The Large Atmospheric Computation on the Earth Simulator (LACES project is a joint initiative between Canadian and Japanese meteorological services and academic institutions that focuses on the high resolution simulation of Hurricane Earl (1998. The unique aspect of this effort is the extent of the computational domain, which covers all of North America and Europe with a grid spacing of 1 km. The Canadian Mesoscale Compressible Community (MC2 model is shown to parallelize effectively on the Japanese Earth Simulator (ES supercomputer; however, even using the extensive computing resources of the ES Center (ESC, the full simulation for the majority of Hurricane Earl's lifecycle takes over eight days to perform and produces over 5.2 TB of raw data. Preliminary diagnostics show that the results of the LACES simulation for the tropical stage of Hurricane Earl's lifecycle compare well with available observations for the storm. Further studies involving advanced diagnostics have commenced, taking advantage of the uniquely large spatial extent of the high resolution LACES simulation to investigate multiscale interactions in the hurricane and its environment. It is hoped that these studies will enhance our understanding of processes occurring within the hurricane and between the hurricane and its planetary-scale environment.

  9. History and evolution of the pharmacophore concept in computer-aided drug design.

    Science.gov (United States)

    Güner, Osman F

    2002-12-01

    With computer-aided drug design established as an integral part of the lead discovery and optimization process, pharmacophores have become a focal point for conceptualizing and understanding receptor-ligand interactions. In the structure-based design process, pharmacophores can be used to align molecules based on the three-dimensional arrangement of chemical features or to develop predictive models (e.g., 3D-QSAR) that correlate with the experimental activities of a given training set. Pharmacophores can be also used as search queries for retrieving potential leads from structural databases, for designing molecules with specific desired attributes, or as fingerprints for assessing similarity and diversity of molecules. This review article presents a historical perspective on the evolution and use of the pharmacophore concept in the pharmaceutical, biotechnology, and fragrances industry with published examples of how the technology has contributed and advanced the field.

  10. Applied Linguistics Project: Student-Led Computer Assisted Research in High School EAL/EAP

    Science.gov (United States)

    Bohát, Róbert; Rödlingová, Beata; Horáková, Nina

    2015-01-01

    The Applied Linguistics Project (ALP) started at the International School of Prague (ISP) in 2013. Every year, Grade 9 English as an Additional Language (EAL) students identify an area of learning in need of improvement and design a research method followed by data collection and analysis using basic computer software tools or online corpora.…

  11. Extinction Events Can Accelerate Evolution

    DEFF Research Database (Denmark)

    Lehman, Joel; Miikkulainen, Risto

    2015-01-01

    Extinction events impact the trajectory of biological evolution significantly. They are often viewed as upheavals to the evolutionary process. In contrast, this paper supports the hypothesis that although they are unpredictably destructive, extinction events may in the long term accelerate...... evolution by increasing evolvability. In particular, if extinction events extinguish indiscriminately many ways of life, indirectly they may select for the ability to expand rapidly through vacated niches. Lineages with such an ability are more likely to persist through multiple extinctions. Lending...... computational support for this hypothesis, this paper shows how increased evolvability will result from simulated extinction events in two computational models of evolved behavior. The conclusion is that although they are destructive in the short term, extinction events may make evolution more prolific...

  12. Comparing genomes to computer operating systems in terms of the topology and evolution of their regulatory control networks.

    Science.gov (United States)

    Yan, Koon-Kiu; Fang, Gang; Bhardwaj, Nitin; Alexander, Roger P; Gerstein, Mark

    2010-05-18

    The genome has often been called the operating system (OS) for a living organism. A computer OS is described by a regulatory control network termed the call graph, which is analogous to the transcriptional regulatory network in a cell. To apply our firsthand knowledge of the architecture of software systems to understand cellular design principles, we present a comparison between the transcriptional regulatory network of a well-studied bacterium (Escherichia coli) and the call graph of a canonical OS (Linux) in terms of topology and evolution. We show that both networks have a fundamentally hierarchical layout, but there is a key difference: The transcriptional regulatory network possesses a few global regulators at the top and many targets at the bottom; conversely, the call graph has many regulators controlling a small set of generic functions. This top-heavy organization leads to highly overlapping functional modules in the call graph, in contrast to the relatively independent modules in the regulatory network. We further develop a way to measure evolutionary rates comparably between the two networks and explain this difference in terms of network evolution. The process of biological evolution via random mutation and subsequent selection tightly constrains the evolution of regulatory network hubs. The call graph, however, exhibits rapid evolution of its highly connected generic components, made possible by designers' continual fine-tuning. These findings stem from the design principles of the two systems: robustness for biological systems and cost effectiveness (reuse) for software systems.

  13. Projective methodical system of students training to the course «History of computer science»

    OpenAIRE

    С А Виденин

    2008-01-01

    Components of teachers readiness to professional activity are described in the item. The projective methods of training to a course « History of computer science « in favour to improve professional grounding of students' are considered.

  14. INTEGRATION OF ECONOMIC AND COMPUTER SKILLS AT IMPLEMENTATION OF STUDENTS PROJECT «BUSINESS PLAN PRODUCING IN MICROSOFT WORD»

    Directory of Open Access Journals (Sweden)

    Y.B. Samchinska

    2012-07-01

    Full Text Available In the article expedience at implementation of economic specialities by complex students project on Informatics and Computer Sciences is grounded on creation of business plan by modern information technologies, and also methodical recommendations are presented on implementation of this project.

  15. Evolution of the ATLAS Distributed Computing system during the LHC Long shutdown

    CERN Document Server

    Campana, S; The ATLAS collaboration

    2014-01-01

    The ATLAS Distributed Computing project (ADC) was established in 2007 to develop and operate a framework, following the ATLAS computing model, to enable data storage, processing and bookkeeping on top of the WLCG distributed infrastructure. ADC development has always been driven by operations and this contributed to its success. The system has fulfilled the demanding requirements of ATLAS, daily consolidating worldwide up to 1PB of data and running more than 1.5 million payloads distributed globally, supporting almost one thousand concurrent distributed analysis users. Comprehensive automation and monitoring minimized the operational manpower required. The flexibility of the system to adjust to operational needs has been important to the success of the ATLAS physics program. The LHC shutdown in 2013-2015 affords an opportunity to improve the system in light of operational experience and scale it to cope with the demanding requirements of 2015 and beyond, most notably a much higher trigger rate and event pileu...

  16. Distributed management of scientific projects - An analysis of two computer-conferencing experiments at NASA

    Science.gov (United States)

    Vallee, J.; Gibbs, B.

    1976-01-01

    Between August 1975 and March 1976, two NASA projects with geographically separated participants used a computer-conferencing system developed by the Institute for the Future for portions of their work. Monthly usage statistics for the system were collected in order to examine the group and individual participation figures for all conferences. The conference transcripts were analysed to derive observations about the use of the medium. In addition to the results of these analyses, the attitudes of users and the major components of the costs of computer conferencing are discussed.

  17. Project planning and project management of Baseball II-T

    International Nuclear Information System (INIS)

    Kozman, T.A.; Chargin, A.K.

    1975-01-01

    The details of the project planning and project management work done on the Baseball II-T experiment are reviewed. The LLL Baseball program is a plasma confinement experiment accomplished with a superconducting magnet in the shape of a baseball seam. Both project planning and project management made use of the Critical Path Management (CPM) computer code. The computer code, input, and results from the project planning and project management runs, and the cost and effectiveness of this method of systems planning are discussed

  18. The Lower Manhattan Project: A New Approach to Computer-Assisted Learning in History Classrooms.

    Science.gov (United States)

    Crozier, William; Gaffield, Chad

    1990-01-01

    The Lower Manhattan Project, a computer-assisted undergraduate course in U.S. history, enhances student appreciation of the historical process through research and writing. Focuses on the late nineteenth and early twentieth centuries emphasizing massive immigration, rapid industrialization, and the growth of cities. Includes a reading list and…

  19. RF heating systems evolution for the WEST project

    Energy Technology Data Exchange (ETDEWEB)

    Magne, R.; Achard, J.; Armitano, A.; Argouarch, A.; Berger-By, G.; Bernard, J. M.; Bouquey, F.; Charabot, N.; Colas, L.; Corbel, E.; Delpech, L.; Ekedahl, A.; Goniche, M.; Guilhem, D.; Hillairet, J.; Jacquot, J.; Joffrin, E.; Litaudon, X.; Lombard, G.; Mollard, P. [CEA, IRFM, F-13108 Saint-Paul-lez-Durance (France); and others

    2014-02-12

    Tore Supra is dedicated to long pulse operation at high power, with a record in injected energy of 1 GJ (2.8 MW × 380 s) and an achieved capability of 12 MW injected power delivered by 3 RF systems: Lower Hybrid Current Drive (LHCD), Ion Cyclotron Resonance Heating (ICRH) and Electron Cyclotron Resonance Heating (ECRH). The new WEST project (W [tungsten] Environment in Steady-state Tokamak) aims at fitting Tore Supra with an actively cooled tungsten coated wall and a bulk tungsten divertor. This new device will offer to ITER a test bed for validating the relevant technologies for actively cooled metallic components, with D-shaped H-mode plasmas. For WEST operation, different scenarii able to reproduce ITER relevant conditions in terms of steady state heat loads have been identified, ranging from a high RF power scenario (15 MW, 30 s) to a high fluence scenario (10 MW, 1000 s). This paper will focus on the evolution of the RF systems required for WEST. For the ICRH system, the main issues are its ELM resilience and its CW compatibility, three new actively cooled antennas are being designed, with the aim of reducing their sensitivity to the load variations induced by ELMs. The LH system has been recently upgraded with new klystrons and the PAM antenna, the possible reshaping of the antenna mouths is presently studied for matching with the magnetic field line in the WEST configuration. For the ECRH system, the device for the poloidal movement of the mirrors of the antenna is being changed for higher accuracy and speed.

  20. Projection multiplex recording of computer-synthesised one-dimensional Fourier holograms for holographic memory systems: mathematical and experimental modelling

    Energy Technology Data Exchange (ETDEWEB)

    Betin, A Yu; Bobrinev, V I; Verenikina, N M; Donchenko, S S; Odinokov, S B [Research Institute ' Radiotronics and Laser Engineering' , Bauman Moscow State Technical University, Moscow (Russian Federation); Evtikhiev, N N; Zlokazov, E Yu; Starikov, S N; Starikov, R S [National Reseach Nuclear University MEPhI (Moscow Engineering Physics Institute), Moscow (Russian Federation)

    2015-08-31

    A multiplex method of recording computer-synthesised one-dimensional Fourier holograms intended for holographic memory devices is proposed. The method potentially allows increasing the recording density in the previously proposed holographic memory system based on the computer synthesis and projection recording of data page holograms. (holographic memory)

  1. Computation of the glandular radiation dose in digital tomosynthesis of the breast

    International Nuclear Information System (INIS)

    Sechopoulos, Ioannis; Suryanarayanan, Sankararaman; Vedantham, Srinivasan; D'Orsi, Carl; Karellas, Andrew

    2007-01-01

    Tomosynthesis of the breast is currently a topic of intense interest as a logical next step in the evolution of digital mammography. This study reports on the computation of glandular radiation dose in digital tomosynthesis of the breast. Previously, glandular dose estimations in tomosynthesis have been performed using data from studies of radiation dose in conventional planar mammography. This study evaluates, using Monte Carlo methods, the normalized glandular dose (D g N) to the breast during a tomosynthesis study, and characterizes its dependence on breast size, tissue composition, and x-ray spectrum. The conditions during digital tomosynthesis imaging of the breast were simulated using a computer program based on the Geant4 toolkit. With the use of simulated breasts of varying size, thickness and tissue composition, the D g N to the breast tissue was computed for varying x-ray spectra and tomosynthesis projection angle. Tomosynthesis projections centered about both the cranio-caudal (CC) and medio-lateral oblique (MLO) views were simulated. For each projection angle, the ratio of the glandular dose for that projection to the glandular dose for the zero degree projection was computed. This ratio was denoted the relative glandular dose (RGD) coefficient, and its variation under different imaging parameters was analyzed. Within mammographic energies, the RGD was found to have a weak dependence on glandular fraction and x-ray spectrum for both views. A substantial dependence on breast size and thickness was found for the MLO view, and to a lesser extent for the CC view. Although RGD values deviate substantially from unity as a function of projection angle, the RGD averaged over all projections in a complete tomosynthesis study varies from 0.91 to 1.01. The RGD results were fit to mathematical functions and the resulting equations are provided

  2. Hybrid quantum computation

    International Nuclear Information System (INIS)

    Sehrawat, Arun; Englert, Berthold-Georg; Zemann, Daniel

    2011-01-01

    We present a hybrid model of the unitary-evolution-based quantum computation model and the measurement-based quantum computation model. In the hybrid model, part of a quantum circuit is simulated by unitary evolution and the rest by measurements on star graph states, thereby combining the advantages of the two standard quantum computation models. In the hybrid model, a complicated unitary gate under simulation is decomposed in terms of a sequence of single-qubit operations, the controlled-z gates, and multiqubit rotations around the z axis. Every single-qubit and the controlled-z gate are realized by a respective unitary evolution, and every multiqubit rotation is executed by a single measurement on a required star graph state. The classical information processing in our model requires only an information flow vector and propagation matrices. We provide the implementation of multicontrol gates in the hybrid model. They are very useful for implementing Grover's search algorithm, which is studied as an illustrative example.

  3. ON THE SIMULTANEOUS EVOLUTION OF MASSIVE PROTOSTARS AND THEIR HOST CORES

    International Nuclear Information System (INIS)

    Kuiper, R.; Yorke, H. W.

    2013-01-01

    Studies of the evolution of massive protostars and the evolution of their host molecular cloud cores are commonly treated as separate problems. However, interdependencies between the two can be significant. Here, we study the simultaneous evolution of massive protostars and their host molecular cores using a multi-dimensional radiation hydrodynamics code that incorporates the effects of the thermal pressure and radiative acceleration feedback of the centrally forming protostar. The evolution of the massive protostar is computed simultaneously using the stellar evolution code STELLAR, modified to include the effects of variable accretion. The interdependencies are studied in three different collapse scenarios. For comparison, stellar evolutionary tracks at constant accretion rates and the evolution of the host cores using pre-computed stellar evolutionary tracks are computed. The resulting interdependencies of the protostellar evolution and the evolution of the environment are extremely diverse and depend on the order of events, in particular the time of circumstellar accretion disk formation with respect to the onset of the bloating phase of the star. Feedback mechanisms affect the instantaneous accretion rate and the protostar's radius, temperature, and luminosity on timescales t ≤ 5 kyr, corresponding to the accretion timescale and Kelvin-Helmholtz contraction timescale, respectively. Nevertheless, it is possible to approximate the overall protostellar evolution in many cases by pre-computed stellar evolutionary tracks assuming appropriate constant average accretion rates

  4. Computability, Gödel's incompleteness theorem, and an inherent limit on the predictability of evolution.

    Science.gov (United States)

    Day, Troy

    2012-04-07

    The process of evolutionary diversification unfolds in a vast genotypic space of potential outcomes. During the past century, there have been remarkable advances in the development of theory for this diversification, and the theory's success rests, in part, on the scope of its applicability. A great deal of this theory focuses on a relatively small subset of the space of potential genotypes, chosen largely based on historical or contemporary patterns, and then predicts the evolutionary dynamics within this pre-defined set. To what extent can such an approach be pushed to a broader perspective that accounts for the potential open-endedness of evolutionary diversification? There have been a number of significant theoretical developments along these lines but the question of how far such theory can be pushed has not been addressed. Here a theorem is proven demonstrating that, because of the digital nature of inheritance, there are inherent limits on the kinds of questions that can be answered using such an approach. In particular, even in extremely simple evolutionary systems, a complete theory accounting for the potential open-endedness of evolution is unattainable unless evolution is progressive. The theorem is closely related to Gödel's incompleteness theorem, and to the halting problem from computability theory.

  5. Nonlinear evolution of MHD instabilities

    International Nuclear Information System (INIS)

    Bateman, G.; Hicks, H.R.; Wooten, J.W.; Dory, R.A.

    1975-01-01

    A 3-D nonlinear MHD computer code was used to study the time evolution of internal instabilities. Velocity vortex cells are observed to persist into the nonlinear evolution. Pressure and density profiles convect around these cells for a weak localized instability, or convect into the wall for a strong instability. (U.S.)

  6. A Computer-Based Laboratory Project for the Study of Stimulus Generalization and Peak Shift

    Science.gov (United States)

    Derenne, Adam; Loshek, Eevett

    2009-01-01

    This paper describes materials designed for classroom projects on stimulus generalization and peak shift. A computer program (originally written in QuickBASIC) is used for data collection and a Microsoft Excel file with macros organizes the raw data on a spreadsheet and creates generalization gradients. The program is designed for use with human…

  7. 2nd Colombian Congress on Computational Biology and Bioinformatics

    CERN Document Server

    Cristancho, Marco; Isaza, Gustavo; Pinzón, Andrés; Rodríguez, Juan

    2014-01-01

    This volume compiles accepted contributions for the 2nd Edition of the Colombian Computational Biology and Bioinformatics Congress CCBCOL, after a rigorous review process in which 54 papers were accepted for publication from 119 submitted contributions. Bioinformatics and Computational Biology are areas of knowledge that have emerged due to advances that have taken place in the Biological Sciences and its integration with Information Sciences. The expansion of projects involving the study of genomes has led the way in the production of vast amounts of sequence data which needs to be organized, analyzed and stored to understand phenomena associated with living organisms related to their evolution, behavior in different ecosystems, and the development of applications that can be derived from this analysis.  .

  8. Development of a numerical 2-dimensional beach evolution model

    DEFF Research Database (Denmark)

    Baykal, Cüneyt

    2014-01-01

    This paper presents the description of a 2-dimensional numerical model constructed for the simulation of beach evolution under the action of wind waves only over the arbitrary land and sea topographies around existing coastal structures and formations. The developed beach evolution numerical model...... is composed of 4 submodels: a nearshore spectral wave transformation model based on an energy balance equation including random wave breaking and diffraction terms to compute the nearshore wave characteristics, a nearshore wave-induced circulation model based on the nonlinear shallow water equations...... to compute the nearshore depth-averaged wave-induced current velocities and mean water level changes, a sediment transport model to compute the local total sediment transport rates occurring under the action of wind waves, and a bottom evolution model to compute the bed level changes in time based...

  9. The Evolution Process on Information Technology Outsourcing Relationship

    Directory of Open Access Journals (Sweden)

    Duan Weihua

    2017-01-01

    Full Text Available Information technology outsourcing relationship is one of the key issues to IT outsourcing success. To explore how to manage and promote IT outsourcing relationship, it is necessary to understand its evolution process. Firstly, the types of IT outsourcing based on relationship quality and IT outsourcing project level will be analyzed; Secondly, two evolution process models of IT outsourcing relationship are proposed based on relationship quality and IT outsourcing project level, and the IT outsourcing relationship evolution process is indicated; Finally, an IT outsourcing relationship evolution process model is developed, and the development process of IT outsourcing relationship from low to high under the internal and external power is explained.

  10. Optimization of the cumulative risk assessment of pesticides and biocides using computational techniques: Pilot project

    DEFF Research Database (Denmark)

    Jonsdottir, Svava Osk; Reffstrup, Trine Klein; Petersen, Annette

    This pilot project is intended as the first step in developing a computational strategy to assist in refining methods for higher tier cumulative and aggregate risk assessment of exposure to mixture of pesticides and biocides. For this purpose, physiologically based toxicokinetic (PBTK) models were...

  11. Definition and evolution of quantum cellular automata with two qubits per cell

    International Nuclear Information System (INIS)

    Karafyllidis, Ioannis G.

    2004-01-01

    Studies of quantum computer implementations suggest cellular quantum computer architectures. These architectures can simulate the evolution of quantum cellular automata, which can possibly simulate both quantum and classical physical systems and processes. It is however known that except for the trivial case, unitary evolution of one-dimensional homogeneous quantum cellular automata with one qubit per cell is not possible. Quantum cellular automata that comprise two qubits per cell are defined and their evolution is studied using a quantum computer simulator. The evolution is unitary and its linearity manifests itself as a periodic structure in the probability distribution patterns

  12. A projection gradient method for computing ground state of spin-2 Bose–Einstein condensates

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Hanquan, E-mail: hanquan.wang@gmail.com [School of Statistics and Mathematics, Yunnan University of Finance and Economics, Kunming, Yunnan Province, 650221 (China); Yunnan Tongchang Scientific Computing and Data Mining Research Center, Kunming, Yunnan Province, 650221 (China)

    2014-10-01

    In this paper, a projection gradient method is presented for computing ground state of spin-2 Bose–Einstein condensates (BEC). We first propose the general projection gradient method for solving energy functional minimization problem under multiple constraints, in which the energy functional takes real functions as independent variables. We next extend the method to solve a similar problem, where the energy functional now takes complex functions as independent variables. We finally employ the method into finding the ground state of spin-2 BEC. The key of our method is: by constructing continuous gradient flows (CGFs), the ground state of spin-2 BEC can be computed as the steady state solution of such CGFs. We discretized the CGFs by a conservative finite difference method along with a proper way to deal with the nonlinear terms. We show that the numerical discretization is normalization and magnetization conservative and energy diminishing. Numerical results of the ground state and their energy of spin-2 BEC are reported to demonstrate the effectiveness of the numerical method.

  13. A projection gradient method for computing ground state of spin-2 Bose–Einstein condensates

    International Nuclear Information System (INIS)

    Wang, Hanquan

    2014-01-01

    In this paper, a projection gradient method is presented for computing ground state of spin-2 Bose–Einstein condensates (BEC). We first propose the general projection gradient method for solving energy functional minimization problem under multiple constraints, in which the energy functional takes real functions as independent variables. We next extend the method to solve a similar problem, where the energy functional now takes complex functions as independent variables. We finally employ the method into finding the ground state of spin-2 BEC. The key of our method is: by constructing continuous gradient flows (CGFs), the ground state of spin-2 BEC can be computed as the steady state solution of such CGFs. We discretized the CGFs by a conservative finite difference method along with a proper way to deal with the nonlinear terms. We show that the numerical discretization is normalization and magnetization conservative and energy diminishing. Numerical results of the ground state and their energy of spin-2 BEC are reported to demonstrate the effectiveness of the numerical method

  14. Evolution of project-based learning in small groups in environmental engineering courses

    Directory of Open Access Journals (Sweden)

    Jesús M. Requies

    2018-03-01

    Full Text Available This work presents the assessment of the development and evolution of an active methodology (Project-Based Learning –PBL- implemented on the course “Unit Operations in Environmental Engineering”, within the bachelor’s degree in Environmental Engineering, with the purpose of decreasing the dropout rate in this course. After the initial design and implementation of this methodology during the first academic year (12/13, different modifications were adopted in the following ones (13-14, 14-15 & 15-16 in order to optimize the student’s and professor’s work load as well as correct some malfunctions observed in the initial design of the PBL. This active methodology seeks to make students the main architects of their own learning processes. Accordingly, they have to identify their learning needs, which is a highly motivating approach both for their curricular development and for attaining the required learning outcomes in this field of knowledge. The results obtained show that working in small teams (cooperative work enhances each group member’s self–learning capabilities. Moreover, academic marks improve when compared to traditional learning methodologies. Nevertheless, the implementation of more active methodologies, such as project-based learning, in small groups has certain specific characteristics. In this case it has been implemented simultaneously in two different groups of 10 students each one. Such small groups are more heterogeneoussince the presence of two highly motivated students or not can vary or affect the whole group’s attitude and academic results.

  15. Hyperbolicity and constrained evolution in linearized gravity

    International Nuclear Information System (INIS)

    Matzner, Richard A.

    2005-01-01

    Solving the 4-d Einstein equations as evolution in time requires solving equations of two types: the four elliptic initial data (constraint) equations, followed by the six second order evolution equations. Analytically the constraint equations remain solved under the action of the evolution, and one approach is to simply monitor them (unconstrained evolution). Since computational solution of differential equations introduces almost inevitable errors, it is clearly 'more correct' to introduce a scheme which actively maintains the constraints by solution (constrained evolution). This has shown promise in computational settings, but the analysis of the resulting mixed elliptic hyperbolic method has not been completely carried out. We present such an analysis for one method of constrained evolution, applied to a simple vacuum system, linearized gravitational waves. We begin with a study of the hyperbolicity of the unconstrained Einstein equations. (Because the study of hyperbolicity deals only with the highest derivative order in the equations, linearization loses no essential details.) We then give explicit analytical construction of the effect of initial data setting and constrained evolution for linearized gravitational waves. While this is clearly a toy model with regard to constrained evolution, certain interesting features are found which have relevance to the full nonlinear Einstein equations

  16. Computational Phenotypes: Where the Theory of Computation Meets Evo-Devo

    Directory of Open Access Journals (Sweden)

    Sergio Balari

    2009-03-01

    Full Text Available This article argues that the Chomsky Hierarchy can be reinterpreted as a developmental morphospace constraining the evolution of a discrete and finite series of computational phenotypes. In doing so, the theory of Morphological Evolution as stated by Pere Alberch, a pioneering figure of Evo–Devo thinking, is adhered to.

  17. TMX-U computer system in evolution

    International Nuclear Information System (INIS)

    Casper, T.A.; Bell, H.; Brown, M.; Gorvad, M.; Jenkins, S.; Meyer, W.; Moller, J.; Perkins, D.

    1986-01-01

    Over the past three years, the total TMX-U diagnsotic data base has grown to exceed 10 megabytes from over 1300 channels; roughly triple the originally designed size. This acquisition and processing load has resulted in an experiment repetition rate exceeding 10 minutes per shot using the five original Hewlett-Packard HP-1000 computers with their shared disks. Our new diagnostics tend to be multichannel instruments, which, in our environment, can be more easily managed using local computers. For this purpose, we are using HP series 9000 computers for instrument control, data acquisition, and analysis. Fourteen such systems are operational with processed format output exchanged via a shared resource manager. We are presently implementing the necessary hardware and software changes to create a local area network allowing us to combine the data from these systems with our main data archive. The expansion of our diagnostic system using the paralled acquisition and processing concept allows us to increase our data base with a minimum of impact on the experimental repetition rate

  18. Design reality gap issues within an ICT4D project:an assessment of Jigawa State Community Computer Center

    OpenAIRE

    Kanya, Rislana Abdulazeez; Good, Alice

    2013-01-01

    This paper evaluates the Jigawa State Government Community Computer centre project using the design reality gap framework. The purpose of this was to analyse the shortfall between design expectations and implementation realities, in order to find out the current situation of the project. Furthermore to analyse whether it would meet the key stakeholder’s expectation. The Majority of Government ICT Projects is classified as either failure or partial failure. Our research will underpin a case st...

  19. Evolution of extreme temperature events in short term climate projection for Iberian Peninsula.

    Science.gov (United States)

    Rodriguez, Alfredo; Tarquis, Ana M.; Sanchez, Enrique; Dosio, Alessandro; Ruiz-Ramos, Margarita

    2014-05-01

    Extreme events of maximum and minimum temperatures are a main hazard for agricultural production in Iberian Peninsula. For this purpose, in this study we analyze projections of their evolution that could be valid for the next decade, represented in this study by the 30-year period 2004-2034 (target period). For this purpose two kinds of data were used in this study: 1) observations from the station network of AEMET (Spanish National Meteorological Agency) for five Spanish locations, and 2) simulated data at a resolution of 50 ×50 km horizontal grid derived from the outputs of twelve Regional Climate Models (RCMs) taken from project ENSEMBLES (van der Linden and Mitchell, 2009), with a bias correction (Dosio and Paruolo, 2011; Dosio et al., 2012) regarding the observational dataset Spain02 (Herrera et al., 2012). To validate the simulated climate, the available period of observations was compared to a baseline period (1964-1994) of simulated climate for all locations. Then, to analyze the changes for the present/very next future, probability of extreme temperature events for 2004-2034 were compared to that of the baseline period. Although only minor changes are expected, small variations in variability may have a significant impact in crop performance. The objective of the work is to evaluate the utility of these short term projections for potential users, as for instance insurance companies. References Dosio A. and Paruolo P., 2011. Bias correction of the ENSEMBLES high-resolution climate change projections for use by impact models: Evaluation on the present climate. Journal of Geophysical Research, VOL. 116,D16106, doi:10.1029/2011JD015934 Dosio A., Paruolo P. and Rojas R., 2012. Bias correction of the ENSEMBLES high resolution climate change projections for use by impact models: Analysis of the climate change signal. Journal of Geophysical Research,Volume 117, D17, doi: 0.1029/2012JD017968 Herrera et. al. (2012) Development and Analysis of a 50 year high

  20. Status of the Advanced Teleoperation Project in the French A.R.A. program

    International Nuclear Information System (INIS)

    Andre, G.; Fournier, R.

    1987-01-01

    This paper reports the research and development work carried out in the French advanced teleoperation project. The successful achievement of significant progress, in recent years, allows to considerably advance the state of the art so that it objectively constitutes the foundation of a new generation of remote systems. After briefly recalling the organization of this project, the authors outline the basic concepts related to the evolution of teleoperation with regard to the notions of flexibility, adaptivity, autonomy, transparency. The authors present the overall architecture of the computer aided teleoperation system. The following sections deal with fundamental studies which have been realized and key subsystems which have been developed. The authors emphasize on the computer control system which includes: generalized bilateral control and supervisory control. Secondly, they underline the role of sophisticated technologies: sensory system, computer graphics. . ., for generating adaptive control functions and for providing new interfaces. Thirdly, they describe the integrated experimental site and, a set of generic experiments in nuclear applications. The paper ends with future perspectives

  1. The advanced software development workstation project

    Science.gov (United States)

    Fridge, Ernest M., III; Pitman, Charles L.

    1991-01-01

    The Advanced Software Development Workstation (ASDW) task is researching and developing the technologies required to support Computer Aided Software Engineering (CASE) with the emphasis on those advanced methods, tools, and processes that will be of benefit to support all NASA programs. Immediate goals are to provide research and prototype tools that will increase productivity, in the near term, in projects such as the Software Support Environment (SSE), the Space Station Control Center (SSCC), and the Flight Analysis and Design System (FADS) which will be used to support the Space Shuttle and Space Station Freedom. Goals also include providing technology for development, evolution, maintenance, and operations. The technologies under research and development in the ASDW project are targeted to provide productivity enhancements during the software life cycle phase of enterprise and information system modeling, requirements generation and analysis, system design and coding, and system use and maintenance. On-line user's guides will assist users in operating the developed information system with knowledge base expert assistance.

  2. Assessment of methods for computing the closest point projection, penetration, and gap functions in contact searching problems

    Czech Academy of Sciences Publication Activity Database

    Kopačka, Ján; Gabriel, Dušan; Plešek, Jiří; Ulbin, M.

    2016-01-01

    Roč. 105, č. 11 (2016), s. 803-833 ISSN 0029-5981 R&D Projects: GA ČR(CZ) GAP101/12/2315; GA MŠk(CZ) ME10114 Institutional support: RVO:61388998 Keywords : closest point projection * local contact search * quadratic elements * Newtons methods * geometric iteration methods * simplex method Subject RIV: JC - Computer Hardware ; Software Impact factor: 2.162, year: 2016 http://onlinelibrary.wiley.com/doi/10.1002/nme.4994/abstract

  3. Redefining Project Management Information Systems with New IT Services

    Directory of Open Access Journals (Sweden)

    Luminita Hurbean

    2013-04-01

    Full Text Available Achieving successful adoption of an innovative project management information system should involve influencing the project management environment by providing useful tools, training, reusable templates, techniques, and methods that improve the project manager's ability to successfully execute. This paper suggests that project management practice, enabled by emerging IT, could more explicitly recognize, represent, and manage the interdependencies that are pervasive throughout projects, thereby fully exploiting the potential of the IT to improve overall project performance. The last few years IT&C evolution led to new approaches to application and infrastructure architecture. Breaking from the traditional procedures used by organizations, they propose a cloud operating platform that reduces complexity and improves agility and scalability by altering the approach to the way data centres are built, applications are developed, infrastructure is managed, and organizations align and collaborate. Further, the paper explores the growing impact of mobile computing, cloud delivery and social business collaboration project management information systems and proposes a shift of a Five C’s for information systems in a cloud based operating platform, driven by cooperation, teamwork and continuous improvement.The proposed shift in the cloud indicates actual tools that may be adopted for better collaboration and higher business value of the project information management.

  4. Fast and accurate computation of projected two-point functions

    Science.gov (United States)

    Grasshorn Gebhardt, Henry S.; Jeong, Donghui

    2018-01-01

    We present the two-point function from the fast and accurate spherical Bessel transformation (2-FAST) algorithm1Our code is available at https://github.com/hsgg/twoFAST. for a fast and accurate computation of integrals involving one or two spherical Bessel functions. These types of integrals occur when projecting the galaxy power spectrum P (k ) onto the configuration space, ξℓν(r ), or spherical harmonic space, Cℓ(χ ,χ'). First, we employ the FFTLog transformation of the power spectrum to divide the calculation into P (k )-dependent coefficients and P (k )-independent integrations of basis functions multiplied by spherical Bessel functions. We find analytical expressions for the latter integrals in terms of special functions, for which recursion provides a fast and accurate evaluation. The algorithm, therefore, circumvents direct integration of highly oscillating spherical Bessel functions.

  5. The Effect of a Graph-Oriented Computer-Assisted Project-Based Learning Environment on Argumentation Skills

    Science.gov (United States)

    Hsu, P. -S.; Van Dyke, M.; Chen, Y.; Smith, T. J.

    2015-01-01

    The purpose of this quasi-experimental study was to explore how seventh graders in a suburban school in the United States developed argumentation skills and science knowledge in a project-based learning environment that incorporated a graph-oriented, computer-assisted application. A total of 54 students (three classes) comprised this treatment…

  6. Integrated computer control system CORBA-based simulator FY98 LDRD project final summary report

    International Nuclear Information System (INIS)

    Bryant, R M; Holloway, F W; Van Arsdall, P J.

    1999-01-01

    The CORBA-based Simulator was a Laboratory Directed Research and Development (LDRD) project that applied simulation techniques to explore critical questions about distributed control architecture. The simulator project used a three-prong approach comprised of a study of object-oriented distribution tools, computer network modeling, and simulation of key control system scenarios. This summary report highlights the findings of the team and provides the architectural context of the study. For the last several years LLNL has been developing the Integrated Computer Control System (ICCS), which is an abstract object-oriented software framework for constructing distributed systems. The framework is capable of implementing large event-driven control systems for mission-critical facilities such as the National Ignition Facility (NIF). Tools developed in this project were applied to the NIF example architecture in order to gain experience with a complex system and derive immediate benefits from this LDRD. The ICCS integrates data acquisition and control hardware with a supervisory system, and reduces the amount of new coding and testing necessary by providing prebuilt components that can be reused and extended to accommodate specific additional requirements. The framework integrates control point hardware with a supervisory system by providing the services needed for distributed control such as database persistence, system start-up and configuration, graphical user interface, status monitoring, event logging, scripting language, alert management, and access control. The design is interoperable among computers of different kinds and provides plug-in software connections by leveraging a common object request brokering architecture (CORBA) to transparently distribute software objects across the network of computers. Because object broker distribution applied to control systems is relatively new and its inherent performance is roughly threefold less than traditional point

  7. Synthesis on the spent fuel long term evolution

    Energy Technology Data Exchange (ETDEWEB)

    Ferry, C.; Poinssot, Ch.; Lovera, P.; Poulesquen, A. [CEA Saclay, Dept. de Physico-Chimie (DEN/DPC), 91 - Gif sur Yvette (France); Broudic, V. [CEA Cadarache, Direction des Reacteurs Nucleaires (DRN), 13 - Saint Paul lez Durance (France); Cappelaere, Ch. [CEA Saclay, Dept. des Materiaux pour le Nucleaire(DMN), 91 - Gif-sur-Yvette (France); Desgranges, L. [CEA Cadarache, Direction des Reacteurs Nucleaires (DRN), 13 - Saint-Paul-lez-Durance (France); Garcia, Ph. [CEA Cadarache, Dept. d' Etudes des Combustibles (DEC), 13 - Saint Paul lez Durance (France); Jegou, Ch.; Roudil, D. [CEA Valrho, Dir. de l' Energie Nucleaire (DEN), 30 - Marcoule (France); Lovera, P.; Poulesquen, A. [CEA Saclay, Dept. de Physico-Chimie (DPC), 91 - Gif sur Yvette (France); Marimbeau, P. [CEA Cadarache, Dir. de l' Energie Nucleaire (DEN), 13 - Saint-Paul-lez-Durance (France); Gras, J.M.; Bouffioux, P. [Electricite de France (EDF), 75 - Paris (France)

    2005-07-01

    The French research on spent fuel long term evolution has been performed by CEA (Commissariat a l'Energie Atomique) since 1999 in the PRECCI project with the support of EDF (Electricite de France). These studies focused on the spent fuel behaviour under various conditions encountered in dry storage or in deep geological disposal. Three main types of conditions were discerned: - The evolution in a closed system which corresponds to the normal scenario in storage and to the first confinement phase in disposal; - The evolution in air which corresponds to an incidental loss of confinement during storage or to a rupture of the canister before the site re-saturation in geological disposal; - The evolution in water which corresponds to the normal scenario after the breaching of the canister in repository conditions. This document produced in the frame of the PRECCI project is an overview of the state of knowledge in 2004 concerning the long-term behavior of spent fuel under these various conditions. The state of the art was derived from the results obtained under the PRECCI project as well as from a review of the literature and of data acquired under the European project on Spent Fuel Stability under Repository Conditions. The main results issued from the French research are underlined. (authors)

  8. Synthesis on the spent fuel long term evolution

    International Nuclear Information System (INIS)

    Ferry, C.; Poinssot, Ch.; Lovera, P.; Poulesquen, A.; Broudic, V.; Cappelaere, Ch.; Desgranges, L.; Garcia, Ph.; Jegou, Ch.; Roudil, D.; Lovera, P.; Poulesquen, A.; Marimbeau, P.; Gras, J.M.; Bouffioux, P.

    2005-01-01

    The French research on spent fuel long term evolution has been performed by CEA (Commissariat a l'Energie Atomique) since 1999 in the PRECCI project with the support of EDF (Electricite de France). These studies focused on the spent fuel behaviour under various conditions encountered in dry storage or in deep geological disposal. Three main types of conditions were discerned: - The evolution in a closed system which corresponds to the normal scenario in storage and to the first confinement phase in disposal; - The evolution in air which corresponds to an incidental loss of confinement during storage or to a rupture of the canister before the site re-saturation in geological disposal; - The evolution in water which corresponds to the normal scenario after the breaching of the canister in repository conditions. This document produced in the frame of the PRECCI project is an overview of the state of knowledge in 2004 concerning the long-term behavior of spent fuel under these various conditions. The state of the art was derived from the results obtained under the PRECCI project as well as from a review of the literature and of data acquired under the European project on Spent Fuel Stability under Repository Conditions. The main results issued from the French research are underlined. (authors)

  9. Removing a barrier to computer-based outbreak and disease surveillance--the RODS Open Source Project.

    Science.gov (United States)

    Espino, Jeremy U; Wagner, M; Szczepaniak, C; Tsui, F C; Su, H; Olszewski, R; Liu, Z; Chapman, W; Zeng, X; Ma, L; Lu, Z; Dara, J

    2004-09-24

    Computer-based outbreak and disease surveillance requires high-quality software that is well-supported and affordable. Developing software in an open-source framework, which entails free distribution and use of software and continuous, community-based software development, can produce software with such characteristics, and can do so rapidly. The objective of the Real-Time Outbreak and Disease Surveillance (RODS) Open Source Project is to accelerate the deployment of computer-based outbreak and disease surveillance systems by writing software and catalyzing the formation of a community of users, developers, consultants, and scientists who support its use. The University of Pittsburgh seeded the Open Source Project by releasing the RODS software under the GNU General Public License. An infrastructure was created, consisting of a website, mailing lists for developers and users, designated software developers, and shared code-development tools. These resources are intended to encourage growth of the Open Source Project community. Progress is measured by assessing website usage, number of software downloads, number of inquiries, number of system deployments, and number of new features or modules added to the code base. During September--November 2003, users generated 5,370 page views of the project website, 59 software downloads, 20 inquiries, one new deployment, and addition of four features. Thus far, health departments and companies have been more interested in using the software as is than in customizing or developing new features. The RODS laboratory anticipates that after initial installation has been completed, health departments and companies will begin to customize the software and contribute their enhancements to the public code base.

  10. Transitioning the GED[R] Mathematics Test to Computer with and without Accommodations: A Pilot Project

    Science.gov (United States)

    Patterson, Margaret Becker; Higgins, Jennifer; Bozman, Martha; Katz, Michael

    2011-01-01

    We conducted a pilot study to see how the GED Mathematics Test could be administered on computer with embedded accessibility tools. We examined test scores and test-taker experience. Nineteen GED test centers across five states and 216 randomly assigned GED Tests candidates participated in the project. GED candidates completed two GED mathematics…

  11. The Students Upgrading through Computer and Career Education Systems Services (Project SUCCESS). 1990-91 Final Evaluation Profile. OREA Report.

    Science.gov (United States)

    New York City Board of Education, Brooklyn, NY. Office of Research, Evaluation, and Assessment.

    An evaluation was done of the New York City Public Schools' Student Upgrading through Computer and Career Education Systems Services Program (Project SUCCESS). Project SUCCESS operated at 3 high schools in Brooklyn and Manhattan (Murry Bergtraum High School, Edward R. Murrow High School, and John Dewey High School). It enrolled limited English…

  12. Computer aided process planning at the Oak Ridge Y-12 plant: a pilot project

    International Nuclear Information System (INIS)

    Hewgley, R.E. Jr.; Prewett, H.P. Jr.

    1979-01-01

    In 1976, a formal needs analysis was conducted in one of the Fabrication Division Shops of all activities from the receipt of an order through final machining. The results indicated deficiencies in process planning activities involving special production work. A pilot program was organized to investigate the benefits of emerging CAM technology and application of GT concepts for machining operations at the Y-12 Plant. The objective of the CAPP Project was to provide computer-assisted process planning for special production machining in th shop. The CAPP team was charged with the specific goal of demonstrating computer-aided process planning within a four-year term. The CAPP charter included a plan with intermediate measurable milestones for achieving its mission. In three years, the CAPP project demonstrated benefits to process planning. A capability to retrieve historical records for similar parts, to review accurately the status of all staff assignments, and to generate detailed machining procedures definitely can impact the way in which a machine shop prepared for new orders. The real payoff is in the hardcopy output (N/C programs, studies, sequence plans, and procedures). 4 figures,

  13. A rapid parallelization of cone-beam projection and back-projection operator based on texture fetching interpolation

    Science.gov (United States)

    Xie, Lizhe; Hu, Yining; Chen, Yang; Shi, Luyao

    2015-03-01

    Projection and back-projection are the most computational consuming parts in Computed Tomography (CT) reconstruction. Parallelization strategies using GPU computing techniques have been introduced. We in this paper present a new parallelization scheme for both projection and back-projection. The proposed method is based on CUDA technology carried out by NVIDIA Corporation. Instead of build complex model, we aimed on optimizing the existing algorithm and make it suitable for CUDA implementation so as to gain fast computation speed. Besides making use of texture fetching operation which helps gain faster interpolation speed, we fixed sampling numbers in the computation of projection, to ensure the synchronization of blocks and threads, thus prevents the latency caused by inconsistent computation complexity. Experiment results have proven the computational efficiency and imaging quality of the proposed method.

  14. ProjectQ: an open source software framework for quantum computing

    Directory of Open Access Journals (Sweden)

    Damian S. Steiger

    2018-01-01

    Full Text Available We introduce ProjectQ, an open source software effort for quantum computing. The first release features a compiler framework capable of targeting various types of hardware, a high-performance simulator with emulation capabilities, and compiler plug-ins for circuit drawing and resource estimation. We introduce our Python-embedded domain-specific language, present the features, and provide example implementations for quantum algorithms. The framework allows testing of quantum algorithms through simulation and enables running them on actual quantum hardware using a back-end connecting to the IBM Quantum Experience cloud service. Through extension mechanisms, users can provide back-ends to further quantum hardware, and scientists working on quantum compilation can provide plug-ins for additional compilation, optimization, gate synthesis, and layout strategies.

  15. The Evolution Process on Information Technology Outsourcing Relationship

    OpenAIRE

    Duan Weihua

    2017-01-01

    Information technology outsourcing relationship is one of the key issues to IT outsourcing success. To explore how to manage and promote IT outsourcing relationship, it is necessary to understand its evolution process. Firstly, the types of IT outsourcing based on relationship quality and IT outsourcing project level will be analyzed; Secondly, two evolution process models of IT outsourcing relationship are proposed based on relationship quality and IT outsourcing project level, and the IT ou...

  16. The MELANIE project: from a biopsy to automatic protein map interpretation by computer.

    Science.gov (United States)

    Appel, R D; Hochstrasser, D F; Funk, M; Vargas, J R; Pellegrini, C; Muller, A F; Scherrer, J R

    1991-10-01

    The goals of the MELANIE project are to determine if disease-associated patterns can be detected in high resolution two-dimensional polyacrylamide gel electrophoresis (HR 2D-PAGE) images and if a diagnosis can be established automatically by computer. The ELSIE/MELANIE system is a set of computer programs which automatically detect, quantify, and compare protein spots shown on HR 2D-PAGE images. Classification programs help the physician to find disease-associated patterns from a given set of two-dimensional gel electrophoresis images and to form diagnostic rules. Prototype expert systems that use these rules to establish a diagnosis from new HR 2D-PAGE images have been developed. They successfully diagnosed cirrhosis of the liver and were able to distinguish a variety of cancer types from biopsies known to be cancerous.

  17. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  18. The Main Tendencies in the Development of Startup Projects as a Form of Innovative-Creative Enterprises in the Ukrainian Computer Programming Market

    Directory of Open Access Journals (Sweden)

    Garafonova Olga I.

    2017-10-01

    Full Text Available The article is aimed at studying the main tendencies in the development of startup projects as a form of innovative-creative enterprises in the Ukrainian computer programming market. A definition of «innovative-creative enterprises» has been proposed, the main features of startups as a form of innovative-creative enterprises has been considered. The directions of development of the computer programming market were analyzed, considering the most significant future trends, products and services in the computer programming sector. An analysis of startups in the Ukrainian computer programming market, based on the volume of investments made, was carried out. A model for the development of startup projects as a form of innovative-creative enterprises has been designed. The unfamiliar promising spheres, wherein have not yet been launched startups in the Ukrainian computer programming market, have been indicated.

  19. Computation of fragment velocities and projection angles of an anti-aircraft round

    CSIR Research Space (South Africa)

    Snyman, IM

    2014-09-01

    Full Text Available the reference point up to the first part. The first part starts at the last points on the tail end and ends at the beginning of part 2. The calculated mass of each cylinder is also shown. Table 1: The position and characteristics of the cylindrical rings... projection power as Hexal P30 according to Langen and Barth, 1979. See also Section 4 below. • To facilitate the computation (especially the elapsed time of the runs), a rectangular aluminium solid models the mass of the fuse. 3.2 Model Set-up ANSYS...

  20. Trans-Amazon Drilling Project (TADP): origins and evolution of the forests, climate, and hydrology of the South American tropics

    Science.gov (United States)

    Baker, P. A.; Fritz, S. C.; Silva, C. G.; Rigsby, C. A.; Absy, M. L.; Almeida, R. P.; Caputo, M.; Chiessi, C. M.; Cruz, F. W.; Dick, C. W.; Feakins, S. J.; Figueiredo, J.; Freeman, K. H.; Hoorn, C.; Jaramillo, C.; Kern, A. K.; Latrubesse, E. M.; Ledru, M. P.; Marzoli, A.; Myrbo, A.; Noren, A.; Piller, W. E.; Ramos, M. I. F.; Ribas, C. C.; Trnadade, R.; West, A. J.; Wahnfried, I.; Willard, D. A.

    2015-12-01

    This article presents the scientific rationale for an ambitious ICDP drilling project to continuously sample Late Cretaceous to modern sediment in four different sedimentary basins that transect the equatorial Amazon of Brazil, from the Andean foreland to the Atlantic Ocean. The goals of this project are to document the evolution of plant biodiversity in the Amazon forests and to relate biotic diversification to changes in the physical environment, including climate, tectonism, and the surface landscape. These goals require long sedimentary records from each of the major sedimentary basins across the heart of the Brazilian Amazon, which can only be obtained by drilling because of the scarcity of Cenozoic outcrops. The proposed drilling will provide the first long, nearly continuous regional records of the Cenozoic history of the forests, their plant diversity, and the associated changes in climate and environment. It also will address fundamental questions about landscape evolution, including the history of Andean uplift and erosion as recorded in Andean foreland basins and the development of west-to-east hydrologic continuity between the Andes, the Amazon lowlands, and the equatorial Atlantic. Because many modern rivers of the Amazon basin flow along the major axes of the old sedimentary basins, we plan to locate drill sites on the margin of large rivers and to access the targeted drill sites by navigation along these rivers.

  1. Trans-Amazon Drilling Project (TADP): origins and evolution of the forests, climate, and hydrology of the South American tropics

    Science.gov (United States)

    Baker, P.A.; Fritz, S.C.; Silva, C.G.; Rigsby, C.A.; Absy, M.L.; Almeida, R.P.; Caputo, Maria C.; Chiessi, C.M.; Cruz, F.W.; Dick, C.W.; Feakins, S.J.; Figueiredo, J.; Freeman, K.H.; Hoorn, C.; Jaramillo, C.A.; Kern, A.; Latrubesse, E.M.; Ledru, M.P.; Marzoli, A.; Myrbo, A.; Noren, A.; Piller, W.E.; Ramos, M.I.F.; Ribas, C.C.; Trinadade, R.; West, A.J.; Wahnfried, I.; Willard, Debra A.

    2015-01-01

    This article presents the scientific rationale for an ambitious ICDP drilling project to continuously sample Late Cretaceous to modern sediment in four different sedimentary basins that transect the equatorial Amazon of Brazil, from the Andean foreland to the Atlantic Ocean. The goals of this project are to document the evolution of plant biodiversity in the Amazon forests and to relate biotic diversification to changes in the physical environment, including climate, tectonism, and the surface landscape. These goals require long sedimentary records from each of the major sedimentary basins across the heart of the Brazilian Amazon, which can only be obtained by drilling because of the scarcity of Cenozoic outcrops. The proposed drilling will provide the first long, nearly continuous regional records of the Cenozoic history of the forests, their plant diversity, and the associated changes in climate and environment. It also will address fundamental questions about landscape evolution, including the history of Andean uplift and erosion as recorded in Andean foreland basins and the development of west-to-east hydrologic continuity between the Andes, the Amazon lowlands, and the equatorial Atlantic. Because many modern rivers of the Amazon basin flow along the major axes of the old sedimentary basins, we plan to locate drill sites on the margin of large rivers and to access the targeted drill sites by navigation along these rivers.

  2. Common envelope evolution

    NARCIS (Netherlands)

    Taam, Ronald E.; Ricker, Paul M.

    2010-01-01

    The common envelope phase of binary star evolution plays a central role in many evolutionary pathways leading to the formation of compact objects in short period systems. Using three dimensional hydrodynamical computations, we review the major features of this evolutionary phase, focusing on the

  3. Interactive Evolution of Complex Behaviours Through Skill Encapsulation

    DEFF Research Database (Denmark)

    González de Prado Salas, Pablo; Risi, Sebastian

    2017-01-01

    Human-based computation (HBC) is an emerging research area in which humans and machines collaborate to solve tasks that neither one can solve in isolation. In evolutionary computation, HBC is often realized through interactive evolutionary computation (IEC), in which a user guides evolution by it...... in evolutionary computation and, as the results in this paper show, IEC-ESP is able to solve complex control problems that are challenging for a traditional fitness-based approach.......Human-based computation (HBC) is an emerging research area in which humans and machines collaborate to solve tasks that neither one can solve in isolation. In evolutionary computation, HBC is often realized through interactive evolutionary computation (IEC), in which a user guides evolution...... by iteratively selecting the parents for the next generation. IEC has shown promise in a variety of different domains, but evolving more complex or hierarchically composed behaviours remains challenging with the traditional IEC approach. To overcome this challenge, this paper combines the recently introduced ESP...

  4. Collective Efficacy and Its Relationship with Leadership in a Computer-Mediated Project-Based Group Work

    Science.gov (United States)

    Huh, Yeol; Reigeluth, Charles M.; Lee, Dabae

    2014-01-01

    Based on Bandura's work, the four sources of efficacy shaping were examined in regard to frequency and students' perception of importance in a computer-mediated, project-based high school classroom. In a context of group work where there was no designated leader, groups' collective efficacy was examined if it has any relationship with individual's…

  5. A computer based approach for Material, Manpower and Equipment managementin the Construction Projects

    Science.gov (United States)

    Sasidhar, Jaladanki; Muthu, D.; Venkatasubramanian, C.; Ramakrishnan, K.

    2017-07-01

    The success of any construction project will depend on efficient management of resources in a perfect manner to complete the project with a reasonable budget and time and the quality cannot be compromised. The efficient and timely procurement of material, deployment of adequate labor at correct time and mobilization of machinery lacking in time, all of them causes delay, lack of quality and finally affect the project cost. It is known factor that Project cost can be controlled by taking corrective actions on mobilization of resources at a right time. This research focuses on integration of management systems with the computer to generate the model which uses OOM data structure which decides to include automatic commodity code generation, automatic takeoff execution, intelligent purchase order generation, and components of design and schedule integration to overcome the problems of stock out. To overcome the problem in equipment management system inventory management module is suggested and the data set of equipment registration number, equipment number, description, date of purchase, manufacturer, equipment price, market value, life of equipment, production data of the equipment which includes equipment number, date, name of the job, hourly rate, insurance, depreciation cost of the equipment, taxes, storage cost, interest, oil, grease, and fuel consumption, etc. is analyzed and the decision support systems to overcome the problem arising out improper management is generated. The problem on labor is managed using scheduling, Strategic management of human resources. From the generated support systems tool, the resources are mobilized at a right time and help the project manager to finish project in time and thereby save the abnormal project cost and also provides the percentage that can be improved and also research focuses on determining the percentage of delays that are caused by lack of management of materials, manpower and machinery in different types of projects

  6. Computers for lattice field theories

    International Nuclear Information System (INIS)

    Iwasaki, Y.

    1994-01-01

    Parallel computers dedicated to lattice field theories are reviewed with emphasis on the three recent projects, the Teraflops project in the US, the CP-PACS project in Japan and the 0.5-Teraflops project in the US. Some new commercial parallel computers are also discussed. Recent development of semiconductor technologies is briefly surveyed in relation to possible approaches toward Teraflops computers. (orig.)

  7. Modeling investigation of the stability and irradiation-induced evolution of nanoscale precipitates in advanced structural materials

    International Nuclear Information System (INIS)

    Wirth, Brian

    2015-01-01

    Materials used in extremely hostile environment such as nuclear reactors are subject to a high flux of neutron irradiation, and thus vast concentrations of vacancy and interstitial point defects are produced because of collisions of energetic neutrons with host lattice atoms. The fate of these defects depends on various reaction mechanisms which occur immediately following the displacement cascade evolution and during the longer-time kinetically dominated evolution such as annihilation, recombination, clustering or trapping at sinks of vacancies, interstitials and their clusters. The long-range diffusional transport and evolution of point defects and self-defect clusters drive a microstructural and microchemical evolution that are known to produce degradation of mechanical properties including the creep rate, yield strength, ductility, or fracture toughness, and correspondingly affect material serviceability and lifetimes in nuclear applications. Therefore, a detailed understanding of microstructural evolution in materials at different time and length scales is of significant importance. The primary objective of this work is to utilize a hierarchical computational modeling approach i) to evaluate the potential for nanoscale precipitates to enhance point defect recombination rates and thereby the self-healing ability of advanced structural materials, and ii) to evaluate the stability and irradiation-induced evolution of such nanoscale precipitates resulting from enhanced point defect transport to and annihilation at precipitate interfaces. This project will utilize, and as necessary develop, computational materials modeling techniques within a hierarchical computational modeling approach, principally including molecular dynamics, kinetic Monte Carlo and spatially-dependent cluster dynamics modeling, to identify and understand the most important physical processes relevant to promoting the ''selfhealing'' or radiation resistance in advanced

  8. Modeling investigation of the stability and irradiation-induced evolution of nanoscale precipitates in advanced structural materials

    Energy Technology Data Exchange (ETDEWEB)

    Wirth, Brian [Univ. of Tennessee, Knoxville, TN (United States)

    2015-04-08

    Materials used in extremely hostile environment such as nuclear reactors are subject to a high flux of neutron irradiation, and thus vast concentrations of vacancy and interstitial point defects are produced because of collisions of energetic neutrons with host lattice atoms. The fate of these defects depends on various reaction mechanisms which occur immediately following the displacement cascade evolution and during the longer-time kinetically dominated evolution such as annihilation, recombination, clustering or trapping at sinks of vacancies, interstitials and their clusters. The long-range diffusional transport and evolution of point defects and self-defect clusters drive a microstructural and microchemical evolution that are known to produce degradation of mechanical properties including the creep rate, yield strength, ductility, or fracture toughness, and correspondingly affect material serviceability and lifetimes in nuclear applications. Therefore, a detailed understanding of microstructural evolution in materials at different time and length scales is of significant importance. The primary objective of this work is to utilize a hierarchical computational modeling approach i) to evaluate the potential for nanoscale precipitates to enhance point defect recombination rates and thereby the self-healing ability of advanced structural materials, and ii) to evaluate the stability and irradiation-induced evolution of such nanoscale precipitates resulting from enhanced point defect transport to and annihilation at precipitate interfaces. This project will utilize, and as necessary develop, computational materials modeling techniques within a hierarchical computational modeling approach, principally including molecular dynamics, kinetic Monte Carlo and spatially-dependent cluster dynamics modeling, to identify and understand the most important physical processes relevant to promoting the “selfhealing” or radiation resistance in advanced materials containing

  9. Project of computer program for designing the steel with the assumed CCT diagram

    OpenAIRE

    S. Malara; J. Trzaska; L.A. Dobrzański

    2007-01-01

    Purpose: The aim of this paper was developing a project of computer aided method for designing the chemicalcomposition of steel with the assumed CCT diagram.Design/methodology/approach: The purpose has been achieved in four stages. At the first stage characteristicpoints of CCT diagram have been determined. At the second stage neural networks have been developed, andnext CCT diagram terms of similarity have been worked out- at the third one. In the last one steel chemicalcomposition optimizat...

  10. Magnetohydrodynamics: Parallel computation of the dynamics of thermonuclear and astrophysical plasmas. 1. Annual report of massively parallel computing pilot project 93MPR05

    International Nuclear Information System (INIS)

    1994-08-01

    This is the first annual report of the MPP pilot project 93MPR05. In this pilot project four research groups with different, complementary backgrounds collaborate with the aim to develop new algorithms and codes to simulate the magnetohydrodynamics of thermonuclear and astrophysical plasmas on massively parallel machines. The expected speed-up is required to simulate the dynamics of the hot plasmas of interest which are characterized by very large magnetic Reynolds numbers and, hence, require high spatial and temporal resolutions (for details see section 1). The four research groups that collaborated to produce the results reported here are: The MHD group of Prof. Dr. J.P. Goedbloed at the FOM-Institute for Plasma Physics 'Rijnhuizen' in Nieuwegein, the group of Prof. Dr. H. van der Vorst at the Mathematics Institute of Utrecht University, the group of Prof. Dr. A.G. Hearn at the Astronomical Institute of Utrecht University, and the group of Dr. Ir. H.J.J. te Riele at the CWI in Amsterdam. The full project team met frequently during this first project year to discuss progress reports, current problems, etc. (see section 2). The main results of the first project year are: - Proof of the scalability of typical linear and nonlinear MHD codes - development and testing of a parallel version of the Arnoldi algorithm - development and testing of alternative methods for solving large non-Hermitian eigenvalue problems - porting of the 3D nonlinear semi-implicit time evolution code HERA to an MPP system. The steps that were scheduled to reach these intended results are given in section 3. (orig./WL)

  11. Magnetohydrodynamics: Parallel computation of the dynamics of thermonuclear and astrophysical plasmas. 1. Annual report of massively parallel computing pilot project 93MPR05

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-08-01

    This is the first annual report of the MPP pilot project 93MPR05. In this pilot project four research groups with different, complementary backgrounds collaborate with the aim to develop new algorithms and codes to simulate the magnetohydrodynamics of thermonuclear and astrophysical plasmas on massively parallel machines. The expected speed-up is required to simulate the dynamics of the hot plasmas of interest which are characterized by very large magnetic Reynolds numbers and, hence, require high spatial and temporal resolutions (for details see section 1). The four research groups that collaborated to produce the results reported here are: The MHD group of Prof. Dr. J.P. Goedbloed at the FOM-Institute for Plasma Physics `Rijnhuizen` in Nieuwegein, the group of Prof. Dr. H. van der Vorst at the Mathematics Institute of Utrecht University, the group of Prof. Dr. A.G. Hearn at the Astronomical Institute of Utrecht University, and the group of Dr. Ir. H.J.J. te Riele at the CWI in Amsterdam. The full project team met frequently during this first project year to discuss progress reports, current problems, etc. (see section 2). The main results of the first project year are: - Proof of the scalability of typical linear and nonlinear MHD codes - development and testing of a parallel version of the Arnoldi algorithm - development and testing of alternative methods for solving large non-Hermitian eigenvalue problems - porting of the 3D nonlinear semi-implicit time evolution code HERA to an MPP system. The steps that were scheduled to reach these intended results are given in section 3. (orig./WL).

  12. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  13. Evolution of project planning tools in a matrix organization

    Energy Technology Data Exchange (ETDEWEB)

    Furaus, J.P.; Figueroa-McInteer, C.; McKeever, P.S.; Wisler, D.B. [Sandia National Labs., Albuquerque, NM (United States); Zavadil, J.T. [Infomatrix (United States)

    1996-10-01

    Until recently, the Corporate Construction Program at Sandia was experiencing difficulties in managing projects: poor planning and cost estimating caused schedule and budget problems. The first step taken was a Microsoft {reg_sign} Project schedule that provides a standard template for scheduling individual construction projects. It is broken down according to the life cycle of the project and prevents the project team from leaving out an important item. A WBS (work breakdown structure) dictionary was also developed that describes how capital and operating funds are used to develop, design, construct, equip, and manage projects. We also developed a matrix chart that maps the planning guide against the major types of construction projects at Sandia. The guide, dictionary, and matrix chart offer enough flexibility that the project manager can make choices about how to structure work, yet ensure that all work rolls up to the cost categories and key DOE WBS elements. As requirements change, the tools can be updated; they also serve as training tools for new project team members.

  14. WRF4G project: Adaptation of WRF Model to Distributed Computing Infrastructures

    Science.gov (United States)

    Cofino, Antonio S.; Fernández Quiruelas, Valvanuz; García Díez, Markel; Blanco Real, Jose C.; Fernández, Jesús

    2013-04-01

    Nowadays Grid Computing is powerful computational tool which is ready to be used for scientific community in different areas (such as biomedicine, astrophysics, climate, etc.). However, the use of this distributed computing infrastructures (DCI) is not yet common practice in climate research, and only a few teams and applications in this area take advantage of this infrastructure. Thus, the first objective of this project is to popularize the use of this technology in the atmospheric sciences area. In order to achieve this objective, one of the most used applications has been taken (WRF; a limited- area model, successor of the MM5 model), that has a user community formed by more than 8000 researchers worldwide. This community develop its research activity on different areas and could benefit from the advantages of Grid resources (case study simulations, regional hind-cast/forecast, sensitivity studies, etc.). The WRF model is been used as input by many energy and natural hazards community, therefore those community will also benefit. However, Grid infrastructures have some drawbacks for the execution of applications that make an intensive use of CPU and memory for a long period of time. This makes necessary to develop a specific framework (middleware). This middleware encapsulates the application and provides appropriate services for the monitoring and management of the jobs and the data. Thus, the second objective of the project consists on the development of a generic adaptation of WRF for Grid (WRF4G), to be distributed as open-source and to be integrated in the official WRF development cycle. The use of this WRF adaptation should be transparent and useful to face any of the previously described studies, and avoid any of the problems of the Grid infrastructure. Moreover it should simplify the access to the Grid infrastructures for the research teams, and also to free them from the technical and computational aspects of the use of the Grid. Finally, in order to

  15. Computer-Assisted Classification Patterns in Autoimmune Diagnostics: The AIDA Project

    Directory of Open Access Journals (Sweden)

    Amel Benammar Elgaaied

    2016-01-01

    Full Text Available Antinuclear antibodies (ANAs are significant biomarkers in the diagnosis of autoimmune diseases in humans, done by mean of Indirect ImmunoFluorescence (IIF method, and performed by analyzing patterns and fluorescence intensity. This paper introduces the AIDA Project (autoimmunity: diagnosis assisted by computer developed in the framework of an Italy-Tunisia cross-border cooperation and its preliminary results. A database of interpreted IIF images is being collected through the exchange of images and double reporting and a Gold Standard database, containing around 1000 double reported images, has been settled. The Gold Standard database is used for optimization of a CAD (Computer Aided Detection solution and for the assessment of its added value, in order to be applied along with an Immunologist as a second Reader in detection of autoantibodies. This CAD system is able to identify on IIF images the fluorescence intensity and the fluorescence pattern. Preliminary results show that CAD, used as second Reader, appeared to perform better than Junior Immunologists and hence may significantly improve their efficacy; compared with two Junior Immunologists, the CAD system showed higher Intensity Accuracy (85,5% versus 66,0% and 66,0%, higher Patterns Accuracy (79,3% versus 48,0% and 66,2%, and higher Mean Class Accuracy (79,4% versus 56,7% and 64.2%.

  16. Computer-Assisted Classification Patterns in Autoimmune Diagnostics: The AIDA Project.

    Science.gov (United States)

    Benammar Elgaaied, Amel; Cascio, Donato; Bruno, Salvatore; Ciaccio, Maria Cristina; Cipolla, Marco; Fauci, Alessandro; Morgante, Rossella; Taormina, Vincenzo; Gorgi, Yousr; Marrakchi Triki, Raja; Ben Ahmed, Melika; Louzir, Hechmi; Yalaoui, Sadok; Imene, Sfar; Issaoui, Yassine; Abidi, Ahmed; Ammar, Myriam; Bedhiafi, Walid; Ben Fraj, Oussama; Bouhaha, Rym; Hamdi, Khouloud; Soumaya, Koudhi; Neili, Bilel; Asma, Gati; Lucchese, Mariano; Catanzaro, Maria; Barbara, Vincenza; Brusca, Ignazio; Fregapane, Maria; Amato, Gaetano; Friscia, Giuseppe; Neila, Trai; Turkia, Souayeh; Youssra, Haouami; Rekik, Raja; Bouokez, Hayet; Vasile Simone, Maria; Fauci, Francesco; Raso, Giuseppe

    2016-01-01

    Antinuclear antibodies (ANAs) are significant biomarkers in the diagnosis of autoimmune diseases in humans, done by mean of Indirect ImmunoFluorescence (IIF) method, and performed by analyzing patterns and fluorescence intensity. This paper introduces the AIDA Project (autoimmunity: diagnosis assisted by computer) developed in the framework of an Italy-Tunisia cross-border cooperation and its preliminary results. A database of interpreted IIF images is being collected through the exchange of images and double reporting and a Gold Standard database, containing around 1000 double reported images, has been settled. The Gold Standard database is used for optimization of a CAD (Computer Aided Detection) solution and for the assessment of its added value, in order to be applied along with an Immunologist as a second Reader in detection of autoantibodies. This CAD system is able to identify on IIF images the fluorescence intensity and the fluorescence pattern. Preliminary results show that CAD, used as second Reader, appeared to perform better than Junior Immunologists and hence may significantly improve their efficacy; compared with two Junior Immunologists, the CAD system showed higher Intensity Accuracy (85,5% versus 66,0% and 66,0%), higher Patterns Accuracy (79,3% versus 48,0% and 66,2%), and higher Mean Class Accuracy (79,4% versus 56,7% and 64.2%).

  17. Computer-Assisted Classification Patterns in Autoimmune Diagnostics: The AIDA Project

    Science.gov (United States)

    Benammar Elgaaied, Amel; Cascio, Donato; Bruno, Salvatore; Ciaccio, Maria Cristina; Cipolla, Marco; Fauci, Alessandro; Morgante, Rossella; Taormina, Vincenzo; Gorgi, Yousr; Marrakchi Triki, Raja; Ben Ahmed, Melika; Louzir, Hechmi; Yalaoui, Sadok; Imene, Sfar; Issaoui, Yassine; Abidi, Ahmed; Ammar, Myriam; Bedhiafi, Walid; Ben Fraj, Oussama; Bouhaha, Rym; Hamdi, Khouloud; Soumaya, Koudhi; Neili, Bilel; Asma, Gati; Lucchese, Mariano; Catanzaro, Maria; Barbara, Vincenza; Brusca, Ignazio; Fregapane, Maria; Amato, Gaetano; Friscia, Giuseppe; Neila, Trai; Turkia, Souayeh; Youssra, Haouami; Rekik, Raja; Bouokez, Hayet; Vasile Simone, Maria; Fauci, Francesco; Raso, Giuseppe

    2016-01-01

    Antinuclear antibodies (ANAs) are significant biomarkers in the diagnosis of autoimmune diseases in humans, done by mean of Indirect ImmunoFluorescence (IIF) method, and performed by analyzing patterns and fluorescence intensity. This paper introduces the AIDA Project (autoimmunity: diagnosis assisted by computer) developed in the framework of an Italy-Tunisia cross-border cooperation and its preliminary results. A database of interpreted IIF images is being collected through the exchange of images and double reporting and a Gold Standard database, containing around 1000 double reported images, has been settled. The Gold Standard database is used for optimization of a CAD (Computer Aided Detection) solution and for the assessment of its added value, in order to be applied along with an Immunologist as a second Reader in detection of autoantibodies. This CAD system is able to identify on IIF images the fluorescence intensity and the fluorescence pattern. Preliminary results show that CAD, used as second Reader, appeared to perform better than Junior Immunologists and hence may significantly improve their efficacy; compared with two Junior Immunologists, the CAD system showed higher Intensity Accuracy (85,5% versus 66,0% and 66,0%), higher Patterns Accuracy (79,3% versus 48,0% and 66,2%), and higher Mean Class Accuracy (79,4% versus 56,7% and 64.2%). PMID:27042658

  18. Man and computer

    International Nuclear Information System (INIS)

    Fischbach, K.F.

    1981-01-01

    The discussion of cultural and sociological consequences of computer evolution is hindered by human prejudice. For example the sentence 'a computer is at best as intelligent as its programmer' veils actual developments. Theoretical limits of computer intelligence are the limits of intelligence in general. Modern computer systems replace not only human labour, but also human decision making and thereby human responsibility. The historical situation is unique. Human head-work is being automated and man is loosing function. (orig.) [de

  19. Method of computer generation and projection recording of microholograms for holographic memory systems: mathematical modelling and experimental implementation

    International Nuclear Information System (INIS)

    Betin, A Yu; Bobrinev, V I; Evtikhiev, N N; Zherdev, A Yu; Zlokazov, E Yu; Lushnikov, D S; Markin, V V; Odinokov, S B; Starikov, S N; Starikov, R S

    2013-01-01

    A method of computer generation and projection recording of microholograms for holographic memory systems is presented; the results of mathematical modelling and experimental implementation of the method are demonstrated. (holographic memory)

  20. The E-MOSAICS project: simulating the formation and co-evolution of galaxies and their star cluster populations

    Science.gov (United States)

    Pfeffer, Joel; Kruijssen, J. M. Diederik; Crain, Robert A.; Bastian, Nate

    2018-04-01

    We introduce the MOdelling Star cluster population Assembly In Cosmological Simulations within EAGLE (E-MOSAICS) project. E-MOSAICS incorporates models describing the formation, evolution, and disruption of star clusters into the EAGLE galaxy formation simulations, enabling the examination of the co-evolution of star clusters and their host galaxies in a fully cosmological context. A fraction of the star formation rate of dense gas is assumed to yield a cluster population; this fraction and the population's initial properties are governed by the physical properties of the natal gas. The subsequent evolution and disruption of the entire cluster population are followed accounting for two-body relaxation, stellar evolution, and gravitational shocks induced by the local tidal field. This introductory paper presents a detailed description of the model and initial results from a suite of 10 simulations of ˜L⋆ galaxies with disc-like morphologies at z = 0. The simulations broadly reproduce key observed characteristics of young star clusters and globular clusters (GCs), without invoking separate formation mechanisms for each population. The simulated GCs are the surviving population of massive clusters formed at early epochs (z ≳ 1-2), when the characteristic pressures and surface densities of star-forming gas were significantly higher than observed in local galaxies. We examine the influence of the star formation and assembly histories of galaxies on their cluster populations, finding that (at similar present-day mass) earlier-forming galaxies foster a more massive and disruption-resilient cluster population, while galaxies with late mergers are capable of forming massive clusters even at late cosmic epochs. We find that the phenomenological treatment of interstellar gas in EAGLE precludes the accurate modelling of cluster disruption in low-density environments, but infer that simulations incorporating an explicitly modelled cold interstellar gas phase will overcome

  1. Undecidability and Irreducibility Conditions for Open-Ended Evolution and Emergence.

    Science.gov (United States)

    Hernández-Orozco, Santiago; Hernández-Quiroz, Francisco; Zenil, Hector

    2018-01-01

    Is undecidability a requirement for open-ended evolution (OEE)? Using methods derived from algorithmic complexity theory, we propose robust computational definitions of open-ended evolution and the adaptability of computable dynamical systems. Within this framework, we show that decidability imposes absolute limits on the stable growth of complexity in computable dynamical systems. Conversely, systems that exhibit (strong) open-ended evolution must be undecidable, establishing undecidability as a requirement for such systems. Complexity is assessed in terms of three measures: sophistication, coarse sophistication, and busy beaver logical depth. These three complexity measures assign low complexity values to random (incompressible) objects. As time grows, the stated complexity measures allow for the existence of complex states during the evolution of a computable dynamical system. We show, however, that finding these states involves undecidable computations. We conjecture that for similar complexity measures that assign low complexity values, decidability imposes comparable limits on the stable growth of complexity, and that such behavior is necessary for nontrivial evolutionary systems. We show that the undecidability of adapted states imposes novel and unpredictable behavior on the individuals or populations being modeled. Such behavior is irreducible. Finally, we offer an example of a system, first proposed by Chaitin, that exhibits strong OEE.

  2. 100 million years of multigene family evolution: origin and evolution of the avian MHC class IIB

    Czech Academy of Sciences Publication Activity Database

    Goebel, J.; Promerová, Marta; Bonadonna, F.; McCoy, K. D.; Serbielle, C.; Strandh, M.; Yannic, G.; Burri, R.; Fumagalli, L.

    2017-01-01

    Roč. 18, č. 460 (2017), s. 1-9 ISSN 1471-2164 R&D Projects: GA ČR GAP505/10/1871 Institutional support: RVO:68081766 Keywords : Birds * Birth -death evolution * Concerted evolution * Gene duplication * Gene conversion * Major histocompatibility complex * Recombination Subject RIV: EG - Zoology OBOR OECD: Genetics and heredity (medical genetics to be 3) Impact factor: 3.729, year: 2016

  3. 100 million years of multigene family evolution: origin and evolution of the avian MHC class IIB

    Czech Academy of Sciences Publication Activity Database

    Goebel, J.; Promerová, Marta; Bonadonna, F.; McCoy, K. D.; Serbielle, C.; Strandh, M.; Yannic, G.; Burri, R.; Fumagalli, L.

    2017-01-01

    Roč. 18, č. 460 (2017), s. 1-9 ISSN 1471-2164 R&D Projects: GA ČR GAP505/10/1871 Institutional support: RVO:68081766 Keywords : Birds * Birth-death evolution * Concerted evolution * Gene duplication * Gene conversion * Major histocompatibility complex * Recombination Subject RIV: EG - Zoology OBOR OECD: Genetics and heredity (medical genetics to be 3) Impact factor: 3.729, year: 2016

  4. A directory of computer codes suitable for stress analysis of HLW containers - Compas project

    International Nuclear Information System (INIS)

    1989-01-01

    This document reports the work carried out for the Compas project which looked at the capabilities of various computer codes for the stress analysis of high-level nuclear-waste containers and overpacks. The report concentrates on codes used by the project partners, but also includes a number of the major commercial finite element codes. The report falls into two parts. The first part of the report describes the capabilities of the codes. This includes details of the solution methods used in the codes, the types of analysis which they can carry out and the interfacing with pre - and post - processing packages. This is the more comprehensive section of the report. The second part of the report looks at the performance of a selection of the codes (those used by the project partners). This look at how the codes perform in a number of test problems which require calculations typical of those encountered in the design and analysis of high-level waste containers and overpacks

  5. Examination of China's performance and thematic evolution in quantum cryptography research using quantitative and computational techniques.

    Science.gov (United States)

    Olijnyk, Nicholas V

    2018-01-01

    This study performed two phases of analysis to shed light on the performance and thematic evolution of China's quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001-2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China's QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China's performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China's performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China's H-index (a normalized indicator) has surpassed all other countries' over the last several years. The second phase of analysis shows how China's main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China's QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology policy researchers

  6. Examination of China's performance and thematic evolution in quantum cryptography research using quantitative and computational techniques.

    Directory of Open Access Journals (Sweden)

    Nicholas V Olijnyk

    Full Text Available This study performed two phases of analysis to shed light on the performance and thematic evolution of China's quantum cryptography (QC research. First, large-scale research publication metadata derived from QC research published from 2001-2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China's QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China's performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China's performance indicators (i.e., Publication Frequency, citation score, H-index are growing. China's H-index (a normalized indicator has surpassed all other countries' over the last several years. The second phase of analysis shows how China's main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures; some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state, while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation. Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China's QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology

  7. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  8. Evolution of project management research: a bibliometric study of International Journal of Project Management

    Directory of Open Access Journals (Sweden)

    Fábio Cocchi da Silva Eiras

    2017-03-01

    Full Text Available Over the past decades, the project management field has evolved and consolidated. Facing this growth, this research aims to identify the main trends of research in the area, as well as providing an overview of publications, identifying new issues, changes in approaches and the development of knowledge areas. To do so, a systematic review of the literature was performed with the use of bibliometric study in the papers of the International Journal of Project Management (IJPM, included in SCOPUS, from its first volume to 2015, covering a period of more than 30 years. It was found that developing countries are increasingly concerned in developing research into the field of project management, especially in mega infrastructure projects and public-private partnerships. The risk is a central topic in all periods of analysis, however, the strategic topics such as success in project and portfolio management are among the fastest growing. Issues related to the soft side of project management as skills, culture, and knowledge management have emerged in recent periods. According to the industry, construction projects and projects in information technology are the most studied along the period analysed.

  9. Chirp subbottom profiler data collected in Pamlico Sound on cruise EPamSh-2016 of RV Riggs for the Coastal Hydrodynamics and Natural Geologic Evolution (CHaNGE) project

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Edgetech 216 chirp data (SEG-Y format) collected for the Coastal Hydrodynamics and Natural Geologic Evolution (CHaNGE) project, OCE-1130843. Survey area covers...

  10. Secure cloud computing

    CERN Document Server

    Jajodia, Sushil; Samarati, Pierangela; Singhal, Anoop; Swarup, Vipin; Wang, Cliff

    2014-01-01

    This book presents a range of cloud computing security challenges and promising solution paths. The first two chapters focus on practical considerations of cloud computing. In Chapter 1, Chandramouli, Iorga, and Chokani describe the evolution of cloud computing and the current state of practice, followed by the challenges of cryptographic key management in the cloud. In Chapter 2, Chen and Sion present a dollar cost model of cloud computing and explore the economic viability of cloud computing with and without security mechanisms involving cryptographic mechanisms. The next two chapters addres

  11. The Impact of Project Manager on Project Success – The Case of ICT Sector

    OpenAIRE

    Blaskovics, Bálint

    2016-01-01

    The project management literature on project success is rich. Numerous papers focus on the evolution of the understanding of project success, identification of success criteria and critical success factors. Critical success factors increase the potential for achieving project success, while project success can be evaluated with the help of success criteria. Although the interrelationships between critical success factors and success criteria are rarely analyzed, yet there is a strong demand f...

  12. Computational Biology and High Performance Computing 2000

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.; Shoichet, Brian K.; Stewart, Craig; Dubchak, Inna L.; Arkin, Adam P.

    2000-10-19

    The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational and theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.

  13. Public outreach: (R)evolution by the lakeside

    CERN Multimedia

    2006-01-01

    Why do the planets revolve around the Sun? Has genetic science shaken Darwin's theories to their foundations? Are viruses the champions of evolution? Is progress a form of tradition? On 8 and 9 July, Geneva's Science History Museum is inviting you to a Science Night on the theme of 'Evolution, revolution'. The Sixth Science Night will host some 60 stands and offer workshops for children, guided tours, exhibitions and shows. Anticipating the (r)evolutions from the LHC, CERN will also be taking part in the event. The future accelerator promises to deliver scientific advances and may even turn our understanding of the infinitesimally small on its head. However, the LHC has already led to technological breakthroughs. The Laboratory's stand will put a special emphasis on the part played by CERN in the computing revolution, from the Web to the Computing Grid. Computer animations will be used to explain these technologies which have spin-offs well beyond the field of particle physics that are of benefit to the whol...

  14. Exploring the use of computer-mediated video communication in engineering projects in South Africa

    Directory of Open Access Journals (Sweden)

    Meyer, Izak P.

    2016-08-01

    Full Text Available Globally-expanding organisations that are trying to capitalise on distributed skills are increasingly using virtual project teams to shorten product development time and increase quality. These virtual teams, which are distributed across countries, cultures, and time zones, are required to use faster and better ways of interacting. Past research has shown that virtual teams that use computer-mediated communication (CMC instead of face-to-face communication are less cohesive because they struggle with mistrust, controlling behaviour , and communication breakdowns. This study aims to determine whether project practitioners in South Africa perceive virtual teams that use videoconferencing as suffering from the same CMC disadvantages described in past research in other environments; and if they do, what the possible causes could be. This paper reports on a survey of 106 project practitioners in South Africa. The results show that these project practitioners prefer face- to-face communication over CMC, and perceive virtual teams using videoconferencing to be less cohesive and to suffer from mistrust and communication breakdowns, but not from increased conflict and power struggles. The perceived shortcomings of videoconferencing might result from virtual teams that use this medium having less time to build interpersonal relationships.

  15. An overview of the Environmental Monitoring Computer Automation Project

    International Nuclear Information System (INIS)

    Johnson, S.M.; Lorenz, R.

    1992-01-01

    The Savannah River Site (SRS) was bulk to produce plutonium and tritium for national defense. As a result of site operations, routine and accidental releases of radionuclides have occurred. The effects these releases have on the k>cal population and environment are of concern to the Department of Energy (DOE) and SRS personnel. Each year, approximately 40,000 environmental samples are collected. The quality of the samples, analytical methods and results obtained are important to site personnel. The Environmental Monitoring Computer Automation Project (EMCAP) was developed to better manage scheduling, log-in, tracking, analytical results, and report generation. EMCAP can be viewed as a custom Laboratory Information Management System (LIMS) with the ability to schedule samples, generate reports, and query data. The purpose of this paper is to give an overview of the SRS environmental monitoring program, describe the development of EMCAP software and hardware, discuss the different software modules, show how EMCAP improved the Environmental Monitoring Section program, and examine the future of EMCAP at SRS

  16. Computing segmentations directly from x-ray projection data via parametric deformable curves

    DEFF Research Database (Denmark)

    Dahl, Vedrana Andersen; Dahl, Anders Bjorholm; Hansen, Per Christian

    2018-01-01

    We describe an efficient algorithm that computes a segmented reconstruction directly from x-ray projection data. Our algorithm uses a parametric curve to define the segmentation. Unlike similar approaches which are based on level-sets, our method avoids a pixel or voxel grid; hence the number...... of unknowns is reduced to the set of points that define the curve, and attenuation coefficients of the segments. Our current implementation uses a simple closed curve and is capable of separating one object from the background. However, our basic algorithm can be applied to an arbitrary topology and multiple...

  17. ATLAS Distributed Computing Monitoring tools during the LHC Run I

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Di Girolamo, A; Jezequel, S; Ueda, I; Wenaus, T

    2013-01-01

    This contribution summarizes evolution of the ATLAS Distributed Computing (ADC) Monitoring project during the LHC Run I. The ADC Monitoring targets at the three groups of customers: ADC Operations team to early identify malfunctions and escalate issues to an activity or a service expert, ATLAS national contacts and sites for the real-time monitoring and long-term measurement of the performance of the provided computing resources, and the ATLAS Management for long-term trends and accounting information about the ATLAS Distributed Computing resources.\\\\ During the LHC Run I a significant development effort has been invested in standardization of the monitoring and accounting applications in order to provide extensive monitoring and accounting suite. ADC Monitoring applications separate the data layer and the visualization layer. The data layer exposes data in a predefined format. The visualization layer is designed bearing in mind visual identity of the provided graphical elements, and re-usability of the visua...

  18. ATLAS Distributed Computing Monitoring tools during the LHC Run I

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Di Girolamo, A; Jezequel, S; Ueda, I; Wenaus, T

    2014-01-01

    This contribution summarizes evolution of the ATLAS Distributed Computing (ADC) Monitoring project during the LHC Run I. The ADC Monitoring targets at the three groups of customers: ADC Operations team to early identify malfunctions and escalate issues to an activity or a service expert, ATLAS national contacts and sites for the real-time monitoring and long-term measurement of the performance of the provided computing resources, and the ATLAS Management for long-term trends and accounting information about the ATLAS Distributed Computing resources.\\\\ During the LHC Run I a significant development effort has been invested in standardization of the monitoring and accounting applications in order to provide extensive monitoring and accounting suite. ADC Monitoring applications separate the data layer and the visualization layer. The data layer exposes data in a predefined format. The visualization layer is designed bearing in mind visual identity of the provided graphical elements, and re-usability of the visua...

  19. The evolution of single stars

    International Nuclear Information System (INIS)

    Tayler, R.J.

    1982-01-01

    The general outline of the evolution of single stars is well understood but at most stages of evolution important uncertainties remain. This paper contains a very personal view of what are the major uncertainties and of what problems remain to be solved before one can be satisfied with the theory. It is suggested that some problems may be essentially insoluble even with the very large and fast computers that are currently available. (author)

  20. Computational Fluid Dynamics (CFD) Computations With Zonal Navier-Stokes Flow Solver (ZNSFLOW) Common High Performance Computing Scalable Software Initiative (CHSSI) Software

    National Research Council Canada - National Science Library

    Edge, Harris

    1999-01-01

    ...), computational fluid dynamics (CFD) 6 project. Under the project, a proven zonal Navier-Stokes solver was rewritten for scalable parallel performance on both shared memory and distributed memory high performance computers...

  1. 1990 CERN School of Computing

    International Nuclear Information System (INIS)

    1991-01-01

    These Proceedings contain written versions of lectures delivered at the 1990 CERN School of Computing, covering a variety of topics. Computer networks are treated in three papers: standards in computer networking; evolution of local and metropolitan area networks; asynchronous transfer mode, the solution for broadband ISDN. Data acquisition and analysis are the topic of papers on: data acquisition using MODEL software; graphical event analysis. Two papers in the field of signal processing treat digital image processing and the use of digital signal processors in HEP. Another paper reviews the present state of digital optical computing. Operating systems and programming discipline are covered in two papers: UNIX, evolution towards distributed systems; new developments in program verification. Three papers treat miscellaneous topics: computer security within the CERN environment; numerical simulation in fluid mechanics; fractals. An introduction to transputers and Occam gives an account of the tutorial lectures given at the School. (orig.)

  2. ORGANIZATION OF FUTURE ENGINEERS' PROJECT-BASED LEARNING WHEN STUDYING THE PROJECT MANAGEMENT METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Halyna V. Lutsenko

    2015-02-01

    Full Text Available The peculiarities of modern world experience of implementation of project-based learning in engineering education have been considered. The potential role and place of projects in learning activity have been analyzed. The methodology of organization of project-based activity of engineering students when studying the project management methodology and computer systems of project management has been proposed. The requirements to documentation and actual results of students' projects have been described in detail. The requirements to computer-aided systems of project management developed by using Microsoft Project in the scope of diary scheduling and resources planning have been formulated.

  3. Inference of Tumor Evolution during Chemotherapy by Computational Modeling and In Situ Analysis of Genetic and Phenotypic Cellular Diversity

    Directory of Open Access Journals (Sweden)

    Vanessa Almendro

    2014-02-01

    Full Text Available Cancer therapy exerts a strong selection pressure that shapes tumor evolution, yet our knowledge of how tumors change during treatment is limited. Here, we report the analysis of cellular heterogeneity for genetic and phenotypic features and their spatial distribution in breast tumors pre- and post-neoadjuvant chemotherapy. We found that intratumor genetic diversity was tumor-subtype specific, and it did not change during treatment in tumors with partial or no response. However, lower pretreatment genetic diversity was significantly associated with pathologic complete response. In contrast, phenotypic diversity was different between pre- and posttreatment samples. We also observed significant changes in the spatial distribution of cells with distinct genetic and phenotypic features. We used these experimental data to develop a stochastic computational model to infer tumor growth patterns and evolutionary dynamics. Our results highlight the importance of integrated analysis of genotypes and phenotypes of single cells in intact tissues to predict tumor evolution.

  4. Inference of tumor evolution during chemotherapy by computational modeling and in situ analysis of genetic and phenotypic cellular diversity

    International Nuclear Information System (INIS)

    Almendro, Vanessa; Cheng, Yu-Kang; Randles, Amanda; Itzkovitz, Shalev; Marusyk, Andriy; Ametller, Elisabet; Gonzalez-Farre, Xavier; Muñoz, Montse; Russnes, Hege G.; Helland, Åslaug; Rye, Inga H.; Borresen-Dale, Anne-Lise; Maruyama, Reo; Van Oudenaarden, Alexander; Dowsett, Mitchell; Jones, Robin L.; Reis-Filho, Jorge; Gascon, Pere; Gönen, Mithat; Michor, Franziska; Polyak, Kornelia

    2014-01-01

    Cancer therapy exerts a strong selection pressure that shapes tumor evolution, yet our knowledge of how tumors change during treatment is limited. Here, we report the analysis of cellular heterogeneity for genetic and phenotypic features and their spatial distribution in breast tumors pre- and post-neoadjuvant chemotherapy. We found that intratumor genetic diversity was tumor-subtype specific, and it did not change during treatment in tumors with partial or no response. However, lower pretreatment genetic diversity was significantly associated with pathologic complete response. In contrast, phenotypic diversity was different between pre- and post-treatment samples. We also observed significant changes in the spatial distribution of cells with distinct genetic and phenotypic features. We used these experimental data to develop a stochastic computational model to infer tumor growth patterns and evolutionary dynamics. Our results highlight the importance of integrated analysis of genotypes and phenotypes of single cells in intact tissues to predict tumor evolution

  5. Knowledge-based geographic information systems on the Macintosh computer: a component of the GypsES project

    Science.gov (United States)

    Gregory Elmes; Thomas Millette; Charles B. Yuill

    1991-01-01

    GypsES, a decision-support and expert system for the management of Gypsy Moth addresses five related research problems in a modular, computer-based project. The modules are hazard rating, monitoring, prediction, treatment decision and treatment implementation. One common component is a geographic information system designed to function intelligently. We refer to this...

  6. IPAD project overview

    Science.gov (United States)

    Fulton, R. E.

    1980-01-01

    To respond to national needs for improved productivity in engineering design and manufacturing, a NASA supported joint industry/government project is underway denoted Integrated Programs for Aerospace-Vehicle Design (IPAD). The objective is to improve engineering productivity through better use of computer technology. It focuses on development of technology and associated software for integrated company-wide management of engineering information. The project has been underway since 1976 under the guidance of an Industry Technical Advisory Board (ITAB) composed of representatives of major engineering and computer companies and in close collaboration with the Air Force Integrated Computer-Aided Manufacturing (ICAM) program. Results to date on the IPAD project include an in-depth documentation of a representative design process for a large engineering project, the definition and design of computer-aided design software needed to support that process, and the release of prototype software to integrate selected design functions. Ongoing work concentrates on development of prototype software to manage engineering information, and initial software is nearing release.

  7. [Restoration filtering based on projection power spectrum for single-photon emission computed tomography].

    Science.gov (United States)

    Kubo, N

    1995-04-01

    To improve the quality of single-photon emission computed tomographic (SPECT) images, a restoration filter has been developed. This filter was designed according to practical "least squares filter" theory. It is necessary to know the object power spectrum and the noise power spectrum. The power spectrum is estimated from the power spectrum of a projection, when the high-frequency power spectrum of a projection is adequately approximated as a polynomial exponential expression. A study of the restoration with the filter based on a projection power spectrum was conducted, and compared with that of the "Butterworth" filtering method (cut-off frequency of 0.15 cycles/pixel), and "Wiener" filtering (signal-to-noise power spectrum ratio was a constant). Normalized mean-squared errors (NMSE) of the phantom, two line sources located in a 99mTc filled cylinder, were used. NMSE of the "Butterworth" filter, "Wiener" filter, and filtering based on a power spectrum were 0.77, 0.83, and 0.76 respectively. Clinically, brain SPECT images utilizing this new restoration filter improved the contrast. Thus, this filter may be useful in diagnosis of SPECT images.

  8. Restoration filtering based on projection power spectrum for single-photon emission computed tomography

    International Nuclear Information System (INIS)

    Kubo, Naoki

    1995-01-01

    To improve the quality of single-photon emission computed tomographic (SPECT) images, a restoration filter has been developed. This filter was designed according to practical 'least squares filter' theory. It is necessary to know the object power spectrum and the noise power spectrum. The power spectrum is estimated from the power spectrum of a projection, when the high-frequency power spectrum of a projection is adequately approximated as a polynomial exponential expression. A study of the restoration with the filter based on a projection power spectrum was conducted, and compared with that of the 'Butterworth' filtering method (cut-off frequency of 0.15 cycles/pixel), and 'Wiener' filtering (signal-to-noise power spectrum ratio was a constant). Normalized mean-squared errors (NMSE) of the phantom, two line sources located in a 99m Tc filled cylinder, were used. NMSE of the 'Butterworth' filter, 'Wiener' filter, and filtering based on a power spectrum were 0.77, 0.83, and 0.76 respectively. Clinically, brain SPECT images utilizing this new restoration filter improved the contrast. Thus, this filter may be useful in diagnosis of SPECT images. (author)

  9. The fast debris evolution model

    Science.gov (United States)

    Lewis, H. G.; Swinerd, G. G.; Newland, R. J.; Saunders, A.

    2009-09-01

    The 'particles-in-a-box' (PIB) model introduced by Talent [Talent, D.L. Analytic model for orbital debris environmental management. J. Spacecraft Rocket, 29 (4), 508-513, 1992.] removed the need for computer-intensive Monte Carlo simulation to predict the gross characteristics of an evolving debris environment. The PIB model was described using a differential equation that allows the stability of the low Earth orbit (LEO) environment to be tested by a straightforward analysis of the equation's coefficients. As part of an ongoing research effort to investigate more efficient approaches to evolutionary modelling and to develop a suite of educational tools, a new PIB model has been developed. The model, entitled Fast Debris Evolution (FADE), employs a first-order differential equation to describe the rate at which new objects ⩾10 cm are added and removed from the environment. Whilst Talent [Talent, D.L. Analytic model for orbital debris environmental management. J. Spacecraft Rocket, 29 (4), 508-513, 1992.] based the collision theory for the PIB approach on collisions between gas particles and adopted specific values for the parameters of the model from a number of references, the form and coefficients of the FADE model equations can be inferred from the outputs of future projections produced by high-fidelity models, such as the DAMAGE model. The FADE model has been implemented as a client-side, web-based service using JavaScript embedded within a HTML document. Due to the simple nature of the algorithm, FADE can deliver the results of future projections immediately in a graphical format, with complete user-control over key simulation parameters. Historical and future projections for the ⩾10 cm LEO debris environment under a variety of different scenarios are possible, including business as usual, no future launches, post-mission disposal and remediation. A selection of results is presented with comparisons with predictions made using the DAMAGE environment model

  10. Evolution of teaching and evaluation methodologies: The experience in the computer programming course at the Universidad Nacional de Colombia

    Directory of Open Access Journals (Sweden)

    Jonatan Gomez Perdomo

    2014-05-01

    Full Text Available In this paper, we present the evolution of a computer-programming course at the Universidad Nacional de Colombia (UNAL. The teaching methodology has evolved from a linear and non-standardized methodology to a flexible, non-linear and student-centered methodology. Our methodology uses an e-learning platform that supports the learning process by offering students and professors custom navigation between the content and material in an interactive way (book chapters, exercises, videos. Moreover, the platform is open access, and approximately 900 students from the university take this course each term. However, our evaluation methodology has evolved from static evaluations based on paper tests to an online process based on computer adaptive testing (CAT that chooses the questions to ask a student and assigns the student a grade according to the student’s ability.

  11. AGIS: Evolution of Distributed Computing Information system for ATLAS

    CERN Document Server

    Anisenkov, Alexey; The ATLAS collaboration; Alandes Pradillo, Maria; Karavakis, Edward

    2015-01-01

    The variety of the ATLAS Computing Infrastructure requires a central information system to define the topology of computing resources and to store the different parameters and configuration data which are needed by the various ATLAS software components. The ATLAS Grid Information System is the system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing applications and services.

  12. Evolution of spatio-temporal drought characteristics: validation, projections and effect of adaptation scenarios

    Science.gov (United States)

    Vidal, J.-P.; Martin, E.; Kitova, N.; Najac, J.; Soubeyroux, J.-M.

    2012-08-01

    Drought events develop in both space and time and they are therefore best described through summary joint spatio-temporal characteristics, such as mean duration, mean affected area and total magnitude. This paper addresses the issue of future projections of such characteristics of drought events over France through three main research questions: (1) Are downscaled climate projections able to simulate spatio-temporal characteristics of meteorological and agricultural droughts in France over a present-day period? (2) How such characteristics will evolve over the 21st century? (3) How to use standardized drought indices to represent theoretical adaptation scenarios? These questions are addressed using the Isba land surface model, downscaled climate projections from the ARPEGE General Circulation Model under three emissions scenarios, as well as results from a previously performed 50-yr multilevel and multiscale drought reanalysis over France. Spatio-temporal characteristics of meteorological and agricultural drought events are computed using the Standardized Precipitation Index and the Standardized Soil Wetness Index, respectively, and for time scales of 3 and 12 months. Results first show that the distributions of joint spatio-temporal characteristics of observed events are well simulated by the downscaled hydroclimate projections over a present-day period. All spatio-temporal characteristics of drought events are then found to dramatically increase over the 21st century, with stronger changes for agricultural droughts. Two theoretical adaptation scenarios are eventually built based on hypotheses of adaptation to evolving climate and hydrological normals, either retrospective or prospective. The perceived spatio-temporal characteristics of drought events derived from these theoretical adaptation scenarios show much reduced changes, but they call for more realistic scenarios at both the catchment and national scale in order to accurately assess the combined effect of

  13. Evolution of the U.S. Energy Service Company Industry: Market Size and Project Performance from 1990-2008

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Goldman, Charles A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Satchwell, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-05-08

    investment. There is empirical evidence confirming that the industry is responding to customer demand by installing more comprehensive and complex measures—including onsite generation and measures to address deferred maintenance—but this evolution has significant implications for customer project economics, especially at K-12 schools. We found that the median simple payback time has increased from 1.9 to 3.2 years in private sector projects since the early-to-mid 1990s and from 5.2 to 10.5 years in public sector projects for the same time period.

  14. Chirp subbottom profiler data collected in Pamlico Sound on cruise RVRiggs_07_31_2013 of RV Riggs for the Coastal Hydrodynamics and Natural Geologic Evolution (CHaNGE) project

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Edgetech 216 chirp data (SEG-Y format) collected for the Coastal Hydrodynamics and Natural Geologic Evolution (CHaNGE) project, OCE-1130843. Survey area covers...

  15. Chirp subbottom profiler data collected in Pamlico Sound on cruise RVRiggs_07_30_2013 of RV Riggs for the Coastal Hydrodynamics and Natural Geologic Evolution (CHaNGE) project

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Edgetech 216 chirp data (SEG-Y format) collected for the Coastal Hydrodynamics and Natural Geologic Evolution (CHaNGE) project, OCE-1130843. Survey area covers...

  16. A Comparison of the CHILD and Landlab Computational Landscape Evolution Models and Examples of Best Practices in Numerical Modeling of Surface Processes

    Science.gov (United States)

    Gasparini, N. M.; Hobley, D. E. J.; Tucker, G. E.; Istanbulluoglu, E.; Adams, J. M.; Nudurupati, S. S.; Hutton, E. W. H.

    2014-12-01

    Computational models are important tools that can be used to quantitatively understand the evolution of real landscapes. Commonalities exist among most landscape evolution models, although they are also idiosyncratic, in that they are coded in different languages, require different input values, and are designed to tackle a unique set of questions. These differences can make applying a landscape evolution model challenging, especially for novice programmers. In this study, we compare and contrast two landscape evolution models that are designed to tackle similar questions, but the actual model designs are quite different. The first model, CHILD, is over a decade-old and is relatively well-tested, well-developed and well-used. It is coded in C++, operates on an irregular grid and was designed more with function rather than user-experience in mind. In contrast, the second model, Landlab, is relatively new and was designed to be accessible to a wide range of scientists, including those who have not previously used or developed a numerical model. Landlab is coded in Python, a relatively easy language for the non-proficient programmer, and has the ability to model landscapes described on both regular and irregular grids. We present landscape simulations from both modeling platforms. Our goal is to illustrate best practices for implementing a new process module in a landscape evolution model, and therefore the simulations are applicable regardless of the modeling platform. We contrast differences and highlight similarities between the use of the two models, including setting-up the model and input file for different evolutionary scenarios, computational time, and model output. Whenever possible, we compare model output with analytical solutions and illustrate the effects, or lack thereof, of a uniform vs. non-uniform grid. Our simulations focus on implementing a single process, including detachment-limited or transport-limited fluvial bedrock incision and linear or non

  17. Evolution of and projections for automated composite material placement equipment in the aerospace industry

    Science.gov (United States)

    McCarville, Douglas A.

    2009-12-01

    As the commercial aircraft industry attempts to improve airplane fuel efficiency by shifting from aluminum to composites (reinforced plastics), there is a concern that composite processing equipment is not mature enough to meet increasing demand and that delivery delays and loss of high tech jobs could result. The research questions focused on the evolution of composite placement machines, improvement of machine functionality by equipment vendors, and the probability of new inventions helping to avoid production shortfalls. An extensive review of the literature found no studies that addressed these issues. Since the early twentieth century, exploratory case study of pivotal technological advances has been an accepted means of performing historic analysis and furthering understanding of rapidly changing marketplaces and industries. This qualitative case study investigated evolution of automated placement equipment by (a) codifying and mapping patent data (e.g., claims and functionality descriptions), (b) triangulating archival data (i.e., trade literature, vender Web sites, and scholarly texts), and (c) interviewing expert witnesses. An industry-level sensitivity model developed by the author showed that expanding the vendor base and increasing the number of performance enhancing inventions will most likely allow the industry to make the transition from aluminum to composites without schedule delays. This study will promote social change by (a) advancing individual and community knowledge (e.g., teaching modules for students, practitioners, and professional society members) and (b) providing an empirical model that will help in the understanding and projection of next generation composite processing equipment demand and productivity output.

  18. Project Chrysalis: The Evolution of a Community School.

    Science.gov (United States)

    Garrett, K.

    1996-01-01

    Describes the creation and operation of Project Chrysalis, a community, service-learning school transformed from row houses, where children can learn, work, and gain inspiration from artists and social entrepreneurs involved with Houston's Project Row Houses. Personal narratives of two teachers highlight the school's and students' accomplishments…

  19. What Is Our Current Understanding of One-to-One Computer Projects: A Systematic Narrative Research Review

    Science.gov (United States)

    Fleischer, Hakan

    2012-01-01

    The aim of this article is to review cross-disciplinary accumulated empirical research on one-to-one computer projects in school settings as published in peer-reviewed journals between 2005 and 2010, particularly the results of teacher- and pupil-oriented studies. Six hundred and five research articles were screened at the abstract and title…

  20. Rana computatrix to human language: towards a computational neuroethology of language evolution.

    Science.gov (United States)

    Arbib, Michael A

    2003-10-15

    Walter's Machina speculatrix inspired the name Rana computatrix for a family of models of visuomotor coordination in the frog, which contributed to the development of computational neuroethology. We offer here an 'evolutionary' perspective on models in the same tradition for rat, monkey and human. For rat, we show how the frog-like taxon affordance model provides a basis for the spatial navigation mechanisms that involve the hippocampus and other brain regions. For monkey, we recall two models of neural mechanisms for visuomotor coordination. The first, for saccades, shows how interactions between the parietal and frontal cortex augment superior colliculus seen as the homologue of frog tectum. The second, for grasping, continues the theme of parieto-frontal interactions, linking parietal affordances to motor schemas in premotor cortex. It further emphasizes the mirror system for grasping, in which neurons are active both when the monkey executes a specific grasp and when it observes a similar grasp executed by others. The model of human-brain mechanisms is based on the mirror-system hypothesis of the evolution of the language-ready brain, which sees the human Broca's area as an evolved extension of the mirror system for grasping.

  1. Evolution of supernova remnants. III. Thermal waves

    International Nuclear Information System (INIS)

    Chevalier, R.A.

    1975-01-01

    The effect of heat conduction on the evolution of supernova remnants is investigated. A thermal wave, or electron conduction front, can travel more rapidly than a shock wave during the first thousand years of the remnant's evolution. A self-similar solution describing this phase has been found by Barenblatt. Numerical computations verify the solution and give the evolution past the thermal wave phase. While shell formation is not impeded, the interior density and temperature profiles are smoothed by the action of conduction

  2. Evolution of the ATLAS data and computing model for a Tier2 in the EGI infrastructure

    CERN Document Server

    Fernández Casaní, A; The ATLAS collaboration; González de la Hoz, S; Salt Cairols, J; Fassi, F; Kaci, M; Lamas, A; Oliver, E; Sánchez, J; Sánchez, V

    2012-01-01

    Since the start of the LHC pp collisions in 2010, the ATLAS computing model has moved from a more strict design, where every Tier2 had a liaison and a network dependence from a Tier1, to a more meshed approach where every cloud could be connected. Evolution of ATLAS data models requires changes in ATLAS Tier2s policy for the data replication, dynamic data caching and remote data access. It also requires rethinking the network infrastructure to enable any Tier2 and associated Tier3 to easily connect to any Tier1 or Tier2. Tier2s are becoming more and more important in the ATLAS computing model as it allows more data to be readily accessible for analysis jobs to all users, independently of their geographical location. The Tier2s disk space has been reserved for real, simulated, calibration and alignment, group, and user data. A buffer disk space is needed for input and output data for simulations jobs. Tier2s are going to be used more efficiently. In this way Tier1s and Tier2s are becoming more equivalent for t...

  3. ZIVIS: A City Computing Platform Based on Volunteer Computing

    International Nuclear Information System (INIS)

    Antoli, B.; Castejon, F.; Giner, A.; Losilla, G.; Reynolds, J. M.; Rivero, A.; Sangiao, S.; Serrano, F.; Tarancon, A.; Valles, R.; Velasco, J. L.

    2007-01-01

    Abstract Volunteer computing has come up as a new form of distributed computing. Unlike other computing paradigms like Grids, which use to be based on complex architectures, volunteer computing has demonstrated a great ability to integrate dispersed, heterogeneous computing resources with ease. This article presents ZIVIS, a project which aims to deploy a city-wide computing platform in Zaragoza (Spain). ZIVIS is based on BOINC (Berkeley Open Infrastructure for Network Computing), a popular open source framework to deploy volunteer and desktop grid computing systems. A scientific code which simulates the trajectories of particles moving inside a stellarator fusion device, has been chosen as the pilot application of the project. In this paper we describe the approach followed to port the code to the BOINC framework as well as some novel techniques, based on standard Grid protocols, we have used to access the output data present in the BOINC server from a remote visualizer. (Author)

  4. Computational intelligence and neuromorphic computing potential for cybersecurity applications

    Science.gov (United States)

    Pino, Robinson E.; Shevenell, Michael J.; Cam, Hasan; Mouallem, Pierre; Shumaker, Justin L.; Edwards, Arthur H.

    2013-05-01

    In today's highly mobile, networked, and interconnected internet world, the flow and volume of information is overwhelming and continuously increasing. Therefore, it is believed that the next frontier in technological evolution and development will rely in our ability to develop intelligent systems that can help us process, analyze, and make-sense of information autonomously just as a well-trained and educated human expert. In computational intelligence, neuromorphic computing promises to allow for the development of computing systems able to imitate natural neurobiological processes and form the foundation for intelligent system architectures.

  5. Incomplete projection reconstruction of computed tomography based on the modified discrete algebraic reconstruction technique

    Science.gov (United States)

    Yang, Fuqiang; Zhang, Dinghua; Huang, Kuidong; Gao, Zongzhao; Yang, YaFei

    2018-02-01

    Based on the discrete algebraic reconstruction technique (DART), this study aims to address and test a new improved algorithm applied to incomplete projection data to generate a high quality reconstruction image by reducing the artifacts and noise in computed tomography. For the incomplete projections, an augmented Lagrangian based on compressed sensing is first used in the initial reconstruction for segmentation of the DART to get higher contrast graphics for boundary and non-boundary pixels. Then, the block matching 3D filtering operator was used to suppress the noise and to improve the gray distribution of the reconstructed image. Finally, simulation studies on the polychromatic spectrum were performed to test the performance of the new algorithm. Study results show a significant improvement in the signal-to-noise ratios (SNRs) and average gradients (AGs) of the images reconstructed from incomplete data. The SNRs and AGs of the new images reconstructed by DART-ALBM were on average 30%-40% and 10% higher than the images reconstructed by DART algorithms. Since the improved DART-ALBM algorithm has a better robustness to limited-view reconstruction, which not only makes the edge of the image clear but also makes the gray distribution of non-boundary pixels better, it has the potential to improve image quality from incomplete projections or sparse projections.

  6. Final Report on Institutional Computing Project s15_hilaserion, “Kinetic Modeling of Next-Generation High-Energy, High-Intensity Laser-Ion Accelerators as an Enabling Capability”

    Energy Technology Data Exchange (ETDEWEB)

    Albright, Brian James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Yin, Lin [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Stark, David James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-06

    This proposal sought of order 1M core-hours of Institutional Computing time intended to enable computing by a new LANL Postdoc (David Stark) working under LDRD ER project 20160472ER (PI: Lin Yin) on laser-ion acceleration. The project was “off-cycle,” initiating in June of 2016 with a postdoc hire.

  7. Evolution and Reactivity in the Semantic Web

    Science.gov (United States)

    Alferes, José Júlio; Eckert, Michael; May, Wolfgang

    Evolution and reactivity in the Semantic Web address the vision and concrete need for an active Web, where data sources evolve autonomously and perceive and react to events. In 2004, when the Rewerse project started, regarding work on Evolution and Reactivity in the Semantic Web there wasn’t much more than a vision of such an active Web.

  8. QCDNUM: Fast QCD evolution and convolution

    Science.gov (United States)

    Botje, M.

    2011-02-01

    The QCDNUM program numerically solves the evolution equations for parton densities and fragmentation functions in perturbative QCD. Un-polarised parton densities can be evolved up to next-to-next-to-leading order in powers of the strong coupling constant, while polarised densities or fragmentation functions can be evolved up to next-to-leading order. Other types of evolution can be accessed by feeding alternative sets of evolution kernels into the program. A versatile convolution engine provides tools to compute parton luminosities, cross-sections in hadron-hadron scattering, and deep inelastic structure functions in the zero-mass scheme or in generalised mass schemes. Input to these calculations are either the QCDNUM evolved densities, or those read in from an external parton density repository. Included in the software distribution are packages to calculate zero-mass structure functions in un-polarised deep inelastic scattering, and heavy flavour contributions to these structure functions in the fixed flavour number scheme. Program summaryProgram title: QCDNUM version: 17.00 Catalogue identifier: AEHV_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHV_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU Public Licence No. of lines in distributed program, including test data, etc.: 45 736 No. of bytes in distributed program, including test data, etc.: 911 569 Distribution format: tar.gz Programming language: Fortran-77 Computer: All Operating system: All RAM: Typically 3 Mbytes Classification: 11.5 Nature of problem: Evolution of the strong coupling constant and parton densities, up to next-to-next-to-leading order in perturbative QCD. Computation of observable quantities by Mellin convolution of the evolved densities with partonic cross-sections. Solution method: Parametrisation of the parton densities as linear or quadratic splines on a discrete grid, and evolution of the spline

  9. Computational Experiment Approach to Controlled Evolution of Procurement Pattern in Cluster Supply Chain

    Directory of Open Access Journals (Sweden)

    Xiao Xue

    2015-01-01

    Full Text Available Companies have been aware of the benefits of developing Cluster Supply Chains (CSCs, and they are spending a great deal of time and money attempting to develop the new business pattern. Yet, the traditional techniques for identifying CSCs have strong theoretical antecedents, but seem to have little traction in the field. We believe this is because the standard techniques fail to capture evolution over time, nor provide useful intervention measures to reach goals. To address these problems, we introduce an agent-based modeling approach to evaluate CSCs. Taking collaborative procurement as research object, our approach is composed of three parts: model construction, model instantiation, and computational experiment. We use the approach to explore the service charging policy problem in collaborative procurement. Three kinds of service charging polices are compared in the same experiment environment. Finally, “Fixed Cost” is identified as the optimal policy under the stable market environment. The case study can help us to understand the workflow of applying the approach, and provide valuable decision support applications to industry.

  10. Coordinated research projects (CRP). Coordinated research project (CRP)

    International Nuclear Information System (INIS)

    Takagi, Hidekazu; Koike, Fumihiro; Nakamura, Nobuyuki

    2013-01-01

    In the present paper, the contribution of Japanese scientists in coordinated research projects on thermonuclear fusion. Representative subjects taken in seven projects are the precise computation of theoretical data on electron-molecule collisions in the peripheral plasma, the computation of spectroscopic data of multi-charged tungsten ions, the spectroscopic measurement of multi-charged tungsten ions using an ion trap device, the development of collisional-radiative model for plasmas including hydrogen and helium, the computational and theoretical studies on the behavior of tungsten and beryllium in the plasma-wall interaction, the study on the property of dusts generated in fusion devices. These subjects are those of most important issues in ITER. (author)

  11. Lessons learned from the K computer project—from the K computer to Exascale

    International Nuclear Information System (INIS)

    Oyanagi, Yoshio

    2014-01-01

    The history of the supercomputers in Japan and the U.S. is briefly summarized and the difference between the two is discussed. The development of the K Computer project in Japan is described as compared to other PetaFlops projects. The difficulties to be solved in the Exascale computer project now being developed are discussed

  12. New ATLAS Software & Computing Organization

    CERN Multimedia

    Barberis, D

    Following the election by the ATLAS Collaboration Board of Dario Barberis (Genoa University/INFN) as Computing Coordinator and David Quarrie (LBNL) as Software Project Leader, it was considered necessary to modify the organization of the ATLAS Software & Computing ("S&C") project. The new organization is based upon the following principles: separation of the responsibilities for computing management from those of software development, with the appointment of a Computing Coordinator and a Software Project Leader who are both members of the Executive Board; hierarchical structure of responsibilities and reporting lines; coordination at all levels between TDAQ, S&C and Physics working groups; integration of the subdetector software development groups with the central S&C organization. A schematic diagram of the new organization can be seen in Fig.1. Figure 1: new ATLAS Software & Computing organization. Two Management Boards will help the Computing Coordinator and the Software Project...

  13. The DataGrid Project

    CERN Document Server

    Ruggieri, F

    2001-01-01

    An overview of the objectives and status of the DataGrid Project is presented, together with a brief introduction to the Grid metaphor and some references to the Grid activities and initiatives related to DataGrid. High energy physics experiments have always requested state of the art computing facilities to efficiently perform several computing activities related with the handling of large amounts of data and fairly large computing resources. Some of the ideas born inside the community to enhance the user friendliness of all the steps in the computing chain have been, sometimes, successfully applied also in other contexts: one bright example is the World Wide Web. The LHC computing challenge has triggered inside the high energy physics community, the start of the DataGrid Project. The objective of the project is to enable next generation scientific exploration requiring intensive computation and analysis of shared large-scale databases. (12 refs).

  14. The Neurona at Home project: Simulating a large-scale cellular automata brain in a distributed computing environment

    Science.gov (United States)

    Acedo, L.; Villanueva-Oller, J.; Moraño, J. A.; Villanueva, R.-J.

    2013-01-01

    The Berkeley Open Infrastructure for Network Computing (BOINC) has become the standard open source solution for grid computing in the Internet. Volunteers use their computers to complete an small part of the task assigned by a dedicated server. We have developed a BOINC project called Neurona@Home whose objective is to simulate a cellular automata random network with, at least, one million neurons. We consider a cellular automata version of the integrate-and-fire model in which excitatory and inhibitory nodes can activate or deactivate neighbor nodes according to a set of probabilistic rules. Our aim is to determine the phase diagram of the model and its behaviour and to compare it with the electroencephalographic signals measured in real brains.

  15. New Frontiers in Language Evolution and Development.

    Science.gov (United States)

    Oller, D Kimbrough; Dale, Rick; Griebel, Ulrike

    2016-04-01

    This article introduces the Special Issue and its focus on research in language evolution with emphasis on theory as well as computational and robotic modeling. A key theme is based on the growth of evolutionary developmental biology or evo-devo. The Special Issue consists of 13 articles organized in two sections: A) Theoretical foundations and B) Modeling and simulation studies. All the papers are interdisciplinary in nature, encompassing work in biological and linguistic foundations for the study of language evolution as well as a variety of computational and robotic modeling efforts shedding light on how language may be developed and may have evolved. Copyright © 2016 Cognitive Science Society, Inc.

  16. Early-state damage detection, characterization, and evolution using high-resolution computed tomography

    Science.gov (United States)

    Grandin, Robert John

    Safely using materials in high performance applications requires adequately understanding the mechanisms which control the nucleation and evolution of damage. Most of a material's operational life is spent in a state with noncritical damage, and, for example in metals only a small portion of its life falls within the classical Paris Law regime of crack growth. Developing proper structural health and prognosis models requires understanding the behavior of damage in these early stages within the material's life, and this early-stage damage occurs on length scales at which the material may be considered "granular'' in the sense that the discrete regions which comprise the whole are large enough to require special consideration. Material performance depends upon the characteristics of the granules themselves as well as the interfaces between granules. As a result, properly studying early-stage damage in complex, granular materials requires a means to characterize changes in the granules and interfaces. The granular-scale can range from tenths of microns in ceramics, to single microns in fiber-reinforced composites, to tens of millimeters in concrete. The difficulty of direct-study is often overcome by exhaustive testing of macro-scale damage caused by gross material loads and abuse. Such testing, for example optical or electron microscopy, destructive and further, is costly when used to study the evolution of damage within a material and often limits the study to a few snapshots. New developments in high-resolution computed tomography (HRCT) provide the necessary spatial resolution to directly image the granule length-scale of many materials. Successful application of HRCT with fiber-reinforced composites, however, requires extending the HRCT performance beyond current limits. This dissertation will discuss improvements made in the field of CT reconstruction which enable resolutions to be pushed to the point of being able to image the fiber-scale damage structures and

  17. Programmed evolution for optimization of orthogonal metabolic output in bacteria.

    Directory of Open Access Journals (Sweden)

    Todd T Eckdahl

    Full Text Available Current use of microbes for metabolic engineering suffers from loss of metabolic output due to natural selection. Rather than combat the evolution of bacterial populations, we chose to embrace what makes biological engineering unique among engineering fields - evolving materials. We harnessed bacteria to compute solutions to the biological problem of metabolic pathway optimization. Our approach is called Programmed Evolution to capture two concepts. First, a population of cells is programmed with DNA code to enable it to compute solutions to a chosen optimization problem. As analog computers, bacteria process known and unknown inputs and direct the output of their biochemical hardware. Second, the system employs the evolution of bacteria toward an optimal metabolic solution by imposing fitness defined by metabolic output. The current study is a proof-of-concept for Programmed Evolution applied to the optimization of a metabolic pathway for the conversion of caffeine to theophylline in E. coli. Introduced genotype variations included strength of the promoter and ribosome binding site, plasmid copy number, and chaperone proteins. We constructed 24 strains using all combinations of the genetic variables. We used a theophylline riboswitch and a tetracycline resistance gene to link theophylline production to fitness. After subjecting the mixed population to selection, we measured a change in the distribution of genotypes in the population and an increased conversion of caffeine to theophylline among the most fit strains, demonstrating Programmed Evolution. Programmed Evolution inverts the standard paradigm in metabolic engineering by harnessing evolution instead of fighting it. Our modular system enables researchers to program bacteria and use evolution to determine the combination of genetic control elements that optimizes catabolic or anabolic output and to maintain it in a population of cells. Programmed Evolution could be used for applications in

  18. Programmed Evolution for Optimization of Orthogonal Metabolic Output in Bacteria

    Science.gov (United States)

    Eckdahl, Todd T.; Campbell, A. Malcolm; Heyer, Laurie J.; Poet, Jeffrey L.; Blauch, David N.; Snyder, Nicole L.; Atchley, Dustin T.; Baker, Erich J.; Brown, Micah; Brunner, Elizabeth C.; Callen, Sean A.; Campbell, Jesse S.; Carr, Caleb J.; Carr, David R.; Chadinha, Spencer A.; Chester, Grace I.; Chester, Josh; Clarkson, Ben R.; Cochran, Kelly E.; Doherty, Shannon E.; Doyle, Catherine; Dwyer, Sarah; Edlin, Linnea M.; Evans, Rebecca A.; Fluharty, Taylor; Frederick, Janna; Galeota-Sprung, Jonah; Gammon, Betsy L.; Grieshaber, Brandon; Gronniger, Jessica; Gutteridge, Katelyn; Henningsen, Joel; Isom, Bradley; Itell, Hannah L.; Keffeler, Erica C.; Lantz, Andrew J.; Lim, Jonathan N.; McGuire, Erin P.; Moore, Alexander K.; Morton, Jerrad; Nakano, Meredith; Pearson, Sara A.; Perkins, Virginia; Parrish, Phoebe; Pierson, Claire E.; Polpityaarachchige, Sachith; Quaney, Michael J.; Slattery, Abagael; Smith, Kathryn E.; Spell, Jackson; Spencer, Morgan; Taye, Telavive; Trueblood, Kamay; Vrana, Caroline J.; Whitesides, E. Tucker

    2015-01-01

    Current use of microbes for metabolic engineering suffers from loss of metabolic output due to natural selection. Rather than combat the evolution of bacterial populations, we chose to embrace what makes biological engineering unique among engineering fields – evolving materials. We harnessed bacteria to compute solutions to the biological problem of metabolic pathway optimization. Our approach is called Programmed Evolution to capture two concepts. First, a population of cells is programmed with DNA code to enable it to compute solutions to a chosen optimization problem. As analog computers, bacteria process known and unknown inputs and direct the output of their biochemical hardware. Second, the system employs the evolution of bacteria toward an optimal metabolic solution by imposing fitness defined by metabolic output. The current study is a proof-of-concept for Programmed Evolution applied to the optimization of a metabolic pathway for the conversion of caffeine to theophylline in E. coli. Introduced genotype variations included strength of the promoter and ribosome binding site, plasmid copy number, and chaperone proteins. We constructed 24 strains using all combinations of the genetic variables. We used a theophylline riboswitch and a tetracycline resistance gene to link theophylline production to fitness. After subjecting the mixed population to selection, we measured a change in the distribution of genotypes in the population and an increased conversion of caffeine to theophylline among the most fit strains, demonstrating Programmed Evolution. Programmed Evolution inverts the standard paradigm in metabolic engineering by harnessing evolution instead of fighting it. Our modular system enables researchers to program bacteria and use evolution to determine the combination of genetic control elements that optimizes catabolic or anabolic output and to maintain it in a population of cells. Programmed Evolution could be used for applications in energy

  19. The Evolution of Computer Based Learning Software Design: Computer Assisted Teaching Unit Experience.

    Science.gov (United States)

    Blandford, A. E.; Smith, P. R.

    1986-01-01

    Describes the style of design of computer simulations developed by Computer Assisted Teaching Unit at Queen Mary College with reference to user interface, input and initialization, input data vetting, effective display screen use, graphical results presentation, and need for hard copy. Procedures and problems relating to academic involvement are…

  20. Simulation of quantum computers

    NARCIS (Netherlands)

    De Raedt, H; Michielsen, K; Hams, AH; Miyashita, S; Saito, K; Landau, DP; Lewis, SP; Schuttler, HB

    2001-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software

  1. Simulation of quantum computers

    NARCIS (Netherlands)

    Raedt, H. De; Michielsen, K.; Hams, A.H.; Miyashita, S.; Saito, K.

    2000-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software

  2. Chirp subbottom profiler data collected in Pamlico Sound on cruise RVRiggs_05_23_24_2012 of RV Riggs for the Coastal Hydrodynamics and Natural Geologic Evolution (CHaNGE) project

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Edgetech 216 chirp data (SEG-Y format) collected for the Coastal Hydrodynamics and Natural Geologic Evolution (CHaNGE) project, OCE-1130843. Survey area covers...

  3. Relevance of East African Drill Cores to Human Evolution: the Case of the Olorgesailie Drilling Project

    Science.gov (United States)

    Potts, R.

    2016-12-01

    Drill cores reaching the local basement of the East African Rift were obtained in 2012 south of the Olorgesailie Basin, Kenya, 20 km from excavations that document key benchmarks in the origin of Homo sapiens. Sediments totaling 216 m were obtained from two drilling locations representing the past 1 million years. The cores were acquired to build a detailed environmental record spatially associated with the transition from Acheulean to Middle Stone Age technology and extensive turnover in mammalian species. The project seeks precise tests of how climate dynamics and tectonic events were linked with these transitions. Core lithology (A.K. Behrensmeyer), geochronology (A. Deino), diatoms (R.B. Owen), phytoliths (R. Kinyanjui), geochemistry (N. Rabideaux, D. Deocampo), among other indicators, show evidence of strong environmental variability in agreement with predicted high-eccentricity modulation of climate during the evolutionary transitions. Increase in hominin mobility, elaboration of symbolic behavior, and concurrent turnover in mammalian species indicating heightened adaptability to unpredictable ecosystems, point to a direct link between the evolutionary transitions and the landscape dynamics reflected in the Olorgesailie drill cores. For paleoanthropologists and Earth scientists, any link between evolutionary transitions and environmental dynamics requires robust evolutionary datasets pertinent to how selection, extinction, population divergence, and other evolutionary processes were impacted by the dynamics uncovered in drill core studies. Fossil and archeological data offer a rich source of data and of robust environment-evolution explanations that must be integrated into efforts by Earth scientists who seek to examine high-resolution climate records of human evolution. Paleoanthropological examples will illustrate the opportunities that exist for connecting evolutionary benchmarks to the data obtained from drilled African muds. Project members: R. Potts, A

  4. Chirp subbottom profiler data collected in Pamlico Sound on cruise RVRiggs_05_20_22_2014 of RV Riggs for the Coastal Hydrodynamics and Natural Geologic Evolution (CHaNGE) project

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Edgetech 216 chirp data (SEG-Y format) collected for the Coastal Hydrodynamics and Natural Geologic Evolution (CHaNGE) project, OCE-1130843. Survey area covers the...

  5. Computer-aided detection of breast carcinoma in standard mammographic projections with digital mammography

    International Nuclear Information System (INIS)

    Destounis, S.; Hanson, S.

    2007-01-01

    This study was conducted to retrospectively evaluate a computer-aided detection system's ability to detect breast carcinoma in multiple standard mammographic projections. Forty-five lesions in 44 patients imaged with digital mammography (Selenia registered , Hologic, Bedford, MA; Senographe registered , GE, Milwaukee, WI) and had computer-aided detection (CAD, Image-checker registered V 8.3.15, Hologic/R2, Santa Clara, CA) applied at the time of examination were identified for review; all were subsequently recommended to biopsy where cancer was revealed. These lesions were determined by the study Radiologist to be visible in both standard mammographic images (mediolateral oblique, MLO; craniocaudal, CC). For each patient, case data included patient age, tissue density, lesion type, BIRADS registered assessment, lesion size, lesion visibility-visible on MLO and/or CC view, ability of CAD to correctly mark the cancerous lesion, number of CAD marks per image, needle core biopsy results and surgical pathologic correlation. For this study cohort. CAD lesion/case sensitivity of 87% (n = 39) was found and image sensitivity was found to be 69% (n = 31) for MLO view and 78% (n = 35) for the CC view. For the study cohort, cases presented with a median of four marks per cases (range 0-13). Eighty-four percent (n = 38) of lesions proceeded to excision; initial needle biopsy pathology was upgraded at surgical excision from in situ disease to invasive for 24% (n = 9) lesions. CAD has demonstrated the potential to detect mammographically visible cancers in multiple standard mammographic projections in all categories of lesions in this study cohort. (orig.)

  6. Climate change-driven cliff and beach evolution at decadal to centennial time scales

    Science.gov (United States)

    Erikson, Li; O'Neill, Andrea; Barnard, Patrick; Vitousek, Sean; Limber, Patrick

    2017-01-01

    Here we develop a computationally efficient method that evolves cross-shore profiles of sand beaches with or without cliffs along natural and urban coastal environments and across expansive geographic areas at decadal to centennial time-scales driven by 21st century climate change projections. The model requires projected sea level rise rates, extrema of nearshore wave conditions, bluff recession and shoreline change rates, and cross-shore profiles representing present-day conditions. The model is applied to the ~470-km long coast of the Southern California Bight, USA, using recently available projected nearshore waves and bluff recession and shoreline change rates. The results indicate that eroded cliff material, from unarmored cliffs, contribute 11% to 26% to the total sediment budget. Historical beach nourishment rates will need to increase by more than 30% for a 0.25 m sea level rise (~2044) and by at least 75% by the year 2100 for a 1 m sea level rise, if evolution of the shoreline is to keep pace with rising sea levels.

  7. The FIFE Project at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Box, D. [Fermilab; Boyd, J. [Fermilab; Di Benedetto, V. [Fermilab; Ding, P. [Fermilab; Dykstra, D. [Fermilab; Fattoruso, M. [Fermilab; Garzoglio, G. [Fermilab; Herner, K. [Fermilab; Levshina, T. [Fermilab; Kirby, M. [Fermilab; Kreymer, A. [Fermilab; Mazzacane, A. [Fermilab; Mengel, M. [Fermilab; Mhashilkar, P. [Fermilab; Podstavkov, V. [Fermilab; Retzke, K. [Fermilab; Sharma, N. [Fermilab

    2016-01-01

    The FabrIc for Frontier Experiments (FIFE) project is an initiative within the Fermilab Scientific Computing Division designed to steer the computing model for non-LHC Fermilab experiments across multiple physics areas. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying size, needs, and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of solutions for high throughput computing, data management, database access and collaboration management within an experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid compute sites along with dedicated and commercial cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including a common job submission service, software and reference data distribution through CVMFS repositories, flexible and robust data transfer clients, and access to opportunistic resources on the Open Science Grid. The progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken the leading role in defining the computing model for Fermilab experiments, aided in the design of experiments beyond those hosted at Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide.

  8. Evolution of product lifespan and implications for environmental assessment and management: a case study of personal computers in higher education.

    Science.gov (United States)

    Babbitt, Callie W; Kahhat, Ramzy; Williams, Eric; Babbitt, Gregory A

    2009-07-01

    Product lifespan is a fundamental variable in understanding the environmental impacts associated with the life cycle of products. Existing life cycle and materials flow studies of products, almost without exception, consider lifespan to be constant over time. To determine the validity of this assumption, this study provides an empirical documentation of the long-term evolution of personal computer lifespan, using a major U.S. university as a case study. Results indicate that over the period 1985-2000, computer lifespan (purchase to "disposal") decreased steadily from a mean of 10.7 years in 1985 to 5.5 years in 2000. The distribution of lifespan also evolved, becoming narrower over time. Overall, however, lifespan distribution was broader than normally considered in life cycle assessments or materials flow forecasts of electronic waste management for policy. We argue that these results suggest that at least for computers, the assumption of constant lifespan is problematic and that it is important to work toward understanding the dynamics of use patterns. We modify an age-structured model of population dynamics from biology as a modeling approach to describe product life cycles. Lastly, the purchase share and generation of obsolete computers from the higher education sector is estimated using different scenarios for the dynamics of product lifespan.

  9. Analysis and modeling of social influence in high performance computing workloads

    KAUST Repository

    Zheng, Shuai

    2011-01-01

    Social influence among users (e.g., collaboration on a project) creates bursty behavior in the underlying high performance computing (HPC) workloads. Using representative HPC and cluster workload logs, this paper identifies, analyzes, and quantifies the level of social influence across HPC users. We show the existence of a social graph that is characterized by a pattern of dominant users and followers. This pattern also follows a power-law distribution, which is consistent with those observed in mainstream social networks. Given its potential impact on HPC workloads prediction and scheduling, we propose a fast-converging, computationally-efficient online learning algorithm for identifying social groups. Extensive evaluation shows that our online algorithm can (1) quickly identify the social relationships by using a small portion of incoming jobs and (2) can efficiently track group evolution over time. © 2011 Springer-Verlag.

  10. A Web-Based Monitoring System for Multidisciplinary Design Projects

    Science.gov (United States)

    Rogers, James L.; Salas, Andrea O.; Weston, Robert P.

    1998-01-01

    In today's competitive environment, both industry and government agencies are under pressure to reduce the time and cost of multidisciplinary design projects. New tools have been introduced to assist in this process by facilitating the integration of and communication among diverse disciplinary codes. One such tool, a framework for multidisciplinary computational environments, is defined as a hardware and software architecture that enables integration, execution, and communication among diverse disciplinary processes. An examination of current frameworks reveals weaknesses in various areas, such as sequencing, displaying, monitoring, and controlling the design process. The objective of this research is to explore how Web technology, integrated with an existing framework, can improve these areas of weakness. This paper describes a Web-based system that optimizes and controls the execution sequence of design processes; and monitors the project status and results. The three-stage evolution of the system with increasingly complex problems demonstrates the feasibility of this approach.

  11. Pengembangan Model Pembelajaran Project Based Learning pada Mata Kuliah Computer Aided Design

    Directory of Open Access Journals (Sweden)

    Satoto Endar Nayono

    2013-09-01

    Full Text Available One of the key competencies of graduates majoring in Civil Engineering and Planning Education, Faculty of Engineering, Yogyakarta State University (YSU is able to plan buildings. CAD courses aim to train students to be able to pour the planning concepts into the picture. One of the obstacles faced in the course are concepts and pictures that created by the students often do not correspond to the standards used in the field. This study aims to develop a model of project-based learning so that the students’ pictures are more in line with the actual conditions in the field. This study was carried out through the stages as follows: (1 Pre test, (2 Planning of learning, (3 Implementation of the learning model of project-based learning, (4 monitoring and evaluation (5 Reflection and revision, (6 Implementation of learning in the next cycle, and (7 Evaluation of the learning outcomes. This study was conducted for four months in 2012 in the Department of Civil Engineering and Planning Education, Faculty of Engineering, YSU. The subjects of this study are the students who took the course of Computer Aided Design. The analysis of the data used descriptive qualitative and descriptive statistics. The results of this study were: (1 The implementation of project based learning model was proven to increase the learning process and the learning outcomes of students in the subject of CAD through the provision of buildings planning pictures tasks of school buildings based on the real conditions in the field. The task was delivered in every meeting and improved based on the feedback from their lecturers, (2 the learning model of project based learning will be easier to be implemented if it is accompanied by the model of peer tutoring and the learning model of PAIKEM.

  12. Computational and Experimental Studies of Microstructure-Scale Porosity in Metallic Fuels for Improved Gas Swelling Behavior

    Energy Technology Data Exchange (ETDEWEB)

    Mllett, Paul [Univ. of Arkansas, Fayetteville, AR (United States); McDeavitt, Sean [Texas A & M Univ., College Station, TX (United States); Deo, Chaitanya [Georgia Inst. of Technology, Atlanta, GA (United States); Mariani, Robert [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2018-01-29

    This proposal will investigate the stability of bimodal pore size distributions in metallic uranium and uranium-zirconium alloys during sintering and re-sintering annealing treatments. The project will utilize both computational and experimental approaches. The computational approach includes both Molecular Dynamics simulations to determine the self-diffusion coefficients in pure U and U-Zr alloys in single crystals, grain boundaries, and free surfaces, as well as calculations of grain boundary and free surface interfacial energies. Phase-field simulations using MOOSE will be conducted to study pore and grain structure evolution in microstructures with bimodal pore size distributions. Experiments will also be performed to validate the simulations, and measure the time-dependent densification of bimodal porous compacts.

  13. Community petascale project for accelerator science and simulation: Advancing computational science for future accelerators and accelerator technologies

    International Nuclear Information System (INIS)

    Spentzouris, P.; Cary, J.; McInnes, L.C.; Mori, W.; Ng, C.; Ng, E.; Ryne, R.

    2008-01-01

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R and D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.

  14. Chirp subbottom profiler data collected in Pamlico Sound on cruise SndPt_05_21_22_2012 of RV Riggs for the Coastal Hydrodynamics and Natural Geologic Evolution (CHaNGE) project

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Edgetech 216 chirp data (SEG-Y format) collected for the Coastal Hydrodynamics and Natural Geologic Evolution (CHaNGE) project, OCE-1130843. Survey area covers...

  15. Launching "the evolution of cooperation".

    Science.gov (United States)

    Axelrod, Robert

    2012-04-21

    This article describes three aspects of the author's early work on the evolution of the cooperation. First, it explains how the idea for a computer tournament for the iterated Prisoner's Dilemma was inspired by the artificial intelligence research on computer checkers and computer chess. Second, it shows how the vulnerability of simple reciprocity of misunderstanding or misimplementation can be eliminated with the addition of some degree of generosity or contrition. Third, it recounts the unusual collaboration between the author, a political scientist, and William D. Hamilton, an evolutionary biologist. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. MakerBot projects blueprints

    CERN Document Server

    Larson, Joseph

    2013-01-01

    MakerBot Projects Blueprints is a project-based book, with each chapter taking you through the creation of an awesome stand-alone project. MakerBot Project Blueprints is for anyone with an interest in the 3D printing revolution and the slightest bit of computer skills. Whether you own a 3D printer or not you can design for them. All it takes is Blender, a free 3D modeling tool, this book and a little creativity and someday you'll be able to hold something you designed in the computer in your hands.

  17. Evolution and non-equilibrium physics

    DEFF Research Database (Denmark)

    Becker, Nikolaj; Sibani, Paolo

    2014-01-01

    We argue that the stochastic dynamics of interacting agents which replicate, mutate and die constitutes a non-equilibrium physical process akin to aging in complex materials. Specifically, our study uses extensive computer simulations of the Tangled Nature Model (TNM) of biological evolution...

  18. Sociohistorical evolution of judo: introductory approaches

    Directory of Open Access Journals (Sweden)

    Orozimbo Cordeiro Júnior

    2008-06-01

    Full Text Available The sociohistorical evolution of judo provided by the research project Methodology for teaching judo from the critical–excelling stance is discussed in this article. The aim of the project was to establish a plan for systematizing judo as body culture constituent and scholastic knowledge of physical education. The ancillary pedagogical material is constituted by an introduction, objectives, contents, teaching methodology and evaluation system.

  19. Darwinian evolution on a chip.

    Directory of Open Access Journals (Sweden)

    Brian M Paegel

    2008-04-01

    Full Text Available Computer control of Darwinian evolution has been demonstrated by propagating a population of RNA enzymes in a microfluidic device. The RNA population was challenged to catalyze the ligation of an oligonucleotide substrate under conditions of progressively lower substrate concentrations. A microchip-based serial dilution circuit automated an exponential growth phase followed by a 10-fold dilution, which was repeated for 500 log-growth iterations. Evolution was observed in real time as the population adapted and achieved progressively faster growth rates over time. The final evolved enzyme contained a set of 11 mutations that conferred a 90-fold improvement in substrate utilization, coinciding with the applied selective pressure. This system reduces evolution to a microfluidic algorithm, allowing the experimenter to observe and manipulate adaptation.

  20. Explaining the Evolution of Poverty

    DEFF Research Database (Denmark)

    Arndt, Channing; Hussain, Azhar; Jones, Edward Samuel

    2012-01-01

    We provide a comprehensive approach for analyzing the evolution of poverty using Mozambique as a case study. Bringing together data from disparate sources, we develop a novel “back-casting” framework that links a dynamic computable general equilibrium model to a micro-simulation poverty module....... This framework provides a new approach to explaining and decomposing the evolution of poverty, as well as to examining rigorously the coherence between poverty, economic growth, and inequality outcomes. Finally, various simple but useful and rarely-applied approaches to considering regional changes in poverty...

  1. Computer literacy enhancement in the Teaching Hospital Olomouc. Part I: project management techniques. Short communication.

    Science.gov (United States)

    Sedlár, Drahomír; Potomková, Jarmila; Rehorová, Jarmila; Seckár, Pavel; Sukopová, Vera

    2003-11-01

    Information explosion and globalization make great demands on keeping pace with the new trends in the healthcare sector. The contemporary level of computer and information literacy among most health care professionals in the Teaching Hospital Olomouc (Czech Republic) is not satisfactory for efficient exploitation of modern information technology in diagnostics, therapy and nursing. The present contribution describes the application of two basic problem solving techniques (brainstorming, SWOT analysis) to develop a project aimed at information literacy enhancement.

  2. Evolution of Cloud Storage as Cloud Computing Infrastructure Service

    OpenAIRE

    Rajan, Arokia Paul; Shanmugapriyaa

    2013-01-01

    Enterprises are driving towards less cost, more availability, agility, managed risk - all of which is accelerated towards Cloud Computing. Cloud is not a particular product, but a way of delivering IT services that are consumable on demand, elastic to scale up and down as needed, and follow a pay-for-usage model. Out of the three common types of cloud computing service models, Infrastructure as a Service (IaaS) is a service model that provides servers, computing power, network bandwidth and S...

  3. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  4. The Evolution of Galaxies

    Czech Academy of Sciences Publication Activity Database

    Palouš, Jan

    2007-01-01

    Roč. 17, - (2007), s. 34-40 ISSN 1220-5168. [Heliospere and galaxy. Sinaia, 03.05.2007-05.05.2007] R&D Projects: GA MŠk(CZ) LC06014 Institutional research plan: CEZ:AV0Z10030501 Keywords : ISM structure * stars formation * evolution of galaxies Subject RIV: BN - Astronomy, Celestial Mechanics, Astrophysics

  5. Fermilab ACP multi-microprocessor project

    International Nuclear Information System (INIS)

    Gaines, I.; Areti, H.; Biel, J.; Bracker, S.; Case, G.; Fischler, M.; Husby, D.; Nash, T.

    1984-08-01

    We report on the status of the Fermilab Advanced Computer Program's project to provide more cost-effective computing engines for the high energy physics community. The project will exploit the cheap, but powerful, commercial microprocessors now available by constructing modular multi-microprocessor systems. A working test bed system as well as plans for the next stages of the project are described

  6. Mixed Waste Treatment Project: Computer simulations of integrated flowsheets

    International Nuclear Information System (INIS)

    Dietsche, L.J.

    1993-12-01

    The disposal of mixed waste, that is waste containing both hazardous and radioactive components, is a challenging waste management problem of particular concern to DOE sites throughout the United States. Traditional technologies used for the destruction of hazardous wastes need to be re-evaluated for their ability to handle mixed wastes, and in some cases new technologies need to be developed. The Mixed Waste Treatment Project (MWTP) was set up by DOE's Waste Operations Program (EM30) to provide guidance on mixed waste treatment options. One of MWTP's charters is to develop flowsheets for prototype integrated mixed waste treatment facilities which can serve as models for sites developing their own treatment strategies. Evaluation of these flowsheets is being facilitated through the use of computer modelling. The objective of the flowsheet simulations is to provide mass and energy balances, product compositions, and equipment sizing (leading to cost) information. The modelled flowsheets need to be easily modified to examine how alternative technologies and varying feed streams effect the overall integrated process. One such commercially available simulation program is ASPEN PLUS. This report contains details of the Aspen Plus program

  7. Mobile cloud computing for computation offloading: Issues and challenges

    Directory of Open Access Journals (Sweden)

    Khadija Akherfi

    2018-01-01

    Full Text Available Despite the evolution and enhancements that mobile devices have experienced, they are still considered as limited computing devices. Today, users become more demanding and expect to execute computational intensive applications on their smartphone devices. Therefore, Mobile Cloud Computing (MCC integrates mobile computing and Cloud Computing (CC in order to extend capabilities of mobile devices using offloading techniques. Computation offloading tackles limitations of Smart Mobile Devices (SMDs such as limited battery lifetime, limited processing capabilities, and limited storage capacity by offloading the execution and workload to other rich systems with better performance and resources. This paper presents the current offloading frameworks, computation offloading techniques, and analyzes them along with their main critical issues. In addition, it explores different important parameters based on which the frameworks are implemented such as offloading method and level of partitioning. Finally, it summarizes the issues in offloading frameworks in the MCC domain that requires further research.

  8. Land destruction and redevelopment - the use of computer based landscape evolution models for post-mining landscape reconstruction

    Science.gov (United States)

    Hancock, Greg; Willgoose, Garry

    2017-04-01

    Mining provides essential resources for the global economy as well as considerable employment and economic benefits for the community. Mining is necessary for the modern economy. However, in recent decades the scale and environmental impact of mining has grown in line with the global demand for resources. This requires ever increasing areas of land to be disturbed. In particular, open-cast mining removes topsoil, disrupts aquifers and removes uneconomic material to depths of many hundreds of metres. Post-mining, this highly disturbed landscape system requires rehabilitation. The first and most important component of this process is to construct an erosionally stable landform which then can ecologically integrate with the surrounding undisturbed landscape. The scale and importance of this process cannot be overstated as without planned rehabilitation it is likely that a degraded and highly erosional landscape system with result. Here we discuss computer based landform evolution models which provide essential information on the likely erosional stability of the reconstructed landscape. These models use a digital elevation model to represent the landscape and dynamically adjusts the surface in response to erosion and deposition. They provide information on soil erosion rates at the storm event time scale through to annual time scales. The models can also be run to assess landscape evolution at millennial time scales. They also provide information on the type of erosion (i.e. rilling, gullying) and likely gully depths (and if they will occur). Importantly, the latest models have vegetation, armouring and pedogenesis submodels incorporated into their formulation. This allows both the surface and subsurface landscape evolution to be assessed. These models have been widely used and have huge benefits for the assessment of reconstructed landscapes as well as other disturbed landscape systems. Here we outline the state of the art.

  9. Statistical Emulation of Climate Model Projections Based on Precomputed GCM Runs*

    KAUST Repository

    Castruccio, Stefano

    2014-03-01

    The authors describe a new approach for emulating the output of a fully coupled climate model under arbitrary forcing scenarios that is based on a small set of precomputed runs from the model. Temperature and precipitation are expressed as simple functions of the past trajectory of atmospheric CO2 concentrations, and a statistical model is fit using a limited set of training runs. The approach is demonstrated to be a useful and computationally efficient alternative to pattern scaling and captures the nonlinear evolution of spatial patterns of climate anomalies inherent in transient climates. The approach does as well as pattern scaling in all circumstances and substantially better in many; it is not computationally demanding; and, once the statistical model is fit, it produces emulated climate output effectively instantaneously. It may therefore find wide application in climate impacts assessments and other policy analyses requiring rapid climate projections.

  10. A general exact method for synthesizing parallel-beam projections from cone-beam projections via filtered backprojection

    International Nuclear Information System (INIS)

    Li Liang; Chen Zhiqiang; Xing Yuxiang; Zhang Li; Kang Kejun; Wang Ge

    2006-01-01

    In recent years, image reconstruction methods for cone-beam computed tomography (CT) have been extensively studied. However, few of these studies discussed computing parallel-beam projections from cone-beam projections. In this paper, we focus on the exact synthesis of complete or incomplete parallel-beam projections from cone-beam projections. First, an extended central slice theorem is described to establish a relationship between the Radon space and the Fourier space. Then, data sufficiency conditions are proposed for computing parallel-beam projection data from cone-beam data. Using these results, a general filtered backprojection algorithm is formulated that can exactly synthesize parallel-beam projection data from cone-beam projection data. As an example, we prove that parallel-beam projections can be exactly synthesized in an angular range in the case of circular cone-beam scanning. Interestingly, this angular range is larger than that derived in the Feldkamp reconstruction framework. Numerical experiments are performed in the circular scanning case to verify our method

  11. Evolution and development of virtual inflorescences.

    NARCIS (Netherlands)

    Koes, R.E.

    2008-01-01

    The architecture of inflorescences diverged during the evolution of distinct plant families by mechanisms that remain unknown. Using computer modeling, Przemyslaw Prusinkiewicz and colleagues established a single model for the development of distinct inflorescences. Selection restricts inflorescence

  12. Academic training: From Evolution Theory to Parallel and Distributed Genetic Programming

    CERN Multimedia

    2007-01-01

    2006-2007 ACADEMIC TRAINING PROGRAMME LECTURE SERIES 15, 16 March From 11:00 to 12:00 - Main Auditorium, bldg. 500 From Evolution Theory to Parallel and Distributed Genetic Programming F. FERNANDEZ DE VEGA / Univ. of Extremadura, SP Lecture No. 1: From Evolution Theory to Evolutionary Computation Evolutionary computation is a subfield of artificial intelligence (more particularly computational intelligence) involving combinatorial optimization problems, which are based to some degree on the evolution of biological life in the natural world. In this tutorial we will review the source of inspiration for this metaheuristic and its capability for solving problems. We will show the main flavours within the field, and different problems that have been successfully solved employing this kind of techniques. Lecture No. 2: Parallel and Distributed Genetic Programming The successful application of Genetic Programming (GP, one of the available Evolutionary Algorithms) to optimization problems has encouraged an ...

  13. Evolution in Many-Sheeted Space-time

    OpenAIRE

    Pitkänen, Matti

    2010-01-01

    The topics of the article has been restricted to those, which seem to represent the most well-established ideas about evolution in many-sheeted space-time. a) Basic facts about and TGD based model for pre-biotic evolution are discussed. b) A model for the ATP-ADP process based on DNA as topological quantum computer vision, the identification of universal metabolic energy quanta in terms of zero point kinetic energies, and the notion of remote metabolism is discussed. c) A model f...

  14. Humans, computers and wizards human (simulated) computer interaction

    CERN Document Server

    Fraser, Norman; McGlashan, Scott; Wooffitt, Robin

    2013-01-01

    Using data taken from a major European Union funded project on speech understanding, the SunDial project, this book considers current perspectives on human computer interaction and argues for the value of an approach taken from sociology which is based on conversation analysis.

  15. Computational methods for reversed-field equilibrium

    International Nuclear Information System (INIS)

    Boyd, J.K.; Auerbach, S.P.; Willmann, P.A.; Berk, H.L.; McNamara, B.

    1980-01-01

    Investigating the temporal evolution of reversed-field equilibrium caused by transport processes requires the solution of the Grad-Shafranov equation and computation of field-line-averaged quantities. The technique for field-line averaging and the computation of the Grad-Shafranov equation are presented. Application of Green's function to specify the Grad-Shafranov equation boundary condition is discussed. Hill's vortex formulas used to verify certain computations are detailed. Use of computer software to implement computational methods is described

  16. Art as A Playground for Evolution

    DEFF Research Database (Denmark)

    Beloff, Laura

    2016-01-01

    Art works which engage with the topic of human enhancement and evolution have begun appearing parallel to increased awareness about anthropogenic changes to our environment and acceleration of the speed of technological developments that impact us and our biological environment. The article...... and related topics is proposed as play activity for adults, which simultaneously experiments directly with ideas concerning evolution and human development. The author proposes that these kinds of experimental art projects support our mental adaptation to evolutionary changes....

  17. Distributed-memory matrix computations

    DEFF Research Database (Denmark)

    Balle, Susanne Mølleskov

    1995-01-01

    The main goal of this project is to investigate, develop, and implement algorithms for numerical linear algebra on parallel computers in order to acquire expertise in methods for parallel computations. An important motivation for analyzaing and investigating the potential for parallelism in these......The main goal of this project is to investigate, develop, and implement algorithms for numerical linear algebra on parallel computers in order to acquire expertise in methods for parallel computations. An important motivation for analyzaing and investigating the potential for parallelism...... in these algorithms is that many scientific applications rely heavily on the performance of the involved dense linear algebra building blocks. Even though we consider the distributed-memory as well as the shared-memory programming paradigm, the major part of the thesis is dedicated to distributed-memory architectures....... We emphasize distributed-memory massively parallel computers - such as the Connection Machines model CM-200 and model CM-5/CM-5E - available to us at UNI-C and at Thinking Machines Corporation. The CM-200 was at the time this project started one of the few existing massively parallel computers...

  18. Examination of China’s performance and thematic evolution in quantum cryptography research using quantitative and computational techniques

    Science.gov (United States)

    2018-01-01

    This study performed two phases of analysis to shed light on the performance and thematic evolution of China’s quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001–2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China’s QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China’s performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China’s performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China’s H-index (a normalized indicator) has surpassed all other countries’ over the last several years. The second phase of analysis shows how China’s main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China’s QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology

  19. The Evolution of Electronic Publishing.

    Science.gov (United States)

    Lancaster, F. W.

    1995-01-01

    Discusses the evolution of electronic publishing from the early 1960s when computers were used merely to produce conventional printed products to the present move toward networked scholarly publishing. Highlights include library development, periodicals on the Internet, online journals versus paper journals, problems, and the future of…

  20. Quantum-mechanical computers and uncomputability

    International Nuclear Information System (INIS)

    Lloyd, S.

    1993-01-01

    The time evolution operator for any quantum-mechanical computer is diagonalizable, but to obtain the diagonal decomposition of a program state of the computer is as hard as actually performing the computation corresponding to the program. In particular, if a quantum-mechanical system is capable of universal computation, then the diagonal decomposition of program states is uncomputable. As a result, in a universe in which local variables support universal computation, a quantum-mechanical theory for that universe that supplies its spectrum cannot supply the spectral decomposition of the computational variables. A ''theory of everything'' can be simultaneously correct and fundamentally incomplete

  1. The New Mexico Technology Deployment Pilot Project: A technology reinvestment project. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-11-01

    The New Mexico Technology Deployment Project (NMTDP) has been in operation for slightly more than two years. As one of the original TRP projects, NMTDP had the charter to develop and validate a new model for technology extraction which emphasized focused technology collaboration, early industry involvement, and a strong dual use commercialization and productization emphasis. Taken in total, the first two years of the NMTDP have been exceptionally successful, surpassing the goals of the project. This report describes the accomplishments and evolution of the NMTDP to date and discusses the future potential of the project. Despite the end of federal funding, and a subsequent reduction in level of effort, the project partners are committed to continuation of the project.

  2. Evolution of Computed Tomography Findings in Secondary Aortoenteric Fistula

    International Nuclear Information System (INIS)

    Bas, Ahmet; Simsek, Osman; Kandemirli, Sedat Giray; Rafiee, Babak; Gulsen, Fatih; Numan, Furuzan

    2015-01-01

    Aortoenteric fistula is a rare but significant clinical entity associated with high morbidity and mortality if remain untreated. Clinical presentation and imaging findings may be subtle and prompt diagnosis can be difficult. Herein, we present a patient who initially presented with abdominal pain and computed tomography showed an aortic aneurysm compressing duodenum without any air bubbles. One month later, the patient presented with gastrointestinal bleeding and computed tomography revealed air bubbles within aneurysm. With a diagnosis of aortoenteric fistula, endovascular aneurysm repair was carried out. This case uniquely presented the computed tomography findings in progression of an aneurysm to an aortoenteric fistula

  3. The Use of Computer Tools in the Design Process of Students’ Architectural Projects. Case Studies in Algeria

    Science.gov (United States)

    Saighi, Ouafa; Salah Zerouala, Mohamed

    2017-12-01

    This The paper particularly deals with the way in which computer tools are used by students in their design studio’s projects. Four institutions of architecture education in Algeria are considered as a case study to evaluate the impact of such tools on student design process. This aims to inspect in depth such use, to sort out its advantages and shortcomings in order to suggest some solutions. A field survey was undertaken on a sample of students and their teachers at the same institutions. The analysed results mainly show that computer tools are highly focusing on improving the quality of drawings representation and images seeking observers’ satisfaction hence influencing their decision. Some teachers are not very keen to overuse the computer during the design phase; they prefer the “traditional” approach. This is the present situation that Algerian university is facing which leads to conflict and disagreement between students and teachers. Meanwhile, there was no doubt that computer tools have effectively contributed to improve the competitive level among students.

  4. Soft computing in computer and information science

    CERN Document Server

    Fray, Imed; Pejaś, Jerzy

    2015-01-01

    This book presents a carefully selected and reviewed collection of papers presented during the 19th Advanced Computer Systems conference ACS-2014. The Advanced Computer Systems conference concentrated from its beginning on methods and algorithms of artificial intelligence. Further future brought new areas of interest concerning technical informatics related to soft computing and some more technological aspects of computer science such as multimedia and computer graphics, software engineering, web systems, information security and safety or project management. These topics are represented in the present book under the categories Artificial Intelligence, Design of Information and Multimedia Systems, Information Technology Security and Software Technologies.

  5. Computing the distribution of return levels of extreme warm temperatures for future climate projections

    Energy Technology Data Exchange (ETDEWEB)

    Pausader, M.; Parey, S.; Nogaj, M. [EDF/R and D, Chatou Cedex (France); Bernie, D. [Met Office Hadley Centre, Exeter (United Kingdom)

    2012-03-15

    In order to take into account uncertainties in the future climate projections there is a growing demand for probabilistic projections of climate change. This paper presents a methodology for producing such a probabilistic analysis of future temperature extremes. The 20- and 100-years return levels are obtained from that of the normalized variable and the changes in mean and standard deviation given by climate models for the desired future periods. Uncertainty in future change of these extremes is quantified using a multi-model ensemble and a perturbed physics ensemble. The probability density functions of future return levels are computed at a representative location from the joint probability distribution of mean and standard deviation changes given by the two combined ensembles of models. For the studied location, the 100-years return level at the end of the century is lower than 41 C with an 80% confidence. Then, as the number of model simulations is low to compute a reliable distribution, two techniques proposed in the literature (local pattern scaling and ANOVA) have been used to infer the changes in mean and standard deviation for the combinations of RCM and GCM which have not been run. The ANOVA technique leads to better results for the reconstruction of the mean changes, whereas the two methods fail to correctly infer the changes in standard deviation. As standard deviation change has a major impact on return level change, there is a need to improve the models and the different techniques regarding the variance changes. (orig.)

  6. Projection decomposition algorithm for dual-energy computed tomography via deep neural network.

    Science.gov (United States)

    Xu, Yifu; Yan, Bin; Chen, Jian; Zeng, Lei; Li, Lei

    2018-03-15

    Dual-energy computed tomography (DECT) has been widely used to improve identification of substances from different spectral information. Decomposition of the mixed test samples into two materials relies on a well-calibrated material decomposition function. This work aims to establish and validate a data-driven algorithm for estimation of the decomposition function. A deep neural network (DNN) consisting of two sub-nets is proposed to solve the projection decomposition problem. The compressing sub-net, substantially a stack auto-encoder (SAE), learns a compact representation of energy spectrum. The decomposing sub-net with a two-layer structure fits the nonlinear transform between energy projection and basic material thickness. The proposed DNN not only delivers image with lower standard deviation and higher quality in both simulated and real data, and also yields the best performance in cases mixed with photon noise. Moreover, DNN costs only 0.4 s to generate a decomposition solution of 360 × 512 size scale, which is about 200 times faster than the competing algorithms. The DNN model is applicable to the decomposition tasks with different dual energies. Experimental results demonstrated the strong function fitting ability of DNN. Thus, the Deep learning paradigm provides a promising approach to solve the nonlinear problem in DECT.

  7. [Orthogonal Vector Projection Algorithm for Spectral Unmixing].

    Science.gov (United States)

    Song, Mei-ping; Xu, Xing-wei; Chang, Chein-I; An, Ju-bai; Yao, Li

    2015-12-01

    Spectrum unmixing is an important part of hyperspectral technologies, which is essential for material quantity analysis in hyperspectral imagery. Most linear unmixing algorithms require computations of matrix multiplication and matrix inversion or matrix determination. These are difficult for programming, especially hard for realization on hardware. At the same time, the computation costs of the algorithms increase significantly as the number of endmembers grows. Here, based on the traditional algorithm Orthogonal Subspace Projection, a new method called. Orthogonal Vector Projection is prompted using orthogonal principle. It simplifies this process by avoiding matrix multiplication and inversion. It firstly computes the final orthogonal vector via Gram-Schmidt process for each endmember spectrum. And then, these orthogonal vectors are used as projection vector for the pixel signature. The unconstrained abundance can be obtained directly by projecting the signature to the projection vectors, and computing the ratio of projected vector length and orthogonal vector length. Compared to the Orthogonal Subspace Projection and Least Squares Error algorithms, this method does not need matrix inversion, which is much computation costing and hard to implement on hardware. It just completes the orthogonalization process by repeated vector operations, easy for application on both parallel computation and hardware. The reasonability of the algorithm is proved by its relationship with Orthogonal Sub-space Projection and Least Squares Error algorithms. And its computational complexity is also compared with the other two algorithms', which is the lowest one. At last, the experimental results on synthetic image and real image are also provided, giving another evidence for effectiveness of the method.

  8. The Magellan Final Report on Cloud Computing

    Energy Technology Data Exchange (ETDEWEB)

    ,; Coghlan, Susan; Yelick, Katherine

    2011-12-21

    The goal of Magellan, a project funded through the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR), was to investigate the potential role of cloud computing in addressing the computing needs for the DOE Office of Science (SC), particularly related to serving the needs of mid- range computing and future data-intensive computing workloads. A set of research questions was formed to probe various aspects of cloud computing from performance, usability, and cost. To address these questions, a distributed testbed infrastructure was deployed at the Argonne Leadership Computing Facility (ALCF) and the National Energy Research Scientific Computing Center (NERSC). The testbed was designed to be flexible and capable enough to explore a variety of computing models and hardware design points in order to understand the impact for various scientific applications. During the project, the testbed also served as a valuable resource to application scientists. Applications from a diverse set of projects such as MG-RAST (a metagenomics analysis server), the Joint Genome Institute, the STAR experiment at the Relativistic Heavy Ion Collider, and the Laser Interferometer Gravitational Wave Observatory (LIGO), were used by the Magellan project for benchmarking within the cloud, but the project teams were also able to accomplish important production science utilizing the Magellan cloud resources.

  9. A Monte Carlo model for 3D grain evolution during welding

    Science.gov (United States)

    Rodgers, Theron M.; Mitchell, John A.; Tikare, Veena

    2017-09-01

    Welding is one of the most wide-spread processes used in metal joining. However, there are currently no open-source software implementations for the simulation of microstructural evolution during a weld pass. Here we describe a Potts Monte Carlo based model implemented in the SPPARKS kinetic Monte Carlo computational framework. The model simulates melting, solidification and solid-state microstructural evolution of material in the fusion and heat-affected zones of a weld. The model does not simulate thermal behavior, but rather utilizes user input parameters to specify weld pool and heat-affect zone properties. Weld pool shapes are specified by Bézier curves, which allow for the specification of a wide range of pool shapes. Pool shapes can range from narrow and deep to wide and shallow representing different fluid flow conditions within the pool. Surrounding temperature gradients are calculated with the aide of a closest point projection algorithm. The model also allows simulation of pulsed power welding through time-dependent variation of the weld pool size. Example simulation results and comparisons with laboratory weld observations demonstrate microstructural variation with weld speed, pool shape, and pulsed-power.

  10. Teachers' Views about the Use of Tablet Computers Distributed in Schools as Part of the Fatih Project

    Science.gov (United States)

    Gökmen, Ömer Faruk; Duman, Ibrahim; Akgün, Özcan Erkan

    2018-01-01

    The purpose of this study is to investigate teachers' views about the use of tablet computers distributed as a part of the FATIH (Movement for Enhancing Opportunities and Improving Technology) Project. In this study, the case study method, one of the qualitative research methods, was used. The participants were 20 teachers from various fields…

  11. Position specific variation in the rate of evolution intranscription factor binding sites

    Energy Technology Data Exchange (ETDEWEB)

    Moses, Alan M.; Chiang, Derek Y.; Kellis, Manolis; Lander, EricS.; Eisen, Michael B.

    2003-08-28

    The binding sites of sequence specific transcription factors are an important and relatively well-understood class of functional non-coding DNAs. Although a wide variety of experimental and computational methods have been developed to characterize transcription factor binding sites, they remain difficult to identify. Comparison of non-coding DNA from related species has shown considerable promise in identifying these functional non-coding sequences, even though relatively little is known about their evolution. Here we analyze the genome sequences of the budding yeasts Saccharomyces cerevisiae, S. bayanus, S. paradoxus and S. mikataeto study the evolution of transcription factor binding sites. As expected, we find that both experimentally characterized and computationally predicted binding sites evolve slower than surrounding sequence, consistent with the hypothesis that they are under purifying selection. We also observe position-specific variation in the rate of evolution within binding sites. We find that the position-specific rate of evolution is positively correlated with degeneracy among binding sites within S. cerevisiae. We test theoretical predictions for the rate of evolution at positions where the base frequencies deviate from background due to purifying selection and find reasonable agreement with the observed rates of evolution. Finally, we show how the evolutionary characteristics of real binding motifs can be used to distinguish them from artifacts of computational motif finding algorithms. As has been observed for protein sequences, the rate of evolution in transcription factor binding sites varies with position, suggesting that some regions are under stronger functional constraint than others. This variation likely reflects the varying importance of different positions in the formation of the protein-DNA complex. The characterization of the pattern of evolution in known binding sites will likely contribute to the effective use of comparative

  12. Cloud Computing

    CERN Document Server

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  13. From evolution theory to parallel and distributed genetic

    CERN Multimedia

    CERN. Geneva

    2007-01-01

    Lecture #1: From Evolution Theory to Evolutionary Computation. Evolutionary computation is a subfield of artificial intelligence (more particularly computational intelligence) involving combinatorial optimization problems, which are based to some degree on the evolution of biological life in the natural world. In this tutorial we will review the source of inspiration for this metaheuristic and its capability for solving problems. We will show the main flavours within the field, and different problems that have been successfully solved employing this kind of techniques. Lecture #2: Parallel and Distributed Genetic Programming. The successful application of Genetic Programming (GP, one of the available Evolutionary Algorithms) to optimization problems has encouraged an increasing number of researchers to apply these techniques to a large set of problems. Given the difficulty of some problems, much effort has been applied to improving the efficiency of GP during the last few years. Among the available proposals,...

  14. Evolution of investment costs related to wood energy collective installations (2000-2006). Final report - Synthesis

    International Nuclear Information System (INIS)

    2009-04-01

    Based on a survey on 90 French projects, and on a comparison with 76 German projects and 36 Austrian projects, this document proposes a synthesis of a study which aimed at identifying and analysing the evolution of investment costs for wood collective heating systems between 2000 and 2006. Data are analysed and commented while stressing their limitations which are related to their quality, to project heterogeneity, to economic value scattering. The evolution of investment costs of French projects is analysed in terms of global cost, and of items (heat production, public works, studies and construction, item ratios)

  15. Evolutionary Computation Methods and their applications in Statistics

    Directory of Open Access Journals (Sweden)

    Francesco Battaglia

    2013-05-01

    Full Text Available A brief discussion of the genesis of evolutionary computation methods, their relationship to artificial intelligence, and the contribution of genetics and Darwin’s theory of natural evolution is provided. Then, the main evolutionary computation methods are illustrated: evolution strategies, genetic algorithms, estimation of distribution algorithms, differential evolution, and a brief description of some evolutionary behavior methods such as ant colony and particle swarm optimization. We also discuss the role of the genetic algorithm for multivariate probability distribution random generation, rather than as a function optimizer. Finally, some relevant applications of genetic algorithm to statistical problems are reviewed: selection of variables in regression, time series model building, outlier identification, cluster analysis, design of experiments.

  16. Evolution of α-particle distribution in burning plasmas including energy dependent α-transport effects

    International Nuclear Information System (INIS)

    Kamelander, G.; Sigmar, D.; Woloch, F.

    1991-09-01

    This report resumes the essential results of a common OEFZS/MIT (Plasma Fusion Center) project to investigate fusion alpha transport. A computer code has been developed going beyond standard FOKKER-PLANCK-codes assuming that the fusion products give their energy to the plasma on the place of their birth. The present transport code admits the calculation of the α-distribution function. By means of the distribution function the energy deposition rates are calculated. The time-evolution of the α-distribution function has been evaluated for an ignited plasma. A description of the transport code, of the subroutines and of the input data as well as a listing is enclosed to this report. (Authors)

  17. Bacterial computing: a form of natural computing and its applications.

    Science.gov (United States)

    Lahoz-Beltra, Rafael; Navarro, Jorge; Marijuán, Pedro C

    2014-01-01

    The capability to establish adaptive relationships with the environment is an essential characteristic of living cells. Both bacterial computing and bacterial intelligence are two general traits manifested along adaptive behaviors that respond to surrounding environmental conditions. These two traits have generated a variety of theoretical and applied approaches. Since the different systems of bacterial signaling and the different ways of genetic change are better known and more carefully explored, the whole adaptive possibilities of bacteria may be studied under new angles. For instance, there appear instances of molecular "learning" along the mechanisms of evolution. More in concrete, and looking specifically at the time dimension, the bacterial mechanisms of learning and evolution appear as two different and related mechanisms for adaptation to the environment; in somatic time the former and in evolutionary time the latter. In the present chapter it will be reviewed the possible application of both kinds of mechanisms to prokaryotic molecular computing schemes as well as to the solution of real world problems.

  18. Modernization projects in Santa Maria e Garona

    International Nuclear Information System (INIS)

    Marcos, R.; Alutiz, J. I.; Garcia Sanchez, M.

    2011-01-01

    This article shows a vision of the Santa Maria de Garona power Plant modernization guidelines and it also presents the most significant projects deployed in the last decade at the power plant grouped in mechanics projects, electrical projects, instrumentations projects and IT projects. At the same time three projects are explained in more detail: the change of one of the main transformers, the evolution from paper recorders to paperless video graphic recorders and the new plant data information system. (Author)

  19. Computer-aided detection of breast carcinoma in standard mammographic projections with digital mammography

    Energy Technology Data Exchange (ETDEWEB)

    Destounis, Stamatia [Elizabeth Wende Breast Care, LLC, Rochester, NY (United States); University of Rochester, School of Medicine and Dentistry, Rochester, NY (United States); Hanson, Sarah; Morgan, Renee; Murphy, Philip; Somerville, Patricia; Seifert, Posy; Andolina, Valerie; Arieno, Andrea; Skolny, Melissa; Logan-Young, Wende [Elizabeth Wende Breast Care, LLC, Rochester, NY (United States)

    2009-06-15

    A retrospective evaluation of the ability of computer-aided detection (CAD) ability to identify breast carcinoma in standard mammographic projections. Forty-five biopsy proven lesions in 44 patients imaged digitally with CAD applied at examination were reviewed. Forty-four screening BIRADS {sup registered} category 1 digital mammography examinations were randomly identified to serve as a comparative normal/control population. Data included patient age; BIRADS {sup registered} breast density; lesion type, size, and visibility; number, type, and location of CAD marks per image; CAD ability to mark lesions; needle core and surgical pathologic correlation. The CAD lesion/case sensitivity of 87% (n=39), image sensitivity of 69% (n=31) for mediolateral oblique view and 78% (n=35) for the craniocaudal view was found. The average false positive rate in 44 normal screening cases was 2.0 (range 1-8). The 2.0 figure is based on 88 reported false positive CAD marks in 44 normal screening exams: 98% (n=44) lesions proceeded to excision; initial pathology upgraded at surgical excision from in situ to invasive disease in 24% (n=9) lesions. CAD demonstrated potential to detect mammographically visible cancers in standard projections for all lesion types. (orig.)

  20. Using Raspberry Pi to Teach Computing "Inside Out"

    Science.gov (United States)

    Jaokar, Ajit

    2013-01-01

    This article discusses the evolution of computing education in preparing for the next wave of computing. With the proliferation of mobile devices, most agree that we are living in a "post-PC" world. Using the Raspberry Pi computer platform, based in the UK, as an example, the author discusses computing education in a world where the…

  1. Wide-angle display developments by computer graphics

    Science.gov (United States)

    Fetter, William A.

    1989-01-01

    Computer graphics can now expand its new subset, wide-angle projection, to be as significant a generic capability as computer graphics itself. Some prior work in computer graphics is presented which leads to an attractive further subset of wide-angle projection, called hemispheric projection, to be a major communication media. Hemispheric film systems have long been present and such computer graphics systems are in use in simulators. This is the leading edge of capabilities which should ultimately be as ubiquitous as CRTs (cathode-ray tubes). These assertions are not from degrees in science or only from a degree in graphic design, but in a history of computer graphics innovations, laying groundwork by demonstration. The author believes that it is timely to look at several development strategies, since hemispheric projection is now at a point comparable to the early stages of computer graphics, requiring similar patterns of development again.

  2. Computed tomography device

    International Nuclear Information System (INIS)

    Ohhashi, A.

    1985-01-01

    A computed tomography device comprising a subtraction unit which obtains differential data strings representing the difference between each time-serial projection data string of a group of projection data strings corresponding to a prospective reconstruction image generated by projection data strings acquired by a data acquisition system, a convolution unit which convolves each time-serial projection data string of the group of projection data strings corresponding to the prospective reconstruction image, and a back-projection unit which back-projects the convolved data strings

  3. Efficient receiver tuning using differential evolution strategies

    Science.gov (United States)

    Wheeler, Caleb H.; Toland, Trevor G.

    2016-08-01

    Differential evolution (DE) is a powerful and computationally inexpensive optimization strategy that can be used to search an entire parameter space or to converge quickly on a solution. The Kilopixel Array Pathfinder Project (KAPPa) is a heterodyne receiver system delivering 5 GHz of instantaneous bandwidth in the tuning range of 645-695 GHz. The fully automated KAPPa receiver test system finds optimal receiver tuning using performance feedback and DE. We present an adaptation of DE for use in rapid receiver characterization. The KAPPa DE algorithm is written in Python 2.7 and is fully integrated with the KAPPa instrument control, data processing, and visualization code. KAPPa develops the technologies needed to realize heterodyne focal plane arrays containing 1000 pixels. Finding optimal receiver tuning by investigating large parameter spaces is one of many challenges facing the characterization phase of KAPPa. This is a difficult task via by-hand techniques. Characterizing or tuning in an automated fashion without need for human intervention is desirable for future large scale arrays. While many optimization strategies exist, DE is ideal for time and performance constraints because it can be set to converge to a solution rapidly with minimal computational overhead. We discuss how DE is utilized in the KAPPa system and discuss its performance and look toward the future of 1000 pixel array receivers and consider how the KAPPa DE system might be applied.

  4. The status of US Teraflops-scale projects

    International Nuclear Information System (INIS)

    Mawhinney, R.D.

    1995-01-01

    The current status of United States projects pursuing Teraflops-scale computing resources for lattice field theory is discussed. Two projects are in existence at this time: the Multidisciplinary Teraflops Project, incorporating the physicists of the QCD Teraflops Collaboration, and a smaller project, centered at Columbia, involving the design and construction of a 0.8Teraflops computer primarily for QCD. ((orig.))

  5. Electroweak evolution equations

    International Nuclear Information System (INIS)

    Ciafaloni, Paolo; Comelli, Denis

    2005-01-01

    Enlarging a previous analysis, where only fermions and transverse gauge bosons were taken into account, we write down infrared-collinear evolution equations for the Standard Model of electroweak interactions computing the full set of splitting functions. Due to the presence of double logs which are characteristic of electroweak interactions (Bloch-Nordsieck violation), new infrared singular splitting functions have to be introduced. We also include corrections related to the third generation Yukawa couplings

  6. Embodied artificial evolution

    OpenAIRE

    Eiben, A. E.; Kernbach, S.; Haasdijk, Evert

    2012-01-01

    Evolution is one of the major omnipresent powers in the universe that has been studied for about two centuries. Recent scientific and technical developments make it possible to make the transition from passively understanding to actively using evolutionary processes. Today this is possible in Evolutionary Computing, where human experimenters can design and manipulate all components of evolutionary processes in digital spaces. We argue that in the near future it will be possible to implement a...

  7. Protein consensus-based surface engineering (ProCoS): a computer-assisted method for directed protein evolution.

    Science.gov (United States)

    Shivange, Amol V; Hoeffken, Hans Wolfgang; Haefner, Stefan; Schwaneberg, Ulrich

    2016-12-01

    Protein consensus-based surface engineering (ProCoS) is a simple and efficient method for directed protein evolution combining computational analysis and molecular biology tools to engineer protein surfaces. ProCoS is based on the hypothesis that conserved residues originated from a common ancestor and that these residues are crucial for the function of a protein, whereas highly variable regions (situated on the surface of a protein) can be targeted for surface engineering to maximize performance. ProCoS comprises four main steps: ( i ) identification of conserved and highly variable regions; ( ii ) protein sequence design by substituting residues in the highly variable regions, and gene synthesis; ( iii ) in vitro DNA recombination of synthetic genes; and ( iv ) screening for active variants. ProCoS is a simple method for surface mutagenesis in which multiple sequence alignment is used for selection of surface residues based on a structural model. To demonstrate the technique's utility for directed evolution, the surface of a phytase enzyme from Yersinia mollaretii (Ymphytase) was subjected to ProCoS. Screening just 1050 clones from ProCoS engineering-guided mutant libraries yielded an enzyme with 34 amino acid substitutions. The surface-engineered Ymphytase exhibited 3.8-fold higher pH stability (at pH 2.8 for 3 h) and retained 40% of the enzyme's specific activity (400 U/mg) compared with the wild-type Ymphytase. The pH stability might be attributed to a significantly increased (20 percentage points; from 9% to 29%) number of negatively charged amino acids on the surface of the engineered phytase.

  8. PAL: an object-oriented programming library for molecular evolution and phylogenetics.

    Science.gov (United States)

    Drummond, A; Strimmer, K

    2001-07-01

    Phylogenetic Analysis Library (PAL) is a collection of Java classes for use in molecular evolution and phylogenetics. PAL provides a modular environment for the rapid construction of both special-purpose and general analysis programs. PAL version 1.1 consists of 145 public classes or interfaces in 13 packages, including classes for models of character evolution, maximum-likelihood estimation, and the coalescent, with a total of more than 27000 lines of code. The PAL project is set up as a collaborative project to facilitate contributions from other researchers. AVAILIABILTY: The program is free and is available at http://www.pal-project.org. It requires Java 1.1 or later. PAL is licensed under the GNU General Public License.

  9. Evolution of the cellular communication system: An analysis in the Computational Paradigm

    International Nuclear Information System (INIS)

    Tahir Shah, K.

    1995-03-01

    We discuss the problem of the evolution of the cellular communication system from the RNA world to progenote to the modern cell. Our method analyses syntactical structure of molecular fossils in the non-coding regions of DNA within the information-processing gene model developed earlier. We concluded that sequence-specific binding is an ancient communication process with its origin in the RNA world. Moreover, we illustrate our viewpoint using four evolution snapshots from the first RNA segments, some 4.1. billion years ago, to the first cell, 3.8 billion years ago. (author). 31 refs

  10. A regional land use survey based on remote sensing and other data: A report on a LANDSAT and computer mapping project, volume 2

    Science.gov (United States)

    Nez, G. (Principal Investigator); Mutter, D.

    1977-01-01

    The author has identified the following significant results. The project mapped land use/cover classifications from LANDSAT computer compatible tape data and combined those results with other multisource data via computer mapping/compositing techniques to analyze various land use planning/natural resource management problems. Data were analyzed on 1:24,000 scale maps at 1.1 acre resolution. LANDSAT analysis software and linkages with other computer mapping software were developed. Significant results were also achieved in training, communication, and identification of needs for developing the LANDSAT/computer mapping technologies into operational tools for use by decision makers.

  11. High-order hydrodynamic algorithms for exascale computing

    Energy Technology Data Exchange (ETDEWEB)

    Morgan, Nathaniel Ray [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-02-05

    Hydrodynamic algorithms are at the core of many laboratory missions ranging from simulating ICF implosions to climate modeling. The hydrodynamic algorithms commonly employed at the laboratory and in industry (1) typically lack requisite accuracy for complex multi- material vortical flows and (2) are not well suited for exascale computing due to poor data locality and poor FLOP/memory ratios. Exascale computing requires advances in both computer science and numerical algorithms. We propose to research the second requirement and create a new high-order hydrodynamic algorithm that has superior accuracy, excellent data locality, and excellent FLOP/memory ratios. This proposal will impact a broad range of research areas including numerical theory, discrete mathematics, vorticity evolution, gas dynamics, interface instability evolution, turbulent flows, fluid dynamics and shock driven flows. If successful, the proposed research has the potential to radically transform simulation capabilities and help position the laboratory for computing at the exascale.

  12. Integrated project support environments the ASPECT project

    CERN Document Server

    Brown, Alan W

    1991-01-01

    A major part of software engineering developments involve the use of computing tools which facilitate the management, maintenance, security, and building of long-scale software engineer projects. Consequently, there have been a proliferation of CASE tools and IPSES. This book looks at IPSES in general and the ASPECT project in particular, providing design and implementation details, as well as locating ASPECT in IPSE developments.Survey of integrated project support environments for more efficient software engineering**Description of a large scale IPSE--ASPECT**Evaluation of formal methods in

  13. Mobile Computing and Ubiquitous Networking: Concepts, Technologies and Challenges.

    Science.gov (United States)

    Pierre, Samuel

    2001-01-01

    Analyzes concepts, technologies and challenges related to mobile computing and networking. Defines basic concepts of cellular systems. Describes the evolution of wireless technologies that constitute the foundations of mobile computing and ubiquitous networking. Presents characterization and issues of mobile computing. Analyzes economical and…

  14. Retrofitting of NPP Computer systems

    International Nuclear Information System (INIS)

    Pettersen, G.

    1994-01-01

    Retrofitting of nuclear power plant control rooms is a continuing process for most utilities. This involves introducing and/or extending computer-based solutions for surveillance and control as well as improving the human-computer interface. The paper describes typical requirements when retrofitting NPP process computer systems, and focuses on the activities of Institute for energieteknikk, OECD Halden Reactor project with respect to such retrofitting, using examples from actual delivery projects. In particular, a project carried out for Forsmarksverket in Sweden comprising upgrade of the operator system in the control rooms of units 1 and 2 is described. As many of the problems of retrofitting NPP process computer systems are similar to such work in other kinds of process industries, an example from a non-nuclear application area is also given

  15. Final Project Report

    DEFF Research Database (Denmark)

    Workspace

    2003-01-01

    of the Disappearing Computer to be that of  “Augmenting reality”, where “Augmented reality” meant:  •  Augmented user – positioning, visualising. •  Augmented environment, Panels, tables and site-pack •  Augmented Artifacts - RFID , tagging, tracking •  Augmented communications – efficient exchange and integration......The primary focus of the WORKSPACE project was to augment the working  environment through the development of spatial computing components, initially for  members of the design professions, but with wider applicability to a range of work  domains.     The project interpreted the requirements...... of the above.    The philosophy was to make the computer disappear by both making it large and  embedding it into the environment (e.g. furniture).  The project has successfully achieved its objectives, and has developed a range of  demonstrator prototypes, some of which is in daily use by practitioners within...

  16. Computer Simulation Performed for Columbia Project Cooling System

    Science.gov (United States)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  17. TH-E-17A-05: Optimizing Four Dimensional Cone Beam Computed Tomography Projection Allocation to Respiratory Bins

    International Nuclear Information System (INIS)

    OBrien, R; Shieh, C; Kipritidis, J; Keall, P

    2014-01-01

    Purpose: Four dimensional cone beam computed tomography (4DCBCT) is an emerging image guidance strategy but it can suffer from poor image quality. To avoid repeating scans it is beneficial to make the best use of the imaging data obtained. For conventional 4DCBCT the location and size of respiratory bins is fixed and projections are allocated to the respiratory bin within which it falls. Strictly adhering to this rule is unnecessary and can compromise image quality. In this study we optimize the size and location of respiratory bins and allow projections to be sourced from adjacent phases of the respiratory cycle. Methods: A mathematical optimization framework using mixed integer quadratic programming has been developed that determines when to source projections from adjacent respiratory bins and optimizes the size and location of the bins. The method, which we will call projection sharing, runs in under 2 seconds of CPU time. Five 4DCBCT datasets of stage III-IV lung cancer patients were used to test the algorithm. The standard deviation of the angular separation between projections (SD-A) and the standard deviation in the volume of the reconstructed fiducial gold coil (SD-V) were used as proxies to measure streaking artefacts and motion blur respectively. Results: The SD-A using displacement binning and projection sharing was 30%–50% smaller than conventional phase based binning and 59%–76% smaller than conventional displacement binning indicating more uniformly spaced projections and fewer streaking artefacts. The SD-V was 20–90% smaller when using projection sharing than using conventional phase based binning suggesting more uniform marker segmentation and less motion blur. Conclusion: Image quality was visibly and significantly improved with projection sharing. Projection sharing does not require any modifications to existing hardware and offers a more robust replacement to phase based binning, or, an option if phase based reconstruction is not of a

  18. High energy physics and cloud computing

    International Nuclear Information System (INIS)

    Cheng Yaodong; Liu Baoxu; Sun Gongxing; Chen Gang

    2011-01-01

    High Energy Physics (HEP) has been a strong promoter of computing technology, for example WWW (World Wide Web) and the grid computing. In the new era of cloud computing, HEP has still a strong demand, and major international high energy physics laboratories have launched a number of projects to research on cloud computing technologies and applications. It describes the current developments in cloud computing and its applications in high energy physics. Some ongoing projects in the institutes of high energy physics, Chinese Academy of Sciences, including cloud storage, virtual computing clusters, and BESⅢ elastic cloud, are also described briefly in the paper. (authors)

  19. Prediction of optimal deployment projection for transcatheter aortic valve replacement: angiographic 3-dimensional reconstruction of the aortic root versus multidetector computed tomography.

    OpenAIRE

    Binder Ronald K; Leipsic Jonathon; Wood David; Moore Teri; Toggweiler Stefan; Willson Alex; Gurvitch Ronen; Freeman Melanie; Webb John G

    2012-01-01

    BACKGROUND Identifying the optimal fluoroscopic projection of the aortic valve is important for successful transcatheter aortic valve replacement (TAVR). Various imaging modalities including multidetector computed tomography (MDCT) have been proposed for prediction of the optimal deployment projection. We evaluated a method that provides 3 dimensional angiographic reconstructions (3DA) of the aortic root for prediction of the optimal deployment angle and compared it with MDCT. METHODS AND RES...

  20. Peculiarities of organization of project and research activity of students in computer science, physics and technology

    Science.gov (United States)

    Stolyarov, I. V.

    2017-01-01

    The author of this article manages a project and research activity of students in the areas of computer science, physics, engineering and biology, basing on the acquired experience in these fields. Pupils constantly become winners of competitions and conferences of different levels, for example, three of the finalists of Intel ISEF in 2013 in Phoenix (Arizona, USA) and in 2014 in Los Angeles (California, USA). In 2013 A. Makarychev received the "Small Nobel prize" in Computer Science section and special award sponsors - the company's CAST. Scientific themes and methods suggested by the author and developed in joint publications of students from Russia, Germany and Austria are the patents for invention and certificates for registration in the ROSPATENT. The article presents the results of the implementation of specific software and hardware systems in physics, engineering and medicine.

  1. ATLAS & Google — "Data Ocean" R&D Project

    CERN Document Server

    The ATLAS collaboration

    2017-01-01

    ATLAS is facing several challenges with respect to their computing requirements for LHC Run-3 (2020-2023) and HL-LHC runs (2025-2034). The challenges are not specific for ATLAS or/and LHC, but common for HENP computing community. Most importantly, storage continues to be the driving cost factor and at the current growth rate cannot absorb the increased physics output of the experiment. Novel computing models with a more dynamic use of storage and computing resources need to be considered. This project aims to start an R&D project for evaluating and adopting novel IT technologies for HENP computing. ATLAS and Google plan to launch an R&D project to integrate Google cloud resources (Storage and Compute) to the ATLAS distributed computing environment. After a series of teleconferences, a face-to-face brainstorming meeting in Denver, CO at the Supercomputing 2017 conference resulted in this proposal for a first prototype of the "Data Ocean" project. The idea is threefold: (a) to allow ATLAS to explore the...

  2. Solving project scheduling problems by minimum cut computations

    NARCIS (Netherlands)

    Möhring, R.H.; Schulz, A.S.; Stork, F.; Uetz, Marc Jochen

    In project scheduling, a set of precedence-constrained jobs has to be scheduled so as to minimize a given objective. In resource-constrained project scheduling, the jobs additionally compete for scarce resources. Due to its universality, the latter problem has a variety of applications in

  3. First International Conference on the Evolution and Development of the Universe

    CERN Document Server

    EDU2008

    2009-01-01

    This document is the Special Issue of the First International Conference on the Evolution and Development (EDU 2008). Please refer to the preface and introduction for more details on the contributions. Keywords: acceleration, artificial cosmogenesis, artificial life, Big Bang, Big History, biological evolution, biological universe, biology, causality, classical vacuum energy, complex systems, complexity, computational universe, conscious evolution, cosmological artificial selection, cosmological natural selection, cosmology, critique, cultural evolution, dark energy, dark matter, development of the universe, development, emergence, evolution of the universe evolution, exobiology, extinction, fine-tuning, fractal space-time, fractal, information, initial conditions, intentional evolution, linear expansion of the universe, log-periodic laws, macroevolution, materialism, meduso-anthropic principle, multiple worlds, natural sciences, Nature, ontology, order, origin of the universe, particle hierarchy, philosophy,...

  4. Improving limited-projection-angle fluorescence molecular tomography using a co-registered x-ray computed tomography scan.

    Science.gov (United States)

    Radrich, Karin; Ale, Angelique; Ermolayev, Vladimir; Ntziachristos, Vasilis

    2012-12-01

    We examine the improvement in imaging performance, such as axial resolution and signal localization, when employing limited-projection-angle fluorescence molecular tomography (FMT) together with x-ray computed tomography (XCT) measurements versus stand-alone FMT. For this purpose, we employed living mice, bearing a spontaneous lung tumor model, and imaged them with FMT and XCT under identical geometrical conditions using fluorescent probes for cancer targeting. The XCT data was employed, herein, as structural prior information to guide the FMT reconstruction. Gold standard images were provided by fluorescence images of mouse cryoslices, providing the ground truth in fluorescence bio-distribution. Upon comparison of FMT images versus images reconstructed using hybrid FMT and XCT data, we demonstrate marked improvements in image accuracy. This work relates to currently disseminated FMT systems, using limited projection scans, and can be employed to enhance their performance.

  5. Implementation of Service Learning and Civic Engagement for Computer Information Systems Students through a Course Project at the Hashemite University

    Science.gov (United States)

    Al-Khasawneh, Ahmad; Hammad, Bashar K.

    2013-01-01

    Service learning methodologies provide information systems students with the opportunity to create and implement systems in real-world, public service-oriented social contexts. This paper presents a case study of integrating a service learning project into an undergraduate Computer Information Systems course titled "Information Systems"…

  6. SciDAC-Data, A Project to Enabling Data Driven Modeling of Exascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mubarak, M.; Ding, P.; Aliaga, L.; Tsaris, A.; Norman, A.; Lyon, A.; Ross, R.

    2016-10-10

    The SciDAC-Data project is a DOE funded initiative to analyze and exploit two decades of information and analytics that have been collected by the Fermilab Data Center on the organization, movement, and consumption of High Energy Physics data. The project will analyze the analysis patterns and data organization that have been used by the NOvA, MicroBooNE, MINERvA and other experiments, to develop realistic models of HEP analysis workflows and data processing. The SciDAC-Data project aims to provide both realistic input vectors and corresponding output data that can be used to optimize and validate simulations of HEP analysis. These simulations are designed to address questions of data handling, cache optimization and workflow structures that are the prerequisites for modern HEP analysis chains to be mapped and optimized to run on the next generation of leadership class exascale computing facilities. We will address the use of the SciDAC-Data distributions acquired from Fermilab Data Center’s analysis workflows and corresponding to around 71,000 HEP jobs, as the input to detailed queuing simulations that model the expected data consumption and caching behaviors of the work running in HPC environments. In particular we describe in detail how the Sequential Access via Metadata (SAM) data handling system in combination with the dCache/Enstore based data archive facilities have been analyzed to develop the radically different models of the analysis of HEP data. We present how the simulation may be used to analyze the impact of design choices in archive facilities.

  7. RNAdualPF: software to compute the dual partition function with sample applications in molecular evolution theory.

    Science.gov (United States)

    Garcia-Martin, Juan Antonio; Bayegan, Amir H; Dotu, Ivan; Clote, Peter

    2016-10-19

    -content. Using different inverse folding software, another group had earlier shown that pre-miRNA is mutationally robust, even controlling for compositional bias. Our opposite conclusion suggests a cautionary note that computationally based insights into molecular evolution may heavily depend on the software used. C/C++-software for RNAdualPF is available at http://bioinformatics.bc.edu/clotelab/RNAdualPF .

  8. Evolution of parasitism in kinetoplastid flagellates

    Czech Academy of Sciences Publication Activity Database

    Lukeš, Julius; Skalický, Tomáš; Týč, Jiří; Votýpka, Jan; Yurchenko, Vyacheslav

    2014-01-01

    Roč. 195, č. 2 (2014), s. 115-122 ISSN 0166-6851 R&D Projects: GA MŠk(CZ) EE2.3.30.0032 Institutional support: RVO:60077344 Keywords : Evolution * Phylogeny * Vectors * Diversity * Parasitism * Trypanosome Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 1.787, year: 2014

  9. The evolution of project financing in the geothermal industry

    International Nuclear Information System (INIS)

    Cardenas, G.S.; Miller, D.M.

    1990-01-01

    Sound underlying economics and beneficial contractual relationships are the fundamentals of any project financing. Given these essential elements, the successful transaction must properly allocate the costs, benefits and risks to the appropriate participants in the most efficient manner. In this paper the authors examine four instances in which project financing offered optimal solutions to this problem in a series of transactions for the successive development of the 70 MW Ormesa Geothermal Energy Complex in the Imperial Valley of California

  10. Differences in the Concept of Fitness Between Artificial Evolution and Natural Selection

    OpenAIRE

    Adami, Christoph; Bryson, David M.; Ofria, Charles; Pennock, Robert T.; Lichocki, Pawel; Keller, Laurent; Floreano, Dario

    2012-01-01

    Evolutionary algorithms were proposed to automatically find solutions to computational problems, much like evolution discovers new adaptive traits. Lately, they have been used to address challenging questions about the evolution of modularity, the genetic code, communication, division of labor and cooperation. Evolutionary algorithms are increasingly popular in biological studies, because they give precise control over the experimental conditions and allow the study of evolution at unpreceden...

  11. Cosmic Collisions: Galaxy Mergers and Evolution

    Science.gov (United States)

    Trouille, Laura; Willett, Kyle; Masters, Karen; Lintott, Christopher; Whyte, Laura; Lynn, Stuart; Tremonti, Christina A.

    2014-08-01

    Over the years evidence has mounted for a significant mode of galaxy evolution via mergers. This process links gas-rich, spiral galaxies; starbursting galaxies; active galactic nuclei (AGN); post-starburst galaxies; and gas-poor, elliptical galaxies, as objects representing different phases of major galaxy mergers. The post-starburst phase is particularly interesting because nearly every galaxy that evolves from star-forming to quiescent must pass through it. In essence, this phase is a sort of galaxy evolution “bottleneck” that indicates that a galaxy is actively evolving through important physical transitions. In this talk I will present the results from the ‘Galaxy Zoo Quench’ project - using post-starburst galaxies to place observational constraints on the role of mergers and AGN activity in quenching star formation. `Quench’ is the first fully collaborative research project with Zooniverse citizen scientists online; engaging the public in all phases of research, from classification to data analysis and discussion to writing the article and submission to a refereed journal.

  12. Evolution of acoustically vaporized microdroplets in gas embolotherapy

    KAUST Repository

    Qamar, Adnan; Wong, ZhengZheng; Fowlkes, Brian Brian; Bull, Joseph L.

    2012-01-01

    Acoustic vaporization dynamics of a superheated dodecafluoropentane (DDFP) microdroplet inside a microtube and the resulting bubble evolution is investigated in the present work. This work is motivated by a developmental gas embolotherapy technique that is intended to treat cancers by infarcting tumors using gas bubbles. A combined theoretical and computational approach is utilized and compared with the experiments to understand the evolution process and to estimate the resulting stress distribution associated with vaporization event. The transient bubble growth is first studied by ultra-high speed imaging and then theoretical and computational modeling is used to predict the entire bubble evolution process. The evolution process consists of three regimes: an initial linear rapid spherical growth followed by a linear compressed oval shaped growth and finally a slow asymptotic nonlinear spherical bubble growth. Although the droplets are small compared to the tube diameter, the bubble evolution is influenced by the tube wall. The final bubble radius is found to scale linearly with the initial droplet radius and is approximately five times the initial droplet radius. A short pressure pulse with amplitude almost twice as that of ambient conditions is observed. The width of this pressure pulse increases with increasing droplet size whereas the amplitude is weakly dependent. Although the rise in shear stress along the tube wall is found to be under peak physiological limits, the shear stress amplitude is found to be more prominently influenced by the initial droplet size. The role of viscous dissipation along the tube wall and ambient bulk fluid pressure is found to be significant in bubble evolution dynamics. © 2012 American Society of Mechanical Engineers.

  13. Computing with words to feasibility study of software projects

    Directory of Open Access Journals (Sweden)

    Marieta Peña Abreu

    2017-02-01

    Full Text Available Objective: This paper proposes a method to analyze the technical, commercial and social feasibility of software projects in environments of uncertainty. It allows working with multiple experts and multiple criteria and facilitates decision-making. Method: The proposal contains two phases, first the necessary information is collected and in second place projects are evaluated using 2-tuple linguistic representation model. The experts are selected by analyzing their curricular synthesis. The evaluation criteria are defined using the technique Focus Group and weighted in the interval (0,1 according to their importance. three domains are offered to express the preferences: numeric, interval-valued and linguistic. For aggregation extended arithmetic mean and weighted average extended are used, preventing the loss of information. A 2-tuple (feasibility, precision is obtained as a result for each project. Results: The evaluation of P1 project was a very high feasibility with -0,33 of precision. The P2 project obtained a high feasibility with 0,38 of precision and P3 project achieved a medium feasibility with -0,21 of precision. Conclusions: This method is favorable for software projects feasibility analysis with presence of multiple experts and criteria, in environments of uncertainty. It tries heterogeneous assessments without loss of information. Their results are consistent and useful for decision makers.

  14. Computational model for simulation small testing launcher, technical solution

    Energy Technology Data Exchange (ETDEWEB)

    Chelaru, Teodor-Viorel, E-mail: teodor.chelaru@upb.ro [University POLITEHNICA of Bucharest - Research Center for Aeronautics and Space, Str. Ghe Polizu, nr. 1, Bucharest, Sector 1 (Romania); Cristian, Barbu, E-mail: barbucr@mta.ro [Military Technical Academy, Romania, B-dul. George Coşbuc, nr. 81-83, Bucharest, Sector 5 (Romania); Chelaru, Adrian, E-mail: achelaru@incas.ro [INCAS -National Institute for Aerospace Research Elie Carafoli, B-dul Iuliu Maniu 220, 061126, Bucharest, Sector 6 (Romania)

    2014-12-10

    The purpose of this paper is to present some aspects regarding the computational model and technical solutions for multistage suborbital launcher for testing (SLT) used to test spatial equipment and scientific measurements. The computational model consists in numerical simulation of SLT evolution for different start conditions. The launcher model presented will be with six degrees of freedom (6DOF) and variable mass. The results analysed will be the flight parameters and ballistic performances. The discussions area will focus around the technical possibility to realize a small multi-stage launcher, by recycling military rocket motors. From technical point of view, the paper is focused on national project 'Suborbital Launcher for Testing' (SLT), which is based on hybrid propulsion and control systems, obtained through an original design. Therefore, while classical suborbital sounding rockets are unguided and they use as propulsion solid fuel motor having an uncontrolled ballistic flight, SLT project is introducing a different approach, by proposing the creation of a guided suborbital launcher, which is basically a satellite launcher at a smaller scale, containing its main subsystems. This is why the project itself can be considered an intermediary step in the development of a wider range of launching systems based on hybrid propulsion technology, which may have a major impact in the future European launchers programs. SLT project, as it is shown in the title, has two major objectives: first, a short term objective, which consists in obtaining a suborbital launching system which will be able to go into service in a predictable period of time, and a long term objective that consists in the development and testing of some unconventional sub-systems which will be integrated later in the satellite launcher as a part of the European space program. This is why the technical content of the project must be carried out beyond the range of the existing suborbital

  15. Low latency network and distributed storage for next generation HPC systems: the ExaNeSt project

    Science.gov (United States)

    Ammendola, R.; Biagioni, A.; Cretaro, P.; Frezza, O.; Lo Cicero, F.; Lonardo, A.; Martinelli, M.; Paolucci, P. S.; Pastorelli, E.; Pisani, F.; Simula, F.; Vicini, P.; Navaridas, J.; Chaix, F.; Chrysos, N.; Katevenis, M.; Papaeustathiou, V.

    2017-10-01

    With processor architecture evolution, the HPC market has undergone a paradigm shift. The adoption of low-cost, Linux-based clusters extended the reach of HPC from its roots in modelling and simulation of complex physical systems to a broader range of industries, from biotechnology, cloud computing, computer analytics and big data challenges to manufacturing sectors. In this perspective, the near future HPC systems can be envisioned as composed of millions of low-power computing cores, densely packed — meaning cooling by appropriate technology — with a tightly interconnected, low latency and high performance network and equipped with a distributed storage architecture. Each of these features — dense packing, distributed storage and high performance interconnect — represents a challenge, made all the harder by the need to solve them at the same time. These challenges lie as stumbling blocks along the road towards Exascale-class systems; the ExaNeSt project acknowledges them and tasks itself with investigating ways around them.

  16. Development of efficient time-evolution method based on three-term recurrence relation

    International Nuclear Information System (INIS)

    Akama, Tomoko; Kobayashi, Osamu; Nanbu, Shinkoh

    2015-01-01

    The advantage of the real-time (RT) propagation method is a direct solution of the time-dependent Schrödinger equation which describes frequency properties as well as all dynamics of a molecular system composed of electrons and nuclei in quantum physics and chemistry. Its applications have been limited by computational feasibility, as the evaluation of the time-evolution operator is computationally demanding. In this article, a new efficient time-evolution method based on the three-term recurrence relation (3TRR) was proposed to reduce the time-consuming numerical procedure. The basic formula of this approach was derived by introducing a transformation of the operator using the arcsine function. Since this operator transformation causes transformation of time, we derived the relation between original and transformed time. The formula was adapted to assess the performance of the RT time-dependent Hartree-Fock (RT-TDHF) method and the time-dependent density functional theory. Compared to the commonly used fourth-order Runge-Kutta method, our new approach decreased computational time of the RT-TDHF calculation by about factor of four, showing the 3TRR formula to be an efficient time-evolution method for reducing computational cost

  17. Computer science and the recent innovations of the modern society

    Directory of Open Access Journals (Sweden)

    Greorghe Popescu

    2010-12-01

    Full Text Available The paper “Computer science and the recent innovations of the modern society” presents the importance of computer science, with the most important historical moments in its evolution, the main theoretical elements of the computation science, computer elements and architecture and the latest innovations in the computer science, such as Artificial Intelligence.

  18. General formalism of Hamiltonians for realizing a prescribed evolution of a qubit

    International Nuclear Information System (INIS)

    Tong, D.M.; Chen, J.-L.; Lai, C.H.; Oh, C.H.; Kwek, L.C.

    2003-01-01

    We investigate the inverse problem concerning the evolution of a qubit system, specifically we consider how one can establish the Hamiltonians that account for the evolution of a qubit along a prescribed path in the projected Hilbert space. For a given path, there are infinite Hamiltonians which can realize the same evolution. A general form of the Hamiltonians is constructed in which one may select the desired one for implementing a prescribed evolution. This scheme can be generalized to higher dimensional systems

  19. The Computer as Rorschach: Implications for Management and User Acceptance

    OpenAIRE

    Kaplan, Bonnie

    1983-01-01

    Different views of the computer held by different participants in a medical computing project make it difficult to gain wide acceptance of an application. Researchers', programmers', and clinicians' views illustrate how users project their views onto the computer. Effects of these different views on user acceptance and implications for the management of computer projects are presented.

  20. Results of the deepest all-sky survey for continuous gravitational waves on LIGO S6 data running on the Einstein@Home volunteer distributed computing project

    NARCIS (Netherlands)

    Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acemese, F.; Ackley, K.; Adams, C.; Phythian-Adams, A.T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Aggarwa, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Allen, B.; Allocca, A.; Altin, P. A.; Anderson, S. B.; Anderson, W. G.; Arai, K.; Araya, M. C.; Arceneaux, C. C.; Areeda, J. S.; Arnaud, N.; Arun, K. G.; Ascenzi, S.; Ashton, G.; Ast, M.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Babak, S.; Bacon, P.; Bader, M. K. M.; Baker, P. T.; Baldaccini, F.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; Barclay, S. E.; Barish, B. C.; Arker, Bd.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barta, D.; Bartlett, J.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Baune, C.; Bavigadda, V.; Bazzan, M.; Bejger, M.; Be, A. S.; Berger, B. K.; Bergmann, G.; Berry, C. P. L.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Bhagwat, S.; Bhandare, R.; Bilenko, I. A.; Billingsley, G.; Birch, M.J.; Birney, R.; Biscans, S.; Bisht, A.; Bitoss, M.; Biwer, C.; Bizouard, M. A.; Blackburn, J. K.; Blair, C. D.; Blair, D. G.; Blair, R. M.; Bloemen, S.; Bock, O.; Boer, M.; Bogaert, J.G.; Bogan, C.; Bohe, A.; Bond, C.; Bondu, F.; Bonnand, R.; Boom, B. A.; Bork, R.; Boschi, V.; Bose, S.; Boutfanais, Y.; Bozzi, A.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Briant, T.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brockill, P.; Broida, J. E.; Brooks, A. F.; Brown, A.D.; Brown, D.; Brown, N. M.; Brunett, S.; Buchanan, C. C.; Buikema, A.; Bulik, O.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cabero, M.; Cadonati, L.; Cagnoli, C.; Cahillane, C.; Bustillo, J. Calderon; Callister, T. A.; Calloni, E.; Camp, J. B.; Cannon, K. C.; Cao, J.; Capano, C. D.; Capocasa, E.; Carbognani, F.; Caride, S.; Diaz, J. Casanueva; Casentini, C.; Caudill, S.; Cavaglia, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C. B.; Baiardi, L. Cerboni; Cerretani, G.; Cesarini, E.; Chamberlin, S. J.; Chan, M.; Chao, D. S.; Charlton, P.; Chassande-Mottin, E.; Cheeseboro, B. D.; Chen, H. Y.; Chen, Y; Cheng, C.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Cho, M.; Chow, J. H.; Christensen, N.; Chu, Q.; Chua, S. S. Y.; Chung, S.; Ciani, G.; Clara, F.; Clark, J. A.; Cleva, F.; Coccia, E.; Cohadon, P. -F.; Colla, A.; Collette, C. G.; Cominsky, L.; Constancio, M., Jr.; Conte, A.; Conti, L.; Cook, D.; Corbitt, T. R.; Cornish, N.; Corsi, A.; Cortese, S.; Costa, C. A.; Coughlin, M. W.; Coughlin, S. B.; Coulon, J. -P.; Countryman, S. T.; Couvares, P.; Cowan, E. E.; Coward, D. M.; Cowart, M. J.; Coyne, D. C.; Coyne, R.; Craig, K.; Creighton, J. D. E.; Creighton, T. D.; Cripe, J.; Crowder, S. G.; Cumming, A.; Cunningham, Laura; Cuoco, E.; Dal Canton, T.; Danilishin, S. L.; D'Antonio, S.; Danzmann, K.; Darman, N. S.; Dasgupta, A.; Costa, C. F. Da Silva; Dattilo, V.; Dave, I.; Davier, M.; Davies, G. S.; Daw, E. J.; Day, R.; De, S.; DeBra, D.; Debreczeni, G.; Degallaix, J.; De laurentis, M.; Deleglise, S.; Del Pozzo, W.; Denker, T.; Dent, T.; Dergachev, V.A.; De Rosa, R.; DeRosa, R. T.; DeSalvo, R.; Devine, R. C.; Dhurandhar, S.; Diaz, M. C.; Di Fiore, L.; Giovanni, M. Di; Di Girolamo, T.; Di Lieto, A.; Di Pace, S.; Di Palma, I.; Di Virgilio, A.; Dolique, V.; Donovan, F.; Dooley, K. L.; Doravari, S.; Douglas, R.; Downes, T. P.; Drago, M.; Dreyer, R. W. P.; Driggers, J. C.; Ducrot, M.; Dwyer, S. E.; Edo, T. B.; Edwards, M. C.; Effler, A.; Egizenstein, H. -B.; Ehrens, P.; Eichholel, J.; Eikenberry, S. S.; Engels, W.; Essick, R. C.; Etzel, T.; Evans, M.; Evans, T. M.; Everett, R.; Factourovich, M.; Fafone, O.; Fair, H.; Fairhurst, S.; Fan, X.; Fang, Q.; Farinon, S.; Farr, B.; Far, W. M.; Favata, M.; Fays, M.; Fehrmann, H.; Fejer, M. M.; Fenyvesi, E.; Ferrante, I.; Ferreira, E. C.; Ferrini, F.; Fidecaro, F.; Fiori, I.; Fiorucci, D.; Fisher, R. P.; Flaminio, R.; Fletcher, M.M.; Fournier, J. -D.; Frasca, J. -D; Frasconi, F.; Frei, Z.; Freise, A.; Frey, R.; Frey, V.; Fritsche, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gabbard, H. A. G.; Gair, J. R.; Gammaitoni, L.; Gaonkar, S. G.; Garuti, F.; Gaur, G.; Gehrels, N.; Gemme, G.; Geng, P.; Genin, E.; Gennai, A.; George, J.; Gergely, L.; Germain, V.; Ghosh, Abhirup; Ghosh, Archisman; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gi, K.; Glaetke, A.; Goetz, E.; Goetz, R.; Gondan, L.; Gonzalez, Idelmis G.; Castro, J. M. Gonzalez; Gopakumar, A.; Gordon, N. A.; Gorodetsky, M. L.; Gossan, S. E.; Lee-Gosselin, M.; Gouaty, R.; Grado, A.; Graef, C.; Graff, P. B.; Granata, M.; Granta, A.; Gras, S.; Cray, C.; Greco, G.; Green, A. C.; Groot, P.; Grote, H.; Grunewald, S.; Guidi, G. M.; Guo, X.; Gupta, A.; Gupta, M. K.; Gushwa, K. E.; Gustafson, E. K.; Gustafson, R.; Hacker, J. J.; Hall, B. R.; Hall, E. D.; Hammond, G.; Haney, M.; Hanke, M. M.; Hanks, J.; Hanna, C.; Hanson, J.; Hardwick, T.; Harms, J.; Harry, G. M.; Harry, I. W.; Hart, M. J.; Hartman, M. T.; Haster, C. -J.; Haughian, K.; Heidmann, A.; Heintze, M. C.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M.; Heng, S.; Hennig, J.; Henry, J.A.; Heptonsta, A. W.; Heurs, M.; Hild, S.; Hoak, D.; Hofman, D.; Holt, K.; Holz, D. E.; Hopkins, P.; Hough, J.; Houston, E. A.; Howel, E. J.; Hu, Y. M.; Huang, O.; Huerta, E. A.; Huet, D.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh-Dinh, T.; Indik, N.; Ingram, D. R.; Inta, R.; Isa, H. N.; Isac, J. -M.; Isi, M.; Isogai, T.; Lyer, B. R.; Fzumi, K.; Jaccimin, T.; Jang, D.H.; Jani, K.; Jaranowski, P.; Jawahar, S.; Jian, L.; Jimenez-Forteza, F.; Johnson, W.; Jones, I.D.; Jones, R.; Jones, R.; Jonker, R. J. G.; Ju, L.; Wads, k; Kalaghatgi, C. V.; Kalogera, V.; Kandhasamy, S.; Kang, G.; Kanner, J. B.; Kapadia, S. J.; Karki, S.; Karvinen, K. S.; Kasprzack, M.; Katsavounidis, E.; Katzman, W.; Kaufer, S.; Kaur, T.; Kawabe, K.; Kefelian, F.; Keh, M. S.; Keite, D.; Kelley, D. B.; Kells, W.; Kennedy, R.E.; Key, J. S.; Khalili, F. Y.; Khan, I.; Khan, S.; Khan, Z.; Khazanov, E. A.; Kijbunchoo, N.; Kim, Chi-Woong; Kim, Chunglee; Kim, J.; Kim, K.; Kim, Namjun; Kim, W.; Kimbre, S. J.; King, E. J.; King, P. J.; Kisse, J. S.; Klein, B.; Kleybolte, L.; Klimenko, S.; Koehlenbeck, S. M.; Koley, S.; Kondrashov, V.; Kontos, A.; Korobko, M.; Korth, W. Z.; Kowalska, I.; Kozak, D. B.; Kringe, V.; Krishnan, B.; Krolak, A.; Krueger, C.; Kuehn, G.; Kumar, P.; Kumar, R.; Kuo, L.; Kutynia, A.; Lackey, B. D.; Landry, M.; Lange, J.; Lantz, B.; Lasky, P. D.; Laxen, M.; Lazzaro, C.; Leaci, P.; Leavey, S.; Lebigot, E. O.; Lee, C. H.; Lee, K.H.; Lee, M.H.; Lee, K.; Lenon, A.; Leonardi, M.; Leong, J. R.; Leroy, N.; Letendre, N.; Levin, Y.; Lewis, J. B.; Li, T. G. F.; Libson, A.; Littenberg, T. B.; Lockerbie, N. A.; Lombardi, A. L.; London, L. T.; Lord, J. E.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lough, J. D.; Liick, H.; Lundgren, A. P.; Lynch, R.; Ivia, Y.; Machenschalk, B.; Maclnnis, M.; Macleod, D. M.; Magafia-Sandoval, F.; Zertuche, L. Magafia; Magee, R. M.; Majorana, E.; Maksimovic, I.; Malvezzi, V.; Man, N.; Mandic, V.; Mangano, V.; Manse, G. L.; Manske, M.; Mantovani, M.; Marchesoni, F.; Marion, F.; Marka, S.; Marka, Z.; Markosyan, A. S.; Maros, E.; Martelli, F.; Martellini, L.; Martin, I. W.; Martynov, D. V.; Marx, J. N.; Mason, K.; Masserot, A.; Massinger, T. J.; Masso-Reid, M.; Mastrogiovanni, S.; Matiehard, F.; Matone, L.; Mavalvala, N.; Mazumder, N.; McCarthy, R.; McClelland, D. E.; McCormick, S.; McGuire, S. C.; McIntyre, G.; McIver, J.; McManus, D. J.; McRae, T.; McWilliams, S. T.; Meacher, D.; Meadors, G. D.; Meidam, J.; Melatos, A.; Mende, G.; Mercer, R. A.; Merilh, E. L.; Merzougui, M.; Meshkov, S.; Messenger, C.; Messick, C.; Metzdorff, R.; Meyers, P. M.; Mezzani, F.; Miao, H.; Miche, C.; Middleton, H.; Mikhailov, E. E.; Milano, L.; Miller, A. L.; Miller, A.; Miller, B. B.; Miller, J.; Millhouse, M.; Minenkov, Y.; Ming, J.; Mirshekari, S.; Mishra, C.; Mitra, S.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Moggi, A.; Mohan, M.; Mohapatra, S. R. P.; Montani, M.; Moore, B.C.; Moore, C. J.; Moraru, D.; Moreno, G.; Morriss, S. R.; Mossavi, K.; Mours, B.; Mow-Lowry, C. M.; Mueller, G.; Muir, A. W.; Mukherjee, Arunava; Mukherjee, S.D.; Mukherjee, S.; Mukund, N.; Mullavey, A.; Munch, J.; Murphy, D. J.; Murray, P.G.; Mytidis, A.; Nardecehia, I.; Naticchioni, L.; Nayak, R. K.; Nedkova, K.; Nelemans, G.; Nelson, T. J. N.; Gutierrez-Neri, M.; Neunzert, A.; Newton, G.; Nguyen, T. T.; Nielsen, A. B.; Nissanke, S.; Nitz, A.; Nocera, F.; Nolting, D.; Normandin, M. E. N.; Nuttall, L. K.; Oberling, J.; Ochsner, E.; O'Dell, J.; Oelker, E.; Ogin, G. H.; Oh, J. J.; Hang, S.; Ohme, F.; Oliver, M.; Oppermann, P.; Ram, Richard J.; O'Reilly, B.; O'Shaughnessy, R.; Ottaway, D. J.; Overmier, H.; Owen, B. J.; Pai, A.; Pai, S. A.; Palamos, J. R.; Palashov, O.; Palomba, C.; Pal-Singh, A.; Pan, H.; Pankow, C.; Pannarale, F.; Pant, B. C.; Paoletti, F.; Paoli, A.; Papa, M. A.; Paris, H. R.; Parker, W.S; Pascucci, D.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Patricelli, B.; Patrick, Z.; Pearlstone, B. L.; Pedraza, M.; Pedurand, R.; Pekowsky, L.; Pele, A.; Penn, S.; Perreca, A.; Perri, L. M.; Phelps, M.; Piccinni, . J.; Pichot, M.; Piergiovanni, F.; Pierro, V.; Pillant, G.; Pinard, L.; Pinto, I. M.; Pitkin, M.; Poe, M.; Poggiani, R.; Popolizio, P.; Post, A.; Powel, J.; Prasad, J.; Predoi, V.; Prestegard, T.; Price, L. R.; Prijatelj, M.; Principe, M.; Privitera, S.; Prix, R.; Prodi, G. A.; Prokhorov, L. G.; Puncken, .; Punturo, M.; Purrer, PuppoM.; Qi, H.; Qin, J.; Qiu, S.; Quetschke, V.; Quintero, E. A.; Quitzow-James, R.; Raab, F. J.; Rabeling, D. S.; Radkins, H.; Raffai, P.; Raja, S.; Rajan, C.; Rakhmanov, M.; Rapagnani, P.; Raymond, V.; Razzano, M.; Re, V.; Read, J.; Reed, C. M.; Regimbau, T.; Rei, L.; Reid, S.; Reitze, D. H.; Rew, H.; Reyes, S. D.; Ricci, F.; Riles, K.; Rizzo, M.; Robertson, N. A.; Robie, R.; Robinet, F.; Rocchi, A.; Rolland, L.; Rollins, J. G.; Roma, V. J.; Romano, R.; Romanov, G.; Romie, J. H.; Rowan, RosiliskaS.; Ruggi, RiidigerP.; Ryan, K.; Sachdev, Perminder S; Sadecki, T.; Sadeghian, L.; Sakellariadou, M.; Salconi, L.; Saleem, M.; Salemi, F.; Samajdar, A.; Sammut, L.; Sanchez, E. J.; Sandberg, V.; Sandeen, B.; Sanders, J. R.; Sassolas, B.; Saulson, P. R.; Sauter, E. S.; Savage, R. L.; Sawadsky, A.; Schale, P.; Schilling, R.; Schmidt, J; Schmidt, P.; Schnabe, R.; Schofield, R. M. S.; Schonbeck, A.; Schreiber, K.E.C.; Schuette, D.; Schutz, B. F.; Scott, J.; Scott, S. M.; Sellers, D.; Sengupta, A. S.; Sentenac, D.; Sequino, V.; Sergeev, A.; Setyawati, Y.; Shaddock, D. A.; Shaffer, T. J.; Shahriar, M. S.; Shaltev, M.; Shapiro, B.; Shawhan, P.; Sheperd, A.; Shoemaker, D. H.; Shoemaker, D. M.; Sielleez, K.; Siemens, X.; Sieniawska, M.; Sigg, D.; Silva, António Dias da; Singer, A.; Singer, L. P.; Singh, A.; Singh, R.; Singhal, A.; Sintes, A. M.; Slagmolen, B. J. J.; Smith, R. J. E.; Smith, N.D.; Smith, R. J. E.; Son, E. J.; Sorazus, B.; Sorrentino, F.; Souradeep, T.; Srivastava, A. K.; Staley, A.; Steinke, M.; Steinlechner, J.; Steinlechner, S.; Steinmeyer, D.; Stephens, B. C.; Stone, R.; Strain, K. A.; Straniero, N.; Stratta, G.; Strauss, N. A.; Strigin, S. E.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Sunil, Suns; Sutton, P. J.; Swinkels, B. L.; Szczepariczyk, M. J.; Tacca, M.; Talukder, D.; Tanner, D. B.; Tapai, M.; Tarabrin, S. P.; Taracchini, A.; Taylor, W.R.; Theeg, T.; Thirugnanasambandam, M. P.; Thomas, E. G.; Thomas, M.; Thomas, P.; Thorne, K. A.; Thrane, E.; Tiwari, S.; Tiwari, V.; Tokmakov, K. V.; Toland, K.; Tomlinson, C.; Tonelli, M.; Tomasi, Z.; Torres, C. V.; Tome, C.; Tot, D.; Travasso, F.; Traylor, G.; Trifire, D.; Tringali, M. C.; Trozz, L.; Tse, M.; Turconi, M.; Tuyenbayev, D.; Ugolini, D.; Unnikrishnan, C. S.; Urban, A. L.; Usman, S. A.; Vahlbruch, H.; Valente, G.; Valdes, G.; van Bake, N.; Van Beuzekom, Martin; Van den Brand, J. F. J.; Van Den Broeck, C.F.F.; Vander-Hyde, D. C.; van der Schaaf, L.; Van Heilningen, J. V.; Van Vegge, A. A.; Vardaro, M.; Vass, S.; Vaslith, M.; Vaulin, R.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, P.J.; Venkateswara, K.; Verkindt, D.; Vetrano, F.; Vicere, A.; Vinciguerra, S.; Vine, D. J.; Vinet, J. -Y.; Vitale, S.; Vo, T.; Vocca, H.; Vorvick, C.; Voss, D. V.; Vousden, W. D.; Vyatchanin, S. P.; Wade, A. R.; Wade, L. E.; Wade, MT; Walker, M.; Wallace, L.; Walsh, S.; Vvang, G.; Wang, O.; Wang, X.; Wiang, Y.; Ward, R. L.; Wiarner, J.; Was, M.; Weaver, B.; Wei, L. -W.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Wen, L.; Weliels, P.; Westphal, T.; Wette, K.; Whelan, J. T.; Whiting, B. F.; Williams, D.R.; Williamson, A. R.; Willis, J. L.; WilIke, B.; Wimmer, M. H.; Whinkler, W.; Wipf, C. C.; De Witte, H.; Woan, G.; Woehler, J.; Worden, J.; Wright, J.L.; Wu, D. S.; Wu, G.; Yablon, J.; Yam, W.; Yamamoto, H.; Yancey, C. C.; Yu, H.; Yvert, M.; Zadrozny, A.; Zangrando, L.; Zanolin, M.; Zendri, J. P.; Zevin, M.; Zhang, L.; Zhang, M.; Zhang, Y.; Zhao, C.; Zhou, M.; Zhou, Z.; Zhu, S.J.; Zhu, X.; Zucker, M. E.; Zuraw, S. E.; Zweizig, J.

    2016-01-01

    We report results of a deep all-sky search for periodic gravitational waves from isolated neutron stars in data from the S6 LIGO science run. The search was possible thanks to the computing power provided by the volunteers of the Einstein@Home distributed computing project. We find no significant