WorldWideScience

Sample records for computer evolution project

  1. Nascence project: nanoscale engineering for novel computation using evolution

    NARCIS (Netherlands)

    Broersma, Haitze J.; Gomez, Faustino; Miller, Julian; Petty, Mike; Tufte, Gunnar

    2012-01-01

    Living systems are able to achieve prodigious feats of computation with remarkable speed and efficiency (e.g. navigation in a complex environment, object recognition, decision making, and reasoning). Many of these tasks have not been adequately solved using algorithms running on our most powerful

  2. Computer Interactives for the Mars Atmospheric and Volatile Evolution (MAVEN) Mission through NASA's "Project Spectra!"

    Science.gov (United States)

    Wood, E. L.

    2014-12-01

    "Project Spectra!" is a standards-based E-M spectrum and engineering program that includes paper and pencil activities as well as Flash-based computer games that help students solidify understanding of high-level planetary and solar physics. Using computer interactive games, students experience and manipulate information making abstract concepts accessible, solidifying understanding and enhancing retention of knowledge. Since students can choose what to watch and explore, the interactives accommodate a broad range of learning styles. Students can go back and forth through the interactives if they've missed a concept or wish to view something again. In the end, students are asked critical thinking questions and conduct web-based research. As part of the Mars Atmospheric and Volatile EvolutioN (MAVEN) mission education programming, we've developed two new interactives. The MAVEN mission will study volatiles in the upper atmosphere to help piece together Mars' climate history. In the first interactive, students explore black body radiation, albedo, and a simplified greenhouse effect to establish what factors contribute to overall planetary temperature. Students design a planet that is able to maintain liquid water on the surface. In the second interactive, students are asked to consider conditions needed for Mars to support water on the surface, keeping some variables fixed. Ideally, students will walk away with the very basic and critical elements required for climate studies, which has far-reaching implications beyond the study of Mars. These interactives were pilot tested at Arvada High School in Colorado.

  3. Computer simulations for the Mars Atmospheric and Volatile EvolutioN (MAVEN) mission through NASA's "Project Spectra!"

    Science.gov (United States)

    Christofferson, R.; Wood, E. L.; Euler, G.

    2012-12-01

    "Project Spectra!" is a standards-based light science and engineering program on solar system exploration that includes both hands-on paper and pencil activities as well as Flash-based computer games that help students solidify understanding of high-level planetary and solar physics. Using computer interactive games where students experience and manipulate the information makes abstract concepts accessible. Visualizing lessons with multi-media tools solidifies understanding and retention of knowledge. Since students can choose what to watch and explore, the interactives accommodate a broad range of learning styles. Students can go back and forth through the interactives if they've missed a concept or wish to view something again. In the end, students are asked critical thinking questions and conduct web-based research. As a part of the Mars Atmospheric and Volatile EvolutioN (MAVEN) mission education programming, we've developed two new "Project Spectra!" interactives that go hand-in-hand with a paper and pencil activity. The MAVEN mission will study volatiles in the upper atmosphere to help piece together Mars' climate history. In the first interactive, students explore black body radiation, albedo, and a simplified greenhouse effect to establish what factors contribute to overall planetary temperature and how they contribute. Students are asked to create a scenario in which a planet they build and design is able to maintain liquid water on the surface. In the second interactive, students are asked to consider Mars and the conditions needed for Mars to support water on the surface, keeping some variables fixed. Ideally, students will walk away with the very basic and critical elements required for climate studies, which has far-reaching implications beyond the study of Mars. These interactives are currently being pilot tested at Arvada High School in Colorado.

  4. Computer simulations for the Mars Atmospheric and Volatile EvolutioN (MAVEN) mission through NASA's 'Project Spectra!'

    Science.gov (United States)

    Wood, E. L.

    2013-12-01

    'Project Spectra!' is a standards-based light science and engineering program on solar system exploration that includes both hands-on paper and pencil activities as well as Flash-based computer games that help students solidify understanding of high-level planetary and solar physics. Using computer interactive games where students experience and manipulate the information makes abstract concepts accessible. Visualizing lessons with multi-media tools solidifies understanding and retention of knowledge. Since students can choose what to watch and explore, the interactives accommodate a broad range of learning styles. Students can go back and forth through the interactives if they've missed a concept or wish to view something again. In the end, students are asked critical thinking questions and conduct web-based research. As a part of the Mars Atmospheric and Volatile EvolutioN (MAVEN) mission education programming, we've developed two new 'Project Spectra!' interactives that go hand-in-hand with a paper and pencil activity. The MAVEN mission will study volatiles in the upper atmosphere to help piece together Mars' climate history. In the first interactive, students explore black body radiation, albedo, and a simplified greenhouse effect to establish what factors contribute to overall planetary temperature and how they contribute. Students are asked to create a scenario in which a planet they build and design is able to maintain liquid water on the surface. In the second interactive, students are asked to consider Mars and the conditions needed for Mars to support water on the surface, keeping some variables fixed. Ideally, students will walk away with the very basic and critical elements required for climate studies, which has far-reaching implications beyond the study of Mars. These interactives were pilot tested at Arvada High School in Colorado.

  5. CMS computing model evolution

    International Nuclear Information System (INIS)

    Grandi, C; Bonacorsi, D; Colling, D; Fisk, I; Girone, M

    2014-01-01

    The CMS Computing Model was developed and documented in 2004. Since then the model has evolved to be more flexible and to take advantage of new techniques, but many of the original concepts remain and are in active use. In this presentation we will discuss the changes planned for the restart of the LHC program in 2015. We will discuss the changes planning in the use and definition of the computing tiers that were defined with the MONARC project. We will present how we intend to use new services and infrastructure to provide more efficient and transparent access to the data. We will discuss the computing plans to make better use of the computing capacity by scheduling more of the processor nodes, making better use of the disk storage, and more intelligent use of the networking.

  6. Toward Cloud Computing Evolution

    OpenAIRE

    Susanto, Heru; Almunawar, Mohammad Nabil; Kang, Chen Chin

    2012-01-01

    -Information Technology (IT) shaped the success of organizations, giving them a solid foundation that increases both their level of efficiency as well as productivity. The computing industry is witnessing a paradigm shift in the way computing is performed worldwide. There is a growing awareness among consumers and enterprises to access their IT resources extensively through a "utility" model known as "cloud computing." Cloud computing was initially rooted in distributed grid-based computing. ...

  7. Computer Assets Recovery Project

    Science.gov (United States)

    CortesPena, Aida Yoguely

    2010-01-01

    This document reports on the project that was performed during the internship of the author. The project involved locating and recovering machines in various locations that Boeing has no need for, and therefore requires that they be transferred to another user or transferred to a non-profit organization. Other projects that the author performed was an inventory of toner and printers, loading new computers and connecting them to the network.

  8. The evolution of computer technology

    CERN Document Server

    Kamar, Haq

    2018-01-01

    Today it seems that computers occupy every single space in life. This book traces the evolution of computers from the humble beginnings as simple calculators up to the modern day jack-of-all trades devices like the iPhone. Readers will learn about how computers evolved from humongous military-issue refrigerators to the spiffy, delicate, and intriguing devices that many modern people feel they can't live without anymore. Readers will also discover the historical significance of computers, and their pivotal roles in World War II, the Space Race, and the emergence of modern Western powers.

  9. `95 computer system operation project

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Taek; Lee, Hae Cho; Park, Soo Jin; Kim, Hee Kyung; Lee, Ho Yeun; Lee, Sung Kyu; Choi, Mi Kyung [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-12-01

    This report describes overall project works related to the operation of mainframe computers, the management of nuclear computer codes and the project of nuclear computer code conversion. The results of the project are as follows ; 1. The operation and maintenance of the three mainframe computers and other utilities. 2. The management of the nuclear computer codes. 3. The finishing of the computer codes conversion project. 26 tabs., 5 figs., 17 refs. (Author) .new.

  10. '95 computer system operation project

    International Nuclear Information System (INIS)

    Kim, Young Taek; Lee, Hae Cho; Park, Soo Jin; Kim, Hee Kyung; Lee, Ho Yeun; Lee, Sung Kyu; Choi, Mi Kyung

    1995-12-01

    This report describes overall project works related to the operation of mainframe computers, the management of nuclear computer codes and the project of nuclear computer code conversion. The results of the project are as follows ; 1. The operation and maintenance of the three mainframe computers and other utilities. 2. The management of the nuclear computer codes. 3. The finishing of the computer codes conversion project. 26 tabs., 5 figs., 17 refs. (Author) .new

  11. Effective Strategies for Teaching Evolution: The Primary Evolution Project

    Science.gov (United States)

    Hatcher, Chris

    2015-01-01

    When Chris Hatcher joined the Primary Evolution Project team at the University of Reading, his goal was to find effective strategies to teach evolution in a way that keeps children engaged and enthused. Hatcher has collaborated with colleagues at the University's Institute of Education to break the evolution unit down into distinct topics and…

  12. Open Compute Project at CERN

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    The Open Compute Project, OCP ( http://www.opencompute.org/), was launched by Facebook in 2011 with the objective of building efficient computing infrastructures at lowest possible cost. The technologies are released as open hardware design, with the goal to develop servers and data centers following the model traditionally associated with open source software projects. We have been following the OCP project for some time and decided to buy two OCP twin servers in 2013 to get some hands-on experience. The servers have been tested and compared with our standard hardware regularly acquired through large tenders. In this presentation we will give some relevant results from this testing and also discuss some of the more important differences that can matter for a larger deployment at CERN. Finally it will outline the details for a possible project for a larger deployment of OCP hardware for production use at CERN.

  13. CMS computing upgrade and evolution

    CERN Document Server

    Hernandez Calama, Jose

    2013-01-01

    The distributed Grid computing infrastructure has been instrumental in the successful exploitation of the LHC data leading to the discovery of the Higgs boson. The computing system will need to face new challenges from 2015 on when LHC restarts with an anticipated higher detector output rate and event complexity, but with only a limited increase in the computing resources. A more efficient use of the available resources will be mandatory. CMS is improving the data storage, distribution and access as well as the processing efficiency. Remote access to the data through the WAN, dynamic data replication and deletion based on the data access patterns, and separation of disk and tape storage are some of the areas being actively developed. Multi-core processing and scheduling is being pursued in order to make a better use of the multi-core nodes available at the sites. In addition, CMS is exploring new computing techniques, such as Cloud Computing, to get access to opportunistic resources or as a means of using wit...

  14. Language evolution and human-computer interaction

    Science.gov (United States)

    Grudin, Jonathan; Norman, Donald A.

    1991-01-01

    Many of the issues that confront designers of interactive computer systems also appear in natural language evolution. Natural languages and human-computer interfaces share as their primary mission the support of extended 'dialogues' between responsive entities. Because in each case one participant is a human being, some of the pressures operating on natural languages, causing them to evolve in order to better support such dialogue, also operate on human-computer 'languages' or interfaces. This does not necessarily push interfaces in the direction of natural language - since one entity in this dialogue is not a human, this is not to be expected. Nonetheless, by discerning where the pressures that guide natural language evolution also appear in human-computer interaction, we can contribute to the design of computer systems and obtain a new perspective on natural languages.

  15. Computer applications in project KARP

    International Nuclear Information System (INIS)

    Raju, R.P.; Siddiqui, H.R.

    1992-01-01

    For effective project implementation of Kalpakkam Reprocessing Plant (KARP) at Kalpakkam, an elaborate Management Information Systems (MIS) was developed in-house for physical and financial progress monitoring and reporting. Computer aided design software for design of process piping layout was also developed and implemented for generation of process cell piping drawings for construction purposes. Modelling and simulation studies were carried out to optimize process parameters and fault tree analysis techniques utilised for evaluating plant availability factors. (author). 2 tabs

  16. Evolution of Computational Toxicology-from Primitive ...

    Science.gov (United States)

    Presentation at the Health Canada seminar in Ottawa, ON, Canada on Nov. 15. 2016 Presentation at the Health Canada seminar in Ottawa, ON, Canada on Nov. 15. 2016 on the Evolution of Computational Toxicology-from Primitive Beginnings to Sophisticated Application

  17. Projected evolution superoperators and the density operator

    International Nuclear Information System (INIS)

    Turner, R.E.; Dahler, J.S.; Snider, R.F.

    1982-01-01

    The projection operator method of Zwanzig and Feshbach is used to construct the time dependent density operator associated with a binary scattering event. The formula developed to describe this time dependence involves time-ordered cosine and sine projected evolution (memory) superoperators. Both Schroedinger and interaction picture results are presented. The former is used to demonstrate the equivalence of the time dependent solution of the von Neumann equation and the more familiar frequency dependent Laplace transform solution. For two particular classes of projection superoperators projected density operators are shown to be equivalent to projected wave functions. Except for these two special cases, no projected wave function analogs of projected density operators exist. Along with the decoupled-motions approximation, projected interaction picture density operators are applied to inelastic scattering events. Simple illustrations are provided of how this formalism is related to previously established results for two-state processes, namely, the theory of resonant transfer events, the first order Magnus approximation, and the Landau-Zener theory

  18. The Evolution of Cloud Computing in ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00224309; Berghaus, Frank; Brasolin, Franco; Cordeiro, Cristovao; Desmarais, Ron; Field, Laurence; Gable, Ian; Giordano, Domenico; Di Girolamo, Alessandro; Hover, John; Leblanc, Matthew Edgar; Love, Peter; Paterson, Michael; Sobie, Randall; Zaytsev, Alexandr

    2015-01-01

    The ATLAS experiment has successfully incorporated cloud computing technology and cloud resources into its primarily grid-based model of distributed computing. Cloud R&D activities continue to mature and transition into stable production systems, while ongoing evolutionary changes are still needed to adapt and refine the approaches used, in response to changes in prevailing cloud technology. In addition, completely new developments are needed to handle emerging requirements. This paper describes the overall evolution of cloud computing in ATLAS. The current status of the virtual machine (VM) management systems used for harnessing infrastructure as a service (IaaS) resources are discussed. Monitoring and accounting systems tailored for clouds are needed to complete the integration of cloud resources within ATLAS' distributed computing framework. We are developing and deploying new solutions to address the challenge of operation in a geographically distributed multi-cloud scenario, including a system for ma...

  19. The Evolution of Cloud Computing in ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00224309; The ATLAS collaboration; Berghaus, Frank; Love, Peter; Leblanc, Matthew Edgar; Di Girolamo, Alessandro; Paterson, Michael; Gable, Ian; Sobie, Randall; Field, Laurence

    2015-01-01

    The ATLAS experiment has successfully incorporated cloud computing technology and cloud resources into its primarily grid-based model of distributed computing. Cloud R&D activities continue to mature and transition into stable production systems, while ongoing evolutionary changes are still needed to adapt and refine the approaches used, in response to changes in prevailing cloud technology. In addition, completely new developments are needed to handle emerging requirements. This work will describe the overall evolution of cloud computing in ATLAS. The current status of the VM management systems used for harnessing IAAS resources will be discussed. Monitoring and accounting systems tailored for clouds are needed to complete the integration of cloud resources within ATLAS' distributed computing framework. We are developing and deploying new solutions to address the challenge of operation in a geographically distributed multi-cloud scenario, including a system for managing VM images across multiple clouds, ...

  20. Digital Genesis: Computers, Evolution and Artificial Life

    OpenAIRE

    Taylor, Tim; Dorin, Alan; Korb, Kevin

    2015-01-01

    The application of evolution in the digital realm, with the goal of creating artificial intelligence and artificial life, has a history as long as that of the digital computer itself. We illustrate the intertwined history of these ideas, starting with the early theoretical work of John von Neumann and the pioneering experimental work of Nils Aall Barricelli. We argue that evolutionary thinking and artificial life will continue to play an integral role in the future development of the digital ...

  1. The LHC Computing Grid Project

    CERN Multimedia

    Åkesson, T

    In the last ATLAS eNews I reported on the preparations for the LHC Computing Grid Project (LCGP). Significant LCGP resources were mobilized during the summer, and there have been numerous iterations on the formal paper to put forward to the CERN Council to establish the LCGP. ATLAS, and also the other LHC-experiments, has been very active in this process to maximally influence the outcome. Our main priorities were to ensure that the global aspects are properly taken into account, that the CERN non-member states are also included in the structure, that the experiments are properly involved in the LCGP execution and that the LCGP takes operative responsibility during the data challenges. A Project Launch Board (PLB) was active from the end of July until the 10th of September. It was chaired by Hans Hoffmann and had the IT division leader as secretary. Each experiment had a representative (me for ATLAS), and the large CERN member states were each represented while the smaller were represented as clusters ac...

  2. Bibliography. Computer-Oriented Projects, 1987.

    Science.gov (United States)

    Smith, Richard L., Comp.

    1988-01-01

    Provides an annotated list of references on computer-oriented projects. Includes information on computers; hands-on versus simulations; games; instruction; students' attitudes and learning styles; artificial intelligence; tutoring; and application of spreadsheets. (RT)

  3. Group Projects and the Computer Science Curriculum

    Science.gov (United States)

    Joy, Mike

    2005-01-01

    Group projects in computer science are normally delivered with reference to good software engineering practice. The discipline of software engineering is rapidly evolving, and the application of the latest 'agile techniques' to group projects causes a potential conflict with constraints imposed by regulating bodies on the computer science…

  4. Computer graphics and research projects

    International Nuclear Information System (INIS)

    Ingtrakul, P.

    1994-01-01

    This report was prepared as an account of scientific visualization tools and application tools for scientists and engineers. It is provided a set of tools to create pictures and to interact with them in natural ways. It applied many techniques of computer graphics and computer animation through a number of full-color presentations as computer animated commercials, 3D computer graphics, dynamic and environmental simulations, scientific modeling and visualization, physically based modelling, and beavioral, skelatal, dynamics, and particle animation. It took in depth at original hardware and limitations of existing PC graphics adapters contain syste m performance, especially with graphics intensive application programs and user interfaces

  5. Project PEACH at UCLH: Student Projects in Healthcare Computing.

    Science.gov (United States)

    Ramachandran, Navin; Mohamedally, Dean; Taylor, Paul

    2017-01-01

    A collaboration between clinicians at UCLH and the Dept of Computer Science at UCL is giving students of computer science the opportunity to undertake real healthcare computing projects as part of their education. This is enabling the creation of a significant research computing platform within the Trust, based on open source components and hosted in the cloud, while providing a large group of students with experience of the specific challenges of health IT.

  6. Technological Evolution on Computed Tomography and Radioprotection

    Energy Technology Data Exchange (ETDEWEB)

    Leite, Bruno Barros; Ribeiro, Nuno Carrilho [Servico de Radiologia, Hospital de Curry Cabral, Rua da Beneficencia, 8, 1069-166 Lisboa (Portugal)

    2006-05-15

    Computed Tomography (CT) has been available since the 70s and has experienced a dramatic technical evolution. Multi-detector technology is our current standard, offering capabilities unthinkable only a decade ago. Yet, we must nor forget the ionizing nature of CT's scanning energy (X-rays). It represents the most important cause of medical-associated radiation exposure to the general public, with a trend to increase. It is compulsory to intervene with the objective of dose reduction, following ALARA policies. Currently there are some technical advances that allow dose reduction, without sacrificing diagnostic image capabilities. However, human intervention is also essential. We must keep investment on education so that CT exams are don when they are really useful in clinical decision. Alternative techniques should also be considered. Image quality must not be searched disregarding the biological effects of radiation. Generally, it is possible to obtain clinically acceptable images with lower dose protocols. (author)

  7. Technological Evolution on Computed Tomography and Radioprotection

    International Nuclear Information System (INIS)

    Leite, Bruno Barros; Ribeiro, Nuno Carrilho

    2006-01-01

    Computed Tomography (CT) has been available since the 70s and has experienced a dramatic technical evolution. Multi-detector technology is our current standard, offering capabilities unthinkable only a decade ago. Yet, we must nor forget the ionizing nature of CT's scanning energy (X-rays). It represents the most important cause of medical-associated radiation exposure to the general public, with a trend to increase. It is compulsory to intervene with the objective of dose reduction, following ALARA policies. Currently there are some technical advances that allow dose reduction, without sacrificing diagnostic image capabilities. However, human intervention is also essential. We must keep investment on education so that CT exams are don when they are really useful in clinical decision. Alternative techniques should also be considered. Image quality must not be searched disregarding the biological effects of radiation. Generally, it is possible to obtain clinically acceptable images with lower dose protocols. (author)

  8. VIP visit of LHC Computing Grid Project

    CERN Multimedia

    Krajewski, Yann Tadeusz

    2015-01-01

    VIP visit of LHC Computing Grid Project with Dr -.Ing. Tarek Kamel [Senior Advisor to the President for Government Engagement, ICANN Geneva Office] and Dr Nigel Hickson [VP, IGO Engagement, ICANN Geneva Office

  9. [Earth Science Technology Office's Computational Technologies Project

    Science.gov (United States)

    Fischer, James (Technical Monitor); Merkey, Phillip

    2005-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  10. Norwegian computers in European energy research project

    International Nuclear Information System (INIS)

    Anon.

    1979-01-01

    16 NORD computers have been ordered for the JET data acquisition and storage system. The computers will be arranged in a 'double star' configuration, developed by CERN. Two control consoles each have their own computer. All computers for communication, control, diagnostics, consoles and testing are NORD-100s while the computer for data storage and analysis is a NORD-500. The operating system is SINTRAN CAMAC SERIAL HIGHWAY with fibre optics to be used for long communications paths. The programming languages FORTRAN, NODAL, NORD PL, PASCAL and BASIC may be used. The JET project and TOKAMAK type machines are briefly described. (JIW)

  11. Glaciation and geosphere evolution - Greenland Analogue Project

    International Nuclear Information System (INIS)

    Hirschorn, S.; Vorauer, A.; Belfadhel, M.B.; Jensen, M.

    2011-01-01

    permafrost occurrence, amongst other attributes; Evolution of deep groundwater systems and impacts of Coupled Thermo-Hydro-Mechanical effects imposed by glacial cycles; Impacts of climate change on redox stability using both numerical simulations and paleohydrogeological investigations; and Potential for seismicity and faulting induced by glacial rebound. This paper presents an overview of studies underway as part of the Greenland Analogue Project (GAP) to evaluate the impact of an ice sheet on groundwater chemistry at repository depth using the Greenland Ice Sheet as an analogue to future glaciations in North America. The study of the Greenland Ice Sheet will allow us to increase our understanding of hydrological, hydrogeological and geochemical processes during glacial conditions. (author)

  12. Evolution of Cloud Computing and Enabling Technologies

    OpenAIRE

    Rabi Prasad Padhy; Manas Ranjan Patra

    2012-01-01

    We present an overview of the history of forecasting software over the past 25 years, concentrating especially on the interaction between computing and technologies from mainframe computing to cloud computing. The cloud computing is latest one. For delivering the vision of  various  of computing models, this paper lightly explains the architecture, characteristics, advantages, applications and issues of various computing models like PC computing, internet computing etc and related technologie...

  13. Cloud computing and Reservoir project

    International Nuclear Information System (INIS)

    Beco, S.; Maraschini, A.; Pacini, F.; Biran, O.

    2009-01-01

    The support for complex services delivery is becoming a key point in current internet technology. Current trends in internet applications are characterized by on demand delivery of ever growing amounts of content. The future internet of services will have to deliver content intensive applications to users with quality of service and security guarantees. This paper describes the Reservoir project and the challenge of a reliable and effective delivery of services as utilities in a commercial scenario. It starts by analyzing the needs of a future infrastructure provider and introducing the key concept of a service oriented architecture that combines virtualisation-aware grid with grid-aware virtualisation, while being driven by business service management. This article will then focus on the benefits and the innovations derived from the Reservoir approach. Eventually, a high level view of Reservoir general architecture is illustrated.

  14. Enzyme (re)design: lessons from natural evolution and computation.

    Science.gov (United States)

    Gerlt, John A; Babbitt, Patricia C

    2009-02-01

    The (re)design of enzymes to catalyze 'new' reactions is a topic of considerable practical and intellectual interest. Directed evolution (random mutagenesis followed by screening/selection) has been used widely to identify novel biocatalysts. However, 'rational' approaches using either natural divergent evolution or computational predictions based on chemical principles have been less successful. This review summarizes recent progress in evolution-based and computation-based (re)design.

  15. [''R"--project for statistical computing

    DEFF Research Database (Denmark)

    Dessau, R.B.; Pipper, Christian Bressen

    2008-01-01

    An introduction to the R project for statistical computing (www.R-project.org) is presented. The main topics are: 1. To make the professional community aware of "R" as a potent and free software for graphical and statistical analysis of medical data; 2. Simple well-known statistical tests are fai...... are fairly easy to perform in R, but more complex modelling requires programming skills; 3. R is seen as a tool for teaching statistics and implementing complex modelling of medical data among medical professionals Udgivelsesdato: 2008/1/28......An introduction to the R project for statistical computing (www.R-project.org) is presented. The main topics are: 1. To make the professional community aware of "R" as a potent and free software for graphical and statistical analysis of medical data; 2. Simple well-known statistical tests...

  16. The evolution of the project management

    Directory of Open Access Journals (Sweden)

    Catalin Drob

    2009-12-01

    Full Text Available Project management has appeared and developed based on scientific management theory during the '50s-'60s of the last century. After the 1990s of the last century, we can say that project management has truly become an independent discipline, which has a huge impact on the success or failure of companies which are engaged in major projects.

  17. Computer models of vocal tract evolution: an overview and critique

    NARCIS (Netherlands)

    de Boer, B.; Fitch, W. T.

    2010-01-01

    Human speech has been investigated with computer models since the invention of digital computers, and models of the evolution of speech first appeared in the late 1960s and early 1970s. Speech science and computer models have a long shared history because speech is a physical signal and can be

  18. Computer modelling as a tool for understanding language evolution

    NARCIS (Netherlands)

    de Boer, Bart; Gontier, N; VanBendegem, JP; Aerts, D

    2006-01-01

    This paper describes the uses of computer models in studying the evolution of language. Language is a complex dynamic system that can be studied at the level of the individual and at the level of the population. Much of the dynamics of language evolution and language change occur because of the

  19. Management evolution in the LSST project

    Science.gov (United States)

    Sweeney, Donald; Claver, Charles; Jacoby, Suzanne; Kantor, Jeffrey; Krabbendam, Victor; Kurita, Nadine

    2010-07-01

    The Large Synoptic Survey Telescope (LSST) project has evolved from just a few staff members in 2003 to about 100 in 2010; the affiliation of four founding institutions has grown to 32 universities, government laboratories, and industry. The public private collaboration aims to complete the estimated $450 M observatory in the 2017 timeframe. During the design phase of the project from 2003 to the present the management structure has been remarkably stable. At the same time, the funding levels, staffing levels and scientific community participation have grown dramatically. The LSSTC has introduced project controls and tools required to manage the LSST's complex funding model, technical structure and distributed work force. Project controls have been configured to comply with the requirements of federal funding agencies. Some of these tools for risk management, configuration control and resource-loaded schedule have been effective and others have not. Technical tasks associated with building the LSST are distributed into three subsystems: Telescope & Site, Camera, and Data Management. Each sub-system has its own experienced Project Manager and System Scientist. Delegation of authority is enabling and effective; it encourages a strong sense of ownership within the project. At the project level, subsystem management follows the principle that there is one Board of Directors, Director, and Project Manager who have overall authority.

  20. The ECLSS Advanced Automation Project Evolution and Technology Assessment

    Science.gov (United States)

    Dewberry, Brandon S.; Carnes, James R.; Lukefahr, Brenda D.; Rogers, John S.; Rochowiak, Daniel M.; Mckee, James W.; Benson, Brian L.

    1990-01-01

    Viewgraphs on Environmental Control and Life Support System (ECLSS) advanced automation project evolution and technology assessment are presented. Topics covered include: the ECLSS advanced automation project; automatic fault diagnosis of ECLSS subsystems descriptions; in-line, real-time chemical and microbial fluid analysis; and object-oriented, distributed chemical and microbial modeling of regenerative environmental control systems description.

  1. ATLAS Distributed Computing: Experience and Evolution

    CERN Document Server

    Nairz, A; The ATLAS collaboration

    2013-01-01

    The ATLAS experiment has just concluded its first running period which commenced in 2010. After two years of remarkable performance from the LHC and ATLAS, the experiment has accumulated more than 25 fb-1 of data. The total volume of beam and simulated data products exceeds 100 PB distributed across more than 150 computing centers around the world, managed by the experiment's distributed data management system. These sites have provided up to 150,000 computing cores to ATLAS's global production and analysis processing system, enabling a rich physics program including the discovery of the Higgs-like boson in 2012. The wealth of accumulated experience in global data-intensive computing at this massive scale, and the considerably more challenging requirements of LHC computing from 2014 when the LHC resumes operation, are driving a comprehensive design and development cycle to prepare a revised computing model together with data processing and management systems able to meet the demands of higher trigger rates, e...

  2. ATLAS distributed computing: experience and evolution

    CERN Document Server

    Nairz, A; The ATLAS collaboration

    2014-01-01

    The ATLAS experiment has just concluded its first running period which commenced in 2010. After two years of remarkable performance from the LHC and ATLAS, the experiment has accumulated more than 25/fb of data. The total volume of beam and simulated data products exceeds 100~PB distributed across more than 150 computing centres around the world, managed by the experiment's distributed data management system. These sites have provided up to 150,000 computing cores to ATLAS's global production and analysis processing system, enabling a rich physics programme including the discovery of the Higgs-like boson in 2012. The wealth of accumulated experience in global data-intensive computing at this massive scale, and the considerably more challenging requirements of LHC computing from 2015 when the LHC resumes operation, are driving a comprehensive design and development cycle to prepare a revised computing model together with data processing and management systems able to meet the demands of higher trigger rates, e...

  3. Stable numerical method in computation of stellar evolution

    International Nuclear Information System (INIS)

    Sugimoto, Daiichiro; Eriguchi, Yoshiharu; Nomoto, Ken-ichi.

    1982-01-01

    To compute the stellar structure and evolution in different stages, such as (1) red-giant stars in which the density and density gradient change over quite wide ranges, (2) rapid evolution with neutrino loss or unstable nuclear flashes, (3) hydrodynamical stages of star formation or supernova explosion, (4) transition phases from quasi-static to dynamical evolutions, (5) mass-accreting or losing stars in binary-star systems, and (6) evolution of stellar core whose mass is increasing by shell burning or decreasing by penetration of convective envelope into the core, we face ''multi-timescale problems'' which can neither be treated by simple-minded explicit scheme nor implicit one. This problem has been resolved by three prescriptions; one by introducing the hybrid scheme suitable for the multi-timescale problems of quasi-static evolution with heat transport, another by introducing also the hybrid scheme suitable for the multi-timescale problems of hydrodynamic evolution, and the other by introducing the Eulerian or, in other words, the mass fraction coordinate for evolution with changing mass. When all of them are combined in a single computer code, we can compute numerically stably any phase of stellar evolution including transition phases, as far as the star is spherically symmetric. (author)

  4. Computer architecture evaluation for structural dynamics computations: Project summary

    Science.gov (United States)

    Standley, Hilda M.

    1989-01-01

    The intent of the proposed effort is the examination of the impact of the elements of parallel architectures on the performance realized in a parallel computation. To this end, three major projects are developed: a language for the expression of high level parallelism, a statistical technique for the synthesis of multicomputer interconnection networks based upon performance prediction, and a queueing model for the analysis of shared memory hierarchies.

  5. ATLAS distributed computing: experience and evolution

    International Nuclear Information System (INIS)

    Nairz, A

    2014-01-01

    The ATLAS experiment has just concluded its first running period which commenced in 2010. After two years of remarkable performance from the LHC and ATLAS, the experiment has accumulated more than 25 fb −1 of data. The total volume of beam and simulated data products exceeds 100 PB distributed across more than 150 computing centres around the world, managed by the experiment's distributed data management system. These sites have provided up to 150,000 computing cores to ATLAS's global production and analysis processing system, enabling a rich physics programme including the discovery of the Higgs-like boson in 2012. The wealth of accumulated experience in global data-intensive computing at this massive scale, and the considerably more challenging requirements of LHC computing from 2015 when the LHC resumes operation, are driving a comprehensive design and development cycle to prepare a revised computing model together with data processing and management systems able to meet the demands of higher trigger rates, energies and event complexities. An essential requirement will be the efficient utilisation of current and future processor technologies as well as a broad range of computing platforms, including supercomputing and cloud resources. We will report on experience gained thus far and our progress in preparing ATLAS computing for the future

  6. From evolutionary computation to the evolution of things

    NARCIS (Netherlands)

    Eiben, A.E.; Smith, J.E.

    2015-01-01

    Evolution has provided a source of inspiration for algorithm designers since the birth of computers. The resulting field, evolutionary computation, has been successful in solving engineering tasks ranging in outlook from the molecular to the astronomical. Today, the field is entering a new phase as

  7. 2nd Generation QUATARA Flight Computer Project

    Science.gov (United States)

    Falker, Jay; Keys, Andrew; Fraticelli, Jose Molina; Capo-Iugo, Pedro; Peeples, Steven

    2015-01-01

    Single core flight computer boards have been designed, developed, and tested (DD&T) to be flown in small satellites for the last few years. In this project, a prototype flight computer will be designed as a distributed multi-core system containing four microprocessors running code in parallel. This flight computer will be capable of performing multiple computationally intensive tasks such as processing digital and/or analog data, controlling actuator systems, managing cameras, operating robotic manipulators and transmitting/receiving from/to a ground station. In addition, this flight computer will be designed to be fault tolerant by creating both a robust physical hardware connection and by using a software voting scheme to determine the processor's performance. This voting scheme will leverage on the work done for the Space Launch System (SLS) flight software. The prototype flight computer will be constructed with Commercial Off-The-Shelf (COTS) components which are estimated to survive for two years in a low-Earth orbit.

  8. Grid computing the European Data Grid Project

    CERN Document Server

    Segal, B; Gagliardi, F; Carminati, F

    2000-01-01

    The goal of this project is the development of a novel environment to support globally distributed scientific exploration involving multi- PetaByte datasets. The project will devise and develop middleware solutions and testbeds capable of scaling to handle many PetaBytes of distributed data, tens of thousands of resources (processors, disks, etc.), and thousands of simultaneous users. The scale of the problem and the distribution of the resources and user community preclude straightforward replication of the data at different sites, while the aim of providing a general purpose application environment precludes distributing the data using static policies. We will construct this environment by combining and extending newly emerging "Grid" technologies to manage large distributed datasets in addition to computational elements. A consequence of this project will be the emergence of fundamental new modes of scientific exploration, as access to fundamental scientific data is no longer constrained to the producer of...

  9. The Evolution of Computing: Slowing down? Not Yet!

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    Dr Sutherland will review the evolution of computing over the past decade, focusing particularly on the development of the database and middleware from client server to Internet computing. But what are the next steps from the perspective of a software company? Dr Sutherland will discuss the development of Grid as well as the future applications revolving around collaborative working, which are appearing as the next wave of computing applications.

  10. Computing as Empirical Science – Evolution of a Concept

    Directory of Open Access Journals (Sweden)

    Polak Paweł

    2016-12-01

    Full Text Available This article presents the evolution of philosophical and methodological considerations concerning empiricism in computer/computing science. In this study, we trace the most important current events in the history of reflection on computing. The forerunners of Artificial Intelligence H.A. Simon and A. Newell in their paper Computer Science As Empirical Inquiry (1975 started these considerations. Later the concept of empirical computer science was developed by S.S. Shapiro, P. Wegner, A.H. Eden and P.J. Denning. They showed various empirical aspects of computing. This led to a view of the science of computing (or science of information processing - the science of general scope. Some interesting contemporary ways towards a generalized perspective on computations were also shown (e.g. natural computing.

  11. Projection computation based on pixel in simultaneous algebraic reconstruction technique

    International Nuclear Information System (INIS)

    Wang Xu; Chen Zhiqiang; Xiong Hua; Zhang Li

    2005-01-01

    SART is an important arithmetic of image reconstruction, in which the projection computation takes over half of the reconstruction time. An efficient way to compute projection coefficient matrix together with memory optimization is presented in this paper. Different from normal method, projection lines are located based on every pixel, and the following projection coefficient computation can make use of the results. Correlation of projection lines and pixels can be used to optimize the computation. (authors)

  12. TMX-U computer system in evolution

    International Nuclear Information System (INIS)

    Casper, T.A.; Bell, H.; Brown, M.; Gorvad, M.; Jenkins, S.; Meyer, W.; Moller, J.; Perkins, D.

    1986-01-01

    Over the past three years, the total TMX-U diagnsotic data base has grown to exceed 10 megabytes from over 1300 channels; roughly triple the originally designed size. This acquisition and processing load has resulted in an experiment repetition rate exceeding 10 minutes per shot using the five original Hewlett-Packard HP-1000 computers with their shared disks. Our new diagnostics tend to be multichannel instruments, which, in our environment, can be more easily managed using local computers. For this purpose, we are using HP series 9000 computers for instrument control, data acquisition, and analysis. Fourteen such systems are operational with processed format output exchanged via a shared resource manager. We are presently implementing the necessary hardware and software changes to create a local area network allowing us to combine the data from these systems with our main data archive. The expansion of our diagnostic system using the paralled acquisition and processing concept allows us to increase our data base with a minimum of impact on the experimental repetition rate

  13. EVOLUT - a computer program for fast burnup evaluation

    International Nuclear Information System (INIS)

    Craciunescu, T.; Dobrin, R.; Stamatescu, L.; Alexa, A.

    1999-01-01

    EVOLUT is a computer program for burnup evaluation. The input data consist on the one hand of axial and radial gamma-scanning profiles (for the experimental evaluation of the number of nuclei of a fission product - the burnup monitor - at the end of irradiation) and on the other hand of the history of irradiation (the time length and values proportional to the neutron flux for each step of irradiation). Using the equation of evolution of the burnup monitor the flux values are iteratively adjusted, by a multiplier factor, until the calculated number of nuclei is equal to the experimental one. The flux values are used in the equation of evolution of the fissile and fertile nuclei to determine the fission number and consequently the burnup. EVOLUT was successfully used in the analysis of several hundreds of CANDU and TRIGA-type fuel rods. We appreciate that EVOLUT is a useful tool in the burnup evaluation based on gamma spectrometry measurements. EVOLUT can be used on an usual AT computer and in this case the results are obtained in a few minutes. It has an original and user-friendly graphical interface and it provides also output in script MATLAB files for graphical representation and further numerical analysis. The computer program needs simple data and it is valuable especially when a large number of burnup analyses are required quickly. (authors)

  14. The Evolution of the Meningitis Vaccine Project.

    Science.gov (United States)

    Tiffay, Kathleen; Jodar, Luis; Kieny, Marie-Paule; Socquet, Muriel; LaForce, F Marc

    2015-11-15

    In 2001, the Meningitis Vaccine Project (MVP) was tasked to develop, test, license, and introduce a group A meningococcal (MenA) conjugate vaccine for sub-Saharan Africa. African public health officials emphasized that a vaccine price of less than US$0.50 per dose was necessary to ensure introduction and sustained use of this new vaccine. Initially, MVP envisioned partnering with a multinational vaccine manufacturer, but the target price and opportunity costs were problematic and formal negotiations ended in 2002. MVP chose to become a "virtual vaccine company," and over the next decade managed a network of public-private and public-public partnerships for pharmaceutical development, clinical development, and regulatory submission. MVP supported the transfer of key know-how for the production of group A polysaccharide and a new conjugation method to the Serum Institute of India, Ltd, based in Pune, India. A robust staff structure supported by technical consultants and overseen by advisory groups in Europe and Africa ensured that the MenA conjugate vaccine would meet all international standards. A robust project structure including a team of technical consultants and 3 advisory groups in Europe and Africa ensured that the MenA conjugate vaccine (PsA-TT, MenAfriVac) was licensed by the Drug Controller General of India and prequalified by the World Health Organization in June 2010. The vaccine was introduced in Burkina Faso, Mali, and Niger in December 2010. The development, through a public-private partnership, of a safe, effective, and affordable vaccine for sub-Saharan Africa, PsA-TT, offers a new paradigm for the development of vaccines specifically targeting populations in resource-poor countries. © The Author 2015. Published by Oxford University Press on behalf of the Infectious Diseases Society of America.

  15. Computer simulation of dust grain evolution

    Science.gov (United States)

    Liffman, K.

    1989-01-01

    The latest results are reported from a Monte Carlo code that is being developed at NASA Ames. The goal of this program, is to derive from the observed and presumed properties of the interstellar medium (ISM) the following information: (1) the size spectrum of interstellar dust; (2) the chemical structure of interstellar dust; (3) interstellar abundances; and (4) the lifetime of a dust grain in the ISM. Presently this study is restricted to refractory interstellar material, i.e., the formation and destruction of ices are not included in the program. The program is embedded in an analytic solution for the bulk chemical evolution of a two-phase interstellar medium in which stars are born in molecular clouds, but new nucleosynthesis products and stellar return are entered into a complementary intercloud medium. The well-mixed matter of each interstellar phase is repeatedly cycled stochastically through the complementary phase and back. Refractory dust is created by thermal condensation as stellar matter flows away from sites of nucleosynthesis such as novae and supernovae and/or from the matter returned from evolved intermediate stars. The history of each particle is traced by standard Monte Carlo techniques as it is sputtered and fragmented by supernova shock waves in the intercloud medium. It also accretes an amorphous mantle of gaseous refractory atoms when its local medium joins with the molecular cloud medium. Finally it encounters the possibility of astration (destruction by star formation) within the molecular clouds.

  16. Evolution of a Computer-Based Testing Laboratory

    Science.gov (United States)

    Moskal, Patrick; Caldwell, Richard; Ellis, Taylor

    2009-01-01

    In 2003, faced with increasing growth in technology-based and large-enrollment courses, the College of Business Administration at the University of Central Florida opened a computer-based testing lab to facilitate administration of course examinations. Patrick Moskal, Richard Caldwell, and Taylor Ellis describe the development and evolution of the…

  17. Curve Evolution in Subspaces and Exploring the Metameric Class of Histogram of Gradient Orientation based Features using Nonlinear Projection Methods

    DEFF Research Database (Denmark)

    Tatu, Aditya Jayant

    This thesis deals with two unrelated issues, restricting curve evolution to subspaces and computing image patches in the equivalence class of Histogram of Gradient orientation based features using nonlinear projection methods. Curve evolution is a well known method used in various applications like...... tracking interfaces, active contour based segmentation methods and others. It can also be used to study shape spaces, as deforming a shape can be thought of as evolving its boundary curve. During curve evolution a curve traces out a path in the infinite dimensional space of curves. Due to application...... specific requirements like shape priors or a given data model, and due to limitations of the computer, the computed curve evolution forms a path in some finite dimensional subspace of the space of curves. We give methods to restrict the curve evolution to a finite dimensional linear or implicitly defined...

  18. Applying natural evolution for solving computational problems - Lecture 1

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Darwin’s natural evolution theory has inspired computer scientists for solving computational problems. In a similar way to how humans and animals have evolved along millions of years, computational problems can be solved by evolving a population of solutions through generations until a good solution is found. In the first lecture, the fundaments of evolutionary computing (EC) will be described, covering the different phases that the evolutionary process implies. ECJ, a framework for researching in such field, will be also explained. In the second lecture, genetic programming (GP) will be covered. GP is a sub-field of EC where solutions are actual computational programs represented by trees. Bloat control and distributed evaluation will be introduced.

  19. Applying natural evolution for solving computational problems - Lecture 2

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Darwin’s natural evolution theory has inspired computer scientists for solving computational problems. In a similar way to how humans and animals have evolved along millions of years, computational problems can be solved by evolving a population of solutions through generations until a good solution is found. In the first lecture, the fundaments of evolutionary computing (EC) will be described, covering the different phases that the evolutionary process implies. ECJ, a framework for researching in such field, will be also explained. In the second lecture, genetic programming (GP) will be covered. GP is a sub-field of EC where solutions are actual computational programs represented by trees. Bloat control and distributed evaluation will be introduced.

  20. Competitiveness in organizational integrated computer system project management

    Directory of Open Access Journals (Sweden)

    Zenovic GHERASIM

    2010-06-01

    Full Text Available The organizational integrated computer system project management aims at achieving competitiveness by unitary, connected and personalised treatment of the requirements for this type of projects, along with the adequate application of all the basic management, administration and project planning principles, as well as of the basic concepts of the organisational information management development. The paper presents some aspects of organizational computer systems project management competitiveness with the specific reference to some Romanian companies’ projects.

  1. AGIS: Evolution of Distributed Computing information system for ATLAS

    Science.gov (United States)

    Anisenkov, A.; Di Girolamo, A.; Alandes, M.; Karavakis, E.

    2015-12-01

    ATLAS, a particle physics experiment at the Large Hadron Collider at CERN, produces petabytes of data annually through simulation production and tens of petabytes of data per year from the detector itself. The ATLAS computing model embraces the Grid paradigm and a high degree of decentralization of computing resources in order to meet the ATLAS requirements of petabytes scale data operations. It has been evolved after the first period of LHC data taking (Run-1) in order to cope with new challenges of the upcoming Run- 2. In this paper we describe the evolution and recent developments of the ATLAS Grid Information System (AGIS), developed in order to integrate configuration and status information about resources, services and topology of the computing infrastructure used by the ATLAS Distributed Computing applications and services.

  2. The three-dimensional matrix -- An evolution in project management

    Energy Technology Data Exchange (ETDEWEB)

    Glidewell, D.

    1996-09-01

    In the Functional Department Dimension, functional departments such as project management, design, and construction would be maintained to maximize consistency among project teams, evenly allocate training opportunities, and facilitate the crossfeeding of lessons learned and innovative ideas. Functional departments were also determined to be the surest way of complying uniformly with all project control systems required by the Department of Energy (Sandia`s primary external customer). The Technical Discipline dimension was maintained to enhance communication within the technical disciplines, such as electrical engineering, mechanical engineering, civil engineering, etc., and to evenly allocate technical training opportunities, reduce technical obsolescence, and enhance design standards. The third dimension, the Project Dimension, represents the next step in the project management evolution at Sandia, and together with Functional Department and Technical Discipline Dimensions constitutes the three-dimensional matrix. It is this Project Dimension that will be explored thoroughly in this paper, including a discussion of the specific roles and responsibilities of both management and the project team.

  3. Quantum ballistic evolution in quantum mechanics: Application to quantum computers

    International Nuclear Information System (INIS)

    Benioff, P.

    1996-01-01

    Quantum computers are important examples of processes whose evolution can be described in terms of iterations of single-step operators or their adjoints. Based on this, Hamiltonian evolution of processes with associated step operators T is investigated here. The main limitation of this paper is to processes which evolve quantum ballistically, i.e., motion restricted to a collection of nonintersecting or distinct paths on an arbitrary basis. The main goal of this paper is proof of a theorem which gives necessary and sufficient conditions that T must satisfy so that there exists a Hamiltonian description of quantum ballistic evolution for the process, namely, that T is a partial isometry and is orthogonality preserving and stable on some basis. Simple examples of quantum ballistic evolution for quantum Turing machines with one and with more than one type of elementary step are discussed. It is seen that for nondeterministic machines the basis set can be quite complex with much entanglement present. It is also proven that, given a step operator T for an arbitrary deterministic quantum Turing machine, it is decidable if T is stable and orthogonality preserving, and if quantum ballistic evolution is possible. The proof fails if T is a step operator for a nondeterministic machine. It is an open question if such a decision procedure exists for nondeterministic machines. This problem does not occur in classical mechanics. Also the definition of quantum Turing machines used here is compared with that used by other authors. copyright 1996 The American Physical Society

  4. Topographic evolution of sandbars: Flume experiment and computational modeling

    Science.gov (United States)

    Kinzel, Paul J.; Nelson, Jonathan M.; McDonald, Richard R.; Logan, Brandy L.

    2010-01-01

    Measurements of sandbar formation and evolution were carried out in a laboratory flume and the topographic characteristics of these barforms were compared to predictions from a computational flow and sediment transport model with bed evolution. The flume experiment produced sandbars with approximate mode 2, whereas numerical simulations produced a bed morphology better approximated as alternate bars, mode 1. In addition, bar formation occurred more rapidly in the laboratory channel than for the model channel. This paper focuses on a steady-flow laboratory experiment without upstream sediment supply. Future experiments will examine the effects of unsteady flow and sediment supply and the use of numerical models to simulate the response of barform topography to these influences.

  5. Evolution of the GATE project: new results and developments

    Energy Technology Data Exchange (ETDEWEB)

    Santin, G. [ESA-ESTEC, Keplerlaan 1, 2200 AG Noordwijk (Netherlands); Staelens, S. [ELIS Department, Ghent University, B-9000 Ghent (Belgium); Taschereau, R. [CRUMP Institute for Molecular Imaging, University of California Los Angeles, 700 Westwood Plaza A438, Los Angeles, CA 90095-1770 (United States); Descourt, P. [U650 INSERM, LaTIM, Brest (France); Schmidtlein, C.R. [Memorial Sloan-Kettering Cancer Center, New York, New York, US (United States); Simon, L. [Department of Radiation Oncology, Institut Curie, Paris (France); Visvikis, D. [U650 INSERM, LaTIM, Brest (France); Jan, S. [Service Hospitalier Frederic Joliot (SHFJ), CEA-Orsay, Orsay (France); Buvat, I. [U678 INSERM, CHU Pitie-Salpetriere, Paris (France)

    2007-10-15

    We present the status of the Geant4 Application for Emission Tomography (GATE) project, a Monte Carlo simulator for Single Photon Emission Computed Tomography (SPECT) and Positron annihilation Emission Tomography (PET). Its main features are reminded, including modelling of time dependent phenomena and versatile, user-friendly scripting interface. The focus of this manuscript will be on new developments introduced in the past 4 years. New results have been achieved in the fields of validation on real medical and research PET and SPECT systems, voxel geometries, digitisation, distributed computing and dosimetry.

  6. Evolution of the GATE project: new results and developments

    International Nuclear Information System (INIS)

    Santin, G.; Staelens, S.; Taschereau, R.; Descourt, P.; Schmidtlein, C.R.; Simon, L.; Visvikis, D.; Jan, S.; Buvat, I.

    2007-01-01

    We present the status of the Geant4 Application for Emission Tomography (GATE) project, a Monte Carlo simulator for Single Photon Emission Computed Tomography (SPECT) and Positron annihilation Emission Tomography (PET). Its main features are reminded, including modelling of time dependent phenomena and versatile, user-friendly scripting interface. The focus of this manuscript will be on new developments introduced in the past 4 years. New results have been achieved in the fields of validation on real medical and research PET and SPECT systems, voxel geometries, digitisation, distributed computing and dosimetry

  7. Computer simulation of the topography evolution on ion bombarded surfaces

    CERN Document Server

    Zier, M

    2003-01-01

    The development of roughness on ion bombarded surfaces (facets, ripples) on single crystalline and amorphous homogeneous solids plays an important role for example in depth profiling techniques. To verify a faceting mechanism based not only on sputtering by directly impinging ions but also on the contribution of reflected ions and the redeposition of sputtered material a computer simulation has been carried out. The surface in this model is treated as a two-dimensional line segment profile. The model describes the topography evolution on ion bombarded surfaces including the growth mechanism of a facetted surface, using only the interplay of reflected and primary ions and redeposited atoms.

  8. ATLAS Cloud Computing R&D project

    CERN Document Server

    Panitkin, S; The ATLAS collaboration; Caballero Bejar, J; Benjamin, D; DiGirolamo, A; Gable, I; Hendrix, V; Hover, J; Kucharczuk, K; Medrano LLamas, R; Ohman, H; Paterson, M; Sobie, R; Taylor, R; Walker, R; Zaytsev, A

    2013-01-01

    The computing model of the ATLAS experiment was designed around the concept of grid computing and, since the start of data taking, this model has proven very successful. However, new cloud computing technologies bring attractive features to improve the operations and elasticity of scientific distributed computing. ATLAS sees grid and cloud computing as complementary technologies that will coexist at different levels of resource abstraction, and two years ago created an R&D working group to investigate the different integration scenarios. The ATLAS Cloud Computing R&D has been able to demonstrate the feasibility of offloading work from grid to cloud sites and, as of today, is able to integrate transparently various cloud resources into the PanDA workload management system. The ATLAS Cloud Computing R&D is operating various PanDA queues on private and public resources and has provided several hundred thousand CPU days to the experiment. As a result, the ATLAS Cloud Computing R&D group has gained...

  9. Computational Nuclear Quantum Many-Body Problem: The UNEDF Project

    OpenAIRE

    Bogner, Scott; Bulgac, Aurel; Carlson, Joseph A.; Engel, Jonathan; Fann, George; Furnstahl, Richard J.; Gandolfi, Stefano; Hagen, Gaute; Horoi, Mihai; Johnson, Calvin W.; Kortelainen, Markus; Lusk, Ewing; Maris, Pieter; Nam, Hai Ah; Navratil, Petr

    2013-01-01

    The UNEDF project was a large-scale collaborative effort that applied high-performance computing to the nuclear quantum many-body problem. UNEDF demonstrated that close associations among nuclear physicists, mathematicians, and computer scientists can lead to novel physics outcomes built on algorithmic innovations and computational developments. This review showcases a wide range of UNEDF science results to illustrate this interplay.

  10. Optimized temporal pattern of brain stimulation designed by computational evolution.

    Science.gov (United States)

    Brocker, David T; Swan, Brandon D; So, Rosa Q; Turner, Dennis A; Gross, Robert E; Grill, Warren M

    2017-01-04

    Brain stimulation is a promising therapy for several neurological disorders, including Parkinson's disease. Stimulation parameters are selected empirically and are limited to the frequency and intensity of stimulation. We varied the temporal pattern of deep brain stimulation to ameliorate symptoms in a parkinsonian animal model and in humans with Parkinson's disease. We used model-based computational evolution to optimize the stimulation pattern. The optimized pattern produced symptom relief comparable to that from standard high-frequency stimulation (a constant rate of 130 or 185 Hz) and outperformed frequency-matched standard stimulation in a parkinsonian rat model and in patients. Both optimized and standard high-frequency stimulation suppressed abnormal oscillatory activity in the basal ganglia of rats and humans. The results illustrate the utility of model-based computational evolution of temporal patterns to increase the efficiency of brain stimulation in treating Parkinson's disease and thereby reduce the energy required for successful treatment below that of current brain stimulation paradigms. Copyright © 2017, American Association for the Advancement of Science.

  11. Cloud Computing as Evolution of Distributed Computing – A Case Study for SlapOS Distributed Cloud Computing Platform

    Directory of Open Access Journals (Sweden)

    George SUCIU

    2013-01-01

    Full Text Available The cloud computing paradigm has been defined from several points of view, the main two directions being either as an evolution of the grid and distributed computing paradigm, or, on the contrary, as a disruptive revolution in the classical paradigms of operating systems, network layers and web applications. This paper presents a distributed cloud computing platform called SlapOS, which unifies technologies and communication protocols into a new technology model for offering any application as a service. Both cloud and distributed computing can be efficient methods for optimizing resources that are aggregated from a grid of standard PCs hosted in homes, offices and small data centers. The paper fills a gap in the existing distributed computing literature by providing a distributed cloud computing model which can be applied for deploying various applications.

  12. Framework for Computer-Aided Evolution of Object-Oriented Designs

    NARCIS (Netherlands)

    Ciraci, S.; van den Broek, P.M.; Aksit, Mehmet

    2008-01-01

    In this paper, we describe a framework for the computer aided evolution of the designs of object-oriented software systems. Evolution mechanisms are software structures that prepare software for certain type of evolutions. The framework uses a database which holds the evolution mechanisms, modeled

  13. Pervasive Computing Support for Hospitals: An Overview of the Activity-Based Computing Project

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Bardram, Jakob E

    2007-01-01

    The activity-based computing project researched pervasive computing support for clinical hospital work. Such technologies have potential for supporting the mobile, collaborative, and disruptive use of heterogeneous embedded devices in a hospital......The activity-based computing project researched pervasive computing support for clinical hospital work. Such technologies have potential for supporting the mobile, collaborative, and disruptive use of heterogeneous embedded devices in a hospital...

  14. Computational nuclear quantum many-body problem: The UNEDF project

    Science.gov (United States)

    Bogner, S.; Bulgac, A.; Carlson, J.; Engel, J.; Fann, G.; Furnstahl, R. J.; Gandolfi, S.; Hagen, G.; Horoi, M.; Johnson, C.; Kortelainen, M.; Lusk, E.; Maris, P.; Nam, H.; Navratil, P.; Nazarewicz, W.; Ng, E.; Nobre, G. P. A.; Ormand, E.; Papenbrock, T.; Pei, J.; Pieper, S. C.; Quaglioni, S.; Roche, K. J.; Sarich, J.; Schunck, N.; Sosonkina, M.; Terasaki, J.; Thompson, I.; Vary, J. P.; Wild, S. M.

    2013-10-01

    The UNEDF project was a large-scale collaborative effort that applied high-performance computing to the nuclear quantum many-body problem. The primary focus of the project was on constructing, validating, and applying an optimized nuclear energy density functional, which entailed a wide range of pioneering developments in microscopic nuclear structure and reactions, algorithms, high-performance computing, and uncertainty quantification. UNEDF demonstrated that close associations among nuclear physicists, mathematicians, and computer scientists can lead to novel physics outcomes built on algorithmic innovations and computational developments. This review showcases a wide range of UNEDF science results to illustrate this interplay.

  15. Integration of case study approach, project design and computer ...

    African Journals Online (AJOL)

    Integration of case study approach, project design and computer modeling in managerial accounting education ... Journal of Fundamental and Applied Sciences ... in the Laboratory of Management Accounting and Controlling Systems at the ...

  16. Taiwan links up to world's first LHC computing grid project

    CERN Multimedia

    2003-01-01

    "Taiwan's Academia Sinica was linked up to the Large Hadron Collider (LHC) Computing Grid Project last week to work jointly with 12 other countries to construct the world's largest and most powerful particle accelerator" (1/2 page).

  17. The Caltech Concurrent Computation Program - Project description

    Science.gov (United States)

    Fox, G.; Otto, S.; Lyzenga, G.; Rogstad, D.

    1985-01-01

    The Caltech Concurrent Computation Program wwhich studies basic issues in computational science is described. The research builds on initial work where novel concurrent hardware, the necessary systems software to use it and twenty significant scientific implementations running on the initial 32, 64, and 128 node hypercube machines have been constructed. A major goal of the program will be to extend this work into new disciplines and more complex algorithms including general packages that decompose arbitrary problems in major application areas. New high-performance concurrent processors with up to 1024-nodes, over a gigabyte of memory and multigigaflop performance are being constructed. The implementations cover a wide range of problems in areas such as high energy and astrophysics, condensed matter, chemical reactions, plasma physics, applied mathematics, geophysics, simulation, CAD for VLSI, graphics and image processing. The products of the research program include the concurrent algorithms, hardware, systems software, and complete program implementations.

  18. Elastic Multi-scale Mechanisms: Computation and Biological Evolution.

    Science.gov (United States)

    Diaz Ochoa, Juan G

    2018-01-01

    Explanations based on low-level interacting elements are valuable and powerful since they contribute to identify the key mechanisms of biological functions. However, many dynamic systems based on low-level interacting elements with unambiguous, finite, and complete information of initial states generate future states that cannot be predicted, implying an increase of complexity and open-ended evolution. Such systems are like Turing machines, that overlap with dynamical systems that cannot halt. We argue that organisms find halting conditions by distorting these mechanisms, creating conditions for a constant creativity that drives evolution. We introduce a modulus of elasticity to measure the changes in these mechanisms in response to changes in the computed environment. We test this concept in a population of predators and predated cells with chemotactic mechanisms and demonstrate how the selection of a given mechanism depends on the entire population. We finally explore this concept in different frameworks and postulate that the identification of predictive mechanisms is only successful with small elasticity modulus.

  19. SPECT: Theoretical aspecte and evolution of emission computed axial tomography

    International Nuclear Information System (INIS)

    Brunol, J.; Nuta, V.

    1981-01-01

    We have detailed certain of the elements of 3-D image reconstruction from axial projections. Two of the aspects specific to nuclear medicine have been analysed namely self-absorption and statistics. In our view, the development of ECAT in the months to come must hence proceed in two essential directions: - application to dynamic cardiac imagery (multigated). Results of this type have been obtained over 8 months in the Radioisotope Service of Cochin Hospital in Paris. It must be stressed here that the number of images to be processed then becomes considerable (multiplication by the gate factor yielding more than 100 images), the more the statistics are reduced due to the fact of the temporal separation. The obtaining of good image quality requires sophisticated quadri-dimensional processing. It follows that the computing times, with all the mini-computers available in nuclear medicine, then become much too great to envisage really application in hospital routine (several hours of computing). This is the reason why we connected an array processor with the IMAC system. This very powerful system (several tens of times the power of a mini-computer) will reduce the time of such computing to less than 10 minutes. New elements can be introduced into the reconstruction algorithm (static case opposite the foregoing one). These important elements of improvement are to the detriment of space and hence of computing time. Here again, the use of an array processor appears indispensable. It is to recall that the ECAT is today a currently used method, the theoretical analyses that it has necessitated have opened the way to new effective methods of tomography by 'Slanted Hole'. (orig.) [de

  20. Numerical evaluation of methods for computing tomographic projections

    International Nuclear Information System (INIS)

    Zhuang, W.; Gopal, S.S.; Hebert, T.J.

    1994-01-01

    Methods for computing forward/back projections of 2-D images can be viewed as numerical integration techniques. The accuracy of any ray-driven projection method can be improved by increasing the number of ray-paths that are traced per projection bin. The accuracy of pixel-driven projection methods can be increased by dividing each pixel into a number of smaller sub-pixels and projecting each sub-pixel. The authors compared four competing methods of computing forward/back projections: bilinear interpolation, ray-tracing, pixel-driven projection based upon sub-pixels, and pixel-driven projection based upon circular, rather than square, pixels. This latter method is equivalent to a fast, bi-nonlinear interpolation. These methods and the choice of the number of ray-paths per projection bin or the number of sub-pixels per pixel present a trade-off between computational speed and accuracy. To solve the problem of assessing backprojection accuracy, the analytical inverse Fourier transform of the ramp filtered forward projection of the Shepp and Logan head phantom is derived

  1. The Challenge '88 Project: Interfacing of Chemical Instruments to Computers.

    Science.gov (United States)

    Lyons, Jim; Verghese, Manoj

    The main part of this project involved using a computer, either an Apple or an IBM, as a chart recorder for the infrared (IR) and nuclear magnetic resonance (NMR) spectrophotometers. The computer "reads" these machines and displays spectra on its monitor. The graphs can then be stored for future reference and manipulation. The program to…

  2. Computing the maximum volume inscribed ellipsoid of a polytopic projection

    NARCIS (Netherlands)

    Zhen, Jianzhe; den Hertog, Dick

    We introduce a novel scheme based on a blending of Fourier-Motzkin elimination (FME) and adjustable robust optimization techniques to compute the maximum volume inscribed ellipsoid (MVE) in a polytopic projection. It is well-known that deriving an explicit description of a projected polytope is

  3. Computing the Maximum Volume Inscribed Ellipsoid of a Polytopic Projection

    NARCIS (Netherlands)

    Zhen, J.; den Hertog, D.

    2015-01-01

    We introduce a novel scheme based on a blending of Fourier-Motzkin elimination (FME) and adjustable robust optimization techniques to compute the maximum volume inscribed ellipsoid (MVE) in a polytopic projection. It is well-known that deriving an explicit description of a projected polytope is

  4. Spiral and Project-Based Learning with Peer Assessment in a Computer Science Project Management Course

    Science.gov (United States)

    Jaime, Arturo; Blanco, José Miguel; Domínguez, César; Sánchez, Ana; Heras, Jónathan; Usandizaga, Imanol

    2016-01-01

    Different learning methods such as project-based learning, spiral learning and peer assessment have been implemented in science disciplines with different outcomes. This paper presents a proposal for a project management course in the context of a computer science degree. Our proposal combines three well-known methods: project-based learning,…

  5. Galactic evolution of copper in the light of NLTE computations

    Science.gov (United States)

    Andrievsky, S.; Bonifacio, P.; Caffau, E.; Korotin, S.; Spite, M.; Spite, F.; Sbordone, L.; Zhukova, A. V.

    2018-01-01

    We have developed a model atom for Cu with which we perform statistical equilibrium computations that allow us to compute the line formation of Cu I lines in stellar atmospheres without assuming local thermodynamic equilibrium (LTE). We validate this model atom by reproducing the observed line profiles of the Sun, Procyon and 11 metal-poor stars. Our sample of stars includes both dwarfs and giants. Over a wide range of stellar parameters, we obtain excellent agreement among different Cu I lines. The 11 metal-poor stars have iron abundances in the range - 4.2 ≤ [Fe/H] ≤ -1.4, the weighted mean of the [Cu/Fe] ratios is -0.22 dex, with a scatter of -0.15 dex. This is very different from the results from LTE analysis (the difference between NLTE and LTE abundances reaches 1 dex) and in spite of the small size of our sample, it prompts for a revision of the Galactic evolution of Cu.

  6. Computer-Automated Evolution of Spacecraft X-Band Antennas

    Science.gov (United States)

    Lohn, Jason D.; Homby, Gregory S.; Linden, Derek S.

    2010-01-01

    A document discusses the use of computer- aided evolution in arriving at a design for X-band communication antennas for NASA s three Space Technology 5 (ST5) satellites, which were launched on March 22, 2006. Two evolutionary algorithms, incorporating different representations of the antenna design and different fitness functions, were used to automatically design and optimize an X-band antenna design. A set of antenna designs satisfying initial ST5 mission requirements was evolved by use these algorithms. The two best antennas - one from each evolutionary algorithm - were built. During flight-qualification testing of these antennas, the mission requirements were changed. After minimal changes in the evolutionary algorithms - mostly in the fitness functions - new antenna designs satisfying the changed mission requirements were evolved and within one month of this change, two new antennas were designed and prototypes of the antennas were built and tested. One of these newly evolved antennas was approved for deployment on the ST5 mission, and flight-qualified versions of this design were built and installed on the spacecraft. At the time of writing the document, these antennas were the first computer-evolved hardware in outer space.

  7. "Simulated molecular evolution" or computer-generated artifacts?

    Science.gov (United States)

    Darius, F; Rojas, R

    1994-11-01

    1. The authors define a function with value 1 for the positive examples and 0 for the negative ones. They fit a continuous function but do not deal at all with the error margin of the fit, which is almost as large as the function values they compute. 2. The term "quality" for the value of the fitted function gives the impression that some biological significance is associated with values of the fitted function strictly between 0 and 1, but there is no justification for this kind of interpretation and finding the point where the fit achieves its maximum does not make sense. 3. By neglecting the error margin the authors try to optimize the fitted function using differences in the second, third, fourth, and even fifth decimal place which have no statistical significance. 4. Even if such a fit could profit from more data points, the authors should first prove that the region of interest has some kind of smoothness, that is, that a continuous fit makes any sense at all. 5. "Simulated molecular evolution" is a misnomer. We are dealing here with random search. Since the margin of error is so large, the fitted function does not provide statistically significant information about the points in search space where strings with cleavage sites could be found. This implies that the method is a highly unreliable stochastic search in the space of strings, even if the neural network is capable of learning some simple correlations. 6. Classical statistical methods are for these kind of problems with so few data points clearly superior to the neural networks used as a "black box" by the authors, which in the way they are structured provide a model with an error margin as large as the numbers being computed.7. And finally, even if someone would provide us with a function which separates strings with cleavage sites from strings without them perfectly, so-called simulated molecular evolution would not be better than random selection.Since a perfect fit would only produce exactly ones or

  8. The Cc1 Project – System For Private Cloud Computing

    Directory of Open Access Journals (Sweden)

    J Chwastowski

    2012-01-01

    Full Text Available The main features of the Cloud Computing system developed at IFJ PAN are described. The project is financed from the structural resources provided by the European Commission and the Polish Ministry of Science and Higher Education (Innovative Economy, National Cohesion Strategy. The system delivers a solution for carrying out computer calculations on a Private Cloud computing infrastructure. It consists of an intuitive Web based user interface, a module for the users and resources administration and the standard EC2 interface implementation. Thanks to the distributed character of the system it allows for the integration of a geographically distant federation of computer clusters within a uniform user environment.

  9. Computer-integrated design and information management for nuclear projects

    International Nuclear Information System (INIS)

    Gonzalez, A.; Martin-Guirado, L.; Nebrera, F.

    1987-01-01

    Over the past seven years, Empresarios Agrupados has been developing a comprehensive, computer-integrated system to perform the majority of the engineering, design, procurement and construction management activities in nuclear, fossil-fired as well as hydro power plant projects. This system, which is already in a production environment, comprises a large number of computer programs and data bases designed using a modular approach. Each software module, dedicated to meeting the needs of a particular design group or project discipline, facilitates the performance of functional tasks characteristic of the power plant engineering process

  10. The Activity-Based Computing Project - A Software Architecture for Pervasive Computing Final Report

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    . Special attention should be drawn to publication [25], which gives an overview of the ABC project to the IEEE Pervasive Computing community; the ACM CHI 2006 [19] paper that documents the implementation of the ABC technology; and the ACM ToCHI paper [12], which is the main publication of the project......, documenting all of the project’s four objectives. All of these publication venues are top-tier journals and conferences within computer science. From a business perspective, the project had the objective of incorporating relevant parts of the ABC technology into the products of Medical Insight, which has been...... done. Moreover, partly based on the research done in the ABC project, the company Cetrea A/S has been founded, which incorporate ABC concepts and technologies in its products. The concepts of activity-based computing have also been researched in cooperation with IBM Research, and the ABC project has...

  11. RF heating systems evolution for the WEST project

    Energy Technology Data Exchange (ETDEWEB)

    Magne, R.; Achard, J.; Armitano, A.; Argouarch, A.; Berger-By, G.; Bernard, J. M.; Bouquey, F.; Charabot, N.; Colas, L.; Corbel, E.; Delpech, L.; Ekedahl, A.; Goniche, M.; Guilhem, D.; Hillairet, J.; Jacquot, J.; Joffrin, E.; Litaudon, X.; Lombard, G.; Mollard, P. [CEA, IRFM, F-13108 Saint-Paul-lez-Durance (France); and others

    2014-02-12

    Tore Supra is dedicated to long pulse operation at high power, with a record in injected energy of 1 GJ (2.8 MW × 380 s) and an achieved capability of 12 MW injected power delivered by 3 RF systems: Lower Hybrid Current Drive (LHCD), Ion Cyclotron Resonance Heating (ICRH) and Electron Cyclotron Resonance Heating (ECRH). The new WEST project (W [tungsten] Environment in Steady-state Tokamak) aims at fitting Tore Supra with an actively cooled tungsten coated wall and a bulk tungsten divertor. This new device will offer to ITER a test bed for validating the relevant technologies for actively cooled metallic components, with D-shaped H-mode plasmas. For WEST operation, different scenarii able to reproduce ITER relevant conditions in terms of steady state heat loads have been identified, ranging from a high RF power scenario (15 MW, 30 s) to a high fluence scenario (10 MW, 1000 s). This paper will focus on the evolution of the RF systems required for WEST. For the ICRH system, the main issues are its ELM resilience and its CW compatibility, three new actively cooled antennas are being designed, with the aim of reducing their sensitivity to the load variations induced by ELMs. The LH system has been recently upgraded with new klystrons and the PAM antenna, the possible reshaping of the antenna mouths is presently studied for matching with the magnetic field line in the WEST configuration. For the ECRH system, the device for the poloidal movement of the mirrors of the antenna is being changed for higher accuracy and speed.

  12. COMPUTER GRAPHICAL REPRESENTATION, IN TREBLE ORTHOGONAL PROJECTION, OF A POINT

    Directory of Open Access Journals (Sweden)

    SLONOVSCHI Andrei

    2017-05-01

    Full Text Available In the stages of understanding and study, by students, of descriptive geometry, the treble orthogonal projection of a point, creates problems in the situations in that one or more descriptive coordinates are zero. Starting from these considerations the authors have created an original computer program which offers to the students the possibility to easily understanding of the way in which a point is represented, in draught, in the treble orthogonal projection whatever which are its values of the descriptive coordinates.

  13. Computer Assisted Project-Based Instruction: The Effects on Science Achievement, Computer Achievement and Portfolio Assessment

    Science.gov (United States)

    Erdogan, Yavuz; Dede, Dinçer

    2015-01-01

    The purpose of this study is to compare the effects of computer assisted project-based instruction on learners' achievement in a science and technology course, in a computer course and in portfolio development. With this aim in mind, a quasi-experimental design was used and a sample of 70 seventh grade secondary school students from Org. Esref…

  14. Computation of a long-time evolution in a Schroedinger system

    International Nuclear Information System (INIS)

    Girard, R.; Kroeger, H.; Labelle, P.; Bajzer, Z.

    1988-01-01

    We compare different techniques for the computation of a long-time evolution and the S matrix in a Schroedinger system. As an application we consider a two-nucleon system interacting via the Yamaguchi potential. We suggest computation of the time evolution for a very short time using Pade approximants, the long-time evolution being obtained by iterative squaring. Within the technique of strong approximation of Moller wave operators (SAM) we compare our calculation with computation of the time evolution in the eigenrepresentation of the Hamiltonian and with the standard Lippmann-Schwinger solution for the S matrix. We find numerical agreement between these alternative methods for time-evolution computation up to half the number of digits of internal machine precision, and fairly rapid convergence of both techniques towards the Lippmann-Schwinger solution

  15. Design, Results, Evolution and Status of the ATLAS Simulation at Point1 Project

    CERN Document Server

    AUTHOR|(SzGeCERN)377840; Fressard-Batraneanu, Silvia Maria; Ballestrero, Sergio; Contescu, Alexandru Cristian; Fazio, Daniel; Di Girolamo, Alessandro; Lee, Christopher Jon; Pozo Astigarraga, Mikel Eukeni; Scannicchio, Diana; Sedov, Alexey; Twomey, Matthew Shaun; Wang, Fuquan; Zaytsev, Alexander

    2015-01-01

    Abstract. During the LHC Long Shutdown 1 period (LS1), that started in 2013, the Simulation at Point1 (Sim@P1) Project takes advantage, in an opportunistic way, of the TDAQ (Trigger and Data Acquisition) HLT (High Level Trigger) farm of the ATLAS experiment. This farm provides more than 1300 compute nodes, which are particularly suited for running event generation and Monte Carlo production jobs that are mostly CPU and not I/O bound. It is capable of running up to 2700 virtual machines (VMs) provided with 8 CPU cores each, for a total of up to 22000 parallel running jobs. This contribution gives a review of the design, the results, and the evolution of the Sim@P1 Project; operating a large scale OpenStack based virtualized platform deployed on top of the ATLAS TDAQ HLT farm computing resources. During LS1, Sim@P1 was one of the most productive ATLAS sites: it delivered more than 50 million CPU-hours and it generated more than 1.7 billion Monte Carlo events to various analysis communities. The design aspects a...

  16. Design, Results, Evolution and Status of the ATLAS simulation in Point1 project.

    CERN Document Server

    Ballestrero, Sergio; The ATLAS collaboration; Brasolin, Franco; Contescu, Alexandru Cristian; Fazio, Daniel; Di Girolamo, Alessandro; Lee, Christopher Jon; Pozo Astigarraga, Mikel Eukeni; Scannicchio, Diana; Sedov, Alexey; Twomey, Matthew Shaun; Wang, Fuquan; Zaytsev, Alexander

    2015-01-01

    During the LHC long shutdown period (LS1), that started in 2013, the simulation in Point1 (Sim@P1) project takes advantage in an opportunistic way of the trigger and data acquisition (TDAQ) farm of the ATLAS experiment. The farm provides more than 1500 computer nodes, and they are particularly suitable for running event generation and Monte Carlo production jobs that are mostly CPU and not I/O bound. It is capable of running up to 2500 virtual machines (VM) provided with 8 CPU cores each, for a total of up to 20000 parallel running jobs. This contribution gives a thorough review of the design, the results and the evolution of the Sim@P1 project operating a large scale Openstack based virtualized platform deployed on top of the ATLAS TDAQ farm computing resources. During LS1, Sim@P1 was one of the most productive GRID sites: it delivered more than 50 million CPU-hours and it generated more than 1.7 billion Monte Carlo events to various analysis communities within the ATLAS collaboration. The particular design ...

  17. The ZAP Project: Designing Interactive Computer Tools for Learning Psychology

    Science.gov (United States)

    Hulshof, Casper; Eysink, Tessa; de Jong, Ton

    2006-01-01

    In the ZAP project, a set of interactive computer programs called "ZAPs" was developed. The programs were designed in such a way that first-year students experience psychological phenomena in a vivid and self-explanatory way. Students can either take the role of participant in a psychological experiment, they can experience phenomena themselves,…

  18. Computer-aided engineering for Qinshan CANDU projects

    International Nuclear Information System (INIS)

    Huang Zhizhang; Goland, D.

    1999-01-01

    The author briefly describes AECL's work in applying computer-aided engineering tools to the Qinshan CANDU Project. The main emphases will be to introduce the major CADD software tools and their use in civil design, process design and EI and C design. Other special software tools and non-CADD tools and their applications are also briefly introduced

  19. Design, Results, Evolution and Status of the ATLAS Simulation at Point1 Project

    Science.gov (United States)

    Ballestrero, S.; Batraneanu, S. M.; Brasolin, F.; Contescu, C.; Fazio, D.; Di Girolamo, A.; Lee, C. J.; Pozo Astigarraga, M. E.; Scannicchio, D. A.; Sedov, A.; Twomey, M. S.; Wang, F.; Zaytsev, A.

    2015-12-01

    During the LHC Long Shutdown 1 (LSI) period, that started in 2013, the Simulation at Point1 (Sim@P1) project takes advantage, in an opportunistic way, of the TDAQ (Trigger and Data Acquisition) HLT (High-Level Trigger) farm of the ATLAS experiment. This farm provides more than 1300 compute nodes, which are particularly suited for running event generation and Monte Carlo production jobs that are mostly CPU and not I/O bound. It is capable of running up to 2700 Virtual Machines (VMs) each with 8 CPU cores, for a total of up to 22000 parallel jobs. This contribution gives a review of the design, the results, and the evolution of the Sim@P1 project, operating a large scale OpenStack based virtualized platform deployed on top of the ATLAS TDAQ HLT farm computing resources. During LS1, Sim@P1 was one of the most productive ATLAS sites: it delivered more than 33 million CPU-hours and it generated more than 1.1 billion Monte Carlo events. The design aspects are presented: the virtualization platform exploited by Sim@P1 avoids interferences with TDAQ operations and it guarantees the security and the usability of the ATLAS private network. The cloud mechanism allows the separation of the needed support on both infrastructural (hardware, virtualization layer) and logical (Grid site support) levels. This paper focuses on the operational aspects of such a large system during the upcoming LHC Run 2 period: simple, reliable, and efficient tools are needed to quickly switch from Sim@P1 to TDAQ mode and back, to exploit the resources when they are not used for the data acquisition, even for short periods. The evolution of the central OpenStack infrastructure is described, as it was upgraded from Folsom to the Icehouse release, including the scalability issues addressed.

  20. Problem-Based Service Learning: The Evolution of a Team Project

    Science.gov (United States)

    Connor-Greene, Patricia A.

    2002-01-01

    In this article, I describe the evolution of a problem-based service learning project in an undergraduate Abnormal Psychology course. Students worked in teams on a semester-long project to locate and evaluate information and treatment for specific psychiatric disorders. As part of the project, each team selected relevant bibliographic materials,…

  1. Evolution of the ATLAS distributed computing system during the LHC long shutdown

    Science.gov (United States)

    Campana, S.; Atlas Collaboration

    2014-06-01

    The ATLAS Distributed Computing project (ADC) was established in 2007 to develop and operate a framework, following the ATLAS computing model, to enable data storage, processing and bookkeeping on top of the Worldwide LHC Computing Grid (WLCG) distributed infrastructure. ADC development has always been driven by operations and this contributed to its success. The system has fulfilled the demanding requirements of ATLAS, daily consolidating worldwide up to 1 PB of data and running more than 1.5 million payloads distributed globally, supporting almost one thousand concurrent distributed analysis users. Comprehensive automation and monitoring minimized the operational manpower required. The flexibility of the system to adjust to operational needs has been important to the success of the ATLAS physics program. The LHC shutdown in 2013-2015 affords an opportunity to improve the system in light of operational experience and scale it to cope with the demanding requirements of 2015 and beyond, most notably a much higher trigger rate and event pileup. We will describe the evolution of the ADC software foreseen during this period. This includes consolidating the existing Production and Distributed Analysis framework (PanDA) and ATLAS Grid Information System (AGIS), together with the development and commissioning of next generation systems for distributed data management (DDM/Rucio) and production (Prodsys-2). We will explain how new technologies such as Cloud Computing and NoSQL databases, which ATLAS investigated as R&D projects in past years, will be integrated in production. Finally, we will describe more fundamental developments such as breaking job-to-data locality by exploiting storage federations and caches, and event level (rather than file or dataset level) workload engines.

  2. Evolution of the ATLAS distributed computing system during the LHC long shutdown

    International Nuclear Information System (INIS)

    Campana, S

    2014-01-01

    The ATLAS Distributed Computing project (ADC) was established in 2007 to develop and operate a framework, following the ATLAS computing model, to enable data storage, processing and bookkeeping on top of the Worldwide LHC Computing Grid (WLCG) distributed infrastructure. ADC development has always been driven by operations and this contributed to its success. The system has fulfilled the demanding requirements of ATLAS, daily consolidating worldwide up to 1 PB of data and running more than 1.5 million payloads distributed globally, supporting almost one thousand concurrent distributed analysis users. Comprehensive automation and monitoring minimized the operational manpower required. The flexibility of the system to adjust to operational needs has been important to the success of the ATLAS physics program. The LHC shutdown in 2013-2015 affords an opportunity to improve the system in light of operational experience and scale it to cope with the demanding requirements of 2015 and beyond, most notably a much higher trigger rate and event pileup. We will describe the evolution of the ADC software foreseen during this period. This includes consolidating the existing Production and Distributed Analysis framework (PanDA) and ATLAS Grid Information System (AGIS), together with the development and commissioning of next generation systems for distributed data management (DDM/Rucio) and production (Prodsys-2). We will explain how new technologies such as Cloud Computing and NoSQL databases, which ATLAS investigated as R and D projects in past years, will be integrated in production. Finally, we will describe more fundamental developments such as breaking job-to-data locality by exploiting storage federations and caches, and event level (rather than file or dataset level) workload engines.

  3. A new major SETI project based on Project SERENDIP data and 100,000 personal computers

    Science.gov (United States)

    Sullivan, Woodruff T., III; Werthimer, Dan; Bowyer, Stuart; Cobb, Jeff; Gedye, David; Anderson, David

    1997-01-01

    We are now developing an innovative SETI project involving massively parallel computation on desktop computers scattered around the world. The public will be uniquely involved in a real scientific project. Individuals will download a screensaver program that will not only provide the usual attractive graphics when their computer is idle, but will also perform sophisticated analysis of SETI data using the host computer. The data are tapped off Project SERENDIP IV's receiver and SETI survey operating on the 305-m-diameter Arecibo radio telescope. We make a continuous tape-recording of a 2-MHz bandwidth signal centered on the 21-cm H I line. The data on these tapes are then preliminarily screened and parceled out by a server that supplies small chunks of data over the Internet to clients possessing the screen-saver software. After the client computer has automatically analyzed a complete chunk of data a report on the best candidate signals is sent back to the server, whereupon a new chunk of data is sent out. If 50,000-100,000 customers can be achieved, the computing power will be equivalent to a substantial fraction of atypical supercomputer, and the project will cover a volume of parameter space comparable to that of SERENDIP IV.

  4. The UK Human Genome Mapping Project online computing service.

    Science.gov (United States)

    Rysavy, F R; Bishop, M J; Gibbs, G P; Williams, G W

    1992-04-01

    This paper presents an overview of computing and networking facilities developed by the Medical Research Council to provide online computing support to the Human Genome Mapping Project (HGMP) in the UK. The facility is connected to a number of other computing facilities in various centres of genetics and molecular biology research excellence, either directly via high-speed links or through national and international wide-area networks. The paper describes the design and implementation of the current system, a 'client/server' network of Sun, IBM, DEC and Apple servers, gateways and workstations. A short outline of online computing services currently delivered by this system to the UK human genetics research community is also provided. More information about the services and their availability could be obtained by a direct approach to the UK HGMP-RC.

  5. Analytical simulation platform describing projections in computed tomography systems

    International Nuclear Information System (INIS)

    Youn, Hanbean; Kim, Ho Kyung

    2013-01-01

    To reduce the patient dose, several approaches such as spectral imaging using photon counting detectors and statistical image reconstruction, are being considered. Although image-reconstruction algorithms may significantly enhance image quality in reconstructed images with low dose, true signal-to-noise properties are mainly determined by image quality in projections. We are developing an analytical simulation platform describing projections to investigate how quantum-interaction physics in each component configuring CT systems affect image quality in projections. This simulator will be very useful for an improved design or optimization of CT systems in economy as well as the development of novel image-reconstruction algorithms. In this study, we present the progress of development of the simulation platform with an emphasis on the theoretical framework describing the generation of projection data. We have prepared the analytical simulation platform describing projections in computed tomography systems. The remained further study before the meeting includes the following: Each stage in the cascaded signal-transfer model for obtaining projections will be validated by the Monte Carlo simulations. We will build up energy-dependent scatter and pixel-crosstalk kernels, and show their effects on image quality in projections and reconstructed images. We will investigate the effects of projections obtained from various imaging conditions and system (or detector) operation parameters on reconstructed images. It is challenging to include the interaction physics due to photon-counting detectors into the simulation platform. Detailed descriptions of the simulator will be presented with discussions on its performance and limitation as well as Monte Carlo validations. Computational cost will also be addressed in detail. The proposed method in this study is simple and can be used conveniently in lab environment

  6. AGIS: Evolution of Distributed Computing Information system for ATLAS

    CERN Document Server

    Anisenkov, Alexey; The ATLAS collaboration; Alandes Pradillo, Maria; Karavakis, Edward

    2015-01-01

    The variety of the ATLAS Computing Infrastructure requires a central information system to define the topology of computing resources and to store the different parameters and configuration data which are needed by the various ATLAS software components. The ATLAS Grid Information System is the system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing applications and services.

  7. Collaborative Computational Project for Electron cryo-Microscopy

    International Nuclear Information System (INIS)

    Wood, Chris; Burnley, Tom; Patwardhan, Ardan; Scheres, Sjors; Topf, Maya; Roseman, Alan; Winn, Martyn

    2015-01-01

    The Collaborative Computational Project for Electron cryo-Microscopy (CCP-EM) is a new initiative for the structural biology community, following the success of CCP4 for macromolecular crystallography. Progress in supporting the users and developers of cryoEM software is reported. The Collaborative Computational Project for Electron cryo-Microscopy (CCP-EM) has recently been established. The aims of the project are threefold: to build a coherent cryoEM community which will provide support for individual scientists and will act as a focal point for liaising with other communities, to support practising scientists in their use of cryoEM software and finally to support software developers in producing and disseminating robust and user-friendly programs. The project is closely modelled on CCP4 for macromolecular crystallography, and areas of common interest such as model fitting, underlying software libraries and tools for building program packages are being exploited. Nevertheless, cryoEM includes a number of techniques covering a large range of resolutions and a distinct project is required. In this article, progress so far is reported and future plans are discussed

  8. Collaborative Computational Project for Electron cryo-Microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Wood, Chris; Burnley, Tom [Science and Technology Facilities Council, Research Complex at Harwell, Didcot OX11 0FA (United Kingdom); Patwardhan, Ardan [European Molecular Biology Laboratory, Wellcome Trust Genome Campus, Hinxton, Cambridge CB10 1SD (United Kingdom); Scheres, Sjors [MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge Biomedical Campus, Cambridge CB2 0QH (United Kingdom); Topf, Maya [University of London, Malet Street, London WC1E 7HX (United Kingdom); Roseman, Alan [University of Manchester, Oxford Road, Manchester M13 9PT (United Kingdom); Winn, Martyn, E-mail: martyn.winn@stfc.ac.uk [Science and Technology Facilities Council, Daresbury Laboratory, Warrington WA4 4AD (United Kingdom); Science and Technology Facilities Council, Research Complex at Harwell, Didcot OX11 0FA (United Kingdom)

    2015-01-01

    The Collaborative Computational Project for Electron cryo-Microscopy (CCP-EM) is a new initiative for the structural biology community, following the success of CCP4 for macromolecular crystallography. Progress in supporting the users and developers of cryoEM software is reported. The Collaborative Computational Project for Electron cryo-Microscopy (CCP-EM) has recently been established. The aims of the project are threefold: to build a coherent cryoEM community which will provide support for individual scientists and will act as a focal point for liaising with other communities, to support practising scientists in their use of cryoEM software and finally to support software developers in producing and disseminating robust and user-friendly programs. The project is closely modelled on CCP4 for macromolecular crystallography, and areas of common interest such as model fitting, underlying software libraries and tools for building program packages are being exploited. Nevertheless, cryoEM includes a number of techniques covering a large range of resolutions and a distinct project is required. In this article, progress so far is reported and future plans are discussed.

  9. apeNEXT A Multi-Tflops LQCD Computing Project

    CERN Document Server

    Alfieri, R; Onofri, E.; Bartoloni, A.; Battista, C.; Cabibbo, N.; Cosimi, M.; Lonardo, A.; Michelotti, A.; Rapuano, F.; Proietti, B.; Rossetti, D.; Sacco, G.; Tassa, S.; Torelli, M.; Vicini, P.; Boucaud, Philippe; Pene, O.; Errico, W.; Magazzu, G.; Sartori, L.; Schifano, F.; Tripiccione, R.; De Riso, P.; Petronzio, R.; Destri, C.; Frezzotti, R.; Marchesini, G.; Gensch, U.; Kretzschmann, A.; Leich, H.; Paschedag, N.; Schwendicke, U.; Simma, H.; Sommer, R.; Sulanke, K.; Wegner, P.; Pleiter, D.; Jansen, K.; Fucci, A.; Martin, B.; Pech, J.; Panizzi, E.; Petricola, A.

    2001-01-01

    This paper is a slightly modified and reduced version of the proposal of the {\\bf apeNEXT} project, which was submitted to DESY and INFN in spring 2000. .It presents the basic motivations and ideas of a next generation lattice QCD (LQCD) computing project, whose goal is the construction and operation of several large scale Multi-TFlops LQCD engines, providing an integrated peak performance of tens of TFlops, and a sustained (double precision) performance on key LQCD kernels of about 50% of peak speed.

  10. Computational performance of a projection and rescaling algorithm

    OpenAIRE

    Pena, Javier; Soheili, Negar

    2018-01-01

    This paper documents a computational implementation of a {\\em projection and rescaling algorithm} for finding most interior solutions to the pair of feasibility problems \\[ \\text{find} \\; x\\in L\\cap\\mathbb{R}^n_{+} \\;\\;\\;\\; \\text{ and } \\; \\;\\;\\;\\; \\text{find} \\; \\hat x\\in L^\\perp\\cap\\mathbb{R}^n_{+}, \\] where $L$ denotes a linear subspace in $\\mathbb{R}^n$ and $L^\\perp$ denotes its orthogonal complement. The projection and rescaling algorithm is a recently developed method that combines a {\\...

  11. Project Management of a personnel radiation records computer system

    International Nuclear Information System (INIS)

    Labenski, T.

    1984-01-01

    Project Management techniques have been used to develop a data base management information system to provide storage and retrieval of personnel radiation and Health Physics records. The system is currently being developed on a Hewlett Packard 1000 Series E Computer with provisions to include plant radiation survey information, radiation work permit information, inventory management for Health Physics supplies and instrumentation, and control of personnel access to radiological controlled areas. The methodologies used to manage the overall project are presented along with selection and management of software vendors

  12. Computational methods for planning and evaluating geothermal energy projects

    International Nuclear Information System (INIS)

    Goumas, M.G.; Lygerou, V.A.; Papayannakis, L.E.

    1999-01-01

    In planning, designing and evaluating a geothermal energy project, a number of technical, economic, social and environmental parameters should be considered. The use of computational methods provides a rigorous analysis improving the decision-making process. This article demonstrates the application of decision-making methods developed in operational research for the optimum exploitation of geothermal resources. Two characteristic problems are considered: (1) the economic evaluation of a geothermal energy project under uncertain conditions using a stochastic analysis approach and (2) the evaluation of alternative exploitation schemes for optimum development of a low enthalpy geothermal field using a multicriteria decision-making procedure. (Author)

  13. Evolution of project planning tools in a matrix organization

    Energy Technology Data Exchange (ETDEWEB)

    Furaus, J.P.; Figueroa-McInteer, C.; McKeever, P.S.; Wisler, D.B. [Sandia National Labs., Albuquerque, NM (United States); Zavadil, J.T. [Infomatrix (United States)

    1996-10-01

    Until recently, the Corporate Construction Program at Sandia was experiencing difficulties in managing projects: poor planning and cost estimating caused schedule and budget problems. The first step taken was a Microsoft {reg_sign} Project schedule that provides a standard template for scheduling individual construction projects. It is broken down according to the life cycle of the project and prevents the project team from leaving out an important item. A WBS (work breakdown structure) dictionary was also developed that describes how capital and operating funds are used to develop, design, construct, equip, and manage projects. We also developed a matrix chart that maps the planning guide against the major types of construction projects at Sandia. The guide, dictionary, and matrix chart offer enough flexibility that the project manager can make choices about how to structure work, yet ensure that all work rolls up to the cost categories and key DOE WBS elements. As requirements change, the tools can be updated; they also serve as training tools for new project team members.

  14. Computational Modeling of Microstructural-Evolution in AISI 1005 Steel During Gas Metal Arc Butt Welding

    Science.gov (United States)

    2013-05-01

    H.K.D.H. Bhadeshia, A Model for the Microstruc- ture of Some Advanced Bainitic Steels , Mater. Trans., 1991, 32, p 689–696 19. G.J. Davies and J.G. Garland...REPORT Computational Modeling of Microstructural-Evolution in AISI 1005 Steel During Gas Metal Arc Butt Welding 14. ABSTRACT 16. SECURITY...Computational Modeling of Microstructural-Evolution in AISI 1005 Steel During Gas Metal Arc Butt Welding Report Title ABSTRACT A fully coupled (two-way

  15. Project Chrysalis: The Evolution of a Community School.

    Science.gov (United States)

    Garrett, K.

    1996-01-01

    Describes the creation and operation of Project Chrysalis, a community, service-learning school transformed from row houses, where children can learn, work, and gain inspiration from artists and social entrepreneurs involved with Houston's Project Row Houses. Personal narratives of two teachers highlight the school's and students' accomplishments…

  16. Power-Efficient Computing: Experiences from the COSA Project

    Directory of Open Access Journals (Sweden)

    Daniele Cesini

    2017-01-01

    Full Text Available Energy consumption is today one of the most relevant issues in operating HPC systems for scientific applications. The use of unconventional computing systems is therefore of great interest for several scientific communities looking for a better tradeoff between time-to-solution and energy-to-solution. In this context, the performance assessment of processors with a high ratio of performance per watt is necessary to understand how to realize energy-efficient computing systems for scientific applications, using this class of processors. Computing On SOC Architecture (COSA is a three-year project (2015–2017 funded by the Scientific Commission V of the Italian Institute for Nuclear Physics (INFN, which aims to investigate the performance and the total cost of ownership offered by computing systems based on commodity low-power Systems on Chip (SoCs and high energy-efficient systems based on GP-GPUs. In this work, we present the results of the project analyzing the performance of several scientific applications on several GPU- and SoC-based systems. We also describe the methodology we have used to measure energy performance and the tools we have implemented to monitor the power drained by applications while running.

  17. Two-qubit quantum computing in a projected subspace

    International Nuclear Information System (INIS)

    Bi Qiao; Ruda, H.E.; Zhan, M.S.

    2002-01-01

    A formulation for performing quantum computing in a projected subspace is presented, based on the subdynamical kinetic equation (SKE) for an open quantum system. The eigenvectors of the kinetic equation are shown to remain invariant before and after interaction with the environment. However, the eigenvalues in the projected subspace exhibit a type of phase shift to the evolutionary states. This phase shift does not destroy the decoherence-free (DF) property of the subspace because the associated fidelity is 1. This permits a universal formalism to be presented--the eigenprojectors of the free part of the Hamiltonian for the system and bath may be used to construct a DF projected subspace based on the SKE. To eliminate possible phase or unitary errors induced by the change in the eigenvalues, a cancellation technique is proposed, using the adjustment of the coupling time, and applied to a two-qubit computing system. A general criteria for constructing a DF-projected subspace from the SKE is discussed. Finally, a proposal for using triangulation to realize a decoherence-free subsystem based on SKE is presented. The concrete formulation for a two-qubit model is given exactly. Our approach is general and appears to be applicable to any type of decoherence

  18. Cloud computing task scheduling strategy based on improved differential evolution algorithm

    Science.gov (United States)

    Ge, Junwei; He, Qian; Fang, Yiqiu

    2017-04-01

    In order to optimize the cloud computing task scheduling scheme, an improved differential evolution algorithm for cloud computing task scheduling is proposed. Firstly, the cloud computing task scheduling model, according to the model of the fitness function, and then used improved optimization calculation of the fitness function of the evolutionary algorithm, according to the evolution of generation of dynamic selection strategy through dynamic mutation strategy to ensure the global and local search ability. The performance test experiment was carried out in the CloudSim simulation platform, the experimental results show that the improved differential evolution algorithm can reduce the cloud computing task execution time and user cost saving, good implementation of the optimal scheduling of cloud computing tasks.

  19. Building an application for computing the resource requests such as disk, CPU, and tape and studying the time evolution of computing model

    CERN Document Server

    Noormandipour, Mohammad Reza

    2017-01-01

    The goal of this project was building an application to calculate the computing resources needed by the LHCb experiment for data processing and analysis, and to predict their evolution in future years. The source code was developed in the Python programming language and the application built and developed in CERN GitLab. This application will facilitate the calculation of resources required by LHCb in both qualitative and quantitative aspects. The granularity of computations is improved to a weekly basis, in contrast with the yearly basis used so far. The LHCb computing model will benefit from the new possibilities and options added, as the new predictions and calculations are aimed at giving more realistic and accurate estimates.

  20. The evolution of project financing in the geothermal industry

    International Nuclear Information System (INIS)

    Cardenas, G.S.; Miller, D.M.

    1990-01-01

    Sound underlying economics and beneficial contractual relationships are the fundamentals of any project financing. Given these essential elements, the successful transaction must properly allocate the costs, benefits and risks to the appropriate participants in the most efficient manner. In this paper the authors examine four instances in which project financing offered optimal solutions to this problem in a series of transactions for the successive development of the 70 MW Ormesa Geothermal Energy Complex in the Imperial Valley of California

  1. Evolution of Cloud Storage as Cloud Computing Infrastructure Service

    OpenAIRE

    Rajan, Arokia Paul; Shanmugapriyaa

    2013-01-01

    Enterprises are driving towards less cost, more availability, agility, managed risk - all of which is accelerated towards Cloud Computing. Cloud is not a particular product, but a way of delivering IT services that are consumable on demand, elastic to scale up and down as needed, and follow a pay-for-usage model. Out of the three common types of cloud computing service models, Infrastructure as a Service (IaaS) is a service model that provides servers, computing power, network bandwidth and S...

  2. Evolution of Computed Tomography Findings in Secondary Aortoenteric Fistula

    International Nuclear Information System (INIS)

    Bas, Ahmet; Simsek, Osman; Kandemirli, Sedat Giray; Rafiee, Babak; Gulsen, Fatih; Numan, Furuzan

    2015-01-01

    Aortoenteric fistula is a rare but significant clinical entity associated with high morbidity and mortality if remain untreated. Clinical presentation and imaging findings may be subtle and prompt diagnosis can be difficult. Herein, we present a patient who initially presented with abdominal pain and computed tomography showed an aortic aneurysm compressing duodenum without any air bubbles. One month later, the patient presented with gastrointestinal bleeding and computed tomography revealed air bubbles within aneurysm. With a diagnosis of aortoenteric fistula, endovascular aneurysm repair was carried out. This case uniquely presented the computed tomography findings in progression of an aneurysm to an aortoenteric fistula

  3. Discovering local patterns of co - evolution: computational aspects and biological examples

    Directory of Open Access Journals (Sweden)

    Tuller Tamir

    2010-01-01

    Full Text Available Abstract Background Co-evolution is the process in which two (or more sets of orthologs exhibit a similar or correlative pattern of evolution. Co-evolution is a powerful way to learn about the functional interdependencies between sets of genes and cellular functions and to predict physical interactions. More generally, it can be used for answering fundamental questions about the evolution of biological systems. Orthologs that exhibit a strong signal of co-evolution in a certain part of the evolutionary tree may show a mild signal of co-evolution in other branches of the tree. The major reasons for this phenomenon are noise in the biological input, genes that gain or lose functions, and the fact that some measures of co-evolution relate to rare events such as positive selection. Previous publications in the field dealt with the problem of finding sets of genes that co-evolved along an entire underlying phylogenetic tree, without considering the fact that often co-evolution is local. Results In this work, we describe a new set of biological problems that are related to finding patterns of local co-evolution. We discuss their computational complexity and design algorithms for solving them. These algorithms outperform other bi-clustering methods as they are designed specifically for solving the set of problems mentioned above. We use our approach to trace the co-evolution of fungal, eukaryotic, and mammalian genes at high resolution across the different parts of the corresponding phylogenetic trees. Specifically, we discover regions in the fungi tree that are enriched with positive evolution. We show that metabolic genes exhibit a remarkable level of co-evolution and different patterns of co-evolution in various biological datasets. In addition, we find that protein complexes that are related to gene expression exhibit non-homogenous levels of co-evolution across different parts of the fungi evolutionary line. In the case of mammalian evolution

  4. Nonlinear evolution equations and solving algebraic systems: the importance of computer algebra

    International Nuclear Information System (INIS)

    Gerdt, V.P.; Kostov, N.A.

    1989-01-01

    In the present paper we study the application of computer algebra to solve the nonlinear polynomial systems which arise in investigation of nonlinear evolution equations. We consider several systems which are obtained in classification of integrable nonlinear evolution equations with uniform rank. Other polynomial systems are related with the finding of algebraic curves for finite-gap elliptic potentials of Lame type and generalizations. All systems under consideration are solved using the method based on construction of the Groebner basis for corresponding polynomial ideals. The computations have been carried out using computer algebra systems. 20 refs

  5. Software for Distributed Computation on Medical Databases: A Demonstration Project

    Directory of Open Access Journals (Sweden)

    Balasubramanian Narasimhan

    2017-05-01

    Full Text Available Bringing together the information latent in distributed medical databases promises to personalize medical care by enabling reliable, stable modeling of outcomes with rich feature sets (including patient characteristics and treatments received. However, there are barriers to aggregation of medical data, due to lack of standardization of ontologies, privacy concerns, proprietary attitudes toward data, and a reluctance to give up control over end use. Aggregation of data is not always necessary for model fitting. In models based on maximizing a likelihood, the computations can be distributed, with aggregation limited to the intermediate results of calculations on local data, rather than raw data. Distributed fitting is also possible for singular value decomposition. There has been work on the technical aspects of shared computation for particular applications, but little has been published on the software needed to support the "social networking" aspect of shared computing, to reduce the barriers to collaboration. We describe a set of software tools that allow the rapid assembly of a collaborative computational project, based on the flexible and extensible R statistical software and other open source packages, that can work across a heterogeneous collection of database environments, with full transparency to allow local officials concerned with privacy protections to validate the safety of the method. We describe the principles, architecture, and successful test results for the site-stratified Cox model and rank-k singular value decomposition.

  6. A computational genomics pipeline for prokaryotic sequencing projects.

    Science.gov (United States)

    Kislyuk, Andrey O; Katz, Lee S; Agrawal, Sonia; Hagen, Matthew S; Conley, Andrew B; Jayaraman, Pushkala; Nelakuditi, Viswateja; Humphrey, Jay C; Sammons, Scott A; Govil, Dhwani; Mair, Raydel D; Tatti, Kathleen M; Tondella, Maria L; Harcourt, Brian H; Mayer, Leonard W; Jordan, I King

    2010-08-01

    New sequencing technologies have accelerated research on prokaryotic genomes and have made genome sequencing operations outside major genome sequencing centers routine. However, no off-the-shelf solution exists for the combined assembly, gene prediction, genome annotation and data presentation necessary to interpret sequencing data. The resulting requirement to invest significant resources into custom informatics support for genome sequencing projects remains a major impediment to the accessibility of high-throughput sequence data. We present a self-contained, automated high-throughput open source genome sequencing and computational genomics pipeline suitable for prokaryotic sequencing projects. The pipeline has been used at the Georgia Institute of Technology and the Centers for Disease Control and Prevention for the analysis of Neisseria meningitidis and Bordetella bronchiseptica genomes. The pipeline is capable of enhanced or manually assisted reference-based assembly using multiple assemblers and modes; gene predictor combining; and functional annotation of genes and gene products. Because every component of the pipeline is executed on a local machine with no need to access resources over the Internet, the pipeline is suitable for projects of a sensitive nature. Annotation of virulence-related features makes the pipeline particularly useful for projects working with pathogenic prokaryotes. The pipeline is licensed under the open-source GNU General Public License and available at the Georgia Tech Neisseria Base (http://nbase.biology.gatech.edu/). The pipeline is implemented with a combination of Perl, Bourne Shell and MySQL and is compatible with Linux and other Unix systems.

  7. Evolution of the Darlington NGS fuel handling computer systems

    International Nuclear Information System (INIS)

    Leung, V.; Crouse, B.

    1996-01-01

    The ability to improve the capabilities and reliability of digital control systems in nuclear power stations to meet changing plant and personnel requirements is a formidable challenge. Many of these systems have high quality assurance standards that must be met to ensure adequate nuclear safety. Also many of these systems contain obsolete hardware along with software that is not easily transported to newer technology computer equipment. Combining modern technology upgrades into a system of obsolete hardware components is not an easy task. Lastly, as users become more accustomed to using modern technology computer systems in other areas of the station (e.g. information systems), their expectations of the capabilities of the plant systems increase. This paper will present three areas of the Darlington NGS fuel handling computer system that have been or are in the process of being upgraded to current technology components within the framework of an existing fuel handling control system. (author). 3 figs

  8. Evolution of the Darlington NGS fuel handling computer systems

    Energy Technology Data Exchange (ETDEWEB)

    Leung, V; Crouse, B [Ontario Hydro, Bowmanville (Canada). Darlington Nuclear Generating Station

    1997-12-31

    The ability to improve the capabilities and reliability of digital control systems in nuclear power stations to meet changing plant and personnel requirements is a formidable challenge. Many of these systems have high quality assurance standards that must be met to ensure adequate nuclear safety. Also many of these systems contain obsolete hardware along with software that is not easily transported to newer technology computer equipment. Combining modern technology upgrades into a system of obsolete hardware components is not an easy task. Lastly, as users become more accustomed to using modern technology computer systems in other areas of the station (e.g. information systems), their expectations of the capabilities of the plant systems increase. This paper will present three areas of the Darlington NGS fuel handling computer system that have been or are in the process of being upgraded to current technology components within the framework of an existing fuel handling control system. (author). 3 figs.

  9. The evolution of cloud computing how to plan for change

    CERN Document Server

    Longbottom, Clive

    2017-01-01

    Cloud computing has been positioned as today's ideal IT platform. This book looks at what cloud promises and how it's likely to evolve in the future. Readers will be able to ensure that decisions made now will hold them in good stead in the future and will gain an understanding of how cloud can deliver the best outcome for their organisations.

  10. Computational anthropomorphic phantoms for radiation protection dosimetry: evolution and prospects

    International Nuclear Information System (INIS)

    Lee, Choonsik; Lee, Jaiki

    2006-01-01

    Computational anthropomorphic phantoms are computer models of human anatomy used in the calculation of radiation dose distribution in the human body upon exposure to a radiation source. Depending on the manner to represent human anatomy, they are categorized into two classes: stylized and tomographic phantoms. Stylized phantoms, which have mainly been developed at the Oak Ridge National Laboratory (ORNL), describe human anatomy by using simple mathematical equations of analytical geometry. Several improved stylized phantoms such as male and female adults, pediatric series, and enhanced organ models have been developed following the first hermaphrodite adult stylized phantom, Medical Internal Radiation Dose (MIRD)-5 phantom. Although stylized phantoms have significantly contributed to dosimetry calculation, they provide only approximations of the true anatomical features of the human body and the resulting organ dose distribution. An alternative class of computational phantom, the tomographic phantom, is based upon three-dimensional imaging techniques such as Magnetic Resonance (MR) imaging and Computed Tomography (CT). The tomographic phantoms represent the human anatomy with a large number of voxels that are assigned tissue type and organ identity. To date, a total of around 30 tomographic phantoms including male and female adults, pediatric phantoms, and even a pregnant female, have been developed and utilized for realistic radiation dosimetry calculation. They are based on MRI/CT images or sectional color photos from patients, volunteers or cadavers. Several investigators have compared tomographic phantoms with stylized phantoms, and demonstrated the superiority of tomographic phantoms in terms of realistic anatomy and dosimetry calculation. This paper summarizes the history and current status of both stylized and tomographic phantoms, including Korean computational phantoms. Advantages, limitations, and future prospects are also discussed

  11. Evaluating the Effectiveness of Collaborative Computer-Intensive Projects in an Undergraduate Psychometrics Course

    Science.gov (United States)

    Barchard, Kimberly A.; Pace, Larry A.

    2010-01-01

    Undergraduate psychometrics classes often use computer-intensive active learning projects. However, little research has examined active learning or computer-intensive projects in psychometrics courses. We describe two computer-intensive collaborative learning projects used to teach the design and evaluation of psychological tests. Course…

  12. ABrIL - Advanced Brain Imaging Lab : a cloud based computation environment for cooperative neuroimaging projects.

    Science.gov (United States)

    Neves Tafula, Sérgio M; Moreira da Silva, Nádia; Rozanski, Verena E; Silva Cunha, João Paulo

    2014-01-01

    Neuroscience is an increasingly multidisciplinary and highly cooperative field where neuroimaging plays an important role. Neuroimaging rapid evolution is demanding for a growing number of computing resources and skills that need to be put in place at every lab. Typically each group tries to setup their own servers and workstations to support their neuroimaging needs, having to learn from Operating System management to specific neuroscience software tools details before any results can be obtained from each setup. This setup and learning process is replicated in every lab, even if a strong collaboration among several groups is going on. In this paper we present a new cloud service model - Brain Imaging Application as a Service (BiAaaS) - and one of its implementation - Advanced Brain Imaging Lab (ABrIL) - in the form of an ubiquitous virtual desktop remote infrastructure that offers a set of neuroimaging computational services in an interactive neuroscientist-friendly graphical user interface (GUI). This remote desktop has been used for several multi-institution cooperative projects with different neuroscience objectives that already achieved important results, such as the contribution to a high impact paper published in the January issue of the Neuroimage journal. The ABrIL system has shown its applicability in several neuroscience projects with a relatively low-cost, promoting truly collaborative actions and speeding up project results and their clinical applicability.

  13. Test computations on the dynamical evolution of star clusters

    International Nuclear Information System (INIS)

    Angeletti, L.; Giannone, P.

    1977-01-01

    Test calculations have been carried out on the evolution of star clusters using the fluid-dynamical method devised by Larson (1970). Large systems of stars have been considered with specific concern with globular clusters. With reference to the analogous 'standard' model by Larson, the influence of varying in turn the various free parameters (cluster mass, star mass, tidal radius, mass concentration of the initial model) has been studied for the results. Furthermore, the partial release of some simplifying assumptions with regard to the relaxation time and distribution of the 'target' stars has been considered. The change of the structural properties is discussed, and the variation of the evolutionary time scale is outlined. An indicative agreement of the results obtained here with structural properties of globular clusters as deduced from previous theoretical models is pointed out. (Auth.)

  14. Symbolic computation of exact solutions for a nonlinear evolution equation

    International Nuclear Information System (INIS)

    Liu Yinping; Li Zhibin; Wang Kuncheng

    2007-01-01

    In this paper, by means of the Jacobi elliptic function method, exact double periodic wave solutions and solitary wave solutions of a nonlinear evolution equation are presented. It can be shown that not only the obtained solitary wave solutions have the property of loop-shaped, cusp-shaped and hump-shaped for different values of parameters, but also different types of double periodic wave solutions are possible, namely periodic loop-shaped wave solutions, periodic hump-shaped wave solutions or periodic cusp-shaped wave solutions. Furthermore, periodic loop-shaped wave solutions will be degenerated to loop-shaped solitary wave solutions for the same values of parameters. So do cusp-shaped solutions and hump-shaped solutions. All these solutions are new and first reported here

  15. Schedule evolution during the life-time of the LHC project

    CERN Document Server

    Foraz, K; Gaillard, H; Hauviller, Claude; Weisz, S

    2007-01-01

    The Large Hadron Collider Project was approved by the CERN Council in December 1994. The CERN management opted from the beginning of the project for a very aggressive installation planning based on a just-in-time sequencing of all activities. This paper aims to draw how different factors (technical development, procurement, logistics and organization) have impacted on the schedule evolution through the lifetime of the project. It describes the cause effect analysis of the major rescheduling that occurred during the installation of the LHC and presents some general conclusions potentially applicable in other projects.

  16. Large Atmospheric Computation on the Earth Simulator: The LACES Project

    Directory of Open Access Journals (Sweden)

    Michel Desgagné

    2006-01-01

    Full Text Available The Large Atmospheric Computation on the Earth Simulator (LACES project is a joint initiative between Canadian and Japanese meteorological services and academic institutions that focuses on the high resolution simulation of Hurricane Earl (1998. The unique aspect of this effort is the extent of the computational domain, which covers all of North America and Europe with a grid spacing of 1 km. The Canadian Mesoscale Compressible Community (MC2 model is shown to parallelize effectively on the Japanese Earth Simulator (ES supercomputer; however, even using the extensive computing resources of the ES Center (ESC, the full simulation for the majority of Hurricane Earl's lifecycle takes over eight days to perform and produces over 5.2 TB of raw data. Preliminary diagnostics show that the results of the LACES simulation for the tropical stage of Hurricane Earl's lifecycle compare well with available observations for the storm. Further studies involving advanced diagnostics have commenced, taking advantage of the uniquely large spatial extent of the high resolution LACES simulation to investigate multiscale interactions in the hurricane and its environment. It is hoped that these studies will enhance our understanding of processes occurring within the hurricane and between the hurricane and its planetary-scale environment.

  17. FPGAs in High Perfomance Computing: Results from Two LDRD Projects.

    Energy Technology Data Exchange (ETDEWEB)

    Underwood, Keith D; Ulmer, Craig D.; Thompson, David; Hemmert, Karl Scott

    2006-11-01

    Field programmable gate arrays (FPGAs) have been used as alternative computational de-vices for over a decade; however, they have not been used for traditional scientific com-puting due to their perceived lack of floating-point performance. In recent years, there hasbeen a surge of interest in alternatives to traditional microprocessors for high performancecomputing. Sandia National Labs began two projects to determine whether FPGAs wouldbe a suitable alternative to microprocessors for high performance scientific computing and,if so, how they should be integrated into the system. We present results that indicate thatFPGAs could have a significant impact on future systems. FPGAs have thepotentialtohave order of magnitude levels of performance wins on several key algorithms; however,there are serious questions as to whether the system integration challenge can be met. Fur-thermore, there remain challenges in FPGA programming and system level reliability whenusing FPGA devices.4 AcknowledgmentArun Rodrigues provided valuable support and assistance in the use of the Structural Sim-ulation Toolkit within an FPGA context. Curtis Janssen and Steve Plimpton provided valu-able insights into the workings of two Sandia applications (MPQC and LAMMPS, respec-tively).5

  18. Vinayaka : A Semi-Supervised Projected Clustering Method Using Differential Evolution

    OpenAIRE

    Satish Gajawada; Durga Toshniwal

    2012-01-01

    Differential Evolution (DE) is an algorithm for evolutionary optimization. Clustering problems have beensolved by using DE based clustering methods but these methods may fail to find clusters hidden insubspaces of high dimensional datasets. Subspace and projected clustering methods have been proposed inliterature to find subspace clusters that are present in subspaces of dataset. In this paper we proposeVINAYAKA, a semi-supervised projected clustering method based on DE. In this method DE opt...

  19. Scientific and computational challenges of the fusion simulation project (FSP)

    International Nuclear Information System (INIS)

    Tang, W M

    2008-01-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Project (FSP). The primary objective is to develop advanced software designed to use leadership-class computers for carrying out multiscale physics simulations to provide information vital to delivering a realistic integrated fusion simulation model with unprecedented physics fidelity. This multiphysics capability will be unprecedented in that in the current FES applications domain, the largest-scale codes are used to carry out first-principles simulations of mostly individual phenomena in realistic 3D geometry while the integrated models are much smaller-scale, lower-dimensionality codes with significant empirical elements used for modeling and designing experiments. The FSP is expected to be the most up-to-date embodiment of the theoretical and experimental understanding of magnetically confined thermonuclear plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing a reliable ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices on all relevant time and space scales. From a computational perspective, the fusion energy science application goal to produce high-fidelity, whole-device modeling capabilities will demand computing resources in the petascale range and beyond, together with the associated multicore algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative device involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics

  20. LHCb: The Evolution of the LHCb Grid Computing Model

    CERN Multimedia

    Arrabito, L; Bouvet, D; Cattaneo, M; Charpentier, P; Clarke, P; Closier, J; Franchini, P; Graciani, R; Lanciotti, E; Mendez, V; Perazzini, S; Nandkumar, R; Remenska, D; Roiser, S; Romanovskiy, V; Santinelli, R; Stagni, F; Tsaregorodtsev, A; Ubeda Garcia, M; Vedaee, A; Zhelezov, A

    2012-01-01

    The increase of luminosity in the LHC during its second year of operation (2011) was achieved by delivering more protons per bunch and increasing the number of bunches. Taking advantage of these changed conditions, LHCb ran with a higher pileup as well as a much larger charm physics introducing a bigger event size and processing times. These changes led to shortages in the offline distributed data processing resources, an increased need of cpu capacity by a factor 2 for reconstruction, higher storage needs at T1 sites by 70\\% and subsequently problems with data throughput for file access from the storage elements. To accommodate these changes the online running conditions and the Computing Model for offline data processing had to be adapted accordingly. This paper describes the changes implemented for the offline data processing on the Grid, relaxing the Monarc model in a first step and going beyond it subsequently. It further describes other operational issues discovered and solved during 2011, present the ...

  1. Studies on defect evolution in steels: experiments and computer simulations

    International Nuclear Information System (INIS)

    Sundar, C.S.

    2011-01-01

    In this paper, we present the results of our on-going studies on steels that are being carried out with a view to develop radiation resistant steels. The focus is on the use of nano-dispersoids in alloys towards the suppression of void formation and eventual swelling under irradiation. Results on the nucleation and growth of TiC precipitates in Ti modified austenitic steels and investigations on nano Yttria particles in Fe - a model oxide dispersion ferritic steel will be presented. The experimental methods of ion beam irradiation and positron annihilation spectroscopy have been used to elucidate the role of minor alloying elements on swelling behaviour. Computer simulation of defect processes have been carried out using ab-initio methods, molecular dynamics and Monte Carlo simulations. Our perspectives on addressing the multi-scale phenomena of defect processes leading to radiation damage, through a judicious combination of experiments and simulations, would be presented. (author)

  2. Computational design of chimeric protein libraries for directed evolution.

    Science.gov (United States)

    Silberg, Jonathan J; Nguyen, Peter Q; Stevenson, Taylor

    2010-01-01

    The best approach for creating libraries of functional proteins with large numbers of nondisruptive amino acid substitutions is protein recombination, in which structurally related polypeptides are swapped among homologous proteins. Unfortunately, as more distantly related proteins are recombined, the fraction of variants having a disrupted structure increases. One way to enrich the fraction of folded and potentially interesting chimeras in these libraries is to use computational algorithms to anticipate which structural elements can be swapped without disturbing the integrity of a protein's structure. Herein, we describe how the algorithm Schema uses the sequences and structures of the parent proteins recombined to predict the structural disruption of chimeras, and we outline how dynamic programming can be used to find libraries with a range of amino acid substitution levels that are enriched in variants with low Schema disruption.

  3. Fast and accurate computation of projected two-point functions

    Science.gov (United States)

    Grasshorn Gebhardt, Henry S.; Jeong, Donghui

    2018-01-01

    We present the two-point function from the fast and accurate spherical Bessel transformation (2-FAST) algorithm1Our code is available at https://github.com/hsgg/twoFAST. for a fast and accurate computation of integrals involving one or two spherical Bessel functions. These types of integrals occur when projecting the galaxy power spectrum P (k ) onto the configuration space, ξℓν(r ), or spherical harmonic space, Cℓ(χ ,χ'). First, we employ the FFTLog transformation of the power spectrum to divide the calculation into P (k )-dependent coefficients and P (k )-independent integrations of basis functions multiplied by spherical Bessel functions. We find analytical expressions for the latter integrals in terms of special functions, for which recursion provides a fast and accurate evaluation. The algorithm, therefore, circumvents direct integration of highly oscillating spherical Bessel functions.

  4. Evolution of Project-Based Learning in Small Groups in Environmental Engineering Courses

    Science.gov (United States)

    Requies, Jesús M.; Agirre, Ion; Barrio, V. Laura; Graells, Moisès

    2018-01-01

    This work presents the assessment of the development and evolution of an active methodology (Project-Based Learning--PBL) implemented on the course "Unit Operations in Environmental Engineering", within the bachelor's degree in Environmental Engineering, with the purpose of decreasing the dropout rate in this course. After the initial…

  5. Drifting Continents and Wandering Poles. Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    Science.gov (United States)

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  6. Drifting Continents and Magnetic Fields. Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    Science.gov (United States)

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  7. Volcanoes: Where and Why? Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    Science.gov (United States)

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  8. Hot Spots in the Earth's Crust. Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    Science.gov (United States)

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  9. Evolution of facility layout requirements and CAD [computer-aided design] system development

    International Nuclear Information System (INIS)

    Jones, M.

    1990-06-01

    The overall configuration of the Superconducting Super Collider (SSC) including the infrastructure and land boundary requirements were developed using a computer-aided design (CAD) system. The evolution of the facility layout requirements and the use of the CAD system are discussed. The emphasis has been on minimizing the amount of input required and maximizing the speed by which the output may be obtained. The computer system used to store the data is also described

  10. Virtual Mockup test based on computational science and engineering. Near future technology projected by JSPS-RFTFADVENTURE project

    International Nuclear Information System (INIS)

    Yoshimura, Shinobu

    2001-01-01

    The ADVENTURE project began on August, 1997, as a project in the computational science' field of JSPS-RFTFADVENTURE project, and is progressed as five year project. In this project, by using versatile parallel computer environment such as PC cluster, super parallel computer, and so on , to solve an arbitrary shape of actual dynamical equation by using 10 to 100 million freedom class mode under maintaining a general use analytical capacity agreeable with present general use computational mechanics system, further development of a large-scale parallel computational mechanics system (ADVENTURE system) capable of carrying out an optimization design on shapes, physical properties, loading conditions, and so on is performed. Here was scoped, after outlining on background of R and D on ADVENTURE system and its features, on near future virtual mockup test forecast from it. (G.K.)

  11. Evolution of project management research: a bibliometric study of International Journal of Project Management

    Directory of Open Access Journals (Sweden)

    Fábio Cocchi da Silva Eiras

    2017-03-01

    Full Text Available Over the past decades, the project management field has evolved and consolidated. Facing this growth, this research aims to identify the main trends of research in the area, as well as providing an overview of publications, identifying new issues, changes in approaches and the development of knowledge areas. To do so, a systematic review of the literature was performed with the use of bibliometric study in the papers of the International Journal of Project Management (IJPM, included in SCOPUS, from its first volume to 2015, covering a period of more than 30 years. It was found that developing countries are increasingly concerned in developing research into the field of project management, especially in mega infrastructure projects and public-private partnerships. The risk is a central topic in all periods of analysis, however, the strategic topics such as success in project and portfolio management are among the fastest growing. Issues related to the soft side of project management as skills, culture, and knowledge management have emerged in recent periods. According to the industry, construction projects and projects in information technology are the most studied along the period analysed.

  12. Towards GLUE 2: evolution of the computing element information model

    International Nuclear Information System (INIS)

    Andreozzi, S; Burke, S; Field, L; Konya, B

    2008-01-01

    A key advantage of Grid systems is the ability to share heterogeneous resources and services between traditional administrative and organizational domains. This ability enables virtual pools of resources to be created and assigned to groups of users. Resource awareness, the capability of users or user agents to have knowledge about the existence and state of resources, is required in order utilize the resource. This awareness requires a description of the services and resources typically defined via a community-agreed information model. One of the most popular information models, used by a number of Grid infrastructures, is the GLUE Schema, which provides a common language for describing Grid resources. Other approaches exist, however they follow different modeling strategies. The presence of different flavors of information models for Grid resources is a barrier for enabling inter-Grid interoperability. In order to solve this problem, the GLUE Working Group in the context of the Open Grid Forum was started. The purpose of the group is to oversee a major redesign of the GLUE Schema which should consider the successful modeling choices and flaws that have emerged from practical experience and modeling choices from other initiatives. In this paper, we present the status of the new model for describing computing resources as the first output from the working group with the aim of dissemination and soliciting feedback from the community

  13. Building synthetic sterols computationally – unlocking the secrets of evolution?

    Directory of Open Access Journals (Sweden)

    Tomasz eRog

    2015-08-01

    Full Text Available Cholesterol is vital in regulating the physical properties of animal cell membranes. While it remains unclear what renders cholesterol so unique, it is known that other sterols are less capable in modulating membrane properties, and there are membrane proteins whose function is dependent on cholesterol. Practical applications of cholesterol include e.g. its use in liposomes in drug delivery and cosmetics, cholesterol-based detergents in membrane protein crystallography, and its fluorescent analogs in studies of cholesterol transport in cells and tissues. Clearly, in spite of their difficult synthesis, producing the synthetic analogs of cholesterol is of great commercial and scientific interest. In this article, we discuss how synthetic sterols nonexistent in nature can be used to elucidate the roles of cholesterol's structural elements. To this end, we discuss recent atomistic molecular dynamics simulation studies that have predicted new synthetic sterols with properties comparable to those of cholesterol. We also discuss more recent experimental studies that have vindicated these predictions. The paper highlights the strength of computational simulations in making predictions for synthetic biology, thereby guiding experiments.

  14. Computing element evolution towards Exascale and its impact on legacy simulation codes

    International Nuclear Information System (INIS)

    Colin de Verdiere, Guillaume J.L.

    2015-01-01

    In the light of the current race towards the Exascale, this article highlights the main features of the forthcoming computing elements that will be at the core of next generations of supercomputers. The market analysis, underlying this work, shows that computers are facing a major evolution in terms of architecture. As a consequence, it is important to understand the impacts of those evolutions on legacy codes or programming methods. The problems of dissipated power and memory access are discussed and will lead to a vision of what should be an exascale system. To survive, programming languages had to respond to the hardware evolutions either by evolving or with the creation of new ones. From the previous elements, we elaborate why vectorization, multithreading, data locality awareness and hybrid programming will be the key to reach the exascale, implying that it is time to start rewriting codes. (orig.)

  15. Computing element evolution towards Exascale and its impact on legacy simulation codes

    Science.gov (United States)

    Colin de Verdière, Guillaume J. L.

    2015-12-01

    In the light of the current race towards the Exascale, this article highlights the main features of the forthcoming computing elements that will be at the core of next generations of supercomputers. The market analysis, underlying this work, shows that computers are facing a major evolution in terms of architecture. As a consequence, it is important to understand the impacts of those evolutions on legacy codes or programming methods. The problems of dissipated power and memory access are discussed and will lead to a vision of what should be an exascale system. To survive, programming languages had to respond to the hardware evolutions either by evolving or with the creation of new ones. From the previous elements, we elaborate why vectorization, multithreading, data locality awareness and hybrid programming will be the key to reach the exascale, implying that it is time to start rewriting codes.

  16. Fitting models of continuous trait evolution to incompletely sampled comparative data using approximate Bayesian computation.

    Science.gov (United States)

    Slater, Graham J; Harmon, Luke J; Wegmann, Daniel; Joyce, Paul; Revell, Liam J; Alfaro, Michael E

    2012-03-01

    In recent years, a suite of methods has been developed to fit multiple rate models to phylogenetic comparative data. However, most methods have limited utility at broad phylogenetic scales because they typically require complete sampling of both the tree and the associated phenotypic data. Here, we develop and implement a new, tree-based method called MECCA (Modeling Evolution of Continuous Characters using ABC) that uses a hybrid likelihood/approximate Bayesian computation (ABC)-Markov-Chain Monte Carlo approach to simultaneously infer rates of diversification and trait evolution from incompletely sampled phylogenies and trait data. We demonstrate via simulation that MECCA has considerable power to choose among single versus multiple evolutionary rate models, and thus can be used to test hypotheses about changes in the rate of trait evolution across an incomplete tree of life. We finally apply MECCA to an empirical example of body size evolution in carnivores, and show that there is no evidence for an elevated rate of body size evolution in the pinnipeds relative to terrestrial carnivores. ABC approaches can provide a useful alternative set of tools for future macroevolutionary studies where likelihood-dependent approaches are lacking. © 2011 The Author(s). Evolution© 2011 The Society for the Study of Evolution.

  17. The Evolution of Computer Based Learning Software Design: Computer Assisted Teaching Unit Experience.

    Science.gov (United States)

    Blandford, A. E.; Smith, P. R.

    1986-01-01

    Describes the style of design of computer simulations developed by Computer Assisted Teaching Unit at Queen Mary College with reference to user interface, input and initialization, input data vetting, effective display screen use, graphical results presentation, and need for hard copy. Procedures and problems relating to academic involvement are…

  18. Projects Using a Computer Algebra System in First-Year Undergraduate Mathematics

    Science.gov (United States)

    Rosenzweig, Martin

    2007-01-01

    This paper illustrates the use of computer-based projects in two one-semester first-year undergraduate mathematics classes. Developed over a period of years, the approach is one in which the classes are organised into work-groups, with computer-based projects being undertaken periodically to illustrate the class material. These projects are…

  19. Evolution of the heteroharmonic strategy for target-range computation in the echolocation of Mormoopidae.

    Directory of Open Access Journals (Sweden)

    Emanuel C Mora

    2013-06-01

    Full Text Available Echolocating bats use the time elapsed from biosonar pulse emission to the arrival of echo (defined as echo-delay to assess target-distance. Target-distance is represented in the brain by delay-tuned neurons that are classified as either heteroharmonic or homoharmormic. Heteroharmonic neurons respond more strongly to pulse-echo pairs in which the timing of the pulse is given by the fundamental biosonar harmonic while the timing of echoes is provided by one (or several of the higher order harmonics. On the other hand, homoharmonic neurons are tuned to the echo delay between similar harmonics in the emitted pulse and echo. It is generally accepted that heteroharmonic computations are advantageous over homoharmonic computations; i.e. heteroharmonic neurons receive information from call and echo in different frequency-bands which helps to avoid jamming between pulse and echo signals. Heteroharmonic neurons have been found in two species of the family Mormoopidae (Pteronotus parnellii and Pteronotus quadridens and in Rhinolophus rouxi. Recently, it was proposed that heteroharmonic target-range computations are a primitive feature of the genus Pteronotus that was preserved in the evolution of the genus. Here we review recent findings on the evolution of echolocation in Mormoopidae, and try to link those findings to the evolution of the heteroharmonic computation strategy. We stress the hypothesis that the ability to perform heteroharmonic computations evolved separately from the ability of using long constant-frequency echolocation calls, high duty cycle echolocation and Doppler Shift Compensation. Also, we present the idea that heteroharmonic computations might have been of advantage for categorizing prey size, hunting eared insects and living in large conspecific colonies. We make five testable predictions that might help future investigations to clarify the evolution of the heteroharmonic echolocation in Mormoopidae and other families.

  20. Computer simulation of the time evolution of a quenched model alloy in the nucleation region

    International Nuclear Information System (INIS)

    Marro, J.; Lebowitz, J.L.; Kalos, M.H.

    1979-01-01

    The time evolution of the structure function and of the cluster (or grain) distribution following quenching in a model binary alloy with a small concentration of minority atoms is obtained from computer simulations. The structure function S-bar (k,t) obeys a simple scaling relation, S-bar (k,t) = K -3 F (k/K) with K (t) proportional t/sup -a/, a approx. = 0.25, during the latter and larger part of the evolution. During the same period, the mean cluster size grows approximately linearly with time

  1. An overview of the Environmental Monitoring Computer Automation Project

    International Nuclear Information System (INIS)

    Johnson, S.M.; Lorenz, R.

    1992-01-01

    The Savannah River Site (SRS) was bulk to produce plutonium and tritium for national defense. As a result of site operations, routine and accidental releases of radionuclides have occurred. The effects these releases have on the k>cal population and environment are of concern to the Department of Energy (DOE) and SRS personnel. Each year, approximately 40,000 environmental samples are collected. The quality of the samples, analytical methods and results obtained are important to site personnel. The Environmental Monitoring Computer Automation Project (EMCAP) was developed to better manage scheduling, log-in, tracking, analytical results, and report generation. EMCAP can be viewed as a custom Laboratory Information Management System (LIMS) with the ability to schedule samples, generate reports, and query data. The purpose of this paper is to give an overview of the SRS environmental monitoring program, describe the development of EMCAP software and hardware, discuss the different software modules, show how EMCAP improved the Environmental Monitoring Section program, and examine the future of EMCAP at SRS

  2. Mixed Waste Treatment Project: Computer simulations of integrated flowsheets

    International Nuclear Information System (INIS)

    Dietsche, L.J.

    1993-12-01

    The disposal of mixed waste, that is waste containing both hazardous and radioactive components, is a challenging waste management problem of particular concern to DOE sites throughout the United States. Traditional technologies used for the destruction of hazardous wastes need to be re-evaluated for their ability to handle mixed wastes, and in some cases new technologies need to be developed. The Mixed Waste Treatment Project (MWTP) was set up by DOE's Waste Operations Program (EM30) to provide guidance on mixed waste treatment options. One of MWTP's charters is to develop flowsheets for prototype integrated mixed waste treatment facilities which can serve as models for sites developing their own treatment strategies. Evaluation of these flowsheets is being facilitated through the use of computer modelling. The objective of the flowsheet simulations is to provide mass and energy balances, product compositions, and equipment sizing (leading to cost) information. The modelled flowsheets need to be easily modified to examine how alternative technologies and varying feed streams effect the overall integrated process. One such commercially available simulation program is ASPEN PLUS. This report contains details of the Aspen Plus program

  3. Click! 101 Computer Activities and Art Projects for Kids and Grown-Ups.

    Science.gov (United States)

    Bundesen, Lynne; And Others

    This book presents 101 computer activities and projects geared toward children and adults. The activities for both personal computers (PCs) and Macintosh were developed on the Windows 95 computer operating system, but they are adaptable to non-Windows personal computers as well. The book is divided into two parts. The first part provides an…

  4. A Synthesis and Survey of Critical Success Factors for Computer Technology Projects

    Science.gov (United States)

    Baker, Ross A.

    2012-01-01

    The author investigated the existence of critical success factors for computer technology projects. Current research literature and a survey of experienced project managers indicate that there are 23 critical success factors (CSFs) that correlate with project success. The survey gathered an assessment of project success and the degree to which…

  5. The evolution of the Waste Isolation Pilot Plant (WIPP) project's public affairs program

    International Nuclear Information System (INIS)

    Walter, L.H.

    1988-01-01

    As a first-of-a-kind facility, the Waste Isolation Pilot Plant (WIPP) presents a unique perspective on the value of designing a public affairs program that grown with and complements a project's evolution from construction to operations. Like the project itself, the public affairs programs progressed through several stages to its present scope. During the construction phase, foundations were laid in the community. Then, in this past year as the project entered a preoperational status, emphasis shifted to broaden the positive image that had been created locally. In this stage, public affairs presented the project's positive elements to the various state agencies, government officials, and federal organizations involved in our country's radioactive waste management program. Most recently, and continuing until receipt of the first shipment of waste in October 1988, an even broader, more aggressive public affairs program is planned

  6. FY95 software project management plan: TMACS, CASS computer systems

    International Nuclear Information System (INIS)

    Spurling, D.G.

    1994-01-01

    The FY95 Work Plan for TMACS and CASS Software Projects describes the activities planned for the current fiscal year. This plan replaces WHC-SD-WM-SDP-008. The TMACS project schedule is included in the TWRS Integrated Schedule

  7. eCodonOpt: a systematic computational framework for optimizing codon usage in directed evolution experiments

    OpenAIRE

    Moore, Gregory L.; Maranas, Costas D.

    2002-01-01

    We present a systematic computational framework, eCodonOpt, for designing parental DNA sequences for directed evolution experiments through codon usage optimization. Given a set of homologous parental proteins to be recombined at the DNA level, the optimal DNA sequences encoding these proteins are sought for a given diversity objective. We find that the free energy of annealing between the recombining DNA sequences is a much better descriptor of the extent of crossover formation than sequence...

  8. Exploring Students' Computational Thinking Skills in Modeling and Simulation Projects: : A Pilot Study

    NARCIS (Netherlands)

    Grgurina, Natasa; van Veen, Klaas; Barendsen, Erik; Zwaneveld, Bert; Suhre, Cor; Gal-Ezer, Judith; Sentance, Sue; Vahrenhold, Jan

    2015-01-01

    Computational Thinking (CT) is gaining a lot of attention in education. We explored how to discern the occurrences of CT in the projects of 12th grade high school students in the computer science (CS) course. Within the projects, they constructed models and ran simulations of phenomena from other

  9. The fifth generation computer project state of the art report 111

    CERN Document Server

    Scarrott

    1983-01-01

    The Fifth Generation Computer Project is a two-part book consisting of the invited papers and the analysis. The invited papers examine various aspects of The Fifth Generation Computer Project. The analysis part assesses the major advances of the Fifth Generation Computer Project and provides a balanced analysis of the state of the art in The Fifth Generation. This part provides a balanced and comprehensive view of the development in Fifth Generation Computer technology. The Bibliography compiles the most important published material on the subject of The Fifth Generation.

  10. Modeling the evolution of channel shape: Balancing computational efficiency with hydraulic fidelity

    Science.gov (United States)

    Wobus, C.W.; Kean, J.W.; Tucker, G.E.; Anderson, R. Scott

    2008-01-01

    The cross-sectional shape of a natural river channel controls the capacity of the system to carry water off a landscape, to convey sediment derived from hillslopes, and to erode its bed and banks. Numerical models that describe the response of a landscape to changes in climate or tectonics therefore require formulations that can accommodate evolution of channel cross-sectional geometry. However, fully two-dimensional (2-D) flow models are too computationally expensive to implement in large-scale landscape evolution models, while available simple empirical relationships between width and discharge do not adequately capture the dynamics of channel adjustment. We have developed a simplified 2-D numerical model of channel evolution in a cohesive, detachment-limited substrate subject to steady, unidirectional flow. Erosion is assumed to be proportional to boundary shear stress, which is calculated using an approximation of the flow field in which log-velocity profiles are assumed to apply along vectors that are perpendicular to the local channel bed. Model predictions of the velocity structure, peak boundary shear stress, and equilibrium channel shape compare well with predictions of a more sophisticated but more computationally demanding ray-isovel model. For example, the mean velocities computed by the two models are consistent to within ???3%, and the predicted peak shear stress is consistent to within ???7%. Furthermore, the shear stress distributions predicted by our model compare favorably with available laboratory measurements for prescribed channel shapes. A modification to our simplified code in which the flow includes a high-velocity core allows the model to be extended to estimate shear stress distributions in channels with large width-to-depth ratios. Our model is efficient enough to incorporate into large-scale landscape evolution codes and can be used to examine how channels adjust both cross-sectional shape and slope in response to tectonic and climatic

  11. (The evolution of) post-secondary education: a computational model and experiments

    Czech Academy of Sciences Publication Activity Database

    Ortmann, Andreas; Slobodyan, Sergey

    -, č. 355 (2008), s. 1-46 ISSN 1211-3298 R&D Projects: GA MŠk LC542 Institutional research plan: CEZ:AV0Z70850503 Keywords : post-secondary education * for-profit higher education providers * computational simulations Subject RIV: AH - Economics http://www.cerge-ei.cz/pdf/wp/Wp355.pdf

  12. (The evolution of) post-secondary education: a computational model and experiments

    Czech Academy of Sciences Publication Activity Database

    Ortmann, Andreas; Slobodyan, Sergey

    -, č. 355 (2008), s. 1-46 ISSN 1211-3298 R&D Projects: GA MŠk LC542 Institutional research plan: CEZ:MSM0021620846 Keywords : post-secondary education * for-profit higher education providers * computational simulations Subject RIV: AH - Economics http://www.cerge-ei.cz/pdf/wp/Wp355.pdf

  13. Computer-Mediated Collaborative Projects: Processes for Enhancing Group Development

    Science.gov (United States)

    Dupin-Bryant, Pamela A.

    2008-01-01

    Groups are a fundamental part of the business world. Yet, as companies continue to expand internationally, a major challenge lies in promoting effective communication among employees who work in varying time zones. Global expansion often requires group collaboration through computer systems. Computer-mediated groups lead to different communicative…

  14. Solving project scheduling problems by minimum cut computations

    NARCIS (Netherlands)

    Möhring, R.H.; Schulz, A.S.; Stork, F.; Uetz, Marc Jochen

    In project scheduling, a set of precedence-constrained jobs has to be scheduled so as to minimize a given objective. In resource-constrained project scheduling, the jobs additionally compete for scarce resources. Due to its universality, the latter problem has a variety of applications in

  15. Using Computer Conferencing and Electronic Mail to Facilitate Group Projects.

    Science.gov (United States)

    Anderson, Margaret D.

    1996-01-01

    Reports on the use of electronic mail and an electronic conferencing system to conduct group projects in three educational psychology courses at the State University of New York College at Cortland. Course design is explained and group project design is described, including assignments and oral presentations during regular class sessions.…

  16. Cross-cultural dataset for the evolution of religion and morality project.

    Science.gov (United States)

    Purzycki, Benjamin Grant; Apicella, Coren; Atkinson, Quentin D; Cohen, Emma; McNamara, Rita Anne; Willard, Aiyana K; Xygalatas, Dimitris; Norenzayan, Ara; Henrich, Joseph

    2016-11-08

    A considerable body of research cross-culturally examines the evolution of religious traditions, beliefs and behaviors. The bulk of this research, however, draws from coded qualitative ethnographies rather than from standardized methods specifically designed to measure religious beliefs and behaviors. Psychological data sets that examine religious thought and behavior in controlled conditions tend to be disproportionately sampled from student populations. Some cross-national databases employ standardized methods at the individual level, but are primarily focused on fully market integrated, state-level societies. The Evolution of Religion and Morality Project sought to generate a data set that systematically probed individual level measures sampling across a wider range of human populations. The set includes data from behavioral economic experiments and detailed surveys of demographics, religious beliefs and practices, material security, and intergroup perceptions. This paper describes the methods and variables, briefly introduces the sites and sampling techniques, notes inconsistencies across sites, and provides some basic reporting for the data set.

  17. Selection Finder (SelFi: A computational metabolic engineering tool to enable directed evolution of enzymes

    Directory of Open Access Journals (Sweden)

    Neda Hassanpour

    2017-06-01

    Full Text Available Directed evolution of enzymes consists of an iterative process of creating mutant libraries and choosing desired phenotypes through screening or selection until the enzymatic activity reaches a desired goal. The biggest challenge in directed enzyme evolution is identifying high-throughput screens or selections to isolate the variant(s with the desired property. We present in this paper a computational metabolic engineering framework, Selection Finder (SelFi, to construct a selection pathway from a desired enzymatic product to a cellular host and to couple the pathway with cell survival. We applied SelFi to construct selection pathways for four enzymes and their desired enzymatic products xylitol, D-ribulose-1,5-bisphosphate, methanol, and aniline. Two of the selection pathways identified by SelFi were previously experimentally validated for engineering Xylose Reductase and RuBisCO. Importantly, SelFi advances directed evolution of enzymes as there is currently no known generalized strategies or computational techniques for identifying high-throughput selections for engineering enzymes.

  18. An Interdisciplinary Team Project: Psychology and Computer Science Students Create Online Cognitive Tasks

    Science.gov (United States)

    Flannery, Kathleen A.; Malita, Mihaela

    2014-01-01

    We present our case study of an interdisciplinary team project for students taking either a psychology or computer science (CS) course. The project required psychology and CS students to combine their knowledge and skills to create an online cognitive task. Each interdisciplinary project team included two psychology students who conducted library…

  19. PERPHECLIM ACCAF Project - Perennial fruit crops and forest phenology evolution facing climatic changes

    Science.gov (United States)

    Garcia de Cortazar-Atauri, Iñaki; Audergon, Jean Marc; Bertuzzi, Patrick; Anger, Christel; Bonhomme, Marc; Chuine, Isabelle; Davi, Hendrik; Delzon, Sylvain; Duchêne, Eric; Legave, Jean Michel; Raynal, Hélène; Pichot, Christian; Van Leeuwen, Cornelis; Perpheclim Team

    2015-04-01

    Phenology is a bio-indicator of climate evolutions. Measurements of phenological stages on perennial species provide actually significant illustrations and assessments of the impact of climate change. Phenology is also one of the main key characteristics of the capacity of adaptation of perennial species, generating questions about their consequences on plant growth and development or on fruit quality. Predicting phenology evolution and adaptative capacities of perennial species need to override three main methodological limitations: 1) existing observations and associated databases are scattered and sometimes incomplete, rendering difficult implementation of multi-site study of genotype-environment interaction analyses; 2) there are not common protocols to observe phenological stages; 3) access to generic phenological models platforms is still very limited. In this context, the PERPHECLIM project, which is funded by the Adapting Agriculture and Forestry to Climate Change Meta-Program (ACCAF) from INRA (French National Institute of Agronomic Research), has the objective to develop the necessary infrastructure at INRA level (observatories, information system, modeling tools) to enable partners to study the phenology of various perennial species (grapevine, fruit trees and forest trees). Currently the PERPHECLIM project involves 27 research units in France. The main activities currently developed are: define protocols and observation forms to observe phenology for various species of interest for the project; organizing observation training; develop generic modeling solutions to simulate phenology (Phenological Modelling Platform and modelling platform solutions); support in building research projects at national and international level; develop environment/genotype observation networks for fruit trees species; develop an information system managing data and documentation concerning phenology. Finally, PERPHECLIM project aims to build strong collaborations with public

  20. Computing with words to feasibility study of software projects

    Directory of Open Access Journals (Sweden)

    Marieta Peña Abreu

    2017-02-01

    Full Text Available Objective: This paper proposes a method to analyze the technical, commercial and social feasibility of software projects in environments of uncertainty. It allows working with multiple experts and multiple criteria and facilitates decision-making. Method: The proposal contains two phases, first the necessary information is collected and in second place projects are evaluated using 2-tuple linguistic representation model. The experts are selected by analyzing their curricular synthesis. The evaluation criteria are defined using the technique Focus Group and weighted in the interval (0,1 according to their importance. three domains are offered to express the preferences: numeric, interval-valued and linguistic. For aggregation extended arithmetic mean and weighted average extended are used, preventing the loss of information. A 2-tuple (feasibility, precision is obtained as a result for each project. Results: The evaluation of P1 project was a very high feasibility with -0,33 of precision. The P2 project obtained a high feasibility with 0,38 of precision and P3 project achieved a medium feasibility with -0,21 of precision. Conclusions: This method is favorable for software projects feasibility analysis with presence of multiple experts and criteria, in environments of uncertainty. It tries heterogeneous assessments without loss of information. Their results are consistent and useful for decision makers.

  1. Computer Simulation Performed for Columbia Project Cooling System

    Science.gov (United States)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  2. Configuration Tool for the Trusted Computing Exemplar Project

    Science.gov (United States)

    2009-12-01

    languages were examined: Microsoft .NET [8], Apple Cocoa (Objective-C) [9], wxPython [10], and Java [11]. Since every language has its pros and...languages using the criteria described above. Based on the developer’s limited experience and knowledge of Microsoft .NET and Apple Cocoa (Objective...became a tabbed panel within a separate window panel. Figure 9 depicts this evolution of the conceptual design. In Figure 9, the table column

  3. Configuration Tool Prototype for the Trusted Computing Exemplar Project

    Science.gov (United States)

    2009-12-01

    languages were examined: Microsoft .NET [8], Apple Cocoa (Objective-C) [9], wxPython [10], and Java [11]. Since every language has its pros and...languages using the criteria described above. Based on the developer’s limited experience and knowledge of Microsoft .NET and Apple Cocoa (Objective...a tabbed panel within a separate window panel. Figure 9 depicts this evolution of the conceptual design. In Figure 9, the table column headers are

  4. The method of projected characteristics for the evolution of magnetic arches

    Science.gov (United States)

    Nakagawa, Y.; Hu, Y. Q.; Wu, S. T.

    1987-01-01

    A numerical method of solving fully nonlinear MHD equation is described. In particular, the formulation based on the newly developed method of projected characteristics (Nakagawa, 1981) suitable to study the evolution of magnetic arches due to motions of their foot-points is presented. The final formulation is given in the form of difference equations; therefore, the analysis of numerical stability is also presented. Further, the most important derivation of physically self-consistent, time-dependent boundary conditions (i.e. the evolving boundary equations) is given in detail, and some results obtained with such boundary equations are reported.

  5. Instructional Computing Project Uses "Multiplier Effect" to Train Florida Teachers.

    Science.gov (United States)

    Roblyer, M. D.; Castine, W. H.

    1987-01-01

    Reviews the efforts undertaken in the Florida Model Microcomputer Trainer Project (FMMTP) and its statewide impact. Outlines its procedural strategies, trainer curriculum, networking system, and the results of its multiplier effect. (ML)

  6. Evolution of the Atlas data and computing model for a Tier-2 in the EGI infrastructure

    CERN Document Server

    Fernandez, A; The ATLAS collaboration; AMOROS, G; VILLAPLANA, M; FASSI, F; KACI, M; LAMAS, A; OLIVER, E; SALT, J; SANCHEZ, J; SANCHEZ, V

    2012-01-01

    ABSTRAC ISCG 2012 Evolution of the Atlas data and computing model for a Tier2 in the EGI infrastructure During last years the Atlas computing model has moved from a more strict design, where every Tier2 had a liaison and a network dependence from a Tier1, to a more meshed approach where every cloud could be connected. Evolution of ATLAS data models requires changes in ATLAS Tier2s policy for the data replication, dynamic data caching and remote data access. It also requires rethinking the network infrastructure to enable any Tier2 and associated Tier3 to easily connect to any Tier1 or Tier2. Tier2s are becoming more and more important in the ATLAS computing model as it allows more data to be readily accessible for analysis jobs to all users, independently of their geographical location. The Tier2s disk space has been reserved for real, simulated, calibration and alignment, group, and user data. A buffer disk space is needed for input and output data for simulations jobs. Tier2s are going to be used more effic...

  7. Evolution and experience with the ATLAS Simulation at Point1 Project

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00389536; The ATLAS collaboration; Brasolin, Franco; Kouba, Tomas; Schovancova, Jaroslava; Fazio, Daniel; Di Girolamo, Alessandro; Scannicchio, Diana; Twomey, Matthew Shaun; Wang, Fuquan; Zaytsev, Alexander; Lee, Christopher

    2017-01-01

    The Simulation at Point1 project is successfully running standard ATLAS simulation jobs on the TDAQ HLT resources. The pool of available resources changes dynamically, therefore we need to be very effective in exploiting the available computing cycles. We present our experience with using the Event Service that provides the event-level granularity of computations. We show the design decisions and overhead time related to the usage of the Event Service. The improved utilization of the resources is also presented with the recent development in monitoring, automatic alerting, deployment and GUI.

  8. Evolution and experience with the ATLAS simulation at Point1 project

    CERN Document Server

    Ballestrero, Sergio; The ATLAS collaboration; Fazio, Daniel; Di Girolamo, Alessandro; Kouba, Tomas; Lee, Christopher; Scannicchio, Diana; Schovancova, Jaroslava; Twomey, Matthew Shaun; Wang, Fuquan; Zaytsev, Alexander

    2016-01-01

    The Simulation at Point1 project is successfully running traditional ATLAS simulation jobs on the TDAQ HLT resources. The pool of available resources changes dynamically, therefore we need to be very effective in exploiting the available computing cycles. We will present our experience with using the Event Service that provides the event-level granularity of computations. We will show the design decisions and overhead time related to the usage of the Event Service. The improved utilization of the resources will also be presented with the recent development in monitoring, automatic alerting, deployment and GUI.

  9. Evolution of Safeguards over Time: Past, Present, and Projected Facilities, Material, and Budget

    International Nuclear Information System (INIS)

    Kollar, Lenka; Mathews, Caroline E.

    2009-01-01

    This study examines the past trends and evolution of safeguards over time and projects growth through 2030. The report documents the amount of nuclear material and facilities under safeguards from 1970 until present, along with the corresponding budget. Estimates for the future amount of facilities and material under safeguards are made according to non-nuclear-weapons states (NNWS) plans to build more nuclear capacity and sustain current nuclear infrastructure. Since nuclear energy is seen as a clean and economic option for base load electric power, many countries are seeking to either expand their current nuclear infrastructure, or introduce nuclear power. In order to feed new nuclear power plants and sustain existing ones, more nuclear facilities will need to be built, and thus more nuclear material will be introduced into the safeguards system. The projections in this study conclude that a zero real growth scenario for the IAEA safeguards budget will result in large resource gaps in the near future.

  10. Evolution of Safeguards over Time: Past, Present, and Projected Facilities, Material, and Budget

    Energy Technology Data Exchange (ETDEWEB)

    Kollar, Lenka; Mathews, Caroline E.

    2009-07-01

    This study examines the past trends and evolution of safeguards over time and projects growth through 2030. The report documents the amount of nuclear material and facilities under safeguards from 1970 until present, along with the corresponding budget. Estimates for the future amount of facilities and material under safeguards are made according to non-nuclear-weapons states’ (NNWS) plans to build more nuclear capacity and sustain current nuclear infrastructure. Since nuclear energy is seen as a clean and economic option for base load electric power, many countries are seeking to either expand their current nuclear infrastructure, or introduce nuclear power. In order to feed new nuclear power plants and sustain existing ones, more nuclear facilities will need to be built, and thus more nuclear material will be introduced into the safeguards system. The projections in this study conclude that a zero real growth scenario for the IAEA safeguards budget will result in large resource gaps in the near future.

  11. Computational representation of Alzheimer's disease evolution applied to a cooking activity.

    Science.gov (United States)

    Serna, Audrey; Rialle, Vincent; Pigot, Hélène

    2006-01-01

    This article presents a computational model and a simulation of the decrease of activities of daily living performances due to Alzheimer's disease. The disease evolution is simulated thanks to the cognitive architecture ACT-R. Activities are represented according to the retrieval of semantic units in declarative memory and the trigger of rules in procedural memory. The simulation of Alzheimer's disease decrease is simulated thanks to the variation of subsymbolic parameters. The model is applied to a cooking activity. Simulation of 100 hundred subjects shows results similar to those realised in a standardized assessment with human subjects.

  12. Evolution of perturbed dynamical systems: analytical computation with time independent accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Gurzadyan, A.V. [Russian-Armenian (Slavonic) University, Department of Mathematics and Mathematical Modelling, Yerevan (Armenia); Kocharyan, A.A. [Monash University, School of Physics and Astronomy, Clayton (Australia)

    2016-12-15

    An analytical method for investigation of the evolution of dynamical systems with independent on time accuracy is developed for perturbed Hamiltonian systems. The error-free estimation using of computer algebra enables the application of the method to complex multi-dimensional Hamiltonian and dissipative systems. It also opens principal opportunities for the qualitative study of chaotic trajectories. The performance of the method is demonstrated on perturbed two-oscillator systems. It can be applied to various non-linear physical and astrophysical systems, e.g. to long-term planetary dynamics. (orig.)

  13. Light Water Reactor-Pressure Vessel Surveillance project computer system

    International Nuclear Information System (INIS)

    Merriman, S.H.

    1980-10-01

    A dedicated process control computer has been implemented for regulating the metallurgical Pressure Vessel Wall Benchmark Facility (PSF) at the Oak Ridge Research Reactor. The purpose of the PSF is to provide reliable standards and methods by which to judge the radiation damage to reactor pressure vessel specimens. Benchmark data gathered from the PSF will be used to improve and standardize procedures for assessing the remaining safe operating lifetime of aging reactors. The computer system controls the pressure vessel specimen environment in the presence of gamma heating so that in-vessel conditions are simulated. Instrumented irradiation capsules, in which the specimens are housed, contain temperature sensors and electrical heaters. The computer system regulates the amount of power delivered to the electrical heaters based on the temperature distribution within the capsules. Time-temperature profiles are recorded along with reactor conditions for later correlation with specimen metallurgical changes

  14. Studying fatigue damage evolution in uni-directional composites using x-ray computed tomography

    DEFF Research Database (Denmark)

    Mikkelsen, Lars Pilgaard

    , it will be possible to lower the costs of energy for wind energy based electricity. In the present work, a lab-source x-ray computed tomography equipment (Zeiss Xradia 520 Versa) has been used in connection with ex-situ fatigue testing of uni-directional composites in order to identify fibre failure during...... comparable x-ray studies) have been used in order to ensure a representative test volume during the ex-situ fatigue testing. Using the ability of the x-ray computed tomography to zoom into regions of interest, non-destructive, the fatigue damage evolution in a repeating ex-situ fatigue loaded test sample has...... improving the fatigue resistance of non-crimp fabric used in the wind turbine industry can be made....

  15. Integration of case study approach, project design and computer ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... computer modeling used as a research method applied in the process ... conclusions discuss the benefits for students who analyzed the ... accounting education process the case study method should not .... providing travel safety information to passengers ... from literature readings with practical problems.

  16. Creative capstone computer projects for post-graduate students of ...

    African Journals Online (AJOL)

    With this in mind, the English Department at the University of Stellenbosch has designed a module in its Honours course that allows post-graduate students the opportunity to develop additional skills in the design and development of multimedia projects that effectively combine the knowledge they have gained during the ...

  17. The Human Genome Project: Biology, Computers, and Privacy.

    Science.gov (United States)

    Cutter, Mary Ann G.; Drexler, Edward; Gottesman, Kay S.; Goulding, Philip G.; McCullough, Laurence B.; McInerney, Joseph D.; Micikas, Lynda B.; Mural, Richard J.; Murray, Jeffrey C.; Zola, John

    This module, for high school teachers, is the second of two modules about the Human Genome Project (HGP) produced by the Biological Sciences Curriculum Study (BSCS). The first section of this module provides background information for teachers about the structure and objectives of the HGP, aspects of the science and technology that underlie the…

  18. Development and application of project management computer system in nuclear power station

    International Nuclear Information System (INIS)

    Chen Junpu

    2000-01-01

    According to the experiences in the construction of Daya Bay and Lingao nuclear power plants presents, the necessity to use the computers for management and their application in the nuclear power engineering project are explained

  19. Taiwan links up to world's 1st LHC Computing Grid Project

    CERN Multimedia

    2003-01-01

    Taiwan's Academia Sinica was linked up to the Large Hadron Collider (LHC) Computing Grid Project to work jointly with 12 other countries to construct the world's largest and most powerful particle accelerator

  20. FireProt: Energy- and Evolution-Based Computational Design of Thermostable Multiple-Point Mutants.

    Science.gov (United States)

    Bednar, David; Beerens, Koen; Sebestova, Eva; Bendl, Jaroslav; Khare, Sagar; Chaloupkova, Radka; Prokop, Zbynek; Brezovsky, Jan; Baker, David; Damborsky, Jiri

    2015-11-01

    There is great interest in increasing proteins' stability to enhance their utility as biocatalysts, therapeutics, diagnostics and nanomaterials. Directed evolution is a powerful, but experimentally strenuous approach. Computational methods offer attractive alternatives. However, due to the limited reliability of predictions and potentially antagonistic effects of substitutions, only single-point mutations are usually predicted in silico, experimentally verified and then recombined in multiple-point mutants. Thus, substantial screening is still required. Here we present FireProt, a robust computational strategy for predicting highly stable multiple-point mutants that combines energy- and evolution-based approaches with smart filtering to identify additive stabilizing mutations. FireProt's reliability and applicability was demonstrated by validating its predictions against 656 mutations from the ProTherm database. We demonstrate that thermostability of the model enzymes haloalkane dehalogenase DhaA and γ-hexachlorocyclohexane dehydrochlorinase LinA can be substantially increased (ΔTm = 24°C and 21°C) by constructing and characterizing only a handful of multiple-point mutants. FireProt can be applied to any protein for which a tertiary structure and homologous sequences are available, and will facilitate the rapid development of robust proteins for biomedical and biotechnological applications.

  1. FireProt: Energy- and Evolution-Based Computational Design of Thermostable Multiple-Point Mutants.

    Directory of Open Access Journals (Sweden)

    David Bednar

    2015-11-01

    Full Text Available There is great interest in increasing proteins' stability to enhance their utility as biocatalysts, therapeutics, diagnostics and nanomaterials. Directed evolution is a powerful, but experimentally strenuous approach. Computational methods offer attractive alternatives. However, due to the limited reliability of predictions and potentially antagonistic effects of substitutions, only single-point mutations are usually predicted in silico, experimentally verified and then recombined in multiple-point mutants. Thus, substantial screening is still required. Here we present FireProt, a robust computational strategy for predicting highly stable multiple-point mutants that combines energy- and evolution-based approaches with smart filtering to identify additive stabilizing mutations. FireProt's reliability and applicability was demonstrated by validating its predictions against 656 mutations from the ProTherm database. We demonstrate that thermostability of the model enzymes haloalkane dehalogenase DhaA and γ-hexachlorocyclohexane dehydrochlorinase LinA can be substantially increased (ΔTm = 24°C and 21°C by constructing and characterizing only a handful of multiple-point mutants. FireProt can be applied to any protein for which a tertiary structure and homologous sequences are available, and will facilitate the rapid development of robust proteins for biomedical and biotechnological applications.

  2. Fermilab advanced computer program multi-microprocessor project

    International Nuclear Information System (INIS)

    Nash, T.; Areti, H.; Biel, J.

    1985-06-01

    Fermilab's Advanced Computer Program is constructing a powerful 128 node multi-microprocessor system for data analysis in high-energy physics. The system will use commercial 32-bit microprocessors programmed in Fortran-77. Extensive software supports easy migration of user applications from a uniprocessor environment to the multiprocessor and provides sophisticated program development, debugging, and error handling and recovery tools. This system is designed to be readily copied, providing computing cost effectiveness of below $2200 per VAX 11/780 equivalent. The low cost, commercial availability, compatibility with off-line analysis programs, and high data bandwidths (up to 160 MByte/sec) make the system an ideal choice for applications to on-line triggers as well as an offline data processor

  3. Exponential rise of dynamical complexity in quantum computing through projections.

    Science.gov (United States)

    Burgarth, Daniel Klaus; Facchi, Paolo; Giovannetti, Vittorio; Nakazato, Hiromichi; Pascazio, Saverio; Yuasa, Kazuya

    2014-10-10

    The ability of quantum systems to host exponentially complex dynamics has the potential to revolutionize science and technology. Therefore, much effort has been devoted to developing of protocols for computation, communication and metrology, which exploit this scaling, despite formidable technical difficulties. Here we show that the mere frequent observation of a small part of a quantum system can turn its dynamics from a very simple one into an exponentially complex one, capable of universal quantum computation. After discussing examples, we go on to show that this effect is generally to be expected: almost any quantum dynamics becomes universal once 'observed' as outlined above. Conversely, we show that any complex quantum dynamics can be 'purified' into a simpler one in larger dimensions. We conclude by demonstrating that even local noise can lead to an exponentially complex dynamics.

  4. Three-Dimensional Computer Graphics Brain-Mapping Project

    Science.gov (United States)

    1988-03-24

    1975-76, one of these brains was hand digitized. It was then reconstructed three dimensionally, using an Evans and Sutherland Picture System 2. This...Yakovlev Collection, we use the Evans and Sutherland Picture System 2 which we have been employing for this purpose for a dozen years. Its virtue is...careful, experimentally designed new protocol (See Figure 20). Most of these heads were imaged with Computed Tomography, thanks to Clint Stiles of Picker

  5. Time evolution of a quenched binary alloy: computer simulation of a three-dimensional model system

    International Nuclear Information System (INIS)

    Marro, J.; Bortz, A.B.; Kalos, M.H.; Lebowitz, J.L.; Sur, A.

    1976-01-01

    Results are presented of computer simulation of the time evolution for a model of a binary alloy, such as ZnAl, following quenching. The model system is a simple cubic lattice the sites of which are occupied either by A or B particles. There is a nearest neighbor interaction favoring segregation into an A rich and a B rich phase at low temperatures, T less than T/sub c/. Starting from a random configuration, T much greater than T/sub c/, the system is quenched to and evolves at a temperature T less than T/sub c/. The evolution takes place through exchanges between A and B atoms on nearest neighbor sites. The probability of such an exchange is assumed proportional to e/sup -βΔU/ [1 + e/sup -βΔU/] -1 where β = (k/sub B/T) -1 and ΔU is the change in energy resulting from the exchange. In the simulations either a 30 x 30 x 30 or a 50 x 50 x 50 lattice is used with various fractions of the sites occupied by A particles. The evolution of the Fourier transform of the spherically averaged structure function S(k,t), the energy, and the cluster distribution were computed. Comparison is made with various theories of this process and with some experiments. It is found in particular that the results disagree with the predictions of the linearized Cahn-Hilliard theory of spinodal decomposition. The qualitative form of the results appear to be unaffected if the change in the positions of the atoms takes place via a vacancy mechanism rather than through direct exchanges

  6. South Ukraine Nuclear Power Plant. Advanced Computer Information System Project

    International Nuclear Information System (INIS)

    Hord, J.; Afanasiev, N.; Smith, C.; Kudinov, Yu.

    1997-01-01

    The South Ukraine upgrade is the first of many that will take place in the former eastern bloc countries over the next several years. Westron is currently developing a similar system for the Zaporozhe nuclear power plant. In addition, there are eleven other WWER type units in operation in the Ukraine, as well as twenty seven others in operation throughout Eastern and Central Europe and Russia - all potential upgrade projects. (author)

  7. Plancton: an opportunistic distributed computing project based on Docker containers

    Science.gov (United States)

    Concas, Matteo; Berzano, Dario; Bagnasco, Stefano; Lusso, Stefano; Masera, Massimo; Puccio, Maximiliano; Vallero, Sara

    2017-10-01

    The computing power of most modern commodity computers is far from being fully exploited by standard usage patterns. In this work we describe the development and setup of a virtual computing cluster based on Docker containers used as worker nodes. The facility is based on Plancton: a lightweight fire-and-forget background service. Plancton spawns and controls a local pool of Docker containers on a host with free resources, by constantly monitoring its CPU utilisation. It is designed to release the resources allocated opportunistically, whenever another demanding task is run by the host user, according to configurable policies. This is attained by killing a number of running containers. One of the advantages of a thin virtualization layer such as Linux containers is that they can be started almost instantly upon request. We will show how fast the start-up and disposal of containers eventually enables us to implement an opportunistic cluster based on Plancton daemons without a central control node, where the spawned Docker containers behave as job pilots. Finally, we will show how Plancton was configured to run up to 10 000 concurrent opportunistic jobs on the ALICE High-Level Trigger facility, by giving a considerable advantage in terms of management compared to virtual machines.

  8. Computers in Education: An Overview. Publication Number One. Software Engineering/Education Cooperative Project.

    Science.gov (United States)

    Collis, Betty; Muir, Walter

    The first of four major sections in this report presents an overview of the background and evolution of computer applications to learning and teaching. It begins with the early attempts toward "automated teaching" of the 1920s, and the "teaching machines" of B. F. Skinner of the 1940s through the 1960s. It then traces the…

  9. Neuroradiology computer-assisted instruction using interactive videodisk: Pilot project

    International Nuclear Information System (INIS)

    Andrews, C.L.; Goldsmith, D.G.; Osborn, A.G.; Stensaas, S.S.; Davidson, H.C.; Quigley, A.C.

    1987-01-01

    The availability of microcomputers, high-resolution monitors, high-level authoring languages, and videodisk technology make sophisticated neuroradiology instruction a cost-effective possibility. The authors developed a laser videodisk and interactive software to teach normal and pathologic gross and radiologic anatomy of the sellar/juxtasellar region. A spectrum of lesions is presented with information for differential diagnosis included. The exhibit permits conference participants to review the pilot module and experience the self-paced learning and self-evaluation possible with computer-assisted instruction. They also may choose to peruse a ''visual database'' by instant random access to the videodisk by hand control

  10. Utilization of Relap 5 computer code for analyzing thermohydraulic projects

    International Nuclear Information System (INIS)

    Silva Filho, E.

    1987-01-01

    This work deals with the design of a scaled test facility of a typical pressurized water reactor plant of the 1300 MW (electric) class. A station blackout has been choosen to investigate the thermohydraulic behaviour of the the test facility in comparison to the reactor plant. The computer code RELAPS/MOD1 has been utilized to simulate the blackout and to compare the test facility behaviour with the reactor plant one. The results demonstrate similar thermohydraulic behaviours of the two systems. (author) [pt

  11. The Dark Side of Software Engineering Evil on Computing Projects

    CERN Document Server

    Rost, Johann

    2010-01-01

    Betrayal! Corruption! Software engineering? Industry experts Johann Rost and Robert L. Glass explore the seamy underbelly of software engineering in this timely report on and analysis of the prevalance of subversion, lying, hacking, and espionage on every level of software project management. Based on the authors' original research and augmented by frank discussion and insights from other well-respected figures, The Dark Side of Software Engineering goes where other management studies fear to tread -- a corporate environment where schedules are fabricated, trust is betrayed, millions of dollar

  12. Lessons from two Dutch projects for the introduction of computers in schools

    NARCIS (Netherlands)

    ten Brummelhuis, A.C.A.; Plomp, T.

    1993-01-01

    The systematic introduction of computers in schools for general secondary education in The Netherlands started in the early 1980s. Initially, the Dutch government experimented in 1983 with a project in 100 lower general secondary schools limited in scope to gain experience with educational computer

  13. A Project-Based Learning Approach to Programmable Logic Design and Computer Architecture

    Science.gov (United States)

    Kellett, C. M.

    2012-01-01

    This paper describes a course in programmable logic design and computer architecture as it is taught at the University of Newcastle, Australia. The course is designed around a major design project and has two supplemental assessment tasks that are also described. The context of the Computer Engineering degree program within which the course is…

  14. High Performance Parallel Processing Project: Industrial computing initiative. Progress reports for fiscal year 1995

    Energy Technology Data Exchange (ETDEWEB)

    Koniges, A.

    1996-02-09

    This project is a package of 11 individual CRADA`s plus hardware. This innovative project established a three-year multi-party collaboration that is significantly accelerating the availability of commercial massively parallel processing computing software technology to U.S. government, academic, and industrial end-users. This report contains individual presentations from nine principal investigators along with overall program information.

  15. Image reconstruction from projections and its application in emission computer tomography

    International Nuclear Information System (INIS)

    Kuba, Attila; Csernay, Laszlo

    1989-01-01

    Computer tomography is an imaging technique for producing cross sectional images by reconstruction from projections. Its two main branches are called transmission and emission computer tomography, TCT and ECT, resp. After an overview of the theory and practice of TCT and ECT, the first Hungarian ECT type MB 9300 SPECT consisting of a gamma camera and Ketronic Medax N computer is described, and its applications to radiological patient observations are discussed briefly. (R.P.) 28 refs.; 4 figs

  16. IMASIS computer-based medical record project: dealing with the human factor.

    Science.gov (United States)

    Martín-Baranera, M; Planas, I; Palau, J; Sanz, F

    1995-01-01

    level, problems to be solved in utilization of the system, errors detected in the systems' database, and the personal interest in participating in the IMASIS project. The questionnaire was also intended to be a tool to monitor IMASIS evolution. Our study showed that medical staff had a lack of information about the current HIS, leading to a poor utilization of some system options. Another major characteristic, related to the above, was the feeling that the project would negatively affect the organization of work at the hospitals. A computer-based medical record was feared to degrade physician-patient relationship, introduce supplementary administrative burden in clinicians day-to-day work, unnecessarily slow history taking, and imply too-rigid patterns of work. The most frequent problems in using the current system could be classified into two groups: problems related to lack of agility and consistency in user interface design, and those derived from lack of a common patient identification number. Duplication of medical records was the most frequent error detected by physicians. Analysis of physicians' attitudes towards IMASIS revealed a lack of confidence globally. This was probably the consequence of two current features: a lack of complete information about IMASIS possibilities and problems faced when using the system. To deal with such factors, three types of measures have been planned. First, an effort is to be done to ensure that every physician is able to adequately use the current system and understands long-term benefits of the project. This task will be better accomplished by personal interaction between clinicians and a physician from the Informatics Department than through formal teaching of IMASIS. Secondly, a protocol for evaluating the HIS is being developed and will be systematically applied to detect both database errors and systemUs design pitfalls. Finally, the IMASIS project has to find a convenient point for starting, to offer short-term re

  17. Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, Forest M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bochev, Pavel B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cameron-Smith, Philip J.. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Easter, Richard C [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Elliott, Scott M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ghan, Steven J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Xiaohong [Univ. of Wyoming, Laramie, WY (United States); Lowrie, Robert B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lucas, Donald D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ma, Po-lun [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sacks, William J. [National Center for Atmospheric Research (NCAR), Boulder, CO (United States); Shrivastava, Manish [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Singh, Balwinder [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tautges, Timothy J. [Argonne National Lab. (ANL), Argonne, IL (United States); Taylor, Mark A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Vertenstein, Mariana [National Center for Atmospheric Research (NCAR), Boulder, CO (United States); Worley, Patrick H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2014-01-15

    The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.

  18. Evolution of and projections for automated composite material placement equipment in the aerospace industry

    Science.gov (United States)

    McCarville, Douglas A.

    2009-12-01

    As the commercial aircraft industry attempts to improve airplane fuel efficiency by shifting from aluminum to composites (reinforced plastics), there is a concern that composite processing equipment is not mature enough to meet increasing demand and that delivery delays and loss of high tech jobs could result. The research questions focused on the evolution of composite placement machines, improvement of machine functionality by equipment vendors, and the probability of new inventions helping to avoid production shortfalls. An extensive review of the literature found no studies that addressed these issues. Since the early twentieth century, exploratory case study of pivotal technological advances has been an accepted means of performing historic analysis and furthering understanding of rapidly changing marketplaces and industries. This qualitative case study investigated evolution of automated placement equipment by (a) codifying and mapping patent data (e.g., claims and functionality descriptions), (b) triangulating archival data (i.e., trade literature, vender Web sites, and scholarly texts), and (c) interviewing expert witnesses. An industry-level sensitivity model developed by the author showed that expanding the vendor base and increasing the number of performance enhancing inventions will most likely allow the industry to make the transition from aluminum to composites without schedule delays. This study will promote social change by (a) advancing individual and community knowledge (e.g., teaching modules for students, practitioners, and professional society members) and (b) providing an empirical model that will help in the understanding and projection of next generation composite processing equipment demand and productivity output.

  19. Impact of configuration management system of computer center on support of scientific projects throughout their lifecycle

    International Nuclear Information System (INIS)

    Bogdanov, A.V.; Yuzhanin, N.V.; Zolotarev, V.I.; Ezhakova, T.R.

    2017-01-01

    In this article the problem of scientific projects support throughout their lifecycle in the computer center is considered in every aspect of support. Configuration Management system plays a connecting role in processes related to the provision and support of services of a computer center. In view of strong integration of IT infrastructure components with the use of virtualization, control of infrastructure becomes even more critical to the support of research projects, which means higher requirements for the Configuration Management system. For every aspect of research projects support, the influence of the Configuration Management system is reviewed and development of the corresponding elements of the system is described in the present paper.

  20. Lattice QCD Application Development within the US DOE Exascale Computing Project

    Energy Technology Data Exchange (ETDEWEB)

    Brower, Richard [Boston U.; Christ, Norman [Columbia U.; DeTar, Carleton [Utah U.; Edwards, Robert [Jefferson Lab; Mackenzie, Paul [Fermilab

    2017-10-30

    In October, 2016, the US Department of Energy launched the Exascale Computing Project, which aims to deploy exascale computing resources for science and engineering in the early 2020's. The project brings together application teams, software developers, and hardware vendors in order to realize this goal. Lattice QCD is one of the applications. Members of the US lattice gauge theory community with significant collaborators abroad are developing algorithms and software for exascale lattice QCD calculations. We give a short description of the project, our activities, and our plans.

  1. Lattice QCD Application Development within the US DOE Exascale Computing Project

    Science.gov (United States)

    Brower, Richard; Christ, Norman; DeTar, Carleton; Edwards, Robert; Mackenzie, Paul

    2018-03-01

    In October, 2016, the US Department of Energy launched the Exascale Computing Project, which aims to deploy exascale computing resources for science and engineering in the early 2020's. The project brings together application teams, software developers, and hardware vendors in order to realize this goal. Lattice QCD is one of the applications. Members of the US lattice gauge theory community with significant collaborators abroad are developing algorithms and software for exascale lattice QCD calculations. We give a short description of the project, our activities, and our plans.

  2. Lattice QCD Application Development within the US DOE Exascale Computing Project

    Directory of Open Access Journals (Sweden)

    Brower Richard

    2018-01-01

    Full Text Available In October, 2016, the US Department of Energy launched the Exascale Computing Project, which aims to deploy exascale computing resources for science and engineering in the early 2020’s. The project brings together application teams, software developers, and hardware vendors in order to realize this goal. Lattice QCD is one of the applications. Members of the US lattice gauge theory community with significant collaborators abroad are developing algorithms and software for exascale lattice QCD calculations. We give a short description of the project, our activities, and our plans.

  3. Impact of configuration management system of computer center on support of scientific projects throughout their lifecycle

    Science.gov (United States)

    Bogdanov, A. V.; Iuzhanin, N. V.; Zolotarev, V. I.; Ezhakova, T. R.

    2017-12-01

    In this article the problem of scientific projects support throughout their lifecycle in the computer center is considered in every aspect of support. Configuration Management system plays a connecting role in processes related to the provision and support of services of a computer center. In view of strong integration of IT infrastructure components with the use of virtualization, control of infrastructure becomes even more critical to the support of research projects, which means higher requirements for the Configuration Management system. For every aspect of research projects support, the influence of the Configuration Management system is being reviewed and development of the corresponding elements of the system is being described in the present paper.

  4. Computer-aided assessment in statistics: the CAMPUS project

    Directory of Open Access Journals (Sweden)

    Neville Hunt

    1998-12-01

    Full Text Available The relentless drive for 'efficiency' in higher education, and the consequent increase in workloads, has given university teachers a compelling incentive to investigate alternative forms of assessment. Some forms of assessment with a clear educational value can no longer be entertained because of the burden placed on the teacher. An added concern is plagiarism, which anecdotal evidence would suggest is on the increase yet which is difficult to detect in large modules with more than one assessor. While computer-aided assessment (CAA has an enthusiastic following, it is not clear to many teachers that it either reduces workloads or reduces the risk of cheating. In an ideal world, most teachers would prefer to give individual attention and personal feedback to each student when marking their work. In this sense CAA must be seen as second best and will therefore be used only if it is seen to offer significant benefits in terms of reduced workloads or increased validity.

  5. Multidisciplinary Aerospace Systems Optimization: Computational AeroSciences (CAS) Project

    Science.gov (United States)

    Kodiyalam, S.; Sobieski, Jaroslaw S. (Technical Monitor)

    2001-01-01

    The report describes a method for performing optimization of a system whose analysis is so expensive that it is impractical to let the optimization code invoke it directly because excessive computational cost and elapsed time might result. In such situation it is imperative to have user control the number of times the analysis is invoked. The reported method achieves that by two techniques in the Design of Experiment category: a uniform dispersal of the trial design points over a n-dimensional hypersphere and a response surface fitting, and the technique of krigging. Analyses of all the trial designs whose number may be set by the user are performed before activation of the optimization code and the results are stored as a data base. That code is then executed and referred to the above data base. Two applications, one of the airborne laser system, and one of an aircraft optimization illustrate the method application.

  6. Handheld computers in nursing education: PDA pilot project.

    Science.gov (United States)

    Koeniger-Donohue, Rebecca

    2008-02-01

    Interest in the use and application of handheld technology at undergraduate and graduate nursing programs across the country is growing rapidly. Personal digital assistants (PDAs) are often referred to as a "peripheral brain" because they can save time, decrease errors, and simplify information retrieval at the point of care. In addition, research results support the notion that PDAs enhance nursing clinical education and are an effective student learning resource. However, most nursing programs lack the full range of technological resources to implement and provide ongoing support for handheld technology use by faculty and students. This article describes a 9-month pilot project for the initial use of PDAs by novice faculty and students at Simmons College.

  7. Exploitation of cloud computing in management of construction projects in Slovakia

    Directory of Open Access Journals (Sweden)

    Mandičák Tomáš

    2016-12-01

    Full Text Available The issue of cloud computing is a highly topical issue. Cloud computing represents a new model for information technology (IT services based on the exploitation of Web (it represents a cloud and other application platforms, as well as software as a service. In general, the exploitation of cloud computing in construction project management has several advantages, as demonstrated by several research reports. Currently, research quantifying the exploitation of cloud computing in the Slovak construction industry has not yet been carried out. The article discusses the issue of exploitation of cloud computing in construction project management in Slovakia. The main objective of the research is to confirm whether factors such as size of construction enterprise, owner of construction enterprise and participant of construction project have any impact on the exploitation level of cloud computing in construction project management. It includes confirmation of differences in use between different participants of the construction project or between construction enterprises broken down by size and shareholders.

  8. History and evolution of the pharmacophore concept in computer-aided drug design.

    Science.gov (United States)

    Güner, Osman F

    2002-12-01

    With computer-aided drug design established as an integral part of the lead discovery and optimization process, pharmacophores have become a focal point for conceptualizing and understanding receptor-ligand interactions. In the structure-based design process, pharmacophores can be used to align molecules based on the three-dimensional arrangement of chemical features or to develop predictive models (e.g., 3D-QSAR) that correlate with the experimental activities of a given training set. Pharmacophores can be also used as search queries for retrieving potential leads from structural databases, for designing molecules with specific desired attributes, or as fingerprints for assessing similarity and diversity of molecules. This review article presents a historical perspective on the evolution and use of the pharmacophore concept in the pharmaceutical, biotechnology, and fragrances industry with published examples of how the technology has contributed and advanced the field.

  9. Cloud computing task scheduling strategy based on differential evolution and ant colony optimization

    Science.gov (United States)

    Ge, Junwei; Cai, Yu; Fang, Yiqiu

    2018-05-01

    This paper proposes a task scheduling strategy DEACO based on the combination of Differential Evolution (DE) and Ant Colony Optimization (ACO), aiming at the single problem of optimization objective in cloud computing task scheduling, this paper combines the shortest task completion time, cost and load balancing. DEACO uses the solution of the DE to initialize the initial pheromone of ACO, reduces the time of collecting the pheromone in ACO in the early, and improves the pheromone updating rule through the load factor. The proposed algorithm is simulated on cloudsim, and compared with the min-min and ACO. The experimental results show that DEACO is more superior in terms of time, cost, and load.

  10. Evolution of extreme temperature events in short term climate projection for Iberian Peninsula.

    Science.gov (United States)

    Rodriguez, Alfredo; Tarquis, Ana M.; Sanchez, Enrique; Dosio, Alessandro; Ruiz-Ramos, Margarita

    2014-05-01

    Extreme events of maximum and minimum temperatures are a main hazard for agricultural production in Iberian Peninsula. For this purpose, in this study we analyze projections of their evolution that could be valid for the next decade, represented in this study by the 30-year period 2004-2034 (target period). For this purpose two kinds of data were used in this study: 1) observations from the station network of AEMET (Spanish National Meteorological Agency) for five Spanish locations, and 2) simulated data at a resolution of 50 ×50 km horizontal grid derived from the outputs of twelve Regional Climate Models (RCMs) taken from project ENSEMBLES (van der Linden and Mitchell, 2009), with a bias correction (Dosio and Paruolo, 2011; Dosio et al., 2012) regarding the observational dataset Spain02 (Herrera et al., 2012). To validate the simulated climate, the available period of observations was compared to a baseline period (1964-1994) of simulated climate for all locations. Then, to analyze the changes for the present/very next future, probability of extreme temperature events for 2004-2034 were compared to that of the baseline period. Although only minor changes are expected, small variations in variability may have a significant impact in crop performance. The objective of the work is to evaluate the utility of these short term projections for potential users, as for instance insurance companies. References Dosio A. and Paruolo P., 2011. Bias correction of the ENSEMBLES high-resolution climate change projections for use by impact models: Evaluation on the present climate. Journal of Geophysical Research, VOL. 116,D16106, doi:10.1029/2011JD015934 Dosio A., Paruolo P. and Rojas R., 2012. Bias correction of the ENSEMBLES high resolution climate change projections for use by impact models: Analysis of the climate change signal. Journal of Geophysical Research,Volume 117, D17, doi: 0.1029/2012JD017968 Herrera et. al. (2012) Development and Analysis of a 50 year high

  11. Computability, Gödel's incompleteness theorem, and an inherent limit on the predictability of evolution.

    Science.gov (United States)

    Day, Troy

    2012-04-07

    The process of evolutionary diversification unfolds in a vast genotypic space of potential outcomes. During the past century, there have been remarkable advances in the development of theory for this diversification, and the theory's success rests, in part, on the scope of its applicability. A great deal of this theory focuses on a relatively small subset of the space of potential genotypes, chosen largely based on historical or contemporary patterns, and then predicts the evolutionary dynamics within this pre-defined set. To what extent can such an approach be pushed to a broader perspective that accounts for the potential open-endedness of evolutionary diversification? There have been a number of significant theoretical developments along these lines but the question of how far such theory can be pushed has not been addressed. Here a theorem is proven demonstrating that, because of the digital nature of inheritance, there are inherent limits on the kinds of questions that can be answered using such an approach. In particular, even in extremely simple evolutionary systems, a complete theory accounting for the potential open-endedness of evolution is unattainable unless evolution is progressive. The theorem is closely related to Gödel's incompleteness theorem, and to the halting problem from computability theory.

  12. Examination of China's performance and thematic evolution in quantum cryptography research using quantitative and computational techniques.

    Directory of Open Access Journals (Sweden)

    Nicholas V Olijnyk

    Full Text Available This study performed two phases of analysis to shed light on the performance and thematic evolution of China's quantum cryptography (QC research. First, large-scale research publication metadata derived from QC research published from 2001-2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China's QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China's performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China's performance indicators (i.e., Publication Frequency, citation score, H-index are growing. China's H-index (a normalized indicator has surpassed all other countries' over the last several years. The second phase of analysis shows how China's main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures; some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state, while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation. Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China's QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology

  13. Examination of China's performance and thematic evolution in quantum cryptography research using quantitative and computational techniques.

    Science.gov (United States)

    Olijnyk, Nicholas V

    2018-01-01

    This study performed two phases of analysis to shed light on the performance and thematic evolution of China's quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001-2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China's QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China's performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China's performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China's H-index (a normalized indicator) has surpassed all other countries' over the last several years. The second phase of analysis shows how China's main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China's QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology policy researchers

  14. Research Progress in Mathematical Analysis of Map Projection by Computer Algebra

    Directory of Open Access Journals (Sweden)

    BIAN Shaofeng

    2017-10-01

    Full Text Available Map projection is an important component of modern cartography, and involves many fussy mathematical analysis processes, such as the power series expansions of elliptical functions, differential of complex and implicit functions, elliptical integral and the operation of complex numbers. The derivation of these problems by hand not only consumes much time and energy but also makes mistake easily, and sometimes can not be realized at all because of the impossible complexity. The research achievements in mathematical analysis of map projection by computer algebra are systematically reviewed in five aspects, i.e., the symbolic expressions of forward and inverse solution of ellipsoidal latitudes, the direct transformations between map projections with different distortion properties, expressions of Gauss projection by complex function, mathematical analysis of oblique Mercator projection, polar chart projection with its transformation. Main problems that need to be further solved in this research field are analyzed. It will be helpful to promote the development of map projection.

  15. Evolution of the ATLAS Distributed Computing during the LHC long shutdown

    CERN Document Server

    Campana, S; The ATLAS collaboration

    2013-01-01

    The ATLAS Distributed Computing project (ADC) was established in 2007 to develop and operate a framework, following the ATLAS computing model, to enable data storage, processing and bookkeeping on top of the WLCG distributed infrastructure. ADC development has always been driven by operations and this contributed to its success. The system has fulfilled the demanding requirements of ATLAS, daily consolidating worldwide up to 1PB of data and running more than 1.5 million payloads distributed globally, supporting almost one thousand concurrent distributed analysis users. Comprehensive automation and monitoring minimized the operational manpower required. The flexibility of the system to adjust to operational needs has been important to the success of the ATLAS physics program. The LHC shutdown in 2013-2015 affords an opportunity to improve the system in light of operational experience and scale it to cope with the demanding requirements of 2015 and beyond, most notably a much higher trigger rate and event pileu...

  16. Evolution of the ATLAS Distributed Computing system during the LHC Long shutdown

    CERN Document Server

    Campana, S; The ATLAS collaboration

    2014-01-01

    The ATLAS Distributed Computing project (ADC) was established in 2007 to develop and operate a framework, following the ATLAS computing model, to enable data storage, processing and bookkeeping on top of the WLCG distributed infrastructure. ADC development has always been driven by operations and this contributed to its success. The system has fulfilled the demanding requirements of ATLAS, daily consolidating worldwide up to 1PB of data and running more than 1.5 million payloads distributed globally, supporting almost one thousand concurrent distributed analysis users. Comprehensive automation and monitoring minimized the operational manpower required. The flexibility of the system to adjust to operational needs has been important to the success of the ATLAS physics program. The LHC shutdown in 2013-2015 affords an opportunity to improve the system in light of operational experience and scale it to cope with the demanding requirements of 2015 and beyond, most notably a much higher trigger rate and event pileu...

  17. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  18. The Evolution of Culture-Climate Interplay in Temporary Multi-Organisations: The Case of Construction Alliancing Projects

    OpenAIRE

    Kusuma, I. C.

    2016-01-01

    Organisational culture has been a long-standing debate in management research. However, in the field of construction project management, it is relatively under-explored. This is mainly due to the different organisational context of Temporary Multi-Organisations (TMOs). This research re-explores the notion of organisational culture in construction projects. Based on Darwin’s theory of evolution this research goes back to the very beginning; illustrating the exact meaning and dynamics of organi...

  19. Computer Technology-Integrated Projects Should Not Supplant Craft Projects in Science Education

    Science.gov (United States)

    Klopp, Tabatha J.; Rule, Audrey C.; Schneider, Jean Suchsland; Boody, Robert M.

    2014-01-01

    The current emphasis on computer technology integration and narrowing of the curriculum has displaced arts and crafts. However, the hands-on, concrete nature of craft work in science modeling enables students to understand difficult concepts and to be engaged and motivated while learning spatial, logical, and sequential thinking skills. Analogy…

  20. Evaluating Students' Perceptions and Attitudes toward Computer-Mediated Project-Based Learning Environment: A Case Study

    Science.gov (United States)

    Seet, Ling Ying Britta; Quek, Choon Lang

    2010-01-01

    This research investigated 68 secondary school students' perceptions of their computer-mediated project-based learning environment and their attitudes towards Project Work (PW) using two instruments--Project Work Classroom Learning Environment Questionnaire (PWCLEQ) and Project Work Related Attitudes Instrument (PWRAI). In this project-based…

  1. Reservoir computer predictions for the Three Meter magnetic field time evolution

    Science.gov (United States)

    Perevalov, A.; Rojas, R.; Lathrop, D. P.; Shani, I.; Hunt, B. R.

    2017-12-01

    The source of the Earth's magnetic field is the turbulent flow of liquid metal in the outer core. Our experiment's goal is to create Earth-like dynamo, to explore the mechanisms and to understand the dynamics of the magnetic and velocity fields. Since it is a complicated system, predictions of the magnetic field is a challenging problem. We present results of mimicking the three Meter experiment by a reservoir computer deep learning algorithm. The experiment is a three-meter diameter outer sphere and a one-meter diameter inner sphere with the gap filled with liquid sodium. The spheres can rotate up to 4 and 14 Hz respectively, giving a Reynolds number near to 108. Two external electromagnets apply magnetic fields, while an array of 31 external and 2 internal Hall sensors measure the resulting induced fields. We use this magnetic probe data to train a reservoir computer to predict the 3M time evolution and mimic waves in the experiment. Surprisingly accurate predictions can be made for several magnetic dipole time scales. This shows that such a complicated MHD system's behavior can be predicted. We gratefully acknowledge support from NSF EAR-1417148.

  2. Evolution of brain-computer interfaces: going beyond classic motor physiology

    Science.gov (United States)

    Leuthardt, Eric C.; Schalk, Gerwin; Roland, Jarod; Rouse, Adam; Moran, Daniel W.

    2010-01-01

    The notion that a computer can decode brain signals to infer the intentions of a human and then enact those intentions directly through a machine is becoming a realistic technical possibility. These types of devices are known as brain-computer interfaces (BCIs). The evolution of these neuroprosthetic technologies could have significant implications for patients with motor disabilities by enhancing their ability to interact and communicate with their environment. The cortical physiology most investigated and used for device control has been brain signals from the primary motor cortex. To date, this classic motor physiology has been an effective substrate for demonstrating the potential efficacy of BCI-based control. However, emerging research now stands to further enhance our understanding of the cortical physiology underpinning human intent and provide further signals for more complex brain-derived control. In this review, the authors report the current status of BCIs and detail the emerging research trends that stand to augment clinical applications in the future. PMID:19569892

  3. Evolution of the ATLAS data and computing model for a Tier2 in the EGI infrastructure

    CERN Document Server

    Fernández Casaní, A; The ATLAS collaboration; González de la Hoz, S; Salt Cairols, J; Fassi, F; Kaci, M; Lamas, A; Oliver, E; Sánchez, J; Sánchez, V

    2012-01-01

    Since the start of the LHC pp collisions in 2010, the ATLAS computing model has moved from a more strict design, where every Tier2 had a liaison and a network dependence from a Tier1, to a more meshed approach where every cloud could be connected. Evolution of ATLAS data models requires changes in ATLAS Tier2s policy for the data replication, dynamic data caching and remote data access. It also requires rethinking the network infrastructure to enable any Tier2 and associated Tier3 to easily connect to any Tier1 or Tier2. Tier2s are becoming more and more important in the ATLAS computing model as it allows more data to be readily accessible for analysis jobs to all users, independently of their geographical location. The Tier2s disk space has been reserved for real, simulated, calibration and alignment, group, and user data. A buffer disk space is needed for input and output data for simulations jobs. Tier2s are going to be used more efficiently. In this way Tier1s and Tier2s are becoming more equivalent for t...

  4. Education as an Agent of Social Evolution: The Educational Projects of Patrick Geddes in Late-Victorian Scotland

    Science.gov (United States)

    Sutherland, Douglas

    2009-01-01

    This paper examines the educational projects of Patrick Geddes in late-Victorian Scotland. Initially a natural scientist, Geddes drew on an eclectic mix of social theory to develop his own ideas on social evolution. For him education was a vital agent of social change which, he believed, had the potential to develop active citizens whose…

  5. The SILCC (SImulating the LifeCycle of molecular Clouds) project - I. Chemical evolution of the supernova-driven ISM

    Czech Academy of Sciences Publication Activity Database

    Walch, S.; Girichidis, P.; Naab, T.; Gatto, A.; Glover, S.C.O.; Wünsch, Richard; Klessen, R.S.; Clark, P.C.; Peters, T.; Derigs, D.; Baczynski, C.

    2015-01-01

    Roč. 454, č. 1 (2015), s. 238-268 ISSN 0035-8711 R&D Projects: GA ČR GAP209/12/1795 Institutional support: RVO:67985815 Keywords : magnetodydrodynamics * ISM clouds * ISM evolution Subject RIV: BN - Astronomy , Celestial Mechanics, Astrophysics Impact factor: 4.952, year: 2015

  6. Evolution of spatio-temporal drought characteristics: validation, projections and effect of adaptation scenarios

    Science.gov (United States)

    Vidal, J.-P.; Martin, E.; Kitova, N.; Najac, J.; Soubeyroux, J.-M.

    2012-08-01

    Drought events develop in both space and time and they are therefore best described through summary joint spatio-temporal characteristics, such as mean duration, mean affected area and total magnitude. This paper addresses the issue of future projections of such characteristics of drought events over France through three main research questions: (1) Are downscaled climate projections able to simulate spatio-temporal characteristics of meteorological and agricultural droughts in France over a present-day period? (2) How such characteristics will evolve over the 21st century? (3) How to use standardized drought indices to represent theoretical adaptation scenarios? These questions are addressed using the Isba land surface model, downscaled climate projections from the ARPEGE General Circulation Model under three emissions scenarios, as well as results from a previously performed 50-yr multilevel and multiscale drought reanalysis over France. Spatio-temporal characteristics of meteorological and agricultural drought events are computed using the Standardized Precipitation Index and the Standardized Soil Wetness Index, respectively, and for time scales of 3 and 12 months. Results first show that the distributions of joint spatio-temporal characteristics of observed events are well simulated by the downscaled hydroclimate projections over a present-day period. All spatio-temporal characteristics of drought events are then found to dramatically increase over the 21st century, with stronger changes for agricultural droughts. Two theoretical adaptation scenarios are eventually built based on hypotheses of adaptation to evolving climate and hydrological normals, either retrospective or prospective. The perceived spatio-temporal characteristics of drought events derived from these theoretical adaptation scenarios show much reduced changes, but they call for more realistic scenarios at both the catchment and national scale in order to accurately assess the combined effect of

  7. The ATLAS Computing Agora: a resource web site for citizen science projects

    CERN Document Server

    Bourdarios, Claire; The ATLAS collaboration

    2016-01-01

    The ATLAS collaboration has recently setup a number of citizen science projects which have a strong IT component and could not have been envisaged without the growth of general public computing resources and network connectivity: event simulation through volunteer computing, algorithms improvement via Machine Learning challenges, event display analysis on citizen science platforms, use of open data, etc. Most of the interactions with volunteers are handled through message boards, but specific outreach material was also developed, giving an enhanced visibility to the ATLAS software and computing techniques, challenges and community. In this talk the Atlas Computing Agora (ACA) web platform will be presented as well as some of the specific material developed for some of the projects.

  8. Relevance of East African Drill Cores to Human Evolution: the Case of the Olorgesailie Drilling Project

    Science.gov (United States)

    Potts, R.

    2016-12-01

    Drill cores reaching the local basement of the East African Rift were obtained in 2012 south of the Olorgesailie Basin, Kenya, 20 km from excavations that document key benchmarks in the origin of Homo sapiens. Sediments totaling 216 m were obtained from two drilling locations representing the past 1 million years. The cores were acquired to build a detailed environmental record spatially associated with the transition from Acheulean to Middle Stone Age technology and extensive turnover in mammalian species. The project seeks precise tests of how climate dynamics and tectonic events were linked with these transitions. Core lithology (A.K. Behrensmeyer), geochronology (A. Deino), diatoms (R.B. Owen), phytoliths (R. Kinyanjui), geochemistry (N. Rabideaux, D. Deocampo), among other indicators, show evidence of strong environmental variability in agreement with predicted high-eccentricity modulation of climate during the evolutionary transitions. Increase in hominin mobility, elaboration of symbolic behavior, and concurrent turnover in mammalian species indicating heightened adaptability to unpredictable ecosystems, point to a direct link between the evolutionary transitions and the landscape dynamics reflected in the Olorgesailie drill cores. For paleoanthropologists and Earth scientists, any link between evolutionary transitions and environmental dynamics requires robust evolutionary datasets pertinent to how selection, extinction, population divergence, and other evolutionary processes were impacted by the dynamics uncovered in drill core studies. Fossil and archeological data offer a rich source of data and of robust environment-evolution explanations that must be integrated into efforts by Earth scientists who seek to examine high-resolution climate records of human evolution. Paleoanthropological examples will illustrate the opportunities that exist for connecting evolutionary benchmarks to the data obtained from drilled African muds. Project members: R. Potts, A

  9. The QUANTGRID Project (RO)—Quantum Security in GRID Computing Applications

    Science.gov (United States)

    Dima, M.; Dulea, M.; Petre, M.; Petre, C.; Mitrica, B.; Stoica, M.; Udrea, M.; Sterian, R.; Sterian, P.

    2010-01-01

    The QUANTGRID Project, financed through the National Center for Programme Management (CNMP-Romania), is the first attempt at using Quantum Crypted Communications (QCC) in large scale operations, such as GRID Computing, and conceivably in the years ahead in the banking sector and other security tight communications. In relation with the GRID activities of the Center for Computing & Communications (Nat.'l Inst. Nucl. Phys.—IFIN-HH), the Quantum Optics Lab. (Nat.'l Inst. Plasma and Lasers—INFLPR) and the Physics Dept. (University Polytechnica—UPB) the project will build a demonstrator infrastructure for this technology. The status of the project in its incipient phase is reported, featuring tests for communications in classical security mode: socket level communications under AES (Advanced Encryption Std.), both proprietary code in C++ technology. An outline of the planned undertaking of the project is communicated, highlighting its impact in quantum physics, coherent optics and information technology.

  10. LightKone Project: Lightweight Computation for Networks at the Edge

    OpenAIRE

    Van Roy, Peter; TEKK Tour Digital Wallonia

    2017-01-01

    LightKone combines two recent advances in distributed computing to enable general-purpose computing on edge networks: * Synchronization-free programming: Large-scale applications can run efficiently on edge networks by using convergent data structures (based on Lasp and Antidote from previous project SyncFree) → tolerates dynamicity and loose coupling of edge networks * Hybrid gossip: Communication can be made highly resilient on edge networks by combining gossip with classical distributed al...

  11. Computational issues in alternating projection algorithms for fixed-order control design

    DEFF Research Database (Denmark)

    Beran, Eric Bengt; Grigoriadis, K.

    1997-01-01

    Alternating projection algorithms have been introduced recently to solve fixed-order controller design problems described by linear matrix inequalities and non-convex coupling rank constraints. In this work, an extensive numerical experimentation using proposed benchmark fixed-order control design...... examples is used to indicate the computational efficiency of the method. These results indicate that the proposed alternating projections are effective in obtaining low-order controllers for small and medium order problems...

  12. Early-state damage detection, characterization, and evolution using high-resolution computed tomography

    Science.gov (United States)

    Grandin, Robert John

    Safely using materials in high performance applications requires adequately understanding the mechanisms which control the nucleation and evolution of damage. Most of a material's operational life is spent in a state with noncritical damage, and, for example in metals only a small portion of its life falls within the classical Paris Law regime of crack growth. Developing proper structural health and prognosis models requires understanding the behavior of damage in these early stages within the material's life, and this early-stage damage occurs on length scales at which the material may be considered "granular'' in the sense that the discrete regions which comprise the whole are large enough to require special consideration. Material performance depends upon the characteristics of the granules themselves as well as the interfaces between granules. As a result, properly studying early-stage damage in complex, granular materials requires a means to characterize changes in the granules and interfaces. The granular-scale can range from tenths of microns in ceramics, to single microns in fiber-reinforced composites, to tens of millimeters in concrete. The difficulty of direct-study is often overcome by exhaustive testing of macro-scale damage caused by gross material loads and abuse. Such testing, for example optical or electron microscopy, destructive and further, is costly when used to study the evolution of damage within a material and often limits the study to a few snapshots. New developments in high-resolution computed tomography (HRCT) provide the necessary spatial resolution to directly image the granule length-scale of many materials. Successful application of HRCT with fiber-reinforced composites, however, requires extending the HRCT performance beyond current limits. This dissertation will discuss improvements made in the field of CT reconstruction which enable resolutions to be pushed to the point of being able to image the fiber-scale damage structures and

  13. Computer Hardware, Advanced Mathematics and Model Physics pilot project final report

    International Nuclear Information System (INIS)

    1992-05-01

    The Computer Hardware, Advanced Mathematics and Model Physics (CHAMMP) Program was launched in January, 1990. A principal objective of the program has been to utilize the emerging capabilities of massively parallel scientific computers in the challenge of regional scale predictions of decade-to-century climate change. CHAMMP has already demonstrated the feasibility of achieving a 10,000 fold increase in computational throughput for climate modeling in this decade. What we have also recognized, however, is the need for new algorithms and computer software to capitalize on the radically new computing architectures. This report describes the pilot CHAMMP projects at the DOE National Laboratories and the National Center for Atmospheric Research (NCAR). The pilot projects were selected to identify the principal challenges to CHAMMP and to entrain new scientific computing expertise. The success of some of these projects has aided in the definition of the CHAMMP scientific plan. Many of the papers in this report have been or will be submitted for publication in the open literature. Readers are urged to consult with the authors directly for questions or comments about their papers

  14. Evolution of project-based learning in small groups in environmental engineering courses

    Directory of Open Access Journals (Sweden)

    Jesús M. Requies

    2018-03-01

    Full Text Available This work presents the assessment of the development and evolution of an active methodology (Project-Based Learning –PBL- implemented on the course “Unit Operations in Environmental Engineering”, within the bachelor’s degree in Environmental Engineering, with the purpose of decreasing the dropout rate in this course. After the initial design and implementation of this methodology during the first academic year (12/13, different modifications were adopted in the following ones (13-14, 14-15 & 15-16 in order to optimize the student’s and professor’s work load as well as correct some malfunctions observed in the initial design of the PBL. This active methodology seeks to make students the main architects of their own learning processes. Accordingly, they have to identify their learning needs, which is a highly motivating approach both for their curricular development and for attaining the required learning outcomes in this field of knowledge. The results obtained show that working in small teams (cooperative work enhances each group member’s self–learning capabilities. Moreover, academic marks improve when compared to traditional learning methodologies. Nevertheless, the implementation of more active methodologies, such as project-based learning, in small groups has certain specific characteristics. In this case it has been implemented simultaneously in two different groups of 10 students each one. Such small groups are more heterogeneoussince the presence of two highly motivated students or not can vary or affect the whole group’s attitude and academic results.

  15. LHC Computing Grid Project Launches intAction with International Support. A thousand times more computing power by 2006

    CERN Multimedia

    2001-01-01

    The first phase of the LHC Computing Grid project was approved at an extraordinary meeting of the Council on 20 September 2001. CERN is preparing for the unprecedented avalanche of data that will be produced by the Large Hadron Collider experiments. A thousand times more computer power will be needed by 2006! CERN's need for a dramatic advance in computing capacity is urgent. As from 2006, the four giant detectors observing trillions of elementary particle collisions at the LHC will accumulate over ten million Gigabytes of data, equivalent to the contents of about 20 million CD-ROMs, each year of its operation. A thousand times more computing power will be needed than is available to CERN today. The strategy the collabortations have adopted to analyse and store this unprecedented amount of data is the coordinated deployment of Grid technologies at hundreds of institutes which will be able to search out and analyse information from an interconnected worldwide grid of tens of thousands of computers and storag...

  16. The harmonic distortion evolution of current in computers; A evolucao da distorcao harmonica de corrente em computadores

    Energy Technology Data Exchange (ETDEWEB)

    Bollen, Math; Larsson, Anders; Lundmark, Martin [Universidade de Tecnologia de Lulea (LTU) (Sweden); Wahlberg, Mats; Roennberg, Sarah [Skelleftea Kraft (Sweden)

    2010-05-15

    This project made feeding measurements of large group of computers during games between 2002 and 2008, including the magnitude of current in each phase and in the neutral conductor, the energy consumption and the harmonic spectrum. The presented results show that the harmonic distortion has been diminishing significantly, while the energy consumption by computer do not register important increase.

  17. Advances in Grid Computing for the Fabric for Frontier Experiments Project at Fermilab

    Science.gov (United States)

    Herner, K.; Alba Hernandez, A. F.; Bhat, S.; Box, D.; Boyd, J.; Di Benedetto, V.; Ding, P.; Dykstra, D.; Fattoruso, M.; Garzoglio, G.; Kirby, M.; Kreymer, A.; Levshina, T.; Mazzacane, A.; Mengel, M.; Mhashilkar, P.; Podstavkov, V.; Retzke, K.; Sharma, N.; Teheran, J.

    2017-10-01

    The Fabric for Frontier Experiments (FIFE) project is a major initiative within the Fermilab Scientific Computing Division charged with leading the computing model for Fermilab experiments. Work within the FIFE project creates close collaboration between experimenters and computing professionals to serve high-energy physics experiments of differing size, scope, and physics area. The FIFE project has worked to develop common tools for job submission, certificate management, software and reference data distribution through CVMFS repositories, robust data transfer, job monitoring, and databases for project tracking. Since the projects inception the experiments under the FIFE umbrella have significantly matured, and present an increasingly complex list of requirements to service providers. To meet these requirements, the FIFE project has been involved in transitioning the Fermilab General Purpose Grid cluster to support a partitionable slot model, expanding the resources available to experiments via the Open Science Grid, assisting with commissioning dedicated high-throughput computing resources for individual experiments, supporting the efforts of the HEP Cloud projects to provision a variety of back end resources, including public clouds and high performance computers, and developing rapid onboarding procedures for new experiments and collaborations. The larger demands also require enhanced job monitoring tools, which the project has developed using such tools as ElasticSearch and Grafana. in helping experiments manage their large-scale production workflows. This group in turn requires a structured service to facilitate smooth management of experiment requests, which FIFE provides in the form of the Production Operations Management Service (POMS). POMS is designed to track and manage requests from the FIFE experiments to run particular workflows, and support troubleshooting and triage in case of problems. Recently a new certificate management infrastructure called

  18. Advances in Grid Computing for the FabrIc for Frontier Experiments Project at Fermialb

    Energy Technology Data Exchange (ETDEWEB)

    Herner, K. [Fermilab; Alba Hernandex, A. F. [Fermilab; Bhat, S. [Fermilab; Box, D. [Fermilab; Boyd, J. [Fermilab; Di Benedetto, V. [Fermilab; Ding, P. [Fermilab; Dykstra, D. [Fermilab; Fattoruso, M. [Fermilab; Garzoglio, G. [Fermilab; Kirby, M. [Fermilab; Kreymer, A. [Fermilab; Levshina, T. [Fermilab; Mazzacane, A. [Fermilab; Mengel, M. [Fermilab; Mhashilkar, P. [Fermilab; Podstavkov, V. [Fermilab; Retzke, K. [Fermilab; Sharma, N. [Fermilab; Teheran, J. [Fermilab

    2016-01-01

    The FabrIc for Frontier Experiments (FIFE) project is a major initiative within the Fermilab Scientic Computing Division charged with leading the computing model for Fermilab experiments. Work within the FIFE project creates close collaboration between experimenters and computing professionals to serve high-energy physics experiments of diering size, scope, and physics area. The FIFE project has worked to develop common tools for job submission, certicate management, software and reference data distribution through CVMFS repositories, robust data transfer, job monitoring, and databases for project tracking. Since the projects inception the experiments under the FIFE umbrella have signicantly matured, and present an increasingly complex list of requirements to service providers. To meet these requirements, the FIFE project has been involved in transitioning the Fermilab General Purpose Grid cluster to support a partitionable slot model, expanding the resources available to experiments via the Open Science Grid, assisting with commissioning dedicated high-throughput computing resources for individual experiments, supporting the eorts of the HEP Cloud projects to provision a variety of back end resources, including public clouds and high performance computers, and developing rapid onboarding procedures for new experiments and collaborations. The larger demands also require enhanced job monitoring tools, which the project has developed using such tools as ElasticSearch and Grafana. in helping experiments manage their large-scale production work ows. This group in turn requires a structured service to facilitate smooth management of experiment requests, which FIFE provides in the form of the Production Operations Management Service (POMS). POMS is designed to track and manage requests from the FIFE experiments to run particular work ows, and support troubleshooting and triage in case of problems. Recently a new certicate management infrastructure called Distributed

  19. KWIKPLAN: a computer program for projecting the annual requirements of nuclear fuel cycle operations

    International Nuclear Information System (INIS)

    Salmon, R.; Kee, C.W.

    1977-06-01

    The computer code KWIKPLAN was written to facilitate the calculation of projected nuclear fuel cycle activities. Using given projections of power generation, the code calculates annual requirements for fuel fabrication, fuel reprocessing, uranium mining, and plutonium use and production. The code uses installed capacity projections and mass flow data for six types of reactors to calculate projected fuel cycle activities and inventories. It calculates fissile uranium and plutonium flows and inventories after allowing for an economy with limited reprocessing capacity and a backlog of unreprocessed fuel. All calculations are made on a quarterly basis; printed and punched output of the projected fuel cycle activities are made on an annual basis. Since the punched information is used in another code to determine waste inventories, the code punches a table from which the effective average burnup can be calculated for the fuel being reprocessed

  20. Transitioning the GED[R] Mathematics Test to Computer with and without Accommodations: A Pilot Project

    Science.gov (United States)

    Patterson, Margaret Becker; Higgins, Jennifer; Bozman, Martha; Katz, Michael

    2011-01-01

    We conducted a pilot study to see how the GED Mathematics Test could be administered on computer with embedded accessibility tools. We examined test scores and test-taker experience. Nineteen GED test centers across five states and 216 randomly assigned GED Tests candidates participated in the project. GED candidates completed two GED mathematics…

  1. The community project COSA: comparison of geo-mechanical computer codes for salt

    International Nuclear Information System (INIS)

    Lowe, M.J.S.; Knowles, N.C.

    1986-01-01

    Two benchmark problems related to waste disposal in salt were tackled by ten European organisations using twelve rock-mechanics finite element computer codes. The two problems represented increasing complexity with first a hypothetical verification and then the simulation of a laboratory experiment. The project allowed to ascertain a shapshot of the current combined expertise of European organisations in the modelling of salt behaviour

  2. Resource-constrained project scheduling: computing lower bounds by solving minimum cut problems

    NARCIS (Netherlands)

    Möhring, R.H.; Nesetril, J.; Schulz, A.S.; Stork, F.; Uetz, Marc Jochen

    1999-01-01

    We present a novel approach to compute Lagrangian lower bounds on the objective function value of a wide class of resource-constrained project scheduling problems. The basis is a polynomial-time algorithm to solve the following scheduling problem: Given a set of activities with start-time dependent

  3. A Computer Supported Teamwork Project for People with a Visual Impairment.

    Science.gov (United States)

    Hale, Greg

    2000-01-01

    Discussion of the use of computer supported teamwork (CSTW) in team-based organizations focuses on problems that visually impaired people have reading graphical user interface software via screen reader software. Describes a project that successfully used email for CSTW, and suggests issues needing further research. (LRW)

  4. A Computer-Based Laboratory Project for the Study of Stimulus Generalization and Peak Shift

    Science.gov (United States)

    Derenne, Adam; Loshek, Eevett

    2009-01-01

    This paper describes materials designed for classroom projects on stimulus generalization and peak shift. A computer program (originally written in QuickBASIC) is used for data collection and a Microsoft Excel file with macros organizes the raw data on a spreadsheet and creates generalization gradients. The program is designed for use with human…

  5. The Variation Theorem Applied to H-2+: A Simple Quantum Chemistry Computer Project

    Science.gov (United States)

    Robiette, Alan G.

    1975-01-01

    Describes a student project which requires limited knowledge of Fortran and only minimal computing resources. The results illustrate such important principles of quantum mechanics as the variation theorem and the virial theorem. Presents sample calculations and the subprogram for energy calculations. (GS)

  6. The Lower Manhattan Project: A New Approach to Computer-Assisted Learning in History Classrooms.

    Science.gov (United States)

    Crozier, William; Gaffield, Chad

    1990-01-01

    The Lower Manhattan Project, a computer-assisted undergraduate course in U.S. history, enhances student appreciation of the historical process through research and writing. Focuses on the late nineteenth and early twentieth centuries emphasizing massive immigration, rapid industrialization, and the growth of cities. Includes a reading list and…

  7. Optimization of the cumulative risk assessment of pesticides and biocides using computational techniques: Pilot project

    DEFF Research Database (Denmark)

    Jonsdottir, Svava Osk; Reffstrup, Trine Klein; Petersen, Annette

    This pilot project is intended as the first step in developing a computational strategy to assist in refining methods for higher tier cumulative and aggregate risk assessment of exposure to mixture of pesticides and biocides. For this purpose, physiologically based toxicokinetic (PBTK) models were...

  8. Applied Linguistics Project: Student-Led Computer Assisted Research in High School EAL/EAP

    Science.gov (United States)

    Bohát, Róbert; Rödlingová, Beata; Horáková, Nina

    2015-01-01

    The Applied Linguistics Project (ALP) started at the International School of Prague (ISP) in 2013. Every year, Grade 9 English as an Additional Language (EAL) students identify an area of learning in need of improvement and design a research method followed by data collection and analysis using basic computer software tools or online corpora.…

  9. Computational Experiment Approach to Controlled Evolution of Procurement Pattern in Cluster Supply Chain

    Directory of Open Access Journals (Sweden)

    Xiao Xue

    2015-01-01

    Full Text Available Companies have been aware of the benefits of developing Cluster Supply Chains (CSCs, and they are spending a great deal of time and money attempting to develop the new business pattern. Yet, the traditional techniques for identifying CSCs have strong theoretical antecedents, but seem to have little traction in the field. We believe this is because the standard techniques fail to capture evolution over time, nor provide useful intervention measures to reach goals. To address these problems, we introduce an agent-based modeling approach to evaluate CSCs. Taking collaborative procurement as research object, our approach is composed of three parts: model construction, model instantiation, and computational experiment. We use the approach to explore the service charging policy problem in collaborative procurement. Three kinds of service charging polices are compared in the same experiment environment. Finally, “Fixed Cost” is identified as the optimal policy under the stable market environment. The case study can help us to understand the workflow of applying the approach, and provide valuable decision support applications to industry.

  10. Computer simulation of phase transformation and plastic deformation in IN718 superalloy: Microstructural evolution during precipitation

    International Nuclear Information System (INIS)

    Zhou, N.; Lv, D.C.; Zhang, H.L.; McAllister, D.; Zhang, F.; Mills, M.J.; Wang, Y.

    2014-01-01

    Microstructural evolution during co-precipitation of γ′, γ″ and δ phases from a supersaturated γ matrix during aging of superalloy Inconel 718 (IN718) is investigated by computer simulation using the phase-field method. The precipitation model is quantitative, using as model inputs ab initio calculations of elastic constants, experimental data on lattice parameters, precipitate–matrix orientation relationship, interfacial energy of each individual precipitate phase and interdiffusivities, and a Ni–Nb–Al pseudo-ternary thermodynamic database specifically developed for IN718. In order to simulate statistically representative multiphase microstructures observed in the alloy, the Kim–Kim–Suzuki treatment of interfaces is employed. Simulation results show how alloy composition, lattice misfit, external stress, temperature and time affect precipitate microstructure and variant selection during isothermal aging, without any a priori assumptions about key microstructural features including size, shape, volume fraction and spatial distribution of different types of precipitates and different variants of the same precipitate phase. The shapes of precipitates and their coarsening kinetics are analyzed based on the two-dimensional moment invariant. The various multiphase microstructures generated by the simulations have been used as model inputs in a study to investigate how precipitate microstructure (in particular shape and spatial distribution) influences the strength of IN718

  11. Rana computatrix to human language: towards a computational neuroethology of language evolution.

    Science.gov (United States)

    Arbib, Michael A

    2003-10-15

    Walter's Machina speculatrix inspired the name Rana computatrix for a family of models of visuomotor coordination in the frog, which contributed to the development of computational neuroethology. We offer here an 'evolutionary' perspective on models in the same tradition for rat, monkey and human. For rat, we show how the frog-like taxon affordance model provides a basis for the spatial navigation mechanisms that involve the hippocampus and other brain regions. For monkey, we recall two models of neural mechanisms for visuomotor coordination. The first, for saccades, shows how interactions between the parietal and frontal cortex augment superior colliculus seen as the homologue of frog tectum. The second, for grasping, continues the theme of parieto-frontal interactions, linking parietal affordances to motor schemas in premotor cortex. It further emphasizes the mirror system for grasping, in which neurons are active both when the monkey executes a specific grasp and when it observes a similar grasp executed by others. The model of human-brain mechanisms is based on the mirror-system hypothesis of the evolution of the language-ready brain, which sees the human Broca's area as an evolved extension of the mirror system for grasping.

  12. Students Upgrading through Computer and Career Education System Services (Project SUCCESS). Final Evaluation Report 1993-94. OER Report.

    Science.gov (United States)

    Greene, Judy

    Students Upgrading through Computer and Career Education System Services (Project SUCCESS) was an Elementary and Secondary Education Act Title VII-funded project in its fourth year of operation. The project operated at two high schools in Brooklyn and one in Manhattan (New York). In the 1993-94 school year, the project served 393 students of…

  13. Mobile clusters of single board computers: an option for providing resources to student projects and researchers.

    Science.gov (United States)

    Baun, Christian

    2016-01-01

    Clusters usually consist of servers, workstations or personal computers as nodes. But especially for academic purposes like student projects or scientific projects, the cost for purchase and operation can be a challenge. Single board computers cannot compete with the performance or energy-efficiency of higher-value systems, but they are an option to build inexpensive cluster systems. Because of the compact design and modest energy consumption, it is possible to build clusters of single board computers in a way that they are mobile and can be easily transported by the users. This paper describes the construction of such a cluster, useful applications and the performance of the single nodes. Furthermore, the clusters' performance and energy-efficiency is analyzed by executing the High Performance Linpack benchmark with a different number of nodes and different proportion of the systems total main memory utilized.

  14. PREPS2 - a PC-based computer program for performing economic analysis of capital projects

    International Nuclear Information System (INIS)

    Blake, M.W.; Brand, D.O.; Chastain, E.T.; Johnson, E.D.

    1990-01-01

    In these times of increased spending to finance new capacity and to meet clean air act legislation, many electric utilities are giving a high priority to controlling capital expenditures at existing generating facilities. Determining the level of capital expenditures which are economically justified is very difficult; units which have higher capacity factors are worth more to the utility. Therefore, the utility can more readily justify higher capital expenditures to improve or maintain reliability and heat rate than on units with lower capacity factors. This paper describes a PC-based computer program (PREPS2) which performs an economic analysis of individual capital projects. The program incorporates tables which describe the worth to the system of making improvements in each unit. This computer program is currently being used by the six Southern Company operating companies to evaluate all production capital projects over $50,000. Approximately 500 projects representing about $300 million are being analyzed each year

  15. Software and man-machine interface considerations for a nuclear plant computer replacement and upgrade project

    International Nuclear Information System (INIS)

    Diamond, G.; Robinson, E.

    1984-01-01

    Some of the key software functions and Man-Machine Interface considerations in a computer replacement and upgrade project for a nuclear power plant are described. The project involves the installation of two separate computer systems: an Emergency Response Facilities Computer System (ERFCS) and a Plant Process Computer System (PPCS). These systems employ state-of-the-art computer hardware and software. The ERFCS is a new system intended to provide enhanced functions to meet NRC post-TMI guidelines. The PPCS is intended to replace and upgrade an existing obsolete plant computer system. A general overview of the hardware and software aspects of the replacement and upgrade is presented. The work done to develop the upgraded Man-Machine Interface is described. For the ERFCS, a detailed discussion is presented of the work done to develop logic to evaluate the readiness and performance of safety systems and their supporting functions. The Man-Machine Interface considerations of reporting readiness and performance to the operator are discussed. Finally, the considerations involved in the implementation of this logic in real-time software are discussed.. For the PPCS, a detailed discussion is presented of some new features

  16. A reconstruction algorithm for coherent scatter computed tomography based on filtered back-projection

    International Nuclear Information System (INIS)

    Stevendaal, U. van; Schlomka, J.-P.; Harding, A.; Grass, M.

    2003-01-01

    Coherent scatter computed tomography (CSCT) is a reconstructive x-ray imaging technique that yields the spatially resolved coherent-scatter form factor of the investigated object. Reconstruction from coherently scattered x-rays is commonly done using algebraic reconstruction techniques (ART). In this paper, we propose an alternative approach based on filtered back-projection. For the first time, a three-dimensional (3D) filtered back-projection technique using curved 3D back-projection lines is applied to two-dimensional coherent scatter projection data. The proposed algorithm is tested with simulated projection data as well as with projection data acquired with a demonstrator setup similar to a multi-line CT scanner geometry. While yielding comparable image quality as ART reconstruction, the modified 3D filtered back-projection algorithm is about two orders of magnitude faster. In contrast to iterative reconstruction schemes, it has the advantage that subfield-of-view reconstruction becomes feasible. This allows a selective reconstruction of the coherent-scatter form factor for a region of interest. The proposed modified 3D filtered back-projection algorithm is a powerful reconstruction technique to be implemented in a CSCT scanning system. This method gives coherent scatter CT the potential of becoming a competitive modality for medical imaging or nondestructive testing

  17. A project to study SOC evolution after land use change combining chronosequence and gradient methods

    Science.gov (United States)

    Gabarron-Galeote, Miguel A.; van Wesemael, Bas

    2013-04-01

    In the last decades the interest in the global C budget has increased enormously and soils have a great importance in this issue since they contain about twice as much carbon as the atmosphere. Land use change (LUC) can cause a change in land cover and an associated change in carbon stocks in soils, so it has a major impact in the balance between inputs and outputs of soil organic carbon (SOC). Improved understanding of land-use impacts on the world's terrestrial carbon balance is thus a necessary part of the global effort to mitigate climate change. The aim of this project is to predict the effects of land use and land management change on (SOC) stocks, characterizing the soil organic carbon cycle and its relationship to the vegetal cover in croplands abandoned different years ago and under different Mediterranean climatic conditions in South of Spain. The study area is located in the Cordillera Bética Litoral, in South of Spain. In this area, a climatic gradient can be observed from West to East: from >1,500 mm year-1 in the Strait of Gibraltar to <250 mm year-1 in the Cabo de Gata. More specifically, the study is focussed on three different areas from the climatic conditions point of view: Gaucín (1010 mm year-1), Almogía, (576 mm year-1) and Gérgal (240 mm year-1). By means of the analyses of aerial photographs (1956, 1977, 1984, 1998 and 2009) all the experimental plots will be selected. After this procedure, the three study areas will be composed by experimental plots of these classes: a) Lands with natural vegetation since 1956. b) Abandoned lands between 1956 and 1977. c) Abandoned lands between 1977 and 1984. d) Abandoned lands between 1984 and 1998. e) Abandoned lands between 1998 and 2005. f) Cultivated lands since 1956. The main expected outcomes of the research project are the characterization of the temporal evolution of SOC in soils, the compilation of experimental areas under different Mediterranean climatic conditions, and the characterization

  18. The Evolution of Successful Satellite Science to Air Quality Application Projects: From Inception to Realization

    Science.gov (United States)

    Soja, A. J.

    2012-12-01

    biomass burning portion of our nation's NEI at the crossroads of the applications 'largest cost-effective unknowns and uncertainties' and the 'best-available science and data'. Here, we will present a diagram tree of the completed evolution of a successful project, which includes the basic science on which this development is based and the succession of the use of satellite data within the applications and user communities.

  19. Scientific computing and algorithms in industrial simulations projects and products of Fraunhofer SCAI

    CERN Document Server

    Schüller, Anton; Schweitzer, Marc

    2017-01-01

    The contributions gathered here provide an overview of current research projects and selected software products of the Fraunhofer Institute for Algorithms and Scientific Computing SCAI. They show the wide range of challenges that scientific computing currently faces, the solutions it offers, and its important role in developing applications for industry. Given the exciting field of applied collaborative research and development it discusses, the book will appeal to scientists, practitioners, and students alike. The Fraunhofer Institute for Algorithms and Scientific Computing SCAI combines excellent research and application-oriented development to provide added value for our partners. SCAI develops numerical techniques, parallel algorithms and specialized software tools to support and optimize industrial simulations. Moreover, it implements custom software solutions for production and logistics, and offers calculations on high-performance computers. Its services and products are based on state-of-the-art metho...

  20. Distributed and grid computing projects with research focus in human health.

    Science.gov (United States)

    Diomidous, Marianna; Zikos, Dimitrios

    2012-01-01

    Distributed systems and grid computing systems are used to connect several computers to obtain a higher level of performance, in order to solve a problem. During the last decade, projects use the World Wide Web to aggregate individuals' CPU power for research purposes. This paper presents the existing active large scale distributed and grid computing projects with research focus in human health. There have been found and presented 11 active projects with more than 2000 Processing Units (PUs) each. The research focus for most of them is molecular biology and, specifically on understanding or predicting protein structure through simulation, comparing proteins, genomic analysis for disease provoking genes and drug design. Though not in all cases explicitly stated, common target diseases include research to find cure against HIV, dengue, Duchene dystrophy, Parkinson's disease, various types of cancer and influenza. Other diseases include malaria, anthrax, Alzheimer's disease. The need for national initiatives and European Collaboration for larger scale projects is stressed, to raise the awareness of citizens to participate in order to create a culture of internet volunteering altruism.

  1. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Spentzouris, P.; /Fermilab; Cary, J.; /Tech-X, Boulder; McInnes, L.C.; /Argonne; Mori, W.; /UCLA; Ng, C.; /SLAC; Ng, E.; Ryne, R.; /LBL, Berkeley

    2011-11-14

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization

  2. Really Large Scale Computer Graphic Projection Using Lasers and Laser Substitutes

    Science.gov (United States)

    Rother, Paul

    1989-07-01

    This paper reflects on past laser projects to display vector scanned computer graphic images onto very large and irregular surfaces. Since the availability of microprocessors and high powered visible lasers, very large scale computer graphics projection have become a reality. Due to the independence from a focusing lens, lasers easily project onto distant and irregular surfaces and have been used for amusement parks, theatrical performances, concert performances, industrial trade shows and dance clubs. Lasers have been used to project onto mountains, buildings, 360° globes, clouds of smoke and water. These methods have proven successful in installations at: Epcot Theme Park in Florida; Stone Mountain Park in Georgia; 1984 Olympics in Los Angeles; hundreds of Corporate trade shows and thousands of musical performances. Using new ColorRayTM technology, the use of costly and fragile lasers is no longer necessary. Utilizing fiber optic technology, the functionality of lasers can be duplicated for new and exciting projection possibilities. The use of ColorRayTM technology has enjoyed worldwide recognition in conjunction with Pink Floyd and George Michaels' world wide tours.

  3. A computer based approach for Material, Manpower and Equipment managementin the Construction Projects

    Science.gov (United States)

    Sasidhar, Jaladanki; Muthu, D.; Venkatasubramanian, C.; Ramakrishnan, K.

    2017-07-01

    The success of any construction project will depend on efficient management of resources in a perfect manner to complete the project with a reasonable budget and time and the quality cannot be compromised. The efficient and timely procurement of material, deployment of adequate labor at correct time and mobilization of machinery lacking in time, all of them causes delay, lack of quality and finally affect the project cost. It is known factor that Project cost can be controlled by taking corrective actions on mobilization of resources at a right time. This research focuses on integration of management systems with the computer to generate the model which uses OOM data structure which decides to include automatic commodity code generation, automatic takeoff execution, intelligent purchase order generation, and components of design and schedule integration to overcome the problems of stock out. To overcome the problem in equipment management system inventory management module is suggested and the data set of equipment registration number, equipment number, description, date of purchase, manufacturer, equipment price, market value, life of equipment, production data of the equipment which includes equipment number, date, name of the job, hourly rate, insurance, depreciation cost of the equipment, taxes, storage cost, interest, oil, grease, and fuel consumption, etc. is analyzed and the decision support systems to overcome the problem arising out improper management is generated. The problem on labor is managed using scheduling, Strategic management of human resources. From the generated support systems tool, the resources are mobilized at a right time and help the project manager to finish project in time and thereby save the abnormal project cost and also provides the percentage that can be improved and also research focuses on determining the percentage of delays that are caused by lack of management of materials, manpower and machinery in different types of projects

  4. Developing a project-based computational physics course grounded in expert practice

    Science.gov (United States)

    Burke, Christopher J.; Atherton, Timothy J.

    2017-04-01

    We describe a project-based computational physics course developed using a backwards course design approach. From an initial competency-based model of problem solving in computational physics, we interviewed faculty who use these tools in their own research to determine indicators of expert practice. From these, a rubric was formulated that enabled us to design a course intended to allow students to learn these skills. We also report an initial implementation of the course and, by having the interviewees regrade student work, show that students acquired many of the expert practices identified.

  5. Distributed management of scientific projects - An analysis of two computer-conferencing experiments at NASA

    Science.gov (United States)

    Vallee, J.; Gibbs, B.

    1976-01-01

    Between August 1975 and March 1976, two NASA projects with geographically separated participants used a computer-conferencing system developed by the Institute for the Future for portions of their work. Monthly usage statistics for the system were collected in order to examine the group and individual participation figures for all conferences. The conference transcripts were analysed to derive observations about the use of the medium. In addition to the results of these analyses, the attitudes of users and the major components of the costs of computer conferencing are discussed.

  6. Final Project Report: Data Locality Enhancement of Dynamic Simulations for Exascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Shen, Xipeng [North Carolina State Univ., Raleigh, NC (United States)

    2016-04-27

    The goal of this project is to develop a set of techniques and software tools to enhance the matching between memory accesses in dynamic simulations and the prominent features of modern and future manycore systems, alleviating the memory performance issues for exascale computing. In the first three years, the PI and his group have achieves some significant progress towards the goal, producing a set of novel techniques for improving the memory performance and data locality in manycore systems, yielding 18 conference and workshop papers and 4 journal papers and graduating 6 Ph.Ds. This report summarizes the research results of this project through that period.

  7. Fast computation of statistical uncertainty for spatiotemporal distributions estimated directly from dynamic cone beam SPECT projections

    International Nuclear Information System (INIS)

    Reutter, Bryan W.; Gullberg, Grant T.; Huesman, Ronald H.

    2001-01-01

    The estimation of time-activity curves and kinetic model parameters directly from projection data is potentially useful for clinical dynamic single photon emission computed tomography (SPECT) studies, particularly in those clinics that have only single-detector systems and thus are not able to perform rapid tomographic acquisitions. Because the radiopharmaceutical distribution changes while the SPECT gantry rotates, projections at different angles come from different tracer distributions. A dynamic image sequence reconstructed from the inconsistent projections acquired by a slowly rotating gantry can contain artifacts that lead to biases in kinetic parameters estimated from time-activity curves generated by overlaying regions of interest on the images. If cone beam collimators are used and the focal point of the collimators always remains in a particular transaxial plane, additional artifacts can arise in other planes reconstructed using insufficient projection samples [1]. If the projection samples truncate the patient's body, this can result in additional image artifacts. To overcome these sources of bias in conventional image based dynamic data analysis, we and others have been investigating the estimation of time-activity curves and kinetic model parameters directly from dynamic SPECT projection data by modeling the spatial and temporal distribution of the radiopharmaceutical throughout the projected field of view [2-8]. In our previous work we developed a computationally efficient method for fully four-dimensional (4-D) direct estimation of spatiotemporal distributions from dynamic SPECT projection data [5], which extended Formiconi's least squares algorithm for reconstructing temporally static distributions [9]. In addition, we studied the biases that result from modeling various orders temporal continuity and using various time samplings [5]. the present work, we address computational issues associated with evaluating the statistical uncertainty of

  8. Enhanced sealing project (ESP): evolution of a full-sized concrete and bentonite shaft seal

    International Nuclear Information System (INIS)

    Dixon, D.A.; Priyanto, D.G.; Martino, J.B.; De Combarieu, M.; Johansson, R.; Korkeakoski, P.; Villagran, J.

    2012-01-01

    Document available in extended abstract form only. A full-scale shaft seal was designed and installed in the 5-m-diameter access shaft at Atomic Energy of Canada's (AECL's) Underground Research Laboratory (URL) at the point where he shaft intersects an ancient water-bearing, low-angle thrust fault in granitic rock. The seal, part of the permanent closure of the URL, consists of a 6-m-thick bentonite-based component sandwiched between 3-m-thick upper and lower, 3-m-thick concrete components. The bentonite-based component spans the fracture zone and extends approximately 1 m beyond the maximum identified extent of the fracture. This design was adopted in order to limit water from the deeper, saline regions mixing with the fresher, near-surface groundwater regime. The concrete components provide the mechanical confinement and an in situ compacted 40/60 mixture of bentonite clay and quartz sand provides the sealing component. Construction of the shaft seal was done as part of Canada's Nuclear Legacies Liability Program. However, monitoring the seal evolution was not part of the decommissioning program's mandate. In addition to accomplishing the permanent closure of the URL, this seal's construction provides a unique opportunity to instrument and monitor the evolution of a full-scale shaft seal as well as the recovery of the regional groundwater regime as the facility passively floods above the seal. A jointly funded monitoring project was developed by NWMO (Canada), SKB (Sweden), Posiva (Finland) and ANDRA (France) and since mid 2009 the thermal, hydraulic and mechanical evolution of the seal has been constantly monitored. The evolution of the type of seal being monitored in the ESP is of relevance to repository closure planning, gaining confidence in the functionality of shaft seals. Although constructed in a crystalline rock medium, the results of the ESP are expected to be relevant to the performance of seals in a variety of host rock types. The shaft seal has been

  9. Comparing genomes to computer operating systems in terms of the topology and evolution of their regulatory control networks.

    Science.gov (United States)

    Yan, Koon-Kiu; Fang, Gang; Bhardwaj, Nitin; Alexander, Roger P; Gerstein, Mark

    2010-05-18

    The genome has often been called the operating system (OS) for a living organism. A computer OS is described by a regulatory control network termed the call graph, which is analogous to the transcriptional regulatory network in a cell. To apply our firsthand knowledge of the architecture of software systems to understand cellular design principles, we present a comparison between the transcriptional regulatory network of a well-studied bacterium (Escherichia coli) and the call graph of a canonical OS (Linux) in terms of topology and evolution. We show that both networks have a fundamentally hierarchical layout, but there is a key difference: The transcriptional regulatory network possesses a few global regulators at the top and many targets at the bottom; conversely, the call graph has many regulators controlling a small set of generic functions. This top-heavy organization leads to highly overlapping functional modules in the call graph, in contrast to the relatively independent modules in the regulatory network. We further develop a way to measure evolutionary rates comparably between the two networks and explain this difference in terms of network evolution. The process of biological evolution via random mutation and subsequent selection tightly constrains the evolution of regulatory network hubs. The call graph, however, exhibits rapid evolution of its highly connected generic components, made possible by designers' continual fine-tuning. These findings stem from the design principles of the two systems: robustness for biological systems and cost effectiveness (reuse) for software systems.

  10. Madagascar: open-source software project for multidimensional data analysis and reproducible computational experiments

    Directory of Open Access Journals (Sweden)

    Sergey Fomel

    2013-11-01

    Full Text Available The Madagascar software package is designed for analysis of large-scale multidimensional data, such as those occurring in exploration geophysics. Madagascar provides a framework for reproducible research. By “reproducible research” we refer to the discipline of attaching software codes and data to computational results reported in publications. The package contains a collection of (a computational modules, (b data-processing scripts, and (c research papers. Madagascar is distributed on SourceForge under a GPL v2 license https://sourceforge.net/projects/rsf/. By October 2013, more than 70 people from different organizations around the world have contributed to the project, with increasing year-to-year activity. The Madagascar website is http://www.ahay.org/.

  11. Projection matrix acquisition for cone-beam computed tomography iterative reconstruction

    Science.gov (United States)

    Yang, Fuqiang; Zhang, Dinghua; Huang, Kuidong; Shi, Wenlong; Zhang, Caixin; Gao, Zongzhao

    2017-02-01

    Projection matrix is an essential and time-consuming part in computed tomography (CT) iterative reconstruction. In this article a novel calculation algorithm of three-dimensional (3D) projection matrix is proposed to quickly acquire the matrix for cone-beam CT (CBCT). The CT data needed to be reconstructed is considered as consisting of the three orthogonal sets of equally spaced and parallel planes, rather than the individual voxels. After getting the intersections the rays with the surfaces of the voxels, the coordinate points and vertex is compared to obtain the index value that the ray traversed. Without considering ray-slope to voxel, it just need comparing the position of two points. Finally, the computer simulation is used to verify the effectiveness of the algorithm.

  12. Project of computer program for designing the steel with the assumed CCT diagram

    OpenAIRE

    S. Malara; J. Trzaska; L.A. Dobrzański

    2007-01-01

    Purpose: The aim of this paper was developing a project of computer aided method for designing the chemicalcomposition of steel with the assumed CCT diagram.Design/methodology/approach: The purpose has been achieved in four stages. At the first stage characteristicpoints of CCT diagram have been determined. At the second stage neural networks have been developed, andnext CCT diagram terms of similarity have been worked out- at the third one. In the last one steel chemicalcomposition optimizat...

  13. ProjectQ: An Open Source Software Framework for Quantum Computing

    OpenAIRE

    Steiger, Damian S.; Häner, Thomas; Troyer, Matthias

    2016-01-01

    We introduce ProjectQ, an open source software effort for quantum computing. The first release features a compiler framework capable of targeting various types of hardware, a high-performance simulator with emulation capabilities, and compiler plug-ins for circuit drawing and resource estimation. We introduce our Python-embedded domain-specific language, present the features, and provide example implementations for quantum algorithms. The framework allows testing of quantum algorithms through...

  14. Computer literacy enhancement in the Teaching Hospital Olomouc. Part I: project management techniques. Short communication.

    Science.gov (United States)

    Sedlár, Drahomír; Potomková, Jarmila; Rehorová, Jarmila; Seckár, Pavel; Sukopová, Vera

    2003-11-01

    Information explosion and globalization make great demands on keeping pace with the new trends in the healthcare sector. The contemporary level of computer and information literacy among most health care professionals in the Teaching Hospital Olomouc (Czech Republic) is not satisfactory for efficient exploitation of modern information technology in diagnostics, therapy and nursing. The present contribution describes the application of two basic problem solving techniques (brainstorming, SWOT analysis) to develop a project aimed at information literacy enhancement.

  15. Descriptive and Computer Aided Drawing Perspective on an Unfolded Polyhedral Projection Surface

    Science.gov (United States)

    Dzwierzynska, Jolanta

    2017-10-01

    The aim of the herby study is to develop a method of direct and practical mapping of perspective on an unfolded prism polyhedral projection surface. The considered perspective representation is a rectilinear central projection onto a surface composed of several flat elements. In the paper two descriptive methods of drawing perspective are presented: direct and indirect. The graphical mapping of the effects of the representation is realized directly on the unfolded flat projection surface. That is due to the projective and graphical connection between points displayed on the polyhedral background and their counterparts received on the unfolded flat surface. For a significant improvement of the construction of line, analytical algorithms are formulated. They draw a perspective image of a segment of line passing through two different points determined by their coordinates in a spatial coordinate system of axis x, y, z. Compared to other perspective construction methods that use information about points, for computer vision and the computer aided design, our algorithms utilize data about lines, which are applied very often in architectural forms. Possibility of drawing lines in the considered perspective enables drawing an edge perspective image of an architectural object. The application of the changeable base elements of perspective as a horizon height and a station point location enable drawing perspective image from different viewing positions. The analytical algorithms for drawing perspective images are formulated in Mathcad software, however, they can be implemented in the majority of computer graphical packages, which can make drawing perspective more efficient and easier. The representation presented in the paper and the way of its direct mapping on the flat unfolded projection surface can find application in presentation of architectural space in advertisement and art.

  16. A Framework for Debugging Geoscience Projects in a High Performance Computing Environment

    Science.gov (United States)

    Baxter, C.; Matott, L.

    2012-12-01

    High performance computing (HPC) infrastructure has become ubiquitous in today's world with the emergence of commercial cloud computing and academic supercomputing centers. Teams of geoscientists, hydrologists and engineers can take advantage of this infrastructure to undertake large research projects - for example, linking one or more site-specific environmental models with soft computing algorithms, such as heuristic global search procedures, to perform parameter estimation and predictive uncertainty analysis, and/or design least-cost remediation systems. However, the size, complexity and distributed nature of these projects can make identifying failures in the associated numerical experiments using conventional ad-hoc approaches both time- consuming and ineffective. To address these problems a multi-tiered debugging framework has been developed. The framework allows for quickly isolating and remedying a number of potential experimental failures, including: failures in the HPC scheduler; bugs in the soft computing code; bugs in the modeling code; and permissions and access control errors. The utility of the framework is demonstrated via application to a series of over 200,000 numerical experiments involving a suite of 5 heuristic global search algorithms and 15 mathematical test functions serving as cheap analogues for the simulation-based optimization of pump-and-treat subsurface remediation systems.

  17. Students Upgrading through Computer and Career Education System Services (Project SUCCESS). Final Evaluation Report 1992-93. OER Report.

    Science.gov (United States)

    New York City Board of Education, Brooklyn, NY. Office of Educational Research.

    Student Upgrading through Computer and Career Education System Services (Project SUCCESS) was an Elementary and Secondary Education Act Title VII-funded project in its third year of operation. Project SUCCESS served 460 students of limited English proficiency at two high schools in Brooklyn and one high school in Manhattan (New York City).…

  18. Computational screening of doped αMnO2 catalystsfor the oxygen evolution reaction

    DEFF Research Database (Denmark)

    Tripkovic, Vladimir; Hansen, Heine Anton; Vegge, Tejs

    2018-01-01

    Minimizing energy and materials costs for driving the oxygen evolution reaction (OER) is paramount for the commercialization of water electrolysis cells and rechargeable metal-air batteries. Using density functional theory calculations, we analyze the structural stability, catalytic activity...

  19. Evolution of the U.S. Energy Service Company Industry: Market Size and Project Performance from 1990-2008

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Goldman, Charles A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Satchwell, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-05-08

    investment. There is empirical evidence confirming that the industry is responding to customer demand by installing more comprehensive and complex measures—including onsite generation and measures to address deferred maintenance—but this evolution has significant implications for customer project economics, especially at K-12 schools. We found that the median simple payback time has increased from 1.9 to 3.2 years in private sector projects since the early-to-mid 1990s and from 5.2 to 10.5 years in public sector projects for the same time period.

  20. Rigorous project for existing houses. Energy conservation requires evolution; Rigoureus project voor bestaande woningen. Evolutie voor energiebesparing nodig

    Energy Technology Data Exchange (ETDEWEB)

    Clocquet, R. [DHV, Amersfoort (Netherlands); Koene, F. [ECN Efficiency and Infrastructure, Petten (Netherlands)

    2010-05-15

    How can existing terraced houses be renovated in such a way that their energy use decreases with 75 percent? The Rigorous project of the Energy research Centre of the Netherlands (ECN), TNO, Delft University of Technology and DHV, developed innovative renovation concepts that make such savings feasible by combining constructional measures with installation concepts. On top of that it is also essential that consumer behavior is addressed. [Dutch] Hoe kunnen bestaande rijtjeswoningen zo worden gerenoveerd dat het totale energiegebruik met 75 procent afneemt? In het Rigoureus-project hebben ECN, TNO, TU Delft en DHV innovatieve renovatieconcepten ontwikkeld die dat, door een combinatie van bouwkundige maatregelen en uitgeldende installatieconcepten, mogelijk maken. Daarbij blijkt het van essentieel belang ook het gebruikersgedrag aan te pakken.

  1. Evolution of ion-exchange: from Moses to the Manhattan Project to modern times.

    Science.gov (United States)

    Lucy, Charles A

    2003-06-06

    This article explores the history of ion-exchange from records of desalination in the Old Testament and the writings of Aristotle, to the identification of the phenomenon of ion-exchange by two English agricultural chemists, to the invention of suppressed conductivity by Small et al. [Anal. Chem. 54 (1975) 462]. It then focuses on the characteristics of the gradual and continuous evolution of ion chromatography with suppressed conductivity to its current state, with an emphasis on those discoveries that punctuated or revolutionized this evolution.

  2. POISSON project. III. Investigating the evolution of the mass accretion rate

    Science.gov (United States)

    Antoniucci, S.; García López, R.; Nisini, B.; Caratti o Garatti, A.; Giannini, T.; Lorenzetti, D.

    2014-12-01

    Context. As part of the Protostellar Optical-Infrared Spectral Survey On NTT (POISSON) project, we present the results of the analysis of low-resolution near-IR spectroscopic data (0.9-2.4 μm) of two samples of young stellar objects in the Lupus (52 objects) and Serpens (17 objects) star-forming clouds, with masses in the range of 0.1 to 2.0 M⊙ and ages spanning from 105 to a few 107 yr. Aims: After determining the accretion parameters of the targets by analysing their H i near-IR emission features, we added the results from the Lupus and Serpens clouds to those from previous regions (investigated in POISSON with the same methodology) to obtain a final catalogue (143 objects) of mass accretion rate values (Ṁacc) derived in a homogeneous and consistent fashion. Our final goal is to analyse how Ṁacc correlates with the stellar mass (M∗) and how it evolves in time in the whole POISSON sample. Methods: We derived the accretion luminosity (Lacc) and Ṁacc for Lupus and Serpens objects from the Brγ (Paβ in a few cases) line by using relevant empirical relationships available in the literature that connect the H i line luminosity and Lacc. To minimise the biases that arise from adopting literature data that are based on different evolutionary models and also for self-consistency, we re-derived mass and age for each source of the POISSON samples using the same set of evolutionary tracks. Results: We observe a correlation Ṁacc~M*2.2 between mass accretion rate and stellar mass, similarly to what has previously been observed in several star-forming regions. We find that the time variation of Ṁacc is roughly consistent with the expected evolution of the accretion rate in viscous disks, with an asymptotic decay that behaves as t-1.6. However, Ṁacc values are characterised by a large scatter at similar ages and are on average higher than the predictions of viscous models. Conclusions: Although part of the scattering may be related to systematics due to the

  3. Evolution of the U.S. energy service company industry: Market size and project performance from 1990–2008

    International Nuclear Information System (INIS)

    Larsen, Peter H.; Goldman, Charles A.; Satchwell, Andrew

    2012-01-01

    The U.S. energy service company (ESCO) industry is an example of a private sector business model where energy savings are delivered to customers primarily through the use of performance-based contracts. This study was conceived as a snapshot of the ESCO industry prior to the economic slowdown and the introduction of federal stimulus funding mandated by enactment of the American Recovery and Reinvestment Act of 2009 (ARRA). This study utilizes two parallel analytic approaches to characterize ESCO industry and market trends in the U.S.: (1) a “top-down” approach involving a survey of individual ESCOs to estimate aggregate industry activity and (2) a “bottom-up” analysis of a database of ∼3250 projects (representing over $8B in project investment) that reports market trends including installed EE retrofit strategies, project installation costs and savings, project payback times, and benefit-cost ratios over time. Despite the onset of a severe economic recession, the U.S. ESCO industry managed to grow at about 7% per year between 2006 and 2008. ESCO industry revenues were about $4.1 billion in 2008 and ESCOs anticipate accelerated growth through 2011 (25% per year). We found that 2484 ESCO projects in our database generated ∼$4.0 billion ($2009) in net, direct economic benefits to their customers. We estimate that the ESCO project database includes about 20% of all U.S. ESCO market activity from 1990–2008. Assuming the net benefits per project are comparable for ESCO projects that are not included in the LBNL database, this would suggest that the ESCO industry has generated ∼$23 billion in net direct economic benefits for customers at projects installed between 1990 and 2008. There is empirical evidence confirming that the industry is evolving by installing more comprehensive and complex measures—including onsite generation and measures to address deferred maintenance—but this evolution has significant implications for customer project economics

  4. A computer graphics pilot project - Spacecraft mission support with an interactive graphics workstation

    Science.gov (United States)

    Hagedorn, John; Ehrner, Marie-Jacqueline; Reese, Jodi; Chang, Kan; Tseng, Irene

    1986-01-01

    The NASA Computer Graphics Pilot Project was undertaken to enhance the quality control, productivity and efficiency of mission support operations at the Goddard Operations Support Computing Facility. The Project evolved into a set of demonstration programs for graphics intensive simulated control room operations, particularly in connection with the complex space missions that began in the 1980s. Complex mission mean more data. Graphic displays are a means to reduce the probabilities of operator errors. Workstations were selected with 1024 x 768 pixel color displays controlled by a custom VLSI chip coupled to an MC68010 chip running UNIX within a shell that permits operations through the medium of mouse-accessed pulldown window menus. The distributed workstations run off a host NAS 8040 computer. Applications of the system for tracking spacecraft orbits and monitoring Shuttle payload handling illustrate the system capabilities, noting the built-in capabilities of shifting the point of view and rotating and zooming in on three-dimensional views of spacecraft.

  5. Correction of computed tomography motion artifacts using pixel-specific back-projection

    International Nuclear Information System (INIS)

    Ritchie, C.J.; Crawford, C.R.; Godwin, J.D.; Kim, Y. King, K.F.

    1996-01-01

    Cardiac and respiratory motion can cause artifacts in computed tomography scans of the chest. The authors describe a new method for reducing these artifacts called pixel-specific back-projection (PSBP). PSBP reduces artifacts caused by in-plane motion by reconstructing each pixel in a frame of reference that moves with the in-plane motion in the volume being scanned. The motion of the frame of reference is specified by constructing maps that describe the motion of each pixel in the image at the time each projection was measured; these maps are based on measurements of the in-plane motion. PSBP has been tested in computer simulations and with volunteer data. In computer simulations, PSBP removed the structured artifacts caused by motion. In scans of two volunteers, PSBP reduced doubling and streaking in chest scans to a level that made the images clinically useful. PSBP corrections of liver scans were less satisfactory because the motion of the liver is predominantly superior-inferior (S-I). PSBP uses a unique set of motion parameters to describe the motion at each point in the chest as opposed to requiring that the motion be described by a single set of parameters. Therefore, PSBP may be more useful in correcting clinical scans than are other correction techniques previously described

  6. Development of computer assisted learning program using cone beam projection for head radiography

    International Nuclear Information System (INIS)

    Nakazeko, Kazuma; Araki, Misao; Kajiwara, Hironori; Watanabe, Hiroyuki; Kuwayama, Jun; Karube, Shuhei; Hashimoto, Takeyuki; Shinohara, Hiroyuki

    2012-01-01

    We present a computer assisted learning (CAL) program to simulate head radiography. The program provides cone beam projections of a target volume, simulating three-dimensional computed tomography (CT) of a head phantom. The generated image is 512 x 512 x 512 pixels with each pixel 0.6 mm on a side. The imaging geometry, such as X-ray tube orientation and phantom orientation, can be varied. The graphical user interface (GUI) of the CAL program allows the study of the effects of varying the imaging geometry; each simulated projection image is shown quickly in an adjoining window. Simulated images with an assigned geometry were compared with the image obtained using the standard geometry in clinical use. The accuracy of the simulated image was verified through comparison with the image acquired using radiography of the head phantom, subsequently processed with a computed radiography system (CR image). Based on correlation coefficient analysis and visual assessment, it was concluded that the CAL program can satisfactorily simulate the CR image. Therefore, it should be useful for the training of head radiography. (author)

  7. High performance simulation for the Silva project using the tera computer

    Energy Technology Data Exchange (ETDEWEB)

    Bergeaud, V.; La Hargue, J.P.; Mougery, F. [CS Communication and Systemes, 92 - Clamart (France); Boulet, M.; Scheurer, B. [CEA Bruyeres-le-Chatel, 91 - Bruyeres-le-Chatel (France); Le Fur, J.F.; Comte, M.; Benisti, D.; Lamare, J. de; Petit, A. [CEA Saclay, 91 - Gif sur Yvette (France)

    2003-07-01

    In the context of the SILVA Project (Atomic Vapor Laser Isotope Separation), numerical simulation of the plant scale propagation of laser beams through uranium vapour was a great challenge. The PRODIGE code has been developed to achieve this goal. Here we focus on the task of achieving high performance simulation on the TERA computer. We describe the main issues for optimizing the parallelization of the PRODIGE code on TERA. Thus, we discuss advantages and drawbacks of the implemented diagonal parallelization scheme. As a consequence, it has been found fruitful to fit out the code in three aspects: memory allocation, MPI communications and interconnection network bandwidth usage. We stress out the interest of MPI/IO in this context and the benefit obtained for production computations on TERA. Finally, we shall illustrate our developments. We indicate some performance measurements reflecting the good parallelization properties of PRODIGE on the TERA computer. The code is currently used for demonstrating the feasibility of the laser propagation at a plant enrichment level and for preparing the 2003 Menphis experiment. We conclude by emphasizing the contribution of high performance TERA simulation to the project. (authors)

  8. High performance simulation for the Silva project using the tera computer

    International Nuclear Information System (INIS)

    Bergeaud, V.; La Hargue, J.P.; Mougery, F.; Boulet, M.; Scheurer, B.; Le Fur, J.F.; Comte, M.; Benisti, D.; Lamare, J. de; Petit, A.

    2003-01-01

    In the context of the SILVA Project (Atomic Vapor Laser Isotope Separation), numerical simulation of the plant scale propagation of laser beams through uranium vapour was a great challenge. The PRODIGE code has been developed to achieve this goal. Here we focus on the task of achieving high performance simulation on the TERA computer. We describe the main issues for optimizing the parallelization of the PRODIGE code on TERA. Thus, we discuss advantages and drawbacks of the implemented diagonal parallelization scheme. As a consequence, it has been found fruitful to fit out the code in three aspects: memory allocation, MPI communications and interconnection network bandwidth usage. We stress out the interest of MPI/IO in this context and the benefit obtained for production computations on TERA. Finally, we shall illustrate our developments. We indicate some performance measurements reflecting the good parallelization properties of PRODIGE on the TERA computer. The code is currently used for demonstrating the feasibility of the laser propagation at a plant enrichment level and for preparing the 2003 Menphis experiment. We conclude by emphasizing the contribution of high performance TERA simulation to the project. (authors)

  9. Verification of Electromagnetic Physics Models for Parallel Computing Architectures in the GeantV Project

    Energy Technology Data Exchange (ETDEWEB)

    Amadio, G.; et al.

    2017-11-22

    An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physics models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.

  10. Inference of Tumor Evolution during Chemotherapy by Computational Modeling and In Situ Analysis of Genetic and Phenotypic Cellular Diversity

    Directory of Open Access Journals (Sweden)

    Vanessa Almendro

    2014-02-01

    Full Text Available Cancer therapy exerts a strong selection pressure that shapes tumor evolution, yet our knowledge of how tumors change during treatment is limited. Here, we report the analysis of cellular heterogeneity for genetic and phenotypic features and their spatial distribution in breast tumors pre- and post-neoadjuvant chemotherapy. We found that intratumor genetic diversity was tumor-subtype specific, and it did not change during treatment in tumors with partial or no response. However, lower pretreatment genetic diversity was significantly associated with pathologic complete response. In contrast, phenotypic diversity was different between pre- and posttreatment samples. We also observed significant changes in the spatial distribution of cells with distinct genetic and phenotypic features. We used these experimental data to develop a stochastic computational model to infer tumor growth patterns and evolutionary dynamics. Our results highlight the importance of integrated analysis of genotypes and phenotypes of single cells in intact tissues to predict tumor evolution.

  11. Inference of tumor evolution during chemotherapy by computational modeling and in situ analysis of genetic and phenotypic cellular diversity

    International Nuclear Information System (INIS)

    Almendro, Vanessa; Cheng, Yu-Kang; Randles, Amanda; Itzkovitz, Shalev; Marusyk, Andriy; Ametller, Elisabet; Gonzalez-Farre, Xavier; Muñoz, Montse; Russnes, Hege G.; Helland, Åslaug; Rye, Inga H.; Borresen-Dale, Anne-Lise; Maruyama, Reo; Van Oudenaarden, Alexander; Dowsett, Mitchell; Jones, Robin L.; Reis-Filho, Jorge; Gascon, Pere; Gönen, Mithat; Michor, Franziska; Polyak, Kornelia

    2014-01-01

    Cancer therapy exerts a strong selection pressure that shapes tumor evolution, yet our knowledge of how tumors change during treatment is limited. Here, we report the analysis of cellular heterogeneity for genetic and phenotypic features and their spatial distribution in breast tumors pre- and post-neoadjuvant chemotherapy. We found that intratumor genetic diversity was tumor-subtype specific, and it did not change during treatment in tumors with partial or no response. However, lower pretreatment genetic diversity was significantly associated with pathologic complete response. In contrast, phenotypic diversity was different between pre- and post-treatment samples. We also observed significant changes in the spatial distribution of cells with distinct genetic and phenotypic features. We used these experimental data to develop a stochastic computational model to infer tumor growth patterns and evolutionary dynamics. Our results highlight the importance of integrated analysis of genotypes and phenotypes of single cells in intact tissues to predict tumor evolution

  12. Conceptual Evolution and Importance of Andragogy towards the Scope Optimization of University Academic Rural Development Programs and Projects

    Directory of Open Access Journals (Sweden)

    José Bernal Azofeifa-Bolaños

    2016-12-01

    Full Text Available This study was carried out with the objective of describing the evolution and importance of andragogical processes in the search of rural profiles committed to the university work in the development and implementation of programs and projects. Among its main contributions, the importance of knowing and teaching processes applied strictly for adults by university coordinators of programs and projects stands out. The relevance of applying this kind of knowledge will allow efficient use of institutional financial resources, particularly for the real commitment of the rural adult community towards the implementation of field activities and accomplishing, in a shorter term, the expected academic achievement. A successful project experience is described in which some andragogical strategies were applied through extension, and which produced a better participation and engagement from rural people with the projects developed by the University. Consequently, applicability of these concepts in the programs and projects of rural development promoted through universities must lay the foundation for regional rural development strategies with the ultimate goal of finding ways to improve the quality of life of people in particular scenarios.

  13. Evolution of the cellular communication system: An analysis in the Computational Paradigm

    International Nuclear Information System (INIS)

    Tahir Shah, K.

    1995-03-01

    We discuss the problem of the evolution of the cellular communication system from the RNA world to progenote to the modern cell. Our method analyses syntactical structure of molecular fossils in the non-coding regions of DNA within the information-processing gene model developed earlier. We concluded that sequence-specific binding is an ancient communication process with its origin in the RNA world. Moreover, we illustrate our viewpoint using four evolution snapshots from the first RNA segments, some 4.1. billion years ago, to the first cell, 3.8 billion years ago. (author). 31 refs

  14. Application of the Computer Capacity to the Analysis of Processors Evolution

    OpenAIRE

    Ryabko, Boris; Rakitskiy, Anton

    2017-01-01

    The notion of computer capacity was proposed in 2012, and this quantity has been estimated for computers of different kinds. In this paper we show that, when designing new processors, the manufacturers change the parameters that affect the computer capacity. This allows us to predict the values of parameters of future processors. As the main example we use Intel processors, due to the accessibility of detailed description of all their technical characteristics.

  15. Peculiarities of organization of project and research activity of students in computer science, physics and technology

    Science.gov (United States)

    Stolyarov, I. V.

    2017-01-01

    The author of this article manages a project and research activity of students in the areas of computer science, physics, engineering and biology, basing on the acquired experience in these fields. Pupils constantly become winners of competitions and conferences of different levels, for example, three of the finalists of Intel ISEF in 2013 in Phoenix (Arizona, USA) and in 2014 in Los Angeles (California, USA). In 2013 A. Makarychev received the "Small Nobel prize" in Computer Science section and special award sponsors - the company's CAST. Scientific themes and methods suggested by the author and developed in joint publications of students from Russia, Germany and Austria are the patents for invention and certificates for registration in the ROSPATENT. The article presents the results of the implementation of specific software and hardware systems in physics, engineering and medicine.

  16. The MELANIE project: from a biopsy to automatic protein map interpretation by computer.

    Science.gov (United States)

    Appel, R D; Hochstrasser, D F; Funk, M; Vargas, J R; Pellegrini, C; Muller, A F; Scherrer, J R

    1991-10-01

    The goals of the MELANIE project are to determine if disease-associated patterns can be detected in high resolution two-dimensional polyacrylamide gel electrophoresis (HR 2D-PAGE) images and if a diagnosis can be established automatically by computer. The ELSIE/MELANIE system is a set of computer programs which automatically detect, quantify, and compare protein spots shown on HR 2D-PAGE images. Classification programs help the physician to find disease-associated patterns from a given set of two-dimensional gel electrophoresis images and to form diagnostic rules. Prototype expert systems that use these rules to establish a diagnosis from new HR 2D-PAGE images have been developed. They successfully diagnosed cirrhosis of the liver and were able to distinguish a variety of cancer types from biopsies known to be cancerous.

  17. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  18. Computation of fragment velocities and projection angles of an anti-aircraft round

    CSIR Research Space (South Africa)

    Snyman, IM

    2014-09-01

    Full Text Available the reference point up to the first part. The first part starts at the last points on the tail end and ends at the beginning of part 2. The calculated mass of each cylinder is also shown. Table 1: The position and characteristics of the cylindrical rings... projection power as Hexal P30 according to Langen and Barth, 1979. See also Section 4 below. • To facilitate the computation (especially the elapsed time of the runs), a rectangular aluminium solid models the mass of the fuse. 3.2 Model Set-up ANSYS...

  19. Computing segmentations directly from x-ray projection data via parametric deformable curves

    DEFF Research Database (Denmark)

    Dahl, Vedrana Andersen; Dahl, Anders Bjorholm; Hansen, Per Christian

    2018-01-01

    We describe an efficient algorithm that computes a segmented reconstruction directly from x-ray projection data. Our algorithm uses a parametric curve to define the segmentation. Unlike similar approaches which are based on level-sets, our method avoids a pixel or voxel grid; hence the number...... of unknowns is reduced to the set of points that define the curve, and attenuation coefficients of the segments. Our current implementation uses a simple closed curve and is capable of separating one object from the background. However, our basic algorithm can be applied to an arbitrary topology and multiple...

  20. Exploring the use of computer-mediated video communication in engineering projects in South Africa

    Directory of Open Access Journals (Sweden)

    Meyer, Izak P.

    2016-08-01

    Full Text Available Globally-expanding organisations that are trying to capitalise on distributed skills are increasingly using virtual project teams to shorten product development time and increase quality. These virtual teams, which are distributed across countries, cultures, and time zones, are required to use faster and better ways of interacting. Past research has shown that virtual teams that use computer-mediated communication (CMC instead of face-to-face communication are less cohesive because they struggle with mistrust, controlling behaviour , and communication breakdowns. This study aims to determine whether project practitioners in South Africa perceive virtual teams that use videoconferencing as suffering from the same CMC disadvantages described in past research in other environments; and if they do, what the possible causes could be. This paper reports on a survey of 106 project practitioners in South Africa. The results show that these project practitioners prefer face- to-face communication over CMC, and perceive virtual teams using videoconferencing to be less cohesive and to suffer from mistrust and communication breakdowns, but not from increased conflict and power struggles. The perceived shortcomings of videoconferencing might result from virtual teams that use this medium having less time to build interpersonal relationships.

  1. Incomplete projection reconstruction of computed tomography based on the modified discrete algebraic reconstruction technique

    Science.gov (United States)

    Yang, Fuqiang; Zhang, Dinghua; Huang, Kuidong; Gao, Zongzhao; Yang, YaFei

    2018-02-01

    Based on the discrete algebraic reconstruction technique (DART), this study aims to address and test a new improved algorithm applied to incomplete projection data to generate a high quality reconstruction image by reducing the artifacts and noise in computed tomography. For the incomplete projections, an augmented Lagrangian based on compressed sensing is first used in the initial reconstruction for segmentation of the DART to get higher contrast graphics for boundary and non-boundary pixels. Then, the block matching 3D filtering operator was used to suppress the noise and to improve the gray distribution of the reconstructed image. Finally, simulation studies on the polychromatic spectrum were performed to test the performance of the new algorithm. Study results show a significant improvement in the signal-to-noise ratios (SNRs) and average gradients (AGs) of the images reconstructed from incomplete data. The SNRs and AGs of the new images reconstructed by DART-ALBM were on average 30%-40% and 10% higher than the images reconstructed by DART algorithms. Since the improved DART-ALBM algorithm has a better robustness to limited-view reconstruction, which not only makes the edge of the image clear but also makes the gray distribution of non-boundary pixels better, it has the potential to improve image quality from incomplete projections or sparse projections.

  2. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  3. PhyloSim - Monte Carlo simulation of sequence evolution in the R statistical computing environment

    Directory of Open Access Journals (Sweden)

    Massingham Tim

    2011-04-01

    Full Text Available Abstract Background The Monte Carlo simulation of sequence evolution is routinely used to assess the performance of phylogenetic inference methods and sequence alignment algorithms. Progress in the field of molecular evolution fuels the need for more realistic and hence more complex simulations, adapted to particular situations, yet current software makes unreasonable assumptions such as homogeneous substitution dynamics or a uniform distribution of indels across the simulated sequences. This calls for an extensible simulation framework written in a high-level functional language, offering new functionality and making it easy to incorporate further complexity. Results PhyloSim is an extensible framework for the Monte Carlo simulation of sequence evolution, written in R, using the Gillespie algorithm to integrate the actions of many concurrent processes such as substitutions, insertions and deletions. Uniquely among sequence simulation tools, PhyloSim can simulate arbitrarily complex patterns of rate variation and multiple indel processes, and allows for the incorporation of selective constraints on indel events. User-defined complex patterns of mutation and selection can be easily integrated into simulations, allowing PhyloSim to be adapted to specific needs. Conclusions Close integration with R and the wide range of features implemented offer unmatched flexibility, making it possible to simulate sequence evolution under a wide range of realistic settings. We believe that PhyloSim will be useful to future studies involving simulated alignments.

  4. Unconventional computing using evolution-in-nanomaterio: neural networks meet nanoparticle networks

    NARCIS (Netherlands)

    Greff, Klaus; van Damme, Rudolf M.J.; Koutnik, Jan; Broersma, Haitze J.; Mikhal, Julia Olegivna; Lawrence, Celestine Preetham; van der Wiel, Wilfred Gerard; Schmidhuber, Jürgen

    2016-01-01

    Recently published experimental work on evolution-in-materio applied to nanoscale materials shows promising results for future reconfigurable devices. These experiments were performed on disordered nano-particle networks that have no predefined design. The material has been treated as a blackbox,

  5. Method of computer generation and projection recording of microholograms for holographic memory systems: mathematical modelling and experimental implementation

    International Nuclear Information System (INIS)

    Betin, A Yu; Bobrinev, V I; Evtikhiev, N N; Zherdev, A Yu; Zlokazov, E Yu; Lushnikov, D S; Markin, V V; Odinokov, S B; Starikov, S N; Starikov, R S

    2013-01-01

    A method of computer generation and projection recording of microholograms for holographic memory systems is presented; the results of mathematical modelling and experimental implementation of the method are demonstrated. (holographic memory)

  6. Projective methodical system of students training to the course «History of computer science»

    OpenAIRE

    С А Виденин

    2008-01-01

    Components of teachers readiness to professional activity are described in the item. The projective methods of training to a course « History of computer science « in favour to improve professional grounding of students' are considered.

  7. Trans-Amazon Drilling Project (TADP): origins and evolution of the forests, climate, and hydrology of the South American tropics

    Science.gov (United States)

    Baker, P. A.; Fritz, S. C.; Silva, C. G.; Rigsby, C. A.; Absy, M. L.; Almeida, R. P.; Caputo, M.; Chiessi, C. M.; Cruz, F. W.; Dick, C. W.; Feakins, S. J.; Figueiredo, J.; Freeman, K. H.; Hoorn, C.; Jaramillo, C.; Kern, A. K.; Latrubesse, E. M.; Ledru, M. P.; Marzoli, A.; Myrbo, A.; Noren, A.; Piller, W. E.; Ramos, M. I. F.; Ribas, C. C.; Trnadade, R.; West, A. J.; Wahnfried, I.; Willard, D. A.

    2015-12-01

    This article presents the scientific rationale for an ambitious ICDP drilling project to continuously sample Late Cretaceous to modern sediment in four different sedimentary basins that transect the equatorial Amazon of Brazil, from the Andean foreland to the Atlantic Ocean. The goals of this project are to document the evolution of plant biodiversity in the Amazon forests and to relate biotic diversification to changes in the physical environment, including climate, tectonism, and the surface landscape. These goals require long sedimentary records from each of the major sedimentary basins across the heart of the Brazilian Amazon, which can only be obtained by drilling because of the scarcity of Cenozoic outcrops. The proposed drilling will provide the first long, nearly continuous regional records of the Cenozoic history of the forests, their plant diversity, and the associated changes in climate and environment. It also will address fundamental questions about landscape evolution, including the history of Andean uplift and erosion as recorded in Andean foreland basins and the development of west-to-east hydrologic continuity between the Andes, the Amazon lowlands, and the equatorial Atlantic. Because many modern rivers of the Amazon basin flow along the major axes of the old sedimentary basins, we plan to locate drill sites on the margin of large rivers and to access the targeted drill sites by navigation along these rivers.

  8. Trans-Amazon Drilling Project (TADP): origins and evolution of the forests, climate, and hydrology of the South American tropics

    Science.gov (United States)

    Baker, P.A.; Fritz, S.C.; Silva, C.G.; Rigsby, C.A.; Absy, M.L.; Almeida, R.P.; Caputo, Maria C.; Chiessi, C.M.; Cruz, F.W.; Dick, C.W.; Feakins, S.J.; Figueiredo, J.; Freeman, K.H.; Hoorn, C.; Jaramillo, C.A.; Kern, A.; Latrubesse, E.M.; Ledru, M.P.; Marzoli, A.; Myrbo, A.; Noren, A.; Piller, W.E.; Ramos, M.I.F.; Ribas, C.C.; Trinadade, R.; West, A.J.; Wahnfried, I.; Willard, Debra A.

    2015-01-01

    This article presents the scientific rationale for an ambitious ICDP drilling project to continuously sample Late Cretaceous to modern sediment in four different sedimentary basins that transect the equatorial Amazon of Brazil, from the Andean foreland to the Atlantic Ocean. The goals of this project are to document the evolution of plant biodiversity in the Amazon forests and to relate biotic diversification to changes in the physical environment, including climate, tectonism, and the surface landscape. These goals require long sedimentary records from each of the major sedimentary basins across the heart of the Brazilian Amazon, which can only be obtained by drilling because of the scarcity of Cenozoic outcrops. The proposed drilling will provide the first long, nearly continuous regional records of the Cenozoic history of the forests, their plant diversity, and the associated changes in climate and environment. It also will address fundamental questions about landscape evolution, including the history of Andean uplift and erosion as recorded in Andean foreland basins and the development of west-to-east hydrologic continuity between the Andes, the Amazon lowlands, and the equatorial Atlantic. Because many modern rivers of the Amazon basin flow along the major axes of the old sedimentary basins, we plan to locate drill sites on the margin of large rivers and to access the targeted drill sites by navigation along these rivers.

  9. Revealing fatigue damage evolution in unidirectional composites for wind turbine blades using x-ray computed tomography

    DEFF Research Database (Denmark)

    Mikkelsen, Lars Pilgaard

    ’. Thereby, it will be possible to lower the cost of energy for wind energy based electricity. In the presented work, a lab-source x-ray computed tomography equipment (Zeiss Xradia 520 Versa) has been used in connection with ex-situ fatigue testing of uni-directional composites in order to identify fibre...... to other comparable x-ray studies) have been used in order to ensure a representative test volume during the ex-situ fatigue testing. Using the ability of the x-ray computed tomography to zoom into regions of interest, non-destructive, the fatigue damage evolution in a repeating ex-situ fatigue loaded test...... improving the fatigue resistance of non-crimp fabric used in the wind turbine industry can be made....

  10. Evolution of teaching and evaluation methodologies: The experience in the computer programming course at the Universidad Nacional de Colombia

    Directory of Open Access Journals (Sweden)

    Jonatan Gomez Perdomo

    2014-05-01

    Full Text Available In this paper, we present the evolution of a computer-programming course at the Universidad Nacional de Colombia (UNAL. The teaching methodology has evolved from a linear and non-standardized methodology to a flexible, non-linear and student-centered methodology. Our methodology uses an e-learning platform that supports the learning process by offering students and professors custom navigation between the content and material in an interactive way (book chapters, exercises, videos. Moreover, the platform is open access, and approximately 900 students from the university take this course each term. However, our evaluation methodology has evolved from static evaluations based on paper tests to an online process based on computer adaptive testing (CAT that chooses the questions to ask a student and assigns the student a grade according to the student’s ability.

  11. The E-MOSAICS project: simulating the formation and co-evolution of galaxies and their star cluster populations

    Science.gov (United States)

    Pfeffer, Joel; Kruijssen, J. M. Diederik; Crain, Robert A.; Bastian, Nate

    2018-04-01

    We introduce the MOdelling Star cluster population Assembly In Cosmological Simulations within EAGLE (E-MOSAICS) project. E-MOSAICS incorporates models describing the formation, evolution, and disruption of star clusters into the EAGLE galaxy formation simulations, enabling the examination of the co-evolution of star clusters and their host galaxies in a fully cosmological context. A fraction of the star formation rate of dense gas is assumed to yield a cluster population; this fraction and the population's initial properties are governed by the physical properties of the natal gas. The subsequent evolution and disruption of the entire cluster population are followed accounting for two-body relaxation, stellar evolution, and gravitational shocks induced by the local tidal field. This introductory paper presents a detailed description of the model and initial results from a suite of 10 simulations of ˜L⋆ galaxies with disc-like morphologies at z = 0. The simulations broadly reproduce key observed characteristics of young star clusters and globular clusters (GCs), without invoking separate formation mechanisms for each population. The simulated GCs are the surviving population of massive clusters formed at early epochs (z ≳ 1-2), when the characteristic pressures and surface densities of star-forming gas were significantly higher than observed in local galaxies. We examine the influence of the star formation and assembly histories of galaxies on their cluster populations, finding that (at similar present-day mass) earlier-forming galaxies foster a more massive and disruption-resilient cluster population, while galaxies with late mergers are capable of forming massive clusters even at late cosmic epochs. We find that the phenomenological treatment of interstellar gas in EAGLE precludes the accurate modelling of cluster disruption in low-density environments, but infer that simulations incorporating an explicitly modelled cold interstellar gas phase will overcome

  12. Computer aided process planning at the Oak Ridge Y-12 plant: a pilot project

    International Nuclear Information System (INIS)

    Hewgley, R.E. Jr.; Prewett, H.P. Jr.

    1979-01-01

    In 1976, a formal needs analysis was conducted in one of the Fabrication Division Shops of all activities from the receipt of an order through final machining. The results indicated deficiencies in process planning activities involving special production work. A pilot program was organized to investigate the benefits of emerging CAM technology and application of GT concepts for machining operations at the Y-12 Plant. The objective of the CAPP Project was to provide computer-assisted process planning for special production machining in th shop. The CAPP team was charged with the specific goal of demonstrating computer-aided process planning within a four-year term. The CAPP charter included a plan with intermediate measurable milestones for achieving its mission. In three years, the CAPP project demonstrated benefits to process planning. A capability to retrieve historical records for similar parts, to review accurately the status of all staff assignments, and to generate detailed machining procedures definitely can impact the way in which a machine shop prepared for new orders. The real payoff is in the hardcopy output (N/C programs, studies, sequence plans, and procedures). 4 figures,

  13. A projection gradient method for computing ground state of spin-2 Bose–Einstein condensates

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Hanquan, E-mail: hanquan.wang@gmail.com [School of Statistics and Mathematics, Yunnan University of Finance and Economics, Kunming, Yunnan Province, 650221 (China); Yunnan Tongchang Scientific Computing and Data Mining Research Center, Kunming, Yunnan Province, 650221 (China)

    2014-10-01

    In this paper, a projection gradient method is presented for computing ground state of spin-2 Bose–Einstein condensates (BEC). We first propose the general projection gradient method for solving energy functional minimization problem under multiple constraints, in which the energy functional takes real functions as independent variables. We next extend the method to solve a similar problem, where the energy functional now takes complex functions as independent variables. We finally employ the method into finding the ground state of spin-2 BEC. The key of our method is: by constructing continuous gradient flows (CGFs), the ground state of spin-2 BEC can be computed as the steady state solution of such CGFs. We discretized the CGFs by a conservative finite difference method along with a proper way to deal with the nonlinear terms. We show that the numerical discretization is normalization and magnetization conservative and energy diminishing. Numerical results of the ground state and their energy of spin-2 BEC are reported to demonstrate the effectiveness of the numerical method.

  14. Computing the distribution of return levels of extreme warm temperatures for future climate projections

    Energy Technology Data Exchange (ETDEWEB)

    Pausader, M.; Parey, S.; Nogaj, M. [EDF/R and D, Chatou Cedex (France); Bernie, D. [Met Office Hadley Centre, Exeter (United Kingdom)

    2012-03-15

    In order to take into account uncertainties in the future climate projections there is a growing demand for probabilistic projections of climate change. This paper presents a methodology for producing such a probabilistic analysis of future temperature extremes. The 20- and 100-years return levels are obtained from that of the normalized variable and the changes in mean and standard deviation given by climate models for the desired future periods. Uncertainty in future change of these extremes is quantified using a multi-model ensemble and a perturbed physics ensemble. The probability density functions of future return levels are computed at a representative location from the joint probability distribution of mean and standard deviation changes given by the two combined ensembles of models. For the studied location, the 100-years return level at the end of the century is lower than 41 C with an 80% confidence. Then, as the number of model simulations is low to compute a reliable distribution, two techniques proposed in the literature (local pattern scaling and ANOVA) have been used to infer the changes in mean and standard deviation for the combinations of RCM and GCM which have not been run. The ANOVA technique leads to better results for the reconstruction of the mean changes, whereas the two methods fail to correctly infer the changes in standard deviation. As standard deviation change has a major impact on return level change, there is a need to improve the models and the different techniques regarding the variance changes. (orig.)

  15. A projection gradient method for computing ground state of spin-2 Bose–Einstein condensates

    International Nuclear Information System (INIS)

    Wang, Hanquan

    2014-01-01

    In this paper, a projection gradient method is presented for computing ground state of spin-2 Bose–Einstein condensates (BEC). We first propose the general projection gradient method for solving energy functional minimization problem under multiple constraints, in which the energy functional takes real functions as independent variables. We next extend the method to solve a similar problem, where the energy functional now takes complex functions as independent variables. We finally employ the method into finding the ground state of spin-2 BEC. The key of our method is: by constructing continuous gradient flows (CGFs), the ground state of spin-2 BEC can be computed as the steady state solution of such CGFs. We discretized the CGFs by a conservative finite difference method along with a proper way to deal with the nonlinear terms. We show that the numerical discretization is normalization and magnetization conservative and energy diminishing. Numerical results of the ground state and their energy of spin-2 BEC are reported to demonstrate the effectiveness of the numerical method

  16. Legacy systems: managing evolution through integration in a distributed and object-oriented computing environment.

    Science.gov (United States)

    Lemaitre, D; Sauquet, D; Fofol, I; Tanguy, L; Jean, F C; Degoulet, P

    1995-01-01

    Legacy systems are crucial for organizations since they support key functionalities. But they become obsolete with aging and the apparition of new techniques. Managing their evolution is a key issue in software engineering. This paper presents a strategy that has been developed at Broussais University Hospital in Paris to make a legacy system devoted to the management of health care units evolve towards a new up-to-date software. A two-phase evolution pathway is described. The first phase consists in separating the interface from the data storage and application control and in using a communication channel between the individualized components. The second phase proposes to use an object-oriented DBMS in place of the homegrown system. An application example for the management of hypertensive patients is described.

  17. Projection multiplex recording of computer-synthesised one-dimensional Fourier holograms for holographic memory systems: mathematical and experimental modelling

    Energy Technology Data Exchange (ETDEWEB)

    Betin, A Yu; Bobrinev, V I; Verenikina, N M; Donchenko, S S; Odinokov, S B [Research Institute ' Radiotronics and Laser Engineering' , Bauman Moscow State Technical University, Moscow (Russian Federation); Evtikhiev, N N; Zlokazov, E Yu; Starikov, S N; Starikov, R S [National Reseach Nuclear University MEPhI (Moscow Engineering Physics Institute), Moscow (Russian Federation)

    2015-08-31

    A multiplex method of recording computer-synthesised one-dimensional Fourier holograms intended for holographic memory devices is proposed. The method potentially allows increasing the recording density in the previously proposed holographic memory system based on the computer synthesis and projection recording of data page holograms. (holographic memory)

  18. INTEGRATION OF ECONOMIC AND COMPUTER SKILLS AT IMPLEMENTATION OF STUDENTS PROJECT «BUSINESS PLAN PRODUCING IN MICROSOFT WORD»

    Directory of Open Access Journals (Sweden)

    Y.B. Samchinska

    2012-07-01

    Full Text Available In the article expedience at implementation of economic specialities by complex students project on Informatics and Computer Sciences is grounded on creation of business plan by modern information technologies, and also methodical recommendations are presented on implementation of this project.

  19. The Students Upgrading through Computer and Career Education Systems Services (Project SUCCESS). 1990-91 Final Evaluation Profile. OREA Report.

    Science.gov (United States)

    New York City Board of Education, Brooklyn, NY. Office of Research, Evaluation, and Assessment.

    An evaluation was done of the New York City Public Schools' Student Upgrading through Computer and Career Education Systems Services Program (Project SUCCESS). Project SUCCESS operated at 3 high schools in Brooklyn and Manhattan (Murry Bergtraum High School, Edward R. Murrow High School, and John Dewey High School). It enrolled limited English…

  20. Evolution of the Milieu Approach for Software Development for the Polymorphous Computing Architecture Program

    National Research Council Canada - National Science Library

    Dandass, Yoginder

    2004-01-01

    A key goal of the DARPA Polymorphous Computing Architectures (PCA) program is to develop reactive closed-loop systems that are capable of being dynamically reconfigured in order to respond to changing mission scenarios...

  1. Evolution of product lifespan and implications for environmental assessment and management: a case study of personal computers in higher education.

    Science.gov (United States)

    Babbitt, Callie W; Kahhat, Ramzy; Williams, Eric; Babbitt, Gregory A

    2009-07-01

    Product lifespan is a fundamental variable in understanding the environmental impacts associated with the life cycle of products. Existing life cycle and materials flow studies of products, almost without exception, consider lifespan to be constant over time. To determine the validity of this assumption, this study provides an empirical documentation of the long-term evolution of personal computer lifespan, using a major U.S. university as a case study. Results indicate that over the period 1985-2000, computer lifespan (purchase to "disposal") decreased steadily from a mean of 10.7 years in 1985 to 5.5 years in 2000. The distribution of lifespan also evolved, becoming narrower over time. Overall, however, lifespan distribution was broader than normally considered in life cycle assessments or materials flow forecasts of electronic waste management for policy. We argue that these results suggest that at least for computers, the assumption of constant lifespan is problematic and that it is important to work toward understanding the dynamics of use patterns. We modify an age-structured model of population dynamics from biology as a modeling approach to describe product life cycles. Lastly, the purchase share and generation of obsolete computers from the higher education sector is estimated using different scenarios for the dynamics of product lifespan.

  2. The Environmental and Molecular Sciences Laboratory project -- Continuous evolution in leadership

    International Nuclear Information System (INIS)

    Knutson, D.E.; McClusky, J.K.

    1994-10-01

    The Environmental and Molecular Sciences Laboratory (EMSL) construction project at Pacific Northwest Laboratory (PNL) in Richland, Washington, is a $230M Major Systems Acquisition for the US Department of Energy (DOE). The completed laboratory will be a national user facility that provides unparalleled capabilities for scientists involved in environmental molecular science research. This project, approved for construction by the Secretary of Energy in October 1993, is underway. The United States is embarking on an environmental cleanup effort that dwarfs previous scientific enterprise. Using current best available technology, the projected costs of cleaning up the tens of thousands of toxic waste sites, including DOE sites, is estimated to exceed one trillion dollars. The present state of scientific knowledge regarding the effects of exogenous chemicals on human biology is very limited. Long term environmental research at the molecular level is needed to resolve the concerns, and form the building blocks for a structure of cost effective process improvement and regulatory reform

  3. The Environmental and Molecular Sciences Laboratory project -- Continuous evolution in leadership

    Energy Technology Data Exchange (ETDEWEB)

    Knutson, D.E.; McClusky, J.K.

    1994-10-01

    The Environmental and Molecular Sciences Laboratory (EMSL) construction project at Pacific Northwest Laboratory (PNL) in Richland, Washington, is a $230M Major Systems Acquisition for the US Department of Energy (DOE). The completed laboratory will be a national user facility that provides unparalleled capabilities for scientists involved in environmental molecular science research. This project, approved for construction by the Secretary of Energy in October 1993, is underway. The United States is embarking on an environmental cleanup effort that dwarfs previous scientific enterprise. Using current best available technology, the projected costs of cleaning up the tens of thousands of toxic waste sites, including DOE sites, is estimated to exceed one trillion dollars. The present state of scientific knowledge regarding the effects of exogenous chemicals on human biology is very limited. Long term environmental research at the molecular level is needed to resolve the concerns, and form the building blocks for a structure of cost effective process improvement and regulatory reform.

  4. WRF4G project: Adaptation of WRF Model to Distributed Computing Infrastructures

    Science.gov (United States)

    Cofino, Antonio S.; Fernández Quiruelas, Valvanuz; García Díez, Markel; Blanco Real, Jose C.; Fernández, Jesús

    2013-04-01

    Nowadays Grid Computing is powerful computational tool which is ready to be used for scientific community in different areas (such as biomedicine, astrophysics, climate, etc.). However, the use of this distributed computing infrastructures (DCI) is not yet common practice in climate research, and only a few teams and applications in this area take advantage of this infrastructure. Thus, the first objective of this project is to popularize the use of this technology in the atmospheric sciences area. In order to achieve this objective, one of the most used applications has been taken (WRF; a limited- area model, successor of the MM5 model), that has a user community formed by more than 8000 researchers worldwide. This community develop its research activity on different areas and could benefit from the advantages of Grid resources (case study simulations, regional hind-cast/forecast, sensitivity studies, etc.). The WRF model is been used as input by many energy and natural hazards community, therefore those community will also benefit. However, Grid infrastructures have some drawbacks for the execution of applications that make an intensive use of CPU and memory for a long period of time. This makes necessary to develop a specific framework (middleware). This middleware encapsulates the application and provides appropriate services for the monitoring and management of the jobs and the data. Thus, the second objective of the project consists on the development of a generic adaptation of WRF for Grid (WRF4G), to be distributed as open-source and to be integrated in the official WRF development cycle. The use of this WRF adaptation should be transparent and useful to face any of the previously described studies, and avoid any of the problems of the Grid infrastructure. Moreover it should simplify the access to the Grid infrastructures for the research teams, and also to free them from the technical and computational aspects of the use of the Grid. Finally, in order to

  5. Design reality gap issues within an ICT4D project:an assessment of Jigawa State Community Computer Center

    OpenAIRE

    Kanya, Rislana Abdulazeez; Good, Alice

    2013-01-01

    This paper evaluates the Jigawa State Government Community Computer centre project using the design reality gap framework. The purpose of this was to analyse the shortfall between design expectations and implementation realities, in order to find out the current situation of the project. Furthermore to analyse whether it would meet the key stakeholder’s expectation. The Majority of Government ICT Projects is classified as either failure or partial failure. Our research will underpin a case st...

  6. A directory of computer codes suitable for stress analysis of HLW containers - Compas project

    International Nuclear Information System (INIS)

    1989-01-01

    This document reports the work carried out for the Compas project which looked at the capabilities of various computer codes for the stress analysis of high-level nuclear-waste containers and overpacks. The report concentrates on codes used by the project partners, but also includes a number of the major commercial finite element codes. The report falls into two parts. The first part of the report describes the capabilities of the codes. This includes details of the solution methods used in the codes, the types of analysis which they can carry out and the interfacing with pre - and post - processing packages. This is the more comprehensive section of the report. The second part of the report looks at the performance of a selection of the codes (those used by the project partners). This look at how the codes perform in a number of test problems which require calculations typical of those encountered in the design and analysis of high-level waste containers and overpacks

  7. [Restoration filtering based on projection power spectrum for single-photon emission computed tomography].

    Science.gov (United States)

    Kubo, N

    1995-04-01

    To improve the quality of single-photon emission computed tomographic (SPECT) images, a restoration filter has been developed. This filter was designed according to practical "least squares filter" theory. It is necessary to know the object power spectrum and the noise power spectrum. The power spectrum is estimated from the power spectrum of a projection, when the high-frequency power spectrum of a projection is adequately approximated as a polynomial exponential expression. A study of the restoration with the filter based on a projection power spectrum was conducted, and compared with that of the "Butterworth" filtering method (cut-off frequency of 0.15 cycles/pixel), and "Wiener" filtering (signal-to-noise power spectrum ratio was a constant). Normalized mean-squared errors (NMSE) of the phantom, two line sources located in a 99mTc filled cylinder, were used. NMSE of the "Butterworth" filter, "Wiener" filter, and filtering based on a power spectrum were 0.77, 0.83, and 0.76 respectively. Clinically, brain SPECT images utilizing this new restoration filter improved the contrast. Thus, this filter may be useful in diagnosis of SPECT images.

  8. Restoration filtering based on projection power spectrum for single-photon emission computed tomography

    International Nuclear Information System (INIS)

    Kubo, Naoki

    1995-01-01

    To improve the quality of single-photon emission computed tomographic (SPECT) images, a restoration filter has been developed. This filter was designed according to practical 'least squares filter' theory. It is necessary to know the object power spectrum and the noise power spectrum. The power spectrum is estimated from the power spectrum of a projection, when the high-frequency power spectrum of a projection is adequately approximated as a polynomial exponential expression. A study of the restoration with the filter based on a projection power spectrum was conducted, and compared with that of the 'Butterworth' filtering method (cut-off frequency of 0.15 cycles/pixel), and 'Wiener' filtering (signal-to-noise power spectrum ratio was a constant). Normalized mean-squared errors (NMSE) of the phantom, two line sources located in a 99m Tc filled cylinder, were used. NMSE of the 'Butterworth' filter, 'Wiener' filter, and filtering based on a power spectrum were 0.77, 0.83, and 0.76 respectively. Clinically, brain SPECT images utilizing this new restoration filter improved the contrast. Thus, this filter may be useful in diagnosis of SPECT images. (author)

  9. Understanding Paleoclimate and Human Evolution Through the Hominin Sites and Paleolakes Drilling Project

    Directory of Open Access Journals (Sweden)

    Kaye Reed

    2009-09-01

    Full Text Available Understanding the evolution of humans and our close relatives is one of the enduring scientific issues of modern times. Since the time of Charles Darwin, scientists have speculated on how and when we evolved and what conditions drove this evolutionary story. The detective work required to address these questions is necessarily interdisciplinary,involving research in anthropology, archaeology, human genetics and genomics, and the earth sciences. In addition to the difficult tasks of finding, describing, and interpreting hominin fossils (the taxonomic tribe which includes Homo sapiens and our close fossil relatives from the last 6 Ma, much of modern geological research associated with paleoanthropology involves understanding the geochronologic and paleoenvironmental context of those fossils. When were they entombed in the sediments? What were the local and regional climatic conditions that early hominins experienced? How did local (watershed scale and regional climate processes combine with regional tectonic boundary conditions to influence hominin food resources, foraging patterns, and demography? How and when did these conditions vary from humid to dry, or cool to warm? Can the history of those conditions (Vrba, 1988; Potts, 1996 be related to the evolution, diversification, stasis, or extinction of hominin species?

  10. Reconstruction of computed tomographic image from a few x-ray projections by means of accelerative gradient method

    International Nuclear Information System (INIS)

    Kobayashi, Fujio; Yamaguchi, Shoichiro

    1982-01-01

    A method of the reconstruction of computed tomographic images was proposed to reduce the exposure dose to X-ray. The method is the small number of X-ray projection method by accelerative gradient method. The procedures of computation are described. The algorithm of these procedures is simple, the convergence of the computation is fast, and the required memory capacity is small. Numerical simulation was carried out to conform the validity of this method. A sample of simple shape was considered, projection data were given, and the images were reconstructed from 6 views. Good results were obtained, and the method is considered to be useful. (Kato, T.)

  11. Computer analysis of protein functional sites projection on exon structure of genes in Metazoa.

    Science.gov (United States)

    Medvedeva, Irina V; Demenkov, Pavel S; Ivanisenko, Vladimir A

    2015-01-01

    Study of the relationship between the structural and functional organization of proteins and their coding genes is necessary for an understanding of the evolution of molecular systems and can provide new knowledge for many applications for designing proteins with improved medical and biological properties. It is well known that the functional properties of proteins are determined by their functional sites. Functional sites are usually represented by a small number of amino acid residues that are distantly located from each other in the amino acid sequence. They are highly conserved within their functional group and vary significantly in structure between such groups. According to this facts analysis of the general properties of the structural organization of the functional sites at the protein level and, at the level of exon-intron structure of the coding gene is still an actual problem. One approach to this analysis is the projection of amino acid residue positions of the functional sites along with the exon boundaries to the gene structure. In this paper, we examined the discontinuity of the functional sites in the exon-intron structure of genes and the distribution of lengths and phases of the functional site encoding exons in vertebrate genes. We have shown that the DNA fragments coding the functional sites were in the same exons, or in close exons. The observed tendency to cluster the exons that code functional sites which could be considered as the unit of protein evolution. We studied the characteristics of the structure of the exon boundaries that code, and do not code, functional sites in 11 Metazoa species. This is accompanied by a reduced frequency of intercodon gaps (phase 0) in exons encoding the amino acid residue functional site, which may be evidence of the existence of evolutionary limitations to the exon shuffling. These results characterize the features of the coding exon-intron structure that affect the functionality of the encoded protein and

  12. Pengembangan Model Pembelajaran Project Based Learning pada Mata Kuliah Computer Aided Design

    Directory of Open Access Journals (Sweden)

    Satoto Endar Nayono

    2013-09-01

    Full Text Available One of the key competencies of graduates majoring in Civil Engineering and Planning Education, Faculty of Engineering, Yogyakarta State University (YSU is able to plan buildings. CAD courses aim to train students to be able to pour the planning concepts into the picture. One of the obstacles faced in the course are concepts and pictures that created by the students often do not correspond to the standards used in the field. This study aims to develop a model of project-based learning so that the students’ pictures are more in line with the actual conditions in the field. This study was carried out through the stages as follows: (1 Pre test, (2 Planning of learning, (3 Implementation of the learning model of project-based learning, (4 monitoring and evaluation (5 Reflection and revision, (6 Implementation of learning in the next cycle, and (7 Evaluation of the learning outcomes. This study was conducted for four months in 2012 in the Department of Civil Engineering and Planning Education, Faculty of Engineering, YSU. The subjects of this study are the students who took the course of Computer Aided Design. The analysis of the data used descriptive qualitative and descriptive statistics. The results of this study were: (1 The implementation of project based learning model was proven to increase the learning process and the learning outcomes of students in the subject of CAD through the provision of buildings planning pictures tasks of school buildings based on the real conditions in the field. The task was delivered in every meeting and improved based on the feedback from their lecturers, (2 the learning model of project based learning will be easier to be implemented if it is accompanied by the model of peer tutoring and the learning model of PAIKEM.

  13. Integrated computer control system CORBA-based simulator FY98 LDRD project final summary report

    International Nuclear Information System (INIS)

    Bryant, R M; Holloway, F W; Van Arsdall, P J.

    1999-01-01

    The CORBA-based Simulator was a Laboratory Directed Research and Development (LDRD) project that applied simulation techniques to explore critical questions about distributed control architecture. The simulator project used a three-prong approach comprised of a study of object-oriented distribution tools, computer network modeling, and simulation of key control system scenarios. This summary report highlights the findings of the team and provides the architectural context of the study. For the last several years LLNL has been developing the Integrated Computer Control System (ICCS), which is an abstract object-oriented software framework for constructing distributed systems. The framework is capable of implementing large event-driven control systems for mission-critical facilities such as the National Ignition Facility (NIF). Tools developed in this project were applied to the NIF example architecture in order to gain experience with a complex system and derive immediate benefits from this LDRD. The ICCS integrates data acquisition and control hardware with a supervisory system, and reduces the amount of new coding and testing necessary by providing prebuilt components that can be reused and extended to accommodate specific additional requirements. The framework integrates control point hardware with a supervisory system by providing the services needed for distributed control such as database persistence, system start-up and configuration, graphical user interface, status monitoring, event logging, scripting language, alert management, and access control. The design is interoperable among computers of different kinds and provides plug-in software connections by leveraging a common object request brokering architecture (CORBA) to transparently distribute software objects across the network of computers. Because object broker distribution applied to control systems is relatively new and its inherent performance is roughly threefold less than traditional point

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  15. Evolution of environmental impact assessment as applied to watershed modification projects in Canada

    Science.gov (United States)

    Dirschl, Herman J.; Novakowski, Nicholas S.; Sadar, M. Husain

    1993-07-01

    This article reviews the application of environmental impact assessment (EIA) procedures and practices to three watershed modification projects situaled in western Canada. These ventures were justified for accelerating regional economic development, and cover the period during which public concerns for protecting the environment rapidly made their way into the national political agenda. An historical account and analysis of the situation, therefore, seems desirable in order to understand the development of EIA processes, practices, and methodologies since the start of construction of the first project in 1961. This study concludes that there has been good progress in predicting and evaluating environmental and related social impacts of watershed modification proposals. However, a number of obstacles need to be overcome before EIA can firmly establish itself as an effective planning tool. These difficulties include jurisdictional confusions and conflicts, division of authority and responsibility in designing and implementing appropriate mitigative and monitoring measures, lack of tested EIA methodologies, and limited availability of qualified human resources. A number of conclusions and suggestions are offered so that future watershed modification proposals may be planned and implemented in a more environmentally sustainable fashion. These include: (1) EIA processes must be completed before irrevocable decisions are made. (2) Any major intrusion into a watershed is likely to impact on some major components of the ecosystem(s). (3) Mitigation costs must form part of the benefit-cost analysis of any project proposal. (4) Interjurisdictional cooperation is imperative where watersheds cross political boundaries. (5) The EIA process is a public process, hence public concerns must be dealt with fairly. (6) The role of science in the EIA process must be at arms length from project proponents and regulators, and allowed to function in the interest of the protection of the

  16. Symbolic Computations and Exact and Explicit Solutions of Some Nonlinear Evolution Equations in Mathematical Physics

    International Nuclear Information System (INIS)

    Oezis, Turgut; Aslan, Imail

    2009-01-01

    With the aid of symbolic computation system Mathematica, several explicit solutions for Fisher's equation and CKdV equation are constructed by utilizing an auxiliary equation method, the so called G'/G-expansion method, where the new and more general forms of solutions are also constructed. When the parameters are taken as special values, the previously known solutions are recovered. (general)

  17. Emergence, evolution, intelligence; hydroinformatics : a study of distributed and decentralised computing using intelligent agents

    NARCIS (Netherlands)

    Babovic, V.

    1996-01-01

    The computer-controlled operating environments of such facilities as automated factories, nuclear power plants, telecommunication centres and space stations are continually becoming more complex.The situation is similar, if not even more apparent and urgent, in the case of water. Water is not only

  18. Evolution in Student Perceptions of a Flipped Classroom in a Computer Programming Course

    Science.gov (United States)

    Davenport, Casey E.

    2018-01-01

    The "flipped classroom" pedagogical approach is used for a combined undergraduate and graduate computer programming course in meteorology. Details of how the course was flipped are discussed, as well as how student perceptions of the approach, which were gathered from qualitative feedback collected throughout the semester, evolved.…

  19. Computational model design specification for Phase 1 of the Hanford Environmental Dose Reconstruction Project

    International Nuclear Information System (INIS)

    Napier, B.A.

    1991-07-01

    The objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation dose that individuals could have received as a result of emission from nuclear operations at Hanford since their inception in 1944. The purpose of this report is to outline the basic algorithm and necessary computer calculations to be used to calculate radiation doses specific and hypothetical individuals in the vicinity of Hanford. The system design requirements, those things that must be accomplished, are defined. The system design specifications, the techniques by which those requirements are met, are outlined. Included are the basic equations, logic diagrams, and preliminary definition of the nature of each input distribution. 4 refs., 10 figs., 9 tabs

  20. Computational model design specification for Phase 1 of the Hanford Environmental Dose Reconstruction Project

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.

    1991-07-01

    The objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation dose that individuals could have received as a result of emission from nuclear operations at Hanford since their inception in 1944. The purpose of this report is to outline the basic algorithm and necessary computer calculations to be used to calculate radiation doses specific and hypothetical individuals in the vicinity of Hanford. The system design requirements, those things that must be accomplished, are defined. The system design specifications, the techniques by which those requirements are met, are outlined. Included are the basic equations, logic diagrams, and preliminary definition of the nature of each input distribution. 4 refs., 10 figs., 9 tabs.

  1. ProjectQ: an open source software framework for quantum computing

    Directory of Open Access Journals (Sweden)

    Damian S. Steiger

    2018-01-01

    Full Text Available We introduce ProjectQ, an open source software effort for quantum computing. The first release features a compiler framework capable of targeting various types of hardware, a high-performance simulator with emulation capabilities, and compiler plug-ins for circuit drawing and resource estimation. We introduce our Python-embedded domain-specific language, present the features, and provide example implementations for quantum algorithms. The framework allows testing of quantum algorithms through simulation and enables running them on actual quantum hardware using a back-end connecting to the IBM Quantum Experience cloud service. Through extension mechanisms, users can provide back-ends to further quantum hardware, and scientists working on quantum compilation can provide plug-ins for additional compilation, optimization, gate synthesis, and layout strategies.

  2. Computer-aided detection of breast carcinoma in standard mammographic projections with digital mammography

    International Nuclear Information System (INIS)

    Destounis, S.; Hanson, S.

    2007-01-01

    This study was conducted to retrospectively evaluate a computer-aided detection system's ability to detect breast carcinoma in multiple standard mammographic projections. Forty-five lesions in 44 patients imaged with digital mammography (Selenia registered , Hologic, Bedford, MA; Senographe registered , GE, Milwaukee, WI) and had computer-aided detection (CAD, Image-checker registered V 8.3.15, Hologic/R2, Santa Clara, CA) applied at the time of examination were identified for review; all were subsequently recommended to biopsy where cancer was revealed. These lesions were determined by the study Radiologist to be visible in both standard mammographic images (mediolateral oblique, MLO; craniocaudal, CC). For each patient, case data included patient age, tissue density, lesion type, BIRADS registered assessment, lesion size, lesion visibility-visible on MLO and/or CC view, ability of CAD to correctly mark the cancerous lesion, number of CAD marks per image, needle core biopsy results and surgical pathologic correlation. For this study cohort. CAD lesion/case sensitivity of 87% (n = 39) was found and image sensitivity was found to be 69% (n = 31) for MLO view and 78% (n = 35) for the CC view. For the study cohort, cases presented with a median of four marks per cases (range 0-13). Eighty-four percent (n = 38) of lesions proceeded to excision; initial needle biopsy pathology was upgraded at surgical excision from in situ disease to invasive for 24% (n = 9) lesions. CAD has demonstrated the potential to detect mammographically visible cancers in multiple standard mammographic projections in all categories of lesions in this study cohort. (orig.)

  3. Investigation on the wake evolution of contra-rotating propeller using RANS computation and SPIV measurement

    Directory of Open Access Journals (Sweden)

    Kwang-Jun Paik

    2015-05-01

    Full Text Available The wake characteristics of Contra-Rotating Propeller (CRP were investigated using numerical simulation and flow measurement. The numerical simulation was carried out with a commercial CFD code based on a Reynolds Averaged Navier-Stokes (RANS equations solver, and the flow measurement was performed with Stereoscopic Particle Image Velocimetry (SPIV system. The simulation results were validated through the comparison with the experiment results measured around the leading edge of rudder to investigate the effect of propeller operation under the conditions without propeller, with forward propeller alone, and with both forward and aft propellers. The evolution of CRP wake was analyzed through velocity and vorticity contours on three transverse planes and one longitudinal plane based on CFD results. The trajectories of propeller tip vortex core in the cases with and without aft propeller were also compared, and larger wake contraction with CRP was confirmed.

  4. Test computations on the dynamical evolution of star clusters. [Fluid dynamic method

    Energy Technology Data Exchange (ETDEWEB)

    Angeletti, L; Giannone, P. (Rome Univ. (Italy))

    1977-01-01

    Test calculations have been carried out on the evolution of star clusters using the fluid-dynamical method devised by Larson (1970). Large systems of stars have been considered with specific concern with globular clusters. With reference to the analogous 'standard' model by Larson, the influence of varying in turn the various free parameters (cluster mass, star mass, tidal radius, mass concentration of the initial model) has been studied for the results. Furthermore, the partial release of some simplifying assumptions with regard to the relaxation time and distribution of the 'target' stars has been considered. The change of the structural properties is discussed, and the variation of the evolutionary time scale is outlined. An indicative agreement of the results obtained here with structural properties of globular clusters as deduced from previous theoretical models is pointed out.

  5. Deformation mechanisms and grain size evolution in the Bohemian granulites - a computational study

    Science.gov (United States)

    Maierova, Petra; Lexa, Ondrej; Jeřábek, Petr; Franěk, Jan; Schulmann, Karel

    2015-04-01

    A dominant deformation mechanism in crustal rocks (e.g., dislocation and diffusion creep, grain boundary sliding, solution-precipitation) depends on many parameters such as temperature, major minerals, differential stress, strain rate and grain size. An exemplary sequence of deformation mechanisms was identified in the largest felsic granulite massifs in the southern Moldanubian domain (Bohemian Massif, central European Variscides). These massifs were interpreted to result from collision-related forced diapiric ascent of lower crust and its subsequent lateral spreading at mid-crustal levels. Three types of microstructures were distinguished. The oldest relict microstructure (S1) with large grains (>1000 μm) of feldspar deformed probably by dislocation creep at peak HT eclogite facies conditions. Subsequently at HP granulite-facies conditions, chemically- and deformation- induced recrystallization of feldspar porphyroclasts led to development of a fine-grained microstructure (S2, ~50 μm grain size) indicating deformation via diffusion creep, probably assisted by melt-enhanced grain-boundary sliding. This microstructure was associated with flow in the lower crust and/or its diapiric ascent. The latest microstructure (S3, ~100 μm grain size) is related to the final lateral spreading of retrograde granulites, and shows deformation by dislocation creep at amphibolite-facies conditions. The S2-S3 switch and coarsening was interpreted to be related with a significant decrease in strain rate. From this microstructural sequence it appears that it is the grain size that is critically linked with specific mechanical behavior of these rocks. Thus in this study, we focused on the interplay between grain size and deformation with the aim to numerically simulate and reinterpret the observed microstructural sequence. We tested several different mathematical descriptions of the grain size evolution, each of which gave qualitatively different results. We selected the two most

  6. Computer-Assisted Classification Patterns in Autoimmune Diagnostics: The AIDA Project

    Directory of Open Access Journals (Sweden)

    Amel Benammar Elgaaied

    2016-01-01

    Full Text Available Antinuclear antibodies (ANAs are significant biomarkers in the diagnosis of autoimmune diseases in humans, done by mean of Indirect ImmunoFluorescence (IIF method, and performed by analyzing patterns and fluorescence intensity. This paper introduces the AIDA Project (autoimmunity: diagnosis assisted by computer developed in the framework of an Italy-Tunisia cross-border cooperation and its preliminary results. A database of interpreted IIF images is being collected through the exchange of images and double reporting and a Gold Standard database, containing around 1000 double reported images, has been settled. The Gold Standard database is used for optimization of a CAD (Computer Aided Detection solution and for the assessment of its added value, in order to be applied along with an Immunologist as a second Reader in detection of autoantibodies. This CAD system is able to identify on IIF images the fluorescence intensity and the fluorescence pattern. Preliminary results show that CAD, used as second Reader, appeared to perform better than Junior Immunologists and hence may significantly improve their efficacy; compared with two Junior Immunologists, the CAD system showed higher Intensity Accuracy (85,5% versus 66,0% and 66,0%, higher Patterns Accuracy (79,3% versus 48,0% and 66,2%, and higher Mean Class Accuracy (79,4% versus 56,7% and 64.2%.

  7. Computer-Assisted Classification Patterns in Autoimmune Diagnostics: The AIDA Project.

    Science.gov (United States)

    Benammar Elgaaied, Amel; Cascio, Donato; Bruno, Salvatore; Ciaccio, Maria Cristina; Cipolla, Marco; Fauci, Alessandro; Morgante, Rossella; Taormina, Vincenzo; Gorgi, Yousr; Marrakchi Triki, Raja; Ben Ahmed, Melika; Louzir, Hechmi; Yalaoui, Sadok; Imene, Sfar; Issaoui, Yassine; Abidi, Ahmed; Ammar, Myriam; Bedhiafi, Walid; Ben Fraj, Oussama; Bouhaha, Rym; Hamdi, Khouloud; Soumaya, Koudhi; Neili, Bilel; Asma, Gati; Lucchese, Mariano; Catanzaro, Maria; Barbara, Vincenza; Brusca, Ignazio; Fregapane, Maria; Amato, Gaetano; Friscia, Giuseppe; Neila, Trai; Turkia, Souayeh; Youssra, Haouami; Rekik, Raja; Bouokez, Hayet; Vasile Simone, Maria; Fauci, Francesco; Raso, Giuseppe

    2016-01-01

    Antinuclear antibodies (ANAs) are significant biomarkers in the diagnosis of autoimmune diseases in humans, done by mean of Indirect ImmunoFluorescence (IIF) method, and performed by analyzing patterns and fluorescence intensity. This paper introduces the AIDA Project (autoimmunity: diagnosis assisted by computer) developed in the framework of an Italy-Tunisia cross-border cooperation and its preliminary results. A database of interpreted IIF images is being collected through the exchange of images and double reporting and a Gold Standard database, containing around 1000 double reported images, has been settled. The Gold Standard database is used for optimization of a CAD (Computer Aided Detection) solution and for the assessment of its added value, in order to be applied along with an Immunologist as a second Reader in detection of autoantibodies. This CAD system is able to identify on IIF images the fluorescence intensity and the fluorescence pattern. Preliminary results show that CAD, used as second Reader, appeared to perform better than Junior Immunologists and hence may significantly improve their efficacy; compared with two Junior Immunologists, the CAD system showed higher Intensity Accuracy (85,5% versus 66,0% and 66,0%), higher Patterns Accuracy (79,3% versus 48,0% and 66,2%), and higher Mean Class Accuracy (79,4% versus 56,7% and 64.2%).

  8. The impact of CFD on development test facilities - A National Research Council projection. [computational fluid dynamics

    Science.gov (United States)

    Korkegi, R. H.

    1983-01-01

    The results of a National Research Council study on the effect that advances in computational fluid dynamics (CFD) will have on conventional aeronautical ground testing are reported. Current CFD capabilities include the depiction of linearized inviscid flows and a boundary layer, initial use of Euler coordinates using supercomputers to automatically generate a grid, research and development on Reynolds-averaged Navier-Stokes (N-S) equations, and preliminary research on solutions to the full N-S equations. Improvements in the range of CFD usage is dependent on the development of more powerful supercomputers, exceeding even the projected abilities of the NASA Numerical Aerodynamic Simulator (1 BFLOP/sec). Full representation of the Re-averaged N-S equations will require over one million grid points, a computing level predicted to be available in 15 yr. Present capabilities allow identification of data anomalies, confirmation of data accuracy, and adequateness of model design in wind tunnel trials. Account can be taken of the wall effects and the Re in any flight regime during simulation. CFD can actually be more accurate than instrumented tests, since all points in a flow can be modeled with CFD, while they cannot all be monitored with instrumentation in a wind tunnel.

  9. COMPASS, the COMmunity Petascale project for Accelerator Science and Simulation, a board computational accelerator physics initiative

    International Nuclear Information System (INIS)

    Cary, J.R.; Spentzouris, P.; Amundson, J.; McInnes, L.; Borland, M.; Mustapha, B.; Ostroumov, P.; Wang, Y.; Fischer, W.; Fedotov, A.; Ben-Zvi, I.; Ryne, R.; Esarey, E.; Geddes, C.; Qiang, J.; Ng, E.; Li, S.; Ng, C.; Lee, R.; Merminga, L.; Wang, H.; Bruhwiler, D.L.; Dechow, D.; Mullowney, P.; Messmer, P.; Nieter, C.; Ovtchinnikov, S.; Paul, K.; Stoltz, P.; Wade-Stein, D.; Mori, W.B.; Decyk, V.; Huang, C.K.; Lu, W.; Tzoufras, M.; Tsung, F.; Zhou, M.; Werner, G.R.; Antonsen, T.; Katsouleas, T.; Morris, B.

    2007-01-01

    Accelerators are the largest and most costly scientific instruments of the Department of Energy, with uses across a broad range of science, including colliders for particle physics and nuclear science and light sources and neutron sources for materials studies. COMPASS, the Community Petascale Project for Accelerator Science and Simulation, is a broad, four-office (HEP, NP, BES, ASCR) effort to develop computational tools for the prediction and performance enhancement of accelerators. The tools being developed can be used to predict the dynamics of beams in the presence of optical elements and space charge forces, the calculation of electromagnetic modes and wake fields of cavities, the cooling induced by comoving beams, and the acceleration of beams by intense fields in plasmas generated by beams or lasers. In SciDAC-1, the computational tools had multiple successes in predicting the dynamics of beams and beam generation. In SciDAC-2 these tools will be petascale enabled to allow the inclusion of an unprecedented level of physics for detailed prediction

  10. Computer-Assisted Classification Patterns in Autoimmune Diagnostics: The AIDA Project

    Science.gov (United States)

    Benammar Elgaaied, Amel; Cascio, Donato; Bruno, Salvatore; Ciaccio, Maria Cristina; Cipolla, Marco; Fauci, Alessandro; Morgante, Rossella; Taormina, Vincenzo; Gorgi, Yousr; Marrakchi Triki, Raja; Ben Ahmed, Melika; Louzir, Hechmi; Yalaoui, Sadok; Imene, Sfar; Issaoui, Yassine; Abidi, Ahmed; Ammar, Myriam; Bedhiafi, Walid; Ben Fraj, Oussama; Bouhaha, Rym; Hamdi, Khouloud; Soumaya, Koudhi; Neili, Bilel; Asma, Gati; Lucchese, Mariano; Catanzaro, Maria; Barbara, Vincenza; Brusca, Ignazio; Fregapane, Maria; Amato, Gaetano; Friscia, Giuseppe; Neila, Trai; Turkia, Souayeh; Youssra, Haouami; Rekik, Raja; Bouokez, Hayet; Vasile Simone, Maria; Fauci, Francesco; Raso, Giuseppe

    2016-01-01

    Antinuclear antibodies (ANAs) are significant biomarkers in the diagnosis of autoimmune diseases in humans, done by mean of Indirect ImmunoFluorescence (IIF) method, and performed by analyzing patterns and fluorescence intensity. This paper introduces the AIDA Project (autoimmunity: diagnosis assisted by computer) developed in the framework of an Italy-Tunisia cross-border cooperation and its preliminary results. A database of interpreted IIF images is being collected through the exchange of images and double reporting and a Gold Standard database, containing around 1000 double reported images, has been settled. The Gold Standard database is used for optimization of a CAD (Computer Aided Detection) solution and for the assessment of its added value, in order to be applied along with an Immunologist as a second Reader in detection of autoantibodies. This CAD system is able to identify on IIF images the fluorescence intensity and the fluorescence pattern. Preliminary results show that CAD, used as second Reader, appeared to perform better than Junior Immunologists and hence may significantly improve their efficacy; compared with two Junior Immunologists, the CAD system showed higher Intensity Accuracy (85,5% versus 66,0% and 66,0%), higher Patterns Accuracy (79,3% versus 48,0% and 66,2%), and higher Mean Class Accuracy (79,4% versus 56,7% and 64.2%). PMID:27042658

  11. Holographic memory system based on projection recording of computer-generated 1D Fourier holograms.

    Science.gov (United States)

    Betin, A Yu; Bobrinev, V I; Donchenko, S S; Odinokov, S B; Evtikhiev, N N; Starikov, R S; Starikov, S N; Zlokazov, E Yu

    2014-10-01

    Utilization of computer generation of holographic structures significantly simplifies the optical scheme that is used to record the microholograms in a holographic memory record system. Also digital holographic synthesis allows to account the nonlinear errors of the record system to improve the microholograms quality. The multiplexed record of holograms is a widespread technique to increase the data record density. In this article we represent the holographic memory system based on digital synthesis of amplitude one-dimensional (1D) Fourier transform holograms and the multiplexed record of these holograms onto the holographic carrier using optical projection scheme. 1D Fourier transform holograms are very sensitive to orientation of the anamorphic optical element (cylindrical lens) that is required for encoded data object reconstruction. The multiplex record of several holograms with different orientation in an optical projection scheme allowed reconstruction of the data object from each hologram by rotating the cylindrical lens on the corresponding angle. Also, we discuss two optical schemes for the recorded holograms readout: a full-page readout system and line-by-line readout system. We consider the benefits of both systems and present the results of experimental modeling of 1D Fourier holograms nonmultiplex and multiplex record and reconstruction.

  12. Projection decomposition algorithm for dual-energy computed tomography via deep neural network.

    Science.gov (United States)

    Xu, Yifu; Yan, Bin; Chen, Jian; Zeng, Lei; Li, Lei

    2018-03-15

    Dual-energy computed tomography (DECT) has been widely used to improve identification of substances from different spectral information. Decomposition of the mixed test samples into two materials relies on a well-calibrated material decomposition function. This work aims to establish and validate a data-driven algorithm for estimation of the decomposition function. A deep neural network (DNN) consisting of two sub-nets is proposed to solve the projection decomposition problem. The compressing sub-net, substantially a stack auto-encoder (SAE), learns a compact representation of energy spectrum. The decomposing sub-net with a two-layer structure fits the nonlinear transform between energy projection and basic material thickness. The proposed DNN not only delivers image with lower standard deviation and higher quality in both simulated and real data, and also yields the best performance in cases mixed with photon noise. Moreover, DNN costs only 0.4 s to generate a decomposition solution of 360 × 512 size scale, which is about 200 times faster than the competing algorithms. The DNN model is applicable to the decomposition tasks with different dual energies. Experimental results demonstrated the strong function fitting ability of DNN. Thus, the Deep learning paradigm provides a promising approach to solve the nonlinear problem in DECT.

  13. Computer-aided detection of breast carcinoma in standard mammographic projections with digital mammography

    Energy Technology Data Exchange (ETDEWEB)

    Destounis, Stamatia [Elizabeth Wende Breast Care, LLC, Rochester, NY (United States); University of Rochester, School of Medicine and Dentistry, Rochester, NY (United States); Hanson, Sarah; Morgan, Renee; Murphy, Philip; Somerville, Patricia; Seifert, Posy; Andolina, Valerie; Arieno, Andrea; Skolny, Melissa; Logan-Young, Wende [Elizabeth Wende Breast Care, LLC, Rochester, NY (United States)

    2009-06-15

    A retrospective evaluation of the ability of computer-aided detection (CAD) ability to identify breast carcinoma in standard mammographic projections. Forty-five biopsy proven lesions in 44 patients imaged digitally with CAD applied at examination were reviewed. Forty-four screening BIRADS {sup registered} category 1 digital mammography examinations were randomly identified to serve as a comparative normal/control population. Data included patient age; BIRADS {sup registered} breast density; lesion type, size, and visibility; number, type, and location of CAD marks per image; CAD ability to mark lesions; needle core and surgical pathologic correlation. The CAD lesion/case sensitivity of 87% (n=39), image sensitivity of 69% (n=31) for mediolateral oblique view and 78% (n=35) for the craniocaudal view was found. The average false positive rate in 44 normal screening cases was 2.0 (range 1-8). The 2.0 figure is based on 88 reported false positive CAD marks in 44 normal screening exams: 98% (n=44) lesions proceeded to excision; initial pathology upgraded at surgical excision from in situ to invasive disease in 24% (n=9) lesions. CAD demonstrated potential to detect mammographically visible cancers in standard projections for all lesion types. (orig.)

  14. Projected evolution of circulation types and their temperatures over Central Europe in climate models

    Czech Academy of Sciences Publication Activity Database

    Plavcová, Eva; Kyselý, Jan

    2013-01-01

    Roč. 114, 3-4 (2013), s. 625-634 ISSN 0177-798X R&D Project s: GA ČR GAP209/10/2265 Grant - others:ENSEMBLES: EU-FP6(XE) 505539 Program:FP6 Institutional support: RVO:68378289 Keywords : Regional climate models * Atmospheric circulation * Climate change scenarios * Surface air temperature * ENSEMBLES * Central Europe Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 1.742, year: 2013 http://link.springer.com/article/10.1007%2Fs00704-013-0874-4#page-1

  15. Evolution and Revolution in the Design of Computers Based on Nanoelectronics

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    Today's computers are roughly a factor of one billion less efficient at doing their job than the laws of fundamental physics state that they could be. How much of this efficiency gain will we actually be able to harvest? What are the biggest obstacles to achieving many orders of magnitude improvement in our computing hardware, rather that the roughly factor of two we are used to seeing with each new generation of chip? Shrinking components to the nanoscale offers both potential advantages and severe challenges. The transition from classical mechanics to quantum mechanics is a major issue. Others are the problems of defect and fault tolearance: defects are manufacturing mistakes or components that irreversibly break over time and faults are transient interuptions that occur during operation. Both of these issues become bigger problems as component sizes shrink and the number of components scales up massively. In 1955, John von Neumann showed that a completely general approach to building a ...

  16. The Si elegans project at the interface of experimental and computational Caenorhabditis elegans neurobiology and behavior

    Science.gov (United States)

    Petrushin, Alexey; Ferrara, Lorenzo; Blau, Axel

    2016-12-01

    Objective. In light of recent progress in mapping neural function to behavior, we briefly and selectively review past and present endeavors to reveal and reconstruct nervous system function in Caenorhabditis elegans through simulation. Approach. Rather than presenting an all-encompassing review on the mathematical modeling of C. elegans, this contribution collects snapshots of pathfinding key works and emerging technologies that recent single- and multi-center simulation initiatives are building on. We thereby point out a few general limitations and problems that these undertakings are faced with and discuss how these may be addressed and overcome. Main results. Lessons learned from past and current computational approaches to deciphering and reconstructing information flow in the C. elegans nervous system corroborate the need of refining neural response models and linking them to intra- and extra-environmental interactions to better reflect and understand the actual biological, biochemical and biophysical events that lead to behavior. Together with single-center research efforts, the Si elegans and OpenWorm projects aim at providing the required, in some cases complementary tools for different hardware architectures to support advancement into this direction. Significance. Despite its seeming simplicity, the nervous system of the hermaphroditic nematode C. elegans with just 302 neurons gives rise to a rich behavioral repertoire. Besides controlling vital functions (feeding, defecation, reproduction), it encodes different stimuli-induced as well as autonomous locomotion modalities (crawling, swimming and jumping). For this dichotomy between system simplicity and behavioral complexity, C. elegans has challenged neurobiologists and computational scientists alike. Understanding the underlying mechanisms that lead to a context-modulated functionality of individual neurons would not only advance our knowledge on nervous system function and its failure in pathological

  17. SciDAC-Data, A Project to Enabling Data Driven Modeling of Exascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mubarak, M.; Ding, P.; Aliaga, L.; Tsaris, A.; Norman, A.; Lyon, A.; Ross, R.

    2016-10-10

    The SciDAC-Data project is a DOE funded initiative to analyze and exploit two decades of information and analytics that have been collected by the Fermilab Data Center on the organization, movement, and consumption of High Energy Physics data. The project will analyze the analysis patterns and data organization that have been used by the NOvA, MicroBooNE, MINERvA and other experiments, to develop realistic models of HEP analysis workflows and data processing. The SciDAC-Data project aims to provide both realistic input vectors and corresponding output data that can be used to optimize and validate simulations of HEP analysis. These simulations are designed to address questions of data handling, cache optimization and workflow structures that are the prerequisites for modern HEP analysis chains to be mapped and optimized to run on the next generation of leadership class exascale computing facilities. We will address the use of the SciDAC-Data distributions acquired from Fermilab Data Center’s analysis workflows and corresponding to around 71,000 HEP jobs, as the input to detailed queuing simulations that model the expected data consumption and caching behaviors of the work running in HPC environments. In particular we describe in detail how the Sequential Access via Metadata (SAM) data handling system in combination with the dCache/Enstore based data archive facilities have been analyzed to develop the radically different models of the analysis of HEP data. We present how the simulation may be used to analyze the impact of design choices in archive facilities.

  18. Evolution of the ATLAS PanDA workload management system for exascale computational science

    International Nuclear Information System (INIS)

    Maeno, T; Klimentov, A; Panitkin, S; Schovancova, J; Wenaus, T; Yu, D; De, K; Nilsson, P; Oleynik, D; Petrosyan, A; Vaniachine, A

    2014-01-01

    An important foundation underlying the impressive success of data processing and analysis in the ATLAS experiment [1] at the LHC [2] is the Production and Distributed Analysis (PanDA) workload management system [3]. PanDA was designed specifically for ATLAS and proved to be highly successful in meeting all the distributed computing needs of the experiment. However, the core design of PanDA is not experiment specific. The PanDA workload management system is capable of meeting the needs of other data intensive scientific applications. Alpha-Magnetic Spectrometer [4], an astro-particle experiment on the International Space Station, and the Compact Muon Solenoid [5], an LHC experiment, have successfully evaluated PanDA and are pursuing its adoption. In this paper, a description of the new program of work to develop a generic version of PanDA will be given, as well as the progress in extending PanDA's capabilities to support supercomputers and clouds and to leverage intelligent networking. PanDA has demonstrated at a very large scale the value of automated dynamic brokering of diverse workloads across distributed computing resources. The next generation of PanDA will allow other data-intensive sciences and a wider exascale community employing a variety of computing platforms to benefit from ATLAS' experience and proven tools.

  19. Evolution of the Brain Computing Interface (BCI and Proposed Electroencephalography (EEG Signals Based Authentication Model

    Directory of Open Access Journals (Sweden)

    Ramzan Qaseem

    2018-01-01

    Full Text Available With current advancements in the field of Brain Computer interface it is required to study how it will affect the other technologies currently in use. In this paper, the authors motivate the need of Brain Computing Interface in the era of IoT (Internet of Things, and analyze how BCI in the presence of IoT could have serious privacy breach if not protected by new kind of more secure protocols. Security breach and hacking has been around for a long time but now we are sensitive towards data as our lives depend on it. When everything is interconnected through IoT and considering that we control all interconnected things by means of our brain using BCI (Brain Computer Interface, the meaning of security breach becomes much more sensitive than in the past. This paper describes the old security methods being used for authentication and how they can be compromised. Considering the sensitivity of data in the era of IoT, a new form of authentication is required, which should incorporate BCI rather than usual authentication techniques.

  20. Land destruction and redevelopment - the use of computer based landscape evolution models for post-mining landscape reconstruction

    Science.gov (United States)

    Hancock, Greg; Willgoose, Garry

    2017-04-01

    Mining provides essential resources for the global economy as well as considerable employment and economic benefits for the community. Mining is necessary for the modern economy. However, in recent decades the scale and environmental impact of mining has grown in line with the global demand for resources. This requires ever increasing areas of land to be disturbed. In particular, open-cast mining removes topsoil, disrupts aquifers and removes uneconomic material to depths of many hundreds of metres. Post-mining, this highly disturbed landscape system requires rehabilitation. The first and most important component of this process is to construct an erosionally stable landform which then can ecologically integrate with the surrounding undisturbed landscape. The scale and importance of this process cannot be overstated as without planned rehabilitation it is likely that a degraded and highly erosional landscape system with result. Here we discuss computer based landform evolution models which provide essential information on the likely erosional stability of the reconstructed landscape. These models use a digital elevation model to represent the landscape and dynamically adjusts the surface in response to erosion and deposition. They provide information on soil erosion rates at the storm event time scale through to annual time scales. The models can also be run to assess landscape evolution at millennial time scales. They also provide information on the type of erosion (i.e. rilling, gullying) and likely gully depths (and if they will occur). Importantly, the latest models have vegetation, armouring and pedogenesis submodels incorporated into their formulation. This allows both the surface and subsurface landscape evolution to be assessed. These models have been widely used and have huge benefits for the assessment of reconstructed landscapes as well as other disturbed landscape systems. Here we outline the state of the art.

  1. Forest Management and the Evolution of Project Design in Dynamic Wildland Urban Interface Fire Environments

    Science.gov (United States)

    Conway, S.

    2014-12-01

    The Truckee Ranger District on the Tahoe National Forest, in the heart of the Sierra Nevada Mountains, has a rich history of human activities. Native American influences, comstock-era logging, fire suppression, development, and recreation have all shaped the natural environment into what it is today. Like much of our national forests in California, forest conditions that have developed are generally much more homogenous and less resistant to disturbance from fire, insect, and disease than they might have been without the myriad of human influences. However, in order to improve the resiliency of our forests to stand replacing disturbances like high severity fire, while managing for integrated anthropomorphic values, it is imperative that management evolve to meet those dynamic needs. Recent advances in remote sensing and GIS allow land managers more access to forest information and can inform site specific prescriptions to change site specific undesirable conditions. It is ecologically and politically complex, yet our forests deserve that microscope. This particular presentation will focus on how the Truckee Ranger District began this process of incorporating several values, generated from stakeholder collaboration, into one project's goals and how those lessons learned informed their most recent project.

  2. Evolution of Project Management in Integration Development Trends of Today’s Russia

    Directory of Open Access Journals (Sweden)

    Lukmanova Inessa

    2017-01-01

    Full Text Available Today’s development trends are affected by many external and internal factors, stepping up globalization processes, faster innovation cycles, negative effects of various bans, import and export sanctions slapped on some categories of goods, technologies, capital, etc. Tit-for-tat responses of the economic systems seek sustainable development of countries taking part in the integration processes. A radical-yet-constructive strategy that deepens integration can tackle the problem. Its focus is determined by the need to upgrade the economies of the former Soviet Union; while a mutual interest in the rollout of the synergistic integration potential channels common efforts on transcontinental megaprojects. A new type of integration projects creates comprehensive methods of project management. They feature the need for information transparency and controlled alignment of innovation, investment, construction and resource capacities of member countries. With streamlined resource flows, the Eurasian transit will help facilitate and cut costs of commodity exchange among countries, provided they take synchronized efforts in not just technical and technological, but also organizational, economic, legal and IT innovations.

  3. Summary of computational support and general documentation for computer code (GENTREE) used in Office of Nuclear Waste Isolation Pilot Salt Site Selection Project

    International Nuclear Information System (INIS)

    Beatty, J.A.; Younker, J.L.; Rousseau, W.F.; Elayat, H.A.

    1983-01-01

    A Decision Tree Computer Model was adapted for the purposes of a Pilot Salt Site Selection Project conducted by the Office of Nuclear Waste Isolation (ONWI). A deterministic computer model was developed to structure the site selection problem with submodels reflecting the five major outcome categories (Cost, Safety, Delay, Environment, Community Impact) to be evaluated in the decision process. Time-saving modifications were made in the tree code as part of the effort. In addition, format changes allowed retention of information items which are valuable in directing future research and in isolation of key variabilities in the Site Selection Decision Model. The deterministic code was linked to the modified tree code and the entire program was transferred to the ONWI-VAX computer for future use by the ONWI project

  4. Performance Refactoring of Instrumentation, Measurement, and Analysis Technologies for Petascale Computing. The PRIMA Project

    Energy Technology Data Exchange (ETDEWEB)

    Malony, Allen D. [Univ. of Oregon, Eugene, OR (United States). Dept. of Computer and Information Science; Wolf, Felix G. [Wilhelm-Johnen-Strasse, Julich (Germany). Forschungszentrum Julich GmbH

    2014-01-31

    The growing number of cores provided by today’s high-­end computing systems present substantial challenges to application developers in their pursuit of parallel efficiency. To find the most effective optimization strategy, application developers need insight into the runtime behavior of their code. The University of Oregon (UO) and the Juelich Supercomputing Centre of Forschungszentrum Juelich (FZJ) develop the performance analysis tools TAU and Scalasca, respectively, which allow high-­performance computing (HPC) users to collect and analyze relevant performance data – even at very large scales. TAU and Scalasca are considered among the most advanced parallel performance systems available, and are used extensively across HPC centers in the U.S., Germany, and around the world. The TAU and Scalasca groups share a heritage of parallel performance tool research and partnership throughout the past fifteen years. Indeed, the close interactions of the two groups resulted in a cross-­fertilization of tool ideas and technologies that pushed TAU and Scalasca to what they are today. It also produced two performance systems with an increasing degree of functional overlap. While each tool has its specific analysis focus, the tools were implementing measurement infrastructures that were substantially similar. Because each tool provides complementary performance analysis, sharing of measurement results is valuable to provide the user with more facets to understand performance behavior. However, each measurement system was producing performance data in different formats, requiring data interoperability tools to be created. A common measurement and instrumentation system was needed to more closely integrate TAU and Scalasca and to avoid the duplication of development and maintenance effort. The PRIMA (Performance Refactoring of Instrumentation, Measurement, and Analysis) project was proposed over three years ago as a joint international effort between UO and FZJ to

  5. Performance Refactoring of Instrumentation, Measurement, and Analysis Technologies for Petascale Computing: the PRIMA Project

    Energy Technology Data Exchange (ETDEWEB)

    Malony, Allen D. [Department of Computer and Information Science, University of Oregon; Wolf, Felix G. [Juelich Supercomputing Centre, Forschungszentrum Juelich

    2014-01-31

    The growing number of cores provided by today’s high-end computing systems present substantial challenges to application developers in their pursuit of parallel efficiency. To find the most effective optimization strategy, application developers need insight into the runtime behavior of their code. The University of Oregon (UO) and the Juelich Supercomputing Centre of Forschungszentrum Juelich (FZJ) develop the performance analysis tools TAU and Scalasca, respectively, which allow high-performance computing (HPC) users to collect and analyze relevant performance data – even at very large scales. TAU and Scalasca are considered among the most advanced parallel performance systems available, and are used extensively across HPC centers in the U.S., Germany, and around the world. The TAU and Scalasca groups share a heritage of parallel performance tool research and partnership throughout the past fifteen years. Indeed, the close interactions of the two groups resulted in a cross-fertilization of tool ideas and technologies that pushed TAU and Scalasca to what they are today. It also produced two performance systems with an increasing degree of functional overlap. While each tool has its specific analysis focus, the tools were implementing measurement infrastructures that were substantially similar. Because each tool provides complementary performance analysis, sharing of measurement results is valuable to provide the user with more facets to understand performance behavior. However, each measurement system was producing performance data in different formats, requiring data interoperability tools to be created. A common measurement and instrumentation system was needed to more closely integrate TAU and Scalasca and to avoid the duplication of development and maintenance effort. The PRIMA (Performance Refactoring of Instrumentation, Measurement, and Analysis) project was proposed over three years ago as a joint international effort between UO and FZJ to accomplish

  6. Mathematical and computational analyses of cracking formation fracture morphology and its evolution in engineering materials and structures

    CERN Document Server

    Sumi, Yoichi

    2014-01-01

    This book is about the pattern formation and the evolution of crack propagation in engineering materials and structures, bridging mathematical analyses of cracks based on singular integral equations, to computational simulation of engineering design. The first two parts of this book focus on elasticity and fracture and provide the basis for discussions on fracture morphology and its numerical simulation, which may lead to a simulation-based fracture control in engineering structures. Several design concepts are discussed for the prevention of fatigue and fracture in engineering structures, including safe-life design, fail-safe design, damage tolerant design. After starting with basic elasticity and fracture theories in parts one and two, this book focuses on the fracture morphology that develops due to the propagation of brittle cracks or fatigue cracks.   In part three, the mathematical analysis of a curved crack is precisely described, based on the perturbation method. The stability theory of interactive ...

  7. Evolution of flowering strategies in Oenothera glazioviana: an integral projection model approach.

    Science.gov (United States)

    Rees, Mark; Rose, Karen E

    2002-07-22

    The timing of reproduction is a key determinant of fitness. Here, we develop parameterized integral projection models of size-related flowering for the monocarpic perennial Oenothera glazioviana and use these to predict the evolutionarily stable strategy (ESS) for flowering. For the most part there is excellent agreement between the model predictions and the results of quantitative field studies. However, the model predicts a much steeper relationship between plant size and the probability of flowering than observed in the field, indicating selection for a 'threshold size' flowering function. Elasticity and sensitivity analysis of population growth rate lambda and net reproductive rate R(0) are used to identify the critical traits that determine fitness and control the ESS for flowering. Using the fitted model we calculate the fitness landscape for invading genotypes and show that this is characterized by a ridge of approximately equal fitness. The implications of these results for the maintenance of genetic variation are discussed.

  8. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  9. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  10. Evolution of the ATLAS PanDA Workload Management System for Exascale Computational Science

    OpenAIRE

    Maeno, T; De, K; Klimentov, A; Nilsson, P; Oleynik, D; Panitkin, S; Petrosyan, A; Schovancova, J; Vaniachine, A; Wenaus, T; Yu, D

    2013-01-01

    An important foundation underlying the impressive success of data processing and analysis in the ATLAS experiment [1] at the LHC [2] is the Production and Distributed Analysis (PanDA) workload management system [3]. PanDA was designed specifically for ATLAS and proved to be highly successful in meeting all the distributed computing needs of the experiment. However, the core design of PanDA is not experiment specific. The PanDA workload management system is capable of meeting the needs of othe...

  11. Computer studies of the evolution of planetary and satellite systems. II

    International Nuclear Information System (INIS)

    Barricelli, N.A.; Aashamar, K.

    1980-01-01

    This paper describes two computer experiments carried out with a CDC-Cyber 74 program for computer simulation of a large number of objects in orbit about a central body or primary. The first experiment was started with 125 planets of which the two largest ones had coplanar orbits and masses comparable to those of Jupiter and Saturn, respectively. Their semi-major axes and eccentricities were, however, much larger. The smaller planets had a distribution promoting the formation of an axial meeting area. The experiment gives information relevant to the question of focusing of planetary orbits into a common plane and to the question of the formation and stability of an axial meeting area. Together with the next experiment, it also gives information about the development of commensurabilities (or resonances) with the largest planets. The second experiment started with 55 planets none of them with a mass greater than about 20% of Jupiter's but several of them with orbits close to a common plane. The aim of the experiment was to investigate whether successive captures followed by planetary fusion could lead to the formation of major planets comparable to Jupiter and Saturn, and in similar orbits. Also this experiment gives information relevant to the commensurability problem. (Auth.)

  12. Computer-automated evolution of an X-band antenna for NASA's Space Technology 5 mission.

    Science.gov (United States)

    Hornby, Gregory S; Lohn, Jason D; Linden, Derek S

    2011-01-01

    Whereas the current practice of designing antennas by hand is severely limited because it is both time and labor intensive and requires a significant amount of domain knowledge, evolutionary algorithms can be used to search the design space and automatically find novel antenna designs that are more effective than would otherwise be developed. Here we present our work in using evolutionary algorithms to automatically design an X-band antenna for NASA's Space Technology 5 (ST5) spacecraft. Two evolutionary algorithms were used: the first uses a vector of real-valued parameters and the second uses a tree-structured generative representation for constructing the antenna. The highest-performance antennas from both algorithms were fabricated and tested and both outperformed a hand-designed antenna produced by the antenna contractor for the mission. Subsequent changes to the spacecraft orbit resulted in a change in requirements for the spacecraft antenna. By adjusting our fitness function we were able to rapidly evolve a new set of antennas for this mission in less than a month. One of these new antenna designs was built, tested, and approved for deployment on the three ST5 spacecraft, which were successfully launched into space on March 22, 2006. This evolved antenna design is the first computer-evolved antenna to be deployed for any application and is the first computer-evolved hardware in space.

  13. LHCb computing in Run II and its evolution towards Run III

    CERN Document Server

    Falabella, Antonio

    2016-01-01

    his contribution reports on the experience of the LHCb computing team during LHC Run 2 and its preparation for Run 3. Furthermore a brief introduction on LHCbDIRAC, i.e. the tool to interface to the experiment distributed computing resources for its data processing and data management operations, is given. Run 2, which started in 2015, has already seen several changes in the data processing workflows of the experiment. Most notably the ability to align and calibrate the detector between two different stages of the data processing in the high level trigger farm, eliminating the need for a second pass processing of the data offline. In addition a fraction of the data is immediately reconstructed to its final physics format in the high level trigger and only this format is exported from the experiment site to the physics analysis. This concept have successfully been tested and will continue to be used for the rest of Run 2. Furthermore the distributed data processing has been improved with new concepts and techn...

  14. Cloud/Fog Computing System Architecture and Key Technologies for South-North Water Transfer Project Safety

    Directory of Open Access Journals (Sweden)

    Yaoling Fan

    2018-01-01

    Full Text Available In view of the real-time and distributed features of Internet of Things (IoT safety system in water conservancy engineering, this study proposed a new safety system architecture for water conservancy engineering based on cloud/fog computing and put forward a method of data reliability detection for the false alarm caused by false abnormal data from the bottom sensors. Designed for the South-North Water Transfer Project (SNWTP, the architecture integrated project safety, water quality safety, and human safety. Using IoT devices, fog computing layer was constructed between cloud server and safety detection devices in water conservancy projects. Technologies such as real-time sensing, intelligent processing, and information interconnection were developed. Therefore, accurate forecasting, accurate positioning, and efficient management were implemented as required by safety prevention of the SNWTP, and safety protection of water conservancy projects was effectively improved, and intelligential water conservancy engineering was developed.

  15. Assessment of methods for computing the closest point projection, penetration, and gap functions in contact searching problems

    Czech Academy of Sciences Publication Activity Database

    Kopačka, Ján; Gabriel, Dušan; Plešek, Jiří; Ulbin, M.

    2016-01-01

    Roč. 105, č. 11 (2016), s. 803-833 ISSN 0029-5981 R&D Projects: GA ČR(CZ) GAP101/12/2315; GA MŠk(CZ) ME10114 Institutional support: RVO:61388998 Keywords : closest point projection * local contact search * quadratic elements * Newtons methods * geometric iteration methods * simplex method Subject RIV: JC - Computer Hardware ; Software Impact factor: 2.162, year: 2016 http://onlinelibrary.wiley.com/doi/10.1002/nme.4994/abstract

  16. The environmental and molecular sciences laboratory project: Continuous evolution in leadership

    International Nuclear Information System (INIS)

    Knutson, D.E.; McClusky, J.K.

    1995-09-01

    The United States is embarking on an environmental cleanup effort that dwarfs previous scientific enterprise. Using current best available technology, the projected costs of cleaning up the tens of abounds of toxic waste sites, including DOE sites, is estimated to exceed one trillion dollars. That level of expenditure contains no guarantee that the sites can be restored to their original condition, and no consensus on ''how clean is clean enough.'' ''Ultimately, the scientific challenge is to determine as accurately as possible each term in the path that links the source of the contaminant with the particular biological end points or health effects and to understand the mechanisms that connect them. However, the present state of scientific knowledge regarding the effects of exogenous chemicals on human biology is very limited. Understanding the connections at the molecular level is, at best, a blurred picture and often a black box.'' Long term environmental research at the molecular level is needed to resolve the concerns, and form the building blocks for a structure of cost effective process improvement and regulatory reform

  17. The environmental and molecular sciences laboratory project: Continuous evolution in leadership

    Energy Technology Data Exchange (ETDEWEB)

    Knutson, D.E.; McClusky, J.K.

    1995-09-01

    The United States is embarking on an environmental cleanup effort that dwarfs previous scientific enterprise. Using current best available technology, the projected costs of cleaning up the tens of abounds of toxic waste sites, including DOE sites, is estimated to exceed one trillion dollars. That level of expenditure contains no guarantee that the sites can be restored to their original condition, and no consensus on ``how clean is clean enough.`` ``Ultimately, the scientific challenge is to determine as accurately as possible each term in the path that links the source of the contaminant with the particular biological end points or health effects and to understand the mechanisms that connect them. However, the present state of scientific knowledge regarding the effects of exogenous chemicals on human biology is very limited. Understanding the connections at the molecular level is, at best, a blurred picture and often a black box.`` Long term environmental research at the molecular level is needed to resolve the concerns, and form the building blocks for a structure of cost effective process improvement and regulatory reform.

  18. Comparison of Computed and Measured Vortex Evolution for a UH-60A Rotor in Forward Flight

    Science.gov (United States)

    Ahmad, Jasim Uddin; Yamauchi, Gloria K.; Kao, David L.

    2013-01-01

    A Computational Fluid Dynamics (CFD) simulation using the Navier-Stokes equations was performed to determine the evolutionary and dynamical characteristics of the vortex flowfield for a highly flexible aeroelastic UH-60A rotor in forward flight. The experimental wake data were acquired using Particle Image Velocimetry (PIV) during a test of the fullscale UH-60A rotor in the National Full-Scale Aerodynamics Complex 40- by 80-Foot Wind Tunnel. The PIV measurements were made in a stationary cross-flow plane at 90 deg rotor azimuth. The CFD simulation was performed using the OVERFLOW CFD solver loosely coupled with the rotorcraft comprehensive code CAMRAD II. Characteristics of vortices captured in the PIV plane from different blades are compared with CFD calculations. The blade airloads were calculated using two different turbulence models. A limited spatial, temporal, and CFD/comprehensive-code coupling sensitivity analysis was performed in order to verify the unsteady helicopter simulations with a moving rotor grid system.

  19. Fast Tree: Computing Large Minimum-Evolution Trees with Profiles instead of a Distance Matrix

    Energy Technology Data Exchange (ETDEWEB)

    N. Price, Morgan; S. Dehal, Paramvir; P. Arkin, Adam

    2009-07-31

    Gene families are growing rapidly, but standard methods for inferring phylogenies do not scale to alignments with over 10,000 sequences. We present FastTree, a method for constructing large phylogenies and for estimating their reliability. Instead of storing a distance matrix, FastTree stores sequence profiles of internal nodes in the tree. FastTree uses these profiles to implement neighbor-joining and uses heuristics to quickly identify candidate joins. FastTree then uses nearest-neighbor interchanges to reduce the length of the tree. For an alignment with N sequences, L sites, and a different characters, a distance matrix requires O(N^2) space and O(N^2 L) time, but FastTree requires just O( NLa + N sqrt(N) ) memory and O( N sqrt(N) log(N) L a ) time. To estimate the tree's reliability, FastTree uses local bootstrapping, which gives another 100-fold speedup over a distance matrix. For example, FastTree computed a tree and support values for 158,022 distinct 16S ribosomal RNAs in 17 hours and 2.4 gigabytes of memory. Just computing pairwise Jukes-Cantor distances and storing them, without inferring a tree or bootstrapping, would require 17 hours and 50 gigabytes of memory. In simulations, FastTree was slightly more accurate than neighbor joining, BIONJ, or FastME; on genuine alignments, FastTree's topologies had higher likelihoods. FastTree is available at http://microbesonline.org/fasttree.

  20. A novel forward projection-based metal artifact reduction method for flat-detector computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Prell, Daniel; Kyriakou, Yiannis; Beister, Marcel; Kalender, Willi A [Institute of Medical Physics, University of Erlangen-Nuernberg, Henkestrasse 91, 91052 Erlangen (Germany)], E-mail: daniel.prell@imp.uni-erlangen.de

    2009-11-07

    Metallic implants generate streak-like artifacts in flat-detector computed tomography (FD-CT) reconstructed volumetric images. This study presents a novel method for reducing these disturbing artifacts by inserting discarded information into the original rawdata using a three-step correction procedure and working directly with each detector element. Computation times are minimized by completely implementing the correction process on graphics processing units (GPUs). First, the original volume is corrected using a three-dimensional interpolation scheme in the rawdata domain, followed by a second reconstruction. This metal artifact-reduced volume is then segmented into three materials, i.e. air, soft-tissue and bone, using a threshold-based algorithm. Subsequently, a forward projection of the obtained tissue-class model substitutes the missing or corrupted attenuation values directly for each flat detector element that contains attenuation values corresponding to metal parts, followed by a final reconstruction. Experiments using tissue-equivalent phantoms showed a significant reduction of metal artifacts (deviations of CT values after correction compared to measurements without metallic inserts reduced typically to below 20 HU, differences in image noise to below 5 HU) caused by the implants and no significant resolution losses even in areas close to the inserts. To cover a variety of different cases, cadaver measurements and clinical images in the knee, head and spine region were used to investigate the effectiveness and applicability of our method. A comparison to a three-dimensional interpolation correction showed that the new approach outperformed interpolation schemes. Correction times are minimized, and initial and corrected images are made available at almost the same time (12.7 s for the initial reconstruction, 46.2 s for the final corrected image compared to 114.1 s and 355.1 s on central processing units (CPUs))

  1. Radiotherapy infrastructure and human resources in Switzerland : Present status and projected computations for 2020.

    Science.gov (United States)

    Datta, Niloy Ranjan; Khan, Shaka; Marder, Dietmar; Zwahlen, Daniel; Bodis, Stephan

    2016-09-01

    The purpose of this study was to evaluate the present status of radiotherapy infrastructure and human resources in Switzerland and compute projections for 2020. The European Society of Therapeutic Radiation Oncology "Quantification of Radiation Therapy Infrastructure and Staffing" guidelines (ESTRO-QUARTS) and those of the International Atomic Energy Agency (IAEA) were applied to estimate the requirements for teleradiotherapy (TRT) units, radiation oncologists (RO), medical physicists (MP) and radiotherapy technologists (RTT). The databases used for computation of the present gap and additional requirements are (a) Global Cancer Incidence, Mortality and Prevalence (GLOBOCAN) for cancer incidence (b) the Directory of Radiotherapy Centres (DIRAC) of the IAEA for existing TRT units (c) human resources from the recent ESTRO "Health Economics in Radiation Oncology" (HERO) survey and (d) radiotherapy utilization (RTU) rates for each tumour site, published by the Ingham Institute for Applied Medical Research (IIAMR). In 2015, 30,999 of 45,903 cancer patients would have required radiotherapy. By 2020, this will have increased to 34,041 of 50,427 cancer patients. Switzerland presently has an adequate number of TRTs, but a deficit of 57 ROs, 14 MPs and 36 RTTs. By 2020, an additional 7 TRTs, 72 ROs, 22 MPs and 66 RTTs will be required. In addition, a realistic dynamic model for calculation of staff requirements due to anticipated changes in future radiotherapy practices has been proposed. This model could be tailor-made and individualized for any radiotherapy centre. A 9.8 % increase in radiotherapy requirements is expected for cancer patients over the next 5 years. The present study should assist the stakeholders and health planners in designing an appropriate strategy for meeting future radiotherapy needs for Switzerland.

  2. Radiotherapy infrastructure and human resources in Switzerland. Present status and projected computations for 2020

    International Nuclear Information System (INIS)

    Datta, Niloy Ranjan; Khan, Shaka; Marder, Dietmar; Zwahlen, Daniel; Bodis, Stephan

    2016-01-01

    The purpose of this study was to evaluate the present status of radiotherapy infrastructure and human resources in Switzerland and compute projections for 2020. The European Society of Therapeutic Radiation Oncology ''Quantification of Radiation Therapy Infrastructure and Staffing'' guidelines (ESTRO-QUARTS) and those of the International Atomic Energy Agency (IAEA) were applied to estimate the requirements for teleradiotherapy (TRT) units, radiation oncologists (RO), medical physicists (MP) and radiotherapy technologists (RTT). The databases used for computation of the present gap and additional requirements are (a) Global Cancer Incidence, Mortality and Prevalence (GLOBOCAN) for cancer incidence (b) the Directory of Radiotherapy Centres (DIRAC) of the IAEA for existing TRT units (c) human resources from the recent ESTRO ''Health Economics in Radiation Oncology'' (HERO) survey and (d) radiotherapy utilization (RTU) rates for each tumour site, published by the Ingham Institute for Applied Medical Research (IIAMR). In 2015, 30,999 of 45,903 cancer patients would have required radiotherapy. By 2020, this will have increased to 34,041 of 50,427 cancer patients. Switzerland presently has an adequate number of TRTs, but a deficit of 57 ROs, 14 MPs and 36 RTTs. By 2020, an additional 7 TRTs, 72 ROs, 22 MPs and 66 RTTs will be required. In addition, a realistic dynamic model for calculation of staff requirements due to anticipated changes in future radiotherapy practices has been proposed. This model could be tailor-made and individualized for any radiotherapy centre. A 9.8 % increase in radiotherapy requirements is expected for cancer patients over the next 5 years. The present study should assist the stakeholders and health planners in designing an appropriate strategy for meeting future radiotherapy needs for Switzerland. (orig.) [de

  3. A novel forward projection-based metal artifact reduction method for flat-detector computed tomography

    International Nuclear Information System (INIS)

    Prell, Daniel; Kyriakou, Yiannis; Beister, Marcel; Kalender, Willi A

    2009-01-01

    Metallic implants generate streak-like artifacts in flat-detector computed tomography (FD-CT) reconstructed volumetric images. This study presents a novel method for reducing these disturbing artifacts by inserting discarded information into the original rawdata using a three-step correction procedure and working directly with each detector element. Computation times are minimized by completely implementing the correction process on graphics processing units (GPUs). First, the original volume is corrected using a three-dimensional interpolation scheme in the rawdata domain, followed by a second reconstruction. This metal artifact-reduced volume is then segmented into three materials, i.e. air, soft-tissue and bone, using a threshold-based algorithm. Subsequently, a forward projection of the obtained tissue-class model substitutes the missing or corrupted attenuation values directly for each flat detector element that contains attenuation values corresponding to metal parts, followed by a final reconstruction. Experiments using tissue-equivalent phantoms showed a significant reduction of metal artifacts (deviations of CT values after correction compared to measurements without metallic inserts reduced typically to below 20 HU, differences in image noise to below 5 HU) caused by the implants and no significant resolution losses even in areas close to the inserts. To cover a variety of different cases, cadaver measurements and clinical images in the knee, head and spine region were used to investigate the effectiveness and applicability of our method. A comparison to a three-dimensional interpolation correction showed that the new approach outperformed interpolation schemes. Correction times are minimized, and initial and corrected images are made available at almost the same time (12.7 s for the initial reconstruction, 46.2 s for the final corrected image compared to 114.1 s and 355.1 s on central processing units (CPUs)).

  4. RadWorks Project. ISS REM - to - BIRD - to - HERA: The Evolution of a Technology

    Science.gov (United States)

    McLeod, Catherine D.

    2015-01-01

    The advancement of particle detectors based on technologies developed for use in high-energy physics applications has enabled the development of a completely new generation of compact low-power active dosimeters and area monitors for use in space radiation environments. One such device, the TimePix, is being developed at CERN, and is providing the technology basis for the most recent line of radiation detection devices being developed by the NASA AES RadWorks project. The most fundamental of these devices, an ISS-Radiation Environment Monitor (REM), is installed as a USB device on ISS where it is monitoring the radiation environment on a perpetual basis. The second generation of this TimePix technology, the BIRD (Battery-operated Independent Radiation Detector), was flown on the NASA EFT-1 flight in December 2014. Data collected by BIRD was the first data made available from the Trapped Belt region of the Earth's atmosphere in over 40 years. The 3rdgeneration of this technology, the HERA (Hybrid Electronic Radiation Assessor), is planned to be integrated into the Orion EM-1, and EM-2 vehicles where it will monitor the radiation environment. For the EM-2 flight, HERA will provide Caution and Warning notification for SPEs as well as real time dose measurements for crew members. The development of this line of radiation detectors provide much greater information and characterization of charged particles in the space radiation environment than has been collected in the past, and in the process provide greater information to inform crew members of radiation related risks, while being very power and mass efficient.

  5. RNAdualPF: software to compute the dual partition function with sample applications in molecular evolution theory.

    Science.gov (United States)

    Garcia-Martin, Juan Antonio; Bayegan, Amir H; Dotu, Ivan; Clote, Peter

    2016-10-19

    -content. Using different inverse folding software, another group had earlier shown that pre-miRNA is mutationally robust, even controlling for compositional bias. Our opposite conclusion suggests a cautionary note that computationally based insights into molecular evolution may heavily depend on the software used. C/C++-software for RNAdualPF is available at http://bioinformatics.bc.edu/clotelab/RNAdualPF .

  6. WE-AB-303-09: Rapid Projection Computations for On-Board Digital Tomosynthesis in Radiation Therapy

    International Nuclear Information System (INIS)

    Iliopoulos, AS; Sun, X; Pitsianis, N; Yin, FF; Ren, L

    2015-01-01

    Purpose: To facilitate fast and accurate iterative volumetric image reconstruction from limited-angle on-board projections. Methods: Intrafraction motion hinders the clinical applicability of modern radiotherapy techniques, such as lung stereotactic body radiation therapy (SBRT). The LIVE system may impact clinical practice by recovering volumetric information via Digital Tomosynthesis (DTS), thus entailing low time and radiation dose for image acquisition during treatment. The DTS is estimated as a deformation of prior CT via iterative registration with on-board images; this shifts the challenge to the computational domain, owing largely to repeated projection computations across iterations. We address this issue by composing efficient digital projection operators from their constituent parts. This allows us to separate the static (projection geometry) and dynamic (volume/image data) parts of projection operations by means of pre-computations, enabling fast on-board processing, while also relaxing constraints on underlying numerical models (e.g. regridding interpolation kernels). Further decoupling the projectors into simpler ones ensures the incurred memory overhead remains low, within the capacity of a single GPU. These operators depend only on the treatment plan and may be reused across iterations and patients. The dynamic processing load is kept to a minimum and maps well to the GPU computational model. Results: We have integrated efficient, pre-computable modules for volumetric ray-casting and FDK-based back-projection with the LIVE processing pipeline. Our results show a 60x acceleration of the DTS computations, compared to the previous version, using a single GPU; presently, reconstruction is attained within a couple of minutes. The present implementation allows for significant flexibility in terms of the numerical and operational projection model; we are investigating the benefit of further optimizations and accurate digital projection sub

  7. WE-AB-303-09: Rapid Projection Computations for On-Board Digital Tomosynthesis in Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Iliopoulos, AS; Sun, X [Duke University, Durham, NC (United States); Pitsianis, N [Aristotle University of Thessaloniki (Greece); Duke University, Durham, NC (United States); Yin, FF; Ren, L [Duke University Medical Center, Durham, NC (United States)

    2015-06-15

    Purpose: To facilitate fast and accurate iterative volumetric image reconstruction from limited-angle on-board projections. Methods: Intrafraction motion hinders the clinical applicability of modern radiotherapy techniques, such as lung stereotactic body radiation therapy (SBRT). The LIVE system may impact clinical practice by recovering volumetric information via Digital Tomosynthesis (DTS), thus entailing low time and radiation dose for image acquisition during treatment. The DTS is estimated as a deformation of prior CT via iterative registration with on-board images; this shifts the challenge to the computational domain, owing largely to repeated projection computations across iterations. We address this issue by composing efficient digital projection operators from their constituent parts. This allows us to separate the static (projection geometry) and dynamic (volume/image data) parts of projection operations by means of pre-computations, enabling fast on-board processing, while also relaxing constraints on underlying numerical models (e.g. regridding interpolation kernels). Further decoupling the projectors into simpler ones ensures the incurred memory overhead remains low, within the capacity of a single GPU. These operators depend only on the treatment plan and may be reused across iterations and patients. The dynamic processing load is kept to a minimum and maps well to the GPU computational model. Results: We have integrated efficient, pre-computable modules for volumetric ray-casting and FDK-based back-projection with the LIVE processing pipeline. Our results show a 60x acceleration of the DTS computations, compared to the previous version, using a single GPU; presently, reconstruction is attained within a couple of minutes. The present implementation allows for significant flexibility in terms of the numerical and operational projection model; we are investigating the benefit of further optimizations and accurate digital projection sub

  8. Paleoenvironments, Evolution, and Geomicrobiology in a Tropical Pacific Lake: The Lake Towuti Drilling Project (TOWUTI)

    Science.gov (United States)

    Vogel, Hendrik; Russell, James M.; Bijaksana, Satria; Crowe, Sean; Fowle, David; Haffner, Douglas; King, John; Marwoto, Ristiyanti; Melles, Martin; von Rintelen, Thomas; Stevenson, Janelle; Watkinson, Ian; Wattrus, Nigel

    2014-05-01

    Lake Towuti (2.5°S, 121°E) is a, 560 km2, 200-m deep tectonic lake at the downstream end of the Malili lake system, a set of five, ancient (1-2 MYr) tectonic lakes in central Sulawesi, Indonesia. Lake Towuti's location in central Indonesia provides a unique opportunity to reconstruct long-term paleoclimate change in a crucially important yet understudied region- the Indo-Pacific warm pool (IPWP), heart of the El Niño-Southern Oscillation. The Malili Lakes have extraordinarily high rates of floral and faunal endemism, and the lakes are surrounded by one of the most diverse tropical forests on Earth. Drilling in Lake Towuti will identify the age and origin of the lake and the environmental and climatic context that shaped the evolution of this unique lacustrine and terrestrial ecosystem. The ultramafic (ophiolitic) rocks and lateritic soils surrounding Lake Towuti provide metal substrates that feed a diverse, exotic microbial community, analogous to the microbial ecosystems that operated in the Archean Oceans. Drill core will provide unique insight into long-term changes in this ecosystem, as well as microbial processes operating at depth in the sediment column. High-resolution seismic reflection data (CHIRP and airgun) combined with numerous long sediment piston cores collected from 2007-2013 demonstrate the enormous promise of Lake Towuti for an ICDP drilling campaign. Well-stratified sequences of up to 150 m thickness, uninterrupted by unconformities or erosional truncation, are present in multiple sub-basins within Towuti, providing ideal sites for long-term environmental, climatic, and limnological reconstructions. Multiproxy analyses of our piston cores document a continuous and detailed record of moisture balance variations in Lake Towuti during the past 60 kyr BP. In detail our datasets show that wet conditions and rainforest ecosystems in central Indonesia persisted during Marine Isotope Stage 3 (MIS3) and the Holocene, and were interrupted by severe

  9. Recent Evolution of the Offline Computing Model of the NOvA Experiment

    Science.gov (United States)

    Habig, Alec; Norman, A.

    2015-12-01

    The NOvA experiment at Fermilab is a long-baseline neutrino experiment designed to study νe appearance in a νμ beam. Over the last few years there has been intense work to streamline the computing infrastructure in preparation for data, which started to flow in from the far detector in Fall 2013. Major accomplishments for this effort include migration to the use of off-site resources through the use of the Open Science Grid and upgrading the file-handling framework from simple disk storage to a tiered system using a comprehensive data management and delivery system to find and access files on either disk or tape storage. NOvA has already produced more than 6.5 million files and more than 1 PB of raw data and Monte Carlo simulation files which are managed under this model. The current system has demonstrated sustained rates of up to 1 TB/hour of file transfer by the data handling system. NOvA pioneered the use of new tools and this paved the way for their use by other Intensity Frontier experiments at Fermilab. Most importantly, the new framework places the experiment's infrastructure on a firm foundation, and is ready to produce the files needed for first physics.

  10. Recent Evolution of the Offline Computing Model of the NOvA Experiment

    International Nuclear Information System (INIS)

    Habig, Alec; Group, Craig; Norman, A.

    2015-01-01

    The NOvA experiment at Fermilab is a long-baseline neutrino experiment designed to study νe appearance in a ν μ beam. Over the last few years there has been intense work to streamline the computing infrastructure in preparation for data, which started to flow in from the far detector in Fall 2013. Major accomplishments for this effort include migration to the use of off-site resources through the use of the Open Science Grid and upgrading the file-handling framework from simple disk storage to a tiered system using a comprehensive data management and delivery system to find and access files on either disk or tape storage. NOvA has already produced more than 6.5 million files and more than 1 PB of raw data and Monte Carlo simulation files which are managed under this model. The current system has demonstrated sustained rates of up to 1 TB/hour of file transfer by the data handling system. NOvA pioneered the use of new tools and this paved the way for their use by other Intensity Frontier experiments at Fermilab. Most importantly, the new framework places the experiment's infrastructure on a firm foundation, and is ready to produce the files needed for first physics. (paper)

  11. Evolution of the ATLAS PanDA Workload Management System for Exascale Computational Science

    CERN Document Server

    Maeno, T; The ATLAS collaboration; Klimentov, A; Nilsson, P; Oleynik, D; Panitkin, S; Petrosyan, A; Schovancova, J; Vaniachine, A; Wenaus, T; Yu, D

    2013-01-01

    An important foundation underlying the impressive success of data processing and analysis in the ATLAS experiment [1] at the LHC [2] is the Production and Distributed Analysis (PanDA) workload management system [3]. PanDA was designed specifically for ATLAS and proved to be highly successful in meeting all the distributed computing needs of the experiment. However, the core design of PanDA is not experiment specific. The PanDA workload management system is capable of meeting the needs of other data intensive scientific applications. Alpha-Magnetic Spectrometer [4], an astro-particle experiment on the International Space Station, and the Compact Muon Solenoid [5], an LHC experiment, have successfully evaluated PanDA and are pursuing its adoption. In this paper, a description of the new program of work to develop a generic version of PanDA will be given, as well as the progress in extending PanDA's capabilities to support supercomputers and clouds and to leverage intelligent networking. PanDA has demonstrated a...

  12. Evolution of the ATLAS PanDA Workload Management System for Exascale Computational Science

    CERN Document Server

    Maeno, T; The ATLAS collaboration; Klimentov, A; Nilsson, P; Oleynik, D; Panitkin, S; Petrosyan, A; Schovancova, J; Vaniachine, A; Wenaus, T; Yu, D

    2014-01-01

    An important foundation underlying the impressive success of data processing and analysis in the ATLAS experiment [1] at the LHC [2] is the Production and Distributed Analysis (PanDA) workload management system [3]. PanDA was designed specifically for ATLAS and proved to be highly successful in meeting all the distributed computing needs of the experiment. However, the core design of PanDA is not experiment specific. The PanDA workload management system is capable of meeting the needs of other data intensive scientific applications. Alpha-Magnetic Spectrometer [4], an astro-particle experiment on the International Space Station, and the Compact Muon Solenoid [5], an LHC experiment, have successfully evaluated PanDA and are pursuing its adoption. In this paper, a description of the new program of work to develop a generic version of PanDA will be given, as well as the progress in extending PanDA's capabilities to support supercomputers and clouds and to leverage intelligent networking. PanDA has demonstrated a...

  13. Computed tomographic evolution of post-traumatic subdural hygroma in young adults

    International Nuclear Information System (INIS)

    Masuzawa, T.; Sato, F.

    1984-01-01

    The authors report on two cases of post-traumatic subdural hygroma that were encountered in young adults. Serial computed tomograms were taken immediately following trauma and for more than 4 weeks thereafter. In the case of a 28-year-old man with a skull fracture, an initial CT scan revealed a thin crescentic subdural collection in the right frontal area. A successive CT scan on the 36th postoperative day revealed developed subdural hygroma, and the CSF-like fluid was surgically evacuated. In the second case, involving an 18-year-old man, a very thin bifrontal subdural collection was found on the initial CT scan, and on the 15th post-traumatic day CT scan demonstrated a bifrontal subdural hygroma. No surgical treatment was carried out, and the follow-up CT scan on the 29th post-traumatic day demonstrated no change in size. The two young patients were slightly symptomatic during the period involved, and the repeat unenchanced CT scans showed subdural lesions of less than brain density, even in the chronic stage. (orig.)

  14. Responding to change - The evolution of operator training for the PFR liquid metals disposal project

    International Nuclear Information System (INIS)

    Cashmore, Stephen

    2006-01-01

    environmental management practice, UKAEA decided to add a Caesium Removal Plant (CRP) on to the SDP. Neutralized effluent from the SDP would now be pumped through an ion exchange column prior to discharge to the site effluent treatment plant. In conclusion, commissioning and operating the PFR Liquid Metals Disposal Plant was a challenging task. Training and qualifying the operators was part of that challenge. Though lengthy and time intensive, the LMD training process had several positive benefits: 1. The process demonstrated that persons from a semi-skilled background with little or no previous experience, could be trained to operate a relatively complex process plant safely and efficiently; 2. The formally documented progress of each stage of training provided a clearly auditable record that was acceptable to all parties, including the regulators; 3. The cost of implementing the training was more than compensated for by the saving made in not having to employ shift engineers for the LMD project; 4. Once proved, the training methodology lent itself to adaptation for use with similar projects at Dounreay; 5. The range of skills and knowledge, acquired by the operators during their training, together with their experience of formal learning, should assist them with any similar role they may wish to apply themselves to in the future. To date (November 2005) the LMD plant has successfully processed over 1000 te of PFR's liquid metal inventory, improving safety by reducing a major potential hazard. It has also enabled UKAEA to meet the targets set by the Dounreay Near Term Work Plan for decommissioning the site. The operator team has had their SQEP status formally reviewed by the UKAEA ATO Holder, and extended for a further year, demonstrating the ongoing value of the rigorous training programme they undertook initially

  15. Using Computers in K-12 Schools: A Project Presentation and Evaluation.

    Science.gov (United States)

    Jurema, Ana Cristinia L. A.; Lima, Maria Edite Costa; Filho, Merval Jurema

    The challenge facing educators today is not just to use computers at school but to use computer education and "Informatics" (information plus automatics, placing computer education in the broader context of information and technology) to mediate improved social and learning relations in schools. In order to introduce computers into…

  16. Intelligent and interactive computer image of a nuclear power plant: The ImagIn project

    International Nuclear Information System (INIS)

    Haubensack, D.; Malvache, P.; Valleix, P.

    1998-01-01

    The ImagIn project consists in a method and a set of computer tools apt to bring perceptible and assessable improvements in the operational safety of a nuclear plant. Its aim is to design an information system that would maintain a highly detailed computerized representation of a nuclear plant in its initial state and throughout its in-service life. It is not a tool to drive or help driving the nuclear plant, but a tool that manages concurrent operations that modify the plant configuration in a very general was (maintenance for example). The configuration of the plant, as well as rules and constraints about it, are described in a object-oriented knowledge database, which is built using a generic ImagIn meta-model based on the semantical network theory. An inference engine works on this database and is connected to reality through interfaces to operators and captors on the installation; it verifies constantly in real-time the consistency of the database according to its inner rules, and reports eventual problems to concerned operators. A special effort is made on interfaces to provide natural and intuitive tools (using virtual reality, natural language, voice recognition and synthesis). A laboratory application on a fictive but realistic installation already exists and is used to simulate various tests and scenarii. A real application is being constructed on Siloe, an experimental reactor of the CEA. (author)

  17. Recommendations for computer modeling codes to support the UMTRA groundwater restoration project

    Energy Technology Data Exchange (ETDEWEB)

    Tucker, M.D. [Sandia National Labs., Albuquerque, NM (United States); Khan, M.A. [IT Corp., Albuquerque, NM (United States)

    1996-04-01

    The Uranium Mill Tailings Remediation Action (UMTRA) Project is responsible for the assessment and remedial action at the 24 former uranium mill tailings sites located in the US. The surface restoration phase, which includes containment and stabilization of the abandoned uranium mill tailings piles, has a specific termination date and is nearing completion. Therefore, attention has now turned to the groundwater restoration phase, which began in 1991. Regulated constituents in groundwater whose concentrations or activities exceed maximum contaminant levels (MCLs) or background levels at one or more sites include, but are not limited to, uranium, selenium, arsenic, molybdenum, nitrate, gross alpha, radium-226 and radium-228. The purpose of this report is to recommend computer codes that can be used to assist the UMTRA groundwater restoration effort. The report includes a survey of applicable codes in each of the following areas: (1) groundwater flow and contaminant transport modeling codes, (2) hydrogeochemical modeling codes, (3) pump and treat optimization codes, and (4) decision support tools. Following the survey of the applicable codes, specific codes that can best meet the needs of the UMTRA groundwater restoration program in each of the four areas are recommended.

  18. Block matching sparsity regularization-based image reconstruction for incomplete projection data in computed tomography

    Science.gov (United States)

    Cai, Ailong; Li, Lei; Zheng, Zhizhong; Zhang, Hanming; Wang, Linyuan; Hu, Guoen; Yan, Bin

    2018-02-01

    In medical imaging many conventional regularization methods, such as total variation or total generalized variation, impose strong prior assumptions which can only account for very limited classes of images. A more reasonable sparse representation frame for images is still badly needed. Visually understandable images contain meaningful patterns, and combinations or collections of these patterns can be utilized to form some sparse and redundant representations which promise to facilitate image reconstructions. In this work, we propose and study block matching sparsity regularization (BMSR) and devise an optimization program using BMSR for computed tomography (CT) image reconstruction for an incomplete projection set. The program is built as a constrained optimization, minimizing the L1-norm of the coefficients of the image in the transformed domain subject to data observation and positivity of the image itself. To solve the program efficiently, a practical method based on the proximal point algorithm is developed and analyzed. In order to accelerate the convergence rate, a practical strategy for tuning the BMSR parameter is proposed and applied. The experimental results for various settings, including real CT scanning, have verified the proposed reconstruction method showing promising capabilities over conventional regularization.

  19. Recommendations for computer modeling codes to support the UMTRA groundwater restoration project

    International Nuclear Information System (INIS)

    Tucker, M.D.; Khan, M.A.

    1996-04-01

    The Uranium Mill Tailings Remediation Action (UMTRA) Project is responsible for the assessment and remedial action at the 24 former uranium mill tailings sites located in the US. The surface restoration phase, which includes containment and stabilization of the abandoned uranium mill tailings piles, has a specific termination date and is nearing completion. Therefore, attention has now turned to the groundwater restoration phase, which began in 1991. Regulated constituents in groundwater whose concentrations or activities exceed maximum contaminant levels (MCLs) or background levels at one or more sites include, but are not limited to, uranium, selenium, arsenic, molybdenum, nitrate, gross alpha, radium-226 and radium-228. The purpose of this report is to recommend computer codes that can be used to assist the UMTRA groundwater restoration effort. The report includes a survey of applicable codes in each of the following areas: (1) groundwater flow and contaminant transport modeling codes, (2) hydrogeochemical modeling codes, (3) pump and treat optimization codes, and (4) decision support tools. Following the survey of the applicable codes, specific codes that can best meet the needs of the UMTRA groundwater restoration program in each of the four areas are recommended

  20. Creating computer aided 3D model of spleen and kidney based based on Visible Human Project

    International Nuclear Information System (INIS)

    Aldur, Muhammad M.

    2005-01-01

    To investigate the efficacy of computer aided 3-dimensional (3D) reconstruction technique on visualization and modeling of gross anatomical structures with an affordable methodology applied on the spleen and kidney. From The Visible Human Project Dataset cryosection images, developed by the National Library of Medicine, the spleen and kidney sections were preferred to be used due to their highly distinct contours. The software used for the reconstruction were Surf Driver 3.5.3 for Mac and Cinema 4D X L version 7.1 for Mac OS X. This study was carried out in May 2004 at the Department of Anatomy, Hacettepe University, Ankara, Turkey. As a result of this study, it is determined that these 2 programs could be effectively used both for 3D modeling of the mentioned organs and volumetric analyses on these models. It is also seen that it is possible to hold the physical models of these gross anatomical digital ones with stereolithography technique by means of the data exchange file format provided by the program and present such images as anaglyph. Surf Driver 3.5.3 for Mac OS and Cinema 4 DXL version 7.1 for Mac OS X can be used effectively for reconstruction of gross anatomical structures from serial parallel sections with distinct contours such as spleen and kidney and the animation of models. These software constitute a highly effective way of getting volumetric calculations, spatial relations and morphometrical measurements of reconstructed structures. (author)

  1. Computational Experiment Study on Selection Mechanism of Project Delivery Method Based on Complex Factors

    Directory of Open Access Journals (Sweden)

    Xiang Ding

    2014-01-01

    Full Text Available Project delivery planning is a key stage used by the project owner (or project investor for organizing design, construction, and other operations in a construction project. The main task in this stage is to select an appropriate project delivery method. In order to analyze different factors affecting the PDM selection, this paper establishes a multiagent model mainly to show how project complexity, governance strength, and market environment affect the project owner’s decision on PDM. Experiment results show that project owner usually choose Design-Build method when the project is very complex within a certain range. Besides, this paper points out that Design-Build method will be the prior choice when the potential contractors develop quickly. This paper provides the owners with methods and suggestions in terms of showing how the factors affect PDM selection, and it may improve the project performance.

  2. Protein consensus-based surface engineering (ProCoS): a computer-assisted method for directed protein evolution.

    Science.gov (United States)

    Shivange, Amol V; Hoeffken, Hans Wolfgang; Haefner, Stefan; Schwaneberg, Ulrich

    2016-12-01

    Protein consensus-based surface engineering (ProCoS) is a simple and efficient method for directed protein evolution combining computational analysis and molecular biology tools to engineer protein surfaces. ProCoS is based on the hypothesis that conserved residues originated from a common ancestor and that these residues are crucial for the function of a protein, whereas highly variable regions (situated on the surface of a protein) can be targeted for surface engineering to maximize performance. ProCoS comprises four main steps: ( i ) identification of conserved and highly variable regions; ( ii ) protein sequence design by substituting residues in the highly variable regions, and gene synthesis; ( iii ) in vitro DNA recombination of synthetic genes; and ( iv ) screening for active variants. ProCoS is a simple method for surface mutagenesis in which multiple sequence alignment is used for selection of surface residues based on a structural model. To demonstrate the technique's utility for directed evolution, the surface of a phytase enzyme from Yersinia mollaretii (Ymphytase) was subjected to ProCoS. Screening just 1050 clones from ProCoS engineering-guided mutant libraries yielded an enzyme with 34 amino acid substitutions. The surface-engineered Ymphytase exhibited 3.8-fold higher pH stability (at pH 2.8 for 3 h) and retained 40% of the enzyme's specific activity (400 U/mg) compared with the wild-type Ymphytase. The pH stability might be attributed to a significantly increased (20 percentage points; from 9% to 29%) number of negatively charged amino acids on the surface of the engineered phytase.

  3. Teachers' Views about the Use of Tablet Computers Distributed in Schools as Part of the Fatih Project

    Science.gov (United States)

    Gökmen, Ömer Faruk; Duman, Ibrahim; Akgün, Özcan Erkan

    2018-01-01

    The purpose of this study is to investigate teachers' views about the use of tablet computers distributed as a part of the FATIH (Movement for Enhancing Opportunities and Improving Technology) Project. In this study, the case study method, one of the qualitative research methods, was used. The participants were 20 teachers from various fields…

  4. Knowledge-based geographic information systems on the Macintosh computer: a component of the GypsES project

    Science.gov (United States)

    Gregory Elmes; Thomas Millette; Charles B. Yuill

    1991-01-01

    GypsES, a decision-support and expert system for the management of Gypsy Moth addresses five related research problems in a modular, computer-based project. The modules are hazard rating, monitoring, prediction, treatment decision and treatment implementation. One common component is a geographic information system designed to function intelligently. We refer to this...

  5. Implementation of Service Learning and Civic Engagement for Computer Information Systems Students through a Course Project at the Hashemite University

    Science.gov (United States)

    Al-Khasawneh, Ahmad; Hammad, Bashar K.

    2013-01-01

    Service learning methodologies provide information systems students with the opportunity to create and implement systems in real-world, public service-oriented social contexts. This paper presents a case study of integrating a service learning project into an undergraduate Computer Information Systems course titled "Information Systems"…

  6. What Is Our Current Understanding of One-to-One Computer Projects: A Systematic Narrative Research Review

    Science.gov (United States)

    Fleischer, Hakan

    2012-01-01

    The aim of this article is to review cross-disciplinary accumulated empirical research on one-to-one computer projects in school settings as published in peer-reviewed journals between 2005 and 2010, particularly the results of teacher- and pupil-oriented studies. Six hundred and five research articles were screened at the abstract and title…

  7. Collective Efficacy and Its Relationship with Leadership in a Computer-Mediated Project-Based Group Work

    Science.gov (United States)

    Huh, Yeol; Reigeluth, Charles M.; Lee, Dabae

    2014-01-01

    Based on Bandura's work, the four sources of efficacy shaping were examined in regard to frequency and students' perception of importance in a computer-mediated, project-based high school classroom. In a context of group work where there was no designated leader, groups' collective efficacy was examined if it has any relationship with individual's…

  8. The Effect of a Graph-Oriented Computer-Assisted Project-Based Learning Environment on Argumentation Skills

    Science.gov (United States)

    Hsu, P. -S.; Van Dyke, M.; Chen, Y.; Smith, T. J.

    2015-01-01

    The purpose of this quasi-experimental study was to explore how seventh graders in a suburban school in the United States developed argumentation skills and science knowledge in a project-based learning environment that incorporated a graph-oriented, computer-assisted application. A total of 54 students (three classes) comprised this treatment…

  9. Computer Tutors: An Innovative Approach to Computer Literacy. Part I: The Early Stages.

    Science.gov (United States)

    Targ, Joan

    1981-01-01

    In Part I of this two-part article, the author describes the evolution of the Computer Tutor project in Palo Alto, California, and the strategies she incorporated into a successful student-taught computer literacy program. Journal availability: Educational Computer, P.O. Box 535, Cupertino, CA 95015. (Editor/SJL)

  10. Stochastic Coastal/Regional Uncertainty Modelling: a Copernicus marine research project in the framework of Service Evolution

    Science.gov (United States)

    Vervatis, Vassilios; De Mey, Pierre; Ayoub, Nadia; Kailas, Marios; Sofianos, Sarantis

    2017-04-01

    The project entitled Stochastic Coastal/Regional Uncertainty Modelling (SCRUM) aims at strengthening CMEMS in the areas of ocean uncertainty quantification, ensemble consistency verification and ensemble data assimilation. The project has been initiated by the University of Athens and LEGOS/CNRS research teams, in the framework of CMEMS Service Evolution. The work is based on stochastic modelling of ocean physics and biogeochemistry in the Bay of Biscay, on an identical sub-grid configuration of the IBI-MFC system in its latest CMEMS operational version V2. In a first step, we use a perturbed tendencies scheme to generate ensembles describing uncertainties in open ocean and on the shelf, focusing on upper ocean processes. In a second step, we introduce two methodologies (i.e. rank histograms and array modes) aimed at checking the consistency of the above ensembles with respect to TAC data and arrays. Preliminary results highlight that wind uncertainties dominate all other atmosphere-ocean sources of model errors. The ensemble spread in medium-range ensembles is approximately 0.01 m for SSH and 0.15 °C for SST, though these values vary depending on season and cross shelf regions. Ecosystem model uncertainties emerging from perturbations in physics appear to be moderately larger than those perturbing the concentration of the biogeochemical compartments, resulting in total chlorophyll spread at about 0.01 mg.m-3. First consistency results show that the model ensemble and the pseudo-ensemble of OSTIA (L4) observation SSTs appear to exhibit nonzero joint probabilities with each other since error vicinities overlap. Rank histograms show that the model ensemble is initially under-dispersive, though results improve in the context of seasonal-range ensembles.

  11. Examination of China’s performance and thematic evolution in quantum cryptography research using quantitative and computational techniques

    Science.gov (United States)

    2018-01-01

    This study performed two phases of analysis to shed light on the performance and thematic evolution of China’s quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001–2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China’s QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China’s performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China’s performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China’s H-index (a normalized indicator) has surpassed all other countries’ over the last several years. The second phase of analysis shows how China’s main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China’s QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  13. TH-E-17A-05: Optimizing Four Dimensional Cone Beam Computed Tomography Projection Allocation to Respiratory Bins

    International Nuclear Information System (INIS)

    OBrien, R; Shieh, C; Kipritidis, J; Keall, P

    2014-01-01

    Purpose: Four dimensional cone beam computed tomography (4DCBCT) is an emerging image guidance strategy but it can suffer from poor image quality. To avoid repeating scans it is beneficial to make the best use of the imaging data obtained. For conventional 4DCBCT the location and size of respiratory bins is fixed and projections are allocated to the respiratory bin within which it falls. Strictly adhering to this rule is unnecessary and can compromise image quality. In this study we optimize the size and location of respiratory bins and allow projections to be sourced from adjacent phases of the respiratory cycle. Methods: A mathematical optimization framework using mixed integer quadratic programming has been developed that determines when to source projections from adjacent respiratory bins and optimizes the size and location of the bins. The method, which we will call projection sharing, runs in under 2 seconds of CPU time. Five 4DCBCT datasets of stage III-IV lung cancer patients were used to test the algorithm. The standard deviation of the angular separation between projections (SD-A) and the standard deviation in the volume of the reconstructed fiducial gold coil (SD-V) were used as proxies to measure streaking artefacts and motion blur respectively. Results: The SD-A using displacement binning and projection sharing was 30%–50% smaller than conventional phase based binning and 59%–76% smaller than conventional displacement binning indicating more uniformly spaced projections and fewer streaking artefacts. The SD-V was 20–90% smaller when using projection sharing than using conventional phase based binning suggesting more uniform marker segmentation and less motion blur. Conclusion: Image quality was visibly and significantly improved with projection sharing. Projection sharing does not require any modifications to existing hardware and offers a more robust replacement to phase based binning, or, an option if phase based reconstruction is not of a

  14. Next Generation Computer Resources: Reference Model for Project Support Environments (Version 2.0)

    National Research Council Canada - National Science Library

    Brown, Alan

    1993-01-01

    The objective of the Next Generation Computer Resources (NGCR) program is to restructure the Navy's approach to acquisition of standard computing resources to take better advantage of commercial advances and investments...

  15. The ATLAS Distributed Computing project for LHC Run-2 and beyond.

    CERN Document Server

    Di Girolamo, Alessandro; The ATLAS collaboration

    2015-01-01

    The ATLAS Distributed Computing infrastructure has evolved after the first period of LHC data taking in order to cope with the challenges of the upcoming LHC Run2. An increased data rate and computing demands of the Monte-Carlo simulation, as well as new approaches to ATLAS analysis, dictated a more dynamic workload management system (ProdSys2) and data management system (Rucio), overcoming the boundaries imposed by the design of the old computing model. In particular, the commissioning of new central computing system components was the core part of the migration toward the flexible computing model. The flexible computing utilization exploring the opportunistic resources such as HPC, cloud, and volunteer computing is embedded in the new computing model, the data access mechanisms have been enhanced with the remote access, and the network topology and performance is deeply integrated into the core of the system. Moreover a new data management strategy, based on defined lifetime for each dataset, has been defin...

  16. ANL/Star project: a new architecture for large scale theoretical physics computations

    International Nuclear Information System (INIS)

    Rushton, A.M.

    1985-01-01

    The project reported consists of two phases, each of which has goals of substantial physics content on its own. In Phase 1, we have selected Star Technologies' ST-100 as the array processor for the prototype coupled system and have installed one on a Vax 11/750 host. Our goals with this system are to institute a substantial program in computational physics at Argonne based on the power provided by this system and thereby to gain experience with both the hardware and software architecture of the ST-100. In Phase II, we propose to build a prototype consisting of two coupled array processors with shared memory to prove that this design can achieve high speed and efficiency in a readily extensible and cost-effective manner. This will implement all of the hardware and software modifications necessary to extend this design to as many as 64 (or more) nodes. In our design, we seek to minimize the changes made in the standard system hardware and software; this drastically reduces the effort required by our group to implement such a design and enables us to more readily incorporate the companies' upgrades to the array processor. It should be emphasized that our design is intended as a special purpose system for theoretical calculations; however it can be efficiently applied to a surprisingly broad class of problems. I shall discuss first the architecture of the ST-100 and then the physics program being currently implemented on a single system. Finally the proposed design of the coupled system is presented

  17. ANL/Star project: a new architecture for large scale theoretical physics computations

    Energy Technology Data Exchange (ETDEWEB)

    Rushton, A.M.

    1985-01-01

    The project reported consists of two phases, each of which has goals of substantial physics content on its own. In Phase 1, we have selected Star Technologies' ST-100 as the array processor for the prototype coupled system and have installed one on a Vax 11/750 host. Our goals with this system are to institute a substantial program in computational physics at Argonne based on the power provided by this system and thereby to gain experience with both the hardware and software architecture of the ST-100. In Phase II, we propose to build a prototype consisting of two coupled array processors with shared memory to prove that this design can achieve high speed and efficiency in a readily extensible and cost-effective manner. This will implement all of the hardware and software modifications necessary to extend this design to as many as 64 (or more) nodes. In our design, we seek to minimize the changes made in the standard system hardware and software; this drastically reduces the effort required by our group to implement such a design and enables us to more readily incorporate the companies' upgrades to the array processor. It should be emphasized that our design is intended as a special purpose system for theoretical calculations; however it can be efficiently applied to a surprisingly broad class of problems. I shall discuss first the architecture of the ST-100 and then the physics program being currently implemented on a single system. Finally the proposed design of the coupled system is presented.

  18. Mis-translation of a Computationally Designed Protein Yields an Exceptionally Stable Homodimer: Implications for Protein Engineering and Evolution.

    Energy Technology Data Exchange (ETDEWEB)

    Dantas, Gautam; Watters, Alexander L.; Lunde, Bradley; Eletr, Ziad; Isern, Nancy G.; Roseman, Toby; Lipfert, Jan; Doniach, Sebastian; Tompa, Martin; Kuhlman, Brian; Stoddard, Barry L.; Varani, Gabriele; Baker, David

    2006-10-06

    We recently used computational protein design to create an extremely stable, globular protein, Top7, with a sequence and fold not observed previously in nature. Since Top7 was created in the absence of genetic selection, it provides a rare opportunity to investigate aspects of the cellular protein production and surveillance machinery that are subject to natural selection. Here we show that a portion of the Top7 protein corresponding to the final 49 C-terminal residues is efficiently mistranslated and accumulates at high levels in E. coli. We used circular dichroism spectroscopy, size-exclusion chromatography, small-angle x-ray scattering, analytical ultra-centrifugation, and NMR spectroscopy to show that the resulting CFr protein adopts a compact, extremely-stable, obligate, symmetric, homo-dimeric structure. Based on the solution structure, we engineered an even more stable variant of CFr by disulfide-induced covalent circularisation that should be an excellent platform for design of novel functions. The accumulation of high levels of CFr exposes the high error rate of the protein translation machinery, and the rarity of correspondingly stable fragments in natural proteins implies a stringent evolutionary pressure against protein sub-fragments that can independently fold into stable structures. The symmetric self-association between two identical mistranslated CFr sub-units to generate an extremely stable structure parallels a mechanism for natural protein-fold evolution by modular recombination of stable protein sub-structures.

  19. The Influence of a Game-Making Project on Male and Female Learners' Attitudes to Computing

    Science.gov (United States)

    Robertson, Judy

    2013-01-01

    There is a pressing need for gender inclusive approaches to engage young people in computer science. A recent popular approach has been to harness learners' enthusiasm for computer games to motivate them to learn computer science concepts through game authoring. This article describes a study in which 992 learners across 13 schools took part in a…

  20. Computer-Assisted Language Learning: Current Programs and Projects. ERIC Digest.

    Science.gov (United States)

    Higgins, Chris

    For many years, foreign language teachers have used the computer to provide supplemental exercises in the instruction of foreign languages. In recent years, advances in computer technology have motivated teachers to reassess the computer and consider it a valuable part of daily foreign language learning. Innovative software programs, authoring…

  1. The SIMRAND 1 computer program: Simulation of research and development projects

    Science.gov (United States)

    Miles, R. F., Jr.

    1986-01-01

    The SIMRAND I Computer Program (Version 5.0 x 0.3) written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles is described. The SIMRAND I Computer Program comprises eleven modules-a main routine and ten subroutines. Two additional files are used at compile time; one inserts the system or task equations into the source code, while the other inserts the dimension statements and common blocks. The SIMRAND I Computer Program can be run on most microcomputers or mainframe computers with only minor modifications to the computer code.

  2. Magnetohydrodynamics: Parallel computation of the dynamics of thermonuclear and astrophysical plasmas. 1. Annual report of massively parallel computing pilot project 93MPR05

    International Nuclear Information System (INIS)

    1994-08-01

    This is the first annual report of the MPP pilot project 93MPR05. In this pilot project four research groups with different, complementary backgrounds collaborate with the aim to develop new algorithms and codes to simulate the magnetohydrodynamics of thermonuclear and astrophysical plasmas on massively parallel machines. The expected speed-up is required to simulate the dynamics of the hot plasmas of interest which are characterized by very large magnetic Reynolds numbers and, hence, require high spatial and temporal resolutions (for details see section 1). The four research groups that collaborated to produce the results reported here are: The MHD group of Prof. Dr. J.P. Goedbloed at the FOM-Institute for Plasma Physics 'Rijnhuizen' in Nieuwegein, the group of Prof. Dr. H. van der Vorst at the Mathematics Institute of Utrecht University, the group of Prof. Dr. A.G. Hearn at the Astronomical Institute of Utrecht University, and the group of Dr. Ir. H.J.J. te Riele at the CWI in Amsterdam. The full project team met frequently during this first project year to discuss progress reports, current problems, etc. (see section 2). The main results of the first project year are: - Proof of the scalability of typical linear and nonlinear MHD codes - development and testing of a parallel version of the Arnoldi algorithm - development and testing of alternative methods for solving large non-Hermitian eigenvalue problems - porting of the 3D nonlinear semi-implicit time evolution code HERA to an MPP system. The steps that were scheduled to reach these intended results are given in section 3. (orig./WL)

  3. Magnetohydrodynamics: Parallel computation of the dynamics of thermonuclear and astrophysical plasmas. 1. Annual report of massively parallel computing pilot project 93MPR05

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-08-01

    This is the first annual report of the MPP pilot project 93MPR05. In this pilot project four research groups with different, complementary backgrounds collaborate with the aim to develop new algorithms and codes to simulate the magnetohydrodynamics of thermonuclear and astrophysical plasmas on massively parallel machines. The expected speed-up is required to simulate the dynamics of the hot plasmas of interest which are characterized by very large magnetic Reynolds numbers and, hence, require high spatial and temporal resolutions (for details see section 1). The four research groups that collaborated to produce the results reported here are: The MHD group of Prof. Dr. J.P. Goedbloed at the FOM-Institute for Plasma Physics `Rijnhuizen` in Nieuwegein, the group of Prof. Dr. H. van der Vorst at the Mathematics Institute of Utrecht University, the group of Prof. Dr. A.G. Hearn at the Astronomical Institute of Utrecht University, and the group of Dr. Ir. H.J.J. te Riele at the CWI in Amsterdam. The full project team met frequently during this first project year to discuss progress reports, current problems, etc. (see section 2). The main results of the first project year are: - Proof of the scalability of typical linear and nonlinear MHD codes - development and testing of a parallel version of the Arnoldi algorithm - development and testing of alternative methods for solving large non-Hermitian eigenvalue problems - porting of the 3D nonlinear semi-implicit time evolution code HERA to an MPP system. The steps that were scheduled to reach these intended results are given in section 3. (orig./WL).

  4. The multimedia computer for low-literacy patient education: a pilot project of cancer risk perceptions.

    Science.gov (United States)

    Wofford, J L; Currin, D; Michielutte, R; Wofford, M M

    2001-04-20

    Inadequate reading literacy is a major barrier to better educating patients. Despite its high prevalence, practical solutions for detecting and overcoming low literacy in a busy clinical setting remain elusive. In exploring the potential role for the multimedia computer in improving office-based patient education, we compared the accuracy of information captured from audio-computer interviewing of patients with that obtained from subsequent verbal questioning. Adult medicine clinic, urban community health center Convenience sample of patients awaiting clinic appointments (n = 59). Exclusion criteria included obvious psychoneurologic impairment or primary language other than English. A multimedia computer presentation that used audio-computer interviewing with localized imagery and voices to elicit responses to 4 questions on prior computer use and cancer risk perceptions. Three patients refused or were unable to interact with the computer at all, and 3 patients required restarting the presentation from the beginning but ultimately completed the computerized survey. Of the 51 evaluable patients (72.5% African-American, 66.7% female, mean age 47.5 [+/- 18.1]), the mean time in the computer presentation was significantly longer with older age and with no prior computer use but did not differ by gender or race. Despite a high proportion of no prior computer use (60.8%), there was a high rate of agreement (88.7% overall) between audio-computer interviewing and subsequent verbal questioning. Audio-computer interviewing is feasible in this urban community health center. The computer offers a partial solution for overcoming literacy barriers inherent in written patient education materials and provides an efficient means of data collection that can be used to better target patients' educational needs.

  5. Mind, Evolution, and Computers

    OpenAIRE

    Abrahamson, Joseph R.

    1994-01-01

    Science deals with knowledge of the material world based on objective reality. It is under constant attack by those who need magic, that is, concepts based on imagination and desire, with no basis in objective reality. A convenient target for such people is speculation on the machinery and method of operation of the human mind, questions that are still obscure in 1994. In The Emperor's New Mind, Roger Penrose attempts to look beyond objective reality for possible answers, using, in his argume...

  6. Radiotherapy infrastructure and human resources in Switzerland. Present status and projected computations for 2020

    Energy Technology Data Exchange (ETDEWEB)

    Datta, Niloy Ranjan; Khan, Shaka; Marder, Dietmar [KSA-KSB, Kantonsspital Aarau, RadioOnkologieZentrum, Aarau (Switzerland); Zwahlen, Daniel [Kantonsspital Graubuenden, Department of Radiotherapy, Chur (Switzerland); Bodis, Stephan [KSA-KSB, Kantonsspital Aarau, RadioOnkologieZentrum, Aarau (Switzerland); University Hospital Zurich, Department of Radiation Oncology, Zurich (Switzerland)

    2016-09-15

    The purpose of this study was to evaluate the present status of radiotherapy infrastructure and human resources in Switzerland and compute projections for 2020. The European Society of Therapeutic Radiation Oncology ''Quantification of Radiation Therapy Infrastructure and Staffing'' guidelines (ESTRO-QUARTS) and those of the International Atomic Energy Agency (IAEA) were applied to estimate the requirements for teleradiotherapy (TRT) units, radiation oncologists (RO), medical physicists (MP) and radiotherapy technologists (RTT). The databases used for computation of the present gap and additional requirements are (a) Global Cancer Incidence, Mortality and Prevalence (GLOBOCAN) for cancer incidence (b) the Directory of Radiotherapy Centres (DIRAC) of the IAEA for existing TRT units (c) human resources from the recent ESTRO ''Health Economics in Radiation Oncology'' (HERO) survey and (d) radiotherapy utilization (RTU) rates for each tumour site, published by the Ingham Institute for Applied Medical Research (IIAMR). In 2015, 30,999 of 45,903 cancer patients would have required radiotherapy. By 2020, this will have increased to 34,041 of 50,427 cancer patients. Switzerland presently has an adequate number of TRTs, but a deficit of 57 ROs, 14 MPs and 36 RTTs. By 2020, an additional 7 TRTs, 72 ROs, 22 MPs and 66 RTTs will be required. In addition, a realistic dynamic model for calculation of staff requirements due to anticipated changes in future radiotherapy practices has been proposed. This model could be tailor-made and individualized for any radiotherapy centre. A 9.8 % increase in radiotherapy requirements is expected for cancer patients over the next 5 years. The present study should assist the stakeholders and health planners in designing an appropriate strategy for meeting future radiotherapy needs for Switzerland. (orig.) [German] Ziel dieser Studie war es, den aktuellen Stand der Infrastruktur und Personalausstattung der

  7. Coordinated Fault-Tolerance for High-Performance Computing Final Project Report

    Energy Technology Data Exchange (ETDEWEB)

    Panda, Dhabaleswar Kumar [The Ohio State University; Beckman, Pete

    2011-07-28

    With the Coordinated Infrastructure for Fault Tolerance Systems (CIFTS, as the original project came to be called) project, our aim has been to understand and tackle the following broad research questions, the answers to which will help the HEC community analyze and shape the direction of research in the field of fault tolerance and resiliency on future high-end leadership systems. Will availability of global fault information, obtained by fault information exchange between the different HEC software on a system, allow individual system software to better detect, diagnose, and adaptively respond to faults? If fault-awareness is raised throughout the system through fault information exchange, is it possible to get all system software working together to provide a more comprehensive end-to-end fault management on the system? What are the missing fault-tolerance features that widely used HEC system software lacks today that would inhibit such software from taking advantage of systemwide global fault information? What are the practical limitations of a systemwide approach for end-to-end fault management based on fault awareness and coordination? What mechanisms, tools, and technologies are needed to bring about fault awareness and coordination of responses on a leadership-class system? What standards, outreach, and community interaction are needed for adoption of the concept of fault awareness and coordination for fault management on future systems? Keeping our overall objectives in mind, the CIFTS team has taken a parallel fourfold approach. Our central goal was to design and implement a light-weight, scalable infrastructure with a simple, standardized interface to allow communication of fault-related information through the system and facilitate coordinated responses. This work led to the development of the Fault Tolerance Backplane (FTB) publish-subscribe API specification, together with a reference implementation and several experimental implementations on top of

  8. A Visual Database System for Image Analysis on Parallel Computers and its Application to the EOS Amazon Project

    Science.gov (United States)

    Shapiro, Linda G.; Tanimoto, Steven L.; Ahrens, James P.

    1996-01-01

    The goal of this task was to create a design and prototype implementation of a database environment that is particular suited for handling the image, vision and scientific data associated with the NASA's EOC Amazon project. The focus was on a data model and query facilities that are designed to execute efficiently on parallel computers. A key feature of the environment is an interface which allows a scientist to specify high-level directives about how query execution should occur.

  9. Computational Screening for Design of Optimal Coating Materials to Suppress Gas Evolution in Li-Ion Battery Cathodes.

    Science.gov (United States)

    Min, Kyoungmin; Seo, Seung-Woo; Choi, Byungjin; Park, Kwangjin; Cho, Eunseog

    2017-05-31

    Ni-rich layered oxides are attractive materials owing to their potentially high capacity for cathode applications. However, when used as cathodes in Li-ion batteries, they contain a large amount of Li residues, which degrade the electrochemical properties because they are the source of gas generation inside the battery. Here, we propose a computational approach to designing optimal coating materials that prevent gas evolution by removing residual Li from the surface of the battery cathode. To discover promising coating materials, the reactions of 16 metal phosphates (MPs) and 45 metal oxides (MOs) with the Li residues, LiOH, and Li 2 CO 3 are examined within a thermodynamic framework. A materials database is constructed according to density functional theory using a hybrid functional, and the reaction products are obtained according to the phases in thermodynamic equilibrium in the phase diagram. In addition, the gravimetric efficiency is calculated to identify coating materials that can eliminate Li residues with a minimal weight of the coating material. Overall, more MP and MO materials react with LiOH than with Li 2 CO 3 . Specifically, MPs exhibit better reactivity to both Li residues, whereas MOs react more with LiOH. The reaction products, such as Li-containing phosphates or oxides, are also obtained to identify the phases on the surface of a cathode after coating. On the basis of the Pareto-front analysis, P 2 O 5 could be an optimal material for the reaction with both Li residuals. Finally, the reactivity of the coating materials containing 3d/4d transition metal elements is better than that of materials containing other types of elements.

  10. Projected role of advanced computational aerodynamic methods at the Lockheed-Georgia company

    Science.gov (United States)

    Lores, M. E.

    1978-01-01

    Experience with advanced computational methods being used at the Lockheed-Georgia Company to aid in the evaluation and design of new and modified aircraft indicates that large and specialized computers will be needed to make advanced three-dimensional viscous aerodynamic computations practical. The Numerical Aerodynamic Simulation Facility should be used to provide a tool for designing better aerospace vehicles while at the same time reducing development costs by performing computations using Navier-Stokes equations solution algorithms and permitting less sophisticated but nevertheless complex calculations to be made efficiently. Configuration definition procedures and data output formats can probably best be defined in cooperation with industry, therefore, the computer should handle many remote terminals efficiently. The capability of transferring data to and from other computers needs to be provided. Because of the significant amount of input and output associated with 3-D viscous flow calculations and because of the exceedingly fast computation speed envisioned for the computer, special attention should be paid to providing rapid, diversified, and efficient input and output.

  11. CTmod—A toolkit for Monte Carlo simulation of projections including scatter in computed tomography

    Czech Academy of Sciences Publication Activity Database

    Malušek, Alexandr; Sandborg, M.; Alm Carlsson, G.

    2008-01-01

    Roč. 90, č. 2 (2008), s. 167-178 ISSN 0169-2607 Institutional research plan: CEZ:AV0Z10480505 Keywords : Monte Carlo * computed tomography * cone beam * scatter Subject RIV: JC - Computer Hardware ; Software Impact factor: 1.220, year: 2008 http://dx.doi.org/10.1016/j.cmpb.2007.12.005

  12. Computer simulating observations of the Lunar physical libration for the Japanese Lunar project ILOM

    Science.gov (United States)

    Petrova, Natalia; Hanada, Hideo

    2010-05-01

    In the frame of the second stage of the Japanese space mission SELENE-2 (Hanada et al. 2009) the project ILOM (In-situ Lunar Orientation Measurement) planned after 2017years is a kind of instrument for positioning on the Moon. It will be set near the lunar pole and will determine parameters of lunar physical libration by positioning of several tens of stars in the field of view regularly for longer than one year. Presented work is dedicated to analyses of computer simulating future observations. It's proposed that for every star crossing lunar prime meridian its polar distance will be to measure. The methods of optimal star observation are being developed for the future experiment. The equations are constructed to determine libration angles ? (t),ρ(t),σ(t)- on the basis of observed polar distances pobs: (| f1(?,ρ,Iσ,pobs) = 0 |{ f2(?,ρ,Iσ,pobs) = 0 | f3(?,ρ,Iσ,pobs) = 0 |( or f(X) = 0, where ; f = ? f1 ? | f2 | |? f3 |? X = ? ? ? | ρ | |? Iσ |? (1) At the present stage we have developed the software for selection of stars for these future polar observations. Stars were taken from various stellar catalogues, such as the UCAC2-BSS, Hipparcos, Tycho and FK6. The software reduces ICRS coordinates of star to selenographical system at the epoch of observation (Petrova et al., 2009). For example, to the epochs 2017 - 2018 more than 50 stars brighter than m = 12 were selected for the northern pole. In total, these stars give about 600 crossings of the prime meridian during one year. Nevertheless, only a few stars (2-5) may be observed in a vicinity of the one moment. This is not enough to have sufficient sample to exclude various kind of errors. The software includes programmes which can determine the moment of transition of star across the meridian and theoretical values of libration angles at this moments. A serious problem arises when we try to solve equations (1) with the purpose to determine libration angles on the basis of simulated pobs.. Polar distances

  13. Student and Staff Perceptions of Key Aspects of Computer Science Engineering Capstone Projects

    Science.gov (United States)

    Olarte, Juan José; Dominguez, César; Jaime, Arturo; Garcia-Izquierdo, Francisco José

    2016-01-01

    In carrying out their capstone projects, students use knowledge and skills acquired throughout their degree program to create a product or provide a technical service. An assigned advisor guides the students and supervises the work, and a committee assesses the projects. This study compares student and staff perceptions of key aspects of…

  14. The Neurona at Home project: Simulating a large-scale cellular automata brain in a distributed computing environment

    Science.gov (United States)

    Acedo, L.; Villanueva-Oller, J.; Moraño, J. A.; Villanueva, R.-J.

    2013-01-01

    The Berkeley Open Infrastructure for Network Computing (BOINC) has become the standard open source solution for grid computing in the Internet. Volunteers use their computers to complete an small part of the task assigned by a dedicated server. We have developed a BOINC project called Neurona@Home whose objective is to simulate a cellular automata random network with, at least, one million neurons. We consider a cellular automata version of the integrate-and-fire model in which excitatory and inhibitory nodes can activate or deactivate neighbor nodes according to a set of probabilistic rules. Our aim is to determine the phase diagram of the model and its behaviour and to compare it with the electroencephalographic signals measured in real brains.

  15. Tracking evolution of myoglobin stability in cetaceans using experimentally calibrated computational methods that account for generic protein relaxation

    DEFF Research Database (Denmark)

    Holm, Jeppe; Dasmeh, Pouria; Kepp, Kasper Planeta

    2016-01-01

    The evolution of cetaceans (whales, dolphins, and porpoises) from land to water is one of the most spectacular events in mammal evolution. It has been suggested that selection for higher myoglobin stability (ΔG of folding) allowed whales to conquer the deep-diving niche. The stability of multi-si...

  16. Algorithms for computing parsimonious evolutionary scenarios for genome evolution, the last universal common ancestor and dominance of horizontal gene transfer in the evolution of prokaryotes

    Directory of Open Access Journals (Sweden)

    Galperin Michael Y

    2003-01-01

    Full Text Available Abstract Background Comparative analysis of sequenced genomes reveals numerous instances of apparent horizontal gene transfer (HGT, at least in prokaryotes, and indicates that lineage-specific gene loss might have been even more common in evolution. This complicates the notion of a species tree, which needs to be re-interpreted as a prevailing evolutionary trend, rather than the full depiction of evolution, and makes reconstruction of ancestral genomes a non-trivial task. Results We addressed the problem of constructing parsimonious scenarios for individual sets of orthologous genes given a species tree. The orthologous sets were taken from the database of Clusters of Orthologous Groups of proteins (COGs. We show that the phyletic patterns (patterns of presence-absence in completely sequenced genomes of almost 90% of the COGs are inconsistent with the hypothetical species tree. Algorithms were developed to reconcile the phyletic patterns with the species tree by postulating gene loss, COG emergence and HGT (the latter two classes of events were collectively treated as gene gains. We prove that each of these algorithms produces a parsimonious evolutionary scenario, which can be represented as mapping of loss and gain events on the species tree. The distribution of the evolutionary events among the tree nodes substantially depends on the underlying assumptions of the reconciliation algorithm, e.g. whether or not independent gene gains (gain after loss after gain are permitted. Biological considerations suggest that, on average, gene loss might be a more likely event than gene gain. Therefore different gain penalties were used and the resulting series of reconstructed gene sets for the last universal common ancestor (LUCA of the extant life forms were analysed. The number of genes in the reconstructed LUCA gene sets grows as the gain penalty increases. However, qualitative examination of the LUCA versions reconstructed with different gain penalties

  17. The models of the life cycle of a computer system

    Directory of Open Access Journals (Sweden)

    Sorina-Carmen Luca

    2006-01-01

    Full Text Available The paper presents a comparative study on the patterns of the life cycle of a computer system. There are analyzed the advantages of each pattern and presented the graphic schemes that point out each stage and step in the evolution of a computer system. In the end the classifications of the methods of projecting the computer systems are discussed.

  18. Extreme-Scale Computing Project Aims to Advance Precision Oncology | FNLCR Staging

    Science.gov (United States)

    Two government agencies and five national laboratories are collaborating to develop extremely high-performance computing capabilities that will analyze mountains of research and clinical data to improve scientific understanding of cancer, predict dru

  19. Extreme-Scale Computing Project Aims to Advance Precision Oncology | Poster

    Science.gov (United States)

    Two government agencies and five national laboratories are collaborating to develop extremely high-performance computing capabilities that will analyze mountains of research and clinical data to improve scientific understanding of cancer, predict drug response, and improve treatments for patients.

  20. Extreme-Scale Computing Project Aims to Advance Precision Oncology | FNLCR

    Science.gov (United States)

    Two government agencies and five national laboratories are collaborating to develop extremely high-performance computing capabilities that will analyze mountains of research and clinical data to improve scientific understanding of cancer, predict dru

  1. Real-Time Probabilistic Structural Health Management Using Machine Learning and GPU Computing Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed project seeks to deliver an ultra-efficient, high-fidelity structural health management (SHM) framework using machine learning and graphics processing...

  2. Optimal reliability allocation for large software projects through soft computing techniques

    DEFF Research Database (Denmark)

    Madsen, Henrik; Albeanu, Grigore; Popentiu-Vladicescu, Florin

    2012-01-01

    or maximizing the system reliability subject to budget constraints. These kinds of optimization problems were considered both in deterministic and stochastic frameworks in literature. Recently, the intuitionistic-fuzzy optimization approach was considered as a soft computing successful modelling approach....... Firstly, a review on existing soft computing approaches to optimization is given. The main section extends the results considering self-organizing migrating algorithms for solving intuitionistic-fuzzy optimization problems attached to complex fault-tolerant software architectures which proved...

  3. From whole-body counting to imaging: The computer aided collimation gamma camera project (CACAO)

    Energy Technology Data Exchange (ETDEWEB)

    Jeanguillaume, C.; Begot, S.; Quartuccio, M.; Douiri, A.; Ballongue, P

    2000-07-01

    Whole-body counting is the method of choice for in vivo detection of contamination. To extend this well established method, the possible advantages of imaging radiocontaminants are examined. The use of the CACAO project is then studied. A comparison of simulated reconstructed images obtained by the CACAO project and by a conventional gamma camera used in nuclear medicine follows. Imaging a radionuclide contaminant with a geometrical sensitivity of 10{sup -2} seems possible in the near future. (author)

  4. From whole-body counting to imaging: The computer aided collimation gamma camera project (CACAO)

    International Nuclear Information System (INIS)

    Jeanguillaume, C.; Begot, S.; Quartuccio, M.; Douiri, A.; Ballongue, P.

    2000-01-01

    Whole-body counting is the method of choice for in vivo detection of contamination. To extend this well established method, the possible advantages of imaging radiocontaminants are examined. The use of the CACAO project is then studied. A comparison of simulated reconstructed images obtained by the CACAO project and by a conventional gamma camera used in nuclear medicine follows. Imaging a radionuclide contaminant with a geometrical sensitivity of 10 -2 seems possible in the near future. (author)

  5. Projectables

    DEFF Research Database (Denmark)

    Rasmussen, Troels A.; Merritt, Timothy R.

    2017-01-01

    CNC cutting machines have become essential tools for designers and architects enabling rapid prototyping, model-building and production of high quality components. Designers often cut from new materials, discarding the irregularly shaped remains. We introduce ProjecTables, a visual augmented...... reality system for interactive packing of model parts onto sheet materials. ProjecTables enables designers to (re)use scrap materials for CNC cutting that would have been previously thrown away, at the same time supporting aesthetic choices related to wood grain, avoiding surface blemishes, and other...... relevant material properties. We conducted evaluations of ProjecTables with design students from Aarhus School of Architecture, demonstrating that participants could quickly and easily place and orient model parts reducing material waste. Contextual interviews and ideation sessions led to a deeper...

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  9. Morphological evolution and internal strain mapping of pomelo peel using X-ray computed tomography and digital volume correlation

    KAUST Repository

    Wang, B.; Pan, B.; Lubineau, Gilles

    2017-01-01

    , and the evolution of both bundles bending and large strain domain from endocarp to mesocarp are explored. Based on the experimental results, the microstructure-related mechanical properties of pomelo peels in response to compressive loading that demonstrates nearly

  10. [Efficiency of computer-based documentation in long-term care--preliminary project].

    Science.gov (United States)

    Lüngen, Markus; Gerber, Andreas; Rupprecht, Christoph; Lauterbach, Karl W

    2008-06-01

    In Germany the documentation of processes in long-term care is mainly paper-based. Planning, realization and evaluation are not supported in an optimal way. In a preliminary study we evaluated the consequences of the introduction of a computer-based documentation system using handheld devices. We interviewed 16 persons before and after introducing the computer-based documentation and assessed costs for the documentation process and administration. The results show that reducing costs is likely. The job satisfaction of the personnel increased, more time could be spent for caring for the residents. We suggest further research to reach conclusive results.

  11. Digital tomosynthesis parallel imaging computational analysis with shift and add and back projection reconstruction algorithms.

    Science.gov (United States)

    Chen, Ying; Balla, Apuroop; Rayford II, Cleveland E; Zhou, Weihua; Fang, Jian; Cong, Linlin

    2010-01-01

    Digital tomosynthesis is a novel technology that has been developed for various clinical applications. Parallel imaging configuration is utilised in a few tomosynthesis imaging areas such as digital chest tomosynthesis. Recently, parallel imaging configuration for breast tomosynthesis began to appear too. In this paper, we present the investigation on computational analysis of impulse response characterisation as the start point of our important research efforts to optimise the parallel imaging configurations. Results suggest that impulse response computational analysis is an effective method to compare and optimise imaging configurations.

  12. A projection graphic display for the computer aided analysis of bubble chamber images

    International Nuclear Information System (INIS)

    Solomos, E.

    1979-01-01

    A projection graphic display for aiding the analysis of bubble chamber photographs has been developed by the Instrumentation Group of EF Division at CERN. The display image is generated on a very high brightness cathode ray tube and projected on to the table of the scanning-measuring machines as a superposition to the image of the bubble chamber. The display can send messages to the operator and aid the measurement by indicating directly on the chamber image the tracks which are measured correctly or not. (orig.)

  13. Computers Take Flight: A History of NASA's Pioneering Digital Fly-By-Wire Project

    Science.gov (United States)

    Tomayko, James E.

    2000-01-01

    An overview of the NASA F-8 Fly-by Wire project is presented. The project made two significant contributions to the new technology: (1) a solid design base of techniques that work and those that do not, and (2) credible evidence of good flying qualities and the ability of such a system to tolerate real faults and to continue operation without degradation. In 1972 the F-8C aircraft used in the program became he first digital fly-by-wire aircraft to operate without a mechanical backup system.

  14. Chirp subbottom profiler data collected in Pamlico Sound on cruise RVRiggs_05_23_24_2012 of RV Riggs for the Coastal Hydrodynamics and Natural Geologic Evolution (CHaNGE) project

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Edgetech 216 chirp data (SEG-Y format) collected for the Coastal Hydrodynamics and Natural Geologic Evolution (CHaNGE) project, OCE-1130843. Survey area covers...

  15. Chirp subbottom profiler data collected in Pamlico Sound on cruise EPamSh-2016 of RV Riggs for the Coastal Hydrodynamics and Natural Geologic Evolution (CHaNGE) project

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Edgetech 216 chirp data (SEG-Y format) collected for the Coastal Hydrodynamics and Natural Geologic Evolution (CHaNGE) project, OCE-1130843. Survey area covers...

  16. Chirp subbottom profiler data collected in Pamlico Sound on cruise RVRiggs_07_31_2013 of RV Riggs for the Coastal Hydrodynamics and Natural Geologic Evolution (CHaNGE) project

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Edgetech 216 chirp data (SEG-Y format) collected for the Coastal Hydrodynamics and Natural Geologic Evolution (CHaNGE) project, OCE-1130843. Survey area covers...

  17. Chirp subbottom profiler data collected in Pamlico Sound on cruise RVRiggs_07_30_2013 of RV Riggs for the Coastal Hydrodynamics and Natural Geologic Evolution (CHaNGE) project

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Edgetech 216 chirp data (SEG-Y format) collected for the Coastal Hydrodynamics and Natural Geologic Evolution (CHaNGE) project, OCE-1130843. Survey area covers...

  18. Chirp subbottom profiler data collected in Pamlico Sound on cruise SndPt_05_21_22_2012 of RV Riggs for the Coastal Hydrodynamics and Natural Geologic Evolution (CHaNGE) project

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Edgetech 216 chirp data (SEG-Y format) collected for the Coastal Hydrodynamics and Natural Geologic Evolution (CHaNGE) project, OCE-1130843. Survey area covers...

  19. Chirp subbottom profiler data collected in Pamlico Sound on cruise RVRiggs_05_20_22_2014 of RV Riggs for the Coastal Hydrodynamics and Natural Geologic Evolution (CHaNGE) project

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Edgetech 216 chirp data (SEG-Y format) collected for the Coastal Hydrodynamics and Natural Geologic Evolution (CHaNGE) project, OCE-1130843. Survey area covers the...

  20. From Many-to-One to One-to-Many: The Evolution of Ubiquitous Computing in Education

    Science.gov (United States)

    Chen, Wenli; Lim, Carolyn; Tan, Ashley

    2011-01-01

    Personal, Internet-connected technologies are becoming ubiquitous in the lives of students, and ubiquitous computing initiatives are already expanding in educational contexts. Historically in the field of education, the terms one-to-one (1:1) computing and ubiquitous computing have been interpreted in a number of ways and have at times been used…

  1. Computer based methods for measurement of joint space width: update of an ongoing OMERACT project

    NARCIS (Netherlands)

    Sharp, John T.; Angwin, Jane; Boers, Maarten; Duryea, Jeff; von Ingersleben, Gabriele; Hall, James R.; Kauffman, Joost A.; Landewé, Robert; Langs, Georg; Lukas, Cédric; Maillefert, Jean-Francis; Bernelot Moens, Hein J.; Peloschek, Philipp; Strand, Vibeke; van der Heijde, Désirée

    2007-01-01

    Computer-based methods of measuring joint space width (JSW) could potentially have advantages over scoring joint space narrowing, with regard to increased standardization, sensitivity, and reproducibility. In an early exercise, 4 different methods showed good agreement on measured change in JSW over

  2. Technology survey of computer software as applicable to the MIUS project

    Science.gov (United States)

    Fulbright, B. E.

    1975-01-01

    Existing computer software, available from either governmental or private sources, applicable to modular integrated utility system program simulation is surveyed. Several programs and subprograms are described to provide a consolidated reference, and a bibliography is included. The report covers the two broad areas of design simulation and system simulation.

  3. Grow--a computer subroutine that projects the growth of trees in the Lake States' forests.

    Science.gov (United States)

    Gary J. Brand

    1981-01-01

    A computer subroutine, Grow, has been written in 1977 Standard FORTRAN to implement a distance-independent, individual tree growth model for Lake States' forests. Grow is a small and easy-to-use version of the growth model. All the user has to do is write a calling program to read initial conditions, call Grow, and summarize the results.

  4. Computer programs for display. [magnetic tapes - project planning/NASA programs

    Science.gov (United States)

    1975-01-01

    The developments of an information storage and retrieval system are presented. Computer programs used in the system are described; the programs allow display messages to be placed on disks in an off-line environment permitting a more efficient use of memory. A time table that shows complete and scheduled developments of the system is given.

  5. A prolongation-projection algorithm for computing the finite real variety of an ideal

    NARCIS (Netherlands)

    J.B. Lasserre; M. Laurent (Monique); P. Rostalski

    2009-01-01

    htmlabstractWe provide a real algebraic symbolic-numeric algorithm for computing the real variety $V_R(I)$ of an ideal $I$, assuming it is finite while $V_C(I)$ may not be. Our approach uses sets of linear functionals on $R[X]$, vanishing on a given set of polynomials generating $I$ and their

  6. A prolongation-projection algorithm for computing the finite real variety of an ideal

    NARCIS (Netherlands)

    J.B. Lasserre; M. Laurent (Monique); P. Rostalski

    2008-01-01

    htmlabstractWe provide a real algebraic symbolic-numeric algorithm for computing the real variety $V_R(I)$ of an ideal $I$, assuming it is finite while $V_C(I)$ may not be. Our approach uses sets of linear functionals on $R[X]$, vanishing on a given set of polynomials generating $I$ and their

  7. Automated patient setup and gating using cone beam computed tomography projections

    DEFF Research Database (Denmark)

    Wan, Hanlin; Bertholet, Jenny; Ge, Jiajia

    2016-01-01

    In radiation therapy, fiducial markers are often implanted near tumors and used for patient positioning and respiratory gating purposes. These markers are then used to manually align the patients by matching the markers in the cone beam computed tomography (CBCT) reconstruction to those...

  8. Information Technology in project-organized electronic and computer technology engineering education

    DEFF Research Database (Denmark)

    Nielsen, Kirsten Mølgaard; Nielsen, Jens Frederik Dalsgaard

    1999-01-01

    This paper describes the integration of IT in the education of electronic and computer technology engineers at Institute of Electronic Systems, Aalborg Uni-versity, Denmark. At the Institute Information Technology is an important tool in the aspects of the education as well as for communication...

  9. Bilingual Academic Computer and Technology Oriented Program: Project COM-TECH. Evaluation Section Report. OREA Report.

    Science.gov (United States)

    Berney, Tomi D.; Plotkin, Donna

    Project COM-TECH offered bilingual individualized instruction, using an enrichment approach, to Spanish- and Haitian Creole-speaking students with varying levels of English and native language proficiency and academic preparation. The program provided supplementary instruction in English as a Second Language (ESL); Native Language Arts (NLA); and…

  10. The World Center for Computing's Pilot Videodisc Project for French Language Instruction.

    Science.gov (United States)

    Eastmond, J. Nicholls, Jr.; Mosenthal, Richard

    1985-01-01

    Describes a pilot videodisc project for French language instruction. Unique features include (1) learner control of instruction by a mouse or touch-sensitive screen, (2) extensive cultural interaction, and (3) an elaborate lexicon of word meanings portrayed visually for selected key words. (Author/SED)

  11. Formal and informal computer mediated communication within within design teams for complex building projects

    NARCIS (Netherlands)

    Otter, den A.F.H.J.; Gray, C.; Prins, M.

    2001-01-01

    In this paper the information environment of design teams is discussed because of the use of Internet based Project websites (PWS) to improve the information exchange within design teams. Because design teams heavenly depend on informal information exchange and PWS is a tool for formalising

  12. 24 CFR 990.165 - Computation of project expense level (PEL).

    Science.gov (United States)

    2010-04-01

    ... is located; location types include rural, city central metropolitan, and non-city central... ceiling for any property except for New York City Housing Authority projects, which have a $480 PUM... subsidy as provided in Attachment A of their MTW Agreements executed prior to November 18, 2005. PHAs with...

  13. Computer-Aided Systems Engineering for Flight Research Projects Using a Workgroup Database

    Science.gov (United States)

    Mizukami, Masahi

    2004-01-01

    An online systems engineering tool for flight research projects has been developed through the use of a workgroup database. Capabilities are implemented for typical flight research systems engineering needs in document library, configuration control, hazard analysis, hardware database, requirements management, action item tracking, project team information, and technical performance metrics. Repetitive tasks are automated to reduce workload and errors. Current data and documents are instantly available online and can be worked on collaboratively. Existing forms and conventional processes are used, rather than inventing or changing processes to fit the tool. An integrated tool set offers advantages by automatically cross-referencing data, minimizing redundant data entry, and reducing the number of programs that must be learned. With a simplified approach, significant improvements are attained over existing capabilities for minimal cost. By using a workgroup-level database platform, personnel most directly involved in the project can develop, modify, and maintain the system, thereby saving time and money. As a pilot project, the system has been used to support an in-house flight experiment. Options are proposed for developing and deploying this type of tool on a more extensive basis.

  14. Web-Based Dissemination System for the Trusted Computing Exemplar Project

    Science.gov (United States)

    2005-06-01

    6 3. Fiasco Microkernel ..............................................................................6 4. Apache Web Server...Fiasco Microkernel The next project examined was the Fiasco Microkernel developed by the Dresden University of Technology. This dissemination...System,” 1999, http://www.eros-os.org, Accessed: May 2005. [5] “The Fiasco Microkernel ,” February 2004, http://os.inf.tu-dresden.de/fiasco/, Accessed

  15. A Tire Gasification Senior Design Project That Integrates Laboratory Experiments and Computer Simulation

    Science.gov (United States)

    Weiss, Brian; Castaldi, Marco J.

    2006-01-01

    A reactor to convert waste rubber tires to useful products such as CO and H2, was investigated in a university undergraduate design project. The student worked individually with mentorship from a faculty professor who aided the student with professional critique. The student was able to research the background of the field and conceive of a novel…

  16. Exploiting Open Environmental Data using Linked Data and Cloud Computing: the MELODIES project

    Science.gov (United States)

    Blower, Jon; Gonçalves, Pedro; Caumont, Hervé; Koubarakis, Manolis; Perkins, Bethan

    2015-04-01

    The European Open Data Strategy establishes important new principles that ensure that European public sector data will be released at no cost (or marginal cost), in machine-readable, commonly-understood formats, and with liberal licences enabling wide reuse. These data encompass both scientific data about the environment (from Earth Observation and other fields) and other public sector information, including diverse topics such as demographics, health and crime. Many open geospatial datasets (e.g. land use) are already available through the INSPIRE directive and made available through infrastructures such as the Global Earth Observation System of Systems (GEOSS). The intention of the Open Data Strategy is to stimulate the growth of research and value-adding services that build upon these data streams; however, the potential value inherent in open data, and the benefits that can be gained by combining previously-disparate sources of information are only just starting to become understood. The MELODIES project (Maximising the Exploitation of Linked Open Data In Enterprise and Science) is developing eight innovative and sustainable services, based upon Open Data, for users in research, government, industry and the general public in a broad range of societal and environmental benefit areas. MELODIES (http://melodiesproject.eu) is a European FP7 project that is coordinated by the University of Reading and has sixteen partners (including nine SMEs) from eight European countries. It started in November 2013 and will run for three years. The project is therefore in its early stages and therefore we will value the opportunity that this workshop affords to present our plans and interact with the wider Linked Geospatial Data community. The project is developing eight new services[1] covering a range of domains including agriculture, urban ecosystems, land use management, marine information, desertification, crisis management and hydrology. These services will combine Earth

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  18. An interactive computer approach to performing resource analysis for a multi-resource/multi-project problem. [Spacelab inventory procurement planning

    Science.gov (United States)

    Schlagheck, R. A.

    1977-01-01

    New planning techniques and supporting computer tools are needed for the optimization of resources and costs for space transportation and payload systems. Heavy emphasis on cost effective utilization of resources has caused NASA program planners to look at the impact of various independent variables that affect procurement buying. A description is presented of a category of resource planning which deals with Spacelab inventory procurement analysis. Spacelab is a joint payload project between NASA and the European Space Agency and will be flown aboard the Space Shuttle starting in 1980. In order to respond rapidly to the various procurement planning exercises, a system was built that could perform resource analysis in a quick and efficient manner. This system is known as the Interactive Resource Utilization Program (IRUP). Attention is given to aspects of problem definition, an IRUP system description, questions of data base entry, the approach used for project scheduling, and problems of resource allocation.

  19. Discrete Calderon’s projections on parallelepipeds and their application to computing exterior magnetic fields for FRC plasmas

    International Nuclear Information System (INIS)

    Kansa, E.; Shumlak, U.; Tsynkov, S.

    2013-01-01

    Confining dense plasma in a field reversed configuration (FRC) is considered a promising approach to fusion. Numerical simulation of this process requires setting artificial boundary conditions (ABCs) for the magnetic field because whereas the plasma itself occupies a bounded region (within the FRC coils), the field extends from this region all the way to infinity. If the plasma is modeled using single fluid magnetohydrodynamics (MHD), then the exterior magnetic field can be considered quasi-static. This field has a scalar potential governed by the Laplace equation. The quasi-static ABC for the magnetic field is obtained using the method of difference potentials, in the form of a discrete Calderon boundary equation with projection on the artificial boundary shaped as a parallelepiped. The Calderon projection itself is computed by convolution with the discrete fundamental solution on the three-dimensional Cartesian grid.

  20. Improving limited-projection-angle fluorescence molecular tomography using a co-registered x-ray computed tomography scan.

    Science.gov (United States)

    Radrich, Karin; Ale, Angelique; Ermolayev, Vladimir; Ntziachristos, Vasilis

    2012-12-01

    We examine the improvement in imaging performance, such as axial resolution and signal localization, when employing limited-projection-angle fluorescence molecular tomography (FMT) together with x-ray computed tomography (XCT) measurements versus stand-alone FMT. For this purpose, we employed living mice, bearing a spontaneous lung tumor model, and imaged them with FMT and XCT under identical geometrical conditions using fluorescent probes for cancer targeting. The XCT data was employed, herein, as structural prior information to guide the FMT reconstruction. Gold standard images were provided by fluorescence images of mouse cryoslices, providing the ground truth in fluorescence bio-distribution. Upon comparison of FMT images versus images reconstructed using hybrid FMT and XCT data, we demonstrate marked improvements in image accuracy. This work relates to currently disseminated FMT systems, using limited projection scans, and can be employed to enhance their performance.