WorldWideScience

Sample records for computing hpcwpl final

  1. Final Report: Correctness Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [Rice Univ., Houston, TX (United States)

    2014-10-27

    In the course of developing parallel programs for leadership computing systems, subtle programming errors often arise that are extremely difficult to diagnose without tools. To meet this challenge, University of Maryland, the University of Wisconsin—Madison, and Rice University worked to develop lightweight tools to help code developers pinpoint a variety of program correctness errors that plague parallel scientific codes. The aim of this project was to develop software tools that help diagnose program errors including memory leaks, memory access errors, round-off errors, and data races. Research at Rice University focused on developing algorithms and data structures to support efficient monitoring of multithreaded programs for memory access errors and data races. This is a final report about research and development work at Rice University as part of this project.

  2. Summer 1994 Computational Science Workshop. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-31

    This report documents the work performed by the University of New Mexico Principal Investigators and Research Assistants while hosting the highly successful Summer 1994 Computational Sciences Workshop in Albuquerque on August 6--11, 1994. Included in this report is a final budget for the workshop, along with a summary of the participants` evaluation of the workshop. The workshop proceeding have been delivered under separate cover. In order to assist in the organization of future workshops, we have also included in this report detailed documentation of the pre- and post-workshop activities associated with this contract. Specifically, we have included a section that documents the advertising performed, along with the manner in which applications were handled. A complete list of the workshop participants in this section. Sample letters that were generated while dealing with various commercial entities and departments at the University are also included in a section dealing with workshop logistics. Finally, we have included a section in this report that deals with suggestions for future workshops.

  3. The Magellan Final Report on Cloud Computing

    Energy Technology Data Exchange (ETDEWEB)

    ,; Coghlan, Susan; Yelick, Katherine

    2011-12-21

    The goal of Magellan, a project funded through the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR), was to investigate the potential role of cloud computing in addressing the computing needs for the DOE Office of Science (SC), particularly related to serving the needs of mid- range computing and future data-intensive computing workloads. A set of research questions was formed to probe various aspects of cloud computing from performance, usability, and cost. To address these questions, a distributed testbed infrastructure was deployed at the Argonne Leadership Computing Facility (ALCF) and the National Energy Research Scientific Computing Center (NERSC). The testbed was designed to be flexible and capable enough to explore a variety of computing models and hardware design points in order to understand the impact for various scientific applications. During the project, the testbed also served as a valuable resource to application scientists. Applications from a diverse set of projects such as MG-RAST (a metagenomics analysis server), the Joint Genome Institute, the STAR experiment at the Relativistic Heavy Ion Collider, and the Laser Interferometer Gravitational Wave Observatory (LIGO), were used by the Magellan project for benchmarking within the cloud, but the project teams were also able to accomplish important production science utilizing the Magellan cloud resources.

  4. Computational infrastructure for law enforcement. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Lades, M.; Kunz, C.; Strikos, I.

    1997-02-01

    This project planned to demonstrate the leverage of enhanced computational infrastructure for law enforcement by demonstrating the face recognition capability at LLNL. The project implemented a face finder module extending the segmentation capabilities of the current face recognition so it was capable of processing different image formats and sizes and create the pilot of a network-accessible image database for the demonstration of face recognition capabilities. The project was funded at $40k (2 man-months) for a feasibility study. It investigated several essential components of a networked face recognition system which could help identify, apprehend, and convict criminals.

  5. Energy efficiency of computer power supply units - Final report

    Energy Technology Data Exchange (ETDEWEB)

    Aebischer, B. [cepe - Centre for Energy Policy and Economics, Swiss Federal Institute of Technology Zuerich, Zuerich (Switzerland); Huser, H. [Encontrol GmbH, Niederrohrdorf (Switzerland)

    2002-11-15

    This final report for the Swiss Federal Office of Energy (SFOE) takes a look at the efficiency of computer power supply units, which decreases rapidly during average computer use. The background and the purpose of the project are examined. The power supplies for personal computers are discussed and the testing arrangement used is described. Efficiency, power-factor and operating points of the units are examined. Potentials for improvement and measures to be taken are discussed. Also, action to be taken by those involved in the design and operation of such power units is proposed. Finally, recommendations for further work are made.

  6. SIAM Conference on Geometric Design and Computing. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    None

    2002-03-11

    The SIAM Conference on Geometric Design and Computing attracted 164 domestic and international researchers, from academia, industry, and government. It provided a stimulating forum in which to learn about the latest developments, to discuss exciting new research directions, and to forge stronger ties between theory and applications. Final Report

  7. National Computational Infrastructure for Lattice Gauge Theory: Final Report

    International Nuclear Information System (INIS)

    Richard Brower; Norman Christ; Michael Creutz; Paul Mackenzie; John Negele; Claudio Rebbi; David Richards; Stephen Sharpe; Robert Sugar

    2006-01-01

    This is the final report of Department of Energy SciDAC Grant ''National Computational Infrastructure for Lattice Gauge Theory''. It describes the software developed under this grant, which enables the effective use of a wide variety of supercomputers for the study of lattice quantum chromodynamics (lattice QCD). It also describes the research on and development of commodity clusters optimized for the study of QCD. Finally, it provides some high lights of research enabled by the infrastructure created under this grant, as well as a full list of the papers resulting from research that made use of this infrastructure

  8. UC Merced Center for Computational Biology Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Colvin, Michael; Watanabe, Masakatsu

    2010-11-30

    Final report for the UC Merced Center for Computational Biology. The Center for Computational Biology (CCB) was established to support multidisciplinary scientific research and academic programs in computational biology at the new University of California campus in Merced. In 2003, the growing gap between biology research and education was documented in a report from the National Academy of Sciences, Bio2010 Transforming Undergraduate Education for Future Research Biologists. We believed that a new type of biological sciences undergraduate and graduate programs that emphasized biological concepts and considered biology as an information science would have a dramatic impact in enabling the transformation of biology. UC Merced as newest UC campus and the first new U.S. research university of the 21st century was ideally suited to adopt an alternate strategy - to create a new Biological Sciences majors and graduate group that incorporated the strong computational and mathematical vision articulated in the Bio2010 report. CCB aimed to leverage this strong commitment at UC Merced to develop a new educational program based on the principle of biology as a quantitative, model-driven science. Also we expected that the center would be enable the dissemination of computational biology course materials to other university and feeder institutions, and foster research projects that exemplify a mathematical and computations-based approach to the life sciences. As this report describes, the CCB has been successful in achieving these goals, and multidisciplinary computational biology is now an integral part of UC Merced undergraduate, graduate and research programs in the life sciences. The CCB began in fall 2004 with the aid of an award from U.S. Department of Energy (DOE), under its Genomes to Life program of support for the development of research and educational infrastructure in the modern biological sciences. This report to DOE describes the research and academic programs

  9. Advanced Certification Program for Computer Graphic Specialists. Final Performance Report.

    Science.gov (United States)

    Parkland Coll., Champaign, IL.

    A pioneer program in computer graphics was implemented at Parkland College (Illinois) to meet the demand for specialized technicians to visualize data generated on high performance computers. In summer 1989, 23 students were accepted into the pilot program. Courses included C programming, calculus and analytic geometry, computer graphics, and…

  10. Computer-Based Self-Instructional Modules. Final Technical Report.

    Science.gov (United States)

    Weinstock, Harold

    Reported is a project involving seven chemists, six mathematicians, and six physicists in the production of computer-based, self-study modules for use in introductory college courses in chemistry, physics, and mathematics. These modules were designed to be used by students and instructors with little or no computer backgrounds, in institutions…

  11. Information-preserving models of physics and computation: Final report

    International Nuclear Information System (INIS)

    1986-01-01

    This research pertains to discrete dynamical systems, as embodied by cellular automata, reversible finite-difference equations, and reversible computation. The research has strengthened the cross-fertilization between physics, computer science and discrete mathematics. It has shown that methods and concepts of physics can be exported to computation. Conversely, fully discrete dynamical systems have been shown to be fruitful for representing physical phenomena usually described with differential equations - cellular automata for fluid dynamics has been the most noted example of such a representation. At the practical level, the fully discrete representation approach suggests innovative uses of computers for scientific computing. The originality of these uses lies in their non-numerical nature: they avoid the inaccuracies of floating-point arithmetic and bypass the need for numerical analysis. 38 refs

  12. CRITICAL ISSUES IN HIGH END COMPUTING - FINAL REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Corones, James [Krell Institute

    2013-09-23

    High-End computing (HEC) has been a driver for advances in science and engineering for the past four decades. Increasingly HEC has become a significant element in the national security, economic vitality, and competitiveness of the United States. Advances in HEC provide results that cut across traditional disciplinary and organizational boundaries. This program provides opportunities to share information about HEC systems and computational techniques across multiple disciplines and organizations through conferences and exhibitions of HEC advances held in Washington DC so that mission agency staff, scientists, and industry can come together with White House, Congressional and Legislative staff in an environment conducive to the sharing of technical information, accomplishments, goals, and plans. A common thread across this series of conferences is the understanding of computational science and applied mathematics techniques across a diverse set of application areas of interest to the Nation. The specific objectives of this program are: Program Objective 1. To provide opportunities to share information about advances in high-end computing systems and computational techniques between mission critical agencies, agency laboratories, academics, and industry. Program Objective 2. To gather pertinent data, address specific topics of wide interest to mission critical agencies. Program Objective 3. To promote a continuing discussion of critical issues in high-end computing. Program Objective 4.To provide a venue where a multidisciplinary scientific audience can discuss the difficulties applying computational science techniques to specific problems and can specify future research that, if successful, will eliminate these problems.

  13. Quantum computing accelerator I/O : LDRD 52750 final report

    International Nuclear Information System (INIS)

    Schroeppel, Richard Crabtree; Modine, Normand Arthur; Ganti, Anand; Pierson, Lyndon George; Tigges, Christopher P.

    2003-01-01

    In a superposition of quantum states, a bit can be in both the states '0' and '1' at the same time. This feature of the quantum bit or qubit has no parallel in classical systems. Currently, quantum computers consisting of 4 to 7 qubits in a 'quantum computing register' have been built. Innovative algorithms suited to quantum computing are now beginning to emerge, applicable to sorting and cryptanalysis, and other applications. A framework for overcoming slightly inaccurate quantum gate interactions and for causing quantum states to survive interactions with surrounding environment is emerging, called quantum error correction. Thus there is the potential for rapid advances in this field. Although quantum information processing can be applied to secure communication links (quantum cryptography) and to crack conventional cryptosystems, the first few computing applications will likely involve a 'quantum computing accelerator' similar to a 'floating point arithmetic accelerator' interfaced to a conventional Von Neumann computer architecture. This research is to develop a roadmap for applying Sandia's capabilities to the solution of some of the problems associated with maintaining quantum information, and with getting data into and out of such a 'quantum computing accelerator'. We propose to focus this work on 'quantum I/O technologies' by applying quantum optics on semiconductor nanostructures to leverage Sandia's expertise in semiconductor microelectronic/photonic fabrication techniques, as well as its expertise in information theory, processing, and algorithms. The work will be guided by understanding of practical requirements of computing and communication architectures. This effort will incorporate ongoing collaboration between 9000, 6000 and 1000 and between junior and senior personnel. Follow-on work to fabricate and evaluate appropriate experimental nano/microstructures will be proposed as a result of this work

  14. Active system area networks for data intensive computations. Final report

    Energy Technology Data Exchange (ETDEWEB)

    None

    2002-04-01

    The goal of the Active System Area Networks (ASAN) project is to develop hardware and software technologies for the implementation of active system area networks (ASANs). The use of the term ''active'' refers to the ability of the network interfaces to perform application-specific as well as system level computations in addition to their traditional role of data transfer. This project adopts the view that the network infrastructure should be an active computational entity capable of supporting certain classes of computations that would otherwise be performed on the host CPUs. The result is a unique network-wide programming model where computations are dynamically placed within the host CPUs or the NIs depending upon the quality of service demands and network/CPU resource availability. The projects seeks to demonstrate that such an approach is a better match for data intensive network-based applications and that the advent of low-cost powerful embedded processors and configurable hardware makes such an approach economically viable and desirable.

  15. Computer simulation of kinetic properties of plasmas. Final report

    International Nuclear Information System (INIS)

    Denavit, J.

    1982-08-01

    The research was directed toward the development and testing of new numerical methods for particle and hybrid simulation of plasmas, and their application to physical problems of current significance to Magnetic Fusion Energy. This project will terminate on August 31, 1982 and this Final Report describes: (1) the research accomplished since the last renewal on October 1, 1981; and (2) a perspective of the work done since the beginning of the project in February 1972

  16. Final Report: Center for Programming Models for Scalable Parallel Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [William Marsh Rice University

    2011-09-13

    As part of the Center for Programming Models for Scalable Parallel Computing, Rice University collaborated with project partners in the design, development and deployment of language, compiler, and runtime support for parallel programming models to support application development for the “leadership-class” computer systems at DOE national laboratories. Work over the course of this project has focused on the design, implementation, and evaluation of a second-generation version of Coarray Fortran. Research and development efforts of the project have focused on the CAF 2.0 language, compiler, runtime system, and supporting infrastructure. This has involved working with the teams that provide infrastructure for CAF that we rely on, implementing new language and runtime features, producing an open source compiler that enabled us to evaluate our ideas, and evaluating our design and implementation through the use of benchmarks. The report details the research, development, findings, and conclusions from this work.

  17. The NIRA computer program package (photonuclear data center). Final report

    International Nuclear Information System (INIS)

    Vander Molen, H.J.; Gerstenberg, H.M.

    1976-02-01

    The Photonuclear Data Center's NIRA library of programs, executable from mass storage on the National Bureau of Standard's central computer facility, is described. Detailed instructions are given (with examples) for the use of the library to analyze, evaluate, synthesize, and produce for publication camera-ready tabular and graphical presentations of digital photonuclear reaction cross-section data. NIRA is the acronym for Nuclear Information Research Associate

  18. WAMCUT, a computer code for fault tree evaluation. Final report

    International Nuclear Information System (INIS)

    Erdmann, R.C.

    1978-06-01

    WAMCUT is a code in the WAM family which produces the minimum cut sets (MCS) for a given fault tree. The MCS are useful as they provide a qualitative evaluation of a system, as well as providing a means of determining the probability distribution function for the top of the tree. The program is very efficient and will produce all the MCS in a very short computer time span. 22 figures, 4 tables

  19. Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Conrad, Patrick [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Bigoni, Daniele [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Parno, Matthew [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2017-06-09

    QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a history of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is the MIT

  20. National Computational Infrastructure for Lattice Gauge Theory: Final report

    International Nuclear Information System (INIS)

    Reed, Daniel A.

    2008-01-01

    In this document we describe work done under the SciDAC-1 Project National Computerational Infrastructure for Lattice Gauge Theory. The objective of this project was to construct the computational infrastructure needed to study quantum chromodynamics (QCD). Nearly all high energy and nuclear physicists in the United States working on the numerical study of QCD are involved in the project, as are Brookhaven National Laboratory (BNL), Fermi National Accelerator Laboratory (FNAL), and Thomas Jefferson National Accelerator Facility (JLab). A list of the senior participants is given in Appendix A.2. The project includes the development of community software for the effective use of the terascale computers, and the research and development of commodity clusters optimized for the study of QCD. The software developed as part of this effort is publicly available, and is being widely used by physicists in the United States and abroad. The prototype clusters built with SciDAC-1 fund have been used to test the software, and are available to lattice gauge theorists in the United States on a peer reviewed basis

  1. URSULA2 computer program. Volume 3. User's manual. Final report

    International Nuclear Information System (INIS)

    Singhal, A.K.

    1980-01-01

    This report is intended to provide documentation for the users of the URSULA2 code so that they can appreciate its important features such as: code structure, flow chart, grid notations, coding style, usage of secondary storage and its interconnection with the input preparation program (Reference H3201/4). Subroutines and subprograms have been divided into four functional groups. The functions of all subroutines have been explained with particular emphasis on the control subroutine (MAIN) and the data input subroutine (BLOCK DATA). Computations for the flow situations similar to the reference case can be performed simply by making alterations in BLOCK DATA. Separate guides for the preparation of input data and for the interpretation of program output have been provided. Furthermore, two appendices; one for the URSULA2 listing and the second for the glossary of FORTRAN variables, are included to make this report self-sufficient

  2. EXPERIMENTS AND COMPUTATIONAL MODELING OF PULVERIZED-COAL IGNITION; FINAL

    International Nuclear Information System (INIS)

    Samuel Owusu-Ofori; John C. Chen

    1999-01-01

    Under typical conditions of pulverized-coal combustion, which is characterized by fine particles heated at very high rates, there is currently a lack of certainty regarding the ignition mechanism of bituminous and lower rank coals as well as the ignition rate of reaction. furthermore, there have been no previous studies aimed at examining these factors under various experimental conditions, such as particle size, oxygen concentration, and heating rate. Finally, there is a need to improve current mathematical models of ignition to realistically and accurately depict the particle-to-particle variations that exist within a coal sample. Such a model is needed to extract useful reaction parameters from ignition studies, and to interpret ignition data in a more meaningful way. The authors propose to examine fundamental aspects of coal ignition through (1) experiments to determine the ignition temperature of various coals by direct measurement, and (2) modeling of the ignition process to derive rate constants and to provide a more insightful interpretation of data from ignition experiments. The authors propose to use a novel laser-based ignition experiment to achieve their first objective. Laser-ignition experiments offer the distinct advantage of easy optical access to the particles because of the absence of a furnace or radiating walls, and thus permit direct observation and particle temperature measurement. The ignition temperature of different coals under various experimental conditions can therefore be easily determined by direct measurement using two-color pyrometry. The ignition rate-constants, when the ignition occurs heterogeneously, and the particle heating rates will both be determined from analyses based on these measurements

  3. IMPROVING TACONITE PROCESSING PLANT EFFICIENCY BY COMPUTER SIMULATION, Final Report

    Energy Technology Data Exchange (ETDEWEB)

    William M. Bond; Salih Ersayin

    2007-03-30

    This project involved industrial scale testing of a mineral processing simulator to improve the efficiency of a taconite processing plant, namely the Minorca mine. The Concentrator Modeling Center at the Coleraine Minerals Research Laboratory, University of Minnesota Duluth, enhanced the capabilities of available software, Usim Pac, by developing mathematical models needed for accurate simulation of taconite plants. This project provided funding for this technology to prove itself in the industrial environment. As the first step, data representing existing plant conditions were collected by sampling and sample analysis. Data were then balanced and provided a basis for assessing the efficiency of individual devices and the plant, and also for performing simulations aimed at improving plant efficiency. Performance evaluation served as a guide in developing alternative process strategies for more efficient production. A large number of computer simulations were then performed to quantify the benefits and effects of implementing these alternative schemes. Modification of makeup ball size was selected as the most feasible option for the target performance improvement. This was combined with replacement of existing hydrocyclones with more efficient ones. After plant implementation of these modifications, plant sampling surveys were carried out to validate findings of the simulation-based study. Plant data showed very good agreement with the simulated data, confirming results of simulation. After the implementation of modifications in the plant, several upstream bottlenecks became visible. Despite these bottlenecks limiting full capacity, concentrator energy improvement of 7% was obtained. Further improvements in energy efficiency are expected in the near future. The success of this project demonstrated the feasibility of a simulation-based approach. Currently, the Center provides simulation-based service to all the iron ore mining companies operating in northern

  4. Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology (Final Report)

    Science.gov (United States)

    EPA announced the release of the final report, Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology. This report describes new approaches that are faster, less resource intensive, and more robust that can help ...

  5. Computer-aided dispatch--traffic management center field operational test : state of Utah final report

    Science.gov (United States)

    2006-07-01

    This document provides the final report for the evaluation of the USDOT-sponsored Computer-Aided Dispatch Traffic Management Center Integration Field Operations Test in the State of Utah. The document discusses evaluation findings in the followin...

  6. Computer-aided dispatch--traffic management center field operational test : Washington State final report

    Science.gov (United States)

    2006-05-01

    This document provides the final report for the evaluation of the USDOT-sponsored Computer-Aided Dispatch - Traffic Management Center Integration Field Operations Test in the State of Washington. The document discusses evaluation findings in the foll...

  7. Computer Assisted Instructional Design for Computer-Based Instruction. Final Report. Working Papers.

    Science.gov (United States)

    Russell, Daniel M.; Pirolli, Peter

    Recent advances in artificial intelligence and the cognitive sciences have made it possible to develop successful intelligent computer-aided instructional systems for technical and scientific training. In addition, computer-aided design (CAD) environments that support the rapid development of such computer-based instruction have also been recently…

  8. 75 FR 32803 - Notice of Issuance of Final Determination Concerning a GTX Mobile+ Hand Held Computer

    Science.gov (United States)

    2010-06-09

    ... shall be published in the Federal Register within 60 days of the date the final determination is issued..., involved various scenarios pertaining to the assembly of a desktop computer in the U.S. and the Netherlands... finished desktop computers depending on the model included an additional floppy drive, CD ROM disk, and...

  9. Computer-assisted modeling: Contributions of computational approaches to elucidating macromolecular structure and function: Final report

    International Nuclear Information System (INIS)

    Walton, S.

    1987-01-01

    The Committee, asked to provide an assessment of computer-assisted modeling of molecular structure, has highlighted the signal successes and the significant limitations for a broad panoply of technologies and has projected plausible paths of development over the next decade. As with any assessment of such scope, differing opinions about present or future prospects were expressed. The conclusions and recommendations, however, represent a consensus of our views of the present status of computational efforts in this field

  10. The Activity-Based Computing Project - A Software Architecture for Pervasive Computing Final Report

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    . Special attention should be drawn to publication [25], which gives an overview of the ABC project to the IEEE Pervasive Computing community; the ACM CHI 2006 [19] paper that documents the implementation of the ABC technology; and the ACM ToCHI paper [12], which is the main publication of the project......, documenting all of the project’s four objectives. All of these publication venues are top-tier journals and conferences within computer science. From a business perspective, the project had the objective of incorporating relevant parts of the ABC technology into the products of Medical Insight, which has been...... done. Moreover, partly based on the research done in the ABC project, the company Cetrea A/S has been founded, which incorporate ABC concepts and technologies in its products. The concepts of activity-based computing have also been researched in cooperation with IBM Research, and the ABC project has...

  11. Low-cost addition-subtraction sequences for the final exponentiation computation in pairings

    DEFF Research Database (Denmark)

    Guzmán-Trampe, Juan E; Cruz-Cortéz, Nareli; Dominguez Perez, Luis

    2014-01-01

    In this paper, we address the problem of finding low cost addition–subtraction sequences for situations where a doubling step is significantly cheaper than a non-doubling one. One application of this setting appears in the computation of the final exponentiation step of the reduced Tate pairing d...

  12. Biomedical Computing Technology Information Center (BCTIC): Final progress report, March 1, 1986-September 30, 1986

    International Nuclear Information System (INIS)

    1987-01-01

    During this time, BCTIC packaged and disseminated computing technology and honored all requests made before September 1, 1986. The final month of operation was devoted to completing code requests, returning submitted codes, and sending out notices of BCTIC's termination of services on September 30th. Final BCTIC library listings were distributed to members of the active mailing list. Also included in the library listing are names and addresses of program authors and contributors in order that users may have continued support of their programs. The BCTIC library list is attached

  13. Computational Particle Dynamic Simulations on Multicore Processors (CPDMu) Final Report Phase I

    Energy Technology Data Exchange (ETDEWEB)

    Schmalz, Mark S

    2011-07-24

    Statement of Problem - Department of Energy has many legacy codes for simulation of computational particle dynamics and computational fluid dynamics applications that are designed to run on sequential processors and are not easily parallelized. Emerging high-performance computing architectures employ massively parallel multicore architectures (e.g., graphics processing units) to increase throughput. Parallelization of legacy simulation codes is a high priority, to achieve compatibility, efficiency, accuracy, and extensibility. General Statement of Solution - A legacy simulation application designed for implementation on mainly-sequential processors has been represented as a graph G. Mathematical transformations, applied to G, produce a graph representation {und G} for a high-performance architecture. Key computational and data movement kernels of the application were analyzed/optimized for parallel execution using the mapping G {yields} {und G}, which can be performed semi-automatically. This approach is widely applicable to many types of high-performance computing systems, such as graphics processing units or clusters comprised of nodes that contain one or more such units. Phase I Accomplishments - Phase I research decomposed/profiled computational particle dynamics simulation code for rocket fuel combustion into low and high computational cost regions (respectively, mainly sequential and mainly parallel kernels), with analysis of space and time complexity. Using the research team's expertise in algorithm-to-architecture mappings, the high-cost kernels were transformed, parallelized, and implemented on Nvidia Fermi GPUs. Measured speedups (GPU with respect to single-core CPU) were approximately 20-32X for realistic model parameters, without final optimization. Error analysis showed no loss of computational accuracy. Commercial Applications and Other Benefits - The proposed research will constitute a breakthrough in solution of problems related to efficient

  14. Hyperacute stroke patients and catheter thrombolysis therapy. Correlation between computed tomography perfusion maps and final infarction

    International Nuclear Information System (INIS)

    Naito, Yukari; Tanaka, Shigeko; Inoue, Yuichi; Ota, Shinsuke; Sakaki, Saburo; Kitagaki, Hajime

    2008-01-01

    We investigated the correlation between abnormal perfusion areas by computed tomography perfusion (CTP) study of hyperacute stroke patients and the final infarction areas after intraarterial catheter thrombolysis. CTP study using the box-modulation transfer function (box-MTF) method based on the deconvolution analysis method was performed in 22 hyperacute stroke patients. Ischemic lesions were immediately treated with catheter thrombolysis after CTP study. Among them, nine patients with middle cerebral artery (MCA) occlusion were investigated regarding correlations of the size of the prolonged mean transit time (MTT) area, the decreased cerebral blood volume (CBV) area, and the final infarction area. Using the box-MTF method, the prolonged MTT area was almost identical to the final infarction area in the case of catheter thrombolysis failure. The decreased CBV areas resulted in infarction or hemorrhage, irrespective of the outcome of recanalization after catheter thrombolysis. The prolonged MTT areas, detected by the box-MTF method of CTP in hyperacute stroke patients, included the area of true prolonged MTT and the tracer delay. The prolonged MTT area was almost identical to the final infarction area when recanalization failed. We believe that a tracer delay area also indicates infarction in cases of thrombolysis failure. (author)

  15. The establishment of computer codes for radiological assessment on LLW final disposal in Taiwan

    International Nuclear Information System (INIS)

    Yang, C.C.; Chen, H.T.; Shih, C.L.; Yeh, C.S.; Tsai, C.M.

    1988-01-01

    For final shallow land disposal of Low Level Waste (LLW) in Taiwan, an effort was initiated to establish the evaluation codes for the needs of environmental impact analysis. The objective of the computer code is to set up generic radiological standards for future evaluation on 10 CFR Part 61 Licensing Requirements for Land Disposal of Radioactive Wastes. In determining long-term influences resulting from radiological impacts of LLW at disposal sites there are at least three quantifiable impact measures selected for calculation: dose to members of the public (individual and population), occupational exposures and costs. The computer codes are from INTRUDE, INVERSI and INVERSW of NUREG-0782, OPTIONR and GRWATRR of NUREG-0945. They are both installed in FACOM-M200 and IBM PC/AT systems of Institute of Nuclear Energy Research (INER). The systematic analysis of the computer codes depends not only on the data bases supported by NUREG/CR-1759 - Data Base for Radioactive Waste Management, Volume 3, Impact Analysis Methodology Report but also the information collected from the different exposure scenarios and pathways. The sensitivity study is also performed to assure the long-term stability and security for needs of determining performance objectives

  16. Computer-aided dispatch--traffic management center field operational test final detailed test plan : WSDOT deployment

    Science.gov (United States)

    2003-10-01

    The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : WSDOT deployment". This document defines the objective, approach,...

  17. Computer-aided dispatch--traffic management center field operational test final test plans : state of Utah

    Science.gov (United States)

    2004-01-01

    The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : state of Utah". This document defines the objective, approach, an...

  18. Reliability analysis and computation of computer-based safety instrumentation and control used in German nuclear power plant. Final report

    International Nuclear Information System (INIS)

    Ding, Yongjian; Krause, Ulrich; Gu, Chunlei

    2014-01-01

    The trend of technological advancement in the field of safety instrumentation and control (I and C) leads to increasingly frequent use of computer-based (digital) control systems which consisting of distributed, connected bus communications computers and their functionalities are freely programmable by qualified software. The advantages of the new I and C system over the old I and C system with hard-wired technology are e.g. in the higher flexibility, cost-effective procurement of spare parts, higher hardware reliability (through higher integration density, intelligent self-monitoring mechanisms, etc.). On the other hand, skeptics see the new technology with the computer-based I and C a higher potential by influences of common cause failures (CCF), and the easier manipulation by sabotage (IT Security). In this joint research project funded by the Federal Ministry for Economical Affaires and Energy (BMWi) (2011-2014, FJZ 1501405) the Otto-von-Guericke-University Magdeburg and Magdeburg-Stendal University of Applied Sciences are therefore trying to develop suitable methods for the demonstration of the reliability of the new instrumentation and control systems with the focus on the investigation of CCF. This expertise of both houses shall be extended to this area and a scientific contribution to the sound reliability judgments of the digital safety I and C in domestic and foreign nuclear power plants. First, the state of science and technology will be worked out through the study of national and international standards in the field of functional safety of electrical and I and C systems and accompanying literature. On the basis of the existing nuclear Standards the deterministic requirements on the structure of the new digital I and C system will be determined. The possible methods of reliability modeling will be analyzed and compared. A suitable method called multi class binomial failure rate (MCFBR) which was successfully used in safety valve applications will be

  19. Technologies and tools for high-performance distributed computing. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Karonis, Nicholas T.

    2000-05-01

    In this project we studied the practical use of the MPI message-passing interface in advanced distributed computing environments. We built on the existing software infrastructure provided by the Globus Toolkit{trademark}, the MPICH portable implementation of MPI, and the MPICH-G integration of MPICH with Globus. As a result of this project we have replaced MPICH-G with its successor MPICH-G2, which is also an integration of MPICH with Globus. MPICH-G2 delivers significant improvements in message passing performance when compared to its predecessor MPICH-G and was based on superior software design principles resulting in a software base that was much easier to make the functional extensions and improvements we did. Using Globus services we replaced the default implementation of MPI's collective operations in MPICH-G2 with more efficient multilevel topology-aware collective operations which, in turn, led to the development of a new timing methodology for broadcasts [8]. MPICH-G2 was extended to include client/server functionality from the MPI-2 standard [23] to facilitate remote visualization applications and, through the use of MPI idioms, MPICH-G2 provided application-level control of quality-of-service parameters as well as application-level discovery of underlying Grid-topology information. Finally, MPICH-G2 was successfully used in a number of applications including an award-winning record-setting computation in numerical relativity. In the sections that follow we describe in detail the accomplishments of this project, we present experimental results quantifying the performance improvements, and conclude with a discussion of our applications experiences. This project resulted in a significant increase in the utility of MPICH-G2.

  20. Results of comparative RBMK neutron computation using VNIIEF codes (cell computation, 3D statics, 3D kinetics). Final report

    Energy Technology Data Exchange (ETDEWEB)

    Grebennikov, A.N.; Zhitnik, A.K.; Zvenigorodskaya, O.A. [and others

    1995-12-31

    In conformity with the protocol of the Workshop under Contract {open_quotes}Assessment of RBMK reactor safety using modern Western Codes{close_quotes} VNIIEF performed a neutronics computation series to compare western and VNIIEF codes and assess whether VNIIEF codes are suitable for RBMK type reactor safety assessment computation. The work was carried out in close collaboration with M.I. Rozhdestvensky and L.M. Podlazov, NIKIET employees. The effort involved: (1) cell computations with the WIMS, EKRAN codes (improved modification of the LOMA code) and the S-90 code (VNIIEF Monte Carlo). Cell, polycell, burnup computation; (2) 3D computation of static states with the KORAT-3D and NEU codes and comparison with results of computation with the NESTLE code (USA). The computations were performed in the geometry and using the neutron constants presented by the American party; (3) 3D computation of neutron kinetics with the KORAT-3D and NEU codes. These computations were performed in two formulations, both being developed in collaboration with NIKIET. Formulation of the first problem maximally possibly agrees with one of NESTLE problems and imitates gas bubble travel through a core. The second problem is a model of the RBMK as a whole with imitation of control and protection system controls (CPS) movement in a core.

  1. Calculation of free-energy differences from computer simulations of initial and final states

    International Nuclear Information System (INIS)

    Hummer, G.; Szabo, A.

    1996-01-01

    A class of simple expressions of increasing accuracy for the free-energy difference between two states is derived based on numerical thermodynamic integration. The implementation of these formulas requires simulations of the initial and final (and possibly a few intermediate) states. They involve higher free-energy derivatives at these states which are related to the moments of the probability distribution of the perturbation. Given a specified number of such derivatives, these integration formulas are optimal in the sense that they are exact to the highest possible order of free-energy perturbation theory. The utility of this approach is illustrated for the hydration free energy of water. This problem provides a quite stringent test because the free energy is a highly nonlinear function of the charge so that even fourth order perturbation theory gives a very poor estimate of the free-energy change. Our results should prove most useful for complex, computationally demanding problems where free-energy differences arise primarily from changes in the electrostatic interactions (e.g., electron transfer, charging of ions, protonation of amino acids in proteins). copyright 1996 American Institute of Physics

  2. XVIS: Visualization for the Extreme-Scale Scientific-Computation Ecosystem Final Scientific/Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Geveci, Berk [Kitware, Inc., Clifton Park, NY (United States); Maynard, Robert [Kitware, Inc., Clifton Park, NY (United States)

    2017-10-27

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. The XVis project brought together collaborators from predominant DOE projects for visualization on accelerators and combining their respective features into a new visualization toolkit called VTK-m.

  3. Application of Computer Graphics to Graphing in Algebra and Trigonometry. Final Report.

    Science.gov (United States)

    Morris, J. Richard

    This project was designed to improve the graphing competency of students in elementary algebra, intermediate algebra, and trigonometry courses at Virginia Commonwealth University. Computer graphics programs were designed using an Apple II Plus computer and implemented using Pascal. The software package is interactive and gives students control…

  4. 2016 Final Reports from the Los Alamos National Laboratory Computational Physics Student Summer Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Runnels, Scott Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bachrach, Harrison Ian [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Carlson, Nils [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Collier, Angela [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dumas, William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Fankell, Douglas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ferris, Natalie [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Gonzalez, Francisco [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Griffith, Alec [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Guston, Brandon [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kenyon, Connor [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Li, Benson [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mookerjee, Adaleena [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parkinson, Christian [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Peck, Hailee [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Peters, Evan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Poondla, Yasvanth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rogers, Brandon [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Shaffer, Nathaniel [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Trettel, Andrew [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Valaitis, Sonata Mae [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Venzke, Joel Aaron [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Black, Mason [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Demircan, Samet [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Holladay, Robert Tyler [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-22

    The two primary purposes of LANL’s Computational Physics Student Summer Workshop are (1) To educate graduate and exceptional undergraduate students in the challenges and applications of computational physics of interest to LANL, and (2) Entice their interest toward those challenges. Computational physics is emerging as a discipline in its own right, combining expertise in mathematics, physics, and computer science. The mathematical aspects focus on numerical methods for solving equations on the computer as well as developing test problems with analytical solutions. The physics aspects are very broad, ranging from low-temperature material modeling to extremely high temperature plasma physics, radiation transport and neutron transport. The computer science issues are concerned with matching numerical algorithms to emerging architectures and maintaining the quality of extremely large codes built to perform multi-physics calculations. Although graduate programs associated with computational physics are emerging, it is apparent that the pool of U.S. citizens in this multi-disciplinary field is relatively small and is typically not focused on the aspects that are of primary interest to LANL. Furthermore, more structured foundations for LANL interaction with universities in computational physics is needed; historically interactions rely heavily on individuals’ personalities and personal contacts. Thus a tertiary purpose of the Summer Workshop is to build an educational network of LANL researchers, university professors, and emerging students to advance the field and LANL’s involvement in it.

  5. 2015 Final Reports from the Los Alamos National Laboratory Computational Physics Student Summer Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Runnels, Scott Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Caldwell, Wendy [Arizona State Univ., Mesa, AZ (United States); Brown, Barton Jed [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pederson, Clark [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Justin [Univ. of California, Santa Cruz, CA (United States); Burrill, Daniel [Univ. of Vermont, Burlington, VT (United States); Feinblum, David [Univ. of California, Irvine, CA (United States); Hyde, David [SLAC National Accelerator Lab., Menlo Park, CA (United States). Stanford Institute for Materials and Energy Science (SIMES); Levick, Nathan [Univ. of New Mexico, Albuquerque, NM (United States); Lyngaas, Isaac [Florida State Univ., Tallahassee, FL (United States); Maeng, Brad [Univ. of Michigan, Ann Arbor, MI (United States); Reed, Richard LeRoy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sarno-Smith, Lois [Univ. of Michigan, Ann Arbor, MI (United States); Shohet, Gil [Univ. of Illinois, Urbana-Champaign, IL (United States); Skarda, Jinhie [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Stevens, Josey [Missouri Univ. of Science and Technology, Rolla, MO (United States); Zeppetello, Lucas [Columbia Univ., New York, NY (United States); Grossman-Ponemon, Benjamin [Stanford Univ., CA (United States); Bottini, Joseph Larkin [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Loudon, Tyson Shane [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); VanGessel, Francis Gilbert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Nagaraj, Sriram [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Price, Jacob [Univ. of Washington, Seattle, WA (United States)

    2015-10-15

    The two primary purposes of LANL’s Computational Physics Student Summer Workshop are (1) To educate graduate and exceptional undergraduate students in the challenges and applications of computational physics of interest to LANL, and (2) Entice their interest toward those challenges. Computational physics is emerging as a discipline in its own right, combining expertise in mathematics, physics, and computer science. The mathematical aspects focus on numerical methods for solving equations on the computer as well as developing test problems with analytical solutions. The physics aspects are very broad, ranging from low-temperature material modeling to extremely high temperature plasma physics, radiation transport and neutron transport. The computer science issues are concerned with matching numerical algorithms to emerging architectures and maintaining the quality of extremely large codes built to perform multi-physics calculations. Although graduate programs associated with computational physics are emerging, it is apparent that the pool of U.S. citizens in this multi-disciplinary field is relatively small and is typically not focused on the aspects that are of primary interest to LANL. Furthermore, more structured foundations for LANL interaction with universities in computational physics is needed; historically interactions rely heavily on individuals’ personalities and personal contacts. Thus a tertiary purpose of the Summer Workshop is to build an educational network of LANL researchers, university professors, and emerging students to advance the field and LANL’s involvement in it. This report includes both the background for the program and the reports from the students.

  6. Publication and Retrieval of Computational Chemical-Physical Data Via the Semantic Web. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Ostlund, Neil [Chemical Semantics, Inc., Gainesville, FL (United States)

    2017-07-20

    This research showed the feasibility of applying the concepts of the Semantic Web to Computation Chemistry. We have created the first web portal (www.chemsem.com) that allows data created in the calculations of quantum chemistry, and other such chemistry calculations to be placed on the web in a way that makes the data accessible to scientists in a semantic form never before possible. The semantic web nature of the portal allows data to be searched, found, and used as an advance over the usual approach of a relational database. The semantic data on our portal has the nature of a Giant Global Graph (GGG) that can be easily merged with related data and searched globally via a SPARQL Protocol and RDF Query Language (SPARQL) that makes global searches for data easier than with traditional methods. Our Semantic Web Portal requires that the data be understood by a computer and hence defined by an ontology (vocabulary). This ontology is used by the computer in understanding the data. We have created such an ontology for computational chemistry (purl.org/gc) that encapsulates a broad knowledge of the field of computational chemistry. We refer to this ontology as the Gainesville Core. While it is perhaps the first ontology for computational chemistry and is used by our portal, it is only a start of what must be a long multi-partner effort to define computational chemistry. In conjunction with the above efforts we have defined a new potential file standard (Common Standard for eXchange – CSX for computational chemistry data). This CSX file is the precursor of data in the Resource Description Framework (RDF) form that the semantic web requires. Our portal translates CSX files (as well as other computational chemistry data files) into RDF files that are part of the graph database that the semantic web employs. We propose a CSX file as a convenient way to encapsulate computational chemistry data.

  7. Coordinated Fault-Tolerance for High-Performance Computing Final Project Report

    Energy Technology Data Exchange (ETDEWEB)

    Panda, Dhabaleswar Kumar [The Ohio State University; Beckman, Pete

    2011-07-28

    existing publish-subscribe tools. We enhanced the intrinsic fault tolerance capabilities representative implementations of a variety of key HPC software subsystems and integrated them with the FTB. Targeting software subsystems included: MPI communication libraries, checkpoint/restart libraries, resource managers and job schedulers, and system monitoring tools. Leveraging the aforementioned infrastructure, as well as developing and utilizing additional tools, we have examined issues associated with expanded, end-to-end fault response from both system and application viewpoints. From the standpoint of system operations, we have investigated log and root cause analysis, anomaly detection and fault prediction, and generalized notification mechanisms. Our applications work has included libraries for fault-tolerance linear algebra, application frameworks for coupled multiphysics applications, and external frameworks to support the monitoring and response for general applications. Our final goal was to engage the high-end computing community to increase awareness of tools and issues around coordinated end-to-end fault management.

  8. Final technical position on documentation of computer codes for high-level waste management

    International Nuclear Information System (INIS)

    Silling, S.A.

    1983-06-01

    Guidance is given for the content of documentation of computer codes which are used in support of a license application for high-level waste disposal. The guidelines cover theoretical basis, programming, and instructions for use of the code

  9. Computer Hardware, Advanced Mathematics and Model Physics pilot project final report

    International Nuclear Information System (INIS)

    1992-05-01

    The Computer Hardware, Advanced Mathematics and Model Physics (CHAMMP) Program was launched in January, 1990. A principal objective of the program has been to utilize the emerging capabilities of massively parallel scientific computers in the challenge of regional scale predictions of decade-to-century climate change. CHAMMP has already demonstrated the feasibility of achieving a 10,000 fold increase in computational throughput for climate modeling in this decade. What we have also recognized, however, is the need for new algorithms and computer software to capitalize on the radically new computing architectures. This report describes the pilot CHAMMP projects at the DOE National Laboratories and the National Center for Atmospheric Research (NCAR). The pilot projects were selected to identify the principal challenges to CHAMMP and to entrain new scientific computing expertise. The success of some of these projects has aided in the definition of the CHAMMP scientific plan. Many of the papers in this report have been or will be submitted for publication in the open literature. Readers are urged to consult with the authors directly for questions or comments about their papers

  10. Computer simulation of transitional process to the final stable Brayton cycle in magnetic refrigeration

    International Nuclear Information System (INIS)

    Numasawa, T.; Hashimoto, T.

    1981-01-01

    The final working cycle in the magnetic refrigeration largely depends on the heat transfer coefficient β in the system, the parameter γ of the heat inflow from the outer system to this cycle and the period tau of the cycle. Therefore, so as to make clear this dependence, the time variation of the Brayton cycle with β, γ and tau has been investigated. In the present paper the transitional process of this cycle and the dependence of the final cooling temperature of the heat load on β, γ and tau have all been shown. (orig.)

  11. Fourth SIAM conference on mathematical and computational issues in the geosciences: Final program and abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    The conference focused on computational and modeling issues in the geosciences. Of the geosciences, problems associated with phenomena occurring in the earth`s subsurface were best represented. Topics in this area included petroleum recovery, ground water contamination and remediation, seismic imaging, parameter estimation, upscaling, geostatistical heterogeneity, reservoir and aquifer characterization, optimal well placement and pumping strategies, and geochemistry. Additional sessions were devoted to the atmosphere, surface water and oceans. The central mathematical themes included computational algorithms and numerical analysis, parallel computing, mathematical analysis of partial differential equations, statistical and stochastic methods, optimization, inversion, homogenization and renormalization. The problem areas discussed at this conference are of considerable national importance, with the increasing importance of environmental issues, global change, remediation of waste sites, declining domestic energy sources and an increasing reliance on producing the most out of established oil reservoirs.

  12. Examinations in the Final Year of Transition to Mathematical Methods Computer Algebra System (CAS)

    Science.gov (United States)

    Leigh-Lancaster, David; Les, Magdalena; Evans, Michael

    2010-01-01

    2009 was the final year of parallel implementation for Mathematical Methods Units 3 and 4 and Mathematical Methods (CAS) Units 3 and 4. From 2006-2009 there was a common technology-free short answer examination that covered the same function, algebra, calculus and probability content for both studies with corresponding expectations for key…

  13. A Multidisciplinary Research Team Approach to Computer-Aided Drafting (CAD) System Selection. Final Report.

    Science.gov (United States)

    Franken, Ken; And Others

    A multidisciplinary research team was assembled to review existing computer-aided drafting (CAD) systems for the purpose of enabling staff in the Design Drafting Department at Linn Technical College (Missouri) to select the best system out of the many CAD systems in existence. During the initial stage of the evaluation project, researchers…

  14. Computer-Aided Authoring of Programmed Instruction for Teaching Symbol Recognition. Final Report.

    Science.gov (United States)

    Braby, Richard; And Others

    This description of AUTHOR, a computer program for the automated authoring of programmed texts designed to teach symbol recognition, includes discussions of the learning strategies incorporated in the design of the instructional materials, hardware description and the algorithm for the software, and current and future developments. Appendices…

  15. Final report for Conference Support Grant "From Computational Biophysics to Systems Biology - CBSB12"

    Energy Technology Data Exchange (ETDEWEB)

    Hansmann, Ulrich H.E.

    2012-07-02

    This report summarizes the outcome of the international workshop From Computational Biophysics to Systems Biology (CBSB12) which was held June 3-5, 2012, at the University of Tennessee Conference Center in Knoxville, TN, and supported by DOE through the Conference Support Grant 120174. The purpose of CBSB12 was to provide a forum for the interaction between a data-mining interested systems biology community and a simulation and first-principle oriented computational biophysics/biochemistry community. CBSB12 was the sixth in a series of workshops of the same name organized in recent years, and the second that has been held in the USA. As in previous years, it gave researchers from physics, biology, and computer science an opportunity to acquaint each other with current trends in computational biophysics and systems biology, to explore venues of cooperation, and to establish together a detailed understanding of cells at a molecular level. The conference grant of $10,000 was used to cover registration fees and provide travel fellowships to selected students and postdoctoral scientists. By educating graduate students and providing a forum for young scientists to perform research into the working of cells at a molecular level, the workshop adds to DOE's mission of paving the way to exploit the abilities of living systems to capture, store and utilize energy.

  16. The use of symbolic computation in radiative, energy, and neutron transport calculations. Final report

    International Nuclear Information System (INIS)

    Frankel, J.I.

    1997-01-01

    This investigation used sysmbolic manipulation in developing analytical methods and general computational strategies for solving both linear and nonlinear, regular and singular integral and integro-differential equations which appear in radiative and mixed-mode energy transport. Contained in this report are seven papers which present the technical results as individual modules

  17. EZLP: An Interactive Computer Program for Solving Linear Programming Problems. Final Report.

    Science.gov (United States)

    Jarvis, John J.; And Others

    Designed for student use in solving linear programming problems, the interactive computer program described (EZLP) permits the student to input the linear programming model in exactly the same manner in which it would be written on paper. This report includes a brief review of the development of EZLP; narrative descriptions of program features,…

  18. Eighth SIAM conference on parallel processing for scientific computing: Final program and abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    This SIAM conference is the premier forum for developments in parallel numerical algorithms, a field that has seen very lively and fruitful developments over the past decade, and whose health is still robust. Themes for this conference were: combinatorial optimization; data-parallel languages; large-scale parallel applications; message-passing; molecular modeling; parallel I/O; parallel libraries; parallel software tools; parallel compilers; particle simulations; problem-solving environments; and sparse matrix computations.

  19. Final Report on XStack: Software Synthesis for High Productivity ExaScale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Solar-Lezama, Armando [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States). Computer Science and Artificial Intelligence Lab.

    2016-07-12

    The goal of the project was to develop a programming model that would significantly improve productivity in the high-performance computing domain by bringing together three components: a) Automated equivalence checking, b) Sketch-based program synthesis, and c) Autotuning. The report provides an executive summary of the research accomplished through this project. At the end of the report is appended a paper that describes in more detail the key technical accomplishments from this project, and which was published in SC 2014.

  20. Efficient Probability of Failure Calculations for QMU using Computational Geometry LDRD 13-0144 Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Scott A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ebeida, Mohamed Salah [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Romero, Vicente J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rushdi, Ahmad A. [Univ. of Texas, Austin, TX (United States); Abdelkader, Ahmad [Univ. of Maryland, College Park, MD (United States)

    2015-09-01

    This SAND report summarizes our work on the Sandia National Laboratory LDRD project titled "Efficient Probability of Failure Calculations for QMU using Computational Geometry" which was project #165617 and proposal #13-0144. This report merely summarizes our work. Those interested in the technical details are encouraged to read the full published results, and contact the report authors for the status of the software and follow-on projects.

  1. CAD-centric Computation Management System for a Virtual TBM. Final Report

    International Nuclear Information System (INIS)

    Munipalli, Ramakanth; Szema, K.Y.; Huang, P.Y.; Rowell, C.M.; Ying, A.; Abdou, M.

    2011-01-01

    HyPerComp Inc. in research collaboration with TEXCEL has set out to build a Virtual Test Blanket Module (VTBM) computational system to address the need in contemporary fusion research for simulating the integrated behavior of the blanket, divertor and plasma facing components in a fusion environment. Physical phenomena to be considered in a VTBM will include fluid flow, heat transfer, mass transfer, neutronics, structural mechanics and electromagnetics. We seek to integrate well established (third-party) simulation software in various disciplines mentioned above. The integrated modeling process will enable user groups to interoperate using a common modeling platform at various stages of the analysis. Since CAD is at the core of the simulation (as opposed to computational meshes which are different for each problem,) VTBM will have a well developed CAD interface, governing CAD model editing, cleanup, parameter extraction, model deformation (based on simulation,) CAD-based data interpolation. In Phase-I, we built the CAD-hub of the proposed VTBM and demonstrated its use in modeling a liquid breeder blanket module with coupled MHD and structural mechanics using HIMAG and ANSYS. A complete graphical user interface of the VTBM was created, which will form the foundation of any future development. Conservative data interpolation via CAD (as opposed to mesh-based transfer), the regeneration of CAD models based upon computed deflections, are among the other highlights of phase-I activity.

  2. Integrated computer control system CORBA-based simulator FY98 LDRD project final summary report

    International Nuclear Information System (INIS)

    Bryant, R M; Holloway, F W; Van Arsdall, P J.

    1999-01-01

    The CORBA-based Simulator was a Laboratory Directed Research and Development (LDRD) project that applied simulation techniques to explore critical questions about distributed control architecture. The simulator project used a three-prong approach comprised of a study of object-oriented distribution tools, computer network modeling, and simulation of key control system scenarios. This summary report highlights the findings of the team and provides the architectural context of the study. For the last several years LLNL has been developing the Integrated Computer Control System (ICCS), which is an abstract object-oriented software framework for constructing distributed systems. The framework is capable of implementing large event-driven control systems for mission-critical facilities such as the National Ignition Facility (NIF). Tools developed in this project were applied to the NIF example architecture in order to gain experience with a complex system and derive immediate benefits from this LDRD. The ICCS integrates data acquisition and control hardware with a supervisory system, and reduces the amount of new coding and testing necessary by providing prebuilt components that can be reused and extended to accommodate specific additional requirements. The framework integrates control point hardware with a supervisory system by providing the services needed for distributed control such as database persistence, system start-up and configuration, graphical user interface, status monitoring, event logging, scripting language, alert management, and access control. The design is interoperable among computers of different kinds and provides plug-in software connections by leveraging a common object request brokering architecture (CORBA) to transparently distribute software objects across the network of computers. Because object broker distribution applied to control systems is relatively new and its inherent performance is roughly threefold less than traditional point

  3. Final Project Report: Data Locality Enhancement of Dynamic Simulations for Exascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Shen, Xipeng [North Carolina State Univ., Raleigh, NC (United States)

    2016-04-27

    The goal of this project is to develop a set of techniques and software tools to enhance the matching between memory accesses in dynamic simulations and the prominent features of modern and future manycore systems, alleviating the memory performance issues for exascale computing. In the first three years, the PI and his group have achieves some significant progress towards the goal, producing a set of novel techniques for improving the memory performance and data locality in manycore systems, yielding 18 conference and workshop papers and 4 journal papers and graduating 6 Ph.Ds. This report summarizes the research results of this project through that period.

  4. Havery Mudd 2014-2015 Computer Science Conduit Clinic Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Aspesi, G [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bai, J [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Deese, R [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Shin, L [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-05-12

    Conduit, a new open-source library developed at Lawrence Livermore National Laboratories, provides a C++ application programming interface (API) to describe and access scientific data. Conduit’s primary use is for inmemory data exchange in high performance computing (HPC) applications. Our team tested and improved Conduit to make it more appealing to potential adopters in the HPC community. We extended Conduit’s capabilities by prototyping four libraries: one for parallel communication using MPI, one for I/O functionality, one for aggregating performance data, and one for data visualization.

  5. Condition monitoring through advanced sensor and computational technology : final report (January 2002 to May 2005).

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jung-Taek (Korea Atomic Energy Research Institute, Daejon, Korea); Luk, Vincent K.

    2005-05-01

    The overall goal of this joint research project was to develop and demonstrate advanced sensors and computational technology for continuous monitoring of the condition of components, structures, and systems in advanced and next-generation nuclear power plants (NPPs). This project included investigating and adapting several advanced sensor technologies from Korean and US national laboratory research communities, some of which were developed and applied in non-nuclear industries. The project team investigated and developed sophisticated signal processing, noise reduction, and pattern recognition techniques and algorithms. The researchers installed sensors and conducted condition monitoring tests on two test loops, a check valve (an active component) and a piping elbow (a passive component), to demonstrate the feasibility of using advanced sensors and computational technology to achieve the project goal. Acoustic emission (AE) devices, optical fiber sensors, accelerometers, and ultrasonic transducers (UTs) were used to detect mechanical vibratory response of check valve and piping elbow in normal and degraded configurations. Chemical sensors were also installed to monitor the water chemistry in the piping elbow test loop. Analysis results of processed sensor data indicate that it is feasible to differentiate between the normal and degraded (with selected degradation mechanisms) configurations of these two components from the acquired sensor signals, but it is questionable that these methods can reliably identify the level and type of degradation. Additional research and development efforts are needed to refine the differentiation techniques and to reduce the level of uncertainties.

  6. Peer-to-peer architectures for exascale computing : LDRD final report.

    Energy Technology Data Exchange (ETDEWEB)

    Vorobeychik, Yevgeniy; Mayo, Jackson R.; Minnich, Ronald G.; Armstrong, Robert C.; Rudish, Donald W.

    2010-09-01

    The goal of this research was to investigate the potential for employing dynamic, decentralized software architectures to achieve reliability in future high-performance computing platforms. These architectures, inspired by peer-to-peer networks such as botnets that already scale to millions of unreliable nodes, hold promise for enabling scientific applications to run usefully on next-generation exascale platforms ({approx} 10{sup 18} operations per second). Traditional parallel programming techniques suffer rapid deterioration of performance scaling with growing platform size, as the work of coping with increasingly frequent failures dominates over useful computation. Our studies suggest that new architectures, in which failures are treated as ubiquitous and their effects are considered as simply another controllable source of error in a scientific computation, can remove such obstacles to exascale computing for certain applications. We have developed a simulation framework, as well as a preliminary implementation in a large-scale emulation environment, for exploration of these 'fault-oblivious computing' approaches. High-performance computing (HPC) faces a fundamental problem of increasing total component failure rates due to increasing system sizes, which threaten to degrade system reliability to an unusable level by the time the exascale range is reached ({approx} 10{sup 18} operations per second, requiring of order millions of processors). As computer scientists seek a way to scale system software for next-generation exascale machines, it is worth considering peer-to-peer (P2P) architectures that are already capable of supporting 10{sup 6}-10{sup 7} unreliable nodes. Exascale platforms will require a different way of looking at systems and software because the machine will likely not be available in its entirety for a meaningful execution time. Realistic estimates of failure rates range from a few times per day to more than once per hour for these

  7. Active and passive computed tomography mixed waste focus area final report

    International Nuclear Information System (INIS)

    Becker, G K; Camp, D C; Decman, D J; Jackson, J A; Martz, H E; Roberson, G P.

    1998-01-01

    The Mixed Waste Focus Area (MWFA) Characterization Development Strategy delineates an approach to resolve technology deficiencies associated with the characterization of mixed wastes. The intent of this strategy is to ensure the availability of technologies to support the Department of Energy s (DOE) mixed-waste, low-level or transuranic (TRU) contaminated waste characterization management needs. To this end the MWFA has defined and coordinated characterization development programs to ensure that data and test results necessary to evaluate the utility of non-destructive assay technologies are available to meet site contact handled waste management schedules. Requirements used as technology development project benchmarks are based in the National TRU Program Quality Assurance Program Plan. These requirements include the ability to determine total bias and total measurement uncertainty. These parameters must be completely evaluated for waste types to be processed through a given nondestructive waste assay system constituting the foundation of activities undertaken in technology development projects. Once development and testing activities have been completed, Innovative Technology Summary Reports are generated to provide results and conclusions to support EM-30, -40, or -60 end user or customer technology selection. The active and passive computed tomography non-destructive assay system is one of the technologies selected for development by the MWFA. Lawrence Livermore National Laboratory (LLNL) has developed the active and passive computed tomography (A ampersand XT) nondestructive assay (NDA) technology to identify and accurately quantify all detectable radioisotopes in closed containers of waste. This technology will be applicable to all types of waste regardless of their classification-low level, transuranic or mixed. Mixed waste contains radioactivity and hazardous organic species. The scope of our technology is to develop a non-invasive waste-drum scanner that

  8. Advanced Scientific Computing Research Network Requirements: ASCR Network Requirements Review Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Bacon, Charles [Argonne National Lab. (ANL), Argonne, IL (United States); Bell, Greg [ESnet, Berkeley, CA (United States); Canon, Shane [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dart, Eli [ESnet, Berkeley, CA (United States); Dattoria, Vince [Dept. of Energy (DOE), Washington DC (United States). Office of Science. Advanced Scientific Computing Research (ASCR); Goodwin, Dave [Dept. of Energy (DOE), Washington DC (United States). Office of Science. Advanced Scientific Computing Research (ASCR); Lee, Jason [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hicks, Susan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Holohan, Ed [Argonne National Lab. (ANL), Argonne, IL (United States); Klasky, Scott [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Lauzon, Carolyn [Dept. of Energy (DOE), Washington DC (United States). Office of Science. Advanced Scientific Computing Research (ASCR); Rogers, Jim [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Shipman, Galen [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Skinner, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Tierney, Brian [ESnet, Berkeley, CA (United States)

    2013-03-08

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In October 2012, ESnet and the Office of Advanced Scientific Computing Research (ASCR) of the DOE SC organized a review to characterize the networking requirements of the programs funded by the ASCR program office. The requirements identified at the review are summarized in the Findings section, and are described in more detail in the body of the report.

  9. Shutdown and degradation: Space computers for nuclear application, verification of radiation hardness. Final report

    International Nuclear Information System (INIS)

    Eichhorn, E.; Gerber, V.; Schreyer, P.

    1995-01-01

    (1) Employment of those radiation hard electronics which are already known in military and space applications. (2) The experience in space-flight shall be used to investigate nuclear technology areas, for example, by using space electronics to prove the range of applications in nuclear radiating environments. (3) Reproduction of a computer developed for telecommunication satellites; proof of radiation hardness by radiation tests. (4) At 328 Krad (Si) first failure of radiation tolerant devices with 100 Krad (Si) hardness guaranteed. (5) Using radiation hard devices of the same type you can expect applications at doses of greater than 1 Mrad (Si). Electronic systems applicable for radiation categories D, C and lower part of B for manipulators, vehicles, underwater robotics. (orig.) [de

  10. Development of an electrical impedance computed tomographic two-phase flows analyzer. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Ovacik, L.; Jones, O.C.

    1998-08-01

    This report summarizes the work on the research project on this cooperative program between DOE and Hitachi, Ltd. Major advances were made in the computational reconstruction of images from electrical excitation and response data with respect to existing capabilities reported in the literature. A demonstration is provided of the imaging of one or more circular objects within the measurement plane with demonstrated linear resolution of six parts in two hundred. At this point it can be said that accurate excitation and measurement of boundary voltages and currents appears adequate to obtain reasonable images of the real conductivity distribution within a body and the outlines of insulating targets suspended within a homogeneous conducting medium. The quality of images is heavily dependent on the theoretical and numerical implementation of imaging algorithms. The overall imaging system described has the potential of being both fast and cost effective in comparison with alternative methods. The methods developed use multiple plate-electrode excitation in conjunction with finite element block decomposition, preconditioned voltage conversion, layer approximation of the third dimension and post processing of boundary measurements to obtain optimal boundary excitations. Reasonably accurate imaging of single and multiple targets of differing size, location and separation is demonstrated and the resulting images are better than any others found in the literature. Recommendations for future effort include the improvement in computational algorithms with emphasis on internal conductivity shape functions and the use of adaptive development of quadrilateral (2-D) or tetrahedral or hexahedral (3-D) elements to coincide with large discrete zone boundaries in the fields, development of a truly binary model and completion of a fast imaging system. Further, the rudimentary methods shown herein for three-dimensional imaging need improving.

  11. Interactive Computer-Enhanced Remote Viewing System (ICERVS): Final report, November 1994--September 1996

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-05-01

    The Interactive Computer-Enhanced Remote Viewing System (ICERVS) is a software tool for complex three-dimensional (3-D) visualization and modeling. Its primary purpose is to facilitate the use of robotic and telerobotic systems in remote and/or hazardous environments, where spatial information is provided by 3-D mapping sensors. ICERVS provides a robust, interactive system for viewing sensor data in 3-D and combines this with interactive geometric modeling capabilities that allow an operator to construct CAD models to match the remote environment. Part I of this report traces the development of ICERVS through three evolutionary phases: (1) development of first-generation software to render orthogonal view displays and wireframe models; (2) expansion of this software to include interactive viewpoint control, surface-shaded graphics, material (scalar and nonscalar) property data, cut/slice planes, color and visibility mapping, and generalized object models; (3) demonstration of ICERVS as a tool for the remediation of underground storage tanks (USTs) and the dismantlement of contaminated processing facilities. Part II of this report details the software design of ICERVS, with particular emphasis on its object-oriented architecture and user interface.

  12. Analyses, algorithms, and computations for models of high-temperature superconductivity. Final report

    International Nuclear Information System (INIS)

    Du, Q.

    1997-01-01

    Under the sponsorship of the Department of Energy, the authors have achieved significant progress in the modeling, analysis, and computation of superconducting phenomena. The work so far has focused on mezoscale models as typified by the celebrated Ginzburg-Landau equations; these models are intermediate between the microscopic models (that can be used to understand the basic structure of superconductors and of the atomic and sub-atomic behavior of these materials) and the macroscale, or homogenized, models (that can be of use for the design of devices). The models they have considered include a time dependent Ginzburg-Landau model, a variable thickness thin film model, models for high values of the Ginzburg-landau parameter, models that account for normal inclusions and fluctuations and Josephson effects, and the anisotropic ginzburg-Landau and Lawrence-Doniach models for layered superconductors, including those with high critical temperatures. In each case, they have developed or refined the models, derived rigorous mathematical results that enhance the state of understanding of the models and their solutions, and developed, analyzed, and implemented finite element algorithms for the approximate solution of the model equations

  13. Analyses, algorithms, and computations for models of high-temperature superconductivity. Final technical report

    International Nuclear Information System (INIS)

    Gunzburger, M.D.; Peterson, J.S.

    1998-01-01

    Under the sponsorship of the Department of Energy, the authors have achieved significant progress in the modeling, analysis, and computation of superconducting phenomena. Their work has focused on mezoscale models as typified by the celebrated ginzburg-Landau equations; these models are intermediate between the microscopic models (that can be used to understand the basic structure of superconductors and of the atomic and sub-atomic behavior of these materials) and the macroscale, or homogenized, models (that can be of use for the design of devices). The models the authors have considered include a time dependent Ginzburg-Landau model, a variable thickness thin film model, models for high values of the Ginzburg-Landau parameter, models that account for normal inclusions and fluctuations and Josephson effects, and the anisotropic Ginzburg-Landau and Lawrence-Doniach models for layered superconductors, including those with high critical temperatures. In each case, they have developed or refined the models, derived rigorous mathematical results that enhance the state of understanding of the models and their solutions, and developed, analyzed, and implemented finite element algorithms for the approximate solution of the model equations

  14. Interactive Computer-Enhanced Remote Viewing System (ICERVS): Final report, November 1994--September 1996

    International Nuclear Information System (INIS)

    1997-01-01

    The Interactive Computer-Enhanced Remote Viewing System (ICERVS) is a software tool for complex three-dimensional (3-D) visualization and modeling. Its primary purpose is to facilitate the use of robotic and telerobotic systems in remote and/or hazardous environments, where spatial information is provided by 3-D mapping sensors. ICERVS provides a robust, interactive system for viewing sensor data in 3-D and combines this with interactive geometric modeling capabilities that allow an operator to construct CAD models to match the remote environment. Part I of this report traces the development of ICERVS through three evolutionary phases: (1) development of first-generation software to render orthogonal view displays and wireframe models; (2) expansion of this software to include interactive viewpoint control, surface-shaded graphics, material (scalar and nonscalar) property data, cut/slice planes, color and visibility mapping, and generalized object models; (3) demonstration of ICERVS as a tool for the remediation of underground storage tanks (USTs) and the dismantlement of contaminated processing facilities. Part II of this report details the software design of ICERVS, with particular emphasis on its object-oriented architecture and user interface

  15. Final Report: MaRSPlus Sensor System Electrical Cable Management and Distributed Motor Control Computer Interface

    Science.gov (United States)

    Reil, Robin

    2011-01-01

    The success of JPL's Next Generation Imaging Spectrometer (NGIS) in Earth remote sensing has inspired a follow-on instrument project, the MaRSPlus Sensor System (MSS). One of JPL's responsibilities in the MSS project involves updating the documentation from the previous JPL airborne imagers to provide all the information necessary for an outside customer to operate the instrument independently. As part of this documentation update, I created detailed electrical cabling diagrams to provide JPL technicians with clear and concise build instructions and a database to track the status of cables from order to build to delivery. Simultaneously, a distributed motor control system is being developed for potential use on the proposed 2018 Mars rover mission. This system would significantly reduce the mass necessary for rover motor control, making more mass space available to other important spacecraft systems. The current stage of the project consists of a desktop computer talking to a single "cold box" unit containing the electronics to drive a motor. In order to test the electronics, I developed a graphical user interface (GUI) using MATLAB to allow a user to send simple commands to the cold box and display the responses received in a user-friendly format.

  16. Blunt cerebrovascular injury screening with 64-channel multidetector computed tomography: more slices finally cut it.

    Science.gov (United States)

    Paulus, Elena M; Fabian, Timothy C; Savage, Stephanie A; Zarzaur, Ben L; Botta, Vandana; Dutton, Wesley; Croce, Martin A

    2014-02-01

    Aggressive screening to diagnose blunt cerebrovascular injury (BCVI) results in early treatment, leading to improved outcomes and reduced stroke rates. While computed tomographic angiography (CTA) has been widely adopted for BCVI screening, evidence of its diagnostic sensitivity is marginal. Previous work from our institution using 32-channel multidetector CTA in 684 patients demonstrated an inadequate sensitivity of 51% (Ann Surg. 2011,253: 444-450). Digital subtraction angiography (DSA) continues to be the reference standard of diagnosis but has significant drawbacks of invasiveness and resource demands. There have been continued advances in CT technology, and this is the first report of an extensive experience with 64-channel multidetector CTA. Patients screened for BCVI using CTA and DSA (reference) at a Level 1 trauma center during the 12-month period ending in May 2012 were identified. Results of CTA and DSA, complications, and strokes were retrospectively reviewed and compared. A total of 594 patients met criteria for BCVI screening and underwent both CTA and DSA. One hundred twenty-eight patients (22% of those screened) had 163 injured vessels: 99 (61%) carotid artery injuries and 64 (39%) vertebral artery injuries. Sixty-four-channel CTA demonstrated an overall sensitivity per vessel of 68% and specificity of 92%. The 52 false-negative findings on CTA were composed of 34 carotid artery injuries and 18 vertebral artery injuries; 32 (62%) were Grade I injuries. Overall, positive predictive value was 36.2%, and negative predictive value was 97.5%. Six procedure-related complications (1%) occurred with DSA, including two iatrogenic dissections and one stroke. Sixty-four-channel CTA demonstrated a significantly improved sensitivity of 68% versus the 51% previously reported for the 32-channel CTA (p = 0.0075). Sixty-two percent of the false-negative findings occurred with low-grade injuries. Considering complications, cost, and resource demand associated with

  17. A qualitative and quantitative laser-based computer-aided flow visualization method. M.S. Thesis, 1992 Final Report

    Science.gov (United States)

    Canacci, Victor A.; Braun, M. Jack

    1994-01-01

    The experimental approach presented here offers a nonintrusive, qualitative and quantitative evaluation of full field flow patterns applicable in various geometries in a variety of fluids. This Full Flow Field Tracking (FFFT) Particle Image Velocimetry (PIV) technique, by means of particle tracers illuminated by a laser light sheet, offers an alternative to Laser Doppler Velocimetry (LDV), and intrusive systems such as Hot Wire/Film Anemometry. The method makes obtainable the flow patterns, and allows quantitative determination of the velocities, accelerations, and mass flows of an entire flow field. The method uses a computer based digitizing system attached through an imaging board to a low luminosity camera. A customized optical train allows the system to become a long distance microscope (LDM), allowing magnifications of areas of interest ranging up to 100 times. Presented in addition to the method itself, are studies in which the flow patterns and velocities were observed and evaluated in three distinct geometries, with three different working fluids. The first study involved pressure and flow analysis of a brush seal in oil. The next application involved studying the velocity and flow patterns in a cowl lip cooling passage of an air breathing aircraft engine using water as the working fluid. Finally, the method was extended to a study in air to examine the flows in a staggered pin arrangement located on one side of a branched duct.

  18. Advanced computational methods for the assessment of reactor core behaviour during reactivity initiated accidents. Final report; Fortschrittliche Rechenmethoden zum Kernverhalten bei Reaktivitaetsstoerfaellen. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Pautz, A.; Perin, Y.; Pasichnyk, I.; Velkov, K.; Zwermann, W.; Seubert, A.; Klein, M.; Gallner, L.; Krzycacz-Hausmann, B.

    2012-05-15

    The document at hand serves as the final report for the reactor safety research project RS1183 ''Advanced Computational Methods for the Assessment of Reactor Core Behavior During Reactivity-Initiated Accidents''. The work performed in the framework of this project was dedicated to the development, validation and application of advanced computational methods for the simulation of transients and accidents of nuclear installations. These simulation tools describe in particular the behavior of the reactor core (with respect to neutronics, thermal-hydraulics and thermal mechanics) at a very high level of detail. The overall goal of this project was the deployment of a modern nuclear computational chain which provides, besides advanced 3D tools for coupled neutronics/ thermal-hydraulics full core calculations, also appropriate tools for the generation of multi-group cross sections and Monte Carlo models for the verification of the individual calculational steps. This computational chain shall primarily be deployed for light water reactors (LWR), but should beyond that also be applicable for innovative reactor concepts. Thus, validation on computational benchmarks and critical experiments was of paramount importance. Finally, appropriate methods for uncertainty and sensitivity analysis were to be integrated into the computational framework, in order to assess and quantify the uncertainties due to insufficient knowledge of data, as well as due to methodological aspects.

  19. Design of a Computer-Controlled, Random-Access Slide Projector Interface. Final Report (April 1974 - November 1974).

    Science.gov (United States)

    Kirby, Paul J.; And Others

    The design, development, test, and evaluation of an electronic hardware device interfacing a commercially available slide projector with a plasma panel computer terminal is reported. The interface device allows an instructional computer program to select slides for viewing based upon the lesson student situation parameters of the instructional…

  20. Guide to improving the performance of a manipulator system for nuclear fuel handling through computer controls. Final report

    International Nuclear Information System (INIS)

    Evans, J.M. Jr.; Albus, J.S.; Barbera, A.J.; Rosenthal, R.; Truitt, W.B.

    1975-11-01

    The Office of Developmental Automation and Control Technology of the Institute for Computer Sciences and Technology of the National Bureau of Standards provides advising services, standards and guidelines on interface and computer control systems, and performance specifications for the procurement and use of computer controlled manipulators and other computer based automation systems. These outputs help other agencies and industry apply this technology to increase productivity and improve work quality by removing men from hazardous environments. In FY 74 personnel from the Oak Ridge National Laboratory visited NBS to discuss the feasibility of using computer control techniques to improve the operation of remote control manipulators in nuclear fuel reprocessing. Subsequent discussions led to an agreement for NBS to develop a conceptual design for such a computer control system for the PaR Model 3000 manipulator in the Thorium Uranium Recycle Facility (TURF) at ORNL. This report provides the required analysis and conceptual design. Complete computer programs are included for testing of computer interfaces and for actual robot control in both point-to-point and continuous path modes

  1. XOQDOQ: computer program for the meteorological evaluation of routine effluent releases at nuclear power stations. Final report

    International Nuclear Information System (INIS)

    Sagendorf, J.F.; Goll, J.T.; Sandusky, W.F.

    1982-09-01

    Provided is a user's guide for the US Nuclear Regulatory Commission's (NRC) computer program X0QDOQ which implements Regulatory Guide 1.111. This NUREG supercedes NUREG-0324 which was published as a draft in September 1977. This program is used by the NRC meteorology staff in their independent meteorological evaluation of routine or anticipated intermittent releases at nuclear power stations. It operates in a batch input mode and has various options a user may select. Relative atmospheric dispersion and deposition factors are computed for 22 specific distances out to 50 miles from the site for each directional sector. From these results, values for 10 distance segments are computed. The user may also select other locations for which atmospheric dispersion deposition factors are computed. Program features, including required input data and output results, are described. A program listing and test case data input and resulting output are provided

  2. Does Preinterventional Flat-Panel Computer Tomography Pooled Blood Volume Mapping Predict Final Infarct Volume After Mechanical Thrombectomy in Acute Cerebral Artery Occlusion?

    International Nuclear Information System (INIS)

    Wagner, Marlies; Kyriakou, Yiannis; Mesnil de Rochemont, Richard du; Singer, Oliver C.; Berkefeld, Joachim

    2013-01-01

    PurposeDecreased cerebral blood volume is known to be a predictor for final infarct volume in acute cerebral artery occlusion. To evaluate the predictability of final infarct volume in patients with acute occlusion of the middle cerebral artery (MCA) or the distal internal carotid artery (ICA) and successful endovascular recanalization, pooled blood volume (PBV) was measured using flat-panel detector computed tomography (FPD CT).Materials and MethodsTwenty patients with acute unilateral occlusion of the MCA or distal ACI without demarcated infarction, as proven by CT at admission, and successful Thrombolysis in cerebral infarction score (TICI 2b or 3) endovascular thrombectomy were included. Cerebral PBV maps were acquired from each patient immediately before endovascular thrombectomy. Twenty-four hours after recanalization, each patient underwent multislice CT to visualize final infarct volume. Extent of the areas of decreased PBV was compared with the final infarct volume proven by follow-up CT the next day.ResultsIn 15 of 20 patients, areas of distinct PBV decrease corresponded to final infarct volume. In 5 patients, areas of decreased PBV overestimated final extension of ischemia probably due to inappropriate timing of data acquisition and misery perfusion.ConclusionPBV mapping using FPD CT is a promising tool to predict areas of irrecoverable brain parenchyma in acute thromboembolic stroke. Further validation is necessary before routine use for decision making for interventional thrombectomy

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  4. Project Final Report: Ubiquitous Computing and Monitoring System (UCoMS) for Discovery and Management of Energy Resources

    Energy Technology Data Exchange (ETDEWEB)

    Tzeng, Nian-Feng; White, Christopher D.; Moreman, Douglas

    2012-07-14

    The UCoMS research cluster has spearheaded three research areas since August 2004, including wireless and sensor networks, Grid computing, and petroleum applications. The primary goals of UCoMS research are three-fold: (1) creating new knowledge to push forward the technology forefronts on pertinent research on the computing and monitoring aspects of energy resource management, (2) developing and disseminating software codes and toolkits for the research community and the public, and (3) establishing system prototypes and testbeds for evaluating innovative techniques and methods. Substantial progress and diverse accomplishment have been made by research investigators in their respective areas of expertise cooperatively on such topics as sensors and sensor networks, wireless communication and systems, computational Grids, particularly relevant to petroleum applications.

  5. Final Report for 'Implimentation and Evaluation of Multigrid Linear Solvers into Extended Magnetohydrodynamic Codes for Petascale Computing'

    International Nuclear Information System (INIS)

    Vadlamani, Srinath; Kruger, Scott; Austin, Travis

    2008-01-01

    Extended magnetohydrodynamic (MHD) codes are used to model the large, slow-growing instabilities that are projected to limit the performance of International Thermonuclear Experimental Reactor (ITER). The multiscale nature of the extended MHD equations requires an implicit approach. The current linear solvers needed for the implicit algorithm scale poorly because the resultant matrices are so ill-conditioned. A new solver is needed, especially one that scales to the petascale. The most successful scalable parallel processor solvers to date are multigrid solvers. Applying multigrid techniques to a set of equations whose fundamental modes are dispersive waves is a promising solution to CEMM problems. For the Phase 1, we implemented multigrid preconditioners from the HYPRE project of the Center for Applied Scientific Computing at LLNL via PETSc of the DOE SciDAC TOPS for the real matrix systems of the extended MHD code NIMROD which is a one of the primary modeling codes of the OFES-funded Center for Extended Magnetohydrodynamic Modeling (CEMM) SciDAC. We implemented the multigrid solvers on the fusion test problem that allows for real matrix systems with success, and in the process learned about the details of NIMROD data structures and the difficulties of inverting NIMROD operators. The further success of this project will allow for efficient usage of future petascale computers at the National Leadership Facilities: Oak Ridge National Laboratory, Argonne National Laboratory, and National Energy Research Scientific Computing Center. The project will be a collaborative effort between computational plasma physicists and applied mathematicians at Tech-X Corporation, applied mathematicians Front Range Scientific Computations, Inc. (who are collaborators on the HYPRE project), and other computational plasma physicists involved with the CEMM project.

  6. [Towards computer-aided catalyst design: Three effective core potential studies of C-H activation]. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-12-31

    Research in the initial grant period focused on computational studies relevant to the selective activation of methane, the prime component of natural gas. Reaction coordinates for methane activation by experimental models were delineated, as well as the bonding and structure of complexes that effect this important reaction. This research, highlighted in the following sections, also provided the impetus for further development, and application of methods for modeling metal-containing catalysts. Sections of the report describe the following: methane activation by multiple-bonded transition metal complexes; computational lanthanide chemistry; and methane activation by non-imido, multiple-bonded ligands.

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  8. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  10. Development of scan analysis techniques employing a small computer. Final report, February 1, 1963--July 31, 1976

    International Nuclear Information System (INIS)

    Kuhl, D.E.

    1976-01-01

    During the thirteen year duration of this contract the goal has been to develop and apply computer based analysis of radionuclide scan data so as to make available improved diagnostic information based on a knowledge of localized quantitative estimates of radionuclide concentration. Results are summarized

  11. THE PRODUCTION AND EVALUATION OF THREE COMPUTER-BASED ECONOMICS GAMES FOR THE SIXTH GRADE. FINAL REPORT.

    Science.gov (United States)

    WING, RICHARD L.; AND OTHERS

    THE PURPOSE OF THE EXPERIMENT WAS TO PRODUCE AND EVALUATE 3 COMPUTER-BASED ECONOMICS GAMES AS A METHOD OF INDIVIDUALIZING INSTRUCTION FOR GRADE 6 STUDENTS. 26 EXPERIMENTAL SUBJECTS PLAYED 2 ECONOMICS GAMES, WHILE A CONTROL GROUP RECEIVED CONVENTIONAL INSTRUCTION ON SIMILAR MATERIAL. IN THE SUMERIAN GAME, STUDENTS SEATED AT THE TYPEWRITER TERMINALS…

  12. Students Upgrading through Computer and Career Education System Services (Project SUCCESS). Final Evaluation Report 1992-93. OER Report.

    Science.gov (United States)

    New York City Board of Education, Brooklyn, NY. Office of Educational Research.

    Student Upgrading through Computer and Career Education System Services (Project SUCCESS) was an Elementary and Secondary Education Act Title VII-funded project in its third year of operation. Project SUCCESS served 460 students of limited English proficiency at two high schools in Brooklyn and one high school in Manhattan (New York City).…

  13. Development of scan analysis techniques employing a small computer. Final report, February 1, 1963--July 31, 1976

    Energy Technology Data Exchange (ETDEWEB)

    Kuhl, D.E.

    1976-08-05

    During the thirteen year duration of this contract the goal has been to develop and apply computer based analysis of radionuclide scan data so as to make available improved diagnostic information based on a knowledge of localized quantitative estimates of radionuclide concentration. Results are summarized. (CH)

  14. The Development of an Individualized Instructional Program in Beginning College Mathematics Utilizing Computer Based Resource Units. Final Report.

    Science.gov (United States)

    Rockhill, Theron D.

    Reported is an attempt to develop and evaluate an individualized instructional program in pre-calculus college mathematics. Four computer based resource units were developed in the areas of set theory, relations and function, algebra, trigonometry, and analytic geometry. Objectives were determined by experienced calculus teachers, and…

  15. The Students Upgrading through Computer and Career Education Systems Services (Project SUCCESS). 1990-91 Final Evaluation Profile. OREA Report.

    Science.gov (United States)

    New York City Board of Education, Brooklyn, NY. Office of Research, Evaluation, and Assessment.

    An evaluation was done of the New York City Public Schools' Student Upgrading through Computer and Career Education Systems Services Program (Project SUCCESS). Project SUCCESS operated at 3 high schools in Brooklyn and Manhattan (Murry Bergtraum High School, Edward R. Murrow High School, and John Dewey High School). It enrolled limited English…

  16. Students Upgrading through Computer and Career Education System Services (Project SUCCESS). Final Evaluation Report 1993-94. OER Report.

    Science.gov (United States)

    Greene, Judy

    Students Upgrading through Computer and Career Education System Services (Project SUCCESS) was an Elementary and Secondary Education Act Title VII-funded project in its fourth year of operation. The project operated at two high schools in Brooklyn and one in Manhattan (New York). In the 1993-94 school year, the project served 393 students of…

  17. Computer-Assisted, Programmed Text, and Lecture Modes of Instruction in Three Medical Training Courses: Comparative Evaluation. Final Report.

    Science.gov (United States)

    Deignan, Gerard M.; And Others

    This report contains a comparative analysis of the differential effectiveness of computer-assisted instruction (CAI), programmed instructional text (PIT), and lecture methods of instruction in three medical courses--Medical Laboratory, Radiology, and Dental. The summative evaluation includes (1) multiple regression analyses conducted to predict…

  18. Application of personal computers to enhance operation and management of research reactors. Proceedings of a final research co-ordination meeting

    International Nuclear Information System (INIS)

    1998-02-01

    The on-line of personal computers (PCs) can be valuable to guide the research reactor operator in analysing both normal and abnormal situations. PCs can effectively be used for data acquisition and data processing, and providing information to the operator. Typical areas of on-line applications of PCs in nuclear research reactors include: Acquisition and display of data on process parameters; performance evaluation of major equipment and safety related components; fuel management; computation of reactor physics parameters; failed fuel detection and location; inventory of system fluids; training using computer aided simulation; operator advice. All these applications require the development of computer programmes and interface hardware. In recognizing this need, the IAEA initiated in 1990 a Co-ordinated Research Programme (CRP) on ''Application of Personal Computers to Enhance Operation and Management of Research Reactors''. The final meeting of the CRP was held from 30 October to 3 November 1995 in Dalata Viet Nam. This report was written by contributors from Bangladesh, Germany, India, the Republic of Korea, Pakistan, Philippines, Thailand and Viet Nam. The IAEA staff members responsible for the publication were K. Akhtar and V. Dimic of the Physics Section, Division of Physical and Chemical Sciences

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  20. Planning Committee for a National Resource for Computation in Chemistry. Final report, October 1, 1974--June 30, 1977

    International Nuclear Information System (INIS)

    1978-11-01

    The Planning Committee for a National Resource for Computation in Chemistry (NRCC) was charged with the responsibility of formulating recommendations regarding organizational structure for an NRCC including the composition, size, and responsibilities of its policy board, the relationship of such a board to the operating structure of the NRCC, to federal funding agencies, and to user groups; desirable priorities, growth rates, and levels of operations for the first several years; and facilities, access and site requirements for such a Resource. By means of site visits, questionnaires, and a workshop, the Committee sought advice from a wide range of potential users and organizations interested in chemical computation. Chemical kinetics, crystallography, macromolecular science, nonnumerical methods, physical organic chemistry, quantum chemistry, and statistical mechanics are covered

  1. Opportunities for Russian Nuclear Weapons Institute developing computer-aided design programs for pharmaceutical drug discovery. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-09-23

    The goal of this study is to determine whether physicists at the Russian Nuclear Weapons Institute can profitably service the need for computer aided drug design (CADD) programs. The Russian physicists` primary competitive advantage is their ability to write particularly efficient code able to work with limited computing power; a history of working with very large, complex modeling systems; an extensive knowledge of physics and mathematics, and price competitiveness. Their primary competitive disadvantage is their lack of biology, and cultural and geographic issues. The first phase of the study focused on defining the competitive landscape, primarily through interviews with and literature searches on the key providers of CADD software. The second phase focused on users of CADD technology to determine deficiencies in the current product offerings, to understand what product they most desired, and to define the potential demand for such a product.

  2. FINAL REPORT DE-FG02-04ER41317 Advanced Computation and Chaotic Dynamics for Beams and Accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Cary, John R [U. Colorado

    2014-09-08

    During the year ending in August 2013, we continued to investigate the potential of photonic crystal (PhC) materials for acceleration purposes. We worked to characterize acceleration ability of simple PhC accelerator structures, as well as to characterize PhC materials to determine whether current fabrication techniques can meet the needs of future accelerating structures. We have also continued to design and optimize PhC accelerator structures, with the ultimate goal of finding a new kind of accelerator structure that could offer significant advantages over current RF acceleration technology. This design and optimization of these requires high performance computation, and we continue to work on methods to make such computation faster and more efficient.

  3. Planning Committee for a National Resource for Computation in Chemistry. Final report, October 1, 1974--June 30, 1977

    Energy Technology Data Exchange (ETDEWEB)

    Bigeleisen, Jacob; Berne, Bruce J.; Coton, F. Albert; Scheraga, Harold A.; Simmons, Howard E.; Snyder, Lawrence C.; Wiberg, Kenneth B.; Wipke, W. Todd

    1978-11-01

    The Planning Committee for a National Resource for Computation in Chemistry (NRCC) was charged with the responsibility of formulating recommendations regarding organizational structure for an NRCC including the composition, size, and responsibilities of its policy board, the relationship of such a board to the operating structure of the NRCC, to federal funding agencies, and to user groups; desirable priorities, growth rates, and levels of operations for the first several years; and facilities, access and site requirements for such a Resource. By means of site visits, questionnaires, and a workshop, the Committee sought advice from a wide range of potential users and organizations interested in chemical computation. Chemical kinetics, crystallography, macromolecular science, nonnumerical methods, physical organic chemistry, quantum chemistry, and statistical mechanics are covered.

  4. Revision of the European Ecolabel Criteria for Personal, Notebook and Tablet Computers TECHNICAL REPORT Summary of the final criteria proposals

    OpenAIRE

    DODD NICHOLAS; VIDAL ABARCA GARRIDO CANDELA; WOLF Oliver; GRAULICH Kathrin; BUNKE Dirk; GROSS Rita; LIU Ran; MANHART Andreas; PRAKASH Siddharth

    2015-01-01

    This technical report provide the background information for the revision of the EU Ecolabel criteria for Personal and Notebook Computers. The study has been carried out by the Joint Research Centre with technical support from the Oeko-Institut. The work has been developed for the European Commission's Directorate General for the Environment. The main purpose of this report is to provide a summary of the technical background and rationale for each criterion proposal. This document is compl...

  5. R&D for computational cognitive and social models : foundations for model evaluation through verification and validation (final LDRD report).

    Energy Technology Data Exchange (ETDEWEB)

    Slepoy, Alexander; Mitchell, Scott A.; Backus, George A.; McNamara, Laura A.; Trucano, Timothy Guy

    2008-09-01

    Sandia National Laboratories is investing in projects that aim to develop computational modeling and simulation applications that explore human cognitive and social phenomena. While some of these modeling and simulation projects are explicitly research oriented, others are intended to support or provide insight for people involved in high consequence decision-making. This raises the issue of how to evaluate computational modeling and simulation applications in both research and applied settings where human behavior is the focus of the model: when is a simulation 'good enough' for the goals its designers want to achieve? In this report, we discuss two years' worth of review and assessment of the ASC program's approach to computational model verification and validation, uncertainty quantification, and decision making. We present a framework that extends the principles of the ASC approach into the area of computational social and cognitive modeling and simulation. In doing so, we argue that the potential for evaluation is a function of how the modeling and simulation software will be used in a particular setting. In making this argument, we move from strict, engineering and physics oriented approaches to V&V to a broader project of model evaluation, which asserts that the systematic, rigorous, and transparent accumulation of evidence about a model's performance under conditions of uncertainty is a reasonable and necessary goal for model evaluation, regardless of discipline. How to achieve the accumulation of evidence in areas outside physics and engineering is a significant research challenge, but one that requires addressing as modeling and simulation tools move out of research laboratories and into the hands of decision makers. This report provides an assessment of our thinking on ASC Verification and Validation, and argues for further extending V&V research in the physical and engineering sciences toward a broader program of model

  6. FY05 LDRD Final Report A Computational Design Tool for Microdevices and Components in Pathogen Detection Systems

    Energy Technology Data Exchange (ETDEWEB)

    Trebotich, D

    2006-02-07

    We have developed new algorithms to model complex biological flows in integrated biodetection microdevice components. The proposed work is important because the design strategy for the next-generation Autonomous Pathogen Detection System at LLNL is the microfluidic-based Biobriefcase, being developed under the Chemical and Biological Countermeasures Program in the Homeland Security Organization. This miniaturization strategy introduces a new flow regime to systems where biological flow is already complex and not well understood. Also, design and fabrication of MEMS devices is time-consuming and costly due to the current trial-and-error approach. Furthermore, existing devices, in general, are not optimized. There are several MEMS CAD capabilities currently available, but their computational fluid dynamics modeling capabilities are rudimentary at best. Therefore, we proposed a collaboration to develop computational tools at LLNL which will (1) provide critical understanding of the fundamental flow physics involved in bioMEMS devices, (2) shorten the design and fabrication process, and thus reduce costs, (3) optimize current prototypes and (4) provide a prediction capability for the design of new, more advanced microfluidic systems. Computational expertise was provided by Comp-CASC and UC Davis-DAS. The simulation work was supported by key experiments for guidance and validation at UC Berkeley-BioE.

  7. A computational procedure for the dynamics of flexible beams within multibody systems. Ph.D. Thesis Final Technical Report

    Science.gov (United States)

    Downer, Janice Diane

    1990-01-01

    The dynamic analysis of three dimensional elastic beams which experience large rotational and large deformational motions are examined. The beam motion is modeled using an inertial reference for the translational displacements and a body-fixed reference for the rotational quantities. Finite strain rod theories are then defined in conjunction with the beam kinematic description which accounts for the effects of stretching, bending, torsion, and transverse shear deformations. A convected coordinate representation of the Cauchy stress tensor and a conjugate strain definition is introduced to model the beam deformation. To treat the beam dynamics, a two-stage modification of the central difference algorithm is presented to integrate the translational coordinates and the angular velocity vector. The angular orientation is then obtained from the application of an implicit integration algorithm to the Euler parameter/angular velocity kinematical relation. The combined developments of the objective internal force computation with the dynamic solution procedures result in the computational preservation of total energy for undamped systems. The present methodology is also extended to model the dynamics of deployment/retrieval of the flexible members. A moving spatial grid corresponding to the configuration of a deployed rigid beam is employed as a reference for the dynamic variables. A transient integration scheme which accurately accounts for the deforming spatial grid is derived from a space-time finite element discretization of a Hamiltonian variational statement. The computational results of this general deforming finite element beam formulation are compared to reported results for a planar inverse-spaghetti problem.

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  10. Institutional Computing: Final Report Quantum Effects on Cosmology: Probing Physics Beyond the Standard Model with Big Bang Nucleosynthesis

    Energy Technology Data Exchange (ETDEWEB)

    Paris, Mark W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-13

    The current one-year project allocation (w17 burst) supports the continuation of research performed in the two-year Institutional Computing allocation (w14 bigbangnucleosynthesis). The project has supported development and production runs resulting in several publications[1, 2, 3, 4] in peer-review journals and talks. Most signi cantly, we have recently achieved a signi cant improvement in code performance. This improvement was essential to the prospect of making further progress on this heretofore unsolved multiphysics problem that lies at the intersection of nuclear and particle theory and the kinetic theory of energy transport in a system with internal (quantum) degrees of freedom.

  11. Compiling for Application Specific Computational Acceleration in Reconfigurable Architectures Final Report CRADA No. TSB-2033-01

    Energy Technology Data Exchange (ETDEWEB)

    De Supinski, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Caliga, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-28

    The primary objective of this project was to develop memory optimization technology to efficiently deliver data to, and distribute data within, the SRC-6's Field Programmable Gate Array- ("FPGA") based Multi-Adaptive Processors (MAPs). The hardware/software approach was to explore efficient MAP configurations and generate the compiler technology to exploit those configurations. This memory accessing technology represents an important step towards making reconfigurable symmetric multi-processor (SMP) architectures that will be a costeffective solution for large-scale scientific computing.

  12. Improvement of measurements, theoretical computations and evaluations of neutron induced helium production cross sections. Summary report on the third and final research co-ordination meeting

    International Nuclear Information System (INIS)

    Pashchenko, A.B.

    1996-09-01

    The present report contains the Summary of the Third and Final IAEA Research Co-ordination Meeting (RCM) on ''Improvement of Measurements, Theoretical Computations and Evaluations of Neutron Induced Helium Production Cross Sections'' which was hosted by the Tohoku University and held in Sendai, Japan, from 25 to 29 September 1995. This RCM was organized by the IAEA Nuclear Data Section (NDS), with the co-operation and assistance of local organizers from Tohoku University. Summarized are the proceedings and results of the meeting. The List of Participants and meeting Agenda are included. (author)

  13. Final Technical Report: Sparse Grid Scenario Generation and Interior Algorithms for Stochastic Optimization in a Parallel Computing Environment

    Energy Technology Data Exchange (ETDEWEB)

    Mehrotra, Sanjay [Northwestern Univ., Evanston, IL (United States)

    2016-09-07

    The support from this grant resulted in seven published papers and a technical report. Two papers are published in SIAM J. on Optimization [87, 88]; two papers are published in IEEE Transactions on Power Systems [77, 78]; one paper is published in Smart Grid [79]; one paper is published in Computational Optimization and Applications [44] and one in INFORMS J. on Computing [67]). The works in [44, 67, 87, 88] were funded primarily by this DOE grant. The applied papers in [77, 78, 79] were also supported through a subcontract from the Argonne National Lab. We start by presenting our main research results on the scenario generation problem in Sections 1–2. We present our algorithmic results on interior point methods for convex optimization problems in Section 3. We describe a new ‘central’ cutting surface algorithm developed for solving large scale convex programming problems (as is the case with our proposed research) with semi-infinite number of constraints in Section 4. In Sections 5–6 we present our work on two application problems of interest to DOE.

  14. Evaluative studies in nuclear medicine research: positron computed tomography assessment. Final report, January 1, 1982-December 31, 1982

    International Nuclear Information System (INIS)

    Potchen, E.J.; Harris, G.I.; Gift, D.A.; Reinhard, D.K.; Siebert, J.E.

    1983-02-01

    Results are reported of the final phase of the study effort generally titled Evaluative Studies in Nuclear Medicine Research. The previous work is reviewed and extended to an assessment providing perspectives on medical applications of positron emission tomographic (PET) systems, their technological context, and the related economic and marketing environment. Methodologies developed and used in earlier phases of the study were continued, but specifically extended to include solicitation of opinion from commercial organizations deemed to be potential developers, manufacturers and marketers of PET systems. Several factors which influence the demand for clinical uses of PET are evaluated and discussed. The recent Federal funding of applied research with PET systems is found to be a necessary and encouraging event toward a determination that PET either is a powerful research tool limited to research, or whether it also presents major clinical utility. A comprehensive, updated bibliography of current literature related to the development, applications and economic considerations of PET technology is appended

  15. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  2. Tornado missile simulation and design methodology. Volume 1: simulation methodology, design applications, and TORMIS computer code. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and risk has been assessed for a hypothetical nuclear power plant design case study

  3. Final report and documentation for the security enabled programmable switch for protection of distributed internetworked computers LDRD.

    Energy Technology Data Exchange (ETDEWEB)

    Van Randwyk, Jamie A.; Robertson, Perry J.; Durgin, Nancy Ann; Toole, Timothy J.; Kucera, Brent D.; Campbell, Philip LaRoche; Pierson, Lyndon George

    2010-02-01

    An increasing number of corporate security policies make it desirable to push security closer to the desktop. It is not practical or feasible to place security and monitoring software on all computing devices (e.g. printers, personal digital assistants, copy machines, legacy hardware). We have begun to prototype a hardware and software architecture that will enforce security policies by pushing security functions closer to the end user, whether in the office or home, without interfering with users' desktop environments. We are developing a specialized programmable Ethernet network switch to achieve this. Embodied in this device is the ability to detect and mitigate network attacks that would otherwise disable or compromise the end user's computing nodes. We call this device a 'Secure Programmable Switch' (SPS). The SPS is designed with the ability to be securely reprogrammed in real time to counter rapidly evolving threats such as fast moving worms, etc. This ability to remotely update the functionality of the SPS protection device is cryptographically protected from subversion. With this concept, the user cannot turn off or fail to update virus scanning and personal firewall filtering in the SPS device as he/she could if implemented on the end host. The SPS concept also provides protection to simple/dumb devices such as printers, scanners, legacy hardware, etc. This report also describes the development of a cryptographically protected processor and its internal architecture in which the SPS device is implemented. This processor executes code correctly even if an adversary holds the processor. The processor guarantees both the integrity and the confidentiality of the code: the adversary cannot determine the sequence of instructions, nor can the adversary change the instruction sequence in a goal-oriented way.

  4. Evaluative studies in nuclear medicine research: emission computed tomography assessment. Final report, January 1-December 31, 1981

    International Nuclear Information System (INIS)

    Potchen, E.J.; Harris, G.I.; Gift, D.A.; Reinhard, D.K.; Siebert, J.E.

    1981-12-01

    The report provides information on an assessment of the potential short and long term benefits of emission computed tomography (ECT) in biomedical research and patient care. Work during the past year has been augmented by the development and use of an opinion survey instrument to reach a wider representation of knowledgeable investigators and users of this technology. This survey instrument is reproduced in an appendix. Information derived from analysis of the opinion survey, and used in conjunction with results of independent staff studies of available sources, provides the basis for the discussions given in following sections of PET applications in the brain, of technical factors, and of economic implications. Projections of capital and operating costs on a per study basis were obtained from a computerized, pro forma accounting model and are compared with the survey cost estimates for both research and clinical modes of application. The results of a cash-flow model analysis of the relationship between projected economic benefit of PET research to disease management and the costs associated with such research are presented and discussed

  5. Computational fluid dynamics modeling of two-phase flow in a BWR fuel assembly. Final CRADA Report

    International Nuclear Information System (INIS)

    Tentner, A.

    2009-01-01

    A direct numerical simulation capability for two-phase flows with heat transfer in complex geometries can considerably reduce the hardware development cycle, facilitate the optimization and reduce the costs of testing of various industrial facilities, such as nuclear power plants, steam generators, steam condensers, liquid cooling systems, heat exchangers, distillers, and boilers. Specifically, the phenomena occurring in a two-phase coolant flow in a BWR (Boiling Water Reactor) fuel assembly include coolant phase changes and multiple flow regimes which directly influence the coolant interaction with fuel assembly and, ultimately, the reactor performance. Traditionally, the best analysis tools for this purpose of two-phase flow phenomena inside the BWR fuel assembly have been the sub-channel codes. However, the resolution of these codes is too coarse for analyzing the detailed intra-assembly flow patterns, such as flow around a spacer element. Advanced CFD (Computational Fluid Dynamics) codes provide a potential for detailed 3D simulations of coolant flow inside a fuel assembly, including flow around a spacer element using more fundamental physical models of flow regimes and phase interactions than sub-channel codes. Such models can extend the code applicability to a wider range of situations, which is highly important for increasing the efficiency and to prevent accidents.

  6. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  7. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  9. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  10. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  13. Use of X-ray computed tomography in core analysis of tight North Sea Chalk. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Mogensen, K.; Stenby, E.H.

    1997-12-01

    This EFP-95 final report summarizes work performed at the Engineering Research Center (IVC-SEP) on the use of CT scanning in core analysis of tight core material from the North Sea. In this work, CT scanning has been applied to chalk material from Danish North Sea oil reservoirs. Results indicate that CT is fast and reliable for prediction of porosity. Typical errors lie within 2-3%. Calculation of fluid saturations requires considerable care from the experimentalists. Saturating a core ought to be performed by pulling vacuum to avoid trapped air bubbles. These bubbles may be produced after subsequent water and gas flooding, thereby ruining the mass balance calculations. Results performed at Stanford University show that if these simple precautions are taken, fluid saturations calculated from CT scanning are generally very accurate. Moreover, it appears that contrast agents need not be added to either phase. Regarding three-phase measurements the results are somewhat disappointing. Previous work using the dual-energy technique has indicated that the accuracy is less than satisfactory, due to an increased noise level at the low-energy setting. More work needs to be done in the future to develop the necessary expertise. The image analysis of the residual oil saturation after a gas flood can be simplified if water is assumed to be immobile during the injection. In that case, the resulting prediction of residual oil saturation is in excellent agreement with measured values. The general conclusion is that CT scanning holds a great potential as an assisting tool in modern core analysis, despite its limitations and the numerous implementation-related problems encountered during this work. (au) EFP-95. 57 refs.

  14. Development of process control capability through the Browns Ferry Integrated Computer System using Reactor Water Clanup System as an example. Final report

    International Nuclear Information System (INIS)

    Smith, J.; Mowrey, J.

    1995-12-01

    This report describes the design, development and testing of process controls for selected system operations in the Browns Ferry Nuclear Plant (BFNP) Reactor Water Cleanup System (RWCU) using a Computer Simulation Platform which simulates the RWCU System and the BFNP Integrated Computer System (ICS). This system was designed to demonstrate the feasibility of the soft control (video touch screen) of nuclear plant systems through an operator console. The BFNP Integrated Computer System, which has recently. been installed at BFNP Unit 2, was simulated to allow for operator control functions of the modeled RWCU system. The BFNP Unit 2 RWCU system was simulated using the RELAP5 Thermal/Hydraulic Simulation Model, which provided the steady-state and transient RWCU process variables and simulated the response of the system to control system inputs. Descriptions of the hardware and software developed are also included in this report. The testing and acceptance program and results are also detailed in this report. A discussion of potential installation of an actual RWCU process control system in BFNP Unit 2 is included. Finally, this report contains a section on industry issues associated with installation of process control systems in nuclear power plants

  15. Intercomparison of liquid metal fast reactor seismic analysis codes. V. 3: Comparison of observed effects with computer simulated effects on reactor cores from seismic disturbances. Proceedings of a final research co-ordination meeting

    International Nuclear Information System (INIS)

    1996-05-01

    This publication contains the final papers summarizing the validation of the codes on the basis of comparison of observed effects with computer simulated effects on reactor cores from seismic disturbances. Refs, figs tabs

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  18. Final Report for the project titled "Enabling Supernova Computations by Integrated Transport and Provisioning Methods Optimized for Dedicated Channels"

    Energy Technology Data Exchange (ETDEWEB)

    Malathi Veeraraghavan

    2007-10-31

    A high-speed optical circuit network is one that offers users rate-guaranteed connectivity between two endpoints, unlike today’s IP-routed Internet in which the rate available to a pair of users fluctuates based on the volume of competing traffic. This particular research project advanced our understanding of circuit networks in two ways. First, transport protocols were developed for circuit networks. In a circuit network, since bandwidth resources are reserved for each circuit on an end-to-end basis (much like how a person reserves a seat on every leg of a multi-segment flight), and the sender is limited to send at the rate of the circuit, there is no possibility of congestion during data transfer. Therefore, no congestion control functions are necessary in a transport protocol designed for circuits. However, error control and flow control are still required because bits can become errored due to noise and interference even on highly reliable optical links, and receivers can, due to multitasking or other reasons, not deplete the receive buffer fast enough to keep up with the sending rate (e.g., if the receiving host is multitasking between receiving a file transfer and some other computation). In this work, we developed two transport protocols for circuits, both of which are described below. Second, this project developed techniques for internetworking different types of connection-oriented networks, which are of two types: circuit-switched or packet-switched. In circuit-switched networks, multiplexing on links is “position based,” where “position” refers to the frequency, time slot, and port (fiber), while connection-oriented packet-switched networks use packet header information to demultiplex packets and switch them from node to node. The latter are commonly referred to as virtual circuit networks. Examples of circuit networks are time-division multiplexed Synchronous Optical Network/Synchronous Digital Hierarchy (SONET/SDH) and Wavelength Division

  19. Computing facilities available to final-year students at 3 UK dental schools in 1997/8: their use, and students' attitudes to information technology.

    Science.gov (United States)

    Grigg, P; Macfarlane, T V; Shearer, A C; Jepson, N J; Stephens, C D

    2001-08-01

    To identify computer facilities available in 3 dental schools where 3 different approaches to the use of technology-based learning material have been adopted and assess dental students' perception of their own computer skills and their attitudes towards information technology. Multicentre cross sectional by questionnaire. All 181 dental students in their final year of study (1997-8). The overall participation rate was 80%. There were no differences between schools in the students' self assessment of their IT skills but only 1/3 regarded themselves as competent in basic skills and nearly 50% of students in all 3 schools felt that insufficient IT training had been provided to enable them to follow their course without difficulty. There were significant differences between schools in most of the other areas examined which reflect the different ways in which IT can be used to support the dental course. 1. Students value IT as an educational tool. 2. Their awareness of the relevance of a knowledge of information technology for their future careers remains generally low. 3. There is a need to provide effective instruction in IT skills for those dental students who do not acquire these during secondary education.

  20. Reliability analysis and computation of computer-based safety instrumentation and control used in German nuclear power plant. Final report; Zuverlaessigkeitsuntersuchung und -berechnung rechnerbasierter Sicherheitsleittechnik zum Einsatz in deutschen Kernkraftwerken. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Yongjian [Hochschule Magdeburg-Stendal, Magdeburg (Germany). Inst. fuer Elektrotechnik; Krause, Ulrich [Magdeburg Univ. (Germany). Inst. fuer Apparate- und Umwelttechnik; Gu, Chunlei

    2014-08-21

    The trend of technological advancement in the field of safety instrumentation and control (I and C) leads to increasingly frequent use of computer-based (digital) control systems which consisting of distributed, connected bus communications computers and their functionalities are freely programmable by qualified software. The advantages of the new I and C system over the old I and C system with hard-wired technology are e.g. in the higher flexibility, cost-effective procurement of spare parts, higher hardware reliability (through higher integration density, intelligent self-monitoring mechanisms, etc.). On the other hand, skeptics see the new technology with the computer-based I and C a higher potential by influences of common cause failures (CCF), and the easier manipulation by sabotage (IT Security). In this joint research project funded by the Federal Ministry for Economical Affaires and Energy (BMWi) (2011-2014, FJZ 1501405) the Otto-von-Guericke-University Magdeburg and Magdeburg-Stendal University of Applied Sciences are therefore trying to develop suitable methods for the demonstration of the reliability of the new instrumentation and control systems with the focus on the investigation of CCF. This expertise of both houses shall be extended to this area and a scientific contribution to the sound reliability judgments of the digital safety I and C in domestic and foreign nuclear power plants. First, the state of science and technology will be worked out through the study of national and international standards in the field of functional safety of electrical and I and C systems and accompanying literature. On the basis of the existing nuclear Standards the deterministic requirements on the structure of the new digital I and C system will be determined. The possible methods of reliability modeling will be analyzed and compared. A suitable method called multi class binomial failure rate (MCFBR) which was successfully used in safety valve applications will be

  1. Availability Perception And Constraints Of Final Year Students To The Use Of Computer-Based Information Communication Technologies Cb-Icts.

    Directory of Open Access Journals (Sweden)

    Nto

    2015-08-01

    Full Text Available There is no doubt that ICTs are the major focus for the day to day running of every society. ICTs create a room for a quicker faster easier access and exchange of information in the world today. The study investigated the availability attitude and constraints of final year students in the use of computer-based ICTs CB-ICTs in Abia State. Data were collected with the use of a well structured questionnaire. Data collected were analysed with descriptive statistics. Results from analysis revealed that the mean age of the respondents was 23 years 7 CB-ICTs were available to the respondents at varying degrees. The respondents had a positive attitude x amp773 3.11 to the use of CB-ICTs and the major constraint to their use of CB-ICTs was poor resource centre where they can access CB-ICTs. Based on the findings we recommended that resource centres should be built in the institution and if it exists should be well equipped and running. Equally awareness to the fact that there is or there will be a resource centre in the institution should be widely spread so that students can utilize the opportunity of its existence and make maximum use of the facilities there in. On the other hand internet service provider should scale up their services in the area so as to provide a more stable internet connection in the institution as this will enable the students to use more effectively CB-ICTs in the institution and for their academic work.

  2. Final Report Extreme Computing and U.S. Competitiveness DOE Award. DE-FG02-11ER26087/DE-SC0008764

    Energy Technology Data Exchange (ETDEWEB)

    Mustain, Christopher J. [Council on Competitiveness, Washington, DC (United States)

    2016-01-13

    The Council has acted on each of the grant deliverables during the funding period. The deliverables are: (1) convening the Council’s High Performance Computing Advisory Committee (HPCAC) on a bi-annual basis; (2) broadening public awareness of high performance computing (HPC) and exascale developments; (3) assessing the industrial applications of extreme computing; and (4) establishing a policy and business case for an exascale economy.

  3. Feasibility of Computer Processing of Technical Information on the Design of Instructional Systems. Final Report for the Period 1 July 1972 through 31 March 1973.

    Science.gov (United States)

    Scheffler, F. L.; And Others

    A feasibility study examined the capability of a computer-based system's handling of technical information pertinent to the design of instructional systems. Structured interviews were held to assess the information needs of both researchers and practitioners and an investigation was conducted of 10 computer-based information storage and retrieval…

  4. Programs for attracting under-represented minority students to graduate school and research careers in computational science. Final report for period October 1, 1995 - September 30, 1997

    Energy Technology Data Exchange (ETDEWEB)

    Turner, James C. Jr.; Mason, Thomas; Guerrieri, Bruno

    1997-10-01

    Programs have been established at Florida A & M University to attract minority students to research careers in mathematics and computational science. The primary goal of the program was to increase the number of such students studying computational science via an interactive multimedia learning environment One mechanism used for meeting this goal was the development of educational modules. This academic year program established within the mathematics department at Florida A&M University, introduced students to computational science projects using high-performance computers. Additional activities were conducted during the summer, these included workshops, meetings, and lectures. Through the exposure provided by this program to scientific ideas and research in computational science, it is likely that their successful applications of tools from this interdisciplinary field will be high.

  5. Stress-intensity factors for surface cracks in pipes: a computer code for evaluation by use of influence functions. Final report

    International Nuclear Information System (INIS)

    Dedhia, D.D.; Harris, D.O.

    1982-06-01

    A user-oriented computer program for the evaluation of stress intensity factors for cracks in pipes is presented. Stress intensity factors for semi-elliptical, complete circumferential and long longitudinal cracks can be obtained using this computer program. The code is based on the method of influence functions which makes it possible to treat arbitrary stresses on the plane of the crack. The stresses on the crack plane can be entered as a mathematical or tabulated function. A user's manual is included in this report. Background information is also included

  6. Three-dimensional gyrokinetic particle-in-cell simulation of plasmas on a massively parallel computer: Final report on LDRD Core Competency Project, FY 1991--FY 1993

    International Nuclear Information System (INIS)

    Byers, J.A.; Williams, T.J.; Cohen, B.I.; Dimits, A.M.

    1994-01-01

    One of the programs of the Magnetic fusion Energy (MFE) Theory and computations Program is studying the anomalous transport of thermal energy across the field lines in the core of a tokamak. We use the method of gyrokinetic particle-in-cell simulation in this study. For this LDRD project we employed massively parallel processing, new algorithms, and new algorithms, and new formal techniques to improve this research. Specifically, we sought to take steps toward: researching experimentally-relevant parameters in our simulations, learning parallel computing to have as a resource for our group, and achieving a 100 x speedup over our starting-point Cray2 simulation code's performance

  7. Neuropsychological Assessment and Training of Cognitive Processing Strategies for Reading Recognition and Comprehension: A Computer Assisted Program for Learning Disabled Students. Final Report.

    Science.gov (United States)

    Teeter, Phyllis Anne; Smith, Philip L.

    The final report of the 2-year project describes the development and validation of microcomputer software to help assess reading disabled elementary grade children and to provide basic reading instruction. Accomplishments of the first year included: design of the STAR Neuro-Cognitive Assessment Program which includes a reproduction of…

  8. SBIR PHASE I FINAL REPORT: Adoption of High Performance Computational (HPC) Modeling Software for Widespread Use in the Manufacture of Welded Structures

    Energy Technology Data Exchange (ETDEWEB)

    Brust, Frederick W. [Engineering Mechanics Corporation of Columbus (Emc2), Columbus, OH (United States); Punch, Edward F. [Engineering Mechanics Corporation of Columbus (Emc2), Columbus, OH (United States); Kurth, Elizabeth A. [Engineering Mechanics Corporation of Columbus (Emc2), Columbus, OH (United States); Kennedy, James C. [Engineering Mechanics Corporation of Columbus (Emc2), Columbus, OH (United States)

    2013-12-02

    fabrication costs. VFT currently is tied to a commercial solver which makes it prohibitively expensive for use by SMEs, as there is a significant licensing cost for the solver - over and above for the relatively minimal cost for VFT. Emc2 developed this software code over a number of years in close cooperation with CAT (Peoria, IL), who currently uses this code exclusively for worldwide fabrication, product design and development activities. The use of VFT has allowed CAT to move directly from design to product fabrication and helped eliminate (to a large extent) new product prototyping and subsequent testing. Additionally, CAT has been able to eliminate/reduce costly one-of-a-kind appliances used to reduce distortion effects due to fabrication. In this context, SMEs can realize the same kind of improved product quality and reduced cost through adoption of the adapted version of VFT for design and subsequent manufacture of new products. Emc2's DOE SBIR Phase I effort successfully adapted VFT so that SMEs have access to this sophisticated and proven methodology that is quick, accurate and cost effective and available on-demand to address weld-simulation and fabrication problems prior to manufacture. The open source code, WARP3D, a high performance finite element code mainly used in fracture and damage assessment of structures, was modified so that computational weld problems can be solved efficiently on multiple processors and threads with VFT. The thermal solver for VFT, based on a series of closed form solution approximations, was enhanced for solution on multiple processors greatly increasing overall speed. In addition, the graphical user interface (GUI) has been tailored to integrate these solutions with WARP3D. The GUI is used to define all the weld pass descriptions, number of passes, material properties, consumable properties, weld speed, etc. for the structure to be modeled. The GUI was improved to make it user-friendly for engineers that are not experts in finite

  9. CPE--A New Perspective: The Impact of the Technology Revolution. Proceedings of the Computer Performance Evaluation Users Group Meeting (19th, San Francisco, California, October 25-28, 1983). Final Report. Reports on Computer Science and Technology.

    Science.gov (United States)

    Mobray, Deborah, Ed.

    Papers on local area networks (LANs), modelling techniques, software improvement, capacity planning, software engineering, microcomputers and end user computing, cost accounting and chargeback, configuration and performance management, and benchmarking presented at this conference include: (1) "Theoretical Performance Analysis of Virtual…

  10. Evaluation of the TSC Dolphin Computer Assisted Instructional System in the Chapter 1 Program of the District of Columbia Public Schools. Final Report 85-9.

    Science.gov (United States)

    Harris, Carolyn DeMeyer; And Others

    Dolphin is a computer-assisted instruction system used to teach and reinforce skills in reading, language arts, and mathematics. An evaluation of this system was conducted to provide information to TSC Division of Houghton Mifflin regarding its effectiveness and possible modifications to the system. The general design of the evaluation was to…

  11. Student Science Training Program in Mathematics, Physics and Computer Science. Final Report to the National Science Foundation. Artificial Intelligence Memo No. 393.

    Science.gov (United States)

    Abelson, Harold; diSessa, Andy

    During the summer of 1976, the MIT Artificial Intelligence Laboratory sponsored a Student Science Training Program in Mathematics, Physics, and Computer Science for high ability secondary school students. This report describes, in some detail, the style of the program, the curriculum and the projects the students under-took. It is hoped that this…

  12. Computer-Based Junior High/Intermediate School Program of Transitional Bilingual Education, Community School District 3, Manhattan. Final Evaluation Report, 1992-93. OREA Report.

    Science.gov (United States)

    Duque, Diana L.

    The Computer-Based Junior High/Intermediate School Program of Transitional Bilingual Education was a federally funded program in its third year of operation in one intermediate school and two junior high schools in Manhattan (New York) in 1992-93. During this period, it served 244 native Spanish-speaking, limited-English-proficient (LEP) students…

  13. The Development and Evaluation of a Teleprocessed Computer-Assisted Instruction Course in the Recognition of Malarial Parasites. Final Report; May 1, 1967 - June 30, 1968.

    Science.gov (United States)

    Mitzel, Harold E.

    A computer-assisted instruction course in the recognition of malarial parasites was developed and evaluated. The course includes stage discrimination, species discrimination, and case histories. Segments developed use COURSEWRITER as an author language and are presented via a display terminal that permits two-way communication with an IBM computer…

  14. Thermal model of laser-induced skin damage: computer program operator's manual. Final report, September 1976--April 1977

    Energy Technology Data Exchange (ETDEWEB)

    Takata, A.N.

    1977-12-01

    A user-oriented description is given of a computer program for predicting temperature rises, irreversible damage, and degree of burns caused to skin by laser exposures. This report describes the parameters necessary to run the program and provides suggested values for the parameters. Input data are described in detail as well as the capabilities and limitations of the program. (Author)

  15. A Methodological Study Evaluating a Pretutorial Computer-Compiled Instructional Program in High School Physics Instruction Initiated from Student-Teacher Selected Instructional Objectives. Final Report.

    Science.gov (United States)

    Leonard, B. Charles; Denton, Jon J.

    A study sought to develop and evaluate an instructional model which utilized the computer to produce individually prescribed instructional guides to account for the idiosyncratic variations among students in physics classes at the secondary school level. The students in the treatment groups were oriented toward the practices of selecting…

  16. Computational and experimental fluid mechanics. Draft version of annex to final report for period January 1st 1993 to December 31st 1997

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-10-01

    The general purpose of the program has been the development of efficient algorithms, their implementation in codes of Computational Fluid Mechanics (CFD), and the experimental verification of these codes. Flows of both fundamental and applied nature has been investigated, including flows in industrial process equipment, about aerodynamics structures and ships, and flows over bed forms of importance for sediment transport. The experimental work has included the development of improved techniques, emphasizing optical methods. The objectives were realized through a coordinated experimental and theoretical/computation research program, organized in 6 specific projects: 1. CFD-methods and algorithms. 2. Special element simulation of ultrafiltration. 3. Turbulent swirling flows; 4. Near-wall models of turbulence and development of experimental techniques. 5. Flow over bed forms. 6. Flow past ship hull. (au)

  17. Planning meeting to form the CMSN Team: Building a unified computational model for the resonant X-ray scattering of strongly correlated materials. Final report

    International Nuclear Information System (INIS)

    van Veenendaal, M.

    2008-01-01

    The planning meeting was held May 21-23 2008 at Argonne National Laboratory (ANL). The purpose of the meeting was to establish a network on building computational model for resonant elastic and inelastic x-ray scattering. This course of action was recommended by program officer Dale Koelling after the initial submission of a proposal for a Computational Materials Science Network to Basic Energy Sciences. The meeting consisted of talks and discussion. At the end of the meeting three subgroups were formed. After the successful formation of the team, a new proposal was written which was funded by BES. Since this was a planning meeting there were no proceedings. The program and titles of talks are given.

  18. An approach to the efficient assessment of safety and usability of computer based control systems, VeNuS 2. Global final report

    International Nuclear Information System (INIS)

    Nelke, T.; Dlugosch, C.; Olaverri Monreal, C.; Sachse, K.; Thuering, M.

    2015-01-01

    Prior to the use of computer-based instrumentation and control the evidence of sufficient safety, development methods and the suitability of man-machine interface must be provided. For this purpose, validation methods must be available, if possible supported by appropriate tools. Based on the multitude of the data which has to be taken into account it is important to generate technical documentation, to realize efficient operation and to prevent human based errors. An approach for computer based generation of user manuals for the operation of technical systems was developed in the VeNuS 2 project. A second goal was to develop an approach to evaluate the usability of safety relevant digital human-machine-interfaces (e.g. for nuclear industries). Therefore a software tool has been developed to assess aspects of usability of user interfaces by considering safety-related priorities. Additionally new or well known methods for provision of evidence of sufficient safety and usability for computer based systems shall be developed in a prototyped way.

  19. Support of theoretical high energy physics research at the Supercomputer Computations Research Institute. Final report, September 30, 1992 - July 31, 1997

    International Nuclear Information System (INIS)

    Bitar, K.M.; Edwards, R.G.; Heller, U.M.; Kennedy, A.D.

    1998-01-01

    The research primarily involved lattice field theory simulations such as Quantum Chromodynamics (QCD) and the Standard Model of electroweak interactions. Among the works completed by the members of the lattice group and their outside collaborators in QCD simulations are extensive hadronic spectrum computations with both Wilson and staggered fermions, and calculations of hadronic matrix elements and wavefunctions. Studies of the QCD β function with two flavors of Wilson fermions, and the study of a possible flavor-parity breaking phase in QCD with two flavors of Wilson fermions have been completed. Studies of the finite temperature behavior of QCD have also been a major activity within the group. Studies of non-relativistic QCD, both for heavy-heavy mesons and for the heavy quark in heavy-light mesons have been done. Combining large N analytic computations within the Higgs sector of the standard model and numerical simulations at N = 4 have yielded a computation of the upper bound of the mass of the Higgs particle, as well as the energy scale above which deviations from the Standard Model may be expected. A major research topic during the second half of the grant period was the study of improved lattice actions, designed to diminish finite lattice spacing effects and thus accelerate the approach to the continuum limit. A new exact Local Hybrid Monte Carlo (overrelaxation) algorithm with a tunable overrelaxation parameter has been developed for pure gauge theories. The characteristics of this algorithm have been investigated. A study of possible instabilities in the global HMC algorithm has been completed

  20. Final report on LDRD project : elucidating performance of proton-exchange-membrane fuel cells via computational modeling with experimental discovery and validation.

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Chao Yang (Pennsylvania State University, University Park, PA); Pasaogullari, Ugur (Pennsylvania State University, University Park, PA); Noble, David R.; Siegel, Nathan P.; Hickner, Michael A.; Chen, Ken Shuang

    2006-11-01

    In this report, we document the accomplishments in our Laboratory Directed Research and Development project in which we employed a technical approach of combining experiments with computational modeling and analyses to elucidate the performance of hydrogen-fed proton exchange membrane fuel cells (PEMFCs). In the first part of this report, we document our focused efforts on understanding water transport in and removal from a hydrogen-fed PEMFC. Using a transparent cell, we directly visualized the evolution and growth of liquid-water droplets at the gas diffusion layer (GDL)/gas flow channel (GFC) interface. We further carried out a detailed experimental study to observe, via direct visualization, the formation, growth, and instability of water droplets at the GDL/GFC interface using a specially-designed apparatus, which simulates the cathode operation of a PEMFC. We developed a simplified model, based on our experimental observation and data, for predicting the onset of water-droplet instability at the GDL/GFC interface. Using a state-of-the-art neutron imaging instrument available at NIST (National Institute of Standard and Technology), we probed liquid-water distribution inside an operating PEMFC under a variety of operating conditions and investigated effects of evaporation due to local heating by waste heat on water removal. Moreover, we developed computational models for analyzing the effects of micro-porous layer on net water transport across the membrane and GDL anisotropy on the temperature and water distributions in the cathode of a PEMFC. We further developed a two-phase model based on the multiphase mixture formulation for predicting the liquid saturation, pressure drop, and flow maldistribution across the PEMFC cathode channels. In the second part of this report, we document our efforts on modeling the electrochemical performance of PEMFCs. We developed a constitutive model for predicting proton conductivity in polymer electrolyte membranes and compared

  1. Impact of energy conservation policy measures on innovation, investment and long-term development of the Swiss economy. Results from the computable induced technical change and energy (CITE) model - Final report

    Energy Technology Data Exchange (ETDEWEB)

    Bretschger, L.; Ramer, R.; Schwark, F.

    2010-09-15

    This comprehensive final report for the Swiss Federal Office of Energy (SFOE) presents the results of a study made on the Computable Induced Technical Change and Energy (CITE) model. The authors note that, in the past two centuries, the Swiss economy experienced an unprecedented increase in living standards. At the same time, the stock of various natural resources declined and the environmental conditions changed substantially. The evaluation of the sustainability of a low energy and low carbon society as well as an optimum transition to this state is discussed. An economic analysis is made and the CITE and GCE (Computable General Equilibrium) numerical simulation models are discussed. The results obtained are presented and discussed.

  2. Probability of pipe fracture in the primary coolant loop of a PWR plant. Volume 9: PRAISE computer code user's manual. Final report

    International Nuclear Information System (INIS)

    Lim, E.Y.

    1981-08-01

    The PRAISE (Piping Reliability Analysis Including Seismic Events) computer code estimates the influence of earthquakes on the probability of failure at a weld joint in the primary coolant system of a pressurized water reactor. Failure, either a through-wall defect (leak) or a complete pipe severance (a large-LOCA), is assumed to be caused by fatigue crack growth of an as-fabricated interior surface circumferential defect. These defects are assumed to be two-dimensional and semi-elliptical in shape. The distribution of initial crack sizes is a function of crack depth and aspect ratio. Crack propagation rates are governed by a Paris-type relationship with separate RMS cyclic stress intensity factors for the depth and length. Both uniform through the wall and radial gradient thermal stresses are included in the calculation of the stress intensity factors. The failure probabilities are estimated by applying Monte Carlo methods to simulate the life histories of the selected weld joint. In order to maximize computational efficiency, a stratified sampling procedure is used to select the initial crack size. Hydrostatic proof test, pre-service inspection, and in-service inspection can be simulated. PRAISE treats the inter-arrival times of operating transients either as a constant or exponentially distributed according to observed or postulated rates. Leak rate and leak detection models are also included. The criterion for complete pipe severance is exceedance of a net section critical stress. Earthquakes of various intensity and arbitrary occurrence times can be modeled. PRAISE presently assumes that exactly one initial defect exists in the weld and that the earthquake of interest is the first earthquake experienced at the reactor

  3. Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Gurney, Kevin R. [Arizona Univ., Mesa, AZ (United States)

    2015-01-12

    This document constitutes the final report under DOE grant DE-FG-08ER64649. The organization of this document is as follows: first, I will review the original scope of the proposed research. Second, I will present the current draft of a paper nearing submission to Nature Climate Change on the initial results of this funded effort. Finally, I will present the last phase of the research under this grant which has supported a Ph.D. student. To that end, I will present the graduate student’s proposed research, a portion of which is completed and reflected in the paper nearing submission. This final work phase will be completed in the next 12 months. This final workphase will likely result in 1-2 additional publications and we consider the results (as exemplified by the current paper) high quality. The continuing results will acknowledge the funding provided by DOE grant DE-FG-08ER64649.

  4. Final Report

    Energy Technology Data Exchange (ETDEWEB)

    DeTar, Carleton [P.I.

    2012-12-10

    This document constitutes the Final Report for award DE-FC02-06ER41446 as required by the Office of Science. It summarizes accomplishments and provides copies of scientific publications with significant contribution from this award.

  5. Final Report, Center for Programming Models for Scalable Parallel Computing: Co-Array Fortran, Grant Number DE-FC02-01ER25505

    Energy Technology Data Exchange (ETDEWEB)

    Robert W. Numrich

    2008-04-22

    The major accomplishment of this project is the production of CafLib, an 'object-oriented' parallel numerical library written in Co-Array Fortran. CafLib contains distributed objects such as block vectors and block matrices along with procedures, attached to each object, that perform basic linear algebra operations such as matrix multiplication, matrix transpose and LU decomposition. It also contains constructors and destructors for each object that hide the details of data decomposition from the programmer, and it contains collective operations that allow the programmer to calculate global reductions, such as global sums, global minima and global maxima, as well as vector and matrix norms of several kinds. CafLib is designed to be extensible in such a way that programmers can define distributed grid and field objects, based on vector and matrix objects from the library, for finite difference algorithms to solve partial differential equations. A very important extra benefit that resulted from the project is the inclusion of the co-array programming model in the next Fortran standard called Fortran 2008. It is the first parallel programming model ever included as a standard part of the language. Co-arrays will be a supported feature in all Fortran compilers, and the portability provided by standardization will encourage a large number of programmers to adopt it for new parallel application development. The combination of object-oriented programming in Fortran 2003 with co-arrays in Fortran 2008 provides a very powerful programming model for high-performance scientific computing. Additional benefits from the project, beyond the original goal, include a programto provide access to the co-array model through access to the Cray compiler as a resource for teaching and research. Several academics, for the first time, included the co-array model as a topic in their courses on parallel computing. A separate collaborative project with LANL and PNNL showed how to

  6. Experimental, theoretical and computational study of frequency upshift of electromagnetic radiation using plasma techniques. Final technical report, January 14, 1991--January 14, 1995

    International Nuclear Information System (INIS)

    Joshi, C.

    1997-01-01

    The final report for the project is comprised of the PhD thesis of Richard L. Savage, Jr entitled: 'Frequency Upshifting of Electromagnetic Radiation via an Underdense Relativistic Ionization Front.' An underdense, relativistically propagating ionization front has been utilized to upshift the frequency of an impinging electromagnetic wave from 35 GHz to more than 173 GHz in a continuously tunable fashion. The source radiation interacted with the ionization front inside a metallic waveguide. The front, simply a moving boundary between ionized and neutral gas, was created as a short, intense pulse of ionizing laser radiation propagated through the gas-filled waveguide. In 1991, W.B. Mori showed theoretically that large upshifts are possible using underdense ionization fronts (underdense implies that the plasma density is lower than that required to reflect the source radiation), where the source wave is transmitted through the plasma/neutral boundary. The authors have extrapolated Mori's analysis to interactions within a waveguide. This is a new technique for generating high-power, short-pulse, tunable radiation, and has potential applications in areas such as time-resolved microwave spectroscopy, plasma diagnostics, and remote sensing

  7. Accident and safety analyses for the HTR-modul. Partial project 1: Computer codes for system behaviour calculation. Final report. Pt. 1

    International Nuclear Information System (INIS)

    Lohnert, G.; Becker, D.; Dilcher, L.; Doerner, G.; Feltes, W.; Gysler, G.; Haque, H.; Kindt, T.; Kohtz, N.; Lange, L.; Ragoss, H.

    1993-08-01

    The project encompasses the following project tasks and problems: (1) Studies relating to complete failure of the main heat transfer system; (2) Pebble flow; (3) Development of computer codes for detailed calculation of hypothetical accidents; (a) the THERMIX/RZKRIT temperature buildup code (covering a.o. a variation to include exothermal heat sources); (b) the REACT/THERMIX corrosion code (variation taking into account extremely severe air ingress into the primary loop); (c) the GRECO corrosion code (variation for treating extremely severe water ingress into the primary loop); (d) the KIND transients code (for treating extremely fast transients during reactivity incidents. (4) Limiting devices for safety-relevant quantities. (5) Analyses relating to hypothetical accidents. (a) hypothetical air ingress; (b) effects on the fuel particles induced by fast transients. The problems of the various tasks are defined in detail and the main results obtained are explained. The contributions reporting the various project tasks and activities have been prepared for separate retrieval from the database. (orig./HP) [de

  8. Accident and safety analyses for the HTR-modul. Partial project 1: Computer codes for system behaviour calculation. Final report. Pt. 2

    International Nuclear Information System (INIS)

    Lohnert, G.; Becker, D.; Dilcher, L.; Doerner, G.; Feltes, W.; Gysler, G.; Haque, H.; Kindt, T.; Kohtz, N.; Lange, L.; Ragoss, H.

    1993-08-01

    The project encompasses the following project tasks and problems: (1) Studies relating to complete failure of the main heat transfer system; (2) Pebble flow; (3) Development of computer codes for detailed calculation of hypothetical accidents; (a) the THERMIX/RZKRIT temperature buildup code (covering a.o. a variation to include exothermal heat sources); (b) the REACT/THERMIX corrosion code (variation taking into account extremely severe air ingress into the primary loop); (c) the GRECO corrosion code (variation for treating extremely severe water ingress into the primary loop); (d) the KIND transients code (for treating extremely fast transients during reactivity incidents. (4) Limiting devices for safety-relevant quantities. (5) Analyses relating to hypothetical accidents. (a) hypothetical air ingress; (b) effects on the fuel particles induced by fast transients. The problems of the various tasks are defined in detail and the main results obtained are explained. The contributions reporting the various project tasks and activities have been prepared for separate retrieval from the database. (orig./HP) [de

  9. Final results of the 'Benchmark on computer simulation of radioactive nuclides production rate and heat generation rate in a spallation target'

    International Nuclear Information System (INIS)

    Janczyszyn, J.; Pohorecki, W.; Domanska, G.; Maiorino, R.J.; David, J.C.; Velarde, F.A.

    2011-01-01

    A benchmark has been organized to assess the computer simulation of nuclide production and heat generation in a spallation lead target. The physical models applied for the calculation of thick lead target activation do not produce satisfactory results for the majority of analysed nuclides, however one can observe better or worse quantitative compliance with the experimental results. Analysis of the quality of calculated results show the best performance for heavy nuclides (A: 170 - 190). For intermediate nuclides (A: 60 - 130) almost all are underestimated while for A: 130 - 170 mainly overestimated. The shape of the activity distribution in the target is well reproduced in calculations by all models but the numerical comparison shows similar performance as for the whole target. The Isabel model yields best results. As for the whole target heating rate, the results from all participants are consistent. Only small differences are observed between results from physical models. As for the heating distribution in the target it looks not quite similar. The quantitative comparison of the distributions yielded by different spallation reaction models shows for the major part of the target no serious differences - generally below 10%. However, in the most outside parts of the target front layers and the part of the target at its end behind the primary protons range, a spread higher than 40 % is obtained

  10. Final Report DOE Grant No. DE-FG03-01ER54617 Computer Modeling of Microturbulence and Macrostability Properties of Magnetically Confined Plasmas

    Energy Technology Data Exchange (ETDEWEB)

    Jean-Noel Leboeuf

    2004-03-04

    OAK-B135 We have made significant progress during the past grant period in several key areas of the UCLA and national Fusion Theory Program. This impressive body of work includes both fundamental and applied contributions to MHD and turbulence in DIII-D and Electric Tokamak plasmas, and also to Z-pinches, particularly with respect to the effect of flows on these phenomena. We have successfully carried out interpretive and predictive global gyrokinetic particle-in-cell calculations of DIII-D discharges. We have cemented our participation in the gyrokinetic PIC effort of the SciDAC Plasma Microturbulence Project through working membership in the Summit Gyrokinetic PIC Team. We have continued to teach advanced courses at UCLA pertaining to computational plasma physics and to foster interaction with students and junior researchers. We have in fact graduated 2 Ph. D. students during the past grant period. The research carried out during that time has resulted in many publications in the premier plasma physics and fusion energy sciences journals and in several invited oral communications at major conferences such as Sherwood, Transport Task Force (TTF), the annual meetings of the Division of Plasma Physics of the American Physical Society, of the European Physical Society, and the 2002 IAEA Fusion Energy Conference, FEC 2002. Many of these have been authored and co-authored with experimentalists at DIII-D.

  11. Final Report DOE Grant No. DE-FG03-01ER54617 Computer Modeling of Microturbulence and Macrostability Properties of Magnetically Confined Plasmas

    International Nuclear Information System (INIS)

    Jean-Noel Leboeuf

    2004-01-01

    OAK-B135 We have made significant progress during the past grant period in several key areas of the UCLA and national Fusion Theory Program. This impressive body of work includes both fundamental and applied contributions to MHD and turbulence in DIII-D and Electric Tokamak plasmas, and also to Z-pinches, particularly with respect to the effect of flows on these phenomena. We have successfully carried out interpretive and predictive global gyrokinetic particle-in-cell calculations of DIII-D discharges. We have cemented our participation in the gyrokinetic PIC effort of the SciDAC Plasma Microturbulence Project through working membership in the Summit Gyrokinetic PIC Team. We have continued to teach advanced courses at UCLA pertaining to computational plasma physics and to foster interaction with students and junior researchers. We have in fact graduated 2 Ph. D. students during the past grant period. The research carried out during that time has resulted in many publications in the premier plasma physics and fusion energy sciences journals and in several invited oral communications at major conferences such as Sherwood, Transport Task Force (TTF), the annual meetings of the Division of Plasma Physics of the American Physical Society, of the European Physical Society, and the 2002 IAEA Fusion Energy Conference, FEC 2002. Many of these have been authored and co-authored with experimentalists at DIII-D

  12. Uranium systems to enhance benchmarks for use in the verification of criticality safety computer models. Final report, February 16, 1990--December 31, 1994

    International Nuclear Information System (INIS)

    Busch, R.D.

    1995-01-01

    Dr. Robert Busch of the Department of Chemical and Nuclear Engineering was the principal investigator on this project with technical direction provided by the staff in the Nuclear Criticality Safety Group at Los Alamos. During the period of the contract, he had a number of graduate and undergraduate students working on subtasks. The objective of this work was to develop information on uranium systems to enhance benchmarks for use in the verification of criticality safety computer models. During the first year of this project, most of the work was focused on setting up the SUN SPARC-1 Workstation and acquiring the literature which described the critical experiments. By august 1990, the Workstation was operational with the current version of TWODANT loaded on the system. MCNP, version 4 tape was made available from Los Alamos late in 1990. Various documents were acquired which provide the initial descriptions of the critical experiments under consideration as benchmarks. The next four years were spent working on various benchmark projects. A number of publications and presentations were made on this material. These are briefly discussed in this report

  13. Narrative Finality

    Directory of Open Access Journals (Sweden)

    Armine Kotin Mortimer

    1981-01-01

    Full Text Available The cloturai device of narration as salvation represents the lack of finality in three novels. In De Beauvoir's Tous les hommes sont mortels an immortal character turns his story to account, but the novel makes a mockery of the historical sense by which men define themselves. In the closing pages of Butor's La Modification , the hero plans to write a book to save himself. Through the thrice-considered portrayal of the Paris-Rome relationship, the ending shows the reader how to bring about closure, but this collective critique written by readers will always be a future book. Simon's La Bataille de Pharsale , the most radical attempt to destroy finality, is an infinite text. No new text can be written. This extreme of perversion guarantees bliss (jouissance . If the ending of De Beauvoir's novel transfers the burden of non-final world onto a new victim, Butor's non-finality lies in the deferral to a future writing, while Simon's writer is stuck in a writing loop, in which writing has become its own end and hence can have no end. The deconstructive and tragic form of contemporary novels proclaims the loss of belief in a finality inherent in the written text, to the profit of writing itself.

  14. Optimization of the radiological protection of patients undergoing radiography, fluoroscopy and computed tomography. Final report of a coordinated research project in Africa, Asia and eastern Europe

    International Nuclear Information System (INIS)

    2004-12-01

    Although radiography has been an established imaging modality for over a century, continuous developments have led to improvements in technique resulting in improved image quality at reduced patient dose. If one compares the technique used by Roentgen with the methods used today, one finds that a radiograph can now be obtained at a dose which is smaller by a factor of 100 or more. Nonetheless, some national surveys, particularly in the United Kingdom and in the United States of America in the 1980s and 1990s, have indicated large variations in patient doses for the same diagnostic examination, in some cases by a factor of 20 or more. This arises not only owing to the various types of equipment and accessories used by the different health care providers, but also because of operational factors. The IAEA has a statutory responsibility to establish standards for the protection of people against exposure to ionising radiation and to provide for the worldwide application of those standards. A fundamental requirement of the International Basic Safety Standards for Protection against Ionizing Radiation and for the Safety of Radiation Sources (BSS), issued by the IAEA in cooperation with the FAO, ILO, WHO, PAHO and NEA, is the optimization of radiological protection of patients undergoing medical exposure. Towards its responsibility of implementation of standards and under the subprogramme of radiation safety, in 1995, the IAEA launched a coordinated research project (CRP) on radiological protection in diagnostic radiology in some countries in the Eastern European, African and Asian region. Initially, the CRP addressed radiography only and it covered wide aspects of optimisation of radiological protection. Subsequently, the scope of the CRP was extended to fluoroscopy and computed tomography (CT), but it covered primarily situation analysis of patient doses and equipment quality control. It did not cover patient dose reduction aspects in fluoroscopy and CT. The project

  15. Final Report

    DEFF Research Database (Denmark)

    Heiselberg, Per; Brohus, Henrik; Nielsen, Peter V.

    This final report for the Hybrid Ventilation Centre at Aalborg University describes the activities and research achievement in the project period from August 2001 to August 2006. The report summarises the work performed and the results achieved with reference to articles and reports published...

  16. Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Stinis, Panos [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-08-07

    This is the final report for the work conducted at the University of Minnesota (during the period 12/01/12-09/18/14) by PI Panos Stinis as part of the "Collaboratory on Mathematics for Mesoscopic Modeling of Materials" (CM4). CM4 is a multi-institution DOE-funded project whose aim is to conduct basic and applied research in the emerging field of mesoscopic modeling of materials.

  17. Cognitive Computing for Security.

    Energy Technology Data Exchange (ETDEWEB)

    Debenedictis, Erik [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rothganger, Fredrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Aimone, James Bradley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Marinella, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Evans, Brian Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Warrender, Christina E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mickel, Patrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    Final report for Cognitive Computing for Security LDRD 165613. It reports on the development of hybrid of general purpose/ne uromorphic computer architecture, with an emphasis on potential implementation with memristors.

  18. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Jarillo-Herrero, Pablo [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2017-02-07

    This is the final report of our research program on electronic transport experiments on Topological Insulator (TI) devices, funded by the DOE Office of Basic Energy Sciences. TI-based electronic devices are attractive as platforms for spintronic applications, and for detection of emergent properties such as Majorana excitations , electron-hole condensates , and the topological magneto-electric effect . Most theoretical proposals envision geometries consisting of a planar TI device integrated with materials of distinctly different physical phases (such as ferromagnets and superconductors). Experimental realization of physics tied to the surface states is a challenge due to the ubiquitous presence of bulk carriers in most TI compounds as well as degradation during device fabrication.

  19. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Weissman, Jon B

    2006-04-30

    High performance computational science and engineering simulations have become an increasingly important part of the scientist's problem solving toolset. A key reason is the development of widely used codes and libraries that support these applications, for example, Netlib, a collection of numerical libraries [33]. The term community codes refers to those libraries or applications that have achieved some critical level of acceptance by a user community. Many of these applications are on the high-end in terms of required resources: computation, storage, and communication. Recently, there has been considerable interest in putting such applications on-line and packaging them as network services to make them available to a wider user base. Applications such as data mining [22], theorem proving and logic [14], parallel numerical computation [8][32] are example services that are all going on-line. Transforming applications into services has been made possible by advances in packaging and interface technologies including component systems [2][6][13][28][37], proposed communication standards [34], and newer Web technologies such as Web Services [38]. Network services allow the user to focus on their application and obtain remote service when needed by simply invoking the service across the network. The user can be assured that the most recent version of the code or service is always provided and they do not need to install, maintain, and manage significant infrastructure to access the service. For high performance applications in particular, the user is still often required to install a code base (e.g. MPI), and therefore become involved with the tedious details of infrastructure management. In the network service model, the service provider is responsible for all of these activities and not the user. The user need not become an expert in high performance computing. An additional advantage of high-end network services is that the user need not have specialized

  20. Final Report

    Energy Technology Data Exchange (ETDEWEB)

    J. K. Blasie; W.F. DeGrado; J.G. Saven; M.J. Therien

    2012-05-24

    The overall objective is to create robust artificial protein modules as scaffolds to control both (a) the conformation of novel cofactors incorporated into the modules thereby making the modules possess a desired functionality and (b) the organization of these functional modules into ordered macroscopic ensembles, whose macroscopic materials properties derive from the designed microscopic function of the modules. We focus on two specific types of cofactors for imparting functionality in this project; primarily nonlinear optical (NLO) chromophores designed to exhibit extraordinary molecular hyperpolarizabilities, as well as donor-bridge-acceptor cofactors designed to exhibit highly efficient, 'through-bonds' light-induced electron transfer (LIET) over nano-scale distances. The ensembles range from 2-D to 3-D, designed to possess the degree of orientational and positional order necessary to optimize their macroscopic response, the latter ranging from liquid-crystalline or glass-like to long-range periodic. Computational techniques, firmly based in statistical thermodynamics, are utilized for the design the artificial protein modules, based on robust {alpha}-helical bundle motifs, necessarily incorporating the desired conformation, location, and environment of the cofactor. Importantly, this design approach also includes optimization of the interactions between the modules to promote their organization into ordered macroscopic ensembles in 2-D and 3-D via either directed-assembly or self-assembly. When long-range periodic order is required, the design can be optimized to result a specified lattice symmetry. The structure and functionality of the individual modules are fully characterized at the microscopic level, as well as that of the ensembles at the macroscopic level, employing modern experimental physical-chemical and computational techniques. These include, for example, multi-dimensional NMR, various pump-probe transient spectroscopies to ultrafast time

  1. Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Webb, Robert C. [Texas A& M University; Kamon, Teruki [Texas A& M University; Toback, David [Texas A& M University; Safonov, Alexei [Texas A& M University; Dutta, Bhaskar [Texas A& M University; Dimitri, Nanopoulos [Texas A& M University; Pope, Christopher [Texas A& M University; White, James [Texas A& M University

    2013-11-18

    Overview The High Energy Physics Group at Texas A&M University is submitting this final report for our grant number DE-FG02-95ER40917. This grant has supported our wide range of research activities for over a decade. The reports contained here summarize the latest work done by our research team. Task A (Collider Physics Program): CMS & CDF Profs. T. Kamon, A. Safonov, and D. Toback co-lead the Texas A&M (TAMU) collider program focusing on CDF and CMS experiments. Task D: Particle Physics Theory Our particle physics theory task is the combined effort of Profs. B. Dutta, D. Nanopoulos, and C. Pope. Task E (Underground Physics): LUX & NEXT Profs. R. Webb and J. White(deceased) lead the Xenon-based underground research program consisting of two main thrusts: the first, participation in the LUX two-phase xenon dark matter search experiment and the second, detector R&D primarily aimed at developing future detectors for underground physics (e.g. NEXT and LZ).

  2. FINAL REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Juergen Eckert; Anthony K. Cheetham (Principal Investigator)

    2011-03-11

    Hydrogen storage systems based on the readily reversible adsorption of H{sub 2} in porous materials have a number of very attractive properties with the potential to provide superior performance among candidate materials currently being investigated were it not for the fact that the interaction of H{sub 2} with the host material is too weak to permit viable operation at room temperature. Our study has delineated in quantitative detail the structural elements which we believe to be the essential ingredients for the future synthesis of porous materials, where guest-host interactions are intermediate between those found in the carbons and the metal hydrides, i.e. between physisorption and chemisorption, which will result in H{sub 2} binding energies required for room temperature operation. The ability to produce porous materials with much improved hydrogen binding energies depends critically on detailed molecular level analysis of hydrogen binding in such materials. However, characterization of H{sub 2} sorption is almost exclusively carried by thermodynamic measurements, which give average properties for all the sites occupied by H{sub 2} molecules at a particular loading. We have therefore extensively utilized the most powerful of the few molecular level experimental probes available to probe the interactions of hydrogen with porous materials, namely inelastic neutron scattering (INS) spectroscopy of the hindered rotations of the hydrogen molecules adsorbed at various sites, which in turn can be interpreted in a very direct way in by computational studies. This technique can relate spectral signatures of various H{sub 2} molecules adsorbed at binding sites with different degrees of interaction. In the course of this project we have synthesized a rather large number of entirely new hybrid materials, which include structural modifications for improved interactions with adsorbed hydrogen. The results of our systematic studies on many porous materials provide detailed

  3. HCI in Mobile and Ubiquitous Computing

    OpenAIRE

    椎尾, 一郎; 安村, 通晃; 福本, 雅明; 伊賀, 聡一郎; 増井, 俊之

    2003-01-01

    This paper provides some perspectives to human computer interaction in mobile and ubiquitous computing. The review covers overview of ubiquitous computing, mobile computing and wearable computing. It also summarizes HCI topics on these field, including real-world oriented interface, multi-modal interface, context awareness and in-visible computers. Finally we discuss killer applications for coming ubiquitous computing era.

  4. AIMES Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Katz, Daniel S [Univ. of Illinois, Urbana-Champaign, IL (United States). National Center for Supercomputing Applications (NCSA); Jha, Shantenu [Rutgers Univ., New Brunswick, NJ (United States); Weissman, Jon [Univ. of Minnesota, Minneapolis, MN (United States); Turilli, Matteo [Rutgers Univ., New Brunswick, NJ (United States)

    2017-01-31

    This is the final technical report for the AIMES project. Many important advances in science and engineering are due to large-scale distributed computing. Notwithstanding this reliance, we are still learning how to design and deploy large-scale production Distributed Computing Infrastructures (DCI). This is evidenced by missing design principles for DCI, and an absence of generally acceptable and usable distributed computing abstractions. The AIMES project was conceived against this backdrop, following on the heels of a comprehensive survey of scientific distributed applications. AIMES laid the foundations to address the tripartite challenge of dynamic resource management, integrating information, and portable and interoperable distributed applications. Four abstractions were defined and implemented: skeleton, resource bundle, pilot, and execution strategy. The four abstractions were implemented into software modules and then aggregated into the AIMES middleware. This middleware successfully integrates information across the application layer (skeletons) and resource layer (Bundles), derives a suitable execution strategy for the given skeleton and enacts its execution by means of pilots on one or more resources, depending on the application requirements, and resource availabilities and capabilities.

  5. Foundations of Neuromorphic Computing

    Science.gov (United States)

    2013-05-01

    paradigms: few sensors/complex computations and many sensors/simple computation. Challenges with Nano-enabled Neuromorphic Chips A wide variety of...FOUNDATIONS OF NEUROMORPHIC COMPUTING MAY 2013 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION...2009 – SEP 2012 4. TITLE AND SUBTITLE FOUNDATIONS OF NEUROMORPHIC COMPUTING 5a. CONTRACT NUMBER IN-HOUSE 5b. GRANT NUMBER N/A 5c. PROGRAM

  6. Development of computational methods for the safety assessment of gas-cooled high-temperature and supercritical light-water reactors. Final report; Rechenmethoden zur Bewertung der Sicherheit von gasgekuehlten Hochtemperaturreaktoren und superkritischen Leichtwasserreaktoren. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Buchholz, S.; Cron, D. von der; Hristov, H.; Lerchl, G.; Papukchiev, A.; Seubert, A.; Sureda, A.; Weis, J.; Weyermann, F.

    2012-12-15

    This report documents developments and results in the frame of the project RS1191 ''Development of computational methods for the safety assessment of gas-cooled high temperature and supercritical light-water reactors''. The report is structured according to the five work packages: 1. Reactor physics modeling of gas-cooled high temperature reactors; 2. Coupling of reactor physics and 3-D thermal hydraulics for the core barrel; 3. Extension of ATHLET models for application to supercritical reactors (HPLWR); 4. Further development of ATHLET for application to HTR; 5. Further development and validation of ANSYS CFX for application to alternative reactor concepts. Chapter 4 describes the extensions made in TORT-TD related to the simulation of pebble-bed HTR, e.g. spectral zone buckling, Iodine-Xenon dynamics, nuclear decay heat calculation and extension of the cross section interpolation algorithms to higher dimensions. For fast running scoping calculations, a time-dependent 3-D diffusion solver has been implemented in TORT-TD. For the PBMR-268 and PBMR-400 as well as for the HTR-10 reactor, appropriate TORT-TD models have been developed. Few-group nuclear cross sections have been generated using the spectral codes MICROX- 2 and DRAGON4. For verification and validation of nuclear cross sections and deterministic reactor models, MCNP models of reactor core and control rod of the HTR-10 have been developed. Comparisons with experimental data have been performed for the HTR-10 first criticality and control rod worth. The development of the coupled 3-D neutron kinetics and thermal hydraulics code system TORT-TD/ATTICA3D is documented in chapter 5. Similar to the couplings with ATHLET and COBRA-TF, the ''internal'' coupling approach has been implemented. Regarding the review of experiments and benchmarks relevant to HTR for validation of the coupled code system, the PBMR-400 benchmarks and the HTR-10 test reactor have been selected

  7. Cosmology Without Finality

    Science.gov (United States)

    Mahootian, F.

    2009-12-01

    The rapid convergence of advancing sensor technology, computational power, and knowledge discovery techniques over the past decade has brought unprecedented volumes of astronomical data together with unprecedented capabilities of data assimilation and analysis. A key result is that a new, data-driven "observational-inductive'' framework for scientific inquiry is taking shape and proving viable. The anticipated rise in data flow and processing power will have profound effects, e.g., confirmations and disconfirmations of existing theoretical claims both for and against the big bang model. But beyond enabling new discoveries can new data-driven frameworks of scientific inquiry reshape the epistemic ideals of science? The history of physics offers a comparison. The Bohr-Einstein debate over the "completeness'' of quantum mechanics centered on a question of ideals: what counts as science? We briefly examine lessons from that episode and pose questions about their applicability to cosmology. If the history of 20th century physics is any indication, the abandonment of absolutes (e.g., space, time, simultaneity, continuity, determinacy) can produce fundamental changes in understanding. The classical ideal of science, operative in both physics and cosmology, descends from the European Enlightenment. This ideal has for over 200 years guided science to seek the ultimate order of nature, to pursue the absolute theory, the "theory of everything.'' But now that we have new models of scientific inquiry powered by new technologies and driven more by data than by theory, it is time, finally, to relinquish dreams of a "final'' theory.

  8. Final Report on Institutional Computing Project s15_hilaserion, “Kinetic Modeling of Next-Generation High-Energy, High-Intensity Laser-Ion Accelerators as an Enabling Capability”

    Energy Technology Data Exchange (ETDEWEB)

    Albright, Brian James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Yin, Lin [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Stark, David James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-06

    This proposal sought of order 1M core-hours of Institutional Computing time intended to enable computing by a new LANL Postdoc (David Stark) working under LDRD ER project 20160472ER (PI: Lin Yin) on laser-ion acceleration. The project was “off-cycle,” initiating in June of 2016 with a postdoc hire.

  9. Learning with Ubiquitous Computing

    Science.gov (United States)

    Rosenheck, Louisa

    2008-01-01

    If ubiquitous computing becomes a reality and is widely adopted, it will inevitably have an impact on education. This article reviews the background of ubiquitous computing and current research projects done involving educational "ubicomp." Finally it explores how ubicomp may and may not change education in both formal and informal settings and…

  10. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Held, Isaac [Princeton Univ., NJ (United States); Balaji, V. [Princeton Univ., NJ (United States); Fueglistaler, Stephan [Princeton Univ., NJ (United States)

    2016-09-19

    We have constructed and analyzed a series of idealized models of tropical convection interacting with large-scale circulations, with 25-50km resolution and with 1-2km cloud resolving resolution to set the stage for rigorous tests of convection closure schemes in high resolution global climate models. Much of the focus has been on the climatology of tropical cyclogenesis in rotating systems and the related problem of the spontaneous aggregation of convection in non-rotating systems. The PI (Held) will be delivering the honorary Bjerknes lecture at the Fall 2016 AGU meeting in December on this work. We have also provided new analyses of long-standing issues related to the interaction between convection and the large-scale circulation: Kelvin waves in the upper troposphere and lower stratosphere, water vapor transport into the stratosphere, and upper tropospheric temperature trends. The results of these analyses help to improve our understanding of processes, and provide tests for future high resolution global modeling. Our final goal of testing new convections schemes in next-generation global atmospheric models at GFDL has been left for future work due to the complexity of the idealized model results meant as tests for these models uncovered in this work and to computational resource limitations. 11 papers have been published with support from this grant, 2 are in review, and another major summary paper is in preparation.

  11. Computer in radiology

    International Nuclear Information System (INIS)

    Kuesters, H.

    1985-01-01

    With this publication, the author presents the requirements that a user specific software should fulfill to reach an effective practice rationalisation through computer usage and the hardware configuration necessary as basic equipment. This should make it more difficult in the future for sales representatives to sell radiologists unusable computer systems. Furthermore, questions shall be answered that were asked by computer interested radiologists during the system presentation. On the one hand there still exists a prejudice against programmes of standard texts and on the other side undefined fears, that handling a computer is to difficult and that one has to learn a computer language first to be able to work with computers. Finally, it i pointed out, the real competitive advantages can be obtained through computer usage. (orig.) [de

  12. Neural computation and the computational theory of cognition.

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-04-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism-neural processes are computations in the generic sense. After that, we reject on empirical grounds the common assimilation of neural computation to either analog or digital computation, concluding that neural computation is sui generis. Analog computation requires continuous signals; digital computation requires strings of digits. But current neuroscientific evidence indicates that typical neural signals, such as spike trains, are graded like continuous signals but are constituted by discrete functional elements (spikes); thus, typical neural signals are neither continuous signals nor strings of digits. It follows that neural computation is sui generis. Finally, we highlight three important consequences of a proper understanding of neural computation for the theory of cognition. First, understanding neural computation requires a specially designed mathematical theory (or theories) rather than the mathematical theories of analog or digital computation. Second, several popular views about neural computation turn out to be incorrect. Third, computational theories of cognition that rely on non-neural notions of computation ought to be replaced or reinterpreted in terms of neural computation. Copyright © 2012 Cognitive Science Society, Inc.

  13. Ethical aspects of final disposal. Final report

    International Nuclear Information System (INIS)

    Baltes, B.; Leder, W.; Achenbach, G.B.; Spaemann, R.; Gerhardt, V.

    2003-01-01

    In fulfilment of this task the Federal Environmental Ministry has commissioned GRS to summarise the current national and international status of ethical aspects of the final disposal of radioactive wastes as part of the project titled ''Final disposal of radioactive wastes as seen from the viewpoint of ethical objectives''. The questions arising from the opinions, positions and publications presented in the report by GRS were to serve as a basis for an expert discussion or an interdisciplinary discussion forum for all concerned with the ethical aspects of an answerable approach to the final disposal of radioactive wastes. In April 2001 GRS held a one-day seminar at which leading ethicists and philosophers offered statements on the questions referred to above and joined in a discussion with experts on issues of final disposal. This report documents the questions that arose ahead of the workshop, the specialist lectures held there and a summary of the discussion results [de

  14. The IAEA co-ordinated research programme on improvement of measurements, theoretical computations and evaluations of neutron induced helium production cross sections. Status report. Prepared at the final CRP meeting in Sendai, Japan 25-29 September 1995

    International Nuclear Information System (INIS)

    Pashchenko, A.B.

    1996-12-01

    The present report describes the results of the IAEA Co-ordinated Research Programme (CRP) on ''Improvements of Measurements, Theoretical Computation and Evaluations of Neutron Induced Helium Production Cross Sections''. Summarized is the progress achieved under the CRP in the following areas: measurements of α-production cross sections for structural materials, theoretical computations at (nα) cross sections; measurements of activation cross sections; and improvement of experimental methods for (n,α) investigations. The status report gives also short summaries on the work of each laboratory which contributed to the results of the CRP. Attached is the list of program members and participants of CRP meetings. (author). Refs, 2 figs, 1 tab

  15. Computed tomography

    International Nuclear Information System (INIS)

    Wells, P.; Davis, J.; Morgan, M.

    1994-01-01

    X-ray or gamma-ray transmission computed tomography (CT) is a powerful non-destructive evaluation (NDE) technique that produces two-dimensional cross-sectional images of an object without the need to physically section it. CT is also known by the acronym CAT, for computerised axial tomography. This review article presents a brief historical perspective on CT, its current status and the underlying physics. The mathematical fundamentals of computed tomography are developed for the simplest transmission CT modality. A description of CT scanner instrumentation is provided with an emphasis on radiation sources and systems. Examples of CT images are shown indicating the range of materials that can be scanned and the spatial and contrast resolutions that may be achieved. Attention is also given to the occurrence, interpretation and minimisation of various image artefacts that may arise. A final brief section is devoted to the principles and potential of a range of more recently developed tomographic modalities including diffraction CT, positron emission CT and seismic tomography. 57 refs., 2 tabs., 14 figs

  16. Synthetic Computation: Chaos Computing, Logical Stochastic Resonance, and Adaptive Computing

    Science.gov (United States)

    Kia, Behnam; Murali, K.; Jahed Motlagh, Mohammad-Reza; Sinha, Sudeshna; Ditto, William L.

    Nonlinearity and chaos can illustrate numerous behaviors and patterns, and one can select different patterns from this rich library of patterns. In this paper we focus on synthetic computing, a field that engineers and synthesizes nonlinear systems to obtain computation. We explain the importance of nonlinearity, and describe how nonlinear systems can be engineered to perform computation. More specifically, we provide an overview of chaos computing, a field that manually programs chaotic systems to build different types of digital functions. Also we briefly describe logical stochastic resonance (LSR), and then extend the approach of LSR to realize combinational digital logic systems via suitable concatenation of existing logical stochastic resonance blocks. Finally we demonstrate how a chaotic system can be engineered and mated with different machine learning techniques, such as artificial neural networks, random searching, and genetic algorithm, to design different autonomous systems that can adapt and respond to environmental conditions.

  17. Final report. [Nonlinear magnetohydrodynamics

    International Nuclear Information System (INIS)

    Montgomery, D.C.

    1998-01-01

    This is a final report on the research activities carried out under the above grant at Dartmouth. During the period considered, the grant was identified as being for nonlinear magnetohydrodynamics, considered as the most tractable theoretical framework in which the plasma problems associated with magnetic confinement of fusion plasmas could be studied. During the first part of the grant's lifetime, the author was associated with Los Alamos National Laboratory as a consultant and the work was motivated by the reversed-field pinch. Later, when that program was killed at Los Alamos, the problems became ones that could be motivated by their relation to tokamaks. Throughout the work, the interest was always on questions that were as fundamental as possible, compatible with those motivations. The intent was always to contribute to plasma physics as a science, as well as to the understanding of mission-oriented confined fusion plasmas. Twelve Ph.D. theses were supervised during this period and a comparable number of postdoctoral research associates were temporarily supported. Many of these have gone on to distinguished careers, though few have done so in the context of the controlled fusion program. Their work was a combination of theory and numerical computation, in gradually less and less idealized settings, moving from rectangular periodic boundary conditions in two dimensions, through periodic straight cylinders and eventually, before the grant was withdrawn, to toroids, with a gradually more prominent role for electrical and mechanical boundary conditions. The author never had access to a situation where he could initiate experiments and relate directly to the laboratory data he wanted. Computers were the laboratory. Most of the work was reported in referred publications in the open literature, copies of which were transmitted one by one to DOE at the time they appeared. The Appendix to this report is a bibliography of published work which was carried out under the

  18. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Schuur, Edward [Northern Arizona Univ., Flagstaff, AZ (United States); Luo, Yiqi [Univ. of Oklahoma, Norman, OK (United States)

    2016-12-01

    This final grant report is a continuation of the final grant report submitted for DE-SC0006982 as the Principle Investigator (Schuur) relocated from the University of Florida to Northern Arizona University. This report summarizes the original project goals, as well as includes new project activities that were completed in the final period of the project.

  19. 뇌-컴퓨터 쿸터페쿴스 (Brain-Computer Interfaces) 기술엿 대한 국내·외 연구개발 뿙향 조사 (Research and Development in Brain-Computer Interfacing Technology: A Comprehensive Technical Review). Final Report.

    NARCIS (Netherlands)

    Nam, Chang Soo; Kim, Sung-Phil; Krusienkki, Dean; Nijholt, Antinus

    2015-01-01

    This report commisioned by the Korean American Scientists and Engineers Association (KSEA) and written with the support of the Korea Federation of Science and Technology Societies (KOFST) surveys research and development trends in the area of brain-computer interface (Brain-Computer Interfaces, BCI)

  20. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  1. Development of a computer model for polycrystalline thin-film CuInSe{sub 2} and CdTe solar cells. Final subcontract report, 1 January 1991--31 December 1991

    Energy Technology Data Exchange (ETDEWEB)

    Gray, J.L.; Schwartz, R.J.; Lee, Y.J. [Purdue Univ., Lafayette, IN (United States)

    1992-09-01

    This report describes work to develop an accurate numerical model for CuInSe{sub 2} (CIS) and CdTe-based solar cells capable of running on a personal computer. Such a model will aid researchers in designing and analyzing CIS- and CdTe-based solar cells. ADEPT (A Device Emulation Pregrain and Tool) was used as the basis for this model. An additional objective of this research was to use the models developed to analyze the performance of existing and proposed CIS- and CdTe-based solar cells. The development of accurate numerical models for CIS- and CdTe-based solar cells required the compilation of cell performance data (for use in model verification) and the compilation of measurements of material parameters. The development of the numerical models involved implementing the various physical models appropriate to CIS and CdTe, as well as some common window. A version of the model capable of running on an IBM-comparable personal computer was developed (primary code development is on a SUN workstation). A user-friendly interface with pop-up menus is continuing to be developed for release with the IBM-compatible model.

  2. Virtualized Network Control. Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Ghani, Nasir [Univ. of New Mexico, Albuquerque, NM (United States)

    2013-02-01

    This document is the final report for the Virtualized Network Control (VNC) project, which was funded by the United States Department of Energy (DOE) Office of Science. This project was also informally referred to as Advanced Resource Computation for Hybrid Service and TOpology NEtworks (ARCHSTONE). This report provides a summary of the project's activities, tasks, deliverable, and accomplishments. It also provides a summary of the documents, software, and presentations generated as part of this projects activities. Namely, the Appendix contains an archive of the deliverables, documents, and presentations generated a part of this project.

  3. [Experimental nuclear physics]. Final report

    International Nuclear Information System (INIS)

    1991-04-01

    This is the final report of the Nuclear Physics Laboratory of the University of Washington on work supported in part by US Department of Energy contract DE-AC06-81ER40048. It contains chapters on giant dipole resonances in excited nuclei, nucleus-nucleus reactions, astrophysics, polarization in nuclear reactions, fundamental symmetries and interactions, accelerator mass spectrometry (AMS), ultra-relativistic heavy ions, medium energy reactions, work by external users, instrumentation, accelerators and ion sources, and computer systems. An appendix lists Laboratory personnel, a Ph. D. degree granted in the 1990-1991 academic year, and publications. Refs., 41 figs., 7 tabs

  4. [Experimental nuclear physics]. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1991-04-01

    This is the final report of the Nuclear Physics Laboratory of the University of Washington on work supported in part by US Department of Energy contract DE-AC06-81ER40048. It contains chapters on giant dipole resonances in excited nuclei, nucleus-nucleus reactions, astrophysics, polarization in nuclear reactions, fundamental symmetries and interactions, accelerator mass spectrometry (AMS), ultra-relativistic heavy ions, medium energy reactions, work by external users, instrumentation, accelerators and ion sources, and computer systems. An appendix lists Laboratory personnel, a Ph. D. degree granted in the 1990-1991 academic year, and publications. Refs., 41 figs., 7 tabs.

  5. Partnership in Computational Science

    Energy Technology Data Exchange (ETDEWEB)

    Huray, Paul G.

    1999-02-24

    This is the final report for the "Partnership in Computational Science" (PICS) award in an amount of $500,000 for the period January 1, 1993 through December 31, 1993. A copy of the proposal with its budget is attached as Appendix A. This report first describes the consequent significance of the DOE award in building infrastructure of high performance computing in the Southeast and then describes the work accomplished under this grant and a list of publications resulting from it.

  6. New computer systems

    International Nuclear Information System (INIS)

    Faerber, G.

    1975-01-01

    Process computers have already become indespensable technical aids for monitoring and automation tasks in nuclear power stations. Yet there are still some problems connected with their use whose elimination should be the main objective in the development of new computer systems. In the paper, some of these problems are summarized, new tendencies in hardware development are outlined, and finally some new systems concepts made possible by the hardware development are explained. (orig./AK) [de

  7. Recent computational chemistry

    International Nuclear Information System (INIS)

    Onishi, Taku

    2015-01-01

    Now we can investigate quantum phenomena for the real materials and molecules, and can design functional materials by computation, due to the previous developments of quantum theory and calculation methods. As there still exist the limit and problem in theory, the cooperation between theory and computation is getting more important to clarify the unknown quantum mechanism, and discover more efficient functional materials. It would be next-generation standard. Finally, our theoretical methodology for boundary solid is introduced

  8. Recent computational chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Onishi, Taku [Department of Chemistry for Materials, and The Center of Ultimate Technology on nano-Electronics, Mie University (Japan); Center for Theoretical and Computational Chemistry, Department of Chemistry, University of Oslo (Norway)

    2015-12-31

    Now we can investigate quantum phenomena for the real materials and molecules, and can design functional materials by computation, due to the previous developments of quantum theory and calculation methods. As there still exist the limit and problem in theory, the cooperation between theory and computation is getting more important to clarify the unknown quantum mechanism, and discover more efficient functional materials. It would be next-generation standard. Finally, our theoretical methodology for boundary solid is introduced.

  9. Computer aided product design

    DEFF Research Database (Denmark)

    Constantinou, Leonidas; Bagherpour, Khosrow; Gani, Rafiqul

    1996-01-01

    A general methodology for Computer Aided Product Design (CAPD) with specified property constraints which is capable of solving a large range of problems is presented. The methodology employs the group contribution approach, generates acyclic, cyclic and aromatic compounds of various degrees......-liquid equilibria (LLE), solid-liquid equilibria (SLE) and gas solubility. Finally, a computer program based on the extended methodology has been developed and the results from five case studies highlighting various features of the methodology are presented....

  10. UOP FIN 571 Final Exam Guide New

    OpenAIRE

    ADMIN

    2018-01-01

    UOP FIN 571 Final Exam Guide New Check this A+ tutorial guideline at http://www.fin571assignment.com/fin-571-uop/fin-571-final-exam-guide -latest For more classes visit http://www.fin571assignment.com Question 1 The underlying assumption of the dividend growth model is that a stock is worth: A. An amount computed as the next annual dividend divided by the required rate of return. B. An amount computed as the next annual dividend divided by the ma...

  11. Computer group

    International Nuclear Information System (INIS)

    Bauer, H.; Black, I.; Heusler, A.; Hoeptner, G.; Krafft, F.; Lang, R.; Moellenkamp, R.; Mueller, W.; Mueller, W.F.; Schati, C.; Schmidt, A.; Schwind, D.; Weber, G.

    1983-01-01

    The computer groups has been reorganized to take charge for the general purpose computers DEC10 and VAX and the computer network (Dataswitch, DECnet, IBM - connections to GSI and IPP, preparation for Datex-P). (orig.)

  12. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  13. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  14. Validation of computer codes and modelling methods for giving proof of nuclear saefty of transport and storage of spent VVER-type nuclear fuels. Part 1. Purposes and goals of the project. Final report

    International Nuclear Information System (INIS)

    Buechse, H.; Langowski, A.; Lein, M.; Nagel, R.; Schmidt, H.; Stammel, M.

    1995-01-01

    The report gives the results of investigations on the validation of computer codes used to prove nuclear safety during transport and storage of spent VVER - fuel of NPP Greifswald and Rheinsberg. Characteristics of typical spent fuel (nuclide concentration, neutron source strength, gamma spectrum, decay heat) - calculated with several codes - and dose rates (e.g. in the surrounding of a loaded spent fuel cask) - based on the different source terms - are presented. Differences and their possible reasons are discussed. The results show that despite the differences in the source terms all relevant health physics requirements are met for all cases of source term. The validation of the criticality code OMEGA was established by calculation of appr. 200 critical experiments of LWR fuel, including VVER fuel rod arrangements. The mean error of the effective multiplication factor k eff is -0,01 compared to the experiment for this area of applicability. Thus, the OMEGA error of 2% assumed in earlier works has turned out to be sufficiently conservative. (orig.) [de

  15. Quantum Computing and the Limits of the Efficiently Computable

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    I'll discuss how computational complexity---the study of what can and can't be feasibly computed---has been interacting with physics in interesting and unexpected ways. I'll first give a crash course about computer science's P vs. NP problem, as well as about the capabilities and limits of quantum computers. I'll then touch on speculative models of computation that would go even beyond quantum computers, using (for example) hypothetical nonlinearities in the Schrodinger equation. Finally, I'll discuss BosonSampling ---a proposal for a simple form of quantum computing, which nevertheless seems intractable to simulate using a classical computer---as well as the role of computational complexity in the black hole information puzzle.

  16. Computational models of neuromodulation.

    Science.gov (United States)

    Fellous, J M; Linster, C

    1998-05-15

    Computational modeling of neural substrates provides an excellent theoretical framework for the understanding of the computational roles of neuromodulation. In this review, we illustrate, with a large number of modeling studies, the specific computations performed by neuromodulation in the context of various neural models of invertebrate and vertebrate preparations. We base our characterization of neuromodulations on their computational and functional roles rather than on anatomical or chemical criteria. We review the main framework in which neuromodulation has been studied theoretically (central pattern generation and oscillations, sensory processing, memory and information integration). Finally, we present a detailed mathematical overview of how neuromodulation has been implemented at the single cell and network levels in modeling studies. Overall, neuromodulation is found to increase and control computational complexity.

  17. Computational invariant theory

    CERN Document Server

    Derksen, Harm

    2015-01-01

    This book is about the computational aspects of invariant theory. Of central interest is the question how the invariant ring of a given group action can be calculated. Algorithms for this purpose form the main pillars around which the book is built. There are two introductory chapters, one on Gröbner basis methods and one on the basic concepts of invariant theory, which prepare the ground for the algorithms. Then algorithms for computing invariants of finite and reductive groups are discussed. Particular emphasis lies on interrelations between structural properties of invariant rings and computational methods. Finally, the book contains a chapter on applications of invariant theory, covering fields as disparate as graph theory, coding theory, dynamical systems, and computer vision. The book is intended for postgraduate students as well as researchers in geometry, computer algebra, and, of course, invariant theory. The text is enriched with numerous explicit examples which illustrate the theory and should be ...

  18. DOE Utility Matching Program Final Technical Report

    International Nuclear Information System (INIS)

    Haghighat, Alireza

    2002-01-01

    This is the Final report for the DOE Match Grant (DE-FG02-99NE38163) awarded to the Nuclear and Radiological Engineering (NRE) Department, University of Florida, for the period of September 1999 to January 2002. This grant has been instrumental for maintaining high-quality graduate and undergraduate education at the NRE department. The grant has been used for supporting student entry and retention and for upgrading nuclear educational facilities, nuclear instrumentation, computer facilities, and computer codes to better enable the incorporation of experimental experiences and computer simulations related to advanced light water fission reactor engineering and other advanced reactor concepts into the nuclear engineering course curricula

  19. Development of a computing program for prediction of wind power for midsize and wide grid areas. Final report; Entwicklung eines Rechenmodells zur Vorhersage der Windleistung fuer mittlere und grosse Versorgungsgebiete. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Reeder, L.; Rohrig, K.; Ernst, B.; Schorn, P.; Bettels, B.

    2002-06-30

    In co-operation with partners out of industry and research a machine program was developed predicting the output of wind power plants. Three attributes should be realised by this prediction tool: Short computing time, usable for various grid regions, high reliability. Therewith the transmission system operators get a tool for reducing the amount of control energy which is needed to ensure the balance between power generation and consumption in their networks. This prediction tool for up to two days was developed exemplary for the northern grid area of the transmission system operator 'E.ON Netz GmbH' (ENE). The wind power prediction is based on numerical weather forecast from the German weather service (Deutscher Wetterdienst). The weather forecast is given for 16 representative sites within the ENE-area. The meso-scale model KLIMM (Klima Model Mainz) was used to calculate the meteorological variables near to the wind farms, which are connected to the one transformer substation belonging to one representative place. Therefor KLIMM is fed with the weather forecast given for one limited location in the representative sites. The transformation of the meteorological variables to the output of wind power plants at the representative site is done by Neural Networks. These Neural Networks have been trained with corresponding measurements. Using an existing online-model the total wind power for the whole ENE-area will be calculated from the individual wind power of the representative sites. The Evaluation of the prediction- and measured data from 2001 shows comparing with reference-models, that the prediction-model evolved in the project lead to very good results. (orig.)

  20. DIMEC - Final Report

    DEFF Research Database (Denmark)

    Conrad, Finn

    1997-01-01

    Final report of the research project DIMEC - Danish InfoMechatronic Control supported by the Danish Technical Research Council, STVF.......Final report of the research project DIMEC - Danish InfoMechatronic Control supported by the Danish Technical Research Council, STVF....

  1. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Glasser, Alan H. [Fusion Theory and Computation Inc., Kingston, WA (United States)

    2018-02-02

    Final technical report on DE-SC0016106. This is the final technical report for a portion of the multi-institutional CEMM project. This report is centered around 3 publications and a seminar presentation, which have been submitted to E-Link.

  2. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  3. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  4. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  5. Models of optical quantum computing

    Directory of Open Access Journals (Sweden)

    Krovi Hari

    2017-03-01

    Full Text Available I review some work on models of quantum computing, optical implementations of these models, as well as the associated computational power. In particular, we discuss the circuit model and cluster state implementations using quantum optics with various encodings such as dual rail encoding, Gottesman-Kitaev-Preskill encoding, and coherent state encoding. Then we discuss intermediate models of optical computing such as boson sampling and its variants. Finally, we review some recent work in optical implementations of adiabatic quantum computing and analog optical computing. We also provide a brief description of the relevant aspects from complexity theory needed to understand the results surveyed.

  6. Final focus nomenclature

    International Nuclear Information System (INIS)

    Erickson, R.

    1986-01-01

    The formal names and common names for all devices in the final focus system of the SLC are listed. The formal names consist of a device type designator, microprocessor designator, and a four-digit unit number

  7. Final focus test beam

    International Nuclear Information System (INIS)

    1991-03-01

    This report discusses the following: the Final Focus Test Beam Project; optical design; magnets; instrumentation; magnetic measurement and BPM calibration; mechanical alignment and stabilization; vacuum system; power supplies; control system; radiation shielding and personnel protection; infrastructure; and administration

  8. WMO Marine Final Reports

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Final reports of the World Meteorological Organization (WMO) Commission for Marine Meteorology, Commission for Synoptic Meteorology, and Commission for Basic...

  9. Transacsys PLC - Final Results

    CERN Multimedia

    2002-01-01

    Final results from Transacsys PLC. A subsidary of this company was set up to develop the CERN EDH system into a commercial product but incurred too much financial loss so the project was cancelled (1/2 page).

  10. Final focus nomenclature

    Energy Technology Data Exchange (ETDEWEB)

    Erickson, R.

    1986-08-08

    The formal names and common names for all devices in the final focus system of the SLC are listed. The formal names consist of a device type designator, microprocessor designator, and a four-digit unit number. (LEW)

  11. Data breaches. Final rule.

    Science.gov (United States)

    2008-04-11

    This document adopts, without change, the interim final rule that was published in the Federal Register on June 22, 2007, addressing data breaches of sensitive personal information that is processed or maintained by the Department of Veterans Affairs (VA). This final rule implements certain provisions of the Veterans Benefits, Health Care, and Information Technology Act of 2006. The regulations prescribe the mechanisms for taking action in response to a data breach of sensitive personal information.

  12. Final report: Prototyping a combustion corridor; FINAL

    International Nuclear Information System (INIS)

    Rutland, Christopher J.; Leach, Joshua

    2001-01-01

    The Combustion Corridor is a concept in which researchers in combustion and thermal sciences have unimpeded access to large volumes of remote computational results. This will enable remote, collaborative analysis and visualization of state-of-the-art combustion science results. The Engine Research Center (ERC) at the University of Wisconsin - Madison partnered with Lawrence Berkeley National Laboratory, Argonne National Laboratory, Sandia National Laboratory, and several other universities to build and test the first stages of a combustion corridor. The ERC served two important functions in this partnership. First, we work extensively with combustion simulations so we were able to provide real world research data sets for testing the Corridor concepts. Second, the ERC was part of an extension of the high bandwidth based DOE National Laboratory connections to universities

  13. Quantum computational webs

    International Nuclear Information System (INIS)

    Gross, D.; Eisert, J.

    2010-01-01

    We discuss the notion of quantum computational webs: These are quantum states universal for measurement-based computation, which can be built up from a collection of simple primitives. The primitive elements--reminiscent of building blocks in a construction kit--are (i) one-dimensional states (computational quantum wires) with the power to process one logical qubit and (ii) suitable couplings, which connect the wires to a computationally universal web. All elements are preparable by nearest-neighbor interactions in a single pass, of the kind accessible in a number of physical architectures. We provide a complete classification of qubit wires, a physically well-motivated class of universal resources that can be fully understood. Finally, we sketch possible realizations in superlattices and explore the power of coupling mechanisms based on Ising or exchange interactions.

  14. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  15. Grid Computing

    Indian Academy of Sciences (India)

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  16. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  17. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  18. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  19. Pervasive Computing

    NARCIS (Netherlands)

    Silvis-Cividjian, N.

    This book provides a concise introduction to Pervasive Computing, otherwise known as Internet of Things (IoT) and Ubiquitous Computing (Ubicomp) which addresses the seamless integration of computing systems within everyday objects. By introducing the core topics and exploring assistive pervasive

  20. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  1. Spatial Computation

    Science.gov (United States)

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  2. The Research of the Parallel Computing Development from the Angle of Cloud Computing

    Science.gov (United States)

    Peng, Zhensheng; Gong, Qingge; Duan, Yanyu; Wang, Yun

    2017-10-01

    Cloud computing is the development of parallel computing, distributed computing and grid computing. The development of cloud computing makes parallel computing come into people’s lives. Firstly, this paper expounds the concept of cloud computing and introduces two several traditional parallel programming model. Secondly, it analyzes and studies the principles, advantages and disadvantages of OpenMP, MPI and Map Reduce respectively. Finally, it takes MPI, OpenMP models compared to Map Reduce from the angle of cloud computing. The results of this paper are intended to provide a reference for the development of parallel computing.

  3. Parallel computations

    CERN Document Server

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  4. Human Computation

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  5. Quantum computation

    International Nuclear Information System (INIS)

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  6. Navier-Stokes computer

    International Nuclear Information System (INIS)

    Hayder, M.E.

    1988-01-01

    A new scientific supercomputer, known as the Navier-Stokes Computer (NSC), has been designed. The NSC is a multi-purpose machine, and for applications in the field of computational fluid dynamics (CFD), this supercomputer is expected to yield a computational speed far exceeding that of the present-day super computers. This computer has a few very powerful processors (known as nodes) connected by an internodal network. There are three versions of the NSC nodes: micro-, mini- and full-node. The micro-node was developed to prove, to demonstrate and to refine the key architectural features of the NSC. Architectures of the two recent versions of the NSC nodes are presented, with the main focus on the full-node. At a clock speed of 20 MHz, the mini- and the full-node have peak computational speeds of 200 and 640 MFLOPS, respectively. The full-node is the final version for the NSC nodes and an NSC is expected to have 128 full-nodes. To test the suitability of different algorithms on the NSC architecture, an NSC simulator was developed. Some of the existing computational fluid dynamics codes were placed on this simulator to determine important and relevant issues relating to the efficient use of the NSC architecture

  7. Essentials of Computational Electromagnetics

    CERN Document Server

    Sheng, Xin-Qing

    2012-01-01

    Essentials of Computational Electromagnetics provides an in-depth introduction of the three main full-wave numerical methods in computational electromagnetics (CEM); namely, the method of moment (MoM), the finite element method (FEM), and the finite-difference time-domain (FDTD) method. Numerous monographs can be found addressing one of the above three methods. However, few give a broad general overview of essentials embodied in these methods, or were published too early to include recent advances. Furthermore, many existing monographs only present the final numerical results without specifyin

  8. Improved Barriers to Turbine Engine Fragments: Final Annual Report

    National Research Council Canada - National Science Library

    Shockey, Donald

    2002-01-01

    This final annual technical report describes the progress rnade during year 4 of the SPI International Phase II effort to develop a computational capability for designing lightweight fragment barriers...

  9. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  10. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  11. Mobile cloud computing for computation offloading: Issues and challenges

    Directory of Open Access Journals (Sweden)

    Khadija Akherfi

    2018-01-01

    Full Text Available Despite the evolution and enhancements that mobile devices have experienced, they are still considered as limited computing devices. Today, users become more demanding and expect to execute computational intensive applications on their smartphone devices. Therefore, Mobile Cloud Computing (MCC integrates mobile computing and Cloud Computing (CC in order to extend capabilities of mobile devices using offloading techniques. Computation offloading tackles limitations of Smart Mobile Devices (SMDs such as limited battery lifetime, limited processing capabilities, and limited storage capacity by offloading the execution and workload to other rich systems with better performance and resources. This paper presents the current offloading frameworks, computation offloading techniques, and analyzes them along with their main critical issues. In addition, it explores different important parameters based on which the frameworks are implemented such as offloading method and level of partitioning. Finally, it summarizes the issues in offloading frameworks in the MCC domain that requires further research.

  12. AIMES Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Jha, Shantenu [Rutgers Univ., New Brunswick, NJ (United States)

    2017-01-31

    Many important advances in science and engineering are due to large-scale distributed computing. Notwithstanding this reliance, we are still learning how to design and deploy large-scale production Distributed Computing Infrastructures (DCI). The AIMES project was conceived against this backdrop, following on the heels of a comprehensive survey of scienti c distributed applications [1]. The survey established, arguably for the rst time, the relationship between infrastructure and scienti c distributed applications. It examined well known contributors to the complexity associated with infrastructure, such as inconsistent internal and external interfaces, and demonstrated the correlation with application brittleness. It discussed how infrastructure complexity reinforces the challenges inherent in developing distributed applications.

  13. The modification and application of RAMS computer code. Final report

    International Nuclear Information System (INIS)

    McKee, T.B.

    1995-01-01

    The Regional Atmospheric Modeling System (RAMS) has been utilized in its most updated form, version 3a, to simulate a case night from the Atmospheric Studies in COmplex Terrain (ASCOT) experimental program. ASCOT held a wintertime observational campaign during February, 1991 to observe the often strong drainage flows which form on the Great Plains and in the canyons embedded within the slope from the Continental Divide to the Great Plains. A high resolution (500 m grid spacing) simulation of the 4-5 February 1991 case night using the more advanced turbulence closure now available in RAMS 3a allowed greater analysis of the physical processes governing the drainage flows. It is found that shear interaction above and within the drainage flow are important, and are overpredicted with the new scheme at small grid spacing (< ∼1000 m). The implication is that contaminants trapped in nighttime stable flows such as these, will be mixed too strongly in the vertical reducing predicted ground concentrations. The HYPACT code has been added to the capability at LANL, although due to the reduced scope of work, no simulations with HYPACT were performed

  14. Computational modeling of drug-resistant bacteria. Final report

    International Nuclear Information System (INIS)

    2015-01-01

    Initial proposal summary: The evolution of antibiotic-resistant mutants among bacteria (superbugs) is a persistent and growing threat to public health. In many ways, we are engaged in a war with these microorganisms, where the corresponding arms race involves chemical weapons and biological targets. Just as advances in microelectronics, imaging technology and feature recognition software have turned conventional munitions into smart bombs, the long-term objectives of this proposal are to develop highly effective antibiotics using next-generation biomolecular modeling capabilities in tandem with novel subatomic feature detection software. Using model compounds and targets, our design methodology will be validated with correspondingly ultra-high resolution structure-determination methods at premier DOE facilities (single-crystal X-ray diffraction at Argonne National Laboratory, and neutron diffraction at Oak Ridge National Laboratory). The objectives and accomplishments are summarized.

  15. Computational modeling of drug-resistant bacteria. Final report

    Energy Technology Data Exchange (ETDEWEB)

    MacDougall, Preston [Middle Tennessee State Univ., Murfreesboro, TN (United States)

    2015-03-12

    Initial proposal summary: The evolution of antibiotic-resistant mutants among bacteria (superbugs) is a persistent and growing threat to public health. In many ways, we are engaged in a war with these microorganisms, where the corresponding arms race involves chemical weapons and biological targets. Just as advances in microelectronics, imaging technology and feature recognition software have turned conventional munitions into smart bombs, the long-term objectives of this proposal are to develop highly effective antibiotics using next-generation biomolecular modeling capabilities in tandem with novel subatomic feature detection software. Using model compounds and targets, our design methodology will be validated with correspondingly ultra-high resolution structure-determination methods at premier DOE facilities (single-crystal X-ray diffraction at Argonne National Laboratory, and neutron diffraction at Oak Ridge National Laboratory). The objectives and accomplishments are summarized.

  16. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    John Ross

    2003-04-30

    The Final Technical Report summarizes research accomplishments and Publications in the period of 5/1/99 to 4/30/03 done on the grant. Extensive progress was made in the period covered by this report in the areas of chemical kinetics of non-linear systems; spatial structures, reaction - diffusion systems, and thermodynamic and stochastic theory of electrochemical and general systems.

  17. Regional final energy consumptions

    International Nuclear Information System (INIS)

    2011-01-01

    This report comments the differences observed between the French regions and also between these regions and national data in terms of final energy consumption per inhabitant, per GDP unit, and per sector (housing and office building, transport, industry, agriculture). It also comments the evolutions during the last decades, identifies the most recent trends

  18. Deep inelastic final states

    International Nuclear Information System (INIS)

    Girardi, G.

    1980-11-01

    In these lectures we attempt to describe the final states of deep inelastic scattering as given by QCD. In the first section we shall briefly comment on the parton model and give the main properties of decay functions which are of interest for the study of semi-inclusive leptoproduction. The second section is devoted to the QCD approach to single hadron leptoproduction. First we recall basic facts on QCD log's and derive after that the evolution equations for the fragmentation functions. For this purpose we make a short detour in e + e - annihilation. The rest of the section is a study of the factorization of long distance effects associated with the initial and final states. We then show how when one includes next to leading QCD corrections one induces factorization breaking and describe the double moments useful for testing such effects. The next section contains a review on the QCD jets in the hadronic final state. We begin by introducing the notion of infrared safe variable and defining a few useful examples. Distributions in these variables are studied to first order in QCD, with some comments on the resummation of logs encountered in higher orders. Finally the last section is a 'gaullimaufry' of jet studies

  19. The 'final order' problem

    NARCIS (Netherlands)

    Teunter, RH; Haneveld, WKK

    1998-01-01

    When the service department of a company selling machines stops producing and supplying spare parts for certain machines, customers are offered an opportunity to place a so-called final order for these spare parts. We focus on one customer with one machine. The customer plans to use this machine up

  20. A programming approach to computability

    CERN Document Server

    Kfoury, A J; Arbib, Michael A

    1982-01-01

    Computability theory is at the heart of theoretical computer science. Yet, ironically, many of its basic results were discovered by mathematical logicians prior to the development of the first stored-program computer. As a result, many texts on computability theory strike today's computer science students as far removed from their concerns. To remedy this, we base our approach to computability on the language of while-programs, a lean subset of PASCAL, and postpone consideration of such classic models as Turing machines, string-rewriting systems, and p. -recursive functions till the final chapter. Moreover, we balance the presentation of un solvability results such as the unsolvability of the Halting Problem with a presentation of the positive results of modern programming methodology, including the use of proof rules, and the denotational semantics of programs. Computer science seeks to provide a scientific basis for the study of information processing, the solution of problems by algorithms, and the design ...

  1. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  2. Final Report: Performance Engineering Research Institute

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [Rice Univ., Houston, TX (United States)

    2014-10-27

    This document is a final report about the work performed for cooperative agreement DE-FC02-06ER25764, the Rice University effort of Performance Engineering Research Institute (PERI). PERI was an Enabling Technologies Institute of the Scientific Discovery through Advanced Computing (SciDAC-2) program supported by the Department of Energy's Office of Science Advanced Scientific Computing Research (ASCR) program. The PERI effort at Rice University focused on (1) research and development of tools for measurement and analysis of application program performance, and (2) engagement with SciDAC-2 application teams.

  3. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    de Szoeke, Simon P. [Oregon State Univ., Corvallis, OR (United States)

    2018-03-02

    The investigator and DOE-supported student [1] retrieved vertical air velocity and microphysical fall velocity retrieval for VOCALS and CAP-MBL homogeneous clouds. [2] Calculated in-cloud and cloud top dissipation calculation and diurnal cycle computed for VOCALS. [3] Compared CAP-MBL Doppler cloud radar scenes with (Remillard et al. 2012) automated classification.

  4. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  5. Computational biomechanics

    International Nuclear Information System (INIS)

    Ethier, C.R.

    2004-01-01

    Computational biomechanics is a fast-growing field that integrates modern biological techniques and computer modelling to solve problems of medical and biological interest. Modelling of blood flow in the large arteries is the best-known application of computational biomechanics, but there are many others. Described here is work being carried out in the laboratory on the modelling of blood flow in the coronary arteries and on the transport of viral particles in the eye. (author)

  6. Cassini's Grand Finale Overview

    Science.gov (United States)

    Spilker, L. J.

    2017-12-01

    After 13 years in orbit, the Cassini-Huygens Mission to Saturn ended in a science-rich blaze of glory. Cassini sent back its final bits of unique science data on September 15, 2017, as it plunged into Saturn's atmosphere, vaporizing and satisfying planetary protection requirements. Cassini's final phase covered roughly ten months and ended with the first time exploration of the region between the rings and planet. In late 2016 Cassini transitioned to a series of 20 Ring Grazing orbits with peripases just outside Saturn's F ring, providing close flybys of tiny ring moons, including Pan, Daphnis and Atlas, and high-resolution views of Saturn's A and F rings. A final Titan flyby in late April 2017 propelled Cassini across Saturn's main rings and into its Grand Finale orbits. Comprised of 22 orbits, Cassini repeatedly dove between Saturn's innermost rings and upper atmosphere to answer fundamental questions unattainable earlier in the mission. The last orbit turned the spacecraft into the first Saturn atmosphere probe. The Grand Finale orbits provided highest resolution observations of both the rings and Saturn, and in-situ sampling of the ring particle composition, Saturn's atmosphere, plasma, and innermost radiation belts. The gravitational field was measured to unprecedented accuracy, providing information on the interior structure of the planet, winds in the deeper atmosphere, and mass of the rings. The magnetic field provided insight into the physical nature of the magnetic dynamo and structure of the internal magnetic field. The ion and neutral mass spectrometer sampled the upper atmosphere for molecules that escape the atmosphere in addition to molecules originating from the rings. The cosmic dust analyzer directly sampled the composition from different parts of the main rings for the first time. Fields and particles instruments directly measured the plasma environment between the rings and planet. Science highlights and new mysteries collected in the Grand

  7. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    to understand the computer as a material like any other material we would use for design, like wood, aluminum, or plastic. That as soon as the computer forms a composition with other materials it becomes just as approachable and inspiring as other smart materials. I present a series of investigations of what...... Computational Composite, and Telltale). Through the investigations, I show how the computer can be understood as a material and how it partakes in a new strand of materials whose expressions come to be in context. I uncover some of their essential material properties and potential expressions. I develop a way...

  8. CMS Is Finally Completed

    CERN Multimedia

    2008-01-01

    Yet another step in the completion of the Large Hadron Collider was taken yesterday morning, as the final element of the Compact Muon Solenoid was lowered nearly 100 meters bellow ground. After more than eight years of work at the world's most powerful particle accelerator, scientists hope that they will be able to start initial experiments with the LHC until the end of this year.

  9. Catarse e Final Feliz

    Directory of Open Access Journals (Sweden)

    Myriam Ávila

    2001-12-01

    Full Text Available Resumo: É a certeza de que nada mais – ou nada importante – pode acontecer após o final de um conto que permite o acontecimento da catarse. Se na maioria das narrativas existe algum tipo de dénouement, em algumas delas isso acontece de maneira especialmente satisfatória e afirmativa. O conto de fadas é uma dessas formas narrativas onde o efeito catártico é extremo e preenche objetivos específicos, de acordo com Bruno Bettelheim. Hollywood mimetizou essa forma como estratégia de sedução, iniciando a tradição do final feliz no cinema. A partir do conto de fadas Cinderela, em diferentes versões, juntamente com a animação homônima da Disney e ainda duas versões do filme Sabrina, será traçada aqui uma relação entre a catarse e o final feliz nos contos de fada, bem como seu uso pela indústria cultural. Palavras-chave: catarse, contos de fada, Hollywood

  10. Final Project Report

    DEFF Research Database (Denmark)

    Workspace

    2003-01-01

    of the Disappearing Computer to be that of  “Augmenting reality”, where “Augmented reality” meant:  •  Augmented user – positioning, visualising. •  Augmented environment, Panels, tables and site-pack •  Augmented Artifacts - RFID , tagging, tracking •  Augmented communications – efficient exchange and integration......The primary focus of the WORKSPACE project was to augment the working  environment through the development of spatial computing components, initially for  members of the design professions, but with wider applicability to a range of work  domains.     The project interpreted the requirements...... of the above.    The philosophy was to make the computer disappear by both making it large and  embedding it into the environment (e.g. furniture).  The project has successfully achieved its objectives, and has developed a range of  demonstrator prototypes, some of which is in daily use by practitioners within...

  11. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  12. Quantum Computing

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 9. Quantum Computing - Building Blocks of a Quantum Computer. C S Vijay Vishal Gupta. General Article Volume 5 Issue 9 September 2000 pp 69-81. Fulltext. Click here to view fulltext PDF. Permanent link:

  13. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  14. Quantum Computing

    Indian Academy of Sciences (India)

    In the first part of this article, we had looked at how quantum physics can be harnessed to make the building blocks of a quantum computer. In this concluding part, we look at algorithms which can exploit the power of this computational device, and some practical difficulties in building such a device. Quantum Algorithms.

  15. Quantum computing

    OpenAIRE

    Burba, M.; Lapitskaya, T.

    2017-01-01

    This article gives an elementary introduction to quantum computing. It is a draft for a book chapter of the "Handbook of Nature-Inspired and Innovative Computing", Eds. A. Zomaya, G.J. Milburn, J. Dongarra, D. Bader, R. Brent, M. Eshaghian-Wilner, F. Seredynski (Springer, Berlin Heidelberg New York, 2006).

  16. Computational Pathology

    Science.gov (United States)

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  17. Hardware for soft computing and soft computing for hardware

    CERN Document Server

    Nedjah, Nadia

    2014-01-01

    Single and Multi-Objective Evolutionary Computation (MOEA),  Genetic Algorithms (GAs), Artificial Neural Networks (ANNs), Fuzzy Controllers (FCs), Particle Swarm Optimization (PSO) and Ant colony Optimization (ACO) are becoming omnipresent in almost every intelligent system design. Unfortunately, the application of the majority of these techniques is complex and so requires a huge computational effort to yield useful and practical results. Therefore, dedicated hardware for evolutionary, neural and fuzzy computation is a key issue for designers. With the spread of reconfigurable hardware such as FPGAs, digital as well as analog hardware implementations of such computation become cost-effective. The idea behind this book is to offer a variety of hardware designs for soft computing techniques that can be embedded in any final product. Also, to introduce the successful application of soft computing technique to solve many hard problem encountered during the design of embedded hardware designs. Reconfigurable em...

  18. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  19. Computability theory

    CERN Document Server

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  20. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  1. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  2. DANAERO MW: Final Report

    DEFF Research Database (Denmark)

    Troldborg, Niels; Bak, Christian; Aagaard Madsen, Helge

    This report describes the results of the EUDP funded DANAERO MW II project carried out by DTU Wind Energy (formerly Risø DTU) and the industrial partners, LM Wind Power, Vestas Wind Systems A/S and Siemens Wind Power. An overview of the data available from the project as well as the results from...... analysis of the data is given with the main objective to explore in detail the influence of atmospheric and wake turbulence on MW turbine performance, loading and stability. Finally, validation and demonstration of simulation codes are carried out....

  3. Variation in radiotherapy target volume definition, dose to organs at risk and clinical target volumes using anatomic (computed tomography) versus combined anatomic and molecular imaging (positron emission tomography/computed tomography): intensity-modulated radiotherapy delivered using a tomotherapy Hi Art machine: final results of the VortigERN study.

    Science.gov (United States)

    Chatterjee, S; Frew, J; Mott, J; McCallum, H; Stevenson, P; Maxwell, R; Wilsdon, J; Kelly, C G

    2012-12-01

    Contrast-enhanced computed tomography (CECT) is the current standard for delineating tumours of the head and neck for radiotherapy. Although metabolic imaging with positron emission tomography (PET) has been used in recent years, the studies were non-confirmatory in establishing its routine role in radiotherapy planning in the modern era. This study explored the difference in gross tumour volume and clinical target volume definitions for the primary and nodal volumes when FDG PET/CT was used as compared with CECT in oropharyngeal cancer cases. Twenty patients with oropharyngeal cancers had a PET/CT scan in the treatment position after consent. Target volumes were defined on CECT scans by a consultant clinical oncologist who was blind to the PET scans. After obtaining inputs from a radiologist, another set of target volumes were outlined on the PET/CT data set. The gross and clinical target volumes as defined on the two data sets were then analysed. The hypothesis of more accurate target delineation, preventing geographical miss and comparative overlap volumes between CECT and PET/CT, was explored. The study also analysed the volumes of intersection and analysed whether there was any TNM stage migration when PET/CT was used as compared with CECT for planning. In 17 of 20 patients, the TNM stage was not altered when adding FDG PET information to CT. PET information prevented geographical miss in two patients and identified distant metastases in one case. PET/CT gross tumour volumes were smaller than CECT volumes (mean ± standard deviation: 25.16 cm(3) ± 35.8 versus 36.56 cm(3) ± 44.14; P standard deviation: CECT versus PET/CT 32.48 cm(3) ± 36.63 versus 32.21 cm(3) ± 37.09; P > 0.86) were not statistically different. Similarity and discordance coefficients were calculated and are reported. PET/CT as compared with CECT could provide more clinically relevant information and prevent geographical miss when used for radiotherapy planning for advanced oropharyngeal

  4. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  5. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  6. Computational physics

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1987-01-15

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October.

  7. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  8. Computational Viscoelasticity

    CERN Document Server

    Marques, Severino P C

    2012-01-01

    This text is a guide how to solve problems in which viscoelasticity is present using existing commercial computational codes. The book gives information on codes’ structure and use, data preparation  and output interpretation and verification. The first part of the book introduces the reader to the subject, and to provide the models, equations and notation to be used in the computational applications. The second part shows the most important Computational techniques: Finite elements formulation, Boundary elements formulation, and presents the solutions of Viscoelastic problems with Abaqus.

  9. Optical computing.

    Science.gov (United States)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  10. Computational physics

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October

  11. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot...... encompass human concepts of subjective experience and intersubjective meaningful communication, which prevents it from being genuinely transdisciplinary. (3) Philosophically, it does not sufficiently accept the deep ontological differences between various paradigms such as von Foerster’s second- order...

  12. Tsonis final project report

    Energy Technology Data Exchange (ETDEWEB)

    Duane, Greg [University of Wisconsin-Milwaukee, WI (United States); Tsonis, Anastasios [University of Wisconsin-Milwaukee, WI (United States); Kocarev, Ljupco [University of Wisconsin-Milwaukee, WI (United States); Tribbia, Joseph [University of Wisconsin-Milwaukee, WI (United States)

    2014-11-20

    This collaborative reserach has several components but the main idea is that when imperfect copies of a given nonlinear dynamical system are coupled, they may synchronize for some set of coupling parameters. This idea is to be tested for several IPCC-like models each one with its own formulation and representing an “imperfect” copy of the true climate system. By computing the coupling parameters, which will lead the models to a synchronized state, a consensus on climate change simulations may be achieved.

  13. The final cool down

    CERN Multimedia

    Thursday 29th May, the cool-down of the final sector (sector 4-5) of LHC has begun, one week after the start of the cool-down of sector 1-2. It will take five weeks for the sectors to be cooled from room temperature to 5 K and a further two weeks to complete the cool down to 1.9 K and the commissioning of cryogenic instrumentation, as well as to fine tune the cryogenic plants and the cooling loops of cryostats.Nearly a year and half has passed since sector 7-8 was cooled for the first time in January 2007. For Laurent Tavian, AT/CRG Group Leader, reaching the final phase of the cool down is an important milestone, confirming the basic design of the cryogenic system and the ability to operate complete sectors. “All the sectors have to operate at the same time otherwise we cannot inject the beam into the machine. The stability and reliability of the cryogenic system and its utilities are now very important. That will be the new challenge for the coming months,” he explains. The status of the cool down of ...

  14. Final report for DESC0004031

    Energy Technology Data Exchange (ETDEWEB)

    Kitchin, John [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    2016-08-08

    In this project we aim to develop new multicomponent oxide-based electrocatalysts for the oxygen evolution reaction using combined theoretical and experimental approaches. We use density functional theory to compute the electronic structure and reactivity proxies of model oxide materials. From the understanding generated from these calculations, we synthesize materials and characterize their oxygen evolution activity. We use in situ spectroscopic methods to characterize oxide electrodes under reaction conditions. We also develop new data sharing strategies to facilitate the reuse of our data by others. Our work has several potential impacts of interest to DOE. First, the discovery of new oxygen evolution electrocatalysts directly affects the efficiency of many energy-related processes from hydrogen generation to air separation and electrochemical fuel synthesis. Second, we have identified new ways to promote the oxygen evolution reaction for some materials through the electrolyte. This opens new pathways to improving the efficiency of processes involving oxygen evolution. The ability to characterize electrodes under operating conditions enables new insights into the actual structure and composition of the materials, which we are finding are not the same as the as prepared materials. Finally, DOE has significant need and interest in improving the ability to share data among researchers.

  15. Scientific computer simulation review

    International Nuclear Information System (INIS)

    Kaizer, Joshua S.; Heller, A. Kevin; Oberkampf, William L.

    2015-01-01

    Before the results of a scientific computer simulation are used for any purpose, it should be determined if those results can be trusted. Answering that question of trust is the domain of scientific computer simulation review. There is limited literature that focuses on simulation review, and most is specific to the review of a particular type of simulation. This work is intended to provide a foundation for a common understanding of simulation review. This is accomplished through three contributions. First, scientific computer simulation review is formally defined. This definition identifies the scope of simulation review and provides the boundaries of the review process. Second, maturity assessment theory is developed. This development clarifies the concepts of maturity criteria, maturity assessment sets, and maturity assessment frameworks, which are essential for performing simulation review. Finally, simulation review is described as the application of a maturity assessment framework. This is illustrated through evaluating a simulation review performed by the U.S. Nuclear Regulatory Commission. In making these contributions, this work provides a means for a more objective assessment of a simulation’s trustworthiness and takes the next step in establishing scientific computer simulation review as its own field. - Highlights: • We define scientific computer simulation review. • We develop maturity assessment theory. • We formally define a maturity assessment framework. • We describe simulation review as the application of a maturity framework. • We provide an example of a simulation review using a maturity framework

  16. Essentials of cloud computing

    CERN Document Server

    Chandrasekaran, K

    2014-01-01

    ForewordPrefaceComputing ParadigmsLearning ObjectivesPreambleHigh-Performance ComputingParallel ComputingDistributed ComputingCluster ComputingGrid ComputingCloud ComputingBiocomputingMobile ComputingQuantum ComputingOptical ComputingNanocomputingNetwork ComputingSummaryReview PointsReview QuestionsFurther ReadingCloud Computing FundamentalsLearning ObjectivesPreambleMotivation for Cloud ComputingThe Need for Cloud ComputingDefining Cloud ComputingNIST Definition of Cloud ComputingCloud Computing Is a ServiceCloud Computing Is a Platform5-4-3 Principles of Cloud computingFive Essential Charact

  17. Personal Computers.

    Science.gov (United States)

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  18. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...

  19. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  20. Computational Controversy

    NARCIS (Netherlands)

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have

  1. Grid Computing

    Indian Academy of Sciences (India)

    IAS Admin

    emergence of supercomputers led to the use of computer simula- tion as an .... Scientific and engineering applications (e.g., Tera grid secure gate way). Collaborative ... Encryption, privacy, protection from malicious software. Physical Layer.

  2. Computer tomographs

    International Nuclear Information System (INIS)

    Niedzwiedzki, M.

    1982-01-01

    Physical foundations and the developments in the transmission and emission computer tomography are presented. On the basis of the available literature and private communications a comparison is made of the various transmission tomographs. A new technique of computer emission tomography ECT, unknown in Poland, is described. The evaluation of two methods of ECT, namely those of positron and single photon emission tomography is made. (author)

  3. Computational sustainability

    CERN Document Server

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  4. Computing farms

    International Nuclear Information System (INIS)

    Yeh, G.P.

    2000-01-01

    High-energy physics, nuclear physics, space sciences, and many other fields have large challenges in computing. In recent years, PCs have achieved performance comparable to the high-end UNIX workstations, at a small fraction of the price. We review the development and broad applications of commodity PCs as the solution to CPU needs, and look forward to the important and exciting future of large-scale PC computing

  5. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  6. Final Scientific EFNUDAT Workshop

    CERN Multimedia

    CERN. Geneva

    2010-01-01

    The Final Scientific EFNUDAT Workshop - organized by the CERN/EN-STI group on behalf of n_TOF Collaboration - will be held at CERN, Geneva (Switzerland) from 30 August to 2 September 2010 inclusive.EFNUDAT website: http://www.efnudat.euTopics of interest include: Data evaluationCross section measurementsExperimental techniquesUncertainties and covariancesFission propertiesCurrent and future facilities  International Advisory Committee: C. Barreau (CENBG, France)T. Belgya (IKI KFKI, Hungary)E. Gonzalez (CIEMAT, Spain)F. Gunsing (CEA, France)F.-J. Hambsch (IRMM, Belgium)A. Junghans (FZD, Germany)R. Nolte (PTB, Germany)S. Pomp (TSL UU, Sweden) Workshop Organizing Committee: Enrico Chiaveri (Chairman)Marco CalvianiSamuel AndriamonjeEric BerthoumieuxCarlos GuerreroRoberto LositoVasilis Vlachoudis Workshop Assistant: Géraldine Jean

  7. AIPM Final Report

    Energy Technology Data Exchange (ETDEWEB)

    John Mookken

    2006-06-30

    The final AIPM project report consists of six sections. Each section includes information on the original AIPM project and extension work on the high temperature design. The first section (1) provides an overview of the program and highlights the significant targets to meet at the end of the program. The next section (2) summarizes the significant technical accomplishments by the SEMIKRON AIPM team during the course of the project. Greater technical details are provided in a collection of all the quarterly reports which can be found in the appendix. Section three (3) presents some the more significant technical data collected from technology demonstrators. Section four (4) analyzes the manufacturing cost or economic aspects of producing 100,000 units/yr. Section five (5) describes the commercialization efforts of the AIPM technology into the automotive market. The last section (6) recommends follow on work that will build on the efforts and achievements of the AIPM program.

  8. Chernobyl: the final warning

    International Nuclear Information System (INIS)

    Gale, R.P.; Hauser, Thomas.

    1988-01-01

    Following the Chernobyl accident in 1986, a book has been written with firstly an introduction to the basic principles and development of nuclear power, followed by a brief review of previous nuclear power plant accidents and then a short account of the Chernobyl accident itself. The main text of the book however contains the personal story of Dr. Robert Peter Yale, head of the Bone Marrow Transplant Unit at the UCLA Medical Center in Los Angeles who travelled to Russia six times to help the victims of the Chernobyl accident. The final part of the book discusses the safety of nuclear power and the dangers of the proliferation of nuclear weapons. (U.K.)

  9. Cosmological Final Focus Systems

    International Nuclear Information System (INIS)

    Irwin, J

    2004-01-01

    We develop the many striking parallels between the dynamics of light streams from distant galaxies and particle beams in accelerator final focus systems. Notably the deflections of light by mass clumps are identical to the kicks arising from the long-range beam-beam interactions of two counter-rotating particle beams (known as parasitic crossings). These deflections have sextupolar as well as quadrupolar components. We estimate the strength of such distortions for a variety of circumstances and argue that the sextupolar distortions from clumping within clusters may be observable. This possibility is enhanced by the facts that (1) the sextupolar distortions of background galaxies is a factor of 5 smaller than the quadrupolar distortion, (2) the angular orientation of the sextupolar and quadrupolar distortions from a mass distribution would be correlated, appearing as a slightly curved image, (3) these effects should be spatially clumped on the sky

  10. Multimuon final states

    International Nuclear Information System (INIS)

    Crespo, J.-M.

    1980-04-01

    Multimuon final states have been detected by 3 experiments in the interactions of the muon beams of CERN (280 GeV) and FNAL (210 GeV) with heavy targets. For the first time production of J/PSI (3100) by space-like photons has been observed and its dependence on ν, Q 2 and t compared to Vector Dominance and photon-gluon fusion models. Also a clear signal has been seen for 3μ above QED tridents (outside J/PSI mass range) and 2μ events which are well described by charm production. An upper limit for the production of the T by high energy muons has been set

  11. Stardust Final Conference

    CERN Document Server

    Minisci, Edmondo; Summerer, Leopold; McGinty, Peter

    2018-01-01

    Space debris and asteroid impacts pose a very real, very near-term threat to Earth. In order to help study and mitigate these risks, the Stardust program was formed in 2013. This training and research network was devoted to developing and mastering techniques such as removal, deflection, exploitation, and tracking. This book is a collection of many of the topics addressed at the Final Stardust Conference, describing the latest in asteroid monitoring and how engineering efforts can help us reduce space debris. It is a selection of studies bringing together specialists from universities, research institutions, and industry, tasked with the mission of pushing the boundaries of space research with innovative ideas and visionary concepts. Topics covered by the Symposium: Orbital and Attitude Dynamics Modeling Long Term Orbit and Attitude Evolution Particle Cloud Modeling and Simulation Collision and Impact Modelling and Simulation, Re-entry Modeling and Simulation Asteroid Origins and Characterization Orbit and A...

  12. Final technical report

    DEFF Research Database (Denmark)

    Juhl, Thomas Winther; Nielsen, Jakob Skov

    gas jet chamber and laser beam path from the final focusing mirror. The project consists of three phases: Phase 1: Fundamental studies of cutting front mechanisms, beam propagation, nozzle design and chemical reactions in the cut kerf with special emphasize on high laser powers and thick sections...... cutting nozzle which can be adjusted independently to the laser beam has been developed. The position of the focus relative the workpiece can be adjusted to cutting applications with relatively large processing windows, i.e. both mild and stainless steels, and of a broad thickness range. A build-in auto......This project entails research with the goal to extend laser cutting of steel based metals to thickness above 20 mm and laser powers in the 10 kW range, with adequate accuracy and economically viable cutting speeds. The technical approach is to develop mirror based cutting heads with truly coaxial...

  13. Final Project Report

    Energy Technology Data Exchange (ETDEWEB)

    Small, R. Justin [Woods Hole Oceanographic Institution, MA (United States); Bryan, Frank [Woods Hole Oceanographic Institution, MA (United States); Tribbia, Joseph [Woods Hole Oceanographic Institution, MA (United States); Park, Sungsu [Woods Hole Oceanographic Institution, MA (United States); Dennis, John [Woods Hole Oceanographic Institution, MA (United States); Saravanan, R. [Woods Hole Oceanographic Institution, MA (United States); Schneider, Niklas [Woods Hole Oceanographic Institution, MA (United States); Kwon, Young-Oh [Woods Hole Oceanographic Institution, MA (United States)

    2015-06-01

    Most climate models are currently run with grid spacings of around 100km, which, with today’s computing power, allows for long (up to 1000 year) simulations, or ensembles of simulations to explore climate change and variability. However this grid spacing does not resolve important components of the weather/climate system such as atmospheric fronts and mesoscale systems, and ocean boundary currents and eddies. The overall aim of this project has been to look at the effect of these small-scale features on the weather/climate system using a suite of high and low resolution climate models, idealized models and observations. This project was only possible due to the highly scalable aspect of the CAM Spectral Element dynamical core, and the significant resources allocated at Yellowstone and NERSC for which we are grateful.

  14. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Jacquelyn Yanch

    2006-05-22

    This project involved the development of a method for in vivo prompt gamma neutron activation analysis for the investigation of Boron-10 distribution in a rabbit knee. The overall objective of this work was a robust approach for rapid screening of new {sup 10}B-labelled compounds to determine their suitability for use in the treatment of rheumatoid arthritis via Boron Neutron Capture Synovectomy (BNCS). For BNCS it is essential to obtain a compound showing high uptake levels in the synovium and long residence time in the joints. Previously the in vivo uptake behavior of potential compounds was evaluated in the arthritic knee joints of rabbits via extensive dissection studies. These studies are very labor-intensive and involve sacrificing large numbers of animals. An in vivo {sup 10}B screening approach was developed to provide initial evaluation of potential compounds. Only those compounds showing positive uptake and retention characteristics will be evaluated further via dissection studies. No further studies will be performed with compounds showing rapid clearance and/or low synovial uptake. Two approaches to in vivo screening were investigated using both simulation methods and experimentation. Both make use of neutron beams generated at the MIT Research Reactor. The first, Transmission Computed Tomography (TCT) was developed and tested but was eventually rejected due to very limited spatial resolution using existing reactor beams. The second, in vivo prompt gamma neutron activation analysis (IVPGNAA) was much more promising. IVPGNAA was developed using computer simulation and physical measurement coupled with image reconstruction techniques. The method was tested in arthritic New Zealand rabbits previously injected intra-articularly with three boron labeled compounds and shown to be effective in providing information regarding uptake level and residence time of {sup 10}B in the joint.

  15. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  16. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Aristos Aristidou Natureworks); Robert Kean (NatureWorks); Tom Schechinger (IronHorse Farms, Mat); Stuart Birrell (Iowa State); Jill Euken (Wallace Foundation & Iowa State)

    2007-10-01

    The two main objectives of this project were: 1) to develop and test technologies to harvest, transport, store, and separate corn stover to supply a clean raw material to the bioproducts industry, and 2) engineer fermentation systems to meet performance targets for lactic acid and ethanol manufacturers. Significant progress was made in testing methods to harvest corn stover in a “single pass” harvest mode (collect corn grain and stover at the same time). This is technically feasible on small scale, but additional equipment refinements will be needed to facilitate cost effective harvest on a larger scale. Transportation models were developed, which indicate that at a corn stover yield of 2.8 tons/acre and purchase price of $35/ton stover, it would be unprofitable to transport stover more than about 25 miles; thus suggesting the development of many regional collection centers. Therefore, collection centers should be located within about 30 miles of the farm, to keep transportation costs to an acceptable level. These collection centers could then potentially do some preprocessing (to fractionate or increase bulk density) and/or ship the biomass by rail or barge to the final customers. Wet storage of stover via ensilage was tested, but no clear economic advantages were evident. Wet storage eliminates fire risk, but increases the complexity of component separation and may result in a small loss of carbohydrate content (fermentation potential). A study of possible supplier-producer relationships, concluded that a “quasi-vertical” integration model would be best suited for new bioproducts industries based on stover. In this model, the relationship would involve a multiyear supply contract (processor with purchase guarantees, producer group with supply guarantees). Price will likely be fixed or calculated based on some formula (possibly a cost plus). Initial quality requirements will be specified (but subject to refinement).Producers would invest in harvest

  17. Quantum computing

    International Nuclear Information System (INIS)

    Steane, Andrew

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  18. Quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Steane, Andrew [Department of Atomic and Laser Physics, University of Oxford, Clarendon Laboratory, Oxford (United Kingdom)

    1998-02-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  19. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    here and discuss other problems caused by the adaptiveness. All protocols in the thesis are formally specified and the proofs of their security are given. [1]Ronald Cramer, Ivan Damgård, Stefan Dziembowski, Martin Hirt, and Tal Rabin. Efficient multiparty computations with dishonest minority......In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... to a polynomial time black-box reduction, the complexity of adaptively secure VSS is the same as that of ordinary secret sharing (SS), where security is only required against a passive, static adversary. Previously, such a connection was only known for linear secret sharing and VSS schemes. We then show...

  20. Scientific computing

    CERN Document Server

    Trangenstein, John A

    2017-01-01

    This is the third of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses topics that depend more on calculus than linear algebra, in order to prepare the reader for solving differential equations. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 90 examples, 200 exercises, 36 algorithms, 40 interactive JavaScript programs, 91 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either ...

  1. Computational Psychiatry

    Science.gov (United States)

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  2. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Velasco, Mayda [Northwestern University

    2013-11-01

    This work is focused on the design and construction of novel beam diagnostic and instrumentation for charged particle accelerators required for the next generation of linear colliders. Our main interest is in non-invasive techniques. The Northwestern group of Velasco has been a member of the CLIC Test Facility 3 (CTF3) collaboration since 2003, and the beam instrumentation work is developed mostly at this facility1. This 4 kW electron beam facility has a 25-170 MeV electron LINAC. CTF3 performed a set of dedicated measurements to finalize the development of our RF-Pickup bunch length detectors. The RF-pickup based on mixers was fully commissioned in 2009 and the RF-pickup based on diodes was finished in time for the 2010-11 data taking. The analysis of all the data taken in by the summer of 2010 was finish in time and presented at the main conference of the year, LINAC 2010 in Japan.

  3. Acoustic Separation Technology; FINAL

    International Nuclear Information System (INIS)

    Fred Ahrens; Tim Patterson

    2002-01-01

    Today's restrictive environmental regulations encourage paper mills to close their water systems. Closed water systems increase the level of contaminants significantly. Accumulations of solid suspensions are detrimental to both the papermaking process and the final products. To remove these solids, technologies such as flotation using dissolved air (DAF), centrifuging, and screening have been developed. Dissolved Air Flotation systems are commonly used to clarify whitewater. These passive systems use high pressure to dissolve air into whitewater. When the pressure is released, air micro-bubbles form and attach themselves to fibers and particles, which then float to the surface where they are mechanically skimmed off. There is an economic incentive to explore alternatives to the DAF technology to drive down the cost of whitewater processing and minimize the use of chemicals. The installed capital cost for a DAF system is significant and a typical DAF system takes up considerable space. An alternative approach, which is the subject of this project, involves a dual method combining the advantages of chemical flocculation and in-line ultrasonic clarification to efficiently remove flocculated contaminants from a water stream

  4. Computer Security at Nuclear Facilities

    International Nuclear Information System (INIS)

    Cavina, A.

    2013-01-01

    This series of slides presents the IAEA policy concerning the development of recommendations and guidelines for computer security at nuclear facilities. A document of the Nuclear Security Series dedicated to this issue is on the final stage prior to publication. This document is the the first existing IAEA document specifically addressing computer security. This document was necessary for 3 mains reasons: first not all national infrastructures have recognized and standardized computer security, secondly existing international guidance is not industry specific and fails to capture some of the key issues, and thirdly the presence of more or less connected digital systems is increasing in the design of nuclear power plants. The security of computer system must be based on a graded approach: the assignment of computer system to different levels and zones should be based on their relevance to safety and security and the risk assessment process should be allowed to feed back into and influence the graded approach

  5. Bringing Advanced Computational Techniques to Energy Research

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Julie C

    2012-11-17

    Please find attached our final technical report for the BACTER Institute award. BACTER was created as a graduate and postdoctoral training program for the advancement of computational biology applied to questions of relevance to bioenergy research.

  6. Computational artifacts

    DEFF Research Database (Denmark)

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  7. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  8. Cloud Computing

    CERN Document Server

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  9. Computational Logistics

    DEFF Research Database (Denmark)

    Pacino, Dario; Voss, Stefan; Jensen, Rune Møller

    2013-01-01

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  10. Computational Logistics

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  11. Computational engineering

    CERN Document Server

    2014-01-01

    The book presents state-of-the-art works in computational engineering. Focus is on mathematical modeling, numerical simulation, experimental validation and visualization in engineering sciences. In particular, the following topics are presented: constitutive models and their implementation into finite element codes, numerical models in nonlinear elasto-dynamics including seismic excitations, multiphase models in structural engineering and multiscale models of materials systems, sensitivity and reliability analysis of engineering structures, the application of scientific computing in urban water management and hydraulic engineering, and the application of genetic algorithms for the registration of laser scanner point clouds.

  12. Computer busses

    CERN Document Server

    Buchanan, William

    2000-01-01

    As more and more equipment is interface or'bus' driven, either by the use of controllers or directly from PCs, the question of which bus to use is becoming increasingly important both in industry and in the office. 'Computer Busses' has been designed to help choose the best type of bus for the particular application.There are several books which cover individual busses, but none which provide a complete guide to computer busses. The author provides a basic theory of busses and draws examples and applications from real bus case studies. Busses are analysed using from a top-down approach, helpin

  13. Reconfigurable Computing

    CERN Document Server

    Cardoso, Joao MP

    2011-01-01

    As the complexity of modern embedded systems increases, it becomes less practical to design monolithic processing platforms. As a result, reconfigurable computing is being adopted widely for more flexible design. Reconfigurable Computers offer the spatial parallelism and fine-grained customizability of application-specific circuits with the postfabrication programmability of software. To make the most of this unique combination of performance and flexibility, designers need to be aware of both hardware and software issues. FPGA users must think not only about the gates needed to perform a comp

  14. Schedulability Analysis for Java Finalizers

    DEFF Research Database (Denmark)

    Bøgholm, Thomas; Hansen, Rene Rydhof; Søndergaard, Hans

    2010-01-01

    Java finalizers perform clean-up and finalisation of objects at garbage collection time. In real-time Java profiles the use of finalizers is either discouraged (RTSJ, Ravenscar Java) or even disallowed (JSR-302), mainly because of the unpredictability of finalizers and in particular their impact...... on the schedulability analysis. In this paper we show that a controlled scoped memory model results in a structured and predictable execution of finalizers, more reminiscent of C++ destructors than Java finalizers. Furthermore, we incorporate finalizers into a (conservative) schedulability analysis for Predictable Java...... programs. Finally, we extend the SARTS tool for automated schedulability analysis of Java bytecode programs to handle finalizers in a fully automated way....

  15. Computational synthetic geometry

    CERN Document Server

    Bokowski, Jürgen

    1989-01-01

    Computational synthetic geometry deals with methods for realizing abstract geometric objects in concrete vector spaces. This research monograph considers a large class of problems from convexity and discrete geometry including constructing convex polytopes from simplicial complexes, vector geometries from incidence structures and hyperplane arrangements from oriented matroids. It turns out that algorithms for these constructions exist if and only if arbitrary polynomial equations are decidable with respect to the underlying field. Besides such complexity theorems a variety of symbolic algorithms are discussed, and the methods are applied to obtain new mathematical results on convex polytopes, projective configurations and the combinatorics of Grassmann varieties. Finally algebraic varieties characterizing matroids and oriented matroids are introduced providing a new basis for applying computer algebra methods in this field. The necessary background knowledge is reviewed briefly. The text is accessible to stud...

  16. The Antares computing model

    Energy Technology Data Exchange (ETDEWEB)

    Kopper, Claudio, E-mail: claudio.kopper@nikhef.nl [NIKHEF, Science Park 105, 1098 XG Amsterdam (Netherlands)

    2013-10-11

    Completed in 2008, Antares is now the largest water Cherenkov neutrino telescope in the Northern Hemisphere. Its main goal is to detect neutrinos from galactic and extra-galactic sources. Due to the high background rate of atmospheric muons and the high level of bioluminescence, several on-line and off-line filtering algorithms have to be applied to the raw data taken by the instrument. To be able to handle this data stream, a dedicated computing infrastructure has been set up. The paper covers the main aspects of the current official Antares computing model. This includes an overview of on-line and off-line data handling and storage. In addition, the current usage of the “IceTray” software framework for Antares data processing is highlighted. Finally, an overview of the data storage formats used for high-level analysis is given.

  17. Optical computer switching network

    Science.gov (United States)

    Clymer, B.; Collins, S. A., Jr.

    1985-01-01

    The design for an optical switching system for minicomputers that uses an optical spatial light modulator such as a Hughes liquid crystal light valve is presented. The switching system is designed to connect 80 minicomputers coupled to the switching system by optical fibers. The system has two major parts: the connection system that connects the data lines by which the computers communicate via a two-dimensional optical matrix array and the control system that controls which computers are connected. The basic system, the matrix-based connecting system, and some of the optical components to be used are described. Finally, the details of the control system are given and illustrated with a discussion of timing.

  18. Summer student final report

    CERN Document Server

    Guzik, Jakub

    2013-01-01

    During my time spent at CERN I worked under the Technology Department of CERN, in the Machine Protection and Electrical Integrity (MPE) Group. The MPE Group supports LHC operations and maintains state of the art technology for magnet circuit protection and interlock systems for the present and future accelerators, magnet test facilities and CERN hosted experiments[1]. As a member of Magnet Powering Interlocks & Software (TE-MPE-MS) section I was involved in three different projects and used not only CERN developed tools like FESA Framework, but also open source C++ frameworks, Google Test and Google Mock. I had a chance to work with Programmable Logic Controllers and real-time devices known as Front End Computers. I was part of a software developer team, and familiarized myself with the Scrum agile software development methodology. The description and results of my work are presented in three parts of this report. Each part describes a separate project created during my participation in the CERN Summer St...

  19. Brain Computer Interfaces for Enhanced Interaction with Mobile Robot Agents

    Science.gov (United States)

    2016-07-27

    SECURITY CLASSIFICATION OF: Brain Computer Interfaces (BCIs) show great potential in allowing humans to interact with computational environments in a...Distribution Unlimited UU UU UU UU 27-07-2016 17-Sep-2013 16-Sep-2014 Final Report: Brain Computer Interfaces for Enhanced Interactions with Mobile Robot...published in peer-reviewed journals: Number of Papers published in non peer-reviewed journals: Final Report: Brain Computer Interfaces for Enhanced

  20. Computing Cosmic Cataclysms

    Science.gov (United States)

    Centrella, Joan M.

    2010-01-01

    The final merger of two black holes releases a tremendous amount of energy, more than the combined light from all the stars in the visible universe. This energy is emitted in the form of gravitational waves, and observing these sources with gravitational wave detectors requires that we know the pattern or fingerprint of the radiation emitted. Since black hole mergers take place in regions of extreme gravitational fields, we need to solve Einstein's equations of general relativity on a computer in order to calculate these wave patterns. For more than 30 years, scientists have tried to compute these wave patterns. However, their computer codes have been plagued by problems that caused them to crash. This situation has changed dramatically in the past few years, with a series of amazing breakthroughs. This talk will take you on this quest for these gravitational wave patterns, showing how a spacetime is constructed on a computer to build a simulation laboratory for binary black hole mergers. We will focus on the recent advances that are revealing these waveforms, and the dramatic new potential for discoveries that arises when these sources will be observed.

  1. Computational materials design

    International Nuclear Information System (INIS)

    Snyder, R.L.

    1999-01-01

    Full text: Trial and error experimentation is an extremely expensive route to the development of new materials. The coming age of reduced defense funding will dramatically alter the way in which advanced materials have developed. In the absence of large funding we must concentrate on reducing the time and expense that the R and D of a new material consumes. This may be accomplished through the development of computational materials science. Materials are selected today by comparing the technical requirements to the materials databases. When existing materials cannot meet the requirements we explore new systems to develop a new material using experimental databases like the PDF. After proof of concept, the scaling of the new material to manufacture requires evaluating millions of parameter combinations to optimize the performance of the new device. Historically this process takes 10 to 20 years and requires hundreds of millions of dollars. The development of a focused set of computational tools to predict the final properties of new materials will permit the exploration of new materials systems with only a limited amount of materials characterization. However, to bound computational extrapolations, the experimental formulations and characterization will need to be tightly coupled to the computational tasks. The required experimental data must be obtained by dynamic, in-situ, very rapid characterization. Finally, to evaluate the optimization matrix required to manufacture the new material, very rapid in situ analysis techniques will be essential to intelligently monitor and optimize the formation of a desired microstructure. Techniques and examples for the rapid real-time application of XRPD and optical microscopy will be shown. Recent developments in the cross linking of the world's structural and diffraction databases will be presented as the basis for the future Total Pattern Analysis by XRPD. Copyright (1999) Australian X-ray Analytical Association Inc

  2. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  3. Green Cloud Computing: A Literature Survey

    Directory of Open Access Journals (Sweden)

    Laura-Diana Radu

    2017-11-01

    Full Text Available Cloud computing is a dynamic field of information and communication technologies (ICTs, introducing new challenges for environmental protection. Cloud computing technologies have a variety of application domains, since they offer scalability, are reliable and trustworthy, and offer high performance at relatively low cost. The cloud computing revolution is redesigning modern networking, and offering promising environmental protection prospects as well as economic and technological advantages. These technologies have the potential to improve energy efficiency and to reduce carbon footprints and (e-waste. These features can transform cloud computing into green cloud computing. In this survey, we review the main achievements of green cloud computing. First, an overview of cloud computing is given. Then, recent studies and developments are summarized, and environmental issues are specifically addressed. Finally, future research directions and open problems regarding green cloud computing are presented. This survey is intended to serve as up-to-date guidance for research with respect to green cloud computing.

  4. Quantum computers: Definition and implementations

    International Nuclear Information System (INIS)

    Perez-Delgado, Carlos A.; Kok, Pieter

    2011-01-01

    The DiVincenzo criteria for implementing a quantum computer have been seminal in focusing both experimental and theoretical research in quantum-information processing. These criteria were formulated specifically for the circuit model of quantum computing. However, several new models for quantum computing (paradigms) have been proposed that do not seem to fit the criteria well. Therefore, the question is what are the general criteria for implementing quantum computers. To this end, a formal operational definition of a quantum computer is introduced. It is then shown that, according to this definition, a device is a quantum computer if it obeys the following criteria: Any quantum computer must consist of a quantum memory, with an additional structure that (1) facilitates a controlled quantum evolution of the quantum memory; (2) includes a method for information theoretic cooling of the memory; and (3) provides a readout mechanism for subsets of the quantum memory. The criteria are met when the device is scalable and operates fault tolerantly. We discuss various existing quantum computing paradigms and how they fit within this framework. Finally, we present a decision tree for selecting an avenue toward building a quantum computer. This is intended to help experimentalists determine the most natural paradigm given a particular physical implementation.

  5. World Cup Final

    Science.gov (United States)

    2006-01-01

    On July 9, hundreds of millions of fans worldwide will be glued to their television sets watching the final match of the 2006 FIFA World Cup, played in Berlin's Olympic stadium (Olympiastadion). The stadium was originally built for the 1936 Summer Olympics. The Olympic Stadium seats 76,000,; its roof rises 68 meters over the seats and is made up of transparent panels that allow sunlight to stream in during the day. With its 14 spectral bands from the visible to the thermal infrared wavelength region, and its high spatial resolution of 15 to 90 meters (about 50 to 300 feet), ASTER images Earth to map and monitor the changing surface of our planet. ASTER is one of five Earth-observing instruments launched December 18, 1999, on NASA's Terra satellite. The instrument was built by Japan's Ministry of Economy, Trade and Industry. A joint U.S./Japan science team is responsible for validation and calibration of the instrument and the data products. The broad spectral coverage and high spectral resolution of ASTER provides scientists in numerous disciplines with critical information for surface mapping, and monitoring of dynamic conditions and temporal change. Example applications are: monitoring glacial advances and retreats; monitoring potentially active volcanoes; identifying crop stress; determining cloud morphology and physical properties; wetlands evaluation; thermal pollution monitoring; coral reef degradation; surface temperature mapping of soils and geology; and measuring surface heat balance. The U.S. science team is located at NASA's Jet Propulsion Laboratory, Pasadena, Calif. The Terra mission is part of NASA's Science Mission Directorate. Size: 12.1 by 15.9 kilometers (7.5 by 9.5 miles) Location: 52.5 degrees North latitude, 13.3 degrees East longitude Orientation: North at top Image Data: ASTER bands 3, 2, and 1 Original Data Resolution: 15 meters (49.2 feet) Dates Acquired: October 15, 2005

  6. MTX final report

    Energy Technology Data Exchange (ETDEWEB)

    Hooper, E.B. [ed.; Allen, S.L.; Brown, M.D.; Byers, J.A.; Casper, T.A.; Cohen, B.I.; Cohen, R.H.; Fenstermacher, M.E.; Foote, J.H.; Hoshino, K. [and others

    1994-01-01

    The MTX experiment was proposed in 1986 to apply high frequency microwaves generated by a free-electron laser (FEL) to electron cyclotron resonance heating (ECRH) in a high field, high density tokamak. As the absorption of microwaves at the electron cyclotron resonance requires high frequencies, the opportunity of applying a free-electron laser has appeal as the device is not limited to frequencies in the microwave or long millimeter wavelength regions, in contrast to many other sources. In addition, the FEL is inherently a high power source of microwaves, which would permit single units of 10 MW or more, optimum for reactors. Finally, it was recognized early in the study of the application of the FEL based on the induction linear accelerator, that the nonlinear effects associated with the intense pulses of microwaves naturally generated would offer several unique opportunities to apply ECRH to current drive, MHD control, and other plasma effects. It was consequently decided to adapt the induction accelerator based FEL to heating and controlling the tokamak, and to conduct experiments on the associated physics. To this end, the Alcator C tokamak was moved from the Massachusetts Institute of Technology (MIT) to the Lawrence Livermore National Laboratory where it was installed in Building 431 and operated from March, 1989, until the conclusion of the experiment in October, 1992. The FEL, based on the ETA-11 accelerator and IMP wiggler was brought into operation by the LLNL Electron Beam Group and power injected into the tokamak during an experimental run in the Fall, 1989. Following an upgrade by the MTX group, a second experimental run was made lasting from the Winter, 1992 through the end of the experiment. Significant contributions to the ECRH experiments were made by the Japan Atomic Energy Research Institute (JAERI).

  7. MTX final report

    International Nuclear Information System (INIS)

    Hooper, E.B.; Allen, S.L.; Brown, M.D.; Byers, J.A.; Casper, T.A.; Cohen, B.I.; Cohen, R.H.; Fenstermacher, M.E.; Foote, J.H.; Hoshino, K.

    1994-01-01

    The MTX experiment was proposed in 1986 to apply high frequency microwaves generated by a free-electron laser (FEL) to electron cyclotron resonance heating (ECRH) in a high field, high density tokamak. As the absorption of microwaves at the electron cyclotron resonance requires high frequencies, the opportunity of applying a free-electron laser has appeal as the device is not limited to frequencies in the microwave or long millimeter wavelength regions, in contrast to many other sources. In addition, the FEL is inherently a high power source of microwaves, which would permit single units of 10 MW or more, optimum for reactors. Finally, it was recognized early in the study of the application of the FEL based on the induction linear accelerator, that the nonlinear effects associated with the intense pulses of microwaves naturally generated would offer several unique opportunities to apply ECRH to current drive, MHD control, and other plasma effects. It was consequently decided to adapt the induction accelerator based FEL to heating and controlling the tokamak, and to conduct experiments on the associated physics. To this end, the Alcator C tokamak was moved from the Massachusetts Institute of Technology (MIT) to the Lawrence Livermore National Laboratory where it was installed in Building 431 and operated from March, 1989, until the conclusion of the experiment in October, 1992. The FEL, based on the ETA-11 accelerator and IMP wiggler was brought into operation by the LLNL Electron Beam Group and power injected into the tokamak during an experimental run in the Fall, 1989. Following an upgrade by the MTX group, a second experimental run was made lasting from the Winter, 1992 through the end of the experiment. Significant contributions to the ECRH experiments were made by the Japan Atomic Energy Research Institute (JAERI)

  8. Statistical Computing

    Indian Academy of Sciences (India)

    inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.

  9. Computational biology

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Computation via biological devices has been the subject of close scrutiny since von Neumann’s early work some 60 years ago. In spite of the many relevant works in this field, the notion of programming biological devices seems to be, at best, ill-defined. While many devices are claimed or proved t...

  10. Computing News

    CERN Multimedia

    McCubbin, N

    2001-01-01

    We are still five years from the first LHC data, so we have plenty of time to get the computing into shape, don't we? Well, yes and no: there is time, but there's an awful lot to do! The recently-completed CERN Review of LHC Computing gives the flavour of the LHC computing challenge. The hardware scale for each of the LHC experiments is millions of 'SpecInt95' (SI95) units of cpu power and tens of PetaBytes of data storage. PCs today are about 20-30SI95, and expected to be about 100 SI95 by 2005, so it's a lot of PCs. This hardware will be distributed across several 'Regional Centres' of various sizes, connected by high-speed networks. How to realise this in an orderly and timely fashion is now being discussed in earnest by CERN, Funding Agencies, and the LHC experiments. Mixed in with this is, of course, the GRID concept...but that's a topic for another day! Of course hardware, networks and the GRID constitute just one part of the computing. Most of the ATLAS effort is spent on software development. What we ...

  11. Quantum Computation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 16; Issue 9. Quantum Computation - Particle and Wave Aspects of Algorithms. Apoorva Patel. General Article Volume 16 Issue 9 September 2011 pp 821-835. Fulltext. Click here to view fulltext PDF. Permanent link:

  12. Cloud computing.

    Science.gov (United States)

    Wink, Diane M

    2012-01-01

    In this bimonthly series, the author examines how nurse educators can use Internet and Web-based technologies such as search, communication, and collaborative writing tools; social networking and social bookmarking sites; virtual worlds; and Web-based teaching and learning programs. This article describes how cloud computing can be used in nursing education.

  13. Computer Recreations.

    Science.gov (United States)

    Dewdney, A. K.

    1988-01-01

    Describes the creation of the computer program "BOUNCE," designed to simulate a weighted piston coming into equilibrium with a cloud of bouncing balls. The model follows the ideal gas law. Utilizes the critical event technique to create the model. Discusses another program, "BOOM," which simulates a chain reaction. (CW)

  14. [Grid computing

    CERN Multimedia

    Wolinsky, H

    2003-01-01

    "Turn on a water spigot, and it's like tapping a bottomless barrel of water. Ditto for electricity: Flip the switch, and the supply is endless. But computing is another matter. Even with the Internet revolution enabling us to connect in new ways, we are still limited to self-contained systems running locally stored software, limited by corporate, institutional and geographic boundaries" (1 page).

  15. Computational Finance

    DEFF Research Database (Denmark)

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  16. Optical Computing

    Indian Academy of Sciences (India)

    Optical computing technology is, in general, developing in two directions. One approach is ... current support in many places, with private companies as well as governments in several countries encouraging such research work. For example, much ... which enables more information to be carried and data to be processed.

  17. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    W. C. Griffith

    2007-01-01

    In this project we provide an example of how to develop multi-tiered models to go across levels of biological organization to provide a framework for relating results of studies of low doses of ionizing radiation. This framework allows us to better understand how to extrapolate laboratory results to policy decisions, and to identify future studies that will increase confidence in policy decisions. In our application of the conceptual Model we were able to move across multiple levels of biological assessment for rodents going from molecular to organism level for in vitro and in vivo endpoints and to relate these to human in vivo organism level effects. We used the rich literature on the effects of ionizing radiation on the developing brain in our models. The focus of this report is on disrupted neuronal migration due to radiation exposure and the structural and functional implications of these early biological effects. The cellular mechanisms resulting in pathogenesis are most likely due to a combination of the three mechanisms mentioned. For the purposes of a computational model, quantitative studies of low dose radiation effects on migration of neuronal progenitor cells in the cerebral mantle of experimental animals were used. In this project we were able to show now results from studies of low doses of radiation can be used in a multidimensional framework to construct linked models of neurodevelopment using molecular, cellular, tissue, and organ level studies conducted both in vitro and in vivo in rodents. These models could also be linked to behavioral endpoints in rodents which can be compared to available results in humans. The available data supported modeling to 10 cGy with limited data available at 5 cGy. We observed gradual but non-linear changes as the doses decreased. For neurodevelopment it appears that the slope of the dose response decreases from 25 cGy to 10 cGy. Future studies of neurodevelopment should be able to better define the dose response in

  18. PRIMA-X Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Lorenz, Daniel [German Research School for Simulation Sciences GmbH, Aachen (Germany); Wolf, Felix [German Research School for Simulation Sciences GmbH, Aachen (Germany)

    2016-02-17

    The PRIMA-X (Performance Retargeting of Instrumentation, Measurement, and Analysis Technologies for Exascale Computing) project is the successor of the DOE PRIMA (Performance Refactoring of Instrumentation, Measurement, and Analysis Technologies for Petascale Computing) project, which addressed the challenge of creating a core measurement infrastructure that would serve as a common platform for both integrating leading parallel performance systems (notably TAU and Scalasca) and developing next-generation scalable performance tools. The PRIMA-X project shifts the focus away from refactorization of robust performance tools towards a re-targeting of the parallel performance measurement and analysis architecture for extreme scales. The massive concurrency, asynchronous execution dynamics, hardware heterogeneity, and multi-objective prerequisites (performance, power, resilience) that identify exascale systems introduce fundamental constraints on the ability to carry forward existing performance methodologies. In particular, there must be a deemphasis of per-thread observation techniques to significantly reduce the otherwise unsustainable flood of redundant performance data. Instead, it will be necessary to assimilate multi-level resource observations into macroscopic performance views, from which resilient performance metrics can be attributed to the computational features of the application. This requires a scalable framework for node-level and system-wide monitoring and runtime analyses of dynamic performance information. Also, the interest in optimizing parallelism parameters with respect to performance and energy drives the integration of tool capabilities in the exascale environment further. Initially, PRIMA-X was a collaborative project between the University of Oregon (lead institution) and the German Research School for Simulation Sciences (GRS). Because Prof. Wolf, the PI at GRS, accepted a position as full professor at Technische Universität Darmstadt (TU

  19. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Bohdan W. Oppenheim; Rudolf Marloth

    2007-10-26

    Executive Summary The document contains Final Technical Report on the Industrial Assessment Center Program at Loyola Marymount University in Los Angeles, covering the contract period of 9/1/2002 to 11/30/2006, under the contract DE-FC36-02GO 12073. The Report describes six required program tasks, as follows: TASK 1 is a summary of the assessments performed over the life of the award: 77 assessments were performed, 595 AR were recommended, covering a very broad range of manufacturing plants. TASK 2 is a description of the efforts to promote and increase the adoption of assessment recommendations and employ innovative methods to assist in accomplishing these goals. The LMU IAC has been very successful in accomplishing the program goals, including implemented savings of $5,141,895 in energy, $10,045,411 in productivity and $30,719 in waste, for a total of $15,218,025. This represents 44% of the recommended savings of $34,896,392. TASK 3 is a description of the efforts promoting the IAC Program and enhancing recruitment efforts for new clients and expanded geographic coverage. LMU IAC has been very successful recruiting new clients covering Southern California. Every year, the intended number of clients was recruited. TASK 4 describes the educational opportunities, training, and other related activities for IAC students. A total of 38 students graduated from the program, including 2-3 graduate students every semester, and the remainder undergraduate students, mostly from the Mechanical Engineering Department. The students received formal weekly training in energy (75%) and productivity (25). All students underwent extensive safety training. All students praised the IAC experience very highly. TASK 5 describes the coordination and integration of the Center activities with other Center and IAC Program activities, and DOE programs. LMU IAC worked closely with MIT, and SDSU IAC and SFSU IAC, and enthusiastically supported the SEN activities. TASK 6 describes other tasks

  20. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  1. Automated, computer interpreted radioimmunoassay results

    International Nuclear Information System (INIS)

    Hill, J.C.; Nagle, C.E.; Dworkin, H.J.; Fink-Bennett, D.; Freitas, J.E.; Wetzel, R.; Sawyer, N.; Ferry, D.; Hershberger, D.

    1984-01-01

    90,000 Radioimmunoassay results have been interpreted and transcribed automatically using software developed for use on a Hewlett Packard Model 1000 mini-computer system with conventional dot matrix printers. The computer program correlates the results of a combination of assays, interprets them and prints a report ready for physician review and signature within minutes of completion of the assay. The authors designed and wrote a computer program to query their patient data base for radioassay laboratory results and to produce a computer generated interpretation of these results using an algorithm that produces normal and abnormal interpretives. Their laboratory assays 50,000 patient samples each year using 28 different radioassays. Of these 85% have been interpreted using our computer program. Allowances are made for drug and patient history and individualized reports are generated with regard to the patients age and sex. Finalization of reports is still subject to change by the nuclear physician at the time of final review. Automated, computerized interpretations have realized cost savings through reduced personnel and personnel time and provided uniformity of the interpretations among the five physicians. Prior to computerization of interpretations, all radioassay results had to be dictated and reviewed for signing by one of the resident or staff physicians. Turn around times for reports prior to the automated computer program generally were two to three days. Whereas, the computerized interpret system allows reports to generally be issued the day assays are completed

  2. Algebraic computing

    International Nuclear Information System (INIS)

    MacCallum, M.A.H.

    1990-01-01

    The implementation of a new computer algebra system is time consuming: designers of general purpose algebra systems usually say it takes about 50 man-years to create a mature and fully functional system. Hence the range of available systems and their capabilities changes little between one general relativity meeting and the next, despite which there have been significant changes in the period since the last report. The introductory remarks aim to give a brief survey of capabilities of the principal available systems and highlight one or two trends. The reference to the most recent full survey of computer algebra in relativity and brief descriptions of the Maple, REDUCE and SHEEP and other applications are given. (author)

  3. Computational Controversy

    OpenAIRE

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have appeared, using new data sources such as Wikipedia, which help us now better understand these phenomena. However, compared to what social sciences have discovered about such debates, the existing computati...

  4. Computed tomography

    International Nuclear Information System (INIS)

    Andre, M.; Resnick, D.

    1988-01-01

    Computed tomography (CT) has matured into a reliable and prominent tool for study of the muscoloskeletal system. When it was introduced in 1973, it was unique in many ways and posed a challenge to interpretation. It is in these unique features, however, that its advantages lie in comparison with conventional techniques. These advantages will be described in a spectrum of important applications in orthopedics and rheumatology

  5. Computed radiography

    International Nuclear Information System (INIS)

    Pupchek, G.

    2004-01-01

    Computed radiography (CR) is an image acquisition process that is used to create digital, 2-dimensional radiographs. CR employs a photostimulable phosphor-based imaging plate, replacing the standard x-ray film and intensifying screen combination. Conventional radiographic exposure equipment is used with no modification required to the existing system. CR can transform an analog x-ray department into a digital one and eliminates the need for chemicals, water, darkrooms and film processor headaches. (author)

  6. Computational universes

    International Nuclear Information System (INIS)

    Svozil, Karl

    2005-01-01

    Suspicions that the world might be some sort of a machine or algorithm existing 'in the mind' of some symbolic number cruncher have lingered from antiquity. Although popular at times, the most radical forms of this idea never reached mainstream. Modern developments in physics and computer science have lent support to the thesis, but empirical evidence is needed before it can begin to replace our contemporary world view

  7. Final Performance Progress Report

    Energy Technology Data Exchange (ETDEWEB)

    Houldin, Joseph [Delaware Valley Industrial Resource Center, Philadelphia, PA (United States); Saboor, Veronica [Delaware Valley Industrial Resource Center, Philadelphia, PA (United States)

    2016-03-30

    about assessing a company’s technical assets, broadening our view of the business to go beyond what they make or what NAICS code they have…to better understand their capacity, capability, and expertise, and to learn more about THEIR customers. Knowing more about the markets they serve can often provide insight into their level of technical knowledge and sophistication. Finally, in the spirit of realizing the intent of the Accelerator we strove to align and integrate the work and activities supported by the five funding agencies to leverage each effort. To that end, we include in the Integrated Work Plan a graphic that illustrates that integration. What follows is our summary report of the project, aggregated from prior reports.

  8. Computational Science at the Argonne Leadership Computing Facility

    Science.gov (United States)

    Romero, Nichols

    2014-03-01

    The goal of the Argonne Leadership Computing Facility (ALCF) is to extend the frontiers of science by solving problems that require innovative approaches and the largest-scale computing systems. ALCF's most powerful computer - Mira, an IBM Blue Gene/Q system - has nearly one million cores. How does one program such systems? What software tools are available? Which scientific and engineering applications are able to utilize such levels of parallelism? This talk will address these questions and describe a sampling of projects that are using ALCF systems in their research, including ones in nanoscience, materials science, and chemistry. Finally, the ways to gain access to ALCF resources will be presented. This research used resources of the Argonne Leadership Computing Facility at Argonne National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under contract DE-AC02-06CH11357.

  9. Parallel computers and three-dimensional computational electromagnetics

    International Nuclear Information System (INIS)

    Madsen, N.K.

    1994-01-01

    The authors have continued to enhance their ability to use new massively parallel processing computers to solve time-domain electromagnetic problems. New vectorization techniques have improved the performance of their code DSI3D by factors of 5 to 15, depending on the computer used. New radiation boundary conditions and far-field transformations now allow the computation of radar cross-section values for complex objects. A new parallel-data extraction code has been developed that allows the extraction of data subsets from large problems, which have been run on parallel computers, for subsequent post-processing on workstations with enhanced graphics capabilities. A new charged-particle-pushing version of DSI3D is under development. Finally, DSI3D has become a focal point for several new Cooperative Research and Development Agreement activities with industrial companies such as Lockheed Advanced Development Company, Varian, Hughes Electron Dynamics Division, General Atomic, and Cray

  10. IBM Cloud Computing Powering a Smarter Planet

    Science.gov (United States)

    Zhu, Jinzy; Fang, Xing; Guo, Zhe; Niu, Meng Hua; Cao, Fan; Yue, Shuang; Liu, Qin Yu

    With increasing need for intelligent systems supporting the world's businesses, Cloud Computing has emerged as a dominant trend to provide a dynamic infrastructure to make such intelligence possible. The article introduced how to build a smarter planet with cloud computing technology. First, it introduced why we need cloud, and the evolution of cloud technology. Secondly, it analyzed the value of cloud computing and how to apply cloud technology. Finally, it predicted the future of cloud in the smarter planet.

  11. Using Computers in Fluids Engineering Education

    Science.gov (United States)

    Benson, Thomas J.

    1998-01-01

    Three approaches for using computers to improve basic fluids engineering education are presented. The use of computational fluid dynamics solutions to fundamental flow problems is discussed. The use of interactive, highly graphical software which operates on either a modern workstation or personal computer is highlighted. And finally, the development of 'textbooks' and teaching aids which are used and distributed on the World Wide Web is described. Arguments for and against this technology as applied to undergraduate education are also discussed.

  12. CSNS computing environment Based on OpenStack

    Science.gov (United States)

    Li, Yakang; Qi, Fazhi; Chen, Gang; Wang, Yanming; Hong, Jianshu

    2017-10-01

    Cloud computing can allow for more flexible configuration of IT resources and optimized hardware utilization, it also can provide computing service according to the real need. We are applying this computing mode to the China Spallation Neutron Source(CSNS) computing environment. So, firstly, CSNS experiment and its computing scenarios and requirements are introduced in this paper. Secondly, the design and practice of cloud computing platform based on OpenStack are mainly demonstrated from the aspects of cloud computing system framework, network, storage and so on. Thirdly, some improvments to openstack we made are discussed further. Finally, current status of CSNS cloud computing environment are summarized in the ending of this paper.

  13. Customizable computing

    CERN Document Server

    Chen, Yu-Ting; Gill, Michael; Reinman, Glenn; Xiao, Bingjun

    2015-01-01

    Since the end of Dennard scaling in the early 2000s, improving the energy efficiency of computation has been the main concern of the research community and industry. The large energy efficiency gap between general-purpose processors and application-specific integrated circuits (ASICs) motivates the exploration of customizable architectures, where one can adapt the architecture to the workload. In this Synthesis lecture, we present an overview and introduction of the recent developments on energy-efficient customizable architectures, including customizable cores and accelerators, on-chip memory

  14. A primer on the energy efficiency of computing

    Energy Technology Data Exchange (ETDEWEB)

    Koomey, Jonathan G. [Research Fellow, Steyer-Taylor Center for Energy Policy and Finance, Stanford University (United States)

    2015-03-30

    The efficiency of computing at peak output has increased rapidly since the dawn of the computer age. This paper summarizes some of the key factors affecting the efficiency of computing in all usage modes. While there is still great potential for improving the efficiency of computing devices, we will need to alter how we do computing in the next few decades because we are finally approaching the limits of current technologies.

  15. TARGET 2 and Settlement Finality

    Directory of Open Access Journals (Sweden)

    Ivan MANGATCHEV

    2011-03-01

    Full Text Available This article examines how TARGET 2 as system implements the idea of settlement finality regulated by Directive 98/26 EC of the European parliament and of the Council of 19 May 1998 on settlement finality in payment and securities settlement systems (Settlement Finality Directive and Directive 2009/44/EC of the European parliament and of the Council of 6 May 2009 amending Directive 98/26/EC on settlement finality in payment and securities settlement systems and Directive 2002/47/EC on financial collateral arrangements as regards linked systems and credit claims (Directive 2009/44/EC. As the title of the arti and finality of the settlement in this system.

  16. Computer loss experience and predictions

    Science.gov (United States)

    Parker, Donn B.

    1996-03-01

    crypto without control), Internet abuse (antisocial use of data communications), and international industrial espionage (governments stealing business secrets). A wide variety of safeguards are necessary to deal with these new crimes. The most powerful controls include (1) carefully controlled use of cryptography and digital signatures with good key management and overriding business and government decryption capability and (2) use of tokens such as smart cards to increase the strength of secret passwords for authentication of computer users. Jewelry-type security for small computers--including registration of serial numbers and security inventorying of equipment, software, and connectivity--will be necessary. Other safeguards include automatic monitoring of computer use and detection of unusual activities, segmentation and filtering of networks, special paper and ink for documents, and reduction of paper documents. Finally, international cooperation of governments to create trusted environments for business is essential.

  17. Final Report. Center for Scalable Application Development Software

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [Rice Univ., Houston, TX (United States)

    2014-10-26

    The Center for Scalable Application Development Software (CScADS) was established as a part- nership between Rice University, Argonne National Laboratory, University of California Berkeley, University of Tennessee – Knoxville, and University of Wisconsin – Madison. CScADS pursued an integrated set of activities with the aim of increasing the productivity of DOE computational scientists by catalyzing the development of systems software, libraries, compilers, and tools for leadership computing platforms. Principal Center activities were workshops to engage the research community in the challenges of leadership computing, research and development of open-source software, and work with computational scientists to help them develop codes for leadership computing platforms. This final report summarizes CScADS activities at Rice University in these areas.

  18. Computing Services and Assured Computing

    Science.gov (United States)

    2006-05-01

    fighters’ ability to execute the mission.” Computing Services 4 We run IT Systems that: provide medical care pay the warfighters manage maintenance...users • 1,400 applications • 18 facilities • 180 software vendors • 18,000+ copies of executive software products • Virtually every type of mainframe and... chocs electriques, de branchez les deux cordons d’al imentation avant de faire le depannage P R IM A R Y SD A S B 1 2 PowerHub 7000 RST U L 00- 00

  19. FOCUS: a fire management planning system -- final report

    Science.gov (United States)

    Frederick W. Bratten; James B. Davis; George T. Flatman; Jerold W. Keith; Stanley R. Rapp; Theodore G. Storey

    1981-01-01

    FOCUS (Fire Operational Characteristics Using Simulation) is a computer simulation model for evaluating alternative fire management plans. This final report provides a broad overview of the FOCUS system, describes two major modules-fire suppression and cost, explains the role in the system of gaming large fires, and outlines the support programs and ways of...

  20. Review of tolerances at the Final Focus Test Beam

    International Nuclear Information System (INIS)

    Bulos, F.; Burke, D.; Helm, R.; Irwin, J.; Roy, G.; Yamamoto, N.

    1991-01-01

    The authors review the tolerances associated with the Final Focus Test Beam (FFTB). The authors have computed the acceptability window of the input beam for orbit jitter, emittance beta functions mismatch, incoming dispersion and coupling; tolerances on magnet alignment, strength and multipole content; and the initial tuneability capture of the line

  1. Review of tolerances at the Final Focus Test Beam

    International Nuclear Information System (INIS)

    Bulos, F.; Burke, D.; Helm, R.; Irwin, J.; Roy, G.; Yamamoto, N.

    1991-05-01

    We review the tolerances associated with the Final Focus Test Beam (FFTB). We have computed the acceptability window of the input beam for orbit jitter, emittance beta functions mismatch, incoming dispersion and coupling; tolerances on magnet alignment, strength and multipole content; and the initial tuneability capture of the line. 2 refs., 1 fig

  2. Computational neuroscience

    CERN Document Server

    Blackwell, Kim L

    2014-01-01

    Progress in Molecular Biology and Translational Science provides a forum for discussion of new discoveries, approaches, and ideas in molecular biology. It contains contributions from leaders in their fields and abundant references. This volume brings together different aspects of, and approaches to, molecular and multi-scale modeling, with applications to a diverse range of neurological diseases. Mathematical and computational modeling offers a powerful approach for examining the interaction between molecular pathways and ionic channels in producing neuron electrical activity. It is well accepted that non-linear interactions among diverse ionic channels can produce unexpected neuron behavior and hinder a deep understanding of how ion channel mutations bring about abnormal behavior and disease. Interactions with the diverse signaling pathways activated by G protein coupled receptors or calcium influx adds an additional level of complexity. Modeling is an approach to integrate myriad data sources into a cohesiv...

  3. Social Computing

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    The past decade has witnessed a momentous transformation in the way people interact with each other. Content is now co-produced, shared, classified, and rated by millions of people, while attention has become the ephemeral and valuable resource that everyone seeks to acquire. This talk will describe how social attention determines the production and consumption of content within both the scientific community and social media, how its dynamics can be used to predict the future and the role that social media plays in setting the public agenda. About the speaker Bernardo Huberman is a Senior HP Fellow and Director of the Social Computing Lab at Hewlett Packard Laboratories. He received his Ph.D. in Physics from the University of Pennsylvania, and is currently a Consulting Professor in the Department of Applied Physics at Stanford University. He originally worked in condensed matter physics, ranging from superionic conductors to two-dimensional superfluids, and made contributions to the theory of critical p...

  4. computer networks

    Directory of Open Access Journals (Sweden)

    N. U. Ahmed

    2002-01-01

    Full Text Available In this paper, we construct a new dynamic model for the Token Bucket (TB algorithm used in computer networks and use systems approach for its analysis. This model is then augmented by adding a dynamic model for a multiplexor at an access node where the TB exercises a policing function. In the model, traffic policing, multiplexing and network utilization are formally defined. Based on the model, we study such issues as (quality of service QoS, traffic sizing and network dimensioning. Also we propose an algorithm using feedback control to improve QoS and network utilization. Applying MPEG video traces as the input traffic to the model, we verify the usefulness and effectiveness of our model.

  5. Computer Tree

    Directory of Open Access Journals (Sweden)

    Onur AĞAOĞLU

    2014-12-01

    Full Text Available It is crucial that gifted and talented students should be supported by different educational methods for their interests and skills. The science and arts centres (gifted centres provide the Supportive Education Program for these students with an interdisciplinary perspective. In line with the program, an ICT lesson entitled “Computer Tree” serves for identifying learner readiness levels, and defining the basic conceptual framework. A language teacher also contributes to the process, since it caters for the creative function of the basic linguistic skills. The teaching technique is applied for 9-11 aged student level. The lesson introduces an evaluation process including basic information, skills, and interests of the target group. Furthermore, it includes an observation process by way of peer assessment. The lesson is considered to be a good sample of planning for any subject, for the unpredicted convergence of visual and technical abilities with linguistic abilities.

  6. Computed tomography

    International Nuclear Information System (INIS)

    Boyd, D.P.

    1989-01-01

    This paper reports on computed tomographic (CT) scanning which has improved computer-assisted imaging modalities for radiologic diagnosis. The advantage of this modality is its ability to image thin cross-sectional planes of the body, thus uncovering density information in three dimensions without tissue superposition problems. Because this enables vastly superior imaging of soft tissues in the brain and body, CT scanning was immediately successful and continues to grow in importance as improvements are made in speed, resolution, and cost efficiency. CT scanners are used for general purposes, and the more advanced machines are generally preferred in large hospitals, where volume and variety of usage justifies the cost. For imaging in the abdomen, a scanner with a rapid speed is preferred because peristalsis, involuntary motion of the diaphram, and even cardiac motion are present and can significantly degrade image quality. When contrast media is used in imaging to demonstrate scanner, immediate review of images, and multiformat hardcopy production. A second console is reserved for the radiologist to read images and perform the several types of image analysis that are available. Since CT images contain quantitative information in terms of density values and contours of organs, quantitation of volumes, areas, and masses is possible. This is accomplished with region-of- interest methods, which involve the electronic outlining of the selected region of the television display monitor with a trackball-controlled cursor. In addition, various image- processing options, such as edge enhancement (for viewing fine details of edges) or smoothing filters (for enhancing the detectability of low-contrast lesions) are useful tools

  7. Cloud Computing: The Future of Computing

    OpenAIRE

    Aggarwal, Kanika

    2013-01-01

    Cloud computing has recently emerged as a new paradigm for hosting and delivering services over the Internet. Cloud computing is attractive to business owners as it eliminates the requirement for users to plan ahead for provisioning, and allows enterprises to start from the small and increase resources only when there is a rise in service demand. The basic principles of cloud computing is to make the computing be assigned in a great number of distributed computers, rather then local computer ...

  8. Final focus system for TLC

    International Nuclear Information System (INIS)

    Oide, K.

    1988-11-01

    A limit of the chromaticity correction for the final focus system of a TeV Linear Collider (TLC) is investigated. As the result, it becomes possible to increase the aperture of the final doublet with a small increase of the horizontal β function. The new optics design uses a final doublet of 0.5 mm half-aperture and 1.4 T pole-tip field. The length of the system is reduced from 400 m to 200 m by several optics changes. Tolerances for various machine errors with this optics are also studied. 5 refs., 7 figs., 2 tabs

  9. Modeling Computer Virus and Its Dynamics

    Directory of Open Access Journals (Sweden)

    Mei Peng

    2013-01-01

    Full Text Available Based on that the computer will be infected by infected computer and exposed computer, and some of the computers which are in suscepitible status and exposed status can get immunity by antivirus ability, a novel coumputer virus model is established. The dynamic behaviors of this model are investigated. First, the basic reproduction number R0, which is a threshold of the computer virus spreading in internet, is determined. Second, this model has a virus-free equilibrium P0, which means that the infected part of the computer disappears, and the virus dies out, and P0 is a globally asymptotically stable equilibrium if R01 then this model has only one viral equilibrium P*, which means that the computer persists at a constant endemic level, and P* is also globally asymptotically stable. Finally, some numerical examples are given to demonstrate the analytical results.

  10. Finally

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. Broadband in Rural India is not just about connectivity. Broadband in Rural India is not just about connectivity. It is about transforming rural areas of S. Asia.

  11. An introduction to geometric computation

    International Nuclear Information System (INIS)

    Nievergelt, J.

    1991-01-01

    Computational geometry has some appealing features that make it ideal for learning about algorithms and data structures, namely the problem statements are easily understood, intuitively meaningful, and mathematically rigorous, problem statement, solution, and every step of the construction have natural visual representations that support abstract thinking and help in detecting errors of reasoning, and finally, these algorithms are practical because is easy to come up with examples where they can be applied. Figs

  12. When cloud computing meets bioinformatics: a review.

    Science.gov (United States)

    Zhou, Shuigeng; Liao, Ruiqi; Guan, Jihong

    2013-10-01

    In the past decades, with the rapid development of high-throughput technologies, biology research has generated an unprecedented amount of data. In order to store and process such a great amount of data, cloud computing and MapReduce were applied to many fields of bioinformatics. In this paper, we first introduce the basic concepts of cloud computing and MapReduce, and their applications in bioinformatics. We then highlight some problems challenging the applications of cloud computing and MapReduce to bioinformatics. Finally, we give a brief guideline for using cloud computing in biology research.

  13. Tracing monadic computations and representing effects

    Directory of Open Access Journals (Sweden)

    Maciej Piróg

    2012-02-01

    Full Text Available In functional programming, monads are supposed to encapsulate computations, effectfully producing the final result, but keeping to themselves the means of acquiring it. For various reasons, we sometimes want to reveal the internals of a computation. To make that possible, in this paper we introduce monad transformers that add the ability to automatically accumulate observations about the course of execution as an effect. We discover that if we treat the resulting trace as the actual result of the computation, we can find new functionality in existing monads, notably when working with non-terminating computations.

  14. HINTS Puerto Rico: Final Report

    Science.gov (United States)

    This final report describes HINTS implementation in Puerto Rico. The report addresses sampling; staffing, training and management of data collection; calling protocol; findings from the CATI Operations, and sample weights.

  15. Smart roadside initiative : final report.

    Science.gov (United States)

    2015-09-01

    This is the Final Report for the Smart Roadside Initiative (SRI) prototype system deployment project. The SRI prototype was implemented at weigh stations in Grass Lake, Michigan and West Friendship, Maryland. The prototype was developed to integrate ...

  16. Tracking the PhD Students' Daily Computer Use

    Science.gov (United States)

    Sim, Kwong Nui; van der Meer, Jacques

    2015-01-01

    This study investigated PhD students' computer activities in their daily research practice. Software that tracks computer usage (Manic Time) was installed on the computers of nine PhD students, who were at their early, mid and final stage in doing their doctoral research in four different discipline areas (Commerce, Humanities, Health Sciences and…

  17. Computer Refurbishment

    International Nuclear Information System (INIS)

    Ichiyen, Norman; Chan, Dominic; Thompson, Paul

    2004-01-01

    The major activity for the 18-month refurbishment outage at the Point Lepreau Generating Station is the replacement of all 380 fuel channel and calandria tube assemblies and the lower portion of connecting feeder pipes. New Brunswick Power would also take advantage of this outage to conduct a number of repairs, replacements, inspections and upgrades (such as rewinding or replacing the generator, replacement of shutdown system trip computers, replacement of certain valves and expansion joints, inspection of systems not normally accessible, etc.). This would allow for an additional 25 to 30 years. Among the systems to be replaced are the PDC's for both shutdown systems. Assessments have been completed for both the SDS1 and SDS2 PDC's, and it has been decided to replace the SDS2 PDCs with the same hardware and software approach that has been used successfully for the Wolsong 2, 3, and 4 and the Qinshan 1 and 2 SDS2 PDCs. For SDS1, it has been decided to use the same software development methodology that was used successfully for the Wolsong and Qinshan called the I A and to use a new hardware platform in order to ensure successful operation for the 25-30 year station operating life. The selected supplier is Triconex, which uses a triple modular redundant architecture that will enhance the robustness/fault tolerance of the design with respect to equipment failures

  18. Final Scientific/Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Reeder, Richard [Stony Brook Univ., NY (United States); Phillips, Brian [Stony Brook Univ., NY (United States)

    2017-10-18

    A variety of calcifying organisms produce a transient or metastable amorphous calcium carbonate (ACC) precursor phase that is assembled and subsequently transformed into a crystalline biomineral, typically calcite or aragonite. The complex shapes, hierarchical structures, and unique physical properties of the biominerals that result from this calcification pathway have stimulated interest in adapting these concepts for the design and creation of bio-inspired functional materials in the laboratory. ACC also forms as a reactive precursor in diverse inorganic systems and is likely to play a much broader role in calcium carbonate formation. Knowledge of the structure, composition, and behavior of this metastable phase is critical for establishing a structural and mechanistic framework for calcium carbonate formation and its role in biogeochemical processes, including carbon cycling. Minor additives, such as magnesium, phosphorus, and organic macromolecules, are known to play important roles in controlling ACC stability, transformation kinetics, and selection of final crystalline polymorph. Molecular water also occurs in many types of ACC and is thought to play a structural role in its stability and transformation behavior. One of the major challenges that remain unresolved is identification of the structural basis for the role of these minor additives and molecular water. The absence of long-range order in ACC, and other amorphous phases, has posed a challenge for study by techniques commonly used for crystalline solids. Preliminary studies in our group show that the combination of two techniques, synchrotron X-ray-based pair distribution function (PDF) analysis and nuclear magnetic resonance (NMR) spectroscopy can provide entirely new insight to structural properties of synthetic ACC over length scales that are most relevant for understanding its transformation properties. Building on preliminary experiments, we propose a systematic study of synthesis, structure, and

  19. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses ... CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known ...

  20. Illustrated computer tomography

    International Nuclear Information System (INIS)

    Takahashi, S.

    1983-01-01

    This book provides the following information: basic aspects of computed tomography; atlas of computed tomography of the normal adult; clinical application of computed tomography; and radiotherapy planning and computed tomography

  1. Analog and hybrid computing

    CERN Document Server

    Hyndman, D E

    2013-01-01

    Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl

  2. Cloud Computing Fundamentals

    Science.gov (United States)

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  3. Final Stage Development of Reactor Console Simulator

    International Nuclear Information System (INIS)

    Mohamad Idris Taib; Ridzuan Abdul Mutalib; Zareen Khan Abdul Jalil Khan; Mohd Khairulezwan Abdul Manan; Mohd Sabri Minhat; Nurfarhana Ayuni Joha

    2013-01-01

    The Reactor Console Simulator PUSPATI TRIGA Reactor was developed since end of 2011 and now in the final stage of development. It is will be an interactive tool for operator training and teaching of PUSPATI TRIGA Reactor. Behavior and characteristic for reactor console and reactor itself can be evaluated and understand. This Simulator will be used as complement for actual present reactor console. Implementation of human system interface (HSI) is using computer screens, keyboard and mouse. Multiple screens are used to match the physical of present reactor console. LabVIEW software are using for user interface and mathematical calculation. Polynomial equation based on control rods calibration data as well as operation parameters record was used to calculate and estimated reactor console parameters. The capabilities in user interface, reactor physics and thermal-hydraulics can be expanded and explored to simulation as well as modeling for New Reactor Console, Research Reactor and Nuclear Power Plant. (author)

  4. Neurons to algorithms LDRD final report.

    Energy Technology Data Exchange (ETDEWEB)

    Rothganger, Fredrick H.; Aimone, James Bradley; Warrender, Christina E.; Trumbo, Derek

    2013-09-01

    Over the last three years the Neurons to Algorithms (N2A) LDRD project teams has built infrastructure to discover computational structures in the brain. This consists of a modeling language, a tool that enables model development and simulation in that language, and initial connections with the Neuroinformatics community, a group working toward similar goals. The approach of N2A is to express large complex systems like the brain as populations of a discrete part types that have specific structural relationships with each other, along with internal and structural dynamics. Such an evolving mathematical system may be able to capture the essence of neural processing, and ultimately of thought itself. This final report is a cover for the actual products of the project: the N2A Language Specification, the N2A Application, and a journal paper summarizing our methods.

  5. GATE: Improving the computational efficiency

    International Nuclear Information System (INIS)

    Staelens, S.; De Beenhouwer, J.; Kruecker, D.; Maigne, L.; Rannou, F.; Ferrer, L.; D'Asseler, Y.; Buvat, I.; Lemahieu, I.

    2006-01-01

    GATE is a software dedicated to Monte Carlo simulations in Single Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET). An important disadvantage of those simulations is the fundamental burden of computation time. This manuscript describes three different techniques in order to improve the efficiency of those simulations. Firstly, the implementation of variance reduction techniques (VRTs), more specifically the incorporation of geometrical importance sampling, is discussed. After this, the newly designed cluster version of the GATE software is described. The experiments have shown that GATE simulations scale very well on a cluster of homogeneous computers. Finally, an elaboration on the deployment of GATE on the Enabling Grids for E-Science in Europe (EGEE) grid will conclude the description of efficiency enhancement efforts. The three aforementioned methods improve the efficiency of GATE to a large extent and make realistic patient-specific overnight Monte Carlo simulations achievable

  6. Unconventional Quantum Computing Devices

    OpenAIRE

    Lloyd, Seth

    2000-01-01

    This paper investigates a variety of unconventional quantum computation devices, including fermionic quantum computers and computers that exploit nonlinear quantum mechanics. It is shown that unconventional quantum computing devices can in principle compute some quantities more rapidly than `conventional' quantum computers.

  7. Computing handbook computer science and software engineering

    CERN Document Server

    Gonzalez, Teofilo; Tucker, Allen

    2014-01-01

    Overview of Computer Science Structure and Organization of Computing Peter J. DenningComputational Thinking Valerie BarrAlgorithms and Complexity Data Structures Mark WeissBasic Techniques for Design and Analysis of Algorithms Edward ReingoldGraph and Network Algorithms Samir Khuller and Balaji RaghavachariComputational Geometry Marc van KreveldComplexity Theory Eric Allender, Michael Loui, and Kenneth ReganFormal Models and Computability Tao Jiang, Ming Li, and Bala

  8. Computation Directorate 2008 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, D L

    2009-03-25

    Whether a computer is simulating the aging and performance of a nuclear weapon, the folding of a protein, or the probability of rainfall over a particular mountain range, the necessary calculations can be enormous. Our computers help researchers answer these and other complex problems, and each new generation of system hardware and software widens the realm of possibilities. Building on Livermore's historical excellence and leadership in high-performance computing, Computation added more than 331 trillion floating-point operations per second (teraFLOPS) of power to LLNL's computer room floors in 2008. In addition, Livermore's next big supercomputer, Sequoia, advanced ever closer to its 2011-2012 delivery date, as architecture plans and the procurement contract were finalized. Hyperion, an advanced technology cluster test bed that teams Livermore with 10 industry leaders, made a big splash when it was announced during Michael Dell's keynote speech at the 2008 Supercomputing Conference. The Wall Street Journal touted Hyperion as a 'bright spot amid turmoil' in the computer industry. Computation continues to measure and improve the costs of operating LLNL's high-performance computing systems by moving hardware support in-house, by measuring causes of outages to apply resources asymmetrically, and by automating most of the account and access authorization and management processes. These improvements enable more dollars to go toward fielding the best supercomputers for science, while operating them at less cost and greater responsiveness to the customers.

  9. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  10. Cassini's Grand Finale Science Highlights

    Science.gov (United States)

    Spilker, Linda

    2017-10-01

    After 13 years in orbit, the Cassini-Huygens Mission to Saturn ended in a science-rich blaze of glory. Cassini returned its final bits of unique science data on September 15, 2017, as it plunged into Saturn's atmosphere satisfying planetary protection requirements. Cassini's Grand Finale covered a period of roughly five months and ended with the first time exploration of the region between the rings and planet.The final close flyby of Titan in late April 2017 propelled Cassini across Saturn’s main rings and into its Grand Finale orbits; 22 orbits that repeatedly dove between Saturn’s innermost rings and upper atmosphere making Cassini the first spacecraft to explore this region. The last orbit turned the spacecraft into the first Saturn upper atmospheric probe.The Grand Finale orbits provided highest resolution observations of both the rings and Saturn, and in-situ sampling of the ring particle composition, Saturn's atmosphere, plasma, and innermost radiation belts. The gravitational field was measured to unprecedented accuracy, providing information on the interior structure of the planet, winds in the deeper atmosphere, and mass of the rings. The magnetic field provided insight into the physical nature of the magnetic dynamo and structure of the internal magnetic field. The ion and neutral mass spectrometer sampled the upper atmosphere for molecules that escape the atmosphere in addition to molecules originating from the rings. The cosmic dust analyzer directly sampled the composition from different parts of the main rings for the first time. Fields and particles instruments directly measured the plasma environment between the rings and planet.Science highlights and new mysteries gleaned to date from the Grand Finale orbits will be discussed.The research described in this paper was carried out in part at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration. Copyright 2017

  11. Computation cluster for Monte Carlo calculations

    International Nuclear Information System (INIS)

    Petriska, M.; Vitazek, K.; Farkas, G.; Stacho, M.; Michalek, S.

    2010-01-01

    Two computation clusters based on Rocks Clusters 5.1 Linux distribution with Intel Core Duo and Intel Core Quad based computers were made at the Department of the Nuclear Physics and Technology. Clusters were used for Monte Carlo calculations, specifically for MCNP calculations applied in Nuclear reactor core simulations. Optimization for computation speed was made on hardware and software basis. Hardware cluster parameters, such as size of the memory, network speed, CPU speed, number of processors per computation, number of processors in one computer were tested for shortening the calculation time. For software optimization, different Fortran compilers, MPI implementations and CPU multi-core libraries were tested. Finally computer cluster was used in finding the weighting functions of neutron ex-core detectors of VVER-440. (authors)

  12. Computation cluster for Monte Carlo calculations

    Energy Technology Data Exchange (ETDEWEB)

    Petriska, M.; Vitazek, K.; Farkas, G.; Stacho, M.; Michalek, S. [Dep. Of Nuclear Physics and Technology, Faculty of Electrical Engineering and Information, Technology, Slovak Technical University, Ilkovicova 3, 81219 Bratislava (Slovakia)

    2010-07-01

    Two computation clusters based on Rocks Clusters 5.1 Linux distribution with Intel Core Duo and Intel Core Quad based computers were made at the Department of the Nuclear Physics and Technology. Clusters were used for Monte Carlo calculations, specifically for MCNP calculations applied in Nuclear reactor core simulations. Optimization for computation speed was made on hardware and software basis. Hardware cluster parameters, such as size of the memory, network speed, CPU speed, number of processors per computation, number of processors in one computer were tested for shortening the calculation time. For software optimization, different Fortran compilers, MPI implementations and CPU multi-core libraries were tested. Finally computer cluster was used in finding the weighting functions of neutron ex-core detectors of VVER-440. (authors)

  13. Implementation of cloud computing in higher education

    Science.gov (United States)

    Asniar; Budiawan, R.

    2016-04-01

    Cloud computing research is a new trend in distributed computing, where people have developed service and SOA (Service Oriented Architecture) based application. This technology is very useful to be implemented, especially for higher education. This research is studied the need and feasibility for the suitability of cloud computing in higher education then propose the model of cloud computing service in higher education in Indonesia that can be implemented in order to support academic activities. Literature study is used as the research methodology to get a proposed model of cloud computing in higher education. Finally, SaaS and IaaS are cloud computing service that proposed to be implemented in higher education in Indonesia and cloud hybrid is the service model that can be recommended.

  14. Use of cloud computing in biomedicine.

    Science.gov (United States)

    Sobeslav, Vladimir; Maresova, Petra; Krejcar, Ondrej; Franca, Tanos C C; Kuca, Kamil

    2016-12-01

    Nowadays, biomedicine is characterised by a growing need for processing of large amounts of data in real time. This leads to new requirements for information and communication technologies (ICT). Cloud computing offers a solution to these requirements and provides many advantages, such as cost savings, elasticity and scalability of using ICT. The aim of this paper is to explore the concept of cloud computing and the related use of this concept in the area of biomedicine. Authors offer a comprehensive analysis of the implementation of the cloud computing approach in biomedical research, decomposed into infrastructure, platform and service layer, and a recommendation for processing large amounts of data in biomedicine. Firstly, the paper describes the appropriate forms and technological solutions of cloud computing. Secondly, the high-end computing paradigm of cloud computing aspects is analysed. Finally, the potential and current use of applications in scientific research of this technology in biomedicine is discussed.

  15. Final disposal of radioactive wastes

    Energy Technology Data Exchange (ETDEWEB)

    Kroebel, R [Kernforschungszentrum Karlsruhe G.m.b.H. (Germany, F.R.). Projekt Wiederaufarbeitung und Abfallbehandlung; Krause, H [Kernforschungszentrum Karlsruhe G.m.b.H. (Germany, F.R.). Abt. zur Behandlung Radioaktiver Abfaelle

    1978-08-01

    This paper discusses the final disposal possibilities for radioactive wastes in the Federal Republic of Germany and the related questions of waste conditioning, storage methods and safety. The programs in progress in neighbouring CEC countries and in the USA are also mentioned briefly. The autors conclude that the existing final disposal possibilities are sufficiently well known and safe, but that they could be improved still further by future development work. The residual hazard potential of radioactive wastes from fuel reprocessing after about 1000 years of storage is lower that of known inorganic core deposits.

  16. Security and privacy in billing services in cloud computing

    OpenAIRE

    Μακρή, Ελένη - Λασκαρίνα

    2013-01-01

    The purpose of this master thesis is to define cloud computing and to introduce its basic principles. Firstly, the history of cloud computing will be briefly discussed, starting from the past and ending up to the current and future situation. Furthermore, the most important characteristics of cloud computing, such as security, privacy and cost, will be analyzed. Moreover the three service and three deployment models of cloud computing will be defined and analyzed with examples. Finally, the a...

  17. A RECIPE FOR LINEAR COLLIDER FINAL FOCUS SYSTEM DESIGN

    International Nuclear Information System (INIS)

    Seryi, Andrei

    2003-01-01

    The design of Final Focus systems for linear colliders is challenging because of the large demagnifications needed to produce nanometer-sized beams at the interaction point. Simple first- and second-order matrix matching have proven insufficient for this task, and minimization of third- and higher-order aberrations is essential. An appropriate strategy is required for the latter to be successful. A recipe for Final Focus design, and a set of computational tools used to implement this approach, are described herein. An example of the use of this procedure is given

  18. Bisphenol A; Final Test Rule

    Science.gov (United States)

    EPA is issuing a final rule, under section 4 of the Toxic Substances Control Act (TSCA) requiring manufacturers and processors of bisphenol A, hereinafter BPA, (4.4’-isopropylidenediphenol, CAS No. 80-05—7) to conduct a 90-day inhalation study.

  19. MINIMARS conceptual design: Final report

    International Nuclear Information System (INIS)

    Lee, J.D.

    1986-09-01

    This volume contains the following sections: (1) fueling systems; (2) blanket; (3) alternative blanket concepts; (4) halo scraper/direct converter system study and final conceptual design; (5) heat-transport and power-conversion systems; (6) tritium systems; (7) minimars air detritiation system; (8) appropriate radiological safety design criteria; and (9) cost estimate

  20. SLC Final Performance and Lessons

    International Nuclear Information System (INIS)

    Phinney, Nan

    2000-01-01

    The Stanford Linear Collider (SLC) was the first prototype of a new type of accelerator, the electron-positron linear collider. Many years of dedicated effort were required to understand the physics of this new technology and to develop the techniques for maximizing performance. Key issues were emittance dilution, stability, final beam optimization and background control. Precision, non-invasive diagnostics were required to measure and monitor the beams throughout the machine. Beam-based feedback systems were needed to stabilize energy, trajectory, intensity and the final beam size at the interaction point. variety of new tuning techniques were developed to correct for residual optical or alignment errors. The final focus system underwent a series of refinements in order to deliver sub-micron size beams. It also took many iterations to understand the sources of backgrounds and develop the methods to control them. The benefit from this accumulated experience was seen in the performance of the SLC during its final run in 1997-98. The luminosity increased by a factor of three to 3*10 30 and the 350,000 Z data sample delivered was nearly double that from all previous runs combined

  1. Final storage of radioactive waste

    International Nuclear Information System (INIS)

    Ziehm, Cornelia

    2015-01-01

    As explained in the present article, operators of nuclear power plants are responsible for the safe final disposal of the radioactive wastes they produce on the strength of the polluter pays principle. To shift the burden of responsibility for safe disposal to society as a whole would violate this principle and is therefore not possible. The polluter pays principle follows from more general principles of the fair distribution of benefits and burdens. Instances of its implementation are to be found in the national Atomic Energy Law as well as in the European Radioactive Waste and Spent Fuel Management Directive. The polluters in this case are in particular responsible for financing the installation and operation of final disposal sites. The reserves accumulated so far for the decommissioning and dismantling of nuclear power plants and disposal of radioactive wastes, including the installation and operation of final disposal sites, should be transferred to a public-law fund. This fund should be supplemented by the polluters to cover further foreseeable costs not covered by the reserves accumulated so far, including a realistic cost increase factor, appropriate risk reserves as well as the costs of the site selection procedure and a share in the costs for the safe closure of the final disposal sites of Morsleben and Asse II. This would merely be implementing in the sphere of atomic law that has long been standard practice in other areas of environmental law involving environmental hazards.

  2. Videoprocessing with the MSX-computer

    International Nuclear Information System (INIS)

    Vliet, G.J. van.

    1988-01-01

    This report deals with the processing of video images with a Philips MSX-2 computer and is directed specifically onto the processing of the videosignals of the beamviewers. The final purpose is to create an extra control function which may be used for intuning the beam. This control function is established by mixing the video signals with a reference image from the computer. 7 figs

  3. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  4. Cloud Computing Security: A Survey

    Directory of Open Access Journals (Sweden)

    Issa M. Khalil

    2014-02-01

    Full Text Available Cloud computing is an emerging technology paradigm that migrates current technological and computing concepts into utility-like solutions similar to electricity and water systems. Clouds bring out a wide range of benefits including configurable computing resources, economic savings, and service flexibility. However, security and privacy concerns are shown to be the primary obstacles to a wide adoption of clouds. The new concepts that clouds introduce, such as multi-tenancy, resource sharing and outsourcing, create new challenges to the security community. Addressing these challenges requires, in addition to the ability to cultivate and tune the security measures developed for traditional computing systems, proposing new security policies, models, and protocols to address the unique cloud security challenges. In this work, we provide a comprehensive study of cloud computing security and privacy concerns. We identify cloud vulnerabilities, classify known security threats and attacks, and present the state-of-the-art practices to control the vulnerabilities, neutralize the threats, and calibrate the attacks. Additionally, we investigate and identify the limitations of the current solutions and provide insights of the future security perspectives. Finally, we provide a cloud security framework in which we present the various lines of defense and identify the dependency levels among them. We identify 28 cloud security threats which we classify into five categories. We also present nine general cloud attacks along with various attack incidents, and provide effectiveness analysis of the proposed countermeasures.

  5. Computational Modeling in Liver Surgery

    Directory of Open Access Journals (Sweden)

    Bruno Christ

    2017-11-01

    Full Text Available The need for extended liver resection is increasing due to the growing incidence of liver tumors in aging societies. Individualized surgical planning is the key for identifying the optimal resection strategy and to minimize the risk of postoperative liver failure and tumor recurrence. Current computational tools provide virtual planning of liver resection by taking into account the spatial relationship between the tumor and the hepatic vascular trees, as well as the size of the future liver remnant. However, size and function of the liver are not necessarily equivalent. Hence, determining the future liver volume might misestimate the future liver function, especially in cases of hepatic comorbidities such as hepatic steatosis. A systems medicine approach could be applied, including biological, medical, and surgical aspects, by integrating all available anatomical and functional information of the individual patient. Such an approach holds promise for better prediction of postoperative liver function and hence improved risk assessment. This review provides an overview of mathematical models related to the liver and its function and explores their potential relevance for computational liver surgery. We first summarize key facts of hepatic anatomy, physiology, and pathology relevant for hepatic surgery, followed by a description of the computational tools currently used in liver surgical planning. Then we present selected state-of-the-art computational liver models potentially useful to support liver surgery. Finally, we discuss the main challenges that will need to be addressed when developing advanced computational planning tools in the context of liver surgery.

  6. Experience of final examination for master's degree in optical engineering

    Science.gov (United States)

    Ivanova, Tatiana; Ezhova, Kseniia; Bakholdin, Alexey; Tolstoba, Nadezhda; Romanova, Galina

    2015-10-01

    At the end of master program it is necessary to measure students' knowledge and competences. Master thesis is the one way, but it measure deep knowledge in quite narrow area. Another way of measure is additional final examination that includes topics from the most important courses. In Applied and Computer Optics Department of ITMO University such examination includes theoretical questions and practical tasks from several courses in one examination. Theoretical section of examination is written and second section is practical. Practical section takes place in laboratory with real equipment or with computer simulation. In the paper examples of tasks for master programs, and results of examination are presented.

  7. Photovoltaic subsystem marketing and distribution model: programming manual. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1982-07-01

    Complete documentation of the marketing and distribution (M and D) computer model is provided. The purpose is to estimate the costs of selling and transporting photovoltaic solar energy products from the manufacturer to the final customer. The model adjusts for the inflation and regional differences in marketing and distribution costs. The model consists of three major components: the marketing submodel, the distribution submodel, and the financial submodel. The computer program is explained including the input requirements, output reports, subprograms and operating environment. The program specifications discuss maintaining the validity of the data and potential improvements. An example for a photovoltaic concentrator collector demonstrates the application of the model.

  8. Applied Parallel Computing Industrial Computation and Optimization

    DEFF Research Database (Denmark)

    Madsen, Kaj; NA NA NA Olesen, Dorte

    Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)......Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)...

  9. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  10. BONFIRE: benchmarking computers and computer networks

    OpenAIRE

    Bouckaert, Stefan; Vanhie-Van Gerwen, Jono; Moerman, Ingrid; Phillips, Stephen; Wilander, Jerker

    2011-01-01

    The benchmarking concept is not new in the field of computing or computer networking. With “benchmarking tools”, one usually refers to a program or set of programs, used to evaluate the performance of a solution under certain reference conditions, relative to the performance of another solution. Since the 1970s, benchmarking techniques have been used to measure the performance of computers and computer networks. Benchmarking of applications and virtual machines in an Infrastructure-as-a-Servi...

  11. Final Empirical Test Case Specification

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Heiselberg, Per

    This document includes the empirical specification on the IEA task of evaluation building energy simulation computer programs for the Double Skin Facades (DSF) constructions. There are two approaches involved into this procedure, one is the comparative approach and another is the empirical one....

  12. DOE Matching Grant Program; FINAL

    International Nuclear Information System (INIS)

    Dr Marvin Adams

    2002-01-01

    OAK 270 - The DOE Matching Grant Program provided$50,000.00 to the Dept of N.E. at TAMU, matching a gift of$50,000.00 from TXU Electric. The$100,000.00 total was spent on scholarships, departmental labs, and computing network

  13. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  14. Computing at Stanford.

    Science.gov (United States)

    Feigenbaum, Edward A.; Nielsen, Norman R.

    1969-01-01

    This article provides a current status report on the computing and computer science activities at Stanford University, focusing on the Computer Science Department, the Stanford Computation Center, the recently established regional computing network, and the Institute for Mathematical Studies in the Social Sciences. Also considered are such topics…

  15. Soft computing in computer and information science

    CERN Document Server

    Fray, Imed; Pejaś, Jerzy

    2015-01-01

    This book presents a carefully selected and reviewed collection of papers presented during the 19th Advanced Computer Systems conference ACS-2014. The Advanced Computer Systems conference concentrated from its beginning on methods and algorithms of artificial intelligence. Further future brought new areas of interest concerning technical informatics related to soft computing and some more technological aspects of computer science such as multimedia and computer graphics, software engineering, web systems, information security and safety or project management. These topics are represented in the present book under the categories Artificial Intelligence, Design of Information and Multimedia Systems, Information Technology Security and Software Technologies.

  16. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  17. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... When the image slices are reassembled by computer software, the result is a very detailed multidimensional view ... Safety Images related to Computed Tomography (CT) - Head Videos related to Computed Tomography (CT) - Head Sponsored by ...

  18. Computers: Instruments of Change.

    Science.gov (United States)

    Barkume, Megan

    1993-01-01

    Discusses the impact of computers in the home, the school, and the workplace. Looks at changes in computer use by occupations and by industry. Provides information on new job titles in computer occupations. (JOW)

  19. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  20. Distributed multiscale computing

    NARCIS (Netherlands)

    Borgdorff, J.

    2014-01-01

    Multiscale models combine knowledge, data, and hypotheses from different scales. Simulating a multiscale model often requires extensive computation. This thesis evaluates distributing these computations, an approach termed distributed multiscale computing (DMC). First, the process of multiscale

  1. Computational Modeling | Bioenergy | NREL

    Science.gov (United States)

    cell walls and are the source of biofuels and biomaterials. Our modeling investigates their properties . Quantum Mechanical Models NREL studies chemical and electronic properties and processes to reduce barriers Computational Modeling Computational Modeling NREL uses computational modeling to increase the

  2. Computer Viruses: An Overview.

    Science.gov (United States)

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  3. Final disposal of radioactive waste

    Directory of Open Access Journals (Sweden)

    Freiesleben H.

    2013-06-01

    Full Text Available In this paper the origin and properties of radioactive waste as well as its classification scheme (low-level waste – LLW, intermediate-level waste – ILW, high-level waste – HLW are presented. The various options for conditioning of waste of different levels of radioactivity are reviewed. The composition, radiotoxicity and reprocessing of spent fuel and their effect on storage and options for final disposal are discussed. The current situation of final waste disposal in a selected number of countries is mentioned. Also, the role of the International Atomic Energy Agency with regard to the development and monitoring of international safety standards for both spent nuclear fuel and radioactive waste management is described.

  4. Plasma lenses for SLAC Final Focus Test facility

    International Nuclear Information System (INIS)

    Betz, D.; Cline, D.; Joshi, C.; Rajagopalan, S.; Rosenzweig, J.; Su, J.J.; Williams, R.; Chen, P.; Gundersen, M.; Katsouleas, T.; Norem, J.

    1991-01-01

    A collaborative group of accelerator and plasma physicists and engineers has formed with an interest in exploring the use of plasma lenses to meet the needs of future colliders. Analytic and computational models of plasma lenses are briefly reviewed and several design examples for the SLAC Final Focus Test Beam are presented. The examples include discrete, thick, and adiabatic lenses. A potential plasma source with desirable lens characteristics is presented

  5. A Monte Carlo program for generating hadronic final states

    International Nuclear Information System (INIS)

    Angelini, L.; Pellicoro, M.; Nitti, L.; Preparata, G.; Valenti, G.

    1991-01-01

    FIRST is a computer program to generate final states from high energy hadronic interactions using the Monte Carlo technique. It is based on a theoretical model in which the high degree of universality in such interactions is related with the existence of highly excited quark-antiquark bound states, called fire-strings. The program handles the decay of both fire-strings and unstable particles produced in the intermediate states. (orig.)

  6. NONLINEAR DYNAMICAL SYSTEMS - Final report

    Energy Technology Data Exchange (ETDEWEB)

    Philip Holmes

    2005-12-31

    This document is the final report on the work completed on DE-FG02-95ER25238 since the start of the second renewal period: Jan 1, 2001. It supplements the annual reports submitted in 2001 and 2002. In the renewal proposal I envisaged work in three main areas: Analytical and topological tools for studying flows and maps Low dimensional models of fluid flow Models of animal locomotion and I describe the progess made on each project.

  7. Computer Virus and Trends

    OpenAIRE

    Tutut Handayani; Soenarto Usna,Drs.MMSI

    2004-01-01

    Since its appearance the first time in the mid-1980s, computer virus has invited various controversies that still lasts to this day. Along with the development of computer systems technology, viruses komputerpun find new ways to spread itself through a variety of existing communications media. This paper discusses about some things related to computer viruses, namely: the definition and history of computer viruses; the basics of computer viruses; state of computer viruses at this time; and ...

  8. Brain architecture: a design for natural computation.

    Science.gov (United States)

    Kaiser, Marcus

    2007-12-15

    Fifty years ago, John von Neumann compared the architecture of the brain with that of the computers he invented and which are still in use today. In those days, the organization of computers was based on concepts of brain organization. Here, we give an update on current results on the global organization of neural systems. For neural systems, we outline how the spatial and topological architecture of neuronal and cortical networks facilitates robustness against failures, fast processing and balanced network activation. Finally, we discuss mechanisms of self-organization for such architectures. After all, the organization of the brain might again inspire computer architecture.

  9. Plasticity: modeling & computation

    National Research Council Canada - National Science Library

    Borja, Ronaldo Israel

    2013-01-01

    .... "Plasticity Modeling & Computation" is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids...

  10. Cloud Computing Quality

    Directory of Open Access Journals (Sweden)

    Anamaria Şiclovan

    2013-02-01

    Full Text Available Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offered to the consumers as a product delivered online. This paper is meant to describe the quality of cloud computing services, analyzing the advantages and characteristics offered by it. It is a theoretical paper.Keywords: Cloud computing, QoS, quality of cloud computing

  11. Computer hardware fault administration

    Science.gov (United States)

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  12. Computer jargon explained

    CERN Document Server

    Enticknap, Nicholas

    2014-01-01

    Computer Jargon Explained is a feature in Computer Weekly publications that discusses 68 of the most commonly used technical computing terms. The book explains what the terms mean and why the terms are important to computer professionals. The text also discusses how the terms relate to the trends and developments that are driving the information technology industry. Computer jargon irritates non-computer people and in turn causes problems for computer people. The technology and the industry are changing so rapidly; it is very hard even for professionals to keep updated. Computer people do not

  13. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  14. Computers in nuclear medicine

    International Nuclear Information System (INIS)

    Giannone, Carlos A.

    1999-01-01

    This chapter determines: capture and observation of images in computers; hardware and software used, personal computers, networks and workstations. The use of special filters determine the quality image

  15. Synchrotron Imaging Computations on the Grid without the Computing Element

    International Nuclear Information System (INIS)

    Curri, A; Pugliese, R; Borghes, R; Kourousias, G

    2011-01-01

    Besides the heavy use of the Grid in the Synchrotron Radiation Facility (SRF) Elettra, additional special requirements from the beamlines had to be satisfied through a novel solution that we present in this work. In the traditional Grid Computing paradigm the computations are performed on the Worker Nodes of the grid element known as the Computing Element. A Grid middleware extension that our team has been working on, is that of the Instrument Element. In general it is used to Grid-enable instrumentation; and it can be seen as a neighbouring concept to that of the traditional Control Systems. As a further extension we demonstrate the Instrument Element as the steering mechanism for a series of computations. In our deployment it interfaces a Control System that manages a series of computational demanding Scientific Imaging tasks in an online manner. The instrument control in Elettra is done through a suitable Distributed Control System, a common approach in the SRF community. The applications that we present are for a beamline working in medical imaging. The solution resulted to a substantial improvement of a Computed Tomography workflow. The near-real-time requirements could not have been easily satisfied from our Grid's middleware (gLite) due to the various latencies often occurred during the job submission and queuing phases. Moreover the required deployment of a set of TANGO devices could not have been done in a standard gLite WN. Besides the avoidance of certain core Grid components, the Grid Security infrastructure has been utilised in the final solution.

  16. Touchable Computing: Computing-Inspired Bio-Detection.

    Science.gov (United States)

    Chen, Yifan; Shi, Shaolong; Yao, Xin; Nakano, Tadashi

    2017-12-01

    We propose a new computing-inspired bio-detection framework called touchable computing (TouchComp). Under the rubric of TouchComp, the best solution is the cancer to be detected, the parameter space is the tissue region at high risk of malignancy, and the agents are the nanorobots loaded with contrast medium molecules for tracking purpose. Subsequently, the cancer detection procedure (CDP) can be interpreted from the computational optimization perspective: a population of externally steerable agents (i.e., nanorobots) locate the optimal solution (i.e., cancer) by moving through the parameter space (i.e., tissue under screening), whose landscape (i.e., a prescribed feature of tissue environment) may be altered by these agents but the location of the best solution remains unchanged. One can then infer the landscape by observing the movement of agents by applying the "seeing-is-sensing" principle. The term "touchable" emphasizes the framework's similarity to controlling by touching the screen with a finger, where the external field for controlling and tracking acts as the finger. Given this analogy, we aim to answer the following profound question: can we look to the fertile field of computational optimization algorithms for solutions to achieve effective cancer detection that are fast, accurate, and robust? Along this line of thought, we consider the classical particle swarm optimization (PSO) as an example and propose the PSO-inspired CDP, which differs from the standard PSO by taking into account realistic in vivo propagation and controlling of nanorobots. Finally, we present comprehensive numerical examples to demonstrate the effectiveness of the PSO-inspired CDP for different blood flow velocity profiles caused by tumor-induced angiogenesis. The proposed TouchComp bio-detection framework may be regarded as one form of natural computing that employs natural materials to compute.

  17. Exterior insulating shutter final prototype design. Final report, Phase II

    Energy Technology Data Exchange (ETDEWEB)

    Dike, G.A.; Kinney, L.F.

    1982-12-01

    The final prototype shutter described uses sliding panels composed of inch-thick thermax sandwiched between 60 mil thick ultraviolet-resistant plastic on the outside, and 20 mil stryrene on the inside. The shuter system was shown to have an effective R-value of 6 using ASHRAE procedures to convert from still air conditions to 15 mph wind conditions in a simulated cold environment. Tests were performed for cyclical operation, vulnerability to ice and wind, thermal performance, and air infiltration. Marketing efforts are described. Cost effectiveness is determined via present value analysis. (LEW)

  18. Metalcasting competitiveness research. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Piwonka, T.S.

    1994-08-01

    This report comprises eleven separate reports: prediction of non- metallic particle distribution, electromagnetic separation of inclusions from molten Al alloy, clean steel castings, waste stream identification and treatment, elastic wave lithotripsy for removal of ceramic from investment castings, metal penetration in sand molds, mold-metal interface gas composition, improved Alloy 718, specifications for iron oxide additions to no-bake sands, criteria functions for defect prediction, and computer-aided cooling curve analysis.

  19. Laser fusion study. Final report

    International Nuclear Information System (INIS)

    1975-06-01

    The following appendices are included: (1) sensor performance calculation techniques, (2) focus sensing, (3) purchased item data, (4) pointing and focusing configuration tradeoff studies, (5) false start centering sensor, (6) RCA application notes on quad delection, (7) elliptical flex pivot analysis, (8) servo mirrors cross coupling, (9) optical misalignment analysis, (10) stress induced birefrigent quarter-wave retarder, (11) data bulletin on incramute damping alloy, (12) the utilization of stepping motors, and (13) computer program listing for stepper motor load simulation

  20. Computation for LHC experiments: a worldwide computing grid

    International Nuclear Information System (INIS)

    Fairouz, Malek

    2010-01-01

    In normal operating conditions the LHC detectors are expected to record about 10 10 collisions each year. The processing of all the consequent experimental data is a real computing challenge in terms of equipment, software and organization: it requires sustaining data flows of a few 10 9 octets per second and recording capacity of a few tens of 10 15 octets each year. In order to meet this challenge a computing network implying the dispatch and share of tasks, has been set. The W-LCG grid (World wide LHC computing grid) is made up of 4 tiers. Tiers 0 is the computer center in CERN, it is responsible for collecting and recording the raw data from the LHC detectors and to dispatch it to the 11 tiers 1. The tiers 1 is typically a national center, it is responsible for making a copy of the raw data and for processing it in order to recover relevant data with a physical meaning and to transfer the results to the 150 tiers 2. The tiers 2 is at the level of the Institute or laboratory, it is in charge of the final analysis of the data and of the production of the simulations. Tiers 3 are at the level of the laboratories, they provide a complementary and local resource to tiers 2 in terms of data analysis. (A.C.)

  1. 9th Symposium on Computational Statistics

    CERN Document Server

    Mildner, Vesna

    1990-01-01

    Although no-one is, probably, too enthused about the idea, it is a fact that the development of most empirical sciences to a great extent depends on the development of data analysis methods and techniques, which, due to the necessity of application of computers for that purpose, actually means that it practically depends on the advancement and orientation of computer statistics. Every other year the International Association for Statistical Computing sponsors the organizition of meetings of individual s professiona77y involved in computational statistics. Since these meetings attract professionals from allover the world, they are a good sample for the estimation of trends in this area which some believe is a statistics proper while others claim it is computer science. It seems, though, that an increasing number of colleagues treat it as an independent scientific or at least technical discipline. This volume contains six invited papers, 41 contributed papers and, finally, two papers which are, formally, softwa...

  2. Relativistic quantum chemistry on quantum computers

    DEFF Research Database (Denmark)

    Veis, L.; Visnak, J.; Fleig, T.

    2012-01-01

    The past few years have witnessed a remarkable interest in the application of quantum computing for solving problems in quantum chemistry more efficiently than classical computers allow. Very recently, proof-of-principle experimental realizations have been reported. However, so far only...... the nonrelativistic regime (i.e., the Schrodinger equation) has been explored, while it is well known that relativistic effects can be very important in chemistry. We present a quantum algorithm for relativistic computations of molecular energies. We show how to efficiently solve the eigenproblem of the Dirac......-Coulomb Hamiltonian on a quantum computer and demonstrate the functionality of the proposed procedure by numerical simulations of computations of the spin-orbit splitting in the SbH molecule. Finally, we propose quantum circuits with three qubits and nine or ten controlled-NOT (CNOT) gates, which implement a proof...

  3. Structure problems in the analog computation

    International Nuclear Information System (INIS)

    Braffort, P.L.

    1957-01-01

    The recent mathematical development showed the importance of elementary structures (algebraic, topological, etc.) in abeyance under the great domains of classical analysis. Such structures in analog computation are put in evidence and possible development of applied mathematics are discussed. It also studied the topological structures of the standard representation of analog schemes such as additional triangles, integrators, phase inverters and functions generators. The analog method gives only the function of the variable: time, as results of its computations. But the course of computation, for systems including reactive circuits, introduces order structures which are called 'chronological'. Finally, it showed that the approximation methods of ordinary numerical and digital computation present the same structure as these analog computation. The structure analysis permits fruitful comparisons between the several domains of applied mathematics and suggests new important domains of application for analog method. (M.P.)

  4. Statistical and Computational Techniques in Manufacturing

    CERN Document Server

    2012-01-01

    In recent years, interest in developing statistical and computational techniques for applied manufacturing engineering has been increased. Today, due to the great complexity of manufacturing engineering and the high number of parameters used, conventional approaches are no longer sufficient. Therefore, in manufacturing, statistical and computational techniques have achieved several applications, namely, modelling and simulation manufacturing processes, optimization manufacturing parameters, monitoring and control, computer-aided process planning, etc. The present book aims to provide recent information on statistical and computational techniques applied in manufacturing engineering. The content is suitable for final undergraduate engineering courses or as a subject on manufacturing at the postgraduate level. This book serves as a useful reference for academics, statistical and computational science researchers, mechanical, manufacturing and industrial engineers, and professionals in industries related to manu...

  5. Dense image correspondences for computer vision

    CERN Document Server

    Liu, Ce

    2016-01-01

    This book describes the fundamental building-block of many new computer vision systems: dense and robust correspondence estimation. Dense correspondence estimation techniques are now successfully being used to solve a wide range of computer vision problems, very different from the traditional applications such techniques were originally developed to solve. This book introduces the techniques used for establishing correspondences between challenging image pairs, the novel features used to make these techniques robust, and the many problems dense correspondences are now being used to solve. The book provides information to anyone attempting to utilize dense correspondences in order to solve new or existing computer vision problems. The editors describe how to solve many computer vision problems by using dense correspondence estimation. Finally, it surveys resources, code, and data necessary for expediting the development of effective correspondence-based computer vision systems.   ·         Provides i...

  6. Electromagnetic Compatibility Design of the Computer Circuits

    Science.gov (United States)

    Zitai, Hong

    2018-02-01

    Computers and the Internet have gradually penetrated into every aspect of people’s daily work. But with the improvement of electronic equipment as well as electrical system, the electromagnetic environment becomes much more complex. Electromagnetic interference has become an important factor to hinder the normal operation of electronic equipment. In order to analyse the computer circuit compatible with the electromagnetic compatibility, this paper starts from the computer electromagnetic and the conception of electromagnetic compatibility. And then, through the analysis of the main circuit and system of computer electromagnetic compatibility problems, we can design the computer circuits in term of electromagnetic compatibility. Finally, the basic contents and methods of EMC test are expounded in order to ensure the electromagnetic compatibility of equipment.

  7. Identification of Enhancers In Human: Advances In Computational Studies

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2016-01-01

    Finally, we take a step further by developing a novel feature selection method suitable for defining a computational framework capable of analyzing the genomic content of enhancers and reporting cell-line specific predictive signatures.

  8. Staging with computed tomography of patients with colon cancer

    DEFF Research Database (Denmark)

    Malmstrom, M. L.; Brisling, S.; Klausen, T. W.

    2018-01-01

    Purpose Accurate staging of colonic cancer is important for patient stratification. We aimed to correlate the diagnostic accuracy of preoperative computed tomography (CT) with final histopathology as reference standard. Methods Data was collected retrospectively on 615 consecutive patients operated...

  9. Computation of water hammer protection of modernized pumping station

    Directory of Open Access Journals (Sweden)

    Himr Daniel

    2014-03-01

    Finally, the pump trip was performed to verify if the system worked correctly. The test showed that pressure pulsations are lower (better than computation predicted. This discrepancy was further analysed.

  10. Advances in unconventional computing

    CERN Document Server

    2017-01-01

    The unconventional computing is a niche for interdisciplinary science, cross-bred of computer science, physics, mathematics, chemistry, electronic engineering, biology, material science and nanotechnology. The aims of this book are to uncover and exploit principles and mechanisms of information processing in and functional properties of physical, chemical and living systems to develop efficient algorithms, design optimal architectures and manufacture working prototypes of future and emergent computing devices. This first volume presents theoretical foundations of the future and emergent computing paradigms and architectures. The topics covered are computability, (non-)universality and complexity of computation; physics of computation, analog and quantum computing; reversible and asynchronous devices; cellular automata and other mathematical machines; P-systems and cellular computing; infinity and spatial computation; chemical and reservoir computing. The book is the encyclopedia, the first ever complete autho...

  11. Multi-Point Combustion System: Final Report

    Science.gov (United States)

    Goeke, Jerry; Pack, Spencer; Zink, Gregory; Ryon, Jason

    2014-01-01

    A low-NOx emission combustor concept has been developed for NASA's Environmentally Responsible Aircraft (ERA) program to meet N+2 emissions goals for a 70,000 lb thrust engine application. These goals include 75 percent reduction of LTO NOx from CAEP6 standards without increasing CO, UHC, or smoke from that of current state of the art. An additional key factor in this work is to improve lean combustion stability over that of previous work performed on similar technology in the early 2000s. The purpose of this paper is to present the final report for the NASA contract. This work included the design, analysis, and test of a multi-point combustion system. All design work was based on the results of Computational Fluid Dynamics modeling with the end results tested on a medium pressure combustion rig at the UC and a medium pressure combustion rig at GRC. The theories behind the designs, results of analysis, and experimental test data will be discussed in this report. The combustion system consists of five radially staged rows of injectors, where ten small scale injectors are used in place of a single traditional nozzle. Major accomplishments of the current work include the design of a Multipoint Lean Direct Injection (MLDI) array and associated air blast and pilot fuel injectors, which is expected to meet or exceed the goal of a 75 percent reduction in LTO NOx from CAEP6 standards. This design incorporates a reduced number of injectors over previous multipoint designs, simplified and lightweight components, and a very compact combustor section. Additional outcomes of the program are validation that the design of these combustion systems can be aided by the use of Computational Fluid Dynamics to predict and reduce emissions. Furthermore, the staging of fuel through the individually controlled radially staged injector rows successfully demonstrated improved low power operability as well as improvements in emissions over previous multipoint designs. Additional comparison

  12. Spacecraft fabrication and test MODIL. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Saito, T.T.

    1994-05-01

    This report covers the period from October 1992 through the close of the project. FY 92 closed out with the successful briefing to industry and with many potential and important initiatives in the spacecraft arena. Due to the funding uncertainties, we were directed to proceed as if our funding would be approximately the same as FY 92 ($2M), but not to make any major new commitments. However, the MODIL`s FY 93 funding was reduced to $810K and we were directed to concentrate on the cryocooler area. The cryocooler effort completed its demonstration project. The final meetings with the cryocooler fabricators were very encouraging as we witnessed the enthusiastic reception of technology to help them reduce fabrication uncertainties. Support of the USAF Phillips Laboratory cryocooler program was continued including kick-off meetings for the Prototype Spacecraft Cryocooler (PSC). Under Phillips Laboratory support, Gill Cruz visited British Aerospace and Lucas Aerospace in the United Kingdom to assess their manufacturing capabilities. In the Automated Spacecraft & Assembly Project (ASAP), contracts were pursued for the analysis by four Brilliant Eyes prime contractors to provide a proprietary snap shot of their current status of Integrated Product Development. In the materials and structure thrust the final analysis was completed of the samples made under the contract (``Partial Automation of Matched Metal Net Shape Molding of Continuous Fiber Composites``) to SPARTA. The Precision Technologies thrust funded the Jet Propulsion Laboratory to prepare a plan to develop a Computer Aided Alignment capability to significantly reduce the time for alignment and even possibly provide real time and remote alignment capability of systems in flight.

  13. MFTF sensor verification computer program

    International Nuclear Information System (INIS)

    Chow, H.K.

    1984-01-01

    The design, requirements document and implementation of the MFE Sensor Verification System were accomplished by the Measurement Engineering Section (MES), a group which provides instrumentation for the MFTF magnet diagnostics. The sensors, installed on and around the magnets and solenoids, housed in a vacuum chamber, will supply information about the temperature, strain, pressure, liquid helium level and magnet voltage to the facility operator for evaluation. As the sensors are installed, records must be maintained as to their initial resistance values. Also, as the work progresses, monthly checks will be made to insure continued sensor health. Finally, after the MFTF-B demonstration, yearly checks will be performed as well as checks of sensors as problem develops. The software to acquire and store the data was written by Harry Chow, Computations Department. The acquired data will be transferred to the MFE data base computer system

  14. Functional Programming in Computer Science

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Loren James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Davis, Marion Kei [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-01-19

    We explore functional programming through a 16-week internship at Los Alamos National Laboratory. Functional programming is a branch of computer science that has exploded in popularity over the past decade due to its high-level syntax, ease of parallelization, and abundant applications. First, we summarize functional programming by listing the advantages of functional programming languages over the usual imperative languages, and we introduce the concept of parsing. Second, we discuss the importance of lambda calculus in the theory of functional programming. Lambda calculus was invented by Alonzo Church in the 1930s to formalize the concept of effective computability, and every functional language is essentially some implementation of lambda calculus. Finally, we display the lasting products of the internship: additions to a compiler and runtime system for the pure functional language STG, including both a set of tests that indicate the validity of updates to the compiler and a compiler pass that checks for illegal instances of duplicate names.

  15. Basic concepts in computational physics

    CERN Document Server

    Stickler, Benjamin A

    2016-01-01

    This new edition is a concise introduction to the basic methods of computational physics. Readers will discover the benefits of numerical methods for solving complex mathematical problems and for the direct simulation of physical processes. The book is divided into two main parts: Deterministic methods and stochastic methods in computational physics. Based on concrete problems, the first part discusses numerical differentiation and integration, as well as the treatment of ordinary differential equations. This is extended by a brief introduction to the numerics of partial differential equations. The second part deals with the generation of random numbers, summarizes the basics of stochastics, and subsequently introduces Monte-Carlo (MC) methods. Specific emphasis is on MARKOV chain MC algorithms. The final two chapters discuss data analysis and stochastic optimization. All this is again motivated and augmented by applications from physics. In addition, the book offers a number of appendices to provide the read...

  16. Open Compute Project at CERN

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    The Open Compute Project, OCP ( http://www.opencompute.org/), was launched by Facebook in 2011 with the objective of building efficient computing infrastructures at lowest possible cost. The technologies are released as open hardware design, with the goal to develop servers and data centers following the model traditionally associated with open source software projects. We have been following the OCP project for some time and decided to buy two OCP twin servers in 2013 to get some hands-on experience. The servers have been tested and compared with our standard hardware regularly acquired through large tenders. In this presentation we will give some relevant results from this testing and also discuss some of the more important differences that can matter for a larger deployment at CERN. Finally it will outline the details for a possible project for a larger deployment of OCP hardware for production use at CERN.

  17. Computational Modeling in Tissue Engineering

    CERN Document Server

    2013-01-01

    One of the major challenges in tissue engineering is the translation of biological knowledge on complex cell and tissue behavior into a predictive and robust engineering process. Mastering this complexity is an essential step towards clinical applications of tissue engineering. This volume discusses computational modeling tools that allow studying the biological complexity in a more quantitative way. More specifically, computational tools can help in:  (i) quantifying and optimizing the tissue engineering product, e.g. by adapting scaffold design to optimize micro-environmental signals or by adapting selection criteria to improve homogeneity of the selected cell population; (ii) quantifying and optimizing the tissue engineering process, e.g. by adapting bioreactor design to improve quality and quantity of the final product; and (iii) assessing the influence of the in vivo environment on the behavior of the tissue engineering product, e.g. by investigating vascular ingrowth. The book presents examples of each...

  18. LDRD 149045 final report distinguishing documents.

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Scott A.

    2010-09-01

    This LDRD 149045 final report describes work that Sandians Scott A. Mitchell, Randall Laviolette, Shawn Martin, Warren Davis, Cindy Philips and Danny Dunlavy performed in 2010. Prof. Afra Zomorodian provided insight. This was a small late-start LDRD. Several other ongoing efforts were leveraged, including the Networks Grand Challenge LDRD, and the Computational Topology CSRF project, and the some of the leveraged work is described here. We proposed a sentence mining technique that exploited both the distribution and the order of parts-of-speech (POS) in sentences in English language documents. The ultimate goal was to be able to discover 'call-to-action' framing documents hidden within a corpus of mostly expository documents, even if the documents were all on the same topic and used the same vocabulary. Using POS was novel. We also took a novel approach to analyzing POS. We used the hypothesis that English follows a dynamical system and the POS are trajectories from one state to another. We analyzed the sequences of POS using support vector machines and the cycles of POS using computational homology. We discovered that the POS were a very weak signal and did not support our hypothesis well. Our original goal appeared to be unobtainable with our original approach. We turned our attention to study an aspect of a more traditional approach to distinguishing documents. Latent Dirichlet Allocation (LDA) turns documents into bags-of-words then into mixture-model points. A distance function is used to cluster groups of points to discover relatedness between documents. We performed a geometric and algebraic analysis of the most popular distance functions and made some significant and surprising discoveries, described in a separate technical report.

  19. Quantum computers in phase space

    International Nuclear Information System (INIS)

    Miquel, Cesar; Paz, Juan Pablo; Saraceno, Marcos

    2002-01-01

    We represent both the states and the evolution of a quantum computer in phase space using the discrete Wigner function. We study properties of the phase space representation of quantum algorithms: apart from analyzing important examples, such as the Fourier transform and Grover's search, we examine the conditions for the existence of a direct correspondence between quantum and classical evolutions in phase space. Finally, we describe how to measure directly the Wigner function in a given phase-space point by means of a tomographic method that, itself, can be interpreted as a simple quantum algorithm

  20. Python and computer vision

    Energy Technology Data Exchange (ETDEWEB)

    Doak, J. E. (Justin E.); Prasad, Lakshman

    2002-01-01

    This paper discusses the use of Python in a computer vision (CV) project. We begin by providing background information on the specific approach to CV employed by the project. This includes a brief discussion of Constrained Delaunay Triangulation (CDT), the Chordal Axis Transform (CAT), shape feature extraction and syntactic characterization, and normalization of strings representing objects. (The terms 'object' and 'blob' are used interchangeably, both referring to an entity extracted from an image.) The rest of the paper focuses on the use of Python in three critical areas: (1) interactions with a MySQL database, (2) rapid prototyping of algorithms, and (3) gluing together all components of the project including existing C and C++ modules. For (l), we provide a schema definition and discuss how the various tables interact to represent objects in the database as tree structures. (2) focuses on an algorithm to create a hierarchical representation of an object, given its string representation, and an algorithm to match unknown objects against objects in a database. And finally, (3) discusses the use of Boost Python to interact with the pre-existing C and C++ code that creates the CDTs and CATS, performs shape feature extraction and syntactic characterization, and normalizes object strings. The paper concludes with a vision of the future use of Python for the CV project.

  1. Final amplifier design and mercury

    International Nuclear Information System (INIS)

    Rose, E.A.; Hanson, D.E.

    1991-01-01

    The final amplifier for the Mercury KrF excimer facility is being designed. The design exercise involves extensive modeling to predict amplifier performance. Models of the pulsed-power system, including a Child-Langmuir diode with closure, electron-beam energy deposition, KrF laser kinetics, amplified spontaneous emission (ASE), a time-dependent laser extraction in the presence of ASE are presented as a design package. The design exercise indicates that the energy objective of Phase I -- 100 joules -- will be met

  2. Computability and unsolvability

    CERN Document Server

    Davis, Martin

    1985-01-01

    ""A clearly written, well-presented survey of an intriguing subject."" - Scientific American. Classic text considers general theory of computability, computable functions, operations on computable functions, Turing machines self-applied, unsolvable decision problems, applications of general theory, mathematical logic, Kleene hierarchy, computable functionals, classification of unsolvable decision problems and more.

  3. Mathematics for computer graphics

    CERN Document Server

    Vince, John

    2006-01-01

    Helps you understand the mathematical ideas used in computer animation, virtual reality, CAD, and other areas of computer graphics. This work also helps you to rediscover the mathematical techniques required to solve problems and design computer programs for computer graphic applications

  4. Computations and interaction

    NARCIS (Netherlands)

    Baeten, J.C.M.; Luttik, S.P.; Tilburg, van P.J.A.; Natarajan, R.; Ojo, A.

    2011-01-01

    We enhance the notion of a computation of the classical theory of computing with the notion of interaction. In this way, we enhance a Turing machine as a model of computation to a Reactive Turing Machine that is an abstract model of a computer as it is used nowadays, always interacting with the user

  5. Symbiotic Cognitive Computing

    OpenAIRE

    Farrell, Robert G.; Lenchner, Jonathan; Kephjart, Jeffrey O.; Webb, Alan M.; Muller, MIchael J.; Erikson, Thomas D.; Melville, David O.; Bellamy, Rachel K.E.; Gruen, Daniel M.; Connell, Jonathan H.; Soroker, Danny; Aaron, Andy; Trewin, Shari M.; Ashoori, Maryam; Ellis, Jason B.

    2016-01-01

    IBM Research is engaged in a research program in symbiotic cognitive computing to investigate how to embed cognitive computing in physical spaces. This article proposes 5 key principles of symbiotic cognitive computing.  We describe how these principles are applied in a particular symbiotic cognitive computing environment and in an illustrative application.  

  6. Research Activity in Computational Physics utilizing High Performance Computing: Co-authorship Network Analysis

    Science.gov (United States)

    Ahn, Sul-Ah; Jung, Youngim

    2016-10-01

    The research activities of the computational physicists utilizing high performance computing are analyzed by bibliometirc approaches. This study aims at providing the computational physicists utilizing high-performance computing and policy planners with useful bibliometric results for an assessment of research activities. In order to achieve this purpose, we carried out a co-authorship network analysis of journal articles to assess the research activities of researchers for high-performance computational physics as a case study. For this study, we used journal articles of the Scopus database from Elsevier covering the time period of 2004-2013. We extracted the author rank in the physics field utilizing high-performance computing by the number of papers published during ten years from 2004. Finally, we drew the co-authorship network for 45 top-authors and their coauthors, and described some features of the co-authorship network in relation to the author rank. Suggestions for further studies are discussed.

  7. 14 CFR 1214.1105 - Final ranking.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Final ranking. 1214.1105 Section 1214.1105... Recruitment and Selection Program § 1214.1105 Final ranking. Final rankings will be based on a combination of... preference will be included in this final ranking in accordance with applicable regulations. ...

  8. Computer scientist looks at reliability computations

    International Nuclear Information System (INIS)

    Rosenthal, A.

    1975-01-01

    Results from the theory of computational complexity are applied to reliability computations on fault trees and networks. A well known class of problems which almost certainly have no fast solution algorithms is presented. It is shown that even approximately computing the reliability of many systems is difficult enough to be in this class. In the face of this result, which indicates that for general systems the computation time will be exponential in the size of the system, decomposition techniques which can greatly reduce the effective size of a wide variety of realistic systems are explored

  9. Roadmap to greener computing

    CERN Document Server

    Nguemaleu, Raoul-Abelin Choumin

    2014-01-01

    A concise and accessible introduction to green computing and green IT, this book addresses how computer science and the computer infrastructure affect the environment and presents the main challenges in making computing more environmentally friendly. The authors review the methodologies, designs, frameworks, and software development tools that can be used in computer science to reduce energy consumption and still compute efficiently. They also focus on Computer Aided Design (CAD) and describe what design engineers and CAD software applications can do to support new streamlined business directi

  10. Brief: Managing computing technology

    International Nuclear Information System (INIS)

    Startzman, R.A.

    1994-01-01

    While computing is applied widely in the production segment of the petroleum industry, its effective application is the primary goal of computing management. Computing technology has changed significantly since the 1950's, when computers first began to influence petroleum technology. The ability to accomplish traditional tasks faster and more economically probably is the most important effect that computing has had on the industry. While speed and lower cost are important, are they enough? Can computing change the basic functions of the industry? When new computing technology is introduced improperly, it can clash with traditional petroleum technology. This paper examines the role of management in merging these technologies

  11. Computer mathematics for programmers

    CERN Document Server

    Abney, Darrell H; Sibrel, Donald W

    1985-01-01

    Computer Mathematics for Programmers presents the Mathematics that is essential to the computer programmer.The book is comprised of 10 chapters. The first chapter introduces several computer number systems. Chapter 2 shows how to perform arithmetic operations using the number systems introduced in Chapter 1. The third chapter covers the way numbers are stored in computers, how the computer performs arithmetic on real numbers and integers, and how round-off errors are generated in computer programs. Chapter 4 details the use of algorithms and flowcharting as problem-solving tools for computer p

  12. Parallel computing works

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  13. Final disposition of MTR fuel

    International Nuclear Information System (INIS)

    Jonnson, Erik B.

    1996-01-01

    The final disposition of power reactor fuel has been investigated for a long time and some promising solutions to the problem have been shown. The research reactor fuels are normally not compatible with the zirkonium clad power reactor fuel and can thus not rely on the same disposal methods. The MTR fuels are typically Al-clad UAl x or U 3 Si 2 , HEU resp. LEU with essentially higher remaining enrichment than the corresponding power reactor fuel after full utilization of the uranium. The problems arising when evaluating the conditions at the final repository are the high corrosion rate of aluminum and uranium metal and the risk for secondary criticality due to the high content on fissionable material in the fully burnt MTR fuel. The newly adopted US policy to take back Foreign Research Reactor Spent Fuel of US origin for a period of ten years have given the research reactor society a reasonable time to evaluate different possibilities to solve the back end of the fuel cycle. The problem is, however, complicated and requires a solid engagement from the research reactor community. The task would be a suitable continuation of the RERTR program as it involves both the development of new fuel types and collecting data for the safe long-term disposal of the spent MTR fuel. (author)

  14. Interim and final storage casks

    International Nuclear Information System (INIS)

    Stumpfrock, L.; Kockelmann, H.

    2012-01-01

    The disposal of radioactive waste is a huge social challenge in Germany and all over the world. As is well known the search for a site for a final repository for high-level waste in Germany is not complete. Therefore, interim storage facilities for radioactive waste were built at plant sites in Germany. The waste is stored in these storage facilities in appropriate storage and transport casks until the transport in a final repository can be carried out. Licensing of the storage and transport casks aimed for use in the public space is done according to the traffic laws and for handling in the storage facility according to nuclear law. Taking into account the activity of the waste to be stored, different containers are in use, so that experience is available from the licensing and operation in interim storage facilities. The large volume of radioactive waste to be disposed of after the shut-down of power generation in nuclear power stations makes it necessary for large quantities of licensed storage and transport casks to be provided soon.

  15. Space tug applications. Final report

    International Nuclear Information System (INIS)

    1996-01-01

    This article is the final report of the conceptual design efforts for a 'space tug'. It includes preliminary efforts, mission analysis, configuration analysis, impact analysis, and conclusions. Of the several concepts evaluated, the nuclear bimodal tug was one of the top candidates, with the two options being the NEBA-1 and NEBA-3 systems. Several potential tug benefits were identified during the mission analysis. The tug enables delivery of large (>3,500 kg) payloads to the outer planets and it increases the GSO delivery capability by 20% relative to current systems. By providing end of life disposal, the tug can be used to extend the life of existing space assets. It can also be used to reboost satellites which were not delivered to their final orbit by the launch system. A specific mission model is the key to validating the tug concept. Once a mission model can be established, mission analysis can be used to determine more precise propellant quantities and burn times. In addition, the specific payloads can be evaluated for mass and volume capability with the launch systems. Results of the economic analysis will be dependent on the total years of operations and the number of missions in the mission model. The mission applications evaluated during this phase drove the need for large propellant quantities and thus did not allow the payloads to step down to smaller and less expensive launch systems

  16. The digital computer

    CERN Document Server

    Parton, K C

    2014-01-01

    The Digital Computer focuses on the principles, methodologies, and applications of the digital computer. The publication takes a look at the basic concepts involved in using a digital computer, simple autocode examples, and examples of working advanced design programs. Discussions focus on transformer design synthesis program, machine design analysis program, solution of standard quadratic equations, harmonic analysis, elementary wage calculation, and scientific calculations. The manuscript then examines commercial and automatic programming, how computers work, and the components of a computer

  17. Cloud computing for radiologists

    OpenAIRE

    Amit T Kharat; Amjad Safvi; S S Thind; Amarjit Singh

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as...

  18. Toward Cloud Computing Evolution

    OpenAIRE

    Susanto, Heru; Almunawar, Mohammad Nabil; Kang, Chen Chin

    2012-01-01

    -Information Technology (IT) shaped the success of organizations, giving them a solid foundation that increases both their level of efficiency as well as productivity. The computing industry is witnessing a paradigm shift in the way computing is performed worldwide. There is a growing awareness among consumers and enterprises to access their IT resources extensively through a "utility" model known as "cloud computing." Cloud computing was initially rooted in distributed grid-based computing. ...

  19. Algorithmically specialized parallel computers

    CERN Document Server

    Snyder, Lawrence; Gannon, Dennis B

    1985-01-01

    Algorithmically Specialized Parallel Computers focuses on the concept and characteristics of an algorithmically specialized computer.This book discusses the algorithmically specialized computers, algorithmic specialization using VLSI, and innovative architectures. The architectures and algorithms for digital signal, speech, and image processing and specialized architectures for numerical computations are also elaborated. Other topics include the model for analyzing generalized inter-processor, pipelined architecture for search tree maintenance, and specialized computer organization for raster

  20. The LiveWire Project final report

    Energy Technology Data Exchange (ETDEWEB)

    Brown, C.D.; Nelson, T.T. [Enova Technology, San Diego, CA (United States); Kelly, J.C.; Dominguez, H.A. [Paragon Consulting Services, La Verne, CA (United States)

    1997-10-01

    Utilities across the US have begun pilot testing a variety of hardware and software products to develop a two-way communications system between themselves and their customers. Their purpose is to reduce utility operating costs and to provide new and improved services for customers in light of pending changes in the electric industry being brought about by deregulation. A consortium including utilities, national labs, consultants, and contractors, with the support of the Department of Energy (DOE) and the Electric Power Research Institute (EPRI), initiated a project that utilized a hybrid fiber-coax (HFC) wide-area network integrated with a CEBus based local area network within the customers home. The system combined energy consumption data taken within the home, and home automation features to provide a suite of energy management services for residential customers. The information was transferred via the Internet through the HFC network, and presented to the customer on their personal computer. This final project report discusses the design, prototype testing, and system deployment planning of the energy management system.