WorldWideScience

Sample records for high computational power

  1. High Performance Computing - Power Application Programming Interface Specification.

    Energy Technology Data Exchange (ETDEWEB)

    Laros, James H.,; Kelly, Suzanne M.; Pedretti, Kevin; Grant, Ryan; Olivier, Stephen Lecler; Levenhagen, Michael J.; DeBonis, David

    2014-08-01

    Measuring and controlling the power and energy consumption of high performance computing systems by various components in the software stack is an active research area [13, 3, 5, 10, 4, 21, 19, 16, 7, 17, 20, 18, 11, 1, 6, 14, 12]. Implementations in lower level software layers are beginning to emerge in some production systems, which is very welcome. To be most effective, a portable interface to measurement and control features would significantly facilitate participation by all levels of the software stack. We present a proposal for a standard power Application Programming Interface (API) that endeavors to cover the entire software space, from generic hardware interfaces to the input from the computer facility manager.

  2. Power/energy use cases for high performance computing

    Energy Technology Data Exchange (ETDEWEB)

    Laros, James H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kelly, Suzanne M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hammond, Steven [National Renewable Energy Lab. (NREL), Golden, CO (United States); Elmore, Ryan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Munch, Kristin [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    Power and Energy have been identified as a first order challenge for future extreme scale high performance computing (HPC) systems. In practice the breakthroughs will need to be provided by the hardware vendors. But to make the best use of the solutions in an HPC environment, it will likely require periodic tuning by facility operators and software components. This document describes the actions and interactions needed to maximize power resources. It strives to cover the entire operational space in which an HPC system occupies. The descriptions are presented as formal use cases, as documented in the Unified Modeling Language Specification [1]. The document is intended to provide a common understanding to the HPC community of the necessary management and control capabilities. Assuming a common understanding can be achieved, the next step will be to develop a set of Application Programing Interfaces (APIs) to which hardware vendors and software developers could utilize to steer power consumption.

  3. High performance computing in power and energy systems

    Energy Technology Data Exchange (ETDEWEB)

    Khaitan, Siddhartha Kumar [Iowa State Univ., Ames, IA (United States); Gupta, Anshul (eds.) [IBM Watson Research Center, Yorktown Heights, NY (United States)

    2013-07-01

    The twin challenge of meeting global energy demands in the face of growing economies and populations and restricting greenhouse gas emissions is one of the most daunting ones that humanity has ever faced. Smart electrical generation and distribution infrastructure will play a crucial role in meeting these challenges. We would need to develop capabilities to handle large volumes of data generated by the power system components like PMUs, DFRs and other data acquisition devices as well as by the capacity to process these data at high resolution via multi-scale and multi-period simulations, cascading and security analysis, interaction between hybrid systems (electric, transport, gas, oil, coal, etc.) and so on, to get meaningful information in real time to ensure a secure, reliable and stable power system grid. Advanced research on development and implementation of market-ready leading-edge high-speed enabling technologies and algorithms for solving real-time, dynamic, resource-critical problems will be required for dynamic security analysis targeted towards successful implementation of Smart Grid initiatives. This books aims to bring together some of the latest research developments as well as thoughts on the future research directions of the high performance computing applications in electric power systems planning, operations, security, markets, and grid integration of alternate sources of energy, etc.

  4. Designing high power targets with computational fluid dynamics (CFD)

    International Nuclear Information System (INIS)

    Covrig, S. D.

    2013-01-01

    High power liquid hydrogen (LH2) targets, up to 850 W, have been widely used at Jefferson Lab for the 6 GeV physics program. The typical luminosity loss of a 20 cm long LH2 target was 20% for a beam current of 100 μA rastered on a square of side 2 mm on the target. The 35 cm long, 2500 W LH2 target for the Qweak experiment had a luminosity loss of 0.8% at 180 μA beam rastered on a square of side 4 mm at the target. The Qweak target was the highest power liquid hydrogen target in the world and with the lowest noise figure. The Qweak target was the first one designed with CFD at Jefferson Lab. A CFD facility is being established at Jefferson Lab to design, build and test a new generation of low noise high power targets

  5. Designing high power targets with computational fluid dynamics (CFD)

    Energy Technology Data Exchange (ETDEWEB)

    Covrig, S. D. [Thomas Jefferson National Laboratory, Newport News, VA 23606 (United States)

    2013-11-07

    High power liquid hydrogen (LH2) targets, up to 850 W, have been widely used at Jefferson Lab for the 6 GeV physics program. The typical luminosity loss of a 20 cm long LH2 target was 20% for a beam current of 100 μA rastered on a square of side 2 mm on the target. The 35 cm long, 2500 W LH2 target for the Qweak experiment had a luminosity loss of 0.8% at 180 μA beam rastered on a square of side 4 mm at the target. The Qweak target was the highest power liquid hydrogen target in the world and with the lowest noise figure. The Qweak target was the first one designed with CFD at Jefferson Lab. A CFD facility is being established at Jefferson Lab to design, build and test a new generation of low noise high power targets.

  6. High Performance Computing - Power Application Programming Interface Specification Version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Laros, James H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grant, Ryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Levenhagen, Michael J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Olivier, Stephen Lecler [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pedretti, Kevin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ward, H. Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Younge, Andrew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-03-01

    Measuring and controlling the power and energy consumption of high performance computing systems by various components in the software stack is an active research area. Implementations in lower level software layers are beginning to emerge in some production systems, which is very welcome. To be most effective, a portable interface to measurement and control features would significantly facilitate participation by all levels of the software stack. We present a proposal for a standard power Application Programming Interface (API) that endeavors to cover the entire software space, from generic hardware interfaces to the input from the computer facility manager.

  7. Computer-aided analysis of power-electronic systems simulation of a high-voltage power converter

    International Nuclear Information System (INIS)

    Bordry, F.; Isch, H.W.; Proudlock, P.

    1987-01-01

    In the study of semiconductor devices, simulation methods play an important role in both the design of systems and the analysis of their operation. The authors describe a new and efficient computer-aided package program for general power-electronic systems. The main difficulty when taking into account non-linear elements, such as semiconductors, lies in determining the existence and the relations of the elementary sequences defined by the conduction or nonconduction of these components. The method does not require a priori knowledge of the state sequences of the semiconductor nor of the commutation instants, but only the circuit structure, its parameters and the commands to the controlled switches. The simulation program computes automatically both transient and steady-state waveforms for any circuit configuration. The simulation of a high-voltage power converter is presented, both for its steady-state and transient overload conditions. This 100 kV power converter (4 MW) will feed two klystrons in parallel

  8. Ultra-low power high precision magnetotelluric receiver array based customized computer and wireless sensor network

    Science.gov (United States)

    Chen, R.; Xi, X.; Zhao, X.; He, L.; Yao, H.; Shen, R.

    2016-12-01

    Dense 3D magnetotelluric (MT) data acquisition owns the benefit of suppressing the static shift and topography effect, can achieve high precision and high resolution inversion for underground structure. This method may play an important role in mineral exploration, geothermal resources exploration, and hydrocarbon exploration. It's necessary to reduce the power consumption greatly of a MT signal receiver for large-scale 3D MT data acquisition while using sensor network to monitor data quality of deployed MT receivers. We adopted a series of technologies to realized above goal. At first, we designed an low-power embedded computer which can couple with other parts of MT receiver tightly and support wireless sensor network. The power consumption of our embedded computer is less than 1 watt. Then we designed 4-channel data acquisition subsystem which supports 24-bit analog-digital conversion, GPS synchronization, and real-time digital signal processing. Furthermore, we developed the power supply and power management subsystem for MT receiver. At last, a series of software, which support data acquisition, calibration, wireless sensor network, and testing, were developed. The software which runs on personal computer can monitor and control over 100 MT receivers on the field for data acquisition and quality control. The total power consumption of the receiver is about 2 watts at full operation. The standby power consumption is less than 0.1 watt. Our testing showed that the MT receiver can acquire good quality data at ground with electrical dipole length as 3 m. Over 100 MT receivers were made and used for large-scale geothermal exploration in China with great success.

  9. Analysis of Application Power and Schedule Composition in a High Performance Computing Environment

    Energy Technology Data Exchange (ETDEWEB)

    Elmore, Ryan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gruchalla, Kenny [National Renewable Energy Lab. (NREL), Golden, CO (United States); Phillips, Caleb [National Renewable Energy Lab. (NREL), Golden, CO (United States); Purkayastha, Avi [National Renewable Energy Lab. (NREL), Golden, CO (United States); Wunder, Nick [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-01-05

    As the capacity of high performance computing (HPC) systems continues to grow, small changes in energy management have the potential to produce significant energy savings. In this paper, we employ an extensive informatics system for aggregating and analyzing real-time performance and power use data to evaluate energy footprints of jobs running in an HPC data center. We look at the effects of algorithmic choices for a given job on the resulting energy footprints, and analyze application-specific power consumption, and summarize average power use in the aggregate. All of these views reveal meaningful power variance between classes of applications as well as chosen methods for a given job. Using these data, we discuss energy-aware cost-saving strategies based on reordering the HPC job schedule. Using historical job and power data, we present a hypothetical job schedule reordering that: (1) reduces the facility's peak power draw and (2) manages power in conjunction with a large-scale photovoltaic array. Lastly, we leverage this data to understand the practical limits on predicting key power use metrics at the time of submission.

  10. High Performance Computing - Power Application Programming Interface Specification Version 1.4

    Energy Technology Data Exchange (ETDEWEB)

    Laros III, James H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); DeBonis, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grant, Ryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kelly, Suzanne M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Levenhagen, Michael J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Olivier, Stephen Lecler [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pedretti, Kevin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-10-01

    Measuring and controlling the power and energy consumption of high performance computing systems by various components in the software stack is an active research area [13, 3, 5, 10, 4, 21, 19, 16, 7, 17, 20, 18, 11, 1, 6, 14, 12]. Implementations in lower level software layers are beginning to emerge in some production systems, which is very welcome. To be most effective, a portable interface to measurement and control features would significantly facilitate participation by all levels of the software stack. We present a proposal for a standard power Application Programming Interface (API) that endeavors to cover the entire software space, from generic hardware interfaces to the input from the computer facility manager.

  11. High degree utilization of computers for design of nuclear power plants

    International Nuclear Information System (INIS)

    Masui, Takao; Sawada, Takashi

    1992-01-01

    Nuclear power plants are the huge technology in which various technologies are compounded, and the high safety is demanded. Therefore, in the design of nuclear power plants, it is necessary to carry out the design by sufficiently grasping the behavior of the plants, and to confirm the safety by carrying out the accurate design evaluation supposing the various operational conditions, and as the indispensable tool for these analysis and evaluation, the most advanced computers in that age have been utilized. As to the utilization for the design, there are the fields of design, analysis and evaluation and another fields of the application to the support of design. Also in the field of the application to operation control, computers are utilized. The utilization of computers for the core design, hydrothermal design, core structure design, safety analysis and structural analysis of PWR plants, and for the nuclear design, safety analysis and heat flow analysis of FBR plants, the application to the support of design and the application to operation control are explained. (K.I.)

  12. Predicting the Noise of High Power Fluid Targets Using Computational Fluid Dynamics

    Science.gov (United States)

    Moore, Michael; Covrig Dusa, Silviu

    The 2.5 kW liquid hydrogen (LH2) target used in the Qweak parity violation experiment is the highest power LH2 target in the world and the first to be designed with Computational Fluid Dynamics (CFD) at Jefferson Lab. The Qweak experiment determined the weak charge of the proton by measuring the parity-violating elastic scattering asymmetry of longitudinally polarized electrons from unpolarized liquid hydrogen at small momentum transfer (Q2 = 0 . 025 GeV2). This target satisfied the design goals of bench-marked with the Qweak target data. This work is an essential component in future designs of very high power low noise targets like MOLLER (5 kW, target noise asymmetry contribution < 25 ppm) and MESA (4.5 kW).

  13. A computer control system for the PNC high power cw electron linac. Concept and hardware

    Energy Technology Data Exchange (ETDEWEB)

    Emoto, T.; Hirano, K.; Takei, Hayanori; Nomura, Masahiro; Tani, S. [Power Reactor and Nuclear Fuel Development Corp., Oarai, Ibaraki (Japan). Oarai Engineering Center; Kato, Y.; Ishikawa, Y.

    1998-06-01

    Design and construction of a high power cw (Continuous Wave) electron linac for studying feasibility of nuclear waste transmutation was started in 1989 at PNC. The PNC accelerator (10 MeV, 20 mA average current, 4 ms pulse width, 50 Hz repetition) is dedicated machine for development of the high current acceleration technology in future need. The computer control system is responsible for accelerator control and supporting the experiment for high power operation. The feature of the system is the measurements of accelerator status simultaneously and modularity of software and hardware for easily implemented for modification or expansion. The high speed network (SCRAM Net {approx} 15 MB/s), Ethernet, and front end processors (Digital Signal Processor) were employed for the high speed data taking and control. The system was designed to be standard modules and software implemented man machine interface. Due to graphical-user-interface and object-oriented-programming, the software development environment is effortless programming and maintenance. (author)

  14. Power plant process computer

    International Nuclear Information System (INIS)

    Koch, R.

    1982-01-01

    The concept of instrumentation and control in nuclear power plants incorporates the use of process computers for tasks which are on-line in respect to real-time requirements but not closed-loop in respect to closed-loop control. The general scope of tasks is: - alarm annunciation on CRT's - data logging - data recording for post trip reviews and plant behaviour analysis - nuclear data computation - graphic displays. Process computers are used additionally for dedicated tasks such as the aeroball measuring system, the turbine stress evaluator. Further applications are personal dose supervision and access monitoring. (orig.)

  15. A high performance, low power computational platform for complex sensing operations in smart cities

    KAUST Repository

    Jiang, Jiming; Claudel, Christian

    2017-01-01

    This paper presents a new wireless platform designed for an integrated traffic/flash flood monitoring system. The sensor platform is built around a 32-bit ARM Cortex M4 microcontroller and a 2.4GHz 802.15.4802.15.4 ISM compliant radio module. It can be interfaced with fixed traffic sensors, or receive data from vehicle transponders. This platform is specifically designed for solar-powered, low bandwidth, high computational performance wireless sensor network applications. A self-recovering unit is designed to increase reliability and allow periodic hard resets, an essential requirement for sensor networks. A radio monitoring circuitry is proposed to monitor incoming and outgoing transmissions, simplifying software debugging. We illustrate the performance of this wireless sensor platform on complex problems arising in smart cities, such as traffic flow monitoring, machine-learning-based flash flood monitoring or Kalman-filter based vehicle trajectory estimation. All design files have been uploaded and shared in an open science framework, and can be accessed from [1]. The hardware design is under CERN Open Hardware License v1.2.

  16. A high performance, low power computational platform for complex sensing operations in smart cities

    KAUST Repository

    Jiang, Jiming

    2017-02-02

    This paper presents a new wireless platform designed for an integrated traffic/flash flood monitoring system. The sensor platform is built around a 32-bit ARM Cortex M4 microcontroller and a 2.4GHz 802.15.4802.15.4 ISM compliant radio module. It can be interfaced with fixed traffic sensors, or receive data from vehicle transponders. This platform is specifically designed for solar-powered, low bandwidth, high computational performance wireless sensor network applications. A self-recovering unit is designed to increase reliability and allow periodic hard resets, an essential requirement for sensor networks. A radio monitoring circuitry is proposed to monitor incoming and outgoing transmissions, simplifying software debugging. We illustrate the performance of this wireless sensor platform on complex problems arising in smart cities, such as traffic flow monitoring, machine-learning-based flash flood monitoring or Kalman-filter based vehicle trajectory estimation. All design files have been uploaded and shared in an open science framework, and can be accessed from [1]. The hardware design is under CERN Open Hardware License v1.2.

  17. Application of a Statistical Linear Time-Varying System Model of High Grazing Angle Sea Clutter for Computing Interference Power

    Science.gov (United States)

    2017-12-08

    STATISTICAL LINEAR TIME-VARYING SYSTEM MODEL OF HIGH GRAZING ANGLE SEA CLUTTER FOR COMPUTING INTERFERENCE POWER 1. INTRODUCTION Statistical linear time...beam. We can approximate one of the sinc factors using the Dirichlet kernel to facilitate computation of the integral in (6) as follows: ∣∣∣∣sinc(WB...plotted in Figure 4. The resultant autocorrelation can then be found by substituting (18) into (28). The Python code used to generate Figures 1-4 is found

  18. Computer control of the high-voltage power supply for the DIII-D electron cyclotron heating system

    International Nuclear Information System (INIS)

    Clow, D.D.; Kellman, D.H.

    1992-01-01

    This paper reports on the DIII-D Electron Cyclotron Heating (ECH) high voltage power supply which is controlled by a computer. Operational control is input via keyboard and mouse, and computer/power supply interfact is accomplished with a Computer Assisted Monitoring and Control (CAMAC) system. User-friendly tools allow the design and layout of simulated control panels on the computer screen. Panel controls and indicators can be changed, added or deleted, and simple editing of user-specific processes can quickly modify control and fault logic. Databases can be defined, and control panel functions are easily referred to various data channels. User-specific processes are written and linked using Fortran, to manage control and data acquisition through CAMAC. The resulting control system has significant advantages over the hardware it emulates: changes in logic, layout, and function are quickly and easily incorporated; data storage, retrieval, and processing are flexible and simply accomplished; physical components subject to wear and degradation are minimized. In addition, the system can be expanded to multiplex control of several power supplies, each with its own database, through a single computer console

  19. Computer control of the high-voltage power supply for the DIII-D Electron Cyclotron Heating system

    International Nuclear Information System (INIS)

    Clow, D.D.; Kellman, D.H.

    1991-10-01

    The D3-D Electron Cyclotron Heating (ECH) high voltage power supply is controlled by a computer. Operational control is input via keyboard and mouse, and computer/power supply interface is accomplished with a Computer Assisted Monitoring and Control (CAMAC) system. User-friendly tools allow the design and layout of simulated control panels on the computer screen. Panel controls and indicators can be changed, added or deleted, and simple editing of user-specific processes can quickly modify control and fault logic. Databases can be defined, and control panel functions are easily referred to various data channels. User-specific processes are written and linked using Fortran, to manage control and data acquisition through CAMAC. The resulting control system has significant advantages over the hardware it emulates: changes in logic, layout, and function are quickly and easily incorporated; data storage, retrieval, and processing are flexible and simply accomplished, physical components subject to wear and degradation are minimized. In addition, the system can be expanded to multiplex control of several power supplied, each with its own database, through a single computer and console. 5 refs., 4 figs., 1 tab

  20. A computer study of radionuclide production in high power accelerators for medical and industrial applications

    Science.gov (United States)

    Van Riper, K. A.; Mashnik, S. G.; Wilson, W. B.

    2001-05-01

    Methods for radionuclide production calculation in a high power proton accelerator have been developed and applied to study production of 22 isotopes by high-energy protons and neutrons. These methods are readily applicable to accelerator, and reactor, environments other than the particular model we considered and to the production of other radioactive and stable isotopes. We have also developed methods for evaluating cross sections from a wide variety of sources into a single cross section set and have produced an evaluated library covering about a third of all natural elements. These methods also are applicable to an expanded set of reactions. A 684 page detailed report on this study, with 37 tables and 264 color figures is available on the Web at http://t2.lanl.gov/publications/publications.html, or, if not accessible, in hard copy from the authors.

  1. Computer study of isotope production for medical and industrial applications in high power accelerators

    Science.gov (United States)

    Mashnik, S. G.; Wilson, W. B.; Van Riper, K. A.

    2001-07-01

    Methods for radionuclide production calculation in a high power proton accelerator have been developed and applied to study production of 22 isotopes. These methods are readily applicable both to accelerator and reactor environments and to the production of other radioactive and stable isotopes. We have also developed methods for evaluating cross sections from a wide variety of sources into a single cross section set and have produced an evaluated library covering about a third of all natural elements that may be expanded to other reactions. A 684 page detailed report on this study, with 37 tables and 264 color figures, is available on the Web at http://t2.lanl.gov/publications/.

  2. High-power klystrons

    Science.gov (United States)

    Siambis, John G.; True, Richard B.; Symons, R. S.

    1994-05-01

    Novel emerging applications in advanced linear collider accelerators, ionospheric and atmospheric sensing and modification and a wide spectrum of industrial processing applications, have resulted in microwave tube requirements that call for further development of high power klystrons in the range from S-band to X-band. In the present paper we review recent progress in high power klystron development and discuss some of the issues and scaling laws for successful design. We also discuss recent progress in electron guns with potential grading electrodes for high voltage with short and long pulse operation via computer simulations obtained from the code DEMEOS, as well as preliminary experimental results. We present designs for high power beam collectors.

  3. Power-efficient computer architectures recent advances

    CERN Document Server

    Själander, Magnus; Kaxiras, Stefanos

    2014-01-01

    As Moore's Law and Dennard scaling trends have slowed, the challenges of building high-performance computer architectures while maintaining acceptable power efficiency levels have heightened. Over the past ten years, architecture techniques for power efficiency have shifted from primarily focusing on module-level efficiencies, toward more holistic design styles based on parallelism and heterogeneity. This work highlights and synthesizes recent techniques and trends in power-efficient computer architecture.Table of Contents: Introduction / Voltage and Frequency Management / Heterogeneity and Sp

  4. Quasi-optical converters for high-power gyrotrons: a brief review of physical models, numerical methods and computer codes

    International Nuclear Information System (INIS)

    Sabchevski, S; Zhelyazkov, I; Benova, E; Atanassov, V; Dankov, P; Thumm, M; Arnold, A; Jin, J; Rzesnicki, T

    2006-01-01

    Quasi-optical (QO) mode converters are used to transform electromagnetic waves of complex structure and polarization generated in gyrotron cavities into a linearly polarized, Gaussian-like beam suitable for transmission. The efficiency of this conversion as well as the maintenance of low level of diffraction losses are crucial for the implementation of powerful gyrotrons as radiation sources for electron-cyclotron-resonance heating of fusion plasmas. The use of adequate physical models, efficient numerical schemes and up-to-date computer codes may provide the high accuracy necessary for the design and analysis of these devices. In this review, we briefly sketch the most commonly used QO converters, the mathematical base they have been treated on and the basic features of the numerical schemes used. Further on, we discuss the applicability of several commercially available and free software packages, their advantages and drawbacks, for solving QO related problems

  5. Does high-power computed tomography scanning equipment affect the operation of pacemakers?

    International Nuclear Information System (INIS)

    Yamaji, Satoshi; Imai, Shinobu; Saito, Fumio; Yagi, Hiroshi; Kushiro, Toshio; Uchiyama, Takahisa

    2006-01-01

    Computed tomography (CT) is widely used in clinical practice, but there has not been a detailed report of its effect on the functioning of pacemakers. During CT, ECGs were recorded in 11 patients with pacemakers and the electromagnetic field in the CT room was also measured. The effect of CT on a pacemaker was also investigated in a human body model with and without shielding by rubber or lead. Transient malfunctions of pacemakers during CT occurred in 6 of 11 patients. The model showed that malfunctioning of the pacemaker was induced by CT scanning and this was prevented by lead but not by rubber. The alternating electrical field was 150 V/m on the CT scanning line, which was lower than the level influencing pacemaker functions. The alternating magnetic field was 15μT on the CT scanning line, which was also lower than the level influencing pacemaker functions. Malfunctions of the pacemaker during CT may be caused by diagnostic radiant rays and although they are transient, the possibility of lethal arrhythmia cannot be ignored. (author)

  6. Computer Aided Modeling and Analysis of Five-Phase PMBLDC Motor Drive for Low Power High Torque Application

    Directory of Open Access Journals (Sweden)

    M. A. Inayathullaah

    2014-01-01

    Full Text Available In order to achieve high torque at low power with high efficiency, a new five-phase permanent magnet brushless DC (PMBLDC motor design was analyzed and optimized. A similar three-phase motor having the same D/L ratio (inner diameter (D and length of the stator (L is compared for maximum torque and torque ripple of the designed five-phase PMBLDC motor. Maxwell software was used to build finite element simulation model of the motor. The internal complicated magnetic field distribution and dynamic performance simulation were obtained in different positions. No load and load characteristics of the five-phase PMBLDC motor were simulated, and the power consumption of materials was computed. The conformity of the final simulation results indicates that this method can be used to provide a theoretical basis for further optimal design of this new type of motor with its drive so as to improve the starting torque and reduce torque ripple of the motor.

  7. Power throttling of collections of computing elements

    Science.gov (United States)

    Bellofatto, Ralph E [Ridgefield, CT; Coteus, Paul W [Yorktown Heights, NY; Crumley, Paul G [Yorktown Heights, NY; Gara, Alan G [Mount Kidsco, NY; Giampapa, Mark E [Irvington, NY; Gooding,; Thomas, M [Rochester, MN; Haring, Rudolf A [Cortlandt Manor, NY; Megerian, Mark G [Rochester, MN; Ohmacht, Martin [Yorktown Heights, NY; Reed, Don D [Mantorville, MN; Swetz, Richard A [Mahopac, NY; Takken, Todd [Brewster, NY

    2011-08-16

    An apparatus and method for controlling power usage in a computer includes a plurality of computers communicating with a local control device, and a power source supplying power to the local control device and the computer. A plurality of sensors communicate with the computer for ascertaining power usage of the computer, and a system control device communicates with the computer for controlling power usage of the computer.

  8. Re-Form: FPGA-Powered True Codesign Flow for High-Performance Computing In The Post-Moore Era

    Energy Technology Data Exchange (ETDEWEB)

    Cappello, Franck; Yoshii, Kazutomo; Finkel, Hal; Cong, Jason

    2016-11-14

    Multicore scaling will end soon because of practical power limits. Dark silicon is becoming a major issue even more than the end of Moore’s law. In the post-Moore era, the energy efficiency of computing will be a major concern. FPGAs could be a key to maximizing the energy efficiency. In this paper we address severe challenges in the adoption of FPGA in HPC and describe “Re-form,” an FPGA-powered codesign flow.

  9. Leveraging the Power of High Performance Computing for Next Generation Sequencing Data Analysis: Tricks and Twists from a High Throughput Exome Workflow

    Science.gov (United States)

    Wonczak, Stephan; Thiele, Holger; Nieroda, Lech; Jabbari, Kamel; Borowski, Stefan; Sinha, Vishal; Gunia, Wilfried; Lang, Ulrich; Achter, Viktor; Nürnberg, Peter

    2015-01-01

    Next generation sequencing (NGS) has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC) systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files. PMID:25942438

  10. High power communication satellites power systems study

    International Nuclear Information System (INIS)

    Josloff, A.T.; Peterson, J.R.

    1994-01-01

    This paper discusses a DOE-funded study to evaluate the commercial attractiveness of high power communication satellites and assesses the attributes of both conventional photovoltaic and reactor power systems. This study brings together a preeminent US Industry/Russian team to cooperate on the role of high power communication satellites in the rapidly expanding communications revolution. These high power satellites play a vital role in assuring availability of universally accessible, wide bandwidth communications, for high definition TV, super computer networks and other services. Satellites are ideally suited to provide the wide bandwidths and data rates required and are unique in the ability to provide services directly to the users. As new or relocated markets arise, satellites offer a flexibility that conventional distribution services cannot match, and it is no longer necessary to be near population centers to take advantage of the telecommunication revolution. The geopolitical implications of these substantially enhanced communications capabilities will be significant

  11. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    Science.gov (United States)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3

  12. Use of computers at nuclear power plants

    International Nuclear Information System (INIS)

    Sen'kin, V.I.; Ozhigano, Yu.V.

    1974-01-01

    Applications of information and control computors in reacter central systems in Great Britain, Federal Republic of Germany, France, Canada, and the USA is surveyed. For the purpose of increasing the reliability of the computers effective means were designed for emergency operation and automatic computerized controls, and highly reliable micromodel modifications were developed. Numerical data units were handled along with development of methods and diagrams for converting analog values to numerical values, in accordance with modern requirements. Some data are presented on computer reliability in operating nuclear power plants both proposed and under construction. It is concluded that in foreign nuclear power stations the informational and calculational computers are finding increasingly wide distribution. Rapid action, the possibility of controlling large parameters, and operation of the computer in conjunction with increasing reliability are speeding up the process of introducing computers in atomic energy and broadenig their functions. (V.P.)

  13. High power communication satellites power systems study

    Science.gov (United States)

    Josloff, Allan T.; Peterson, Jerry R.

    1995-01-01

    This paper discusses a planned study to evaluate the commercial attractiveness of high power communication satellites and assesses the attributes of both conventional photovoltaic and reactor power systems. These high power satellites can play a vital role in assuring availability of universally accessible, wide bandwidth communications, for high definition TV, super computer networks and other services. Satellites are ideally suited to provide the wide bandwidths and data rates required and are unique in the ability to provide services directly to the users. As new or relocated markets arise, satellites offer a flexibility that conventional distribution services cannot match, and it is no longer necessary to be near population centers to take advantage of the telecommunication revolution. The geopolitical implications of these substantially enhanced communications capabilities can be significant.

  14. Computing power on the move

    CERN Multimedia

    Joannah Caborn Wengler

    2012-01-01

    You might sit right next to your computer as you work, use the GRID’s computing power sitting in another part of the world or share CPU time with the Cloud: actual and virtual machines communicate and exchange information, and the place where they are located is a detail of only marginal importance. CERN’s new remote computer centre will open in Hungary in 2013.   Artist's impression of the new Wigner Data Centre. (Image: Wigner). CERN’s computing department has been aiming to minimise human contact with the machines for a while now. “The problem is that people going in creates dust, and simply touching things may cause damage,” explains Wayne Salter, Leader of the IT Computing Facilities Group. A first remote centre on the other side of Geneva was opened in June 2010 and a new one will open in Hungary next year. “Once the centre in Budapest is running, we will not be going there to operate it. As far as possible, w...

  15. Computer controlled high voltage system

    Energy Technology Data Exchange (ETDEWEB)

    Kunov, B; Georgiev, G; Dimitrov, L [and others

    1996-12-31

    A multichannel computer controlled high-voltage power supply system is developed. The basic technical parameters of the system are: output voltage -100-3000 V, output current - 0-3 mA, maximum number of channels in one crate - 78. 3 refs.

  16. Computing in high energy physics

    Energy Technology Data Exchange (ETDEWEB)

    Watase, Yoshiyuki

    1991-09-15

    The increasingly important role played by computing and computers in high energy physics is displayed in the 'Computing in High Energy Physics' series of conferences, bringing together experts in different aspects of computing - physicists, computer scientists, and vendors.

  17. Computer aided method of low voltage power distribution networks protection system against lightning and electromagnetic pulse generated by high altitude nuclear burst

    International Nuclear Information System (INIS)

    Laroubine, J.

    1989-01-01

    The lightning creates an electromagnetic field which produces a slow duration and high energy pulse of current on low voltage power distribution networks. On the other hand an high altitude nuclear burst generates an electromagnetic pulse which causes fast and intense interferences. We describe here the specifications of a passive filter that can reject these interferences. We used a computer aided method of simulation to create a prototype. Experimental results confirm the validity of the model used for simulation [fr

  18. High average power solid state laser power conditioning system

    International Nuclear Information System (INIS)

    Steinkraus, R.F.

    1987-01-01

    The power conditioning system for the High Average Power Laser program at Lawrence Livermore National Laboratory (LLNL) is described. The system has been operational for two years. It is high voltage, high power, fault protected, and solid state. The power conditioning system drives flashlamps that pump solid state lasers. Flashlamps are driven by silicon control rectifier (SCR) switched, resonant charged, (LC) discharge pulse forming networks (PFNs). The system uses fiber optics for control and diagnostics. Energy and thermal diagnostics are monitored by computers

  19. CSTI High Capacity Power

    International Nuclear Information System (INIS)

    Winter, J.M.

    1989-01-01

    The SP-100 program was established in 1983 by DOD, DOE, and NASA as a joint program to develop the technology necessary for space nuclear power systems for military and civil application. During FY-86 and 87, the NASA SP-100 Advanced Technology Program was devised to maintain the momentum of promising technology advancement efforts started during Phase 1 of SP-100 and to strengthen, in key areas, the chances for successful development and growth capability of space nuclear reactor power systems for future space applications. In FY-88, the Advanced Technology Program was incorporated into NASA's new Civil Space Technology Initiative (CSTI). The CSTI Program was established to provide the foundation for technology development in automation and robotics, information, propulsion, and power. The CSTI High Capacity Power Program builds on the technology efforts of the SP-100 program, incorporates the previous NASA SP-100 Advanced Technology project, and provides a bridge to NASA Project Pathfinder. The elements of CSTI High Capacity Power development include Conversion Systems, Thermal Management, Power Management, System Diagnostics, and Environmental Interactions. Technology advancement in all areas, including materials, is required to assure the high reliability and 7 to 10 year lifetime demanded for future space nuclear power systems. The overall program will develop and demonstrate the technology base required to provide a wide range of modular power systems as well as allowing mission independence from solar and orbital attitude requirements. Several recent advancements in CSTI High Capacity power development will be discussed

  20. Process computers automate CERN power supply installations

    International Nuclear Information System (INIS)

    Ullrich, H.; Martin, A.

    1974-01-01

    Higher standards of performance and reliability in the power plants of large particle accelerators necessitate increasing use of automation. The CERN (European Nuclear Research Centre) in Geneva started to employ process computers for plant automation at an early stage in its history. The great complexity and extent of the plants for high-energy physics first led to the setting-up of decentralized automatic systems which are now being increasingly combined into one interconnected automation system. One of these automatic systems controls and monitors the extensive power supply installations for the main ring magnets in the experimental zones. (orig.) [de

  1. SWITCHING POWER FAN CONTROL OF COMPUTER

    Directory of Open Access Journals (Sweden)

    Oleksandr I. Popovskyi

    2010-10-01

    Full Text Available Relevance of material presented in the article, due to extensive use of high-performance computers to create modern information systems, including the NAPS of Ukraine. Most computers in NAPS of Ukraine work on Intel Pentium processors at speeds from 600 MHz to 3 GHz and release a lot of heat, which requires the installation of the system unit 2-3 additional fans. The fan is always works on full power, that leads to rapid deterioration and high level (up to 50 dB noise. In order to meet ergonomic requirements it is proposed to іnstall a computer system unit and an additional control unit ventilators, allowing independent control of each fan. The solution is applied at creation of information systems planning research in the National Academy of Pedagogical Sciences of Ukraine on Internet basis.

  2. Computational and experimental progress on laser-activated gas avalanche switches for broadband, high-power electromagnetic pulse generation

    International Nuclear Information System (INIS)

    Mayhall, D.J.; Yee, J.H.; Villa, F.

    1991-01-01

    This paper discusses the gas avalanche switch, a high-voltage, picosecond-speed switch, which has been proposed. The basic switch consists of pulse-charged electrodes, immersed in a high-pressure gas. An avalanche discharge is induced in the gas between the electrodes by ionization from a picosecond-scale laser pulse. The avalanching electrons move toward the anode, causing the applied voltage to collapse in picoseconds. This voltage collapse, if rapid enough, generates electromagnetic waves. A two-dimensional (2D), finite difference computer code solves Maxwell's equations for transverse magnetic modes for rectilinear electrodes between parallel plate conductors, along with electron conservation equations for continuity, momentum, and energy. Collision frequencies for ionization and momentum and energy transfer to neutral molecules are assumed to scale linearly with neutral pressure. Electrode charging and laser-driven electron deposition are assumed to be instantaneous. Code calculations are done for a pulse generator geometry, consisting of an 0.7 mm wide by 0.8 mm high, beveled, rectangular center electrode between grounded parallel plates at 2 mm spacing in air

  3. Resonant High Power Combiners

    CERN Document Server

    Langlois, Michel; Peillex-Delphe, Guy

    2005-01-01

    Particle accelerators need radio frequency sources. Above 300 MHz, the amplifiers mostly used high power klystrons developed for this sole purpose. As for military equipment, users are drawn to buy "off the shelf" components rather than dedicated devices. IOTs have replaced most klystrons in TV transmitters and find their way in particle accelerators. They are less bulky, easier to replace, more efficient at reduced power. They are also far less powerful. What is the benefit of very compact sources if huge 3 dB couplers are needed to combine the power? To alleviate this drawback, we investigated a resonant combiner, operating in TM010 mode, able to combine 3 to 5 IOTs. Our IOTs being able to deliver 80 kW C.W. apiece, combined power would reach 400 kW minus the minor insertion loss. Values for matching and insertion loss are given. The behavior of the system in case of IOT failure is analyzed.

  4. High power microwaves

    CERN Document Server

    Benford, James; Schamiloglu, Edl

    2016-01-01

    Following in the footsteps of its popular predecessors, High Power Microwaves, Third Edition continues to provide a wide-angle, integrated view of the field of high power microwaves (HPMs). This third edition includes significant updates in every chapter as well as a new chapter on beamless systems that covers nonlinear transmission lines. Written by an experimentalist, a theorist, and an applied theorist, respectively, the book offers complementary perspectives on different source types. The authors address: * How HPM relates historically and technically to the conventional microwave field * The possible applications for HPM and the key criteria that HPM devices have to meet in order to be applied * How high power sources work, including their performance capabilities and limitations * The broad fundamental issues to be addressed in the future for a wide variety of source types The book is accessible to several audiences. Researchers currently in the field can widen their understanding of HPM. Present or pot...

  5. Computing in high energy physics

    International Nuclear Information System (INIS)

    Watase, Yoshiyuki

    1991-01-01

    The increasingly important role played by computing and computers in high energy physics is displayed in the 'Computing in High Energy Physics' series of conferences, bringing together experts in different aspects of computing - physicists, computer scientists, and vendors

  6. High energy physics computing in Japan

    International Nuclear Information System (INIS)

    Watase, Yoshiyuki

    1989-01-01

    A brief overview of the computing provision for high energy physics in Japan is presented. Most of the computing power for high energy physics is concentrated in KEK. Here there are two large scale systems: one providing a general computing service including vector processing and the other dedicated to TRISTAN experiments. Each university group has a smaller sized mainframe or VAX system to facilitate both their local computing needs and the remote use of the KEK computers through a network. The large computer system for the TRISTAN experiments is described. An overview of a prospective future large facility is also given. (orig.)

  7. Framework Resources Multiply Computing Power

    Science.gov (United States)

    2010-01-01

    As an early proponent of grid computing, Ames Research Center awarded Small Business Innovation Research (SBIR) funding to 3DGeo Development Inc., of Santa Clara, California, (now FusionGeo Inc., of The Woodlands, Texas) to demonstrate a virtual computer environment that linked geographically dispersed computer systems over the Internet to help solve large computational problems. By adding to an existing product, FusionGeo enabled access to resources for calculation- or data-intensive applications whenever and wherever they were needed. Commercially available as Accelerated Imaging and Modeling, the product is used by oil companies and seismic service companies, which require large processing and data storage capacities.

  8. A computational modeling approach of the jet-like acoustic streaming and heat generation induced by low frequency high power ultrasonic horn reactors.

    Science.gov (United States)

    Trujillo, Francisco Javier; Knoerzer, Kai

    2011-11-01

    High power ultrasound reactors have gained a lot of interest in the food industry given the effects that can arise from ultrasonic-induced cavitation in liquid foods. However, most of the new food processing developments have been based on empirical approaches. Thus, there is a need for mathematical models which help to understand, optimize, and scale up ultrasonic reactors. In this work, a computational fluid dynamics (CFD) model was developed to predict the acoustic streaming and induced heat generated by an ultrasonic horn reactor. In the model it is assumed that the horn tip is a fluid inlet, where a turbulent jet flow is injected into the vessel. The hydrodynamic momentum rate of the incoming jet is assumed to be equal to the total acoustic momentum rate emitted by the acoustic power source. CFD velocity predictions show excellent agreement with the experimental data for power densities higher than W(0)/V ≥ 25kWm(-3). This model successfully describes hydrodynamic fields (streaming) generated by low-frequency-high-power ultrasound. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  9. Computer-aided design of the RF-cavity for a high-power S-band klystron

    Science.gov (United States)

    Kant, D.; Bandyopadhyay, A. K.; Pal, D.; Meena, R.; Nangru, S. C.; Joshi, L. M.

    2012-08-01

    This article describes the computer-aided design of the RF-cavity for a S-band klystron operating at 2856 MHz. State-of-the-art electromagnetic simulation tools SUPERFISH, CST Microwave studio, HFSS and MAGIC have been used for cavity design. After finalising the geometrical details of the cavity through simulation, it has been fabricated and characterised through cold testing. Detailed results of the computer-aided simulation and cold measurements are presented in this article.

  10. Computer-aided power systems analysis

    CERN Document Server

    Kusic, George

    2008-01-01

    Computer applications yield more insight into system behavior than is possible by using hand calculations on system elements. Computer-Aided Power Systems Analysis: Second Edition is a state-of-the-art presentation of basic principles and software for power systems in steady-state operation. Originally published in 1985, this revised edition explores power systems from the point of view of the central control facility. It covers the elements of transmission networks, bus reference frame, network fault and contingency calculations, power flow on transmission networks, generator base power setti

  11. Switching power converters medium and high power

    CERN Document Server

    Neacsu, Dorin O

    2013-01-01

    An examination of all of the multidisciplinary aspects of medium- and high-power converter systems, including basic power electronics, digital control and hardware, sensors, analog preprocessing of signals, protection devices and fault management, and pulse-width-modulation (PWM) algorithms, Switching Power Converters: Medium and High Power, Second Edition discusses the actual use of industrial technology and its related subassemblies and components, covering facets of implementation otherwise overlooked by theoretical textbooks. The updated Second Edition contains many new figures, as well as

  12. Balancing computation and communication power in power constrained clusters

    Science.gov (United States)

    Piga, Leonardo; Paul, Indrani; Huang, Wei

    2018-05-29

    Systems, apparatuses, and methods for balancing computation and communication power in power constrained environments. A data processing cluster with a plurality of compute nodes may perform parallel processing of a workload in a power constrained environment. Nodes that finish tasks early may be power-gated based on one or more conditions. In some scenarios, a node may predict a wait duration and go into a reduced power consumption state if the wait duration is predicted to be greater than a threshold. The power saved by power-gating one or more nodes may be reassigned for use by other nodes. A cluster agent may be configured to reassign the unused power to the active nodes to expedite workload processing.

  13. Wind power systems. Applications of computational intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Lingfeng [Toledo Univ., OH (United States). Dept. of Electrical Engineering and Computer Science; Singh, Chanan [Texas A and M Univ., College Station, TX (United States). Electrical and Computer Engineering Dept.; Kusiak, Andrew (eds.) [Iowa Univ., Iowa City, IA (United States). Mechanical and Industrial Engineering Dept.

    2010-07-01

    Renewable energy sources such as wind power have attracted much attention because they are environmentally friendly, do not produce carbon dioxide and other emissions, and can enhance a nation's energy security. For example, recently more significant amounts of wind power are being integrated into conventional power grids. Therefore, it is necessary to address various important and challenging issues related to wind power systems, which are significantly different from the traditional generation systems. This book is a resource for engineers, practitioners, and decision-makers interested in studying or using the power of computational intelligence based algorithms in handling various important problems in wind power systems at the levels of power generation, transmission, and distribution. Researchers have been developing biologically-inspired algorithms in a wide variety of complex large-scale engineering domains. Distinguished from the traditional analytical methods, the new methods usually accomplish the task through their computationally efficient mechanisms. Computational intelligence methods such as evolutionary computation, neural networks, and fuzzy systems have attracted much attention in electric power systems. Meanwhile, modern electric power systems are becoming more and more complex in order to meet the growing electricity market. In particular, the grid complexity is continuously enhanced by the integration of intermittent wind power as well as the current restructuring efforts in electricity industry. Quite often, the traditional analytical methods become less efficient or even unable to handle this increased complexity. As a result, it is natural to apply computational intelligence as a powerful tool to deal with various important and pressing problems in the current wind power systems. This book presents the state-of-the-art development in the field of computational intelligence applied to wind power systems by reviewing the most up

  14. Automated System Tests High-Power MOSFET's

    Science.gov (United States)

    Huston, Steven W.; Wendt, Isabel O.

    1994-01-01

    Computer-controlled system tests metal-oxide/semiconductor field-effect transistors (MOSFET's) at high voltages and currents. Measures seven parameters characterizing performance of MOSFET, with view toward obtaining early indication MOSFET defective. Use of test system prior to installation of power MOSFET in high-power circuit saves time and money.

  15. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  16. High performance multi-scale and multi-physics computation of nuclear power plant subjected to strong earthquake. An Overview

    International Nuclear Information System (INIS)

    Yoshimura, Shinobu; Kawai, Hiroshi; Sugimoto, Shin'ichiro; Hori, Muneo; Nakajima, Norihiro; Kobayashi, Kei

    2010-01-01

    Recently importance of nuclear energy has been recognized again due to serious concerns of global warming and energy security. In parallel, it is one of critical issues to verify safety capability of ageing nuclear power plants (NPPs) subjected to strong earthquake. Since 2007, we have been developing the multi-scale and multi-physics based numerical simulator for quantitatively predicting actual quake-proof capability of ageing NPPs under operation or just after plant trip subjected to strong earthquake. In this paper, we describe an overview of the simulator with some preliminary results. (author)

  17. High-performance computing using FPGAs

    CERN Document Server

    Benkrid, Khaled

    2013-01-01

    This book is concerned with the emerging field of High Performance Reconfigurable Computing (HPRC), which aims to harness the high performance and relative low power of reconfigurable hardware–in the form Field Programmable Gate Arrays (FPGAs)–in High Performance Computing (HPC) applications. It presents the latest developments in this field from applications, architecture, and tools and methodologies points of view. We hope that this work will form a reference for existing researchers in the field, and entice new researchers and developers to join the HPRC community.  The book includes:  Thirteen application chapters which present the most important application areas tackled by high performance reconfigurable computers, namely: financial computing, bioinformatics and computational biology, data search and processing, stencil computation e.g. computational fluid dynamics and seismic modeling, cryptanalysis, astronomical N-body simulation, and circuit simulation.     Seven architecture chapters which...

  18. Architectural analysis for wirelessly powered computing platforms

    NARCIS (Netherlands)

    Kapoor, A.; Pineda de Gyvez, J.

    2013-01-01

    We present a design framework for wirelessly powered generic computing platforms that takes into account various system parameters in response to a time-varying energy source. These parameters are the charging profile of the energy source, computing speed (fclk), digital supply voltage (VDD), energy

  19. A Low-Power High-Speed Spintronics-Based Neuromorphic Computing System Using Real Time Tracking Method

    DEFF Research Database (Denmark)

    Farkhani, Hooman; Tohidi, Mohammad; Farkhani, Sadaf

    2018-01-01

    In spintronic-based neuromorphic computing systems (NCS), the switching of magnetic moment in a magnetic tunnel junction (MTJ) is used to mimic neuron firing. However, the stochastic switching behavior of the MTJ and process variations effect lead to a significant increase in stimulation time...... of such NCSs. Moreover, current NCSs need an extra phase to read the MTJ state after stimulation which is in contrast with real neuron functionality in human body. In this paper, the read circuit is replaced with a proposed real-time sensing (RTS) circuit. The RTS circuit tracks the MTJ state during...... stimulation phase. As soon as switching happens, the RTS circuit terminates the MTJ current and stimulates the post neuron. Hence, the RTS circuit not only improves the energy consumption and speed, but also makes the operation of NCS similar to real neuron functionality. The simulation results in 65-nm CMOS...

  20. High Power Density Motors

    Science.gov (United States)

    Kascak, Daniel J.

    2004-01-01

    With the growing concerns of global warming, the need for pollution-free vehicles is ever increasing. Pollution-free flight is one of NASA's goals for the 21" Century. , One method of approaching that goal is hydrogen-fueled aircraft that use fuel cells or turbo- generators to develop electric power that can drive electric motors that turn the aircraft's propulsive fans or propellers. Hydrogen fuel would likely be carried as a liquid, stored in tanks at its boiling point of 20.5 K (-422.5 F). Conventional electric motors, however, are far too heavy (for a given horsepower) to use on aircraft. Fortunately the liquid hydrogen fuel can provide essentially free refrigeration that can be used to cool the windings of motors before the hydrogen is used for fuel. Either High Temperature Superconductors (HTS) or high purity metals such as copper or aluminum may be used in the motor windings. Superconductors have essentially zero electrical resistance to steady current. The electrical resistance of high purity aluminum or copper near liquid hydrogen temperature can be l/lOO* or less of the room temperature resistance. These conductors could provide higher motor efficiency than normal room-temperature motors achieve. But much more importantly, these conductors can carry ten to a hundred times more current than copper conductors do in normal motors operating at room temperature. This is a consequence of the low electrical resistance and of good heat transfer coefficients in boiling LH2. Thus the conductors can produce higher magnetic field strengths and consequently higher motor torque and power. Designs, analysis and actual cryogenic motor tests show that such cryogenic motors could produce three or more times as much power per unit weight as turbine engines can, whereas conventional motors produce only 1/5 as much power per weight as turbine engines. This summer work has been done with Litz wire to maximize the current density. The current is limited by the amount of heat it

  1. High-power electronics

    CERN Document Server

    Kapitsa, Petr Leonidovich

    1966-01-01

    High-Power Electronics, Volume 2 presents the electronic processes in devices of the magnetron type and electromagnetic oscillations in different systems. This book explores the problems of electronic energetics.Organized into 11 chapters, this volume begins with an overview of the motion of electrons in a flat model of the magnetron, taking into account the in-phase wave and the reverse wave. This text then examines the processes of transmission of electromagnetic waves of various polarization and the wave reflection from grids made of periodically distributed infinite metal conductors. Other

  2. High Power Vanadate lasers

    CSIR Research Space (South Africa)

    Strauss

    2006-07-01

    Full Text Available stream_source_info Strauss1_2006.pdf.txt stream_content_type text/plain stream_size 3151 Content-Encoding UTF-8 stream_name Strauss1_2006.pdf.txt Content-Type text/plain; charset=UTF-8 Laser Research Institute... University of Stellenbosch www.laser-research.co.za High Power Vanadate lasers H.J.Strauss, Dr. C. Bollig, R.C. Botha, Prof. H.M. von Bergmann, Dr. J.P. Burger Aims 1) To develop new techniques to mount laser crystals, 2) compare the lasing properties...

  3. High power coaxial ubitron

    Science.gov (United States)

    Balkcum, Adam J.

    In the ubitron, also known as the free electron laser, high power coherent radiation is generated from the interaction of an undulating electron beam with an electromagnetic signal and a static periodic magnetic wiggler field. These devices have experimentally produced high power spanning the microwave to x-ray regimes. Potential applications range from microwave radar to the study of solid state material properties. In this dissertation, the efficient production of high power microwaves (HPM) is investigated for a ubitron employing a coaxial circuit and wiggler. Designs for the particular applications of an advanced high gradient linear accelerator driver and a directed energy source are presented. The coaxial ubitron is inherently suited for the production of HPM. It utilizes an annular electron beam to drive the low loss, RF breakdown resistant TE01 mode of a large coaxial circuit. The device's large cross-sectional area greatly reduces RF wall heat loading and the current density loading at the cathode required to produce the moderate energy (500 keV) but high current (1-10 kA) annular electron beam. Focusing and wiggling of the beam is achieved using coaxial annular periodic permanent magnet (PPM) stacks without a solenoidal guide magnetic field. This wiggler configuration is compact, efficient and can propagate the multi-kiloampere electron beams required for many HPM applications. The coaxial PPM ubitron in a traveling wave amplifier, cavity oscillator and klystron configuration is investigated using linear theory and simulation codes. A condition for the dc electron beam stability in the coaxial wiggler is derived and verified using the 2-1/2 dimensional particle-in-cell code, MAGIC. New linear theories for the cavity start-oscillation current and gain in a klystron are derived. A self-consistent nonlinear theory for the ubitron-TWT and a new nonlinear theory for the ubitron oscillator are presented. These form the basis for simulation codes which, along

  4. Computational Biology and High Performance Computing 2000

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.; Shoichet, Brian K.; Stewart, Craig; Dubchak, Inna L.; Arkin, Adam P.

    2000-10-19

    The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational and theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.

  5. High-End Scientific Computing

    Science.gov (United States)

    EPA uses high-end scientific computing, geospatial services and remote sensing/imagery analysis to support EPA's mission. The Center for Environmental Computing (CEC) assists the Agency's program offices and regions to meet staff needs in these areas.

  6. Computing in high energy physics

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Sarah; Devenish, Robin [Nuclear Physics Laboratory, Oxford University (United Kingdom)

    1989-07-15

    Computing in high energy physics has changed over the years from being something one did on a slide-rule, through early computers, then a necessary evil to the position today where computers permeate all aspects of the subject from control of the apparatus to theoretical lattice gauge calculations. The state of the art, as well as new trends and hopes, were reflected in this year's 'Computing In High Energy Physics' conference held in the dreamy setting of Oxford's spires. The conference aimed to give a comprehensive overview, entailing a heavy schedule of 35 plenary talks plus 48 contributed papers in two afternoons of parallel sessions. In addition to high energy physics computing, a number of papers were given by experts in computing science, in line with the conference's aim – 'to bring together high energy physicists and computer scientists'.

  7. Computing in high energy physics

    International Nuclear Information System (INIS)

    Smith, Sarah; Devenish, Robin

    1989-01-01

    Computing in high energy physics has changed over the years from being something one did on a slide-rule, through early computers, then a necessary evil to the position today where computers permeate all aspects of the subject from control of the apparatus to theoretical lattice gauge calculations. The state of the art, as well as new trends and hopes, were reflected in this year's 'Computing In High Energy Physics' conference held in the dreamy setting of Oxford's spires. The conference aimed to give a comprehensive overview, entailing a heavy schedule of 35 plenary talks plus 48 contributed papers in two afternoons of parallel sessions. In addition to high energy physics computing, a number of papers were given by experts in computing science, in line with the conference's aim – 'to bring together high energy physicists and computer scientists'

  8. Computer Architecture Techniques for Power-Efficiency

    CERN Document Server

    Kaxiras, Stefanos

    2008-01-01

    In the last few years, power dissipation has become an important design constraint, on par with performance, in the design of new computer systems. Whereas in the past, the primary job of the computer architect was to translate improvements in operating frequency and transistor count into performance, now power efficiency must be taken into account at every step of the design process. While for some time, architects have been successful in delivering 40% to 50% annual improvement in processor performance, costs that were previously brushed aside eventually caught up. The most critical of these

  9. High Performance Computing Multicast

    Science.gov (United States)

    2012-02-01

    A History of the Virtual Synchrony Replication Model,” in Replication: Theory and Practice, Charron-Bost, B., Pedone, F., and Schiper, A. (Eds...Performance Computing IP / IPv4 Internet Protocol (version 4.0) IPMC Internet Protocol MultiCast LAN Local Area Network MCMD Dr. Multicast MPI

  10. Plant computer system in nuclear power station

    International Nuclear Information System (INIS)

    Kato, Shinji; Fukuchi, Hiroshi

    1991-01-01

    In nuclear power stations, centrally concentrated monitoring system has been adopted, and in central control rooms, large quantity of information and operational equipments concentrate, therefore, those become the important place of communication between plants and operators. Further recently, due to the increase of the unit capacity, the strengthening of safety, the problems of man-machine interface and so on, it has become important to concentrate information, to automate machinery and equipment and to simplify them for improving the operational environment, reliability and so on. On the relation of nuclear power stations and computer system, to which attention has been paid recently as the man-machine interface, the example in Tsuruga Power Station, Japan Atomic Power Co. is shown. No.2 plant in the Tsuruga Power Station is a PWR plant with 1160 MWe output, which is a home built standardized plant, accordingly the computer system adopted here is explained. The fundamental concept of the central control board, the process computer system, the design policy, basic system configuration, reliability and maintenance, CRT display, and the computer system for No.1 BWR 357 MW plant are reported. (K.I.)

  11. INSPIRED High School Computing Academies

    Science.gov (United States)

    Doerschuk, Peggy; Liu, Jiangjiang; Mann, Judith

    2011-01-01

    If we are to attract more women and minorities to computing we must engage students at an early age. As part of its mission to increase participation of women and underrepresented minorities in computing, the Increasing Student Participation in Research Development Program (INSPIRED) conducts computing academies for high school students. The…

  12. Nuclear power flies high

    International Nuclear Information System (INIS)

    Friedman, S.T.

    1983-01-01

    Nuclear power in aircraft, rockets and satellites is discussed. No nuclear-powered rockets or aircraft have ever flown, but ground tests were successful. Nuclear reactors are used in the Soviet Cosmos serles of satellites, but only one American satellite, the SNAP-10A, contained a reactor. Radioisotope thermoelectric generators, many of which use plutonium 238, have powered more than 20 satellites launched into deep space by the U.S.A

  13. Cloud Computing and the Power to Choose

    Science.gov (United States)

    Bristow, Rob; Dodds, Ted; Northam, Richard; Plugge, Leo

    2010-01-01

    Some of the most significant changes in information technology are those that have given the individual user greater power to choose. The first of these changes was the development of the personal computer. The PC liberated the individual user from the limitations of the mainframe and minicomputers and from the rules and regulations of centralized…

  14. High Efficiency Power Converter for Low Voltage High Power Applications

    DEFF Research Database (Denmark)

    Nymand, Morten

    The topic of this thesis is the design of high efficiency power electronic dc-to-dc converters for high-power, low-input-voltage to high-output-voltage applications. These converters are increasingly required for emerging sustainable energy systems such as fuel cell, battery or photo voltaic based......, and remote power generation for light towers, camper vans, boats, beacons, and buoys etc. A review of current state-of-the-art is presented. The best performing converters achieve moderately high peak efficiencies at high input voltage and medium power level. However, system dimensioning and cost are often...

  15. Wirelessly powered sensor networks and computational RFID

    CERN Document Server

    2013-01-01

    The Wireless Identification and Sensing Platform (WISP) is the first of a new class of RF-powered sensing and computing systems.  Rather than being powered by batteries, these sensor systems are powered by radio waves that are either deliberately broadcast or ambient.  Enabled by ongoing exponential improvements in the energy efficiency of microelectronics, RF-powered sensing and computing is rapidly moving along a trajectory from impossible (in the recent past), to feasible (today), toward practical and commonplace (in the near future). This book is a collection of key papers on RF-powered sensing and computing systems including the WISP.  Several of the papers grew out of the WISP Challenge, a program in which Intel Corporation donated WISPs to academic applicants who proposed compelling WISP-based projects.  The book also includes papers presented at the first WISP Summit, a workshop held in Berkeley, CA in association with the ACM Sensys conference, as well as other relevant papers. The book provides ...

  16. High Power Orbit Transfer Vehicle

    National Research Council Canada - National Science Library

    Gulczinski, Frank

    2003-01-01

    ... from Virginia Tech University and Aerophysics, Inc. to examine propulsion requirements for a high-power orbit transfer vehicle using thin-film voltaic solar array technologies under development by the Space Vehicles Directorate (dubbed PowerSail...

  17. High Efficiency Power Converter for Low Voltage High Power Applications

    DEFF Research Database (Denmark)

    Nymand, Morten

    The topic of this thesis is the design of high efficiency power electronic dc-to-dc converters for high-power, low-input-voltage to high-output-voltage applications. These converters are increasingly required for emerging sustainable energy systems such as fuel cell, battery or photo voltaic based...

  18. Implementing an Affordable High-Performance Computing for Teaching-Oriented Computer Science Curriculum

    Science.gov (United States)

    Abuzaghleh, Omar; Goldschmidt, Kathleen; Elleithy, Yasser; Lee, Jeongkyu

    2013-01-01

    With the advances in computing power, high-performance computing (HPC) platforms have had an impact on not only scientific research in advanced organizations but also computer science curriculum in the educational community. For example, multicore programming and parallel systems are highly desired courses in the computer science major. However,…

  19. High power excimer laser

    International Nuclear Information System (INIS)

    Oesterlin, P.; Muckenheim, W.; Basting, D.

    1988-01-01

    Excimer lasers emitting more than 200 W output power are not commercially available. A significant increase requires new technological efforts with respect to both the gas circulation and the discharge system. The authors report how a research project has yielded a laser which emits 0.5 kW at 308 nm when being UV preionized and operated at a repetition rate of 300 Hz. The laser, which is capable of operating at 500 Hz, can be equipped with an x-ray preionization module. After completing this project 1 kW output power will be available

  20. Computing in high energy physics

    International Nuclear Information System (INIS)

    Hertzberger, L.O.; Hoogland, W.

    1986-01-01

    This book deals with advanced computing applications in physics, and in particular in high energy physics environments. The main subjects covered are networking; vector and parallel processing; and embedded systems. Also examined are topics such as operating systems, future computer architectures and commercial computer products. The book presents solutions that are foreseen as coping, in the future, with computing problems in experimental and theoretical High Energy Physics. In the experimental environment the large amounts of data to be processed offer special problems on-line as well as off-line. For on-line data reduction, embedded special purpose computers, which are often used for trigger applications are applied. For off-line processing, parallel computers such as emulator farms and the cosmic cube may be employed. The analysis of these topics is therefore a main feature of this volume

  1. High average power supercontinuum sources

    Indian Academy of Sciences (India)

    The physical mechanisms and basic experimental techniques for the creation of high average spectral power supercontinuum sources is briefly reviewed. We focus on the use of high-power ytterbium-doped fibre lasers as pump sources, and the use of highly nonlinear photonic crystal fibres as the nonlinear medium.

  2. Future Computing Platforms for Science in a Power Constrained Era

    International Nuclear Information System (INIS)

    Abdurachmanov, David; Eulisse, Giulio; Elmer, Peter; Knight, Robert

    2015-01-01

    Power consumption will be a key constraint on the future growth of Distributed High Throughput Computing (DHTC) as used by High Energy Physics (HEP). This makes performance-per-watt a crucial metric for selecting cost-efficient computing solutions. For this paper, we have done a wide survey of current and emerging architectures becoming available on the market including x86-64 variants, ARMv7 32-bit, ARMv8 64-bit, Many-Core and GPU solutions, as well as newer System-on-Chip (SoC) solutions. We compare performance and energy efficiency using an evolving set of standardized HEP-related benchmarks and power measurement techniques we have been developing. We evaluate the potential for use of such computing solutions in the context of DHTC systems, such as the Worldwide LHC Computing Grid (WLCG). (paper)

  3. High performance computing in linear control

    International Nuclear Information System (INIS)

    Datta, B.N.

    1993-01-01

    Remarkable progress has been made in both theory and applications of all important areas of control. The theory is rich and very sophisticated. Some beautiful applications of control theory are presently being made in aerospace, biomedical engineering, industrial engineering, robotics, economics, power systems, etc. Unfortunately, the same assessment of progress does not hold in general for computations in control theory. Control Theory is lagging behind other areas of science and engineering in this respect. Nowadays there is a revolution going on in the world of high performance scientific computing. Many powerful computers with vector and parallel processing have been built and have been available in recent years. These supercomputers offer very high speed in computations. Highly efficient software, based on powerful algorithms, has been developed to use on these advanced computers, and has also contributed to increased performance. While workers in many areas of science and engineering have taken great advantage of these hardware and software developments, control scientists and engineers, unfortunately, have not been able to take much advantage of these developments

  4. Fault tolerant embedded computers and power electronics for nuclear robotics

    International Nuclear Information System (INIS)

    Giraud, A.; Robiolle, M.

    1995-01-01

    For requirements of nuclear industries, it is necessary to use embedded rad-tolerant electronics and high-level safety. In this paper, we first describe a computer architecture called MICADO designed for French nuclear industry. We then present outgoing projects on our industry. A special point is made on power electronics for remote-operated and legged robots. (authors). 7 refs., 2 figs

  5. Fault tolerant embedded computers and power electronics for nuclear robotics

    Energy Technology Data Exchange (ETDEWEB)

    Giraud, A.; Robiolle, M.

    1995-12-31

    For requirements of nuclear industries, it is necessary to use embedded rad-tolerant electronics and high-level safety. In this paper, we first describe a computer architecture called MICADO designed for French nuclear industry. We then present outgoing projects on our industry. A special point is made on power electronics for remote-operated and legged robots. (authors). 7 refs., 2 figs.

  6. High-powered manoeuvres

    CERN Multimedia

    Anaïs Schaeffer

    2013-01-01

    This week, CERN received the latest new transformers for the SPS. Stored in pairs in 24-tonne steel containers, these transformers will replace the old models, which have been in place since 1981.     The transformers arrive at SPS's access point 4 (BA 4). During LS1, the TE-EPC Group will be replacing all of the transformers for the main converters of the SPS. This renewal campaign is being carried out as part of the accelerator consolidation programme, which began at the start of April and will come to an end in November. It involves 80 transformers: 64 with a power of 2.6 megavolt-amperes (MVA) for the dipole magnets, and 16 with 1.9 MVA for the quadrupoles. These new transformers were manufactured by an Italian company and are being installed outside the six access points of the SPS by the EN-HE Group, using CERN's 220-tonne crane. They will contribute to the upgrade of the SPS, which should thus continue to operate as the injector for the LHC until 2040....

  7. Autonomously managed high power systems

    International Nuclear Information System (INIS)

    Weeks, D.J.; Bechtel, R.T.

    1985-01-01

    The need for autonomous power management capabilities will increase as the power levels of spacecraft increase into the multi-100 kW range. The quantity of labor intensive ground and crew support consumed by the 9 kW Skylab cannot be afforded in support of a 75-300 kW Space Station or high power earth orbital and interplanetary spacecraft. Marshall Space Flight Center is managing a program to develop necessary technologies for high power system autonomous management. To date a reference electrical power system and automation approaches have been defined. A test facility for evaluation and verification of management algorithms and hardware has been designed with the first of the three power channel capability nearing completion

  8. Reviewing computer capabilities in nuclear power plants

    International Nuclear Information System (INIS)

    1990-06-01

    The OSART programme of the IAEA has become an effective vehicle for promoting international co-operation for the enhancement of plant operational safety. In order to maintain consistency in the OSART reviews, OSART Guidelines have been developed which are intended to ensure that the reviewing process is comprehensive. Computer technology is an area in which rapid development is taking place and new applications may be computerized to further enhance safety and the effectiveness of the plant. Supplementary guidance and reference material is needed to help attain comprehensiveness and consistency in OSART reviews. This document is devoted to the utilization of on-site and off-site computers in such a way that the safe operation of the plant is supported. In addition to the main text, there are several annexes illustrating adequate practices as found at various operating nuclear power plants. Refs, figs and tabs

  9. EURISOL High Power Targets

    CERN Document Server

    Kadi, Y; Lindroos, M; Ridikas, D; Stora, T; Tecchio, L; CERN. Geneva. BE Department

    2009-01-01

    Modern Nuclear Physics requires access to higher yields of rare isotopes, that relies on further development of the In-flight and Isotope Separation On-Line (ISOL) production methods. The limits of the In-Flight method will be applied via the next generation facilities FAIR in Germany, RIKEN in Japan and RIBF in the USA. The ISOL method will be explored at facilities including ISAC-TRIUMF in Canada, SPIRAL-2 in France, SPES in Italy, ISOLDE at CERN and eventually at the very ambitious multi-MW EURISOL facility. ISOL and in-flight facilities are complementary entities. While in-flight facilities excel in the production of very short lived radioisotopes independently of their chemical nature, ISOL facilities provide high Radioisotope Beam (RIB) intensities and excellent beam quality for 70 elements. Both production schemes are opening vast and rich fields of nuclear physics research. In this article we will introduce the targets planned for the EURISOL facility and highlight some of the technical and safety cha...

  10. Computational engineering applied to the concentrating solar power technology

    International Nuclear Information System (INIS)

    Giannuzzi, Giuseppe Mauro; Miliozzi, Adio

    2006-01-01

    Solar power plants based on parabolic-trough collectors present innumerable thermo-structural problems related on the one hand to the high temperatures of the heat transfer fluid, and on the other to the need og highly precise aiming and structural resistance. Devising an engineering response to these problems implies analysing generally unconventional solutions. At present, computational engineering is the principal investigating tool; it speeds the design of prototype installations and significantly reduces the necessary but costly experimental programmes [it

  11. Applications of high power microwaves

    International Nuclear Information System (INIS)

    Benford, J.; Swegle, J.

    1993-01-01

    The authors address a number of applications for HPM technology. There is a strong symbiotic relationship between a developing technology and its emerging applications. New technologies can generate new applications. Conversely, applications can demand development of new technological capability. High-power microwave generating systems come with size and weight penalties and problems associated with the x-radiation and collection of the electron beam. Acceptance of these difficulties requires the identification of a set of applications for which high-power operation is either demanded or results in significant improvements in peRFormance. The authors identify the following applications, and discuss their requirements and operational issues: (1) High-energy RF acceleration; (2) Atmospheric modification (both to produce artificial ionospheric mirrors for radio waves and to save the ozone layer); (3) Radar; (4) Electronic warfare; and (5) Laser pumping. In addition, they discuss several applications requiring high average power than border on HPM, power beaming and plasma heating

  12. Modular High Voltage Power Supply

    Energy Technology Data Exchange (ETDEWEB)

    Newell, Matthew R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-05-18

    The goal of this project is to develop a modular high voltage power supply that will meet the needs of safeguards applications and provide a modular plug and play supply for use with standard electronic racks.

  13. Modelling, simulation and computer-aided design (CAD) of gyrotrons for novel applications in the high-power terahertz science and technologies

    Science.gov (United States)

    Sabchevski, S.; Idehara, T.; Damyanova, M.; Zhelyazkov, I.; Balabanova, E.; Vasileva, E.

    2018-03-01

    Gyrotrons are the most powerful sources of CW coherent radiation in the sub-THz and THz frequency bands. In recent years, they have demonstrated a remarkable potential for bridging the so-called THz-gap in the electromagnetic spectrum and opened the road to many novel applications of the terahertz waves. Among them are various advanced spectroscopic techniques (e.g., ESR and DNP-NMR), plasma physics and fusion research, materials processing and characterization, imaging and inspection, new medical technologies and biological studies. In this paper, we review briefly the current status of the research in this broad field and present our problem-oriented software packages developed recently for numerical analysis, computer-aided design (CAD) and optimization of gyrotrons.

  14. High power klystrons for efficient reliable high power amplifiers

    Science.gov (United States)

    Levin, M.

    1980-11-01

    This report covers the design of reliable high efficiency, high power klystrons which may be used in both existing and proposed troposcatter radio systems. High Power (10 kW) klystron designs were generated in C-band (4.4 GHz to 5.0 GHz), S-band (2.5 GHz to 2.7 GHz), and L-band or UHF frequencies (755 MHz to 985 MHz). The tubes were designed for power supply compatibility and use with a vapor/liquid phase heat exchanger. Four (4) S-band tubes were developed in the course of this program along with two (2) matching focusing solenoids and two (2) heat exchangers. These tubes use five (5) tuners with counters which are attached to the focusing solenoids. A reliability mathematical model of the tube and heat exchanger system was also generated.

  15. Trend of computer-based console for nuclear power plants

    International Nuclear Information System (INIS)

    Wajima, Tsunetaka; Serizawa, Michiya

    1975-01-01

    The amount of informations to be watched by the operators in the central operation room increased with the increase of the capacity of nuclear power generation plants, and the necessity of computer-based consoles, in which the informations are compiled and the rationalization of the interface between the operators and the plants is intended by introducing CRT displays and process computers, became to be recognized. The integrated monitoring and controlling system is explained briefly by taking Dungeness B Nuclear Power Station in Britain as a typical example. This power station comprises two AGRs, and these two plants can be controlled in one central control room, each by one man. Three computers including stand-by one are installed. Each computer has the core memory of 16 K words (24 bits/word), and 4 magnetic drums of 256 K words are installed as the external memory. The peripheral equipments are 12 CRT displays, 6 typewriters, high speed tape reader and tape punch for each plant. The display and record of plant data, the analysis, display and record of alarms, the control of plants including reactors, and post incident record are assigned to the computers. In Hitachi Ltd. in Japan, the introduction of color CRTs, the developments of operating consoles, new data-accessing method, and the consoles for maintenance management are in progress. (Kako, I.)

  16. Computer-assisted power plant management

    International Nuclear Information System (INIS)

    Boettcher, D.

    1990-01-01

    Operating a power plant and keeping it operational is ensured by a multiplicity of technical management subtasks which are cross referenced and based on an extensive inventory of descriptive and operational plant data. These data stocks are still registered in an isolated mode and managed and updated manually. This is a labor intensive, error prone procedure. In this situation, the introduction of a computer-assisted plant management system, whose core is a data-base of assured quality common to all activities, and which contains standardized processing aids fully planned for the subtasks occurring in the plant, is likely to achieve a considerable improvement in the quality of plant management and to relieve the staff of administrative activities. (orig.) [de

  17. New high power linacs and beam physics

    International Nuclear Information System (INIS)

    Wangler, T.P.; Gray, E.R.; Nath, S.; Crandall, K.R.; Hasegawa, K.

    1997-01-01

    New high-power proton linacs must be designed to control beam loss, which can lead to radioactivation of the accelerator. The threat of beam loss is increased significantly by the formation of beam halo. Numerical simulation studies have identified the space-charge interactions, especially those that occur in rms mismatched beams, as a major concern for halo growth. The maximum-amplitude predictions of the simulation codes must be subjected to independent tests to confirm the validity of the results. Consequently, the authors compare predictions from the particle-core halo models with computer simulations to test their understanding of the halo mechanisms that are incorporated in the computer codes. They present and discuss scaling laws that provide guidance for high-power linac design

  18. Pulsed high-power beams

    International Nuclear Information System (INIS)

    Reginato, L.L.; Birx, D.L.

    1988-01-01

    The marriage of induction linac technology with nonlinear magnetic modulators has produced some unique capabilities. It is now possible to produce short-pulse electron beams with average currents measured in amperes, at gradients approaching 1-MeV/m, and with power efficiencies exceeding 50%. This paper reports on a 70-MeV, 3-kA induction accelerator (ETA II) constructed at the Lawrence Livermore National Laboratory that incorporates the pulse technology concepts that have evolved over the past several years. The ETA II is a linear induction accelerator and provides a test facility for demonstration of the high-average-power components and high-brightness sources used in such accelerators. The pulse drive of the accelerator is based on state-of-the-art magnetic pulse compressors with very high peak-power capability, repetition rates exceeding 1 kHz, and excellent reliability

  19. Computer-aided engineering in High Energy Physics

    International Nuclear Information System (INIS)

    Bachy, G.; Hauviller, C.; Messerli, R.; Mottier, M.

    1988-01-01

    Computing, standard tool for a long time in the High Energy Physics community, is being slowly introduced at CERN in the mechanical engineering field. The first major application was structural analysis followed by Computer-Aided Design (CAD). Development work is now progressing towards Computer-Aided Engineering around a powerful data base. This paper gives examples of the power of this approach applied to engineering for accelerators and detectors

  20. Advanced Output Coupling for High Power Gyrotrons

    Energy Technology Data Exchange (ETDEWEB)

    Read, Michael [Calabazas Creek Research, Inc., San Mateo, CA (United States); Ives, Robert Lawrence [Calabazas Creek Research, Inc., San Mateo, CA (United States); Marsden, David [Calabazas Creek Research, Inc., San Mateo, CA (United States); Collins, George [Calabazas Creek Research, Inc., San Mateo, CA (United States); Temkin, Richard [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Guss, William [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Lohr, John [General Atomics, La Jolla, CA (United States); Neilson, Jeffrey [Lexam Research, Redwood City, CA (United States); Bui, Thuc [Calabazas Creek Research, Inc., San Mateo, CA (United States)

    2016-11-28

    The Phase II program developed an internal RF coupler that transforms the whispering gallery RF mode produced in gyrotron cavities to an HE11 waveguide mode propagating in corrugated waveguide. This power is extracted from the vacuum using a broadband, chemical vapor deposited (CVD) diamond, Brewster angle window capable of transmitting more than 1.5 MW CW of RF power over a broad range of frequencies. This coupling system eliminates the Mirror Optical Units now required to externally couple Gaussian output power into corrugated waveguide, significantly reducing system cost and increasing efficiency. The program simulated the performance using a broad range of advanced computer codes to optimize the design. Both a direct coupler and Brewster angle window were built and tested at low and high power. Test results confirmed the performance of both devices and demonstrated they are capable of achieving the required performance for scientific, defense, industrial, and medical applications.

  1. High power laser exciter accelerators

    International Nuclear Information System (INIS)

    Martin, T.H.

    1975-01-01

    Recent developments in untriggered oil and water switching now permit the construction of compact, high energy density pulsed power sources for laser excitation. These accelerators, developed principally for electron beam fusion studies, appear adaptable to laser excitation and will provide electron beams of 10 13 to 10 14 W in the next several years. The accelerators proposed for e-beam fusion essentially concentrate the available power from the outside edge of a disk into the central region where the electron beam is formed. One of the main problem areas, that of power flow at the vacuum diode insulator, is greatly alleviated by the multiplicity of electron beams that are allowable for laser excitation. A proposal is made whereby the disk-shaped pulsed power sections are stacked vertically to form a series of radially flowing electron beams to excite the laser gas volume. (auth)

  2. High power fast ramping power supplies

    Energy Technology Data Exchange (ETDEWEB)

    Marneris,I.; Bajon, E.; Bonati, R.; Sandberg, J.; Roser, T.; Tsoupas, N.

    2009-05-04

    Hundred megawatt level fast ramping power converters to drive proton and heavy ion machines are under research and development at accelerator facilities in the world. This is a leading edge technology. There are several topologies to achieve this power level. Their advantages and related issues will be discussed.

  3. Application of computational intelligence in emerging power systems

    African Journals Online (AJOL)

    ... in the electrical engineering applications. This paper highlights the application of computational intelligence methods in power system problems. Various types of CI methods, which are widely used in power system, are also discussed in the brief. Keywords: Power systems, computational intelligence, artificial intelligence.

  4. High-power, high-efficiency FELs

    International Nuclear Information System (INIS)

    Sessler, A.M.

    1989-04-01

    High power, high efficiency FELs require tapering, as the particles loose energy, so as to maintain resonance between the electromagnetic wave and the particles. They also require focusing of the particles (usually done with curved pole faces) and focusing of the electromagnetic wave (i.e. optical guiding). In addition, one must avoid transverse beam instabilities (primarily resistive wall) and longitudinal instabilities (i.e sidebands). 18 refs., 7 figs., 3 tabs

  5. High voltage power network construction

    CERN Document Server

    Harker, Keith

    2018-01-01

    This book examines the key requirements, considerations, complexities and constraints relevant to the task of high voltage power network construction, from design, finance, contracts and project management to installation and commissioning, with the aim of providing an overview of the holistic end to end construction task in a single volume.

  6. High Temperature, High Power Piezoelectric Composite Transducers

    Science.gov (United States)

    Lee, Hyeong Jae; Zhang, Shujun; Bar-Cohen, Yoseph; Sherrit, StewarT.

    2014-01-01

    Piezoelectric composites are a class of functional materials consisting of piezoelectric active materials and non-piezoelectric passive polymers, mechanically attached together to form different connectivities. These composites have several advantages compared to conventional piezoelectric ceramics and polymers, including improved electromechanical properties, mechanical flexibility and the ability to tailor properties by using several different connectivity patterns. These advantages have led to the improvement of overall transducer performance, such as transducer sensitivity and bandwidth, resulting in rapid implementation of piezoelectric composites in medical imaging ultrasounds and other acoustic transducers. Recently, new piezoelectric composite transducers have been developed with optimized composite components that have improved thermal stability and mechanical quality factors, making them promising candidates for high temperature, high power transducer applications, such as therapeutic ultrasound, high power ultrasonic wirebonding, high temperature non-destructive testing, and downhole energy harvesting. This paper will present recent developments of piezoelectric composite technology for high temperature and high power applications. The concerns and limitations of using piezoelectric composites will also be discussed, and the expected future research directions will be outlined. PMID:25111242

  7. High Power Electron Accelerator Prototype

    CERN Document Server

    Tkachenko, Vadim; Cheskidov, Vladimir; Korobeynikov, G I; Kuznetsov, Gennady I; Lukin, A N; Makarov, Ivan; Ostreiko, Gennady; Panfilov, Alexander; Sidorov, Alexey; Tarnetsky, Vladimir V; Tiunov, Michael A

    2005-01-01

    In recent time the new powerful industrial electron accelerators appear on market. It caused the increased interest to radiation technologies using high energy X-rays due to their high penetration ability. However, because of low efficiency of X-ray conversion for electrons with energy below 5 MeV, the intensity of X-rays required for some industrial applications can be achieved only when the beam power exceeds 300 kW. The report describes a project of industrial electron accelerator ILU-12 for electron energy up to 5 MeV and beam power up to 300 kW specially designed for use in industrial applications. On the first stage of work we plan to use the existing generator designed for ILU-8 accelerator. It is realized on the GI-50A triode and provides the pulse power up to 1.5-2 MW and up to 20-30 kW of average power. In the report the basic concepts and a condition of the project for today are reflected.

  8. Associative Memory Computing Power and Its Simulation

    CERN Document Server

    Volpi, G; The ATLAS collaboration

    2014-01-01

    The associative memory (AM) system is a computing device made of hundreds of AM ASICs chips designed to perform “pattern matching” at very high speed. Since each AM chip stores a data base of 130000 pre-calculated patterns and large numbers of chips can be easily assembled together, it is possible to produce huge AM banks. Speed and size of the system are crucial for real-time High Energy Physics applications, such as the ATLAS Fast TracKer (FTK) Processor. Using 80 million channels of the ATLAS tracker, FTK finds tracks within 100 micro seconds. The simulation of such a parallelized system is an extremely complex task if executed in commercial computers based on normal CPUs. The algorithm performance is limited, due to the lack of parallelism, and in addition the memory requirement is very large. In fact the AM chip uses a content addressable memory (CAM) architecture. Any data inquiry is broadcast to all memory elements simultaneously, thus data retrieval time is independent of the database size. The gr...

  9. Associative Memory computing power and its simulation

    CERN Document Server

    Ancu, L S; The ATLAS collaboration; Britzger, D; Giannetti, P; Howarth, J W; Luongo, C; Pandini, C; Schmitt, S; Volpi, G

    2014-01-01

    The associative memory (AM) system is a computing device made of hundreds of AM ASICs chips designed to perform “pattern matching” at very high speed. Since each AM chip stores a data base of 130000 pre-calculated patterns and large numbers of chips can be easily assembled together, it is possible to produce huge AM banks. Speed and size of the system are crucial for real-time High Energy Physics applications, such as the ATLAS Fast TracKer (FTK) Processor. Using 80 million channels of the ATLAS tracker, FTK finds tracks within 100 micro seconds. The simulation of such a parallelized system is an extremely complex task if executed in commercial computers based on normal CPUs. The algorithm performance is limited, due to the lack of parallelism, and in addition the memory requirement is very large. In fact the AM chip uses a content addressable memory (CAM) architecture. Any data inquiry is broadcast to all memory elements simultaneously, thus data retrieval time is independent of the database size. The gr...

  10. Power-Efficient Computing: Experiences from the COSA Project

    Directory of Open Access Journals (Sweden)

    Daniele Cesini

    2017-01-01

    Full Text Available Energy consumption is today one of the most relevant issues in operating HPC systems for scientific applications. The use of unconventional computing systems is therefore of great interest for several scientific communities looking for a better tradeoff between time-to-solution and energy-to-solution. In this context, the performance assessment of processors with a high ratio of performance per watt is necessary to understand how to realize energy-efficient computing systems for scientific applications, using this class of processors. Computing On SOC Architecture (COSA is a three-year project (2015–2017 funded by the Scientific Commission V of the Italian Institute for Nuclear Physics (INFN, which aims to investigate the performance and the total cost of ownership offered by computing systems based on commodity low-power Systems on Chip (SoCs and high energy-efficient systems based on GP-GPUs. In this work, we present the results of the project analyzing the performance of several scientific applications on several GPU- and SoC-based systems. We also describe the methodology we have used to measure energy performance and the tools we have implemented to monitor the power drained by applications while running.

  11. High speed computer assisted tomography

    International Nuclear Information System (INIS)

    Maydan, D.; Shepp, L.A.

    1980-01-01

    X-ray generation and detection apparatus for use in a computer assisted tomography system which permits relatively high speed scanning. A large x-ray tube having a circular anode (3) surrounds the patient area. A movable electron gun (8) orbits adjacent to the anode. The anode directs into the patient area xrays which are delimited into a fan beam by a pair of collimating rings (21). After passing through the patient, x-rays are detected by an array (22) of movable detectors. Detector subarrays (23) are synchronously movable out of the x-ray plane to permit the passage of the fan beam

  12. Computer program analyzes and monitors electrical power systems (POSIMO)

    Science.gov (United States)

    Jaeger, K.

    1972-01-01

    Requirements to monitor and/or simulate electric power distribution, power balance, and charge budget are discussed. Computer program to analyze power system and generate set of characteristic power system data is described. Application to status indicators to denote different exclusive conditions is presented.

  13. High-Degree Neurons Feed Cortical Computations.

    Directory of Open Access Journals (Sweden)

    Nicholas M Timme

    2016-05-01

    Full Text Available Recent work has shown that functional connectivity among cortical neurons is highly varied, with a small percentage of neurons having many more connections than others. Also, recent theoretical developments now make it possible to quantify how neurons modify information from the connections they receive. Therefore, it is now possible to investigate how information modification, or computation, depends on the number of connections a neuron receives (in-degree or sends out (out-degree. To do this, we recorded the simultaneous spiking activity of hundreds of neurons in cortico-hippocampal slice cultures using a high-density 512-electrode array. This preparation and recording method combination produced large numbers of neurons recorded at temporal and spatial resolutions that are not currently available in any in vivo recording system. We utilized transfer entropy (a well-established method for detecting linear and nonlinear interactions in time series and the partial information decomposition (a powerful, recently developed tool for dissecting multivariate information processing into distinct parts to quantify computation between neurons where information flows converged. We found that computations did not occur equally in all neurons throughout the networks. Surprisingly, neurons that computed large amounts of information tended to receive connections from high out-degree neurons. However, the in-degree of a neuron was not related to the amount of information it computed. To gain insight into these findings, we developed a simple feedforward network model. We found that a degree-modified Hebbian wiring rule best reproduced the pattern of computation and degree correlation results seen in the real data. Interestingly, this rule also maximized signal propagation in the presence of network-wide correlations, suggesting a mechanism by which cortex could deal with common random background input. These are the first results to show that the extent to

  14. DOE research in utilization of high-performance computers

    International Nuclear Information System (INIS)

    Buzbee, B.L.; Worlton, W.J.; Michael, G.; Rodrigue, G.

    1980-12-01

    Department of Energy (DOE) and other Government research laboratories depend on high-performance computer systems to accomplish their programatic goals. As the most powerful computer systems become available, they are acquired by these laboratories so that advances can be made in their disciplines. These advances are often the result of added sophistication to numerical models whose execution is made possible by high-performance computer systems. However, high-performance computer systems have become increasingly complex; consequently, it has become increasingly difficult to realize their potential performance. The result is a need for research on issues related to the utilization of these systems. This report gives a brief description of high-performance computers, and then addresses the use of and future needs for high-performance computers within DOE, the growing complexity of applications within DOE, and areas of high-performance computer systems warranting research. 1 figure

  15. Computer simulation at high pressure

    International Nuclear Information System (INIS)

    Alder, B.J.

    1977-11-01

    The use of either the Monte Carlo or molecular dynamics method to generate equations-of-state data for various materials at high pressure is discussed. Particular emphasis is given to phase diagrams, such as the generation of various types of critical lines for mixtures, melting, structural and electronic transitions in solids, two-phase ionic fluid systems of astrophysical interest, as well as a brief aside of possible eutectic behavior in the interior of the earth. Then the application of the molecular dynamics method to predict transport coefficients and the neutron scattering function is discussed with a view as to what special features high pressure brings out. Lastly, an analysis by these computational methods of the measured intensity and frequency spectrum of depolarized light and also of the deviation of the dielectric measurements from the constancy of the Clausius--Mosotti function is given that leads to predictions of how the electronic structure of an atom distorts with pressure

  16. Fundamentals of power integrity for computer platforms and systems

    CERN Document Server

    DiBene, Joseph T

    2014-01-01

    An all-encompassing text that focuses on the fundamentals of power integrity Power integrity is the study of power distribution from the source to the load and the system level issues that can occur across it. For computer systems, these issues can range from inside the silicon to across the board and may egress into other parts of the platform, including thermal, EMI, and mechanical. With a focus on computer systems and silicon level power delivery, this book sheds light on the fundamentals of power integrity, utilizing the author's extensive background in the power integrity industry and un

  17. GRID : unlimited computing power on your desktop Conference MT17

    CERN Multimedia

    2001-01-01

    The Computational GRID is an analogy to the electrical power grid for computing resources. It decouples the provision of computing, data, and networking from its use, it allows large-scale pooling and sharing of resources distributed world-wide. Every computer, from a desktop to a mainframe or supercomputer, can provide computing power or data for the GRID. The final objective is to plug your computer into the wall and have direct access to huge computing resources immediately, just like plugging-in a lamp to get instant light. The GRID will facilitate world-wide scientific collaborations on an unprecedented scale. It will provide transparent access to major distributed resources of computer power, data, information, and collaborations.

  18. High Performance Spaceflight Computing (HPSC)

    Data.gov (United States)

    National Aeronautics and Space Administration — Space-based computing has not kept up with the needs of current and future NASA missions. We are developing a next-generation flight computing system that addresses...

  19. High energy physics and grid computing

    International Nuclear Information System (INIS)

    Yu Chuansong

    2004-01-01

    The status of the new generation computing environment of the high energy physics experiments is introduced briefly in this paper. The development of the high energy physics experiments and the new computing requirements by the experiments are presented. The blueprint of the new generation computing environment of the LHC experiments, the history of the Grid computing, the R and D status of the high energy physics grid computing technology, the network bandwidth needed by the high energy physics grid and its development are described. The grid computing research in Chinese high energy physics community is introduced at last. (authors)

  20. Energy efficiency of computer power supply units - Final report

    Energy Technology Data Exchange (ETDEWEB)

    Aebischer, B. [cepe - Centre for Energy Policy and Economics, Swiss Federal Institute of Technology Zuerich, Zuerich (Switzerland); Huser, H. [Encontrol GmbH, Niederrohrdorf (Switzerland)

    2002-11-15

    This final report for the Swiss Federal Office of Energy (SFOE) takes a look at the efficiency of computer power supply units, which decreases rapidly during average computer use. The background and the purpose of the project are examined. The power supplies for personal computers are discussed and the testing arrangement used is described. Efficiency, power-factor and operating points of the units are examined. Potentials for improvement and measures to be taken are discussed. Also, action to be taken by those involved in the design and operation of such power units is proposed. Finally, recommendations for further work are made.

  1. IBM Cloud Computing Powering a Smarter Planet

    Science.gov (United States)

    Zhu, Jinzy; Fang, Xing; Guo, Zhe; Niu, Meng Hua; Cao, Fan; Yue, Shuang; Liu, Qin Yu

    With increasing need for intelligent systems supporting the world's businesses, Cloud Computing has emerged as a dominant trend to provide a dynamic infrastructure to make such intelligence possible. The article introduced how to build a smarter planet with cloud computing technology. First, it introduced why we need cloud, and the evolution of cloud technology. Secondly, it analyzed the value of cloud computing and how to apply cloud technology. Finally, it predicted the future of cloud in the smarter planet.

  2. Computer system for nuclear power plant parameter display

    International Nuclear Information System (INIS)

    Stritar, A.; Klobuchar, M.

    1990-01-01

    The computer system for efficient, cheap and simple presentation of data on the screen of the personal computer is described. The display is in alphanumerical or graphical form. The system can be used for the man-machine interface in the process monitoring system of the nuclear power plant. It represents the third level of the new process computer system of the Nuclear Power Plant Krsko. (author)

  3. Proceedings of national symposium on computer applications in power plants

    International Nuclear Information System (INIS)

    1992-01-01

    The National Symposium on Computer Applications in Power Plants was organized to help promote exchange of views among scientists and engineers engaged in design, engineering, operation and maintenance of computer based systems in nuclear power plants, conventional power plants, heavy water plants, nuclear fuel cycle facilities and allied industries. About one hundred papers were presented at the Symposium. Those falling within the subject scope of INIS have been processed separately. (author)

  4. Computer-based control systems of nuclear power plants

    International Nuclear Information System (INIS)

    Kalashnikov, V.K.; Shugam, R.A.; Ol'shevsky, Yu.N.

    1975-01-01

    Computer-based control systems of nuclear power plants may be classified into those using computers for data acquisition only, those using computers for data acquisition and data processing, and those using computers for process control. In the present paper a brief review is given of the functions the systems above mentioned perform, their applications in different nuclear power plants, and some of their characteristics. The trend towards hierarchic systems using control computers with reserves already becomes clear when consideration is made of the control systems applied in the Canadian nuclear power plants that pertain to the first ones equipped with process computers. The control system being now under development for the large Soviet reactors of WWER type will also be based on the use of control computers. That part of the system concerned with controlling the reactor assembly is described in detail

  5. Evolutionary Computing for Intelligent Power System Optimization and Control

    DEFF Research Database (Denmark)

    This new book focuses on how evolutionary computing techniques benefit engineering research and development tasks by converting practical problems of growing complexities into simple formulations, thus largely reducing development efforts. This book begins with an overview of the optimization the...... theory and modern evolutionary computing techniques, and goes on to cover specific applications of evolutionary computing to power system optimization and control problems....

  6. Computing and cognition in future power-plant operations

    International Nuclear Information System (INIS)

    Kisner, R.A.; Sheridan, T.B.

    1983-01-01

    The intent of this paper is to speculate on the nature of future interactions between people and computers in the operation of power plants. In particular, the authors offer a taxonomy for examining the differing functions of operators in interacting with the plant and its computers, and the differing functions of the computers in interacting with the plant and its operators

  7. Computing and cognition in future power plant operations

    International Nuclear Information System (INIS)

    Kisner, R.A.; Sheridan, T.B.

    1983-01-01

    The intent of this paper is to speculate on the nature of future interactions between people and computers in the operation of power plants. In particular, the authors offer a taxonomy for examining the differing functions of operators in interacting with the plant and its computers, and the differing functions of the computers in interacting with the plant and its operators

  8. High power microwave source development

    Science.gov (United States)

    Benford, James N.; Miller, Gabriel; Potter, Seth; Ashby, Steve; Smith, Richard R.

    1995-05-01

    The requirements of this project have been to: (1) improve and expand the sources available in the facility for testing purposes and (2) perform specific tasks under direction of the Defense Nuclear Agency about the applications of high power microwaves (HPM). In this project the HPM application was power beaming. The requirements of this program were met in the following way: (1) We demonstrated that a compact linear induction accelerator can drive HPM sources at repetition rates in excess of 100 HZ at peak microwave powers of a GW. This was done for the relativistic magnetron. Since the conclusion of this contract such specifications have also been demonstrated for the relativistic klystron under Ballistic Missile Defense Organization funding. (2) We demonstrated an L band relativistic magnetron. This device has been used both on our single pulse machines, CAMEL and CAMEL X, and the repetitive system CLIA. (3) We demonstrated that phase locking of sources together in large numbers is a feasible technology and showed the generation of multigigawatt S-band radiation in an array of relativistic magnetrons.

  9. High performance computing in Windows Azure cloud

    OpenAIRE

    Ambruš, Dejan

    2013-01-01

    High performance, security, availability, scalability, flexibility and lower costs of maintenance have essentially contributed to the growing popularity of cloud computing in all spheres of life, especially in business. In fact cloud computing offers even more than this. With usage of virtual computing clusters a runtime environment for high performance computing can be efficiently implemented also in a cloud. There are many advantages but also some disadvantages of cloud computing, some ...

  10. High-power pulsed lasers

    International Nuclear Information System (INIS)

    Holzrichter, J.F.

    1980-01-01

    The ideas that led to the successful construction and operation of large multibeam fusion lasers at the Lawrence Livermore Laboratory are reviewed. These lasers are based on the use of Nd:glass laser materials. However, most of the concepts are applicable to any laser being designed for fusion experimentation. This report is a summary of lectures given by the author at the 20th Scottish University Summer School in Physics, on Laser Plasma Interaction. This report includes basic concepts of the laser plasma system, a discussion of lasers that are useful for short-pulse, high-power operation, laser design constraints, optical diagnostics, and system organization

  11. High energy physics and cloud computing

    International Nuclear Information System (INIS)

    Cheng Yaodong; Liu Baoxu; Sun Gongxing; Chen Gang

    2011-01-01

    High Energy Physics (HEP) has been a strong promoter of computing technology, for example WWW (World Wide Web) and the grid computing. In the new era of cloud computing, HEP has still a strong demand, and major international high energy physics laboratories have launched a number of projects to research on cloud computing technologies and applications. It describes the current developments in cloud computing and its applications in high energy physics. Some ongoing projects in the institutes of high energy physics, Chinese Academy of Sciences, including cloud storage, virtual computing clusters, and BESⅢ elastic cloud, are also described briefly in the paper. (authors)

  12. The computer simulation of the resonant network for the B-factory model power supply

    International Nuclear Information System (INIS)

    Zhou, W.; Endo, K.

    1993-07-01

    A high repetition model power supply and the resonant magnet network are simulated with the computer in order to check and improve the design of the power supply for the B-factory booster. We put our key point on a transient behavior of the power supply and the resonant magnet network. The results of the simulation are given. (author)

  13. Abstraction Power in Computer Science Education

    DEFF Research Database (Denmark)

    Bennedsen, Jens Benned; Caspersen, Michael Edelgaard

    2006-01-01

    The paper is a discussion of the hypothesis that a person’s abstraction power (or ability) has a positive influence on their ability to program.......The paper is a discussion of the hypothesis that a person’s abstraction power (or ability) has a positive influence on their ability to program....

  14. Fault analysis and strategy of high pulsed power supply for high power laser

    International Nuclear Information System (INIS)

    Liu Kefu; Qin Shihong; Li Jin; Pan Yuan; Yao Zonggan; Zheng Wanguo; Guo Liangfu; Zhou Peizhang; Li Yizheng; Chen Dehuai

    2001-01-01

    according to the requirements of driving flash-lamp, a high pulsed power supply (PPS) based on capacitors as energy storage elements is designed. The author analyzes in detail the faults of high pulsed power supply for high power laser. Such as capacitor internal short-circuit, main bus breakdown to ground, flashlamp sudden short or break. The fault current and voltage waveforms were given by circuit simulations. Based on the analysis and computation, the protection strategy with the fast fuse and ZnO was put forward, which can reduce the damage of PPS to the lower extent and provide the personnel safe and collateral property from the all threats. The preliminary experiments demonstrated that the design of the PPS can satisfy the project requirements

  15. Parallel Computing:. Some Activities in High Energy Physics

    Science.gov (United States)

    Willers, Ian

    This paper examines some activities in High Energy Physics that utilise parallel computing. The topic includes all computing from the proposed SIMD front end detectors, the farming applications, high-powered RISC processors and the large machines in the computer centers. We start by looking at the motivation behind using parallelism for general purpose computing. The developments around farming are then described from its simplest form to the more complex system in Fermilab. Finally, there is a list of some developments that are happening close to the experiments.

  16. Computational Power of Symmetry-Protected Topological Phases.

    Science.gov (United States)

    Stephen, David T; Wang, Dong-Sheng; Prakash, Abhishodh; Wei, Tzu-Chieh; Raussendorf, Robert

    2017-07-07

    We consider ground states of quantum spin chains with symmetry-protected topological (SPT) order as resources for measurement-based quantum computation (MBQC). We show that, for a wide range of SPT phases, the computational power of ground states is uniform throughout each phase. This computational power, defined as the Lie group of executable gates in MBQC, is determined by the same algebraic information that labels the SPT phase itself. We prove that these Lie groups always contain a full set of single-qubit gates, thereby affirming the long-standing conjecture that general SPT phases can serve as computationally useful phases of matter.

  17. High-Performance Computing Paradigm and Infrastructure

    CERN Document Server

    Yang, Laurence T

    2006-01-01

    With hyperthreading in Intel processors, hypertransport links in next generation AMD processors, multi-core silicon in today's high-end microprocessors from IBM and emerging grid computing, parallel and distributed computers have moved into the mainstream

  18. Micromagnetics on high-performance workstation and mobile computational platforms

    Science.gov (United States)

    Fu, S.; Chang, R.; Couture, S.; Menarini, M.; Escobar, M. A.; Kuteifan, M.; Lubarda, M.; Gabay, D.; Lomakin, V.

    2015-05-01

    The feasibility of using high-performance desktop and embedded mobile computational platforms is presented, including multi-core Intel central processing unit, Nvidia desktop graphics processing units, and Nvidia Jetson TK1 Platform. FastMag finite element method-based micromagnetic simulator is used as a testbed, showing high efficiency on all the platforms. Optimization aspects of improving the performance of the mobile systems are discussed. The high performance, low cost, low power consumption, and rapid performance increase of the embedded mobile systems make them a promising candidate for micromagnetic simulations. Such architectures can be used as standalone systems or can be built as low-power computing clusters.

  19. GPU-based high-performance computing for radiation therapy

    International Nuclear Information System (INIS)

    Jia, Xun; Jiang, Steve B; Ziegenhein, Peter

    2014-01-01

    Recent developments in radiotherapy therapy demand high computation powers to solve challenging problems in a timely fashion in a clinical environment. The graphics processing unit (GPU), as an emerging high-performance computing platform, has been introduced to radiotherapy. It is particularly attractive due to its high computational power, small size, and low cost for facility deployment and maintenance. Over the past few years, GPU-based high-performance computing in radiotherapy has experienced rapid developments. A tremendous amount of study has been conducted, in which large acceleration factors compared with the conventional CPU platform have been observed. In this paper, we will first give a brief introduction to the GPU hardware structure and programming model. We will then review the current applications of GPU in major imaging-related and therapy-related problems encountered in radiotherapy. A comparison of GPU with other platforms will also be presented. (topical review)

  20. High-average-power solid state lasers

    International Nuclear Information System (INIS)

    Summers, M.A.

    1989-01-01

    In 1987, a broad-based, aggressive R ampersand D program aimed at developing the technologies necessary to make possible the use of solid state lasers that are capable of delivering medium- to high-average power in new and demanding applications. Efforts were focused along the following major lines: development of laser and nonlinear optical materials, and of coatings for parasitic suppression and evanescent wave control; development of computational design tools; verification of computational models on thoroughly instrumented test beds; and applications of selected aspects of this technology to specific missions. In the laser materials areas, efforts were directed towards producing strong, low-loss laser glasses and large, high quality garnet crystals. The crystal program consisted of computational and experimental efforts aimed at understanding the physics, thermodynamics, and chemistry of large garnet crystal growth. The laser experimental efforts were directed at understanding thermally induced wave front aberrations in zig-zag slabs, understanding fluid mechanics, heat transfer, and optical interactions in gas-cooled slabs, and conducting critical test-bed experiments with various electro-optic switch geometries. 113 refs., 99 figs., 18 tabs

  1. Optics assembly for high power laser tools

    Science.gov (United States)

    Fraze, Jason D.; Faircloth, Brian O.; Zediker, Mark S.

    2016-06-07

    There is provided a high power laser rotational optical assembly for use with, or in high power laser tools for performing high power laser operations. In particular, the optical assembly finds applications in performing high power laser operations on, and in, remote and difficult to access locations. The optical assembly has rotational seals and bearing configurations to avoid contamination of the laser beam path and optics.

  2. Turning a $10 Computer into a Powerful DIY Data Logger

    Science.gov (United States)

    Schilperoort, B.

    2017-12-01

    Due the rapid advance of consumer electronics, much more powerful and cheaper options are available for DIY projects. The $10 `Raspberry Pi Zero W' computer, with abilities like WiFi, Bluetooth, HDMI video output, and a large cheap memory, can be used for data logging purposes. The computer has a range of input and output pins on the board, with which virtually every type of digital sensor communication is possible. With an extra component, analog measurements can also be made. An extra option is the addition of a camera, which can be connected straight to the board. However, due to the relatively high power consumption (0.5 - 0.7 Watt), the `Zero W' is not optimal for off-the-grid locations. For ease of use, the collected data can be downloaded over a local WiFi network using your smartphone or a laptop. No extra software or skills are needed, it is as simple as visiting a webpage and pressing download, making data collection a quick and easy task. With simple step by step instructions you can set up your own data logger, to collect data from sensors ranging from simple temperature and water level measurements, to sonic anemometers.

  3. Process computers automate CERN power supply installations

    CERN Document Server

    Ullrich, H

    1974-01-01

    Computerized automation systems are being used at CERN, Geneva, to improve the capacity, operational reliability and flexibility of the power supply installations for main ring magnets in the experimental zones of particle accelerators. A detailed account of the technological problem involved is followed in the article by a description of the system configuration, the program system and field experience already gathered in similar schemes. (1 refs).

  4. Saving Energy and Money: A Lesson in Computer Power Management

    Science.gov (United States)

    Lazaros, Edward J.; Hua, David

    2012-01-01

    In this activity, students will develop an understanding of the economic impact of technology by estimating the cost savings of power management strategies in the classroom. Students will learn how to adjust computer display settings to influence the impact that the computer has on the financial burden to the school. They will use mathematics to…

  5. High accuracy ion optics computing

    International Nuclear Information System (INIS)

    Amos, R.J.; Evans, G.A.; Smith, R.

    1986-01-01

    Computer simulation of focused ion beams for surface analysis of materials by SIMS, or for microfabrication by ion beam lithography plays an important role in the design of low energy ion beam transport and optical systems. Many computer packages currently available, are limited in their applications, being inaccurate or inappropriate for a number of practical purposes. This work describes an efficient and accurate computer programme which has been developed and tested for use on medium sized machines. The programme is written in Algol 68 and models the behaviour of a beam of charged particles through an electrostatic system. A variable grid finite difference method is used with a unique data structure, to calculate the electric potential in an axially symmetric region, for arbitrary shaped boundaries. Emphasis has been placed upon finding an economic method of solving the resulting set of sparse linear equations in the calculation of the electric field and several of these are described. Applications include individual ion lenses, extraction optics for ions in surface analytical instruments and the design of columns for ion beam lithography. Computational results have been compared with analytical calculations and with some data obtained from individual einzel lenses. (author)

  6. high power facto high power factor high power factor hybrid rectifier

    African Journals Online (AJOL)

    eobe

    increase in the number of electrical loads that some kind of ... components in the AC power system. Thus, suppl ... al output power; assuring reliability in ... distribution systems. This can be ...... Thesis- Califonia Institute of Technology, Capitulo.

  7. Computational Error Estimate for the Power Series Solution of Odes ...

    African Journals Online (AJOL)

    This paper compares the error estimation of power series solution with recursive Tau method for solving ordinary differential equations. From the computational viewpoint, the power series using zeros of Chebyshevpolunomial is effective, accurate and easy to use. Keywords: Lanczos Tau method, Chebyshev polynomial, ...

  8. A Computer Program for Short Circuit Analysis of Electric Power ...

    African Journals Online (AJOL)

    The Short Circuit Analysis Program (SCAP) is to be used to assess the composite effects of unbalanced and balanced faults on the overall reliability of electric power system. The program uses the symmetrical components method to compute all phase and sequence quantities for any bus or branch of a given power network ...

  9. A solar powered wireless computer mouse: industrial design concepts

    NARCIS (Netherlands)

    Reich, N.H.; Veefkind, M.; van Sark, W.G.J.H.M.; Alsema, E.A.; Turkenburg, W.C.; Silvester, S.

    2009-01-01

    A solar powered wireless computer mouse (SPM) was chosen to serve as a case study for the evaluation and optimization of industrial design processes of photovoltaic (PV) powered consumer systems. As the design process requires expert knowledge in various technical fields, we assessed and compared

  10. Computers in experimental nuclear power facilities

    International Nuclear Information System (INIS)

    Jukl, M.

    1982-01-01

    The CIS 3000 information system is described used for monitoring the operating modes of large technological equipment. The CIS system consists of two ADT computers, an external drum store an analog input side, a bivalent input side, 4 control consoles with monitors and acoustic signalling, a print-out area with typewriters and punching machines and linear recorders. Various applications are described of the installed CIS configuration as is the general-purpose program for processing measured values into a protocol. The program operates in the conversational mode. Different processing variants are shown on the display monitor. (M.D.)

  11. Low Power Computing in Distributed Systems

    Science.gov (United States)

    2006-04-01

    performance applications. It has been adopted in embedded systems such as the Stargate from Crossbow [15] and the PASTA 4 0 0.1 0.2 0.3 0.4 (A) flo at...current consumption of the Stargate board is measured by an Agilent digital multimeter 34401A. The digital multimeter is connected with the PC for data...floating point operation vs. integer operation Power supply Digital multimeter Stargate board with Xscale processor 5 2.2 Library math function vs

  12. High-level language computer architecture

    CERN Document Server

    Chu, Yaohan

    1975-01-01

    High-Level Language Computer Architecture offers a tutorial on high-level language computer architecture, including von Neumann architecture and syntax-oriented architecture as well as direct and indirect execution architecture. Design concepts of Japanese-language data processing systems are discussed, along with the architecture of stack machines and the SYMBOL computer system. The conceptual design of a direct high-level language processor is also described.Comprised of seven chapters, this book first presents a classification of high-level language computer architecture according to the pr

  13. High power ubitron-klystron

    International Nuclear Information System (INIS)

    Balkcum, A.J.; McDermott, D.B.; Luhmann, N.C. Jr.

    1997-01-01

    A coaxial ubitron is being considered as the rf driver for the Next Linear Collider (NLC). Prior simulation of a traveling-wave ubitron using a self-consistent code found that 200 MW of power and 53 dB of gain could be achieved with 37% efficiency. In a ubiron-klystron, a series of cavities are used to obtain an even tighter electron bunch for higher efficiency. A small-signal theory of the ubitron-klystron shows that gain scales with the square of the cavity separation distance. A linear stability theory has also been developed. Verification of the stability theory has been achieved using the 2-12-D PIC code, MAGIC, and the particle-tracing code. Saturation characteristics of the amplifier will be presented using both MAGIC and a simpler self-consistent slow-timescale code currently under development. The ubitron can also operate as a compact, highly efficient oscillator. Cavities only two wiggler periods in length have yielded up to 40% rf conversion efficiency in simulation. An initial oscillator design for directed energy applications will also be presented

  14. Axial power deviation control strategy and computer simulation for Daya Bay Nuclear Power Station

    International Nuclear Information System (INIS)

    Liao Yehong; Zhou Xiaoling, Xiao Min

    2004-01-01

    Daya Bay Nuclear Power Station has very tight operation diagram especially at its right side. Therefore the successful control of axial power deviation for PWR is crucial to nuclear safety. After analyzing various core characters' effect on axial power distribution, several axial power deviation control strategies has been proposed to comply with different power varying operation scenario. Application and computer simulation of the strategies has shown that our prediction of axial power deviation evolution are comparable to the measurement values, and that our control strategies are effective. Engineering experience shows that the application of our methodology can predict accurately the transient of axial power deviation, and therefore has become a useful tool for reactor operation and safety control. This paper presents the axial power control characteristics, reactor operation strategy research, computer simulation, and comparison to measurement results in Daya Bay Nuclear Power Station. (author)

  15. Simplified High-Power Inverter

    Science.gov (United States)

    Edwards, D. B.; Rippel, W. E.

    1984-01-01

    Solid-state inverter simplified by use of single gate-turnoff device (GTO) to commutate multiple silicon controlled rectifiers (SCR's). By eliminating conventional commutation circuitry, GTO reduces cost, size and weight. GTO commutation applicable to inverters of greater than 1-kilowatt capacity. Applications include emergency power, load leveling, drives for traction and stationary polyphase motors, and photovoltaic-power conditioning.

  16. Review of Power System Stability with High Wind Power Penetration

    DEFF Research Database (Denmark)

    Hu, Rui; Hu, Weihao; Chen, Zhe

    2015-01-01

    analyzing methods and stability improvement approaches. With increasing wind power penetration, system balancing and the reduced inertia may cause a big threaten for stable operation of power systems. To mitigate or eliminate the wind impacts for high wind penetration systems, although the practical......This paper presents an overview of researches on power system stability with high wind power penetration including analyzing methods and improvement approaches. Power system stability issues can be classified diversely according to different considerations. Each classified issue has special...... and reliable choices currently are the strong outside connections or sufficient reserve capacity constructions, many novel theories and approaches are invented to investigate the stability issues, looking forward to an extra-high penetration or totally renewable resource based power systems. These analyzing...

  17. An optimized junctionless GAA MOSFET design based on multi-objective computation for high-performance ultra-low power devices

    International Nuclear Information System (INIS)

    Bendib, T.; Djeffal, F.; Meguellati, M.

    2014-01-01

    An analytical investigation has been proposed to study the subthreshold behavior of junctionless gates all around (JLGAA) MOSFET for nanoscale CMOS analog applications. Based on 2-D analytical analysis, a new subthreshold swing model for short-channel JLGAA MOSFETs is developed. The analysis has been used to calculate the subthreshold swing and to compare the performance of the investigated design and conventional GAA MOSFET, where the comparison of device architectures shows that the JLGAA MOSFET exhibits a superior performance with respect to the conventional inversion-mode GAA MOSFET in terms of the fabrication process and electrical behavior in the subthreshold domain. The analytical models have been validated by 2-D numerical simulations. The proposed analytical models are used to formulate the objectives functions. The overall objective function is formulated by means of a weighted sum approach to search the optimal electrical and dimensional device parameters in order to obtain the better scaling capability and the electrical performance of the device for ultra-low power applications. (semiconductor devices)

  18. Transitions in the computational power of thermal states for measurement-based quantum computation

    International Nuclear Information System (INIS)

    Barrett, Sean D.; Bartlett, Stephen D.; Jennings, David; Doherty, Andrew C.; Rudolph, Terry

    2009-01-01

    We show that the usefulness of the thermal state of a specific spin-lattice model for measurement-based quantum computing exhibits a transition between two distinct 'phases' - one in which every state is a universal resource for quantum computation, and another in which any local measurement sequence can be simulated efficiently on a classical computer. Remarkably, this transition in computational power does not coincide with any phase transition, classical, or quantum in the underlying spin-lattice model.

  19. Electronic DC transformer with high power density

    NARCIS (Netherlands)

    Pavlovský, M.

    2006-01-01

    This thesis is concerned with the possibilities of increasing the power density of high-power dc-dc converters with galvanic isolation. Three cornerstones for reaching high power densities are identified as: size reduction of passive components, reduction of losses particularly in active components

  20. High power CW linac in PNC

    International Nuclear Information System (INIS)

    Toyama, S.; Wang, Y.L.; Emoto, T.

    1994-01-01

    Power Reactor and Nuclear Fuel Development Corporation (PNC) is developing a high power electron linac for various applications. The electron beam is accelerated in CW operation to get maximum beam current of 100 mA and energy of 10 MeV. Crucial components such as a high power L-band klystron and a high power traveling wave resonant ring (TWRR) accelerator guides were designed and manufactured and their performance were examined. These design and results from the recent high power RF tests were described in this paper. (author)

  1. Aeroelastic modelling without the need for excessive computing power

    Energy Technology Data Exchange (ETDEWEB)

    Infield, D. [Loughborough Univ., Centre for Renewable Energy Systems Technology, Dept. of Electronic and Electrical Engineering, Loughborough (United Kingdom)

    1996-09-01

    The aeroelastic model presented here was developed specifically to represent a wind turbine manufactured by Northern Power Systems which features a passive pitch control mechanism. It was considered that this particular turbine, which also has low solidity flexible blades, and is free yawing, would provide a stringent test of modelling approaches. It was believed that blade element aerodynamic modelling would not be adequate to properly describe the combination of yawed flow, dynamic inflow and unsteady aerodynamics; consequently a wake modelling approach was adopted. In order to keep computation time limited, a highly simplified, semi-free wake approach (developed in previous work) was used. a similarly simple structural model was adopted with up to only six degrees of freedom in total. In order to take account of blade (flapwise) flexibility a simple finite element sub-model is used. Good quality data from the turbine has recently been collected and it is hoped to undertake model validation in the near future. (au)

  2. High-performance computing — an overview

    Science.gov (United States)

    Marksteiner, Peter

    1996-08-01

    An overview of high-performance computing (HPC) is given. Different types of computer architectures used in HPC are discussed: vector supercomputers, high-performance RISC processors, various parallel computers like symmetric multiprocessors, workstation clusters, massively parallel processors. Software tools and programming techniques used in HPC are reviewed: vectorizing compilers, optimization and vector tuning, optimization for RISC processors; parallel programming techniques like shared-memory parallelism, message passing and data parallelism; and numerical libraries.

  3. High Power Density Power Electronic Converters for Large Wind Turbines

    DEFF Research Database (Denmark)

    Senturk, Osman Selcuk

    . For these VSCs, high power density is required due to limited turbine nacelle space. Also, high reliability is required since maintenance cost of these remotely located wind turbines is quite high and these turbines operate under harsh operating conditions. In order to select a high power density and reliability......In large wind turbines (in MW and multi-MW ranges), which are extensively utilized in wind power plants, full-scale medium voltage (MV) multi-level (ML) voltage source converters (VSCs) are being more preferably employed nowadays for interfacing these wind turbines with electricity grids...... VSC solution for wind turbines, first, the VSC topology and the switch technology to be employed should be specified such that the highest possible power density and reliability are to be attained. Then, this qualitative approach should be complemented with the power density and reliability...

  4. Computer network for electric power control systems. Chubu denryoku (kabu) denryoku keito seigyoyo computer network

    Energy Technology Data Exchange (ETDEWEB)

    Tsuneizumi, T. (Chubu Electric Power Co. Inc., Nagoya (Japan)); Shimomura, S.; Miyamura, N. (Fuji Electric Co. Ltd., Tokyo (Japan))

    1992-06-03

    A computer network for electric power control system was developed that is applied with the open systems interconnection (OSI), an international standard for communications protocol. In structuring the OSI network, a direct session layer was accessed from the operation functions when high-speed small-capacity information is transmitted. File transfer, access and control having a function of collectively transferring large-capacity data were applied when low-speed large-capacity information is transmitted. A verification test for the realtime computer network (RCN) mounting regulation was conducted according to a verification model using a mini-computer, and a result that can satisfy practical performance was obtained. For application interface, kernel, health check and two-route transmission functions were provided as a connection control function, so were transmission verification function and late arrival abolishing function. In system mounting pattern, dualized communication server (CS) structure was adopted. A hardware structure may include a system to have the CS function contained in a host computer and a separate installation system. 5 figs., 6 tabs.

  5. High Power Fiber Laser Test Bed

    Data.gov (United States)

    Federal Laboratory Consortium — This facility, unique within DoD, power-combines numerous cutting-edge fiber-coupled laser diode modules (FCLDM) to integrate pumping of high power rare earth-doped...

  6. Practical computer analysis of switch mode power supplies

    CERN Document Server

    Bennett, Johnny C

    2006-01-01

    When designing switch-mode power supplies (SMPSs), engineers need much more than simple "recipes" for analysis. Such plug-and-go instructions are not at all helpful for simulating larger and more complex circuits and systems. Offering more than merely a "cookbook," Practical Computer Analysis of Switch Mode Power Supplies provides a thorough understanding of the essential requirements for analyzing SMPS performance characteristics. It demonstrates the power of the circuit averaging technique when used with powerful computer circuit simulation programs. The book begins with SMPS fundamentals and the basics of circuit averaging models, reviewing most basic topologies and explaining all of their various modes of operation and control. The author then discusses the general analysis requirements of power supplies and how to develop the general types of SMPS models, demonstrating the use of SPICE for analysis. He examines the basic first-order analyses generally associated with SMPS performance along with more pra...

  7. High Power Wireless Transfer : For Charging High Power Batteries

    OpenAIRE

    Gill, Himmat

    2017-01-01

    Wireless power transfer (WPT) is developing with emerging of new technologies that has made it possible to transfer electricity over certain distances without any physical contact, offering significant benefits to modern automation systems, medical applications, consumer electronic, and especially in electric vehicle systems. The goal of this study is to provide a brief review of existing compensation topologies for the loosely coupled transformer. The technique used to simulate a co...

  8. Associative Memory computing power and its simulation.

    CERN Document Server

    Volpi, G; The ATLAS collaboration

    2014-01-01

    The associative memory (AM) chip is ASIC device specifically designed to perform ``pattern matching'' at very high speed and with parallel access to memory locations. The most extensive use for such device will be the ATLAS Fast Tracker (FTK) processor, where more than 8000 chips will be installed in 128 VME boards, specifically designed for high throughput in order to exploit the chip's features. Each AM chip will store a database of about 130000 pre-calculated patterns, allowing FTK to use about 1 billion patterns for the whole system, with any data inquiry broadcast to all memory elements simultaneously within the same clock cycle (10 ns), thus data retrieval time is independent of the database size. Speed and size of the system are crucial for real-time High Energy Physics applications, such as the ATLAS FTK processor. Using 80 million channels of the ATLAS tracker, FTK finds tracks within 100 $\\mathrm{\\mu s}$. The simulation of such a parallelized system is an extremely complex task when executed in comm...

  9. High Power laser power conditioning system new discharge circuit research

    CERN Document Server

    Li Yi; Peng Han Sheng; Zhou Pei Zhang; Zheng Wan Guo; Guo Lang Fu; Chen Li Hua; Chen De Hui; Lai Gui You; Luan Yong Ping

    2002-01-01

    The new discharge circuit of power conditioning system for high power laser is studied. The theoretical model of the main discharge circuit is established. The pre-ionization circuit is studied in experiment. In addition, the explosion energy of the new large xenon lamp is successfully measured. The conclusion has been applied to 4 x 2 amplifier system

  10. A High Performance COTS Based Computer Architecture

    Science.gov (United States)

    Patte, Mathieu; Grimoldi, Raoul; Trautner, Roland

    2014-08-01

    Using Commercial Off The Shelf (COTS) electronic components for space applications is a long standing idea. Indeed the difference in processing performance and energy efficiency between radiation hardened components and COTS components is so important that COTS components are very attractive for use in mass and power constrained systems. However using COTS components in space is not straightforward as one must account with the effects of the space environment on the COTS components behavior. In the frame of the ESA funded activity called High Performance COTS Based Computer, Airbus Defense and Space and its subcontractor OHB CGS have developed and prototyped a versatile COTS based architecture for high performance processing. The rest of the paper is organized as follows: in a first section we will start by recapitulating the interests and constraints of using COTS components for space applications; then we will briefly describe existing fault mitigation architectures and present our solution for fault mitigation based on a component called the SmartIO; in the last part of the paper we will describe the prototyping activities executed during the HiP CBC project.

  11. Computer science of the high performance; Informatica del alto rendimiento

    Energy Technology Data Exchange (ETDEWEB)

    Moraleda, A.

    2008-07-01

    The high performance computing is taking shape as a powerful accelerator of the process of innovation, to drastically reduce the waiting times for access to the results and the findings in a growing number of processes and activities as complex and important as medicine, genetics, pharmacology, environment, natural resources management or the simulation of complex processes in a wide variety of industries. (Author)

  12. Computer proficiency questionnaire: assessing low and high computer proficient seniors.

    Science.gov (United States)

    Boot, Walter R; Charness, Neil; Czaja, Sara J; Sharit, Joseph; Rogers, Wendy A; Fisk, Arthur D; Mitzner, Tracy; Lee, Chin Chin; Nair, Sankaran

    2015-06-01

    Computers and the Internet have the potential to enrich the lives of seniors and aid in the performance of important tasks required for independent living. A prerequisite for reaping these benefits is having the skills needed to use these systems, which is highly dependent on proper training. One prerequisite for efficient and effective training is being able to gauge current levels of proficiency. We developed a new measure (the Computer Proficiency Questionnaire, or CPQ) to measure computer proficiency in the domains of computer basics, printing, communication, Internet, calendaring software, and multimedia use. Our aim was to develop a measure appropriate for individuals with a wide range of proficiencies from noncomputer users to extremely skilled users. To assess the reliability and validity of the CPQ, a diverse sample of older adults, including 276 older adults with no or minimal computer experience, was recruited and asked to complete the CPQ. The CPQ demonstrated excellent reliability (Cronbach's α = .98), with subscale reliabilities ranging from .86 to .97. Age, computer use, and general technology use all predicted CPQ scores. Factor analysis revealed three main factors of proficiency related to Internet and e-mail use; communication and calendaring; and computer basics. Based on our findings, we also developed a short-form CPQ (CPQ-12) with similar properties but 21 fewer questions. The CPQ and CPQ-12 are useful tools to gauge computer proficiency for training and research purposes, even among low computer proficient older adults. © The Author 2013. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  13. Hot Chips and Hot Interconnects for High End Computing Systems

    Science.gov (United States)

    Saini, Subhash

    2005-01-01

    I will discuss several processors: 1. The Cray proprietary processor used in the Cray X1; 2. The IBM Power 3 and Power 4 used in an IBM SP 3 and IBM SP 4 systems; 3. The Intel Itanium and Xeon, used in the SGI Altix systems and clusters respectively; 4. IBM System-on-a-Chip used in IBM BlueGene/L; 5. HP Alpha EV68 processor used in DOE ASCI Q cluster; 6. SPARC64 V processor, which is used in the Fujitsu PRIMEPOWER HPC2500; 7. An NEC proprietary processor, which is used in NEC SX-6/7; 8. Power 4+ processor, which is used in Hitachi SR11000; 9. NEC proprietary processor, which is used in Earth Simulator. The IBM POWER5 and Red Storm Computing Systems will also be discussed. The architectures of these processors will first be presented, followed by interconnection networks and a description of high-end computer systems based on these processors and networks. The performance of various hardware/programming model combinations will then be compared, based on latest NAS Parallel Benchmark results (MPI, OpenMP/HPF and hybrid (MPI + OpenMP). The tutorial will conclude with a discussion of general trends in the field of high performance computing, (quantum computing, DNA computing, cellular engineering, and neural networks).

  14. Superconducting high frequency high power resonators

    International Nuclear Information System (INIS)

    Hobbis, C.; Vardiman, R.; Weinman, L.

    1974-01-01

    A niobium superconducting quarter-wave helical resonator has been designed and built. The resonator has been electron-beam welded and electropolished to produce a smooth flaw-free surface. This has been followed by an anodization to produce a 1000 A layer of Nb 2 0 5 . At the resonant frequency of approximately 15 MHz the unloaded Q was approximately equal to 4.6x10 6 with minimal dielectric support. With the resonator open to the helium bath to provide cooling, and rigidly supported by a teflon cylinder, 350 V of power were transferred at a doubly loaded Q of 3500. The extrapolation of the results to a Qsub(DL) of 1000 meet the power handling criteria of one kilowatt for the intended application. (author)

  15. Debugging a high performance computing program

    Science.gov (United States)

    Gooding, Thomas M.

    2013-08-20

    Methods, apparatus, and computer program products are disclosed for debugging a high performance computing program by gathering lists of addresses of calling instructions for a plurality of threads of execution of the program, assigning the threads to groups in dependence upon the addresses, and displaying the groups to identify defective threads.

  16. Future trends in power plant process computer techniques

    International Nuclear Information System (INIS)

    Dettloff, K.

    1975-01-01

    The development of new concepts of the process computer technique has advanced in great steps. The steps are in the three sections: hardware, software, application concept. New computers with a new periphery such as, e.g., colour layer equipment, have been developed in hardware. In software, a decisive step in the sector 'automation software' has been made. Through these components, a step forwards has also been made in the question of incorporating the process computer in the structure of the whole power plant control technique. (orig./LH) [de

  17. Federal High End Computing (HEC) Information Portal

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — This portal provides information about opportunities to engage in U.S. Federal government high performance computing activities, including supercomputer use,...

  18. Embedded High Performance Scalable Computing Systems

    National Research Council Canada - National Science Library

    Ngo, David

    2003-01-01

    The Embedded High Performance Scalable Computing Systems (EHPSCS) program is a cooperative agreement between Sanders, A Lockheed Martin Company and DARPA that ran for three years, from Apr 1995 - Apr 1998...

  19. Computer- Aided Design in Power Engineering Application of Software Tools

    CERN Document Server

    Stojkovic, Zlatan

    2012-01-01

    This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents  application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel & Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems ...

  20. Dynamic stability calculations for power grids employing a parallel computer

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, K

    1982-06-01

    The aim of dynamic contingency calculations in power systems is to estimate the effects of assumed disturbances, such as loss of generation. Due to the large dimensions of the problem these simulations require considerable computing time and costs, to the effect that they are at present only used in a planning state but not for routine checks in power control stations. In view of the homogeneity of the problem, where a multitude of equal generator models, having different parameters, are to be integrated simultaneously, the use of a parallel computer looks very attractive. The results of this study employing a prototype parallel computer (SMS 201) are presented. It consists of up to 128 equal microcomputers bus-connected to a control computer. Each of the modules is programmed to simulate a node of the power grid. Generators with their associated control are represented by models of 13 states each. Passive nodes are complemented by 'phantom'-generators, so that the whole power grid is homogenous, thus removing the need for load-flow-iterations. Programming of microcomputers is essentially performed in FORTRAN.

  1. EMI Evaluation on Wireless Computer Devices in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Jae Ki; JI Yeong Hwa; Sung, Chan Ho

    2011-01-01

    Wireless computer devices, for example, mice and keyboards are widely used in various industries. However, I and C (instrumentation and control) equipment in nuclear power plants are very susceptible to the EMI (Electro-magnetic interference) and there are concerns regarding EMI induced transient caused by wireless computer devices which emit electromagnetic waves for communication. In this paper, industrial practices and nuclear related international standards are investigated to verify requirements of wireless devices. In addition, actual measurement and evaluation for the intensity of EMI of some commercially available wireless devices is performed to verify their compatibility in terms of EMI. Finally we suggest an appropriate method of using wireless computer devices in nuclear power plant control rooms for better office circumstances of operators

  2. Practical application of computer graphics in nuclear power plant engineering

    International Nuclear Information System (INIS)

    Machiba, Hiroshi; Kawamura, Hirobumi; Sasaki, Norio

    1992-01-01

    A nuclear power plant is composed of a vast amount of equipment, piping, and so on, and six or seven years are required to complete the design and engineering from the initial planning stage to the time of commercial operation. Furthermore, operating plants must be continually maintained and improved for a long period. Computer graphics were first applied to the composite arrangement design of nuclear power plants in the form of 3-dimensional CAD. Subsequently, as the introduction of CAE has progressed, a huge assortment of information has been accumulated in database, and measures have been sought that would permit the convenient utilization of this information. Using computer graphics technologies, improvement of the interface between the user and such databases has recently been accomplished. In response to the growth in environmental consciousness, photo-realistic simulations for artistic design of the interior and overviews showing harmony with the surroundings have been achieved through the application of computer graphics. (author)

  3. High Powered Rocketry: Design, Construction, and Launching Experience and Analysis

    Science.gov (United States)

    Paulson, Pryce; Curtis, Jarret; Bartel, Evan; Cyr, Waycen Owens; Lamsal, Chiranjivi

    2018-01-01

    In this study, the nuts and bolts of designing and building a high powered rocket have been presented. A computer simulation program called RockSim was used to design the rocket. Simulation results are consistent with time variations of altitude, velocity, and acceleration obtained in the actual flight. The actual drag coefficient was determined…

  4. Cyber Security on Nuclear Power Plant's Computer Systems

    International Nuclear Information System (INIS)

    Shin, Ick Hyun

    2010-01-01

    Computer systems are used in many different fields of industry. Most of us are taking great advantages from the computer systems. Because of the effectiveness and great performance of computer system, we are getting so dependable on the computer. But the more we are dependable on the computer system, the more the risk we will face when the computer system is unavailable or inaccessible or uncontrollable. There are SCADA, Supervisory Control And Data Acquisition, system which are broadly used for critical infrastructure such as transportation, electricity, water management. And if the SCADA system is vulnerable to the cyber attack, it is going to be nation's big disaster. Especially if nuclear power plant's main control systems are attacked by cyber terrorists, the results may be huge. Leaking of radioactive material will be the terrorist's main purpose without using physical forces. In this paper, different types of cyber attacks are described, and a possible structure of NPP's computer network system is presented. And the paper also provides possible ways of destruction of the NPP's computer system along with some suggestions for the protection against cyber attacks

  5. Powering the High-Luminosity Triplets

    Science.gov (United States)

    Ballarino, A.; Burnet, J. P.

    The powering of the magnets in the LHC High-Luminosity Triplets requires production and transfer of more than 150 kA of DC current. High precision power converters will be adopted, and novel High Temperature Superconducting (HTS) current leads and MgB2 based transfer lines will provide the electrical link between the power converters and the magnets. This chapter gives an overview of the systems conceived in the framework of the LHC High-Luminosity upgrade for feeding the superconducting magnet circuits. The focus is on requirements, challenges and novel developments.

  6. A new computational method for reactive power market clearing

    International Nuclear Information System (INIS)

    Zhang, T.; Elkasrawy, A.; Venkatesh, B.

    2009-01-01

    After deregulation of electricity markets, ancillary services such as reactive power supply are priced separately. However, unlike real power supply, procedures for costing and pricing reactive power supply are still evolving and spot markets for reactive power do not exist as of now. Further, traditional formulations proposed for clearing reactive power markets use a non-linear mixed integer programming formulation that are difficult to solve. This paper proposes a new reactive power supply market clearing scheme. Novelty of this formulation lies in the pricing scheme that rewards transformers for tap shifting while participating in this market. The proposed model is a non-linear mixed integer challenge. A significant portion of the manuscript is devoted towards the development of a new successive mixed integer linear programming (MILP) technique to solve this formulation. The successive MILP method is computationally robust and fast. The IEEE 6-bus and 300-bus systems are used to test the proposed method. These tests serve to demonstrate computational speed and rigor of the proposed method. (author)

  7. High current and high power superconducting rectifiers

    International Nuclear Information System (INIS)

    Kate, H.H.J. ten; Bunk, P.B.; Klundert, L.J.M. van de; Britton, R.B.

    1981-01-01

    Results on three experimental superconducting rectifiers are reported. Two of them are 1 kA low frequency flux pumps, one thermally and magnetically switched. The third is a low-current high-frequency magnetically switched rectifier which can use the mains directly. (author)

  8. Parameters that affect parallel processing for computational electromagnetic simulation codes on high performance computing clusters

    Science.gov (United States)

    Moon, Hongsik

    What is the impact of multicore and associated advanced technologies on computational software for science? Most researchers and students have multicore laptops or desktops for their research and they need computing power to run computational software packages. Computing power was initially derived from Central Processing Unit (CPU) clock speed. That changed when increases in clock speed became constrained by power requirements. Chip manufacturers turned to multicore CPU architectures and associated technological advancements to create the CPUs for the future. Most software applications benefited by the increased computing power the same way that increases in clock speed helped applications run faster. However, for Computational ElectroMagnetics (CEM) software developers, this change was not an obvious benefit - it appeared to be a detriment. Developers were challenged to find a way to correctly utilize the advancements in hardware so that their codes could benefit. The solution was parallelization and this dissertation details the investigation to address these challenges. Prior to multicore CPUs, advanced computer technologies were compared with the performance using benchmark software and the metric was FLoting-point Operations Per Seconds (FLOPS) which indicates system performance for scientific applications that make heavy use of floating-point calculations. Is FLOPS an effective metric for parallelized CEM simulation tools on new multicore system? Parallel CEM software needs to be benchmarked not only by FLOPS but also by the performance of other parameters related to type and utilization of the hardware, such as CPU, Random Access Memory (RAM), hard disk, network, etc. The codes need to be optimized for more than just FLOPs and new parameters must be included in benchmarking. In this dissertation, the parallel CEM software named High Order Basis Based Integral Equation Solver (HOBBIES) is introduced. This code was developed to address the needs of the

  9. Chaos in high-power high-frequency gyrotrons

    International Nuclear Information System (INIS)

    Airila, M.

    2004-01-01

    Gyrotron interaction is a complex nonlinear dynamical process, which may turn chaotic in certain circumstances. The emergence of chaos renders dynamical systems unpredictable and causes bandwidth broadening of signals. Such effects would jeopardize the prospect of advanced gyrotrons in fusion. Therefore, it is important to be aware of the possibility of chaos in gyrotrons. There are three different chaos scenarios closely related to the development of high-power gyrotrons: First, the onset of chaos in electron trajectories would lead to difficulties in the design and efficient operation of depressed potential collectors, which are used for efficiency enhancement. Second, the radio-frequency signal could turn chaotic, decreasing the output power and the spectral purity of the output signal. As a result, mode conversion, transmission, and absorption efficiencies would be reduced. Third, spatio-temporal chaos in the resonator field structure can set a limit for the use of large-diameter interaction cavities and high-order TE modes (large azimuthal index) allowing higher generated power. In this thesis, the issues above are addressed with numerical modeling. It is found that chaos in electron residual energies is practically absent in the parameter region corresponding to high efficiency. Accordingly, depressed collectors are a feasible solution also in advanced high-power gyrotrons. A new method is presented for straightforward numerical solution of the one-dimensional self-consistent time-dependent gyrotron equations, and the method is generalized to two dimensions. In 1D, a chart of gyrotron oscillations is calculated. It is shown that the regions of stationary oscillations, automodulation, and chaos have a complicated topology in the plane of generalized gyrotron variables. The threshold current for chaotic oscillations exceeds typical operating currents by a factor of ten. However, reflection of the output signal may significantly lower the threshold. 2D

  10. Design and Characterization of High Power Targets for RIB Generation

    International Nuclear Information System (INIS)

    Zhang, Y.

    2001-01-01

    In this article, thermal modeling techniques are used to simulate ISOL targets irradiated with high power proton beams. Beam scattering effects, nuclear reactions and beam power deposition distributions in the target were computed with the Monte Carlo simulation code, GEANT4. The power density information was subsequently used as input to the finite element thermal analysis code, ANSYS, for extracting temperature distribution information for a variety of target materials. The principal objective of the studies was to evaluate techniques for more uniformly distributing beam deposited heat over the volumes of targets to levels compatible with their irradiation with the highest practical primary-beam power, and to use the preferred technique to design high power ISOL targets. The results suggest that radiation cooling, in combination, with primary beam manipulation, can be used to control temperatures in practically sized targets, to levels commensurate with irradiation with 1 GeV, 100 kW proton beams

  11. High power ultrashort pulse lasers

    International Nuclear Information System (INIS)

    Perry, M.D.

    1994-01-01

    Small scale terawatt and soon even petawatt (1000 terawatt) class laser systems are made possible by application of the chirped-pulse amplification technique to solid-state lasers combined with the availability of broad bandwidth materials. These lasers make possible a new class of high gradient accelerators based on the large electric fields associated with intense laser-plasma interactions or from the intense laser field directly. Here, we concentrate on the laser technology to produce these intense pulses. Application of the smallest of these systems to the production of high brightness electron sources is also introduced

  12. Optical interconnection networks for high-performance computing systems

    International Nuclear Information System (INIS)

    Biberman, Aleksandr; Bergman, Keren

    2012-01-01

    Enabled by silicon photonic technology, optical interconnection networks have the potential to be a key disruptive technology in computing and communication industries. The enduring pursuit of performance gains in computing, combined with stringent power constraints, has fostered the ever-growing computational parallelism associated with chip multiprocessors, memory systems, high-performance computing systems and data centers. Sustaining these parallelism growths introduces unique challenges for on- and off-chip communications, shifting the focus toward novel and fundamentally different communication approaches. Chip-scale photonic interconnection networks, enabled by high-performance silicon photonic devices, offer unprecedented bandwidth scalability with reduced power consumption. We demonstrate that the silicon photonic platforms have already produced all the high-performance photonic devices required to realize these types of networks. Through extensive empirical characterization in much of our work, we demonstrate such feasibility of waveguides, modulators, switches and photodetectors. We also demonstrate systems that simultaneously combine many functionalities to achieve more complex building blocks. We propose novel silicon photonic devices, subsystems, network topologies and architectures to enable unprecedented performance of these photonic interconnection networks. Furthermore, the advantages of photonic interconnection networks extend far beyond the chip, offering advanced communication environments for memory systems, high-performance computing systems, and data centers. (review article)

  13. Evolution of Very High Frequency Power Supplies

    DEFF Research Database (Denmark)

    Knott, Arnold; Andersen, Toke Meyer; Kamby, Peter

    2013-01-01

    The ongoing demand for smaller and lighter power supplies is driving the motivation to increase the switching frequencies of power converters. Drastic increases however come along with new challenges, namely the increase of switching losses in all components. The application of power circuits used...... in radio frequency transmission equipment helps to overcome those. However those circuits were not designed to meet the same requirements as power converters. This paper summarizes the contributions in recent years in application of very high frequency (VHF) technologies in power electronics, shows results...... of the recent advances and describes the remaining challenges. The presented results include a self-oscillating gate-drive, air core inductor optimizations, an offline LED driver with a power density of 8.9 W/cm3 and a 120 MHz, 9 W DC powered LED driver with 89 % efficiency as well as a bidirectional VHF...

  14. The NASA CSTI High Capacity Power Project

    International Nuclear Information System (INIS)

    Winter, J.; Dudenhoefer, J.; Juhasz, A.; Schwarze, G.; Patterson, R.; Ferguson, D.; Schmitz, P.; Vandersande, J.

    1992-01-01

    This paper describes the elements of NASA's CSTI High Capacity Power Project which include Systems Analysis, Stirling Power Conversion, Thermoelectric Power Conversion, Thermal Management, Power Management, Systems Diagnostics, Environmental Interactions, and Material/Structural Development. Technology advancement in all elements is required to provide the growth capability, high reliability and 7 to 10 year lifetime demanded for future space nuclear power systems. The overall project will develop and demonstrate the technology base required to provide a wide range of modular power systems compatible with the SP-100 reactor which facilitates operation during lunar and planetary day/night cycles as well as allowing spacecraft operation at any attitude or distance from the sun. Significant accomplishments in all of the project elements will be presented, along with revised goals and project timeliness recently developed

  15. Improvement of nuclear power plant monitor and control equipment. Computer application backfitting

    International Nuclear Information System (INIS)

    Hayakawa, H.; Kawamura, A.; Suto, O.; Kinoshita, Y.; Toda, Y.

    1985-01-01

    This paper describes the application of advanced computer technology to existing Japanese Boiling Water Reactor (BWR) nuclear power plants for backfitting. First we review the background of the backfitting and the objectives of backfitting. A feature of backfitting such as restrictions and constraints imposed by the existing equipment are discussed and how to overcome these restrictions by introduction of new technology such as highly efficient data transmission using multiplexing, and compact space saving computer systems are described. Role of the computer system in reliable NPS are described with a wide spectrum of TOSHIBA backfitting computer system application experiences. (author)

  16. HIGH AVERAGE POWER OPTICAL FEL AMPLIFIERS

    International Nuclear Information System (INIS)

    2005-01-01

    Historically, the first demonstration of the optical FEL was in an amplifier configuration at Stanford University [l]. There were other notable instances of amplifying a seed laser, such as the LLNL PALADIN amplifier [2] and the BNL ATF High-Gain Harmonic Generation FEL [3]. However, for the most part FELs are operated as oscillators or self amplified spontaneous emission devices. Yet, in wavelength regimes where a conventional laser seed can be used, the FEL can be used as an amplifier. One promising application is for very high average power generation, for instance FEL's with average power of 100 kW or more. The high electron beam power, high brightness and high efficiency that can be achieved with photoinjectors and superconducting Energy Recovery Linacs (ERL) combine well with the high-gain FEL amplifier to produce unprecedented average power FELs. This combination has a number of advantages. In particular, we show that for a given FEL power, an FEL amplifier can introduce lower energy spread in the beam as compared to a traditional oscillator. This properly gives the ERL based FEL amplifier a great wall-plug to optical power efficiency advantage. The optics for an amplifier is simple and compact. In addition to the general features of the high average power FEL amplifier, we will look at a 100 kW class FEL amplifier is being designed to operate on the 0.5 ampere Energy Recovery Linac which is under construction at Brookhaven National Laboratory's Collider-Accelerator Department

  17. Application of a High-Power Reversible Converter in a Hybrid Traction Power Supply System

    Directory of Open Access Journals (Sweden)

    Gang Zhang

    2017-03-01

    Full Text Available A high-power reversible converter can achieve a variety of functions, such as recovering regenerative braking energy, expanding traction power capacity, and improving an alternating current (AC grid power factor. A new hybrid traction power supply scheme, which consists of a high-power reversible converter and two 12-pulse diode rectifiers, is proposed. A droop control method based on load current feed-forward is adopted to realize the load distribution between the reversible converter and the existing 12-pulse diode rectifiers. The direct current (DC short-circuit characteristics of the reversible converter is studied, then the relationship between the peak fault current and the circuit parameters is obtained from theoretical calculations and validated by computer simulation. The first two sets of 2 MW reversible converters have been successfully applied in Beijing Metro Line 10, the proposed hybrid application scheme and coordinated control strategy are verified, and 11.15% of average energy-savings is reached.

  18. Fast Performance Computing Model for Smart Distributed Power Systems

    Directory of Open Access Journals (Sweden)

    Umair Younas

    2017-06-01

    Full Text Available Plug-in Electric Vehicles (PEVs are becoming the more prominent solution compared to fossil fuels cars technology due to its significant role in Greenhouse Gas (GHG reduction, flexible storage, and ancillary service provision as a Distributed Generation (DG resource in Vehicle to Grid (V2G regulation mode. However, large-scale penetration of PEVs and growing demand of energy intensive Data Centers (DCs brings undesirable higher load peaks in electricity demand hence, impose supply-demand imbalance and threaten the reliability of wholesale and retail power market. In order to overcome the aforementioned challenges, the proposed research considers smart Distributed Power System (DPS comprising conventional sources, renewable energy, V2G regulation, and flexible storage energy resources. Moreover, price and incentive based Demand Response (DR programs are implemented to sustain the balance between net demand and available generating resources in the DPS. In addition, we adapted a novel strategy to implement the computational intensive jobs of the proposed DPS model including incoming load profiles, V2G regulation, battery State of Charge (SOC indication, and fast computation in decision based automated DR algorithm using Fast Performance Computing resources of DCs. In response, DPS provide economical and stable power to DCs under strict power quality constraints. Finally, the improved results are verified using case study of ISO California integrated with hybrid generation.

  19. Use of computer codes to improve nuclear power plant operation

    International Nuclear Information System (INIS)

    Misak, J.; Polak, V.; Filo, J.; Gatas, J.

    1985-01-01

    For safety and economic reasons, the scope for carrying out experiments on operational nuclear power plants (NPPs) is very limited and any changes in technical equipment and operating parameters or conditions have to be supported by theoretical calculations. In the Nuclear Power Plant Scientific Research Institute (NIIAEhS), computer codes are systematically used to analyse actual operating events, assess safety aspects of changes in equipment and operating conditions, optimize the conditions, preparation and analysis of NPP startup trials and review and amend operating instructions. In addition, calculation codes are gradually being introduced into power plant computer systems to perform real time processing of the parameters being measured. The paper describes a number of specific examples of the use of calculation codes for the thermohydraulic analysis of operating and accident conditions aimed at improving the operation of WWER-440 units at the Jaslovske Bohunice V-1 and V-2 nuclear power plants. These examples confirm that computer calculations are an effective way of solving operating problems and of further increasing the level of safety and economic efficiency of NPP operation. (author)

  20. ACIGA's high optical power test facility

    International Nuclear Information System (INIS)

    Ju, L; Aoun, M; Barriga, P

    2004-01-01

    Advanced laser interferometer detectors utilizing more than 100 W of laser power and with ∼10 6 W circulating laser power present many technological problems. The Australian Consortium for Interferometric Gravitational Astronomy (ACIGA) is developing a high power research facility in Gingin, north of Perth, Western Australia, which will test techniques for the next generation interferometers. In particular it will test thermal lensing compensation and control strategies for optical cavities in which optical spring effects and parametric instabilities may present major difficulties

  1. Grid computing in high energy physics

    CERN Document Server

    Avery, P

    2004-01-01

    Over the next two decades, major high energy physics (HEP) experiments, particularly at the Large Hadron Collider, will face unprecedented challenges to achieving their scientific potential. These challenges arise primarily from the rapidly increasing size and complexity of HEP datasets that will be collected and the enormous computational, storage and networking resources that will be deployed by global collaborations in order to process, distribute and analyze them. Coupling such vast information technology resources to globally distributed collaborations of several thousand physicists requires extremely capable computing infrastructures supporting several key areas: (1) computing (providing sufficient computational and storage resources for all processing, simulation and analysis tasks undertaken by the collaborations); (2) networking (deploying high speed networks to transport data quickly between institutions around the world); (3) software (supporting simple and transparent access to data and software r...

  2. Computing in high-energy physics

    International Nuclear Information System (INIS)

    Mount, Richard P.

    2016-01-01

    I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Lastly, I describe recent developments aimed at improving the overall coherence of high-energy physics software

  3. Computing in high-energy physics

    Science.gov (United States)

    Mount, Richard P.

    2016-04-01

    I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Finally, I describe recent developments aimed at improving the overall coherence of high-energy physics software.

  4. Computing trends using graphic processor in high energy physics

    CERN Document Server

    Niculescu, Mihai

    2011-01-01

    One of the main challenges in Heavy Energy Physics is to make fast analysis of high amount of experimental and simulated data. At LHC-CERN one p-p event is approximate 1 Mb in size. The time taken to analyze the data and obtain fast results depends on high computational power. The main advantage of using GPU(Graphic Processor Unit) programming over traditional CPU one is that graphical cards bring a lot of computing power at a very low price. Today a huge number of application(scientific, financial etc) began to be ported or developed for GPU, including Monte Carlo tools or data analysis tools for High Energy Physics. In this paper, we'll present current status and trends in HEP using GPU.

  5. Power Conversion Study for High Temperature Gas-Cooled Reactors

    International Nuclear Information System (INIS)

    Chang Oh; Richard Moore; Robert Barner

    2005-01-01

    The Idaho National Laboratory (INL) is investigating a Brayton cycle efficiency improvement on a high temperature gas-cooled reactor (HTGR) as part of Generation-IV nuclear engineering research initiative. There are some technical issues to be resolved before the selection of the final design of the high temperature gas cooled reactor, called as a Next Generation Nuclear Plant (NGNP), which is supposed to be built at the INEEL by year 2017. The technical issues are the selection of the working fluid, direct vs. indirect cycle, power cycle type, the optimized design in terms of a number of intercoolers, and others. In this paper, we investigated a number of working fluids for the power conversion loop, direct versus indirect cycle, the effect of intercoolers, and other thermal hydraulics issues. However, in this paper, we present part of the results we have obtained. HYSYS computer code was used along with a computer model developed using Visual Basic computer language

  6. Parallel computing for event reconstruction in high-energy physics

    International Nuclear Information System (INIS)

    Wolbers, S.

    1993-01-01

    Parallel computing has been recognized as a solution to large computing problems. In High Energy Physics offline event reconstruction of detector data is a very large computing problem that has been solved with parallel computing techniques. A review of the parallel programming package CPS (Cooperative Processes Software) developed and used at Fermilab for offline reconstruction of Terabytes of data requiring the delivery of hundreds of Vax-Years per experiment is given. The Fermilab UNIX farms, consisting of 180 Silicon Graphics workstations and 144 IBM RS6000 workstations, are used to provide the computing power for the experiments. Fermilab has had a long history of providing production parallel computing starting with the ACP (Advanced Computer Project) Farms in 1986. The Fermilab UNIX Farms have been in production for over 2 years with 24 hour/day service to experimental user groups. Additional tools for management, control and monitoring these large systems will be described. Possible future directions for parallel computing in High Energy Physics will be given

  7. AHPCRC - Army High Performance Computing Research Center

    Science.gov (United States)

    2010-01-01

    computing. Of particular interest is the ability of a distrib- uted jamming network (DJN) to jam signals in all or part of a sensor or communications net...and reasoning, assistive technologies. FRIEDRICH (FRITZ) PRINZ Finmeccanica Professor of Engineering, Robert Bosch Chair, Department of Engineering...High Performance Computing Research Center www.ahpcrc.org BARBARA BRYAN AHPCRC Research and Outreach Manager, HPTi (650) 604-3732 bbryan@hpti.com Ms

  8. DURIP: High Performance Computing in Biomathematics Applications

    Science.gov (United States)

    2017-05-10

    Mathematics and Statistics (AMS) at the University of California, Santa Cruz (UCSC) to conduct research and research-related education in areas of...Computing in Biomathematics Applications Report Title The goal of this award was to enhance the capabilities of the Department of Applied Mathematics and...DURIP: High Performance Computing in Biomathematics Applications The goal of this award was to enhance the capabilities of the Department of Applied

  9. Power Consumption Evaluation of Distributed Computing Network Considering Traffic Locality

    Science.gov (United States)

    Ogawa, Yukio; Hasegawa, Go; Murata, Masayuki

    When computing resources are consolidated in a few huge data centers, a massive amount of data is transferred to each data center over a wide area network (WAN). This results in increased power consumption in the WAN. A distributed computing network (DCN), such as a content delivery network, can reduce the traffic from/to the data center, thereby decreasing the power consumed in the WAN. In this paper, we focus on the energy-saving aspect of the DCN and evaluate its effectiveness, especially considering traffic locality, i.e., the amount of traffic related to the geographical vicinity. We first formulate the problem of optimizing the DCN power consumption and describe the DCN in detail. Then, numerical evaluations show that, when there is strong traffic locality and the router has ideal energy proportionality, the system's power consumption is reduced to about 50% of the power consumed in the case where a DCN is not used; moreover, this advantage becomes even larger (up to about 30%) when the data center is located farthest from the center of the network topology.

  10. High average-power induction linacs

    International Nuclear Information System (INIS)

    Prono, D.S.; Barrett, D.; Bowles, E.; Caporaso, G.J.; Chen, Yu-Jiuan; Clark, J.C.; Coffield, F.; Newton, M.A.; Nexsen, W.; Ravenscroft, D.; Turner, W.C.; Watson, J.A.

    1989-01-01

    Induction linear accelerators (LIAs) are inherently capable of accelerating several thousand amperes of ∼ 50-ns duration pulses to > 100 MeV. In this paper the authors report progress and status in the areas of duty factor and stray power management. These technologies are vital if LIAs are to attain high average power operation. 13 figs

  11. High average-power induction linacs

    International Nuclear Information System (INIS)

    Prono, D.S.; Barrett, D.; Bowles, E.

    1989-01-01

    Induction linear accelerators (LIAs) are inherently capable of accelerating several thousand amperes of /approximately/ 50-ns duration pulses to > 100 MeV. In this paper we report progress and status in the areas of duty factor and stray power management. These technologies are vital if LIAs are to attain high average power operation. 13 figs

  12. Driver Circuit For High-Power MOSFET's

    Science.gov (United States)

    Letzer, Kevin A.

    1991-01-01

    Driver circuit generates rapid-voltage-transition pulses needed to switch high-power metal oxide/semiconductor field-effect transistor (MOSFET) modules rapidly between full "on" and full "off". Rapid switching reduces time of overlap between appreciable current through and appreciable voltage across such modules, thereby increasing power efficiency.

  13. Computer based training simulator for Hunterston Nuclear Power Station

    International Nuclear Information System (INIS)

    Bowden, R.S.M.; Hacking, D.

    1978-01-01

    For reasons which are stated, the Hunterston-B nuclear power station automatic control system includes a manual over-ride facility. It is therefore essential for the station engineers to be trained to recognise and control all feasible modes of plant and logic malfunction. A training simulator has been built which consists of a replica of the shutdown monitoring panel in the Central Control Room and is controlled by a mini-computer. This paper highlights the computer aspects of the simulator and relevant derived experience, under the following headings: engineering background; shutdown sequence equipment; simulator equipment; features; software; testing; maintenance. (U.K.)

  14. On Computational Power of Quantum Read-Once Branching Programs

    Directory of Open Access Journals (Sweden)

    Farid Ablayev

    2011-03-01

    Full Text Available In this paper we review our current results concerning the computational power of quantum read-once branching programs. First of all, based on the circuit presentation of quantum branching programs and our variant of quantum fingerprinting technique, we show that any Boolean function with linear polynomial presentation can be computed by a quantum read-once branching program using a relatively small (usually logarithmic in the size of input number of qubits. Then we show that the described class of Boolean functions is closed under the polynomial projections.

  15. ICAN: High power neutral beam generation

    International Nuclear Information System (INIS)

    Moustaizis, S.D.; Lalousis, P.; Perrakis, K.; Auvray, P.; Larour, J.; Ducret, J.E.; Balcou, P.

    2015-01-01

    During the last few years there is an increasing interest on the development of alternative high power new negative ion source for Tokamak applications. The proposed new neutral beam device presents a number of advantages with respect to: the density current, the acceleration voltage, the relative compact dimension of the negative ion source, and the coupling of a high power laser beam for photo-neutralization of the negative ion beam. Here we numerically investigate, using a multi- fluid 1-D code, the acceleration and the extraction of high power ion beam from a Magnetically Insulated Diode (MID). The diode configuration will be coupled to a high power device capable of extracting a current up to a few kA with an accelerating voltage up to MeV. An efficiency of up to 92% of the coupling of the laser beam, is required in order to obtain a high power, up to GW, neutral beam. The new high energy, high average power, high efficiency (up to 30%) ICAN fiber laser is proposed for both the plasma generation and the photo-neutralizer configuration. (authors)

  16. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison; Adel Sarofim; Connie Senior

    2004-12-22

    , immersive environment. The Virtual Engineering Framework (VEF), in effect a prototype framework, was developed through close collaboration with NETL supported research teams from Iowa State University Virtual Reality Applications Center (ISU-VRAC) and Carnegie Mellon University (CMU). The VEF is open source, compatible across systems ranging from inexpensive desktop PCs to large-scale, immersive facilities and provides support for heterogeneous distributed computing of plant simulations. The ability to compute plant economics through an interface that coupled the CMU IECM tool to the VEF was demonstrated, and the ability to couple the VEF to Aspen Plus, a commercial flowsheet modeling tool, was demonstrated. Models were interfaced to the framework using VES-Open. Tests were performed for interfacing CAPE-Open-compliant models to the framework. Where available, the developed models and plant simulations have been benchmarked against data from the open literature. The VEF has been installed at NETL. The VEF provides simulation capabilities not available in commercial simulation tools. It provides DOE engineers, scientists, and decision makers with a flexible and extensible simulation system that can be used to reduce the time, technical risk, and cost to develop the next generation of advanced, coal-fired power systems that will have low emissions and high efficiency. Furthermore, the VEF provides a common simulation system that NETL can use to help manage Advanced Power Systems Research projects, including both combustion- and gasification-based technologies.

  17. Dimensioning storage and computing clusters for efficient High Throughput Computing

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Scientific experiments are producing huge amounts of data, and they continue increasing the size of their datasets and the total volume of data. These data are then processed by researchers belonging to large scientific collaborations, with the Large Hadron Collider being a good example. The focal point of Scientific Data Centres has shifted from coping efficiently with PetaByte scale storage to deliver quality data processing throughput. The dimensioning of the internal components in High Throughput Computing (HTC) data centers is of crucial importance to cope with all the activities demanded by the experiments, both the online (data acceptance) and the offline (data processing, simulation and user analysis). This requires a precise setup involving disk and tape storage services, a computing cluster and the internal networking to prevent bottlenecks, overloads and undesired slowness that lead to losses cpu cycles and batch jobs failures. In this paper we point out relevant features for running a successful s...

  18. GRID computing for experimental high energy physics

    International Nuclear Information System (INIS)

    Moloney, G.R.; Martin, L.; Seviour, E.; Taylor, G.N.; Moorhead, G.F.

    2002-01-01

    Full text: The Large Hadron Collider (LHC), to be completed at the CERN laboratory in 2006, will generate 11 petabytes of data per year. The processing of this large data stream requires a large, distributed computing infrastructure. A recent innovation in high performance distributed computing, the GRID, has been identified as an important tool in data analysis for the LHC. GRID computing has actual and potential application in many fields which require computationally intensive analysis of large, shared data sets. The Australian experimental High Energy Physics community has formed partnerships with the High Performance Computing community to establish a GRID node at the University of Melbourne. Through Australian membership of the ATLAS experiment at the LHC, Australian researchers have an opportunity to be involved in the European DataGRID project. This presentation will include an introduction to the GRID, and it's application to experimental High Energy Physics. We will present the results of our studies, including participation in the first LHC data challenge

  19. Low Power Design with High-Level Power Estimation and Power-Aware Synthesis

    CERN Document Server

    Ahuja, Sumit; Shukla, Sandeep Kumar

    2012-01-01

    Low-power ASIC/FPGA based designs are important due to the need for extended battery life, reduced form factor, and lower packaging and cooling costs for electronic devices. These products require fast turnaround time because of the increasing demand for handheld electronic devices such as cell-phones, PDAs and high performance machines for data centers. To achieve short time to market, design flows must facilitate a much shortened time-to-product requirement. High-level modeling, architectural exploration and direct synthesis of design from high level description enable this design process. This book presents novel research techniques, algorithms,methodologies and experimental results for high level power estimation and power aware high-level synthesis. Readers will learn to apply such techniques to enable design flows resulting in shorter time to market and successful low power ASIC/FPGA design. Integrates power estimation and reduction for high level synthesis, with low-power, high-level design; Shows spec...

  20. High power density carbonate fuel cell

    Energy Technology Data Exchange (ETDEWEB)

    Yuh, C.; Johnsen, R.; Doyon, J.; Allen, J. [Energy Research Corp., Danbury, CT (United States)

    1996-12-31

    Carbonate fuel cell is a highly efficient and environmentally clean source of power generation. Many organizations worldwide are actively pursuing the development of the technology. Field demonstration of multi-MW size power plant has been initiated in 1996, a step toward commercialization before the turn of the century, Energy Research Corporation (ERC) is planning to introduce a 2.85MW commercial fuel cell power plant with an efficiency of 58%, which is quite attractive for distributed power generation. However, to further expand competitive edge over alternative systems and to achieve wider market penetration, ERC is exploring advanced carbonate fuel cells having significantly higher power densities. A more compact power plant would also stimulate interest in new markets such as ships and submarines where space limitations exist. The activities focused on reducing cell polarization and internal resistance as well as on advanced thin cell components.

  1. High-performance computing for airborne applications

    International Nuclear Information System (INIS)

    Quinn, Heather M.; Manuzatto, Andrea; Fairbanks, Tom; Dallmann, Nicholas; Desgeorges, Rose

    2010-01-01

    Recently, there has been attempts to move common satellite tasks to unmanned aerial vehicles (UAVs). UAVs are significantly cheaper to buy than satellites and easier to deploy on an as-needed basis. The more benign radiation environment also allows for an aggressive adoption of state-of-the-art commercial computational devices, which increases the amount of data that can be collected. There are a number of commercial computing devices currently available that are well-suited to high-performance computing. These devices range from specialized computational devices, such as field-programmable gate arrays (FPGAs) and digital signal processors (DSPs), to traditional computing platforms, such as microprocessors. Even though the radiation environment is relatively benign, these devices could be susceptible to single-event effects. In this paper, we will present radiation data for high-performance computing devices in a accelerated neutron environment. These devices include a multi-core digital signal processor, two field-programmable gate arrays, and a microprocessor. From these results, we found that all of these devices are suitable for many airplane environments without reliability problems.

  2. High Performance Computing in Science and Engineering '08 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2009-01-01

    The discussions and plans on all scienti?c, advisory, and political levels to realize an even larger “European Supercomputer” in Germany, where the hardware costs alone will be hundreds of millions Euro – much more than in the past – are getting closer to realization. As part of the strategy, the three national supercomputing centres HLRS (Stuttgart), NIC/JSC (Julic ¨ h) and LRZ (Munich) have formed the Gauss Centre for Supercomputing (GCS) as a new virtual organization enabled by an agreement between the Federal Ministry of Education and Research (BMBF) and the state ministries for research of Baden-Wurttem ¨ berg, Bayern, and Nordrhein-Westfalen. Already today, the GCS provides the most powerful high-performance computing - frastructure in Europe. Through GCS, HLRS participates in the European project PRACE (Partnership for Advances Computing in Europe) and - tends its reach to all European member countries. These activities aligns well with the activities of HLRS in the European HPC infrastructur...

  3. Development of distributed computer systems for future nuclear power plants

    International Nuclear Information System (INIS)

    Yan, G.; L'Archeveque, J.V.R.

    1978-01-01

    Dual computers have been used for direct digital control in CANDU power reactors since 1963. However, as reactor plants have grown in size and complexity, some drawbacks to centralized control appear such as, for example, the surprisingly large amount of cabling required for information transmission. Dramatic changes in costs of components and a desire to improve system performance have stimulated a broad-based research and development effort in distribution systems. This paper outlines work in this area

  4. High Voltage Power Transmission for Wind Energy

    Science.gov (United States)

    Kim, Young il

    The high wind speeds and wide available area at sea have recently increased the interests on offshore wind farms in the U.S.A. As offshore wind farms become larger and are placed further from the shore, the power transmission to the onshore grid becomes a key feature. Power transmission of the offshore wind farm, in which good wind conditions and a larger installation area than an onshore site are available, requires the use of submarine cable systems. Therefore, an underground power cable system requires unique design and installation challenges not found in the overhead power cable environment. This paper presents analysis about the benefit and drawbacks of three different transmission solutions: HVAC, LCC/VSC HVDC in the grid connecting offshore wind farms and also analyzed the electrical characteristics of underground cables. In particular, loss of HV (High Voltage) subsea power of the transmission cables was evaluated by the Brakelmann's theory, taking into account the distributions of current and temperature.

  5. Gingin High Optical Power Test Facility

    International Nuclear Information System (INIS)

    Zhao, C; Blair, D G; Barrigo, P

    2006-01-01

    The Australian Consortium for Gravitational Wave Astronomy (ACIGA) in collaboration with LIGO is developing a high optical power research facility at the AIGO site, Gingin, Western Australia. Research at the facility will provide solutions to the problems that advanced gravitational wave detectors will encounter with extremely high optical power. The problems include thermal lensing and parametric instabilities. This article will present the status of the facility and the plan for the future experiments

  6. Computer techniques for experimental work in GDR nuclear power plants with WWER

    International Nuclear Information System (INIS)

    Stemmler, G.

    1985-01-01

    Nuclear power plant units with WWER are being increasingly equipped with high-performance, programmable process control computers. There are, however, essential reasons for further advancing the development of computer-aided measuring systems, in particular for experimental work. A special structure of such systems, which is based on the division into relatively rigid data registration and primary handling and into further processing by advanced programming language, has proved useful in the GDR. (author)

  7. Application of Nearly Linear Solvers to Electric Power System Computation

    Science.gov (United States)

    Grant, Lisa L.

    To meet the future needs of the electric power system, improvements need to be made in the areas of power system algorithms, simulation, and modeling, specifically to achieve a time frame that is useful to industry. If power system time-domain simulations could run in real-time, then system operators would have situational awareness to implement online control and avoid cascading failures, significantly improving power system reliability. Several power system applications rely on the solution of a very large linear system. As the demands on power systems continue to grow, there is a greater computational complexity involved in solving these large linear systems within reasonable time. This project expands on the current work in fast linear solvers, developed for solving symmetric and diagonally dominant linear systems, in order to produce power system specific methods that can be solved in nearly-linear run times. The work explores a new theoretical method that is based on ideas in graph theory and combinatorics. The technique builds a chain of progressively smaller approximate systems with preconditioners based on the system's low stretch spanning tree. The method is compared to traditional linear solvers and shown to reduce the time and iterations required for an accurate solution, especially as the system size increases. A simulation validation is performed, comparing the solution capabilities of the chain method to LU factorization, which is the standard linear solver for power flow. The chain method was successfully demonstrated to produce accurate solutions for power flow simulation on a number of IEEE test cases, and a discussion on how to further improve the method's speed and accuracy is included.

  8. Inverter design for high frequency power distribution

    Science.gov (United States)

    King, R. J.

    1985-01-01

    A class of simple resonantly commutated inverters are investigated for use in a high power (100 KW - 1000 KW) high frequency (10 KHz - 20 KHz) AC power distribution system. The Mapham inverter is found to provide a unique combination of large thyristor turn-off angle and good utilization factor, much better than an alternate 'current-fed' inverter. The effects of loading the Mapham inverter entirely with rectifier loads are investigated by simulation and with an experimental 3 KW 20 KHz inverter. This inverter is found to be well suited to a power system with heavy rectifier loading.

  9. Assessing Power Monitoring Approaches for Energy and Power Analysis of Computers

    OpenAIRE

    El Mehdi Diouria, Mohammed; Dolz Zaragozá, Manuel Francisco; Glückc, Olivier; Lefèvre, Laurent; Alonso, Pedro; Catalán Pallarés, Sandra; Mayo, Rafael; Quintana Ortí, Enrique S.

    2014-01-01

    Large-scale distributed systems (e.g., datacenters, HPC systems, clouds, large-scale networks, etc.) consume and will consume enormous amounts of energy. Therefore, accurately monitoring the power dissipation and energy consumption of these systems is more unavoidable. The main novelty of this contribution is the analysis and evaluation of different external and internal power monitoring devices tested using two different computing systems, a server and a desktop machine. Furthermore, we prov...

  10. The ongoing investigation of high performance parallel computing in HEP

    CERN Document Server

    Peach, Kenneth J; Böck, R K; Dobinson, Robert W; Hansroul, M; Norton, Alan Robert; Willers, Ian Malcolm; Baud, J P; Carminati, F; Gagliardi, F; McIntosh, E; Metcalf, M; Robertson, L; CERN. Geneva. Detector Research and Development Committee

    1993-01-01

    Past and current exploitation of parallel computing in High Energy Physics is summarized and a list of R & D projects in this area is presented. The applicability of new parallel hardware and software to physics problems is investigated, in the light of the requirements for computing power of LHC experiments and the current trends in the computer industry. Four main themes are discussed (possibilities for a finer grain of parallelism; fine-grain communication mechanism; usable parallel programming environment; different programming models and architectures, using standard commercial products). Parallel computing technology is potentially of interest for offline and vital for real time applications in LHC. A substantial investment in applications development and evaluation of state of the art hardware and software products is needed. A solid development environment is required at an early stage, before mainline LHC program development begins.

  11. CUDA/GPU Technology : Parallel Programming For High Performance Scientific Computing

    OpenAIRE

    YUHENDRA; KUZE, Hiroaki; JOSAPHAT, Tetuko Sri Sumantyo

    2009-01-01

    [ABSTRACT]Graphics processing units (GP Us) originally designed for computer video cards have emerged as the most powerful chip in a high-performance workstation. In the high performance computation capabilities, graphic processing units (GPU) lead to much more powerful performance than conventional CPUs by means of parallel processing. In 2007, the birth of Compute Unified Device Architecture (CUDA) and CUDA-enabled GPUs by NVIDIA Corporation brought a revolution in the general purpose GPU a...

  12. Careful determination of inservice inspection of piping by computer analysis in nuclear power plant

    International Nuclear Information System (INIS)

    Lim, H. T.; Lee, S. L.; Lee, J. P.; Kim, B. C.

    1992-01-01

    Stress analysis has been performed using computer program ANSYS in the pressurizer surge line in order to predict possibility of crack generation due to thermal stratification phenomena in pipes connected to reactor coolant system of Nuclear power plants. Highly vulnerable area to crack generation has been chosen by the analysis of fatigue due to thermal stress in pressurizer surge line. This kind of result can be helpful to choose the location requiring intensive care during inservice inspection of nuclear power plants.

  13. High-performance computing in seismology

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-09-01

    The scientific, technical, and economic importance of the issues discussed here presents a clear agenda for future research in computational seismology. In this way these problems will drive advances in high-performance computing in the field of seismology. There is a broad community that will benefit from this work, including the petroleum industry, research geophysicists, engineers concerned with seismic hazard mitigation, and governments charged with enforcing a comprehensive test ban treaty. These advances may also lead to new applications for seismological research. The recent application of high-resolution seismic imaging of the shallow subsurface for the environmental remediation industry is an example of this activity. This report makes the following recommendations: (1) focused efforts to develop validated documented software for seismological computations should be supported, with special emphasis on scalable algorithms for parallel processors; (2) the education of seismologists in high-performance computing technologies and methodologies should be improved; (3) collaborations between seismologists and computational scientists and engineers should be increased; (4) the infrastructure for archiving, disseminating, and processing large volumes of seismological data should be improved.

  14. Modeling high-power RF accelerator cavities with SPICE

    International Nuclear Information System (INIS)

    Humphries, S. Jr.

    1992-01-01

    The dynamical interactions between RF accelerator cavities and high-power beams can be treated on personal computers using a lumped circuit element model and the SPICE circuit analysis code. Applications include studies of wake potentials, two-beam accelerators, microwave sources, and transverse mode damping. This report describes the construction of analogs for TM mn0 modes and the creation of SPICE input for cylindrical cavities. The models were used to study continuous generation of kA electron beam pulses from a vacuum cavity driven by a high-power RF source

  15. Soft computing for fault diagnosis in power plants

    International Nuclear Information System (INIS)

    Ciftcioglu, O.; Turkcan, E.

    1998-01-01

    Considering the advancements in the AI technology, there arises a new concept known as soft computing. It can be defined as the processing of uncertain information with the AI methods, that refers to explicitly the methods using neural networks, fuzzy logic and evolutionary algorithms. In this respect, soft computing is a new dimension in information processing technology where linguistic information can also be processed in contrast with the classical stochastic and deterministic treatments of data. On one hand it can process uncertain/incomplete information and on the other hand it can deal with non-linearity of large-scale systems where uncertainty is particularly relevant with respect to linguistic information and incompleteness is related to fault tolerance in fault diagnosis. In this perspective, the potential role of soft computing in power plant operation is presented. (author)

  16. HEDPIN: a computer program to estimate pinwise power density

    International Nuclear Information System (INIS)

    Cappiello, M.W.

    1976-05-01

    A description is given of the digital computer program, HEDPIN. This program, modeled after a previously developed program, POWPIN, provides a means of estimating the pinwise power density distribution in fast reactor triangular pitched pin bundles. The capability also exists for computing any reaction rate of interest at the respective pin positions within an assembly. HEDPIN was developed in support of FTR fuel and test management as well as fast reactor core design and core characterization planning and analysis. The results of a test devised to check out HEDPIN's computational method are given, and the realm of application is discussed. Nearly all programming is in FORTRAN IV. Variable dimensioning is employed to make efficient use of core memory and maintain short running time for small problems. Input instructions, sample problem, and a program listing are also given

  17. High performance parallel computers for science

    International Nuclear Information System (INIS)

    Nash, T.; Areti, H.; Atac, R.; Biel, J.; Cook, A.; Deppe, J.; Edel, M.; Fischler, M.; Gaines, I.; Hance, R.

    1989-01-01

    This paper reports that Fermilab's Advanced Computer Program (ACP) has been developing cost effective, yet practical, parallel computers for high energy physics since 1984. The ACP's latest developments are proceeding in two directions. A Second Generation ACP Multiprocessor System for experiments will include $3500 RISC processors each with performance over 15 VAX MIPS. To support such high performance, the new system allows parallel I/O, parallel interprocess communication, and parallel host processes. The ACP Multi-Array Processor, has been developed for theoretical physics. Each $4000 node is a FORTRAN or C programmable pipelined 20 Mflops (peak), 10 MByte single board computer. These are plugged into a 16 port crossbar switch crate which handles both inter and intra crate communication. The crates are connected in a hypercube. Site oriented applications like lattice gauge theory are supported by system software called CANOPY, which makes the hardware virtually transparent to users. A 256 node, 5 GFlop, system is under construction

  18. Small high cooling power space cooler

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, T. V.; Raab, J.; Durand, D.; Tward, E. [Northrop Grumman Aerospace Systems Redondo Beach, Ca, 90278 (United States)

    2014-01-29

    The small High Efficiency pulse tube Cooler (HEC) cooler, that has been produced and flown on a number of space infrared instruments, was originally designed to provide cooling of 10 W @ 95 K. It achieved its goal with >50% margin when limited by the 180 W output ac power of its flight electronics. It has also been produced in 2 stage configurations, typically for simultaneously cooling of focal planes to temperatures as low as 35 K and optics at higher temperatures. The need for even higher cooling power in such a low mass cryocooler is motivated by the advent of large focal plane arrays. With the current availability at NGAS of much larger power cryocooler flight electronics, reliable long term operation in space with much larger cooling powers is now possible with the flight proven 4 kg HEC mechanical cooler. Even though the single stage cooler design can be re-qualified for those larger input powers without design change, we redesigned both the linear and coaxial version passive pulse tube cold heads to re-optimize them for high power cooling at temperatures above 130 K while rejecting heat to 300 K. Small changes to the regenerator packing, the re-optimization of the tuned inertance and no change to the compressor resulted in the increased performance at 150 K. The cooler operating at 290 W input power achieves 35 W@ 150 K corresponding to a specific cooling power at 150 K of 8.25 W/W and a very high specific power of 72.5 W/Kg. At these powers the cooler still maintains large stroke, thermal and current margins. In this paper we will present the measured data and the changes to this flight proven cooler that were made to achieve this increased performance.

  19. Grid Computing in High Energy Physics

    International Nuclear Information System (INIS)

    Avery, Paul

    2004-01-01

    Over the next two decades, major high energy physics (HEP) experiments, particularly at the Large Hadron Collider, will face unprecedented challenges to achieving their scientific potential. These challenges arise primarily from the rapidly increasing size and complexity of HEP datasets that will be collected and the enormous computational, storage and networking resources that will be deployed by global collaborations in order to process, distribute and analyze them.Coupling such vast information technology resources to globally distributed collaborations of several thousand physicists requires extremely capable computing infrastructures supporting several key areas: (1) computing (providing sufficient computational and storage resources for all processing, simulation and analysis tasks undertaken by the collaborations); (2) networking (deploying high speed networks to transport data quickly between institutions around the world); (3) software (supporting simple and transparent access to data and software resources, regardless of location); (4) collaboration (providing tools that allow members full and fair access to all collaboration resources and enable distributed teams to work effectively, irrespective of location); and (5) education, training and outreach (providing resources and mechanisms for training students and for communicating important information to the public).It is believed that computing infrastructures based on Data Grids and optical networks can meet these challenges and can offer data intensive enterprises in high energy physics and elsewhere a comprehensive, scalable framework for collaboration and resource sharing. A number of Data Grid projects have been underway since 1999. Interestingly, the most exciting and far ranging of these projects are led by collaborations of high energy physicists, computer scientists and scientists from other disciplines in support of experiments with massive, near-term data needs. I review progress in this

  20. High performance computing on vector systems

    CERN Document Server

    Roller, Sabine

    2008-01-01

    Presents the developments in high-performance computing and simulation on modern supercomputer architectures. This book covers trends in hardware and software development in general and specifically the vector-based systems and heterogeneous architectures. It presents innovative fields like coupled multi-physics or multi-scale simulations.

  1. Applications of parallel computer architectures to the real-time simulation of nuclear power systems

    International Nuclear Information System (INIS)

    Doster, J.M.; Sills, E.D.

    1988-01-01

    In this paper the authors report on efforts to utilize parallel computer architectures for the thermal-hydraulic simulation of nuclear power systems and current research efforts toward the development of advanced reactor operator aids and control systems based on this new technology. Many aspects of reactor thermal-hydraulic calculations are inherently parallel, and the computationally intensive portions of these calculations can be effectively implemented on modern computers. Timing studies indicate faster-than-real-time, high-fidelity physics models can be developed when the computational algorithms are designed to take advantage of the computer's architecture. These capabilities allow for the development of novel control systems and advanced reactor operator aids. Coupled with an integral real-time data acquisition system, evolving parallel computer architectures can provide operators and control room designers improved control and protection capabilities. Current research efforts are currently under way in this area

  2. Toward High-Power Klystrons With RF Power Conversion Efficiency on the Order of 90%

    CERN Document Server

    Baikov, Andrey Yu; Syratchev, Igor

    2015-01-01

    The increase in efficiency of RF power generation for future large accelerators is considered a high priority issue. The vast majority of the existing commercial high-power RF klystrons operates in the electronic efficiency range between 40% and 55%. Only a few klystrons available on the market are capable of operating with 65% efficiency or above. In this paper, a new method to achieve 90% RF power conversion efficiency in a klystron amplifier is presented. The essential part of this method is a new bunching technique - bunching with bunch core oscillations. Computer simulations confirm that the RF production efficiency above 90% can be reached with this new bunching method. The results of a preliminary study of an L-band, 20-MW peak RF power multibeam klystron for Compact Linear Collider with the efficiency above 85% are presented.

  3. Analysis of three-phase power-supply systems using computer-aided design programs

    International Nuclear Information System (INIS)

    Oberst, E.F.

    1977-01-01

    A major concern of every designer of large, three-phase power-supply systems is the protection of system components from overvoltage transients. At present, three computer-aided circuit design programs are available in the Magnetic Fusion Energy (MFE) National Computer Center that can be used to analyze three-phase power systems: MINI SCEPTRE, SPICE I, and SPICE II. These programs have been used at Lawrence Livermore Laboratory (LLL) to analyze the operation of a 200-kV dc, 20-A acceleration power supply for the High Voltage Test Stand. Various overvoltage conditions are simulated and the effectiveness of system protective devices is observed. The simulated overvoltage conditions include such things as circuit breaker openings, pulsed loading, and commutation voltage surges in the rectifiers. These examples are used to illustrate the use of the computer-aided, circuit-design programs discussed in this paper

  4. A High Performance VLSI Computer Architecture For Computer Graphics

    Science.gov (United States)

    Chin, Chi-Yuan; Lin, Wen-Tai

    1988-10-01

    A VLSI computer architecture, consisting of multiple processors, is presented in this paper to satisfy the modern computer graphics demands, e.g. high resolution, realistic animation, real-time display etc.. All processors share a global memory which are partitioned into multiple banks. Through a crossbar network, data from one memory bank can be broadcasted to many processors. Processors are physically interconnected through a hyper-crossbar network (a crossbar-like network). By programming the network, the topology of communication links among processors can be reconfigurated to satisfy specific dataflows of different applications. Each processor consists of a controller, arithmetic operators, local memory, a local crossbar network, and I/O ports to communicate with other processors, memory banks, and a system controller. Operations in each processor are characterized into two modes, i.e. object domain and space domain, to fully utilize the data-independency characteristics of graphics processing. Special graphics features such as 3D-to-2D conversion, shadow generation, texturing, and reflection, can be easily handled. With the current high density interconnection (MI) technology, it is feasible to implement a 64-processor system to achieve 2.5 billion operations per second, a performance needed in most advanced graphics applications.

  5. Reducing power consumption during execution of an application on a plurality of compute nodes

    Science.gov (United States)

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Peters, Amanda E [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2012-06-05

    Methods, apparatus, and products are disclosed for reducing power consumption during execution of an application on a plurality of compute nodes that include: executing, by each compute node, an application, the application including power consumption directives corresponding to one or more portions of the application; identifying, by each compute node, the power consumption directives included within the application during execution of the portions of the application corresponding to those identified power consumption directives; and reducing power, by each compute node, to one or more components of that compute node according to the identified power consumption directives during execution of the portions of the application corresponding to those identified power consumption directives.

  6. Profiling an application for power consumption during execution on a compute node

    Science.gov (United States)

    Archer, Charles J; Blocksome, Michael A; Peters, Amanda E; Ratterman, Joseph D; Smith, Brian E

    2013-09-17

    Methods, apparatus, and products are disclosed for profiling an application for power consumption during execution on a compute node that include: receiving an application for execution on a compute node; identifying a hardware power consumption profile for the compute node, the hardware power consumption profile specifying power consumption for compute node hardware during performance of various processing operations; determining a power consumption profile for the application in dependence upon the application and the hardware power consumption profile for the compute node; and reporting the power consumption profile for the application.

  7. Profiling an application for power consumption during execution on a plurality of compute nodes

    Science.gov (United States)

    Archer, Charles J.; Blocksome, Michael A.; Peters, Amanda E.; Ratterman, Joseph D.; Smith, Brian E.

    2012-08-21

    Methods, apparatus, and products are disclosed for profiling an application for power consumption during execution on a compute node that include: receiving an application for execution on a compute node; identifying a hardware power consumption profile for the compute node, the hardware power consumption profile specifying power consumption for compute node hardware during performance of various processing operations; determining a power consumption profile for the application in dependence upon the application and the hardware power consumption profile for the compute node; and reporting the power consumption profile for the application.

  8. High performance computing in science and engineering '09: transactions of the High Performance Computing Center, Stuttgart (HLRS) 2009

    National Research Council Canada - National Science Library

    Nagel, Wolfgang E; Kröner, Dietmar; Resch, Michael

    2010-01-01

    ...), NIC/JSC (J¨ u lich), and LRZ (Munich). As part of that strategic initiative, in May 2009 already NIC/JSC has installed the first phase of the GCS HPC Tier-0 resources, an IBM Blue Gene/P with roughly 300.000 Cores, this time in J¨ u lich, With that, the GCS provides the most powerful high-performance computing infrastructure in Europe alread...

  9. High power neutral beam injection in LHD

    International Nuclear Information System (INIS)

    Tsumori, K.; Takeiri, Y.; Nagaoka, K.

    2005-01-01

    The results of high power injection with a neutral beam injection (NBI) system for the large helical device (LHD) are reported. The system consists of three beam-lines, and two hydrogen negative ion (H - ion) sources are installed in each beam-line. In order to improve the injection power, the new beam accelerator with multi-slot grounded grid (MSGG) has been developed and applied to one of the beam-lines. Using the accelerator, the maximum powers of 5.7 MW were achieved in 2003 and 2004, and the energy of 189 keV reached at maximum. The power and energy exceeded the design values of the individual beam-line for LHD. The other beam-lines also increased their injection power up to about 4 MW, and the total injection power of 13.1 MW was achieved with three beam-lines in 2003. Although the accelerator had an advantage in high power beam injection, it involved a demerit in the beam focal condition. The disadvantage was resolved by modifying the aperture shapes of the steering grid. (author)

  10. Nuclear power reactor analysis, methods, algorithms and computer programs

    International Nuclear Information System (INIS)

    Matausek, M.V

    1981-01-01

    Full text: For a developing country buying its first nuclear power plants from a foreign supplier, disregarding the type and scope of the contract, there is a certain number of activities which have to be performed by local stuff and domestic organizations. This particularly applies to the choice of the nuclear fuel cycle strategy and the choice of the type and size of the reactors, to bid parameters specification, bid evaluation and final safety analysis report evaluation, as well as to in-core fuel management activities. In the Nuclear Engineering Department of the Boris Kidric Institute of Nuclear Sciences (NET IBK) the continual work is going on, related to the following topics: cross section and resonance integral calculations, spectrum calculations, generation of group constants, lattice and cell problems, criticality and global power distribution search, fuel burnup analysis, in-core fuel management procedures, cost analysis and power plant economics, safety and accident analysis, shielding problems and environmental impact studies, etc. The present paper gives the details of the methods developed and the results achieved, with the particular emphasis on the NET IBK computer program package for the needs of planning, construction and operation of nuclear power plants. The main problems encountered so far were related to small working team, lack of large and powerful computers, absence of reliable basic nuclear data and shortage of experimental and empirical results for testing theoretical models. Some of these difficulties have been overcome thanks to bilateral and multilateral cooperation with developed countries, mostly through IAEA. It is the authors opinion, however, that mutual cooperation of developing countries, having similar problems and similar goals, could lead to significant results. Some activities of this kind are suggested and discussed. (author)

  11. High Flux Isotope Reactor power upgrade status

    International Nuclear Information System (INIS)

    Rothrock, R.B.; Hale, R.E.; Cheverton, R.D.

    1997-01-01

    A return to 100-MW operation is being planned for the High Flux Isotope Reactor (HFIR). Recent improvements in fuel element manufacturing procedures and inspection equipment will be exploited to reduce hot spot and hot streak factors sufficiently to permit the power upgrade without an increase in primary coolant pressure. Fresh fuel elements already fabricated for future use are being evaluated individually for power upgrade potential based on their measured coolant channel dimensions

  12. A Power Efficient Exaflop Computer Design for Global Cloud System Resolving Climate Models.

    Science.gov (United States)

    Wehner, M. F.; Oliker, L.; Shalf, J.

    2008-12-01

    Exascale computers would allow routine ensemble modeling of the global climate system at the cloud system resolving scale. Power and cost requirements of traditional architecture systems are likely to delay such capability for many years. We present an alternative route to the exascale using embedded processor technology to design a system optimized for ultra high resolution climate modeling. These power efficient processors, used in consumer electronic devices such as mobile phones, portable music players, cameras, etc., can be tailored to the specific needs of scientific computing. We project that a system capable of integrating a kilometer scale climate model a thousand times faster than real time could be designed and built in a five year time scale for US$75M with a power consumption of 3MW. This is cheaper, more power efficient and sooner than any other existing technology.

  13. CHEP95: Computing in high energy physics. Abstracts

    International Nuclear Information System (INIS)

    1995-01-01

    These proceedings cover the technical papers on computation in High Energy Physics, including computer codes, computer devices, control systems, simulations, data acquisition systems. New approaches on computer architectures are also discussed

  14. Recent computer applications in boiling water reactor power plants

    International Nuclear Information System (INIS)

    Hiraga, Shoji; Joge, Toshio; Kiyokawa, Kazuhiro; Kato, Kanji; Nigawara, Seiitsu

    1976-01-01

    Process computers in boiling water reactor power plants have won the position of important equipments for the calculation of the core and plant performances and for data logging. Their application technique is growing larger and larger every year. Here, two systems are introduced; plant diagnostic system and computerized control panel. The plant diagnostic system consists of the part processing the signals from a plant, the operation part mainly composed of a computer to diagnose the operating conditions of each system component using input signal, and the result display (CRT or typewriter). The concept on the indications on control panels in nuclear power plants is changing from ''Plant parameters and to be indicated on panel meters as much as possible'' to ''Only the data required for operation are to be indicated.'' Thus the computerized control panel is attracting attention, in which the process computer for processing the operating information and CRT display are introduced. The experimental model of that panel comprises and operator's console and a chief watchmen's console. Its functions are dialogic data access and the automatic selection of preferential information. (Wakatsuki, Y.)

  15. High-Precision Computation and Mathematical Physics

    International Nuclear Information System (INIS)

    Bailey, David H.; Borwein, Jonathan M.

    2008-01-01

    At the present time, IEEE 64-bit floating-point arithmetic is sufficiently accurate for most scientific applications. However, for a rapidly growing body of important scientific computing applications, a higher level of numeric precision is required. Such calculations are facilitated by high-precision software packages that include high-level language translation modules to minimize the conversion effort. This paper presents a survey of recent applications of these techniques and provides some analysis of their numerical requirements. These applications include supernova simulations, climate modeling, planetary orbit calculations, Coulomb n-body atomic systems, scattering amplitudes of quarks, gluons and bosons, nonlinear oscillator theory, Ising theory, quantum field theory and experimental mathematics. We conclude that high-precision arithmetic facilities are now an indispensable component of a modern large-scale scientific computing environment.

  16. Advanced High Voltage Power Device Concepts

    CERN Document Server

    Baliga, B Jayant

    2012-01-01

    Advanced High Voltage Power Device Concepts describes devices utilized in power transmission and distribution equipment, and for very high power motor control in electric trains and steel-mills. Since these devices must be capable of supporting more than 5000-volts in the blocking mode, this books covers operation of devices rated at 5,000-V, 10,000-V and 20,000-V. Advanced concepts (the MCT, the BRT, and the EST) that enable MOS-gated control of power thyristor structures are described and analyzed in detail. In addition, detailed analyses of the silicon IGBT, as well as the silicon carbide MOSFET and IGBT, are provided for comparison purposes. Throughout the book, analytical models are generated to give a better understanding of the physics of operation for all the structures. This book provides readers with: The first comprehensive treatment of high voltage (over 5000-volts) power devices suitable for the power distribution, traction, and motor-control markets;  Analytical formulations for all the device ...

  17. Optimizing the design of very high power, high performance converters

    International Nuclear Information System (INIS)

    Edwards, R.J.; Tiagha, E.A.; Ganetis, G.; Nawrocky, R.J.

    1980-01-01

    This paper describes how various technologies are used to achieve the desired performance in a high current magnet power converter system. It is hoped that the discussions of the design approaches taken will be applicable to other power supply systems where stringent requirements in stability, accuracy and reliability must be met

  18. Dimensioning storage and computing clusters for efficient high throughput computing

    International Nuclear Information System (INIS)

    Accion, E; Bria, A; Bernabeu, G; Caubet, M; Delfino, M; Espinal, X; Merino, G; Lopez, F; Martinez, F; Planas, E

    2012-01-01

    Scientific experiments are producing huge amounts of data, and the size of their datasets and total volume of data continues increasing. These data are then processed by researchers belonging to large scientific collaborations, with the Large Hadron Collider being a good example. The focal point of scientific data centers has shifted from efficiently coping with PetaByte scale storage to deliver quality data processing throughput. The dimensioning of the internal components in High Throughput Computing (HTC) data centers is of crucial importance to cope with all the activities demanded by the experiments, both the online (data acceptance) and the offline (data processing, simulation and user analysis). This requires a precise setup involving disk and tape storage services, a computing cluster and the internal networking to prevent bottlenecks, overloads and undesired slowness that lead to losses cpu cycles and batch jobs failures. In this paper we point out relevant features for running a successful data storage and processing service in an intensive HTC environment.

  19. The Jefferson Lab High Power Light Source

    Energy Technology Data Exchange (ETDEWEB)

    James R. Boyce

    2006-01-01

    Jefferson Lab has designed, built and operated two high average power free-electron lasers (FEL) using superconducting RF (SRF) technology and energy recovery techniques. Between 1999-2001 Jefferson Lab operated the IR Demo FEL. This device produced over 2 kW in the mid-infrared, in addition to producing world record average powers in the visible (50 W), ultraviolet (10 W) and terahertz range (50 W) for tunable, short-pulse (< ps) light. This FEL was the first high power demonstration of an accelerator configuration that is being exploited for a number of new accelerator-driven light source facilities that are currently under design or construction. The driver accelerator for the IR Demo FEL uses an Energy Recovered Linac (ERL) configuration that improves the energy efficiency and lowers both the capital and operating cost of such devices by recovering most of the power in the spent electron beam after optical power is extracted from the beam. The IR Demo FEL was de-commissioned in late 2001 for an upgraded FEL for extending the IR power to over 10 kW and the ultraviolet power to over 1 kW. The FEL Upgrade achieved 10 kW of average power in the mid-IR (6 microns) in July of 2004, and its IR operation currently is being extended down to 1 micron. In addition, we have demonstrated the capability of on/off cycling and recovering over a megawatt of electron beam power without diminishing machine performance. A complementary UV FEL will come on-line within the next year. This paper presents a summary of the FEL characteristics, user community accomplishments with the IR Demo, and planned user experiments.

  20. High power infrared QCLs: advances and applications

    Science.gov (United States)

    Patel, C. Kumar N.

    2012-01-01

    QCLs are becoming the most important sources of laser radiation in the midwave infrared (MWIR) and longwave infrared (LWIR) regions because of their size, weight, power and reliability advantages over other laser sources in the same spectral regions. The availability of multiwatt RT operation QCLs from 3.5 μm to >16 μm with wall plug efficiency of 10% or higher is hastening the replacement of traditional sources such as OPOs and OPSELs in many applications. QCLs can replace CO2 lasers in many low power applications. Of the two leading groups in improvements in QCL performance, Pranalytica is the commercial organization that has been supplying the highest performance QCLs to various customers for over four year. Using a new QCL design concept, the non-resonant extraction [1], we have achieved CW/RT power of >4.7 W and WPE of >17% in the 4.4 μm - 5.0 μm region. In the LWIR region, we have recently demonstrated QCLs with CW/RT power exceeding 1 W with WPE of nearly 10 % in the 7.0 μm-10.0 μm region. In general, the high power CW/RT operation requires use of TECs to maintain QCLs at appropriate operating temperatures. However, TECs consume additional electrical power, which is not desirable for handheld, battery-operated applications, where system power conversion efficiency is more important than just the QCL chip level power conversion efficiency. In high duty cycle pulsed (quasi-CW) mode, the QCLs can be operated without TECs and have produced nearly the same average power as that available in CW mode with TECs. Multiwatt average powers are obtained even in ambient T>70°C, with true efficiency of electrical power-to-optical power conversion being above 10%. Because of the availability of QCLs with multiwatt power outputs and wavelength range covering a spectral region from ~3.5 μm to >16 μm, the QCLs have found instantaneous acceptance for insertion into multitude of defense and homeland security applications, including laser sources for infrared

  1. Concept of a computer network architecture for complete automation of nuclear power plants

    International Nuclear Information System (INIS)

    Edwards, R.M.; Ray, A.

    1990-01-01

    The state of the art in automation of nuclear power plants has been largely limited to computerized data acquisition, monitoring, display, and recording of process signals. Complete automation of nuclear power plants, which would include plant operations, control, and management, fault diagnosis, and system reconfiguration with efficient and reliable man/machine interactions, has been projected as a realistic goal. This paper presents the concept of a computer network architecture that would use a high-speed optical data highway to integrate diverse, interacting, and spatially distributed functions that are essential for a fully automated nuclear power plant

  2. The NASA CSTI High Capacity Power Program

    International Nuclear Information System (INIS)

    Winter, J.M.

    1991-09-01

    The SP-100 program was established in 1983 by DOD, DOE, and NASA as a joint program to develop the technology necessary for space nuclear power systems for military and civil applications. During 1986 and 1987, the NASA Advanced Technology Program was responsible for maintaining the momentum of promising technology advancement efforts started during Phase 1 of SP-100 and to strengthen, in key areas, the chances for successful development and growth capability of space nuclear reactor power systems for future space applications. In 1988, the NASA Advanced Technology Program was incorporated into NASA's new Civil Space Technology Initiative (CSTI). The CSTI program was established to provide the foundation for technology development in automation and robotics, information, propulsion, and power. The CSTI High Capacity Power Program builds on the technology efforts of the SP-100 program, incorporates the previous NASA advanced technology project, and provides a bridge to the NASA exploration technology programs. The elements of CSTI high capacity power development include conversion systems: Stirling and thermoelectric, thermal management, power management, system diagnostics, and environmental interactions. Technology advancement in all areas, including materials, is required to provide the growth capability, high reliability, and 7 to 10 year lifetime demanded for future space nuclear power systems. The overall program will develop and demonstrate the technology base required to provide a wide range of modular power systems while minimizing the impact of day/night operations as well as attitudes and distance from the Sun. Significant accomplishments in all of the program elements will be discussed, along with revised goals and project timelines recently developed

  3. High-power VCSELs for smart munitions

    Science.gov (United States)

    Geske, Jon; MacDougal, Michael; Cole, Garrett; Snyder, Donald

    2006-08-01

    The next generation of low-cost smart munitions will be capable of autonomously detecting and identifying targets aided partly by the ability to image targets with compact and robust scanning rangefinder and LADAR capabilities. These imaging systems will utilize arrays of high performance, low-cost semiconductor diode lasers capable of achieving high peak powers in pulses ranging from 5 to 25 nanoseconds in duration. Aerius Photonics is developing high-power Vertical-Cavity Surface-Emitting Lasers (VCSELs) to meet the needs of these smart munitions applications. The authors will report the results of Aerius' development program in which peak pulsed powers exceeding 60 Watts were demonstrated from single VCSEL emitters. These compact packaged emitters achieved pulse energies in excess of 1.5 micro-joules with multi kilo-hertz pulse repetition frequencies. The progress of the ongoing effort toward extending this performance to arrays of VCSEL emitters and toward further improving laser slope efficiency will be reported.

  4. High power all solid state VUV lasers

    International Nuclear Information System (INIS)

    Zhang, Shen-jin; Cui, Da-fu; Zhang, Feng-feng; Xu, Zhi; Wang, Zhi-min; Yang, Feng; Zong, Nan; Tu, Wei; Chen, Ying; Xu, Hong-yan; Xu, Feng-liang; Peng, Qin-jun; Wang, Xiao-yang; Chen, Chuang-tian; Xu, Zu-yan

    2014-01-01

    Highlights: • Polarization and pulse repetition rate adjustable ps 177.3 nm laser was developed. • Wavelength tunable ns, ps and fs VUV lasers were developed. • High power ns 177.3 nm laser with narrow linewidth was investigated. - Abstract: We report the investigation on the high power all solid state vacuum ultra-violet (VUV) lasers by means of nonlinear frequency conversion with KBe 2 BO 3 F 2 (KBBF) nonlinear crystal. Several all solid state VUV lasers have developed in our group, including polarization and pulse repetition rate adjustable picosecond 177.3 nm VUV laser, wavelength tunable nanosecond, picosecond and femtosecond VUV lasers, high power ns 177.3 nm laser with narrow linewidth. The VUV lasers have impact, accurate and precise advantage

  5. High Power High Efficiency Diode Laser Stack for Processing

    Science.gov (United States)

    Gu, Yuanyuan; Lu, Hui; Fu, Yueming; Cui, Yan

    2018-03-01

    High-power diode lasers based on GaAs semiconductor bars are well established as reliable and highly efficient laser sources. As diode laser is simple in structure, small size, longer life expectancy with the advantages of low prices, it is widely used in the industry processing, such as heat treating, welding, hardening, cladding and so on. Respectively, diode laser could make it possible to establish the practical application because of rectangular beam patterns which are suitable to make fine bead with less power. At this power level, it can have many important applications, such as surgery, welding of polymers, soldering, coatings and surface treatment of metals. But there are some applications, which require much higher power and brightness, e.g. hardening, key hole welding, cutting and metal welding. In addition, High power diode lasers in the military field also have important applications. So all developed countries have attached great importance to high-power diode laser system and its applications. This is mainly due their low performance. In this paper we will introduce the structure and the principle of the high power diode stack.

  6. High to ultra-high power electrical energy storage.

    Science.gov (United States)

    Sherrill, Stefanie A; Banerjee, Parag; Rubloff, Gary W; Lee, Sang Bok

    2011-12-14

    High power electrical energy storage systems are becoming critical devices for advanced energy storage technology. This is true in part due to their high rate capabilities and moderate energy densities which allow them to capture power efficiently from evanescent, renewable energy sources. High power systems include both electrochemical capacitors and electrostatic capacitors. These devices have fast charging and discharging rates, supplying energy within seconds or less. Recent research has focused on increasing power and energy density of the devices using advanced materials and novel architectural design. An increase in understanding of structure-property relationships in nanomaterials and interfaces and the ability to control nanostructures precisely has led to an immense improvement in the performance characteristics of these devices. In this review, we discuss the recent advances for both electrochemical and electrostatic capacitors as high power electrical energy storage systems, and propose directions and challenges for the future. We asses the opportunities in nanostructure-based high power electrical energy storage devices and include electrochemical and electrostatic capacitors for their potential to open the door to a new regime of power energy.

  7. High Performance Computing in Science and Engineering '15 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2016-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2015. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance. The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  8. High Performance Computing in Science and Engineering '17 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael; HLRS 2017

    2018-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2017. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance.The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  9. Powersail High Power Propulsion System Design Study

    Science.gov (United States)

    Gulczinski, Frank S., III

    2000-11-01

    A desire by the United States Air Force to exploit the space environment has led to a need for increased on-orbit electrical power availability. To enable this, the Air Force Research Laboratory Space Vehicles Directorate (AFRL/ VS) is developing Powersail: a two-phased program to demonstrate high power (100 kW to 1 MW) capability in space using a deployable, flexible solar array connected to the host spacecraft using a slack umbilical. The first phase will be a proof-of-concept demonstration at 50 kW, followed by the second phase, an operational system at full power. In support of this program, the AFRL propulsion Directorate's Spacecraft Propulsion Branch (AFRL/PRS ) at Edwards AFB has commissioned a design study of the Powersail High Power Propulsion System. The purpose of this study, the results of which are summarized in this paper, is to perform mission and design trades to identify potential full-power applications (both near-Earth and interplanetary) and the corresponding propulsion system requirements and design. The design study shall farther identify a suitable low power demonstration flight that maximizes risk reduction for the fully operational system. This propulsion system is expected to be threefold: (1) primary propulsion for moving the entire vehicle, (2) a propulsion unit that maintains the solar array position relative to the host spacecraft, and (3) control propulsion for maintaining proper orientation for the flexible solar array.

  10. Reactor G1: high power experiments

    International Nuclear Information System (INIS)

    Laage, F. de; Teste du Baillet, A.; Veyssiere, A.; Wanner, G.

    1957-01-01

    The experiments carried out in the starting-up programme of the reactor G1 comprised a series of tests at high power, which allowed the following points to be studied: 1- Effect of poisoning by Xenon (absolute value, evolution). 2- Temperature coefficients of the uranium and graphite for a temperature distribution corresponding to heating by fission. 3- Effect of the pressure (due to the coiling system) on the reactivity. 4- Calibration of the security rods as a function of their position in the pile (1). 5- Temperature distribution of the graphite, the sheathing, the uranium and the air leaving the canals, in a pile running normally at high power. 6- Neutron flux distribution in a pile running normally at high power. 7- Determination of the power by nuclear and thermodynamic methods. These experiments have been carried out under two very different pile conditions. From the 1. to the 15. of August 1956, a series of power increases, followed by periods of stabilisation, were induced in a pile containing uranium only, in 457 canals, amounting to about 34 tons of fuel. A knowledge of the efficiency of the control rods in such a pile has made it possible to measure with good accuracy the principal effects at high temperatures, that is, to deal with points 1, 2, 3, 5. Flux charts giving information on the variations of the material Laplacian and extrapolation lengths in the reflector have been drawn up. Finally the thermodynamic power has been measured under good conditions, in spite of some installation difficulties. On September 16, the pile had its final charge of 100 tons. All the canals were loaded, 1,234 with uranium and 53 (i.e. exactly 4 per cent of the total number) with thorium uniformly distributed in a square lattice of 100 cm side. Since technical difficulties prevented the calibration of the control rods, the measurements were limited to the determination of the thermodynamic power and the temperature distributions (points 5 and 7). This report will

  11. Compact high-power terahertz radiation source

    Directory of Open Access Journals (Sweden)

    G. A. Krafft

    2004-06-01

    Full Text Available In this paper a new type of THz radiation source, based on recirculating an electron beam through a high gradient superconducting radio frequency cavity, and using this beam to drive a standard electromagnetic undulator on the return leg, is discussed. Because the beam is recirculated and not stored, short bunches may be produced that radiate coherently in the undulator, yielding exceptionally high average THz power for relatively low average beam power. Deceleration from the coherent emission, and the detuning it causes, limits the charge-per-bunch possible in such a device.

  12. High power RF transmission line component development

    International Nuclear Information System (INIS)

    Hong, B. G.; Hwang, C. K.; Bae, Y. D.; Yoon, J. S.; Wang, S. J.; Gu, S. H.; Yang, J. R.; Hahm, Y. S.; Oh, G. S.; Lee, J. R.; Lee, W. I.; Park, S. H.; Kang, M. S.; Oh, S. H.; Lee, W.I.

    1999-12-01

    We developed the liquid stub and phase shifter which are the key high RF power transmission line components. They show reliable operation characteristics and increased insulation capability, and reduced the size by using liquid (silicon oil, dielectric constant ε=2.72) instead of gas for insulating dielectric material. They do not have finger stock for the electric contact so the local temperature rise due to irregular contact and RF breakdown due to scratch in conductor are prevented. They can be utilized in broadcasting, radar facility which require high RF power transmission. Moreover, they are key components in RF heating system for fusion reactor. (author)

  13. High power RF transmission line component development

    Energy Technology Data Exchange (ETDEWEB)

    Hong, B. G.; Hwang, C. K.; Bae, Y. D.; Yoon, J. S.; Wang, S. J.; Gu, S. H.; Yang, J. R.; Hahm, Y. S.; Oh, G. S.; Lee, J. R.; Lee, W. I.; Park, S. H.; Kang, M. S.; Oh, S. H.; Lee, W.I

    1999-12-01

    We developed the liquid stub and phase shifter which are the key high RF power transmission line components. They show reliable operation characteristics and increased insulation capability, and reduced the size by using liquid (silicon oil, dielectric constant {epsilon}=2.72) instead of gas for insulating dielectric material. They do not have finger stock for the electric contact so the local temperature rise due to irregular contact and RF breakdown due to scratch in conductor are prevented. They can be utilized in broadcasting, radar facility which require high RF power transmission. Moreover, they are key components in RF heating system for fusion reactor. (author)

  14. High voltage superconducting switch for power application

    International Nuclear Information System (INIS)

    Mawardi, O.; Ferendeci, A.; Gattozzi, A.

    1983-01-01

    This paper reports the development of a novel interrupter which meets the requirements of a high voltage direct current (HVDC) power switch and at the same time doubles as a current limiter. The basic concept of the interrupter makes use of a fast superconducting, high capacity (SHIC) switch that carries the full load current while in the superconducting state and reverts to the normal resistive state when triggered. Typical design parameters are examined for the case of a HVDC transmission line handling 2.5KA at 150KVDC. The result is a power switch with superior performance and smaller size than the ones reported to date

  15. Nanoelectromechanical Switches for Low-Power Digital Computing

    Directory of Open Access Journals (Sweden)

    Alexis Peschot

    2015-08-01

    Full Text Available The need for more energy-efficient solid-state switches beyond complementary metal-oxide-semiconductor (CMOS transistors has become a major concern as the power consumption of electronic integrated circuits (ICs steadily increases with technology scaling. Nano-Electro-Mechanical (NEM relays control current flow by nanometer-scale motion to make or break physical contact between electrodes, and offer advantages over transistors for low-power digital logic applications: virtually zero leakage current for negligible static power consumption; the ability to operate with very small voltage signals for low dynamic power consumption; and robustness against harsh environments such as extreme temperatures. Therefore, NEM logic switches (relays have been investigated by several research groups during the past decade. Circuit simulations calibrated to experimental data indicate that scaled relay technology can overcome the energy-efficiency limit of CMOS technology. This paper reviews recent progress toward this goal, providing an overview of the different relay designs and experimental results achieved by various research groups, as well as of relay-based IC design principles. Remaining challenges for realizing the promise of nano-mechanical computing, and ongoing efforts to address these, are discussed.

  16. Advances in Very High Frequency Power Conversion

    DEFF Research Database (Denmark)

    Kovacevic, Milovan

    Resonant and quasi-resonant converters operated at frequencies above 30 MHz have attracted special attention in the last two decades. Compared to conventional converters operated at ~100 kHz, they offer significant advantages: smaller volume and weight, lower cost, and faster transient performance....... Excellent performance and small size of magnetic components and capacitors at very high frequencies, along with constant advances in performance of power semiconductor devices, suggests a sizable shift in consumer power supplies market into this area in the near future. To operate dc-dc converter power...... method provides low complexity and low gate loss simultaneously. A direct design synthesis method is provided for resonant SEPIC converters employing this technique. Most experimental prototypes were developed using low cost, commercially available power semiconductors. Due to very fast transient...

  17. Monitoring SLAC High Performance UNIX Computing Systems

    International Nuclear Information System (INIS)

    Lettsome, Annette K.

    2005-01-01

    Knowledge of the effectiveness and efficiency of computers is important when working with high performance systems. The monitoring of such systems is advantageous in order to foresee possible misfortunes or system failures. Ganglia is a software system designed for high performance computing systems to retrieve specific monitoring information. An alternative storage facility for Ganglia's collected data is needed since its default storage system, the round-robin database (RRD), struggles with data integrity. The creation of a script-driven MySQL database solves this dilemma. This paper describes the process took in the creation and implementation of the MySQL database for use by Ganglia. Comparisons between data storage by both databases are made using gnuplot and Ganglia's real-time graphical user interface

  18. High Average Power, High Energy Short Pulse Fiber Laser System

    Energy Technology Data Exchange (ETDEWEB)

    Messerly, M J

    2007-11-13

    Recently continuous wave fiber laser systems with output powers in excess of 500W with good beam quality have been demonstrated [1]. High energy, ultrafast, chirped pulsed fiber laser systems have achieved record output energies of 1mJ [2]. However, these high-energy systems have not been scaled beyond a few watts of average output power. Fiber laser systems are attractive for many applications because they offer the promise of high efficiency, compact, robust systems that are turn key. Applications such as cutting, drilling and materials processing, front end systems for high energy pulsed lasers (such as petawatts) and laser based sources of high spatial coherence, high flux x-rays all require high energy short pulses and two of the three of these applications also require high average power. The challenge in creating a high energy chirped pulse fiber laser system is to find a way to scale the output energy while avoiding nonlinear effects and maintaining good beam quality in the amplifier fiber. To this end, our 3-year LDRD program sought to demonstrate a high energy, high average power fiber laser system. This work included exploring designs of large mode area optical fiber amplifiers for high energy systems as well as understanding the issues associated chirped pulse amplification in optical fiber amplifier systems.

  19. A Multilevel Introspective Dynamic Optimization System For Holistic Power-Aware Computing

    DEFF Research Database (Denmark)

    Venkatachalam, Vasanth; Probst, Christian; Franz, Michael

    2005-01-01

    Power consumption is rapidly becoming the dominant limiting factor for further improvements in computer design. Curiously, this applies both at the “high-end” of workstations and servers and the “low end” of handheld devices and embedded computers. At the high-end, the challenge lies in dealing w......, including that of applications and the virtual machine itself. We believe this introspective, holistic approach enables more informed power-management decisions....... with exponentially growing power densities. At the low-end, there is a demand to make mobile devices more powerful and longer lasting, but battery technology is not improving at the same rate that power consumption is rising. Traditional power-management research is fragmented; techniques are being developed...... at specific levels, without fully exploring their synergy with other levels. Most software techniques target either operating systems or compilers but do not explore the interaction between the two layers. These techniques also have not fully explored the potential of virtual machines for power management...

  20. High Performance Computing Operations Review Report

    Energy Technology Data Exchange (ETDEWEB)

    Cupps, Kimberly C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-19

    The High Performance Computing Operations Review (HPCOR) meeting—requested by the ASC and ASCR program headquarters at DOE—was held November 5 and 6, 2013, at the Marriott Hotel in San Francisco, CA. The purpose of the review was to discuss the processes and practices for HPC integration and its related software and facilities. Experiences and lessons learned from the most recent systems deployed were covered in order to benefit the deployment of new systems.

  1. High Available COTS Based Computer for Space

    Science.gov (United States)

    Hartmann, J.; Magistrati, Giorgio

    2015-09-01

    The availability and reliability factors of a system are central requirements of a target application. From a simple fuel injection system used in cars up to a flight control system of an autonomous navigating spacecraft, each application defines its specific availability factor under the target application boundary conditions. Increasing quality requirements on data processing systems used in space flight applications calling for new architectures to fulfill the availability, reliability as well as the increase of the required data processing power. Contrary to the increased quality request simplification and use of COTS components to decrease costs while keeping the interface compatibility to currently used system standards are clear customer needs. Data processing system design is mostly dominated by strict fulfillment of the customer requirements and reuse of available computer systems were not always possible caused by obsolescence of EEE-Parts, insufficient IO capabilities or the fact that available data processing systems did not provide the required scalability and performance.

  2. High power switches for ion induction linacs

    International Nuclear Information System (INIS)

    Humphries, S.; Savage, M.; Saylor, W.B.

    1985-01-01

    The success of linear induction ion accelerators for accelerator inertial fusion (AIF) applications depends largely on innovations in pulsed power technology. There are tight constraints on the accuracy of accelerating voltage waveforms to maintain a low momentum spread. Furthermore, the non-relativistic ion beams may be subject to a klystronlike interaction with the accelerating cavities, leading to enhanced momentum spread. In this paper, we describe a novel high power switch with a demonstrated ability to interrupt 300 A at 20 kV in less than 60 ns. The switch may allow the replacement of pulse modulators in linear induction accelerators with hard tube pulsers. A power system based on a hard tube pulser could solve the longitudinal instability problem while maintaining high energy transfer efficiency. The problem of longitudinal beam control in ion induction linacs is reviewed in Section 2. Section 3 describes the principles of the plasma flow switch. Experimental results are summarized in Section 4

  3. High power switches for ion induction linacs

    International Nuclear Information System (INIS)

    Humphries, S. Jr.; Savage, M.; Saylor, W.B.

    1985-01-01

    The success of linear induction ion accelerators for accelerator inertial fusion (AIF) applications depends largely on innovations in pulsed power technology. There are tight constraints on the accuracy of accelerating voltage waveforms to maintain a low momentum spread. Furthermore, the non-relativistic ion beams may be subject to a klystron-like interaction with the accelerating cavities leading to enhanced momentum spread. In this paper, the author describe a novel high power switch with a demonstrated ability to interrupt 300 A at 20 kV in less than 60 ns. The switch may allow the replacement of pulse modulators in linear induction accelerators with hard tube pulsers. A power system based on a hard tube pulser could solve the longitudinal instability problem while maintaining high energy transfer efficiency. The problem of longitudinal beam control in ion induction linacs is reviewed in Section 2. Section 3 describes the principles of the plasma flow switch. Experimental results are summarized in Section 4

  4. High power RF oscillator with Marx generators

    International Nuclear Information System (INIS)

    Murase, Hiroshi; Hayashi, Izumi

    1980-01-01

    A method to maintain RF oscillation by using many Marx generators was proposed and studied experimentally. Many charging circuits were connected to an oscillator circuit, and successive pulsed charging was made. This successive charging amplified and maintained the RF oscillation. The use of vacuum gaps and high power silicon diodes improved the characteristics of RF current cut-off of the circuit. The efficiency of the pulsed charging from Marx generators to a condenser was theoretically investigated. The theoretical result showed the maximum efficiency of 0.98. The practical efficiency obtained by using a proposed circuit with a high power oscillator was in the range 0.50 to 0.56. The obtained effective output power of the RF pulses was 11 MW. The maximum holding time of the RF pulses was about 21 microsecond. (Kato, T.)

  5. Operation of Power Grids with High Penetration of Wind Power

    Science.gov (United States)

    Al-Awami, Ali Taleb

    The integration of wind power into the power grid poses many challenges due to its highly uncertain nature. This dissertation involves two main components related to the operation of power grids with high penetration of wind energy: wind-thermal stochastic dispatch and wind-thermal coordinated bidding in short-term electricity markets. In the first part, a stochastic dispatch (SD) algorithm is proposed that takes into account the stochastic nature of the wind power output. The uncertainty associated with wind power output given the forecast is characterized using conditional probability density functions (CPDF). Several functions are examined to characterize wind uncertainty including Beta, Weibull, Extreme Value, Generalized Extreme Value, and Mixed Gaussian distributions. The unique characteristics of the Mixed Gaussian distribution are then utilized to facilitate the speed of convergence of the SD algorithm. A case study is carried out to evaluate the effectiveness of the proposed algorithm. Then, the SD algorithm is extended to simultaneously optimize the system operating costs and emissions. A modified multi-objective particle swarm optimization algorithm is suggested to identify the Pareto-optimal solutions defined by the two conflicting objectives. A sensitivity analysis is carried out to study the effect of changing load level and imbalance cost factors on the Pareto front. In the second part of this dissertation, coordinated trading of wind and thermal energy is proposed to mitigate risks due to those uncertainties. The problem of wind-thermal coordinated trading is formulated as a mixed-integer stochastic linear program. The objective is to obtain the optimal tradeoff bidding strategy that maximizes the total expected profits while controlling trading risks. For risk control, a weighted term of the conditional value at risk (CVaR) is included in the objective function. The CVaR aims to maximize the expected profits of the least profitable scenarios, thus

  6. Optical engineering for high power laser applications

    International Nuclear Information System (INIS)

    Novaro, M.

    1993-01-01

    Laser facilities for Inertial Confinement Fusion (I.C.F.) experiments require laser and X ray optics able to withstand short pulse conditions. After a brief recall of high power laser system arrangements and of the characteristics of their optics, the authors will present some X ray optical developments

  7. Development of a high power femtosecond laser

    CSIR Research Space (South Africa)

    Neethling, PH

    2010-10-01

    Full Text Available The Laser Research Institute and the CSIR National Laser Centre are developing a high power femtosecond laser system in a joint project with a phased approach. The laser system consists of an fs oscillator and a regenerative amplifier. An OPCPA...

  8. Targets for high power neutral beams

    International Nuclear Information System (INIS)

    Kim, J.

    1980-01-01

    Stopping high-power, long-pulse beams is fast becoming an engineering challenge, particularly in neutral beam injectors for heating magnetically confined plasmas. A brief review of neutral beam target technology is presented along with heat transfer calculations for some selected target designs

  9. Grid computing in high-energy physics

    International Nuclear Information System (INIS)

    Bischof, R.; Kuhn, D.; Kneringer, E.

    2003-01-01

    Full text: The future high energy physics experiments are characterized by an enormous amount of data delivered by the large detectors presently under construction e.g. at the Large Hadron Collider and by a large number of scientists (several thousands) requiring simultaneous access to the resulting experimental data. Since it seems unrealistic to provide the necessary computing and storage resources at one single place, (e.g. CERN), the concept of grid computing i.e. the use of distributed resources, will be chosen. The DataGrid project (under the leadership of CERN) develops, based on the Globus toolkit, the software necessary for computation and analysis of shared large-scale databases in a grid structure. The high energy physics group Innsbruck participates with several resources in the DataGrid test bed. In this presentation our experience as grid users and resource provider is summarized. In cooperation with the local IT-center (ZID) we installed a flexible grid system which uses PCs (at the moment 162) in student's labs during nights, weekends and holidays, which is especially used to compare different systems (local resource managers, other grid software e.g. from the Nordugrid project) and to supply a test bed for the future Austrian Grid (AGrid). (author)

  10. Reduced filamentation in high power semiconductor lasers

    DEFF Research Database (Denmark)

    Skovgaard, Peter M. W.; McInerney, John; O'Brien, Peter

    1999-01-01

    High brightness semiconductor lasers have applications in fields ranging from material processing to medicine. The main difficulty associated with high brightness is that high optical power densities cause damage to the laser facet and thus require large apertures. This, in turn, results in spatio......-temporal instabilities such as filamentation which degrades spatial coherence and brightness. We first evaluate performance of existing designs with a “top-hat” shaped transverse current density profile. The unstable nature of highly excited semiconductor material results in a run-away process where small modulations...

  11. People powerComputer games in the classroom

    Directory of Open Access Journals (Sweden)

    Ivan Hilliard

    2014-03-01

    Full Text Available This article presents a case study in the use of the computer simulation game People Power, developed by the International Center on Nonviolent Conflict. The principal objective of the activity was to offer students an opportunity to understand the dynamics of social conflicts, in a format not possible in a traditional classroom setting. Due to the game complexity, it was decided to play it in a day-long (8 hour workshop format. A computer lab was prepared several weeks beforehand, which meant that each team of four students had access to a number of computers, being able to have the game open on several monitors at the same time, playing on one while using the others to constantly revise information as their strategy and tactics evolved. At the end of the workshop, and after handing in a group report, the 24 participants (6 groups were asked to complete a short survey of the activity. The survey was divided into three areas: the game itself, skill development, and the workshop organization. Results showed a strong relationship between the activity and the course content, skills and competencies development, and practical know-how and leadership, as well as a strong feeling that it works well as a learning tool and is enjoyable. DOI: 10.18870/hlrc.v4i1.200

  12. A solar powered wireless computer mouse. Industrial design concepts

    Energy Technology Data Exchange (ETDEWEB)

    Reich, N.H.; Van Sark, W.G.J.H.M.; Alsema, E.A.; Turkenburg, W.C. [Department of Science, Technology and Society, Copernicus Institute, Utrecht University, Heidelberglaan 2, 3584 CS Utrecht (Netherlands); Veefkind, M.; Silvester, S. [Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628 CE Delft (Netherlands)

    2009-02-15

    A solar powered wireless computer mouse (SPM) was chosen to serve as a case study for the evaluation and optimization of industrial design processes of photovoltaic (PV) powered consumer systems. As the design process requires expert knowledge in various technical fields, we assessed and compared the following: appropriate selection of integrated PV type, battery capacity and type, possible electronic circuitries for PV-battery coupling, and material properties concerning mechanical incorporation of PV into the encasing. Besides technical requirements, ergonomic aspects and design aesthetics with respect to good 'sun-harvesting' properties influenced the design process. This is particularly important as simulations show users can positively influence energy balances by 'sun-bathing' the PV mouse. A total of 15 SPM prototypes were manufactured and tested by actual users. Although user satisfaction proved the SPM concept to be feasible, future research still needs to address user acceptance related to product dimensions and user willingness to pro-actively 'sun-bath' PV powered products in greater detail. (author)

  13. High Efficiency Reversible Fuel Cell Power Converter

    DEFF Research Database (Denmark)

    Pittini, Riccardo

    as well as different dc-ac and dc-dc converter topologies are presented and analyzed. A new ac-dc topology for high efficiency data center applications is proposed and an efficiency characterization based on the fuel cell stack I-V characteristic curve is presented. The second part discusses the main...... converter components. Wide bandgap power semiconductors are introduced due to their superior performance in comparison to traditional silicon power devices. The analysis presents a study based on switching loss measurements performed on Si IGBTs, SiC JFETs, SiC MOSFETs and their respective gate drivers...

  14. High Power RF Test Facility at the SNS

    CERN Document Server

    Kang, Yoon W; Campisi, Isidoro E; Champion, Mark; Crofford, Mark; Davis, Kirk; Drury, Michael A; Fuja, Ray E; Gurd, Pamela; Kasemir, Kay-Uwe; McCarthy, Michael P; Powers, Tom; Shajedul Hasan, S M; Stirbet, Mircea; Stout, Daniel; Tang, Johnny Y; Vassioutchenko, Alexandre V; Wezensky, Mark

    2005-01-01

    RF Test Facility has been completed in the SNS project at ORNL to support test and conditioning operation of RF subsystems and components. The system consists of two transmitters for two klystrons powered by a common high voltage pulsed converter modulator that can provide power to two independent RF systems. The waveguides are configured with WR2100 and WR1150 sizes for presently used frequencies: 402.5 MHz and 805 MHz. Both 402.5 MHz and 805 MHz systems have circulator protected klystrons that can be powered by the modulator capable of delivering 11 MW peak and 1 MW average power. The facility has been equipped with computer control for various RF processing and complete dual frequency operation. More than forty 805 MHz fundamental power couplers for the SNS superconducting linac (SCL) cavitites have been RF conditioned in this facility. The facility provides more than 1000 ft2 floor area for various test setups. The facility also has a shielded cave area that can support high power tests of normal conducti...

  15. Design and development of high voltage high power operational ...

    Indian Academy of Sciences (India)

    address this challenge, a) Designing a discrete power opamp with high .... the use of high-impedance feedback networks, thus minimizing their output loading ... Spice simulation is done for the circuit and results are given in figures 4a–c.

  16. Voltage generators of high voltage high power accelerators

    International Nuclear Information System (INIS)

    Svinin, M.P.

    1981-01-01

    High voltage electron accelerators are widely used in modern radiation installations for industrial purposes. In the near future further increasing of their power may be effected, which enables to raise the efficiency of the radiation processes known and to master new power-consuming production in industry. Improvement of HV generators by increasing their power and efficiency is one of many scientific and engineering aspects the successful solution of which provides further development of these accelerators and their technical parameters. The subject is discussed in detail. (author)

  17. VLab: A Science Gateway for Distributed First Principles Calculations in Heterogeneous High Performance Computing Systems

    Science.gov (United States)

    da Silveira, Pedro Rodrigo Castro

    2014-01-01

    This thesis describes the development and deployment of a cyberinfrastructure for distributed high-throughput computations of materials properties at high pressures and/or temperatures--the Virtual Laboratory for Earth and Planetary Materials--VLab. VLab was developed to leverage the aggregated computational power of grid systems to solve…

  18. High impact data visualization with Power View, Power Map, and Power BI

    CERN Document Server

    Aspin, Adam

    2014-01-01

    High Impact Data Visualization with Power View, Power Map, and Power BI helps you take business intelligence delivery to a new level that is interactive, engaging, even fun, all while driving commercial success through sound decision-making. Learn to harness the power of Microsoft's flagship, self-service business intelligence suite to deliver compelling and interactive insight with remarkable ease. Learn the essential techniques needed to enhance the look and feel of reports and dashboards so that you can seize your audience's attention and provide them with clear and accurate information. Al

  19. A High Power Linear Solid State Pulser

    International Nuclear Information System (INIS)

    Boris Yen; Brent Davis; Rex Booth

    1999-01-01

    Particle Accelerators require high voltage and often high power. Typically the high voltage/power generation utilizes a topology with an extra energy store and a switching means to extract that stored energy. The switches may be active or passive devices. Active switches are hard or soft vacuum tubes, or semiconductors. When required voltages exceed tens of kilovolts, numerous semiconductors are stacked to withstand that potential. Such topologies can use large numbers of critical parts that, when in series, compromise the system reliability and performance. This paper describes a modular, linear, solid state amplifier which uses a parallel array of semiconductors, coupled with transmission line transformers. Such a design can provide output signals with voltages exceeding 10kV (into 50-ohms), and with rise and fall times (10-90 % amplitude) that are less than 1--ns. This compact solid state amplifier is modular, and has both hot-swap and soft fail capabilities

  20. High prices on electric power now again?

    International Nuclear Information System (INIS)

    Doorman, Gerard

    2003-01-01

    Deregulation of the electric power market has yielded low prices for the consumers throughout the 1990s. Consumption has now increased considerably, but little new production has been added. This results in high prices in dry years, but to understand this one must understand price formation in the Nordic spot market. The high prices are a powerful signal to the consumers to reduce consumption, but they are also a signal to the producers to seize any opportunity to increase production. However, the construction of new dams etc. stirs up the environmentalists. Ordinary consumers may protect themselves against high prices by signing fixed-price contracts. For those who can tolerate price fluctuations, spot prices are a better alternative than the standard contract with variable price

  1. High average power linear induction accelerator development

    International Nuclear Information System (INIS)

    Bayless, J.R.; Adler, R.J.

    1987-07-01

    There is increasing interest in linear induction accelerators (LIAs) for applications including free electron lasers, high power microwave generators and other types of radiation sources. Lawrence Livermore National Laboratory has developed LIA technology in combination with magnetic pulse compression techniques to achieve very impressive performance levels. In this paper we will briefly discuss the LIA concept and describe our development program. Our goals are to improve the reliability and reduce the cost of LIA systems. An accelerator is presently under construction to demonstrate these improvements at an energy of 1.6 MeV in 2 kA, 65 ns beam pulses at an average beam power of approximately 30 kW. The unique features of this system are a low cost accelerator design and an SCR-switched, magnetically compressed, pulse power system. 4 refs., 7 figs

  2. Heterogeneous High Throughput Scientific Computing with APM X-Gene and Intel Xeon Phi

    CERN Document Server

    Abdurachmanov, David; Elmer, Peter; Eulisse, Giulio; Knight, Robert; Muzaffar, Shahzad

    2014-01-01

    Electrical power requirements will be a constraint on the future growth of Distributed High Throughput Computing (DHTC) as used by High Energy Physics. Performance-per-watt is a critical metric for the evaluation of computer architectures for cost- efficient computing. Additionally, future performance growth will come from heterogeneous, many-core, and high computing density platforms with specialized processors. In this paper, we examine the Intel Xeon Phi Many Integrated Cores (MIC) co-processor and Applied Micro X-Gene ARMv8 64-bit low-power server system-on-a-chip (SoC) solutions for scientific computing applications. We report our experience on software porting, performance and energy efficiency and evaluate the potential for use of such technologies in the context of distributed computing systems such as the Worldwide LHC Computing Grid (WLCG).

  3. Heterogeneous High Throughput Scientific Computing with APM X-Gene and Intel Xeon Phi

    Science.gov (United States)

    Abdurachmanov, David; Bockelman, Brian; Elmer, Peter; Eulisse, Giulio; Knight, Robert; Muzaffar, Shahzad

    2015-05-01

    Electrical power requirements will be a constraint on the future growth of Distributed High Throughput Computing (DHTC) as used by High Energy Physics. Performance-per-watt is a critical metric for the evaluation of computer architectures for cost- efficient computing. Additionally, future performance growth will come from heterogeneous, many-core, and high computing density platforms with specialized processors. In this paper, we examine the Intel Xeon Phi Many Integrated Cores (MIC) co-processor and Applied Micro X-Gene ARMv8 64-bit low-power server system-on-a-chip (SoC) solutions for scientific computing applications. We report our experience on software porting, performance and energy efficiency and evaluate the potential for use of such technologies in the context of distributed computing systems such as the Worldwide LHC Computing Grid (WLCG).

  4. Heterogeneous High Throughput Scientific Computing with APM X-Gene and Intel Xeon Phi

    International Nuclear Information System (INIS)

    Abdurachmanov, David; Bockelman, Brian; Elmer, Peter; Eulisse, Giulio; Muzaffar, Shahzad; Knight, Robert

    2015-01-01

    Electrical power requirements will be a constraint on the future growth of Distributed High Throughput Computing (DHTC) as used by High Energy Physics. Performance-per-watt is a critical metric for the evaluation of computer architectures for cost- efficient computing. Additionally, future performance growth will come from heterogeneous, many-core, and high computing density platforms with specialized processors. In this paper, we examine the Intel Xeon Phi Many Integrated Cores (MIC) co-processor and Applied Micro X-Gene ARMv8 64-bit low-power server system-on-a-chip (SoC) solutions for scientific computing applications. We report our experience on software porting, performance and energy efficiency and evaluate the potential for use of such technologies in the context of distributed computing systems such as the Worldwide LHC Computing Grid (WLCG). (paper)

  5. Budget-based power consumption for application execution on a plurality of compute nodes

    Science.gov (United States)

    Archer, Charles J; Inglett, Todd A; Ratterman, Joseph D

    2012-10-23

    Methods, apparatus, and products are disclosed for budget-based power consumption for application execution on a plurality of compute nodes that include: assigning an execution priority to each of one or more applications; executing, on the plurality of compute nodes, the applications according to the execution priorities assigned to the applications at an initial power level provided to the compute nodes until a predetermined power consumption threshold is reached; and applying, upon reaching the predetermined power consumption threshold, one or more power conservation actions to reduce power consumption of the plurality of compute nodes during execution of the applications.

  6. High power gyrotrons: a close perspective

    International Nuclear Information System (INIS)

    Kartikeyan, M.V.

    2012-01-01

    Gyrotrons and their variants, popularly known as gyrodevices are millimetric wave sources provide very high powers ranging from long pulse to continuous wave (CW) for various technological, scientific and industrial applications. From their conception (monotron-version) in the late fifties until their successful development for various applications, these devices have come a long way technologically and made an irreversible impact on both users and developers. The possible applications of high power millimeter and sub-millimeter waves from gyrotrons and their variants (gyro-devices) span a wide range of technologies. The plasma physics community has already taken advantage of the recent advances of gyrotrons in the areas of RF plasma production, heating, non-inductive current drive, plasma stabilization and active plasma diagnostics for magnetic confinement thermonuclear fusion research, such as lower hybrid current drive (LHCD) (8 GHz), electron cyclotron resonance heating (ECRH) (28-170-220 GHz), electron cyclotron current drive (ECCD), collective Thomson scattering (CTS), heat-wave propagation experiments, and space-power grid (SPG) applications. Other important applications of gyrotrons are electron cyclotron resonance (ECR) discharges for the generation of multi- charged ions and soft X-rays, as well as industrial materials processing and plasma chemistry. Submillimeter wave gyrotrons are employed in high frequency, broadband electron paramagnetic resonance (EPR) spectroscopy. Additional future applications await the development of novel high power gyro-amplifiers and devices for high resolution radar ranging and imaging in atmospheric and planetary science as well as deep space and specialized satellite communications, RF drivers for next generation high gradient linear accelerators (supercolliders), high resolution Doppler radar, radar ranging and imaging in atmospheric and planetary science, drivers for next-generation high-gradient linear accelerators

  7. Visible high power fiber coupled diode lasers

    Science.gov (United States)

    Köhler, Bernd; Drovs, Simon; Stoiber, Michael; Dürsch, Sascha; Kissel, Heiko; Könning, Tobias; Biesenbach, Jens; König, Harald; Lell, Alfred; Stojetz, Bernhard; Löffler, Andreas; Strauß, Uwe

    2018-02-01

    In this paper we report on further development of fiber coupled high-power diode lasers in the visible spectral range. New visible laser modules presented in this paper include the use of multi single emitter arrays @ 450 nm leading to a 120 W fiber coupled unit with a beam quality of 44 mm x mrad, as well as very compact modules with multi-W output power from 405 nm to 640 nm. However, as these lasers are based on single emitters, power scaling quickly leads to bulky laser units with a lot of optical components to be aligned. We also report on a new approach based on 450 nm diode laser bars, which dramatically reduces size and alignment effort. These activities were performed within the German government-funded project "BlauLas": a maximum output power of 80 W per bar has been demonstrated @ 450 nm. We show results of a 200 μm NA0.22 fiber coupled 35 W source @ 450 nm, which has been reduced in size by a factor of 25 compared to standard single emitter approach. In addition, we will present a 200 μm NA0.22 fiber coupled laser unit with an output power of 135 W.

  8. The path toward HEP High Performance Computing

    International Nuclear Information System (INIS)

    Apostolakis, John; Brun, René; Gheata, Andrei; Wenzel, Sandro; Carminati, Federico

    2014-01-01

    High Energy Physics code has been known for making poor use of high performance computing architectures. Efforts in optimising HEP code on vector and RISC architectures have yield limited results and recent studies have shown that, on modern architectures, it achieves a performance between 10% and 50% of the peak one. Although several successful attempts have been made to port selected codes on GPUs, no major HEP code suite has a 'High Performance' implementation. With LHC undergoing a major upgrade and a number of challenging experiments on the drawing board, HEP cannot any longer neglect the less-than-optimal performance of its code and it has to try making the best usage of the hardware. This activity is one of the foci of the SFT group at CERN, which hosts, among others, the Root and Geant4 project. The activity of the experiments is shared and coordinated via a Concurrency Forum, where the experience in optimising HEP code is presented and discussed. Another activity is the Geant-V project, centred on the development of a highperformance prototype for particle transport. Achieving a good concurrency level on the emerging parallel architectures without a complete redesign of the framework can only be done by parallelizing at event level, or with a much larger effort at track level. Apart the shareable data structures, this typically implies a multiplication factor in terms of memory consumption compared to the single threaded version, together with sub-optimal handling of event processing tails. Besides this, the low level instruction pipelining of modern processors cannot be used efficiently to speedup the program. We have implemented a framework that allows scheduling vectors of particles to an arbitrary number of computing resources in a fine grain parallel approach. The talk will review the current optimisation activities within the SFT group with a particular emphasis on the development perspectives towards a simulation framework able to profit

  9. A computational study of high entropy alloys

    Science.gov (United States)

    Wang, Yang; Gao, Michael; Widom, Michael; Hawk, Jeff

    2013-03-01

    As a new class of advanced materials, high-entropy alloys (HEAs) exhibit a wide variety of excellent materials properties, including high strength, reasonable ductility with appreciable work-hardening, corrosion and oxidation resistance, wear resistance, and outstanding diffusion-barrier performance, especially at elevated and high temperatures. In this talk, we will explain our computational approach to the study of HEAs that employs the Korringa-Kohn-Rostoker coherent potential approximation (KKR-CPA) method. The KKR-CPA method uses Green's function technique within the framework of multiple scattering theory and is uniquely designed for the theoretical investigation of random alloys from the first principles. The application of the KKR-CPA method will be discussed as it pertains to the study of structural and mechanical properties of HEAs. In particular, computational results will be presented for AlxCoCrCuFeNi (x = 0, 0.3, 0.5, 0.8, 1.0, 1.3, 2.0, 2.8, and 3.0), and these results will be compared with experimental information from the literature.

  10. Computer simulation of high energy displacement cascades

    International Nuclear Information System (INIS)

    Heinisch, H.L.

    1990-01-01

    A methodology developed for modeling many aspects of high energy displacement cascades with molecular level computer simulations is reviewed. The initial damage state is modeled in the binary collision approximation (using the MARLOWE computer code), and the subsequent disposition of the defects within a cascade is modeled with a Monte Carlo annealing simulation (the ALSOME code). There are few adjustable parameters, and none are set to physically unreasonable values. The basic configurations of the simulated high energy cascades in copper, i.e., the number, size and shape of damage regions, compare well with observations, as do the measured numbers of residual defects and the fractions of freely migrating defects. The success of these simulations is somewhat remarkable, given the relatively simple models of defects and their interactions that are employed. The reason for this success is that the behavior of the defects is very strongly influenced by their initial spatial distributions, which the binary collision approximation adequately models. The MARLOWE/ALSOME system, with input from molecular dynamics and experiments, provides a framework for investigating the influence of high energy cascades on microstructure evolution. (author)

  11. High power VCSELs for miniature optical sensors

    Science.gov (United States)

    Geske, Jon; Wang, Chad; MacDougal, Michael; Stahl, Ron; Follman, David; Garrett, Henry; Meyrath, Todd; Snyder, Don; Golden, Eric; Wagener, Jeff; Foley, Jason

    2010-02-01

    Recent advances in Vertical-cavity Surface-emitting Laser (VCSEL) efficiency and packaging have opened up alternative applications for VCSELs that leverage their inherent advantages over light emitting diodes and edge-emitting lasers (EELs), such as low-divergence symmetric emission, wavelength stability, and inherent 2-D array fabrication. Improvements in reproducible highly efficient VCSELs have allowed VCSELs to be considered for high power and high brightness applications. In this talk, Aerius will discuss recent advances with Aerius' VCSELs and application of these VCSELs to miniature optical sensors such as rangefinders and illuminators.

  12. Usage of super high speed computer for clarification of complex phenomena

    International Nuclear Information System (INIS)

    Sekiguchi, Tomotsugu; Sato, Mitsuhisa; Nakata, Hideki; Tatebe, Osami; Takagi, Hiromitsu

    1999-01-01

    This study aims at construction of an efficient super high speed computer system application environment in response to parallel distributed system with easy transplantation to different computer system and different number by conducting research and development on super high speed computer application technology required for elucidation of complicated phenomenon in elucidation of complicated phenomenon of nuclear power field due to computed scientific method. In order to realize such environment, the Electrotechnical Laboratory has conducted development on Ninf, a network numerical information library. This Ninf system can supply a global network infrastructure for worldwide computing with high performance on further wide range distributed network (G.K.)

  13. High-resolution computer-aided moire

    Science.gov (United States)

    Sciammarella, Cesar A.; Bhat, Gopalakrishna K.

    1991-12-01

    This paper presents a high resolution computer assisted moire technique for the measurement of displacements and strains at the microscopic level. The detection of micro-displacements using a moire grid and the problem associated with the recovery of displacement field from the sampled values of the grid intensity are discussed. A two dimensional Fourier transform method for the extraction of displacements from the image of the moire grid is outlined. An example of application of the technique to the measurement of strains and stresses in the vicinity of the crack tip in a compact tension specimen is given.

  14. High Performance Computing Facility Operational Assessment, FY 2010 Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Bland, Arthur S Buddy [ORNL; Hack, James J [ORNL; Baker, Ann E [ORNL; Barker, Ashley D [ORNL; Boudwin, Kathlyn J. [ORNL; Kendall, Ricky A [ORNL; Messer, Bronson [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL; White, Julia C [ORNL

    2010-08-01

    Oak Ridge National Laboratory's (ORNL's) Cray XT5 supercomputer, Jaguar, kicked off the era of petascale scientific computing in 2008 with applications that sustained more than a thousand trillion floating point calculations per second - or 1 petaflop. Jaguar continues to grow even more powerful as it helps researchers broaden the boundaries of knowledge in virtually every domain of computational science, including weather and climate, nuclear energy, geosciences, combustion, bioenergy, fusion, and materials science. Their insights promise to broaden our knowledge in areas that are vitally important to the Department of Energy (DOE) and the nation as a whole, particularly energy assurance and climate change. The science of the 21st century, however, will demand further revolutions in computing, supercomputers capable of a million trillion calculations a second - 1 exaflop - and beyond. These systems will allow investigators to continue attacking global challenges through modeling and simulation and to unravel longstanding scientific questions. Creating such systems will also require new approaches to daunting challenges. High-performance systems of the future will need to be codesigned for scientific and engineering applications with best-in-class communications networks and data-management infrastructures and teams of skilled researchers able to take full advantage of these new resources. The Oak Ridge Leadership Computing Facility (OLCF) provides the nation's most powerful open resource for capability computing, with a sustainable path that will maintain and extend national leadership for DOE's Office of Science (SC). The OLCF has engaged a world-class team to support petascale science and to take a dramatic step forward, fielding new capabilities for high-end science. This report highlights the successful delivery and operation of a petascale system and shows how the OLCF fosters application development teams, developing cutting-edge tools

  15. 8. High power laser and ignition facilities

    International Nuclear Information System (INIS)

    Bayramian, A.J.; Beach, R.J.; Bibeau, C.

    2002-01-01

    This document gives a review of the various high power laser projects and ignition facilities in the world: the Mercury laser system and Electra (Usa), the krypton fluoride (KrF) laser and the HALNA (high average power laser for nuclear-fusion application) project (Japan), the Shenguang series, the Xingguang facility and the TIL (technical integration line) facility (China), the Vulcan peta-watt interaction facility (UK), the Megajoule project and its feasibility phase: the LIL (laser integration line) facility (France), the Asterix IV/PALS high power laser facility (Czech Republic), and the Phelix project (Germany). In Japan the 100 TW Petawatt Module Laser, constructed in 1997, is being upgraded to the world biggest peta-watt laser. Experiments have been performed with single-pulse large aperture e-beam-pumped Garpun (Russia) and with high-current-density El-1 KrF laser installation (Russia) to investigate Al-Be foil transmittance and stability to multiple e-beam irradiations. An article is dedicated to a comparison of debris shield impacts for 2 experiments at NIF (national ignition facility). (A.C.)

  16. High Power UV LED Industrial Curing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Karlicek, Robert, F., Jr; Sargent, Robert

    2012-05-14

    UV curing is a green technology that is largely underutilized because UV radiation sources like Hg Lamps are unreliable and difficult to use. High Power UV LEDs are now efficient enough to replace Hg Lamps, and offer significantly improved performance relative to Hg Lamps. In this study, a modular, scalable high power UV LED curing system was designed and tested, performing well in industrial coating evaluations. In order to achieve mechanical form factors similar to commercial Hg Lamp systems, a new patent pending design was employed enabling high irradiance at long working distances. While high power UV LEDs are currently only available at longer UVA wavelengths, rapid progress on UVC LEDs and the development of new formulations designed specifically for use with UV LED sources will converge to drive more rapid adoption of UV curing technology. An assessment of the environmental impact of replacing Hg Lamp systems with UV LED systems was performed. Since UV curing is used in only a small portion of the industrial printing, painting and coating markets, the ease of use of UV LED systems should increase the use of UV curing technology. Even a small penetration of the significant number of industrial applications still using oven curing and drying will lead to significant reductions in energy consumption and reductions in the emission of green house gases and solvent emissions.

  17. High power microwave emission and diagnostics of microsecond electron beams

    Energy Technology Data Exchange (ETDEWEB)

    Gilgenbach, R; Hochman, J M; Jayness, R; Rintamaki, J I; Lau, Y Y; Luginsland, J; Lash, J S [Univ. of Michigan, Ann Arbor, MI (United States). Intense Electron Beam Interaction Lab.; Spencer, T A [Air Force Phillips Lab., Kirtland AFB, NM (United States)

    1997-12-31

    Experiments were performed to generate high power, long-pulse microwaves by the gyrotron mechanism in rectangular cross-section interaction cavities. Long-pulse electron beams are generated by MELBA (Michigan Electron Long Beam Accelerator), which operates with parameters: -0.8 MV, 1-10 kA, and 0.5-1 microsecond pulse length. Microwave power levels are in the megawatt range. Polarization control is being studied by adjustment of the solenoidal magnetic field. Initial results show polarization power ratios up to a factor of 15. Electron beam dynamics (V{sub perp}/V{sub par}) are being measured by radiation darkening on glass plates. Computer modeling utilizes the MAGIC Code for electromagnetic waves and a single electron orbit code that includes a distribution of angles. (author). 4 figs., 4 refs.

  18. Computer-based control of nuclear power information systems at international level

    International Nuclear Information System (INIS)

    Boniface, Ekechukwu; Okonkwo, Obi

    2011-01-01

    In most highly industrialized countries of the world information plays major role in anti-nuclear campaign. Information and discussions on nuclear power need critical and objective analysis before the structured information presentation to the public to avoid bias anti-nuclear information on one side and neglect of great risk in nuclear power. This research is developing a computer-based information system for the control of nuclear power at international level. The system is to provide easy and fast information highways for the followings: (1) Low Regulatory dose and activity limit as level of high danger for individuals and public. (2) Provision of relevant technical or scientific education among the information carriers in the nuclear power countries. The research is on fact oriented investigation about radioactivity. It also deals with fact oriented education about nuclear accidents and safety. A standard procedure for dissemination of latest findings using technical and scientific experts in nuclear technology is developed. The information highway clearly analyzes the factual information about radiation risk and nuclear energy. Radiation cannot be removed from our environment. The necessity of radiation utilizations defines nuclear energy as two-edge sword. It is therefore, possible to use computer-based information system in projecting and dissemination of expert knowledge about nuclear technology positively and also to use it in directing the public on the safety and control of the nuclear energy. The computer-based information highway for nuclear energy technology is to assist in scientific research and technological development at international level. (author)

  19. High speed micromachining with high power UV laser

    Science.gov (United States)

    Patel, Rajesh S.; Bovatsek, James M.

    2013-03-01

    Increasing demand for creating fine features with high accuracy in manufacturing of electronic mobile devices has fueled growth for lasers in manufacturing. High power, high repetition rate ultraviolet (UV) lasers provide an opportunity to implement a cost effective high quality, high throughput micromachining process in a 24/7 manufacturing environment. The energy available per pulse and the pulse repetition frequency (PRF) of diode pumped solid state (DPSS) nanosecond UV lasers have increased steadily over the years. Efficient use of the available energy from a laser is important to generate accurate fine features at a high speed with high quality. To achieve maximum material removal and minimal thermal damage for any laser micromachining application, use of the optimal process parameters including energy density or fluence (J/cm2), pulse width, and repetition rate is important. In this study we present a new high power, high PRF QuasarR 355-40 laser from Spectra-Physics with TimeShiftTM technology for unique software adjustable pulse width, pulse splitting, and pulse shaping capabilities. The benefits of these features for micromachining include improved throughput and quality. Specific example and results of silicon scribing are described to demonstrate the processing benefits of the Quasar's available power, PRF, and TimeShift technology.

  20. The high-power iodine laser

    Science.gov (United States)

    Brederlow, G.; Fill, E.; Witte, K. J.

    The book provides a description of the present state of the art concerning the iodine laser, giving particular attention to the design and operation of pulsed high-power iodine lasers. The basic features of the laser are examined, taking into account aspects of spontaneous emission lifetime, hyperfine structure, line broadening and line shifts, stimulated emission cross sections, the influence of magnetic fields, sublevel relaxation, the photodissociation of alkyl iodides, flashlamp technology, excitation in a direct discharge, chemical excitation, and questions regarding the chemical kinetics of the photodissociation iodine laser. The principles of high-power operation are considered along with aspects of beam quality and losses, the design and layout of an iodine laser system, the scalability and prospects of the iodine laser, and the design of the single-beam Asterix III laser.

  1. Industrial Applications of High Power Ultrasonics

    Science.gov (United States)

    Patist, Alex; Bates, Darren

    Since the change of the millennium, high-power ultrasound has become an alternative food processing technology applicable to large-scale commercial applications such as emulsification, homogenization, extraction, crystallization, dewatering, low-temperature pasteurization, degassing, defoaming, activation and inactivation of enzymes, particle size reduction, extrusion, and viscosity alteration. This new focus can be attributed to significant improvements in equipment design and efficiency during the late 1990 s. Like most innovative food processing technologies, high-power ultrasonics is not an off-the-shelf technology, and thus requires careful development and scale-up for each and every application. The objective of this chapter is to present examples of ultrasonic applications that have been successful at the commercialization stage, advantages, and limitations, as well as key learnings from scaling up an innovative food technology in general.

  2. High power, repetitive stacked Blumlein pulse generators

    Energy Technology Data Exchange (ETDEWEB)

    Davanloo, F; Borovina, D L; Korioth, J L; Krause, R K; Collins, C B [Univ. of Texas at Dallas, Richardson, TX (United States). Center for Quantum Electronics; Agee, F J [US Air Force Phillips Lab., Kirtland AFB, NM (United States); Kingsley, L E [US Army CECOM, Ft. Monmouth, NJ (United States)

    1997-12-31

    The repetitive stacked Blumlein pulse power generators developed at the University of Texas at Dallas consist of several triaxial Blumleins stacked in series at one end. The lines are charged in parallel and synchronously commuted with a single switch at the other end. In this way, relatively low charging voltages are multiplied to give a high discharge voltage across an arbitrary load. Extensive characterization of these novel pulsers have been performed over the past few years. Results indicate that they are capable of producing high power waveforms with rise times and repetition rates in the range of 0.5-50 ns and 1-300 Hz, respectively, using a conventional thyratron, spark gap, or photoconductive switch. The progress in the development and use of stacked Blumlein pulse generators is reviewed. The technology and the characteristics of these novel pulsers driving flash x-ray diodes are discussed. (author). 4 figs., 5 refs.

  3. Power Supplies for High Energy Particle Accelerators

    Science.gov (United States)

    Dey, Pranab Kumar

    2016-06-01

    The on-going research and the development projects with Large Hadron Collider at CERN, Geneva, Switzerland has generated enormous enthusiasm and interest amongst all to know about the ultimate findings on `God's Particle'. This paper has made an attempt to unfold the power supply requirements and the methodology adopted to provide the stringent demand of such high energy particle accelerators during the initial stages of the search for the ultimate particles. An attempt has also been made to highlight the present status on the requirement of power supplies in some high energy accelerators with a view that, precautionary measures can be drawn during design and development from earlier experience which will be of help for the proposed third generation synchrotron to be installed in India at a huge cost.

  4. High stability, high current DC-power supplies

    International Nuclear Information System (INIS)

    Hosono, K.; Hatanaka, K.; Itahashi, T.

    1995-01-01

    Improvements of the power supplies and the control system of the AVF cyclotron which is used as an injector to the ring cyclotron and of the transport system to the ring cyclotron were done in order to get more high quality and more stable beam. The power supply of the main coil of the AVF cyclotron was exchanged to new one. The old DCCTs (zero-flux current transformers) used for the power supplies of the trim coils of the AVF cyclotron were changed to new DCCTs to get more stability. The potentiometers used for the reference voltages in the other power supplies of the AVF cyclotron and the transport system were changed to the temperature controlled DAC method for numerical-value settings. This paper presents the results of the improvements. (author)

  5. Low Cost, Low Power, High Sensitivity Magnetometer

    Science.gov (United States)

    2008-12-01

    which are used to measure the small magnetic signals from brain. Other types of vector magnetometers are fluxgate , coil based, and magnetoresistance...concentrator with the magnetometer currently used in Army multimodal sensor systems, the Brown fluxgate . One sees the MEMS fluxgate magnetometer is...Guedes, A.; et al., 2008: Hybrid - LOW COST, LOW POWER, HIGH SENSITIVITY MAGNETOMETER A.S. Edelstein*, James E. Burnette, Greg A. Fischer, M.G

  6. Department of Energy research in utilization of high-performance computers

    International Nuclear Information System (INIS)

    Buzbee, B.L.; Worlton, W.J.; Michael, G.; Rodrigue, G.

    1980-08-01

    Department of Energy (DOE) and other Government research laboratories depend on high-performance computer systems to accomplish their programmatic goals. As the most powerful computer systems become available, they are acquired by these laboratories so that advances can be made in their disciplines. These advances are often the result of added sophistication to numerical models, the execution of which is made possible by high-performance computer systems. However, high-performance computer systems have become increasingly complex, and consequently it has become increasingly difficult to realize their potential performance. The result is a need for research on issues related to the utilization of these systems. This report gives a brief description of high-performance computers, and then addresses the use of and future needs for high-performance computers within DOE, the growing complexity of applications within DOE, and areas of high-performance computer systems warranting research. 1 figure

  7. Evaluation of high-performance computing software

    Energy Technology Data Exchange (ETDEWEB)

    Browne, S.; Dongarra, J. [Univ. of Tennessee, Knoxville, TN (United States); Rowan, T. [Oak Ridge National Lab., TN (United States)

    1996-12-31

    The absence of unbiased and up to date comparative evaluations of high-performance computing software complicates a user`s search for the appropriate software package. The National HPCC Software Exchange (NHSE) is attacking this problem using an approach that includes independent evaluations of software, incorporation of author and user feedback into the evaluations, and Web access to the evaluations. We are applying this approach to the Parallel Tools Library (PTLIB), a new software repository for parallel systems software and tools, and HPC-Netlib, a high performance branch of the Netlib mathematical software repository. Updating the evaluations with feed-back and making it available via the Web helps ensure accuracy and timeliness, and using independent reviewers produces unbiased comparative evaluations difficult to find elsewhere.

  8. Axial power difference control strategy and computer simulation for GNPS during stretch-out and power decrease

    International Nuclear Information System (INIS)

    Liao Yehong; Xiao Min; Li Xianfeng; Zhu Minhong

    2004-01-01

    Successful control of the axial power difference for PWR is crucial to nuclear safety. After analyzing various elements' effect on the axial power distribution, different axial power deviation control strategies have been proposed to comply with different power decrease scenarios. Application of the strategy to computer simulation shows that our prediction of axial power deviation evolution is comparable to the measurement value, and that our control strategy is effective

  9. Gate Drive For High Speed, High Power IGBTs

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, M.N.; Cassel, R.L.; de Lamare, J.E.; Pappas, G.C.; /SLAC

    2007-06-18

    A new gate drive for high-voltage, high-power IGBTs has been developed for the SLAC NLC (Next Linear Collider) Solid State Induction Modulator. This paper describes the design and implementation of a driver that allows an IGBT module rated at 800A/3300V to switch up to 3000A at 2200V in 3{micro}S with a rate of current rise of more than 10000A/{micro}S, while still being short circuit protected. Issues regarding fast turn on, high de-saturation voltage detection, and low short circuit peak current will be presented. A novel approach is also used to counter the effect of unequal current sharing between parallel chips inside most high-power IGBT modules. It effectively reduces the collector-emitter peak current, and thus protects the IGBT from being destroyed during soft short circuit conditions at high di/dt.

  10. Gate Drive For High Speed, High Power IGBTs

    International Nuclear Information System (INIS)

    Nguyen, M.N.; Cassel, R.L.; de Lamare, J.E.; Pappas, G.C.; SLAC

    2007-01-01

    A new gate drive for high-voltage, high-power IGBTs has been developed for the SLAC NLC (Next Linear Collider) Solid State Induction Modulator. This paper describes the design and implementation of a driver that allows an IGBT module rated at 800A/3300V to switch up to 3000A at 2200V in 3(micro)S with a rate of current rise of more than 10000A/(micro)S, while still being short circuit protected. Issues regarding fast turn on, high de-saturation voltage detection, and low short circuit peak current will be presented. A novel approach is also used to counter the effect of unequal current sharing between parallel chips inside most high-power IGBT modules. It effectively reduces the collector-emitter peak current, and thus protects the IGBT from being destroyed during soft short circuit conditions at high di/dt

  11. Simple, parallel, high-performance virtual machines for extreme computations

    International Nuclear Information System (INIS)

    Chokoufe Nejad, Bijan; Ohl, Thorsten; Reuter, Jurgen

    2014-11-01

    We introduce a high-performance virtual machine (VM) written in a numerically fast language like Fortran or C to evaluate very large expressions. We discuss the general concept of how to perform computations in terms of a VM and present specifically a VM that is able to compute tree-level cross sections for any number of external legs, given the corresponding byte code from the optimal matrix element generator, O'Mega. Furthermore, this approach allows to formulate the parallel computation of a single phase space point in a simple and obvious way. We analyze hereby the scaling behaviour with multiple threads as well as the benefits and drawbacks that are introduced with this method. Our implementation of a VM can run faster than the corresponding native, compiled code for certain processes and compilers, especially for very high multiplicities, and has in general runtimes in the same order of magnitude. By avoiding the tedious compile and link steps, which may fail for source code files of gigabyte sizes, new processes or complex higher order corrections that are currently out of reach could be evaluated with a VM given enough computing power.

  12. The computational power of astrocyte mediated synaptic plasticity

    Directory of Open Access Journals (Sweden)

    Rogier eMin

    2012-11-01

    Full Text Available Research in the last two decades has made clear that astrocytes play a crucial role in the brain beyond their functions in energy metabolism and homeostasis. Many studies have shown that astrocytes can dynamically modulate neuronal excitability and synaptic plasticity, and might participate in higher brain functions like learning and memory. With the plethora of astrocyte-mediated signaling processes described in the literature today, the current challenge is to identify which of these processes happen under what physiological condition, and how this shapes information processing and, ultimately, behavior. To answer these questions will require a combination of advanced physiological, genetical and behavioral experiments. Additionally, mathematical modeling will prove crucial for testing predictions on the possible functions of astrocytes in neuronal networks, and to generate novel ideas as to how astrocytes can contribute to the complexity of the brain. Here, we aim to provide an outline of how astrocytes can interact with neurons. We do this by reviewing recent experimental literature on astrocyte-neuron interactions, discussing the dynamic effects of astrocytes on neuronal excitability and short- and long-term synaptic plasticity. Finally, we will outline the potential computational functions that astrocyte-neuron interactions can serve in the brain. We will discuss how astrocytes could govern metaplasticity in the brain, how they might organize the clustering of synaptic inputs, and how they could function as memory elements for neuronal activity. We conclude that astrocytes can enhance the computational power of neuronal networks in previously unexpected ways.

  13. Computer-based systems for nuclear power stations

    International Nuclear Information System (INIS)

    Humble, P.J.; Welbourne, D.; Belcher, G.

    1995-01-01

    The published intentions of vendors are for extensive touch-screen control and computer-based protection. The software features needed for acceptance in the UK are indicated. The defence in depth needed is analyzed. Current practice in aircraft flight control systems and the software methods available are discussed. Software partitioning and mathematically formal methods are appropriate for the structures and simple logic needed for nuclear power applications. The potential for claims of diversity and independence between two computer-based subsystems of a protection system is discussed. Features needed to meet a single failure criterion applied to software are discussed. Conclusions are given on the main factors which a design should allow for. The work reported was done for the Health and Safety Executive of the UK (HSE), and acknowledgement is given to them, to NNC Ltd and to GEC-Marconi Avionics Ltd for permission to publish. The opinions and recommendations expressed are those of the authors and do not necessarily reflect those of HSE. (Author)

  14. Several problems of algorithmization in integrated computation programs on third generation computers for short circuit currents in complex power networks

    Energy Technology Data Exchange (ETDEWEB)

    Krylov, V.A.; Pisarenko, V.P.

    1982-01-01

    Methods of modeling complex power networks with short circuits in the networks are described. The methods are implemented in integrated computation programs for short circuit currents and equivalents in electrical networks with a large number of branch points (up to 1000) on a computer with a limited on line memory capacity (M equals 4030 for the computer).

  15. Computer simulations of discharges from a lignite power plant complex

    International Nuclear Information System (INIS)

    Koukouliou, V.; Horyna, J.; Perez-Sanchez, D.

    2008-01-01

    This paper describes work carried out within the IAEA EMRAS program NORM working group to test the predictions of three computer models against measured radionuclide concentrations resulting from discharges from a lignite power plant complex. This complex consists of two power plants with a total of five discharge stacks, situated approximately 2-5 kilometres from a city of approximately 10,000 inhabitants. Monthly measurements of mean wind speed and direction, dust loading, and 238 U activities in fallout samples, as well as mean annual values of 232 Th activity in the nearest city sampling sites were available for the study. The models used in the study were Pc-CREAM (a detailed impact assessment model), and COMPLY and CROM (screening models). In applying the models to this scenario it was noted that the meteorological data provided was not ideal for testing, and that a number of assumptions had to be made, particularly for the simpler models. However, taking the gaps and uncertainties in the data into account, the model predictions from PC-CREAM were generally in good agreement with the measured data, and the results from different models were also generally consistent with each other. However, the COMPLY predictions were generally lower than those from PC-CREAM. This is of concern, as the aim of a screening model (COMPLY) is to provide conservative estimates of contaminant concentrations. Further investigation of this problem is required. The general implications of the results for further model development are discussed. (author)

  16. Design of EAST LHCD high power supply feedback control system based on PLC

    International Nuclear Information System (INIS)

    Hu Huaichuan; Shan Jiafang

    2009-01-01

    Design of EAST LHCD -35kV/5.6MW high power supply feedback control system based on PLC is described. Industrial computer and PLC are used to control high power supply in the system. PID arithmetic is adopted to achieve the feedback control of voltage of high power supply. Operating system is base on real-time operating system of QNX. Good controlling properties and reliable protective properties of the feedback control system are proved by the experiment results. (authors)

  17. High-power planar dielectric waveguide lasers

    International Nuclear Information System (INIS)

    Shepherd, D.P.; Hettrick, S.J.; Li, C.; Mackenzie, J.I.; Beach, R.J.; Mitchell, S.C.; Meissner, H.E.

    2001-01-01

    The advantages and potential hazards of using a planar waveguide as the host in a high-power diode-pumped laser system are described. The techniques discussed include the use of proximity-coupled diodes, double-clad waveguides, unstable resonators, tapers, and integrated passive Q switches. Laser devices are described based on Yb 3+ -, Nd 3+ -, and Tm 3+ -doped YAG, and monolithic and highly compact waveguide lasers with outputs greater than 10 W are demonstrated. The prospects for scaling to the 100 W level and for further integration of devices for added functionality in a monolithic laser system are discussed. (author)

  18. Computation of the Mutual Inductance between Air-Cored Coils of Wireless Power Transformer

    International Nuclear Information System (INIS)

    Anele, A O; Hamam, Y; Djouani, K; Chassagne, L; Alayli, Y; Linares, J

    2015-01-01

    Wireless power transfer system is a modern technology which allows the transfer of electric power between the air-cored coils of its transformer via high frequency magnetic fields. However, due to its coil separation distance and misalignment, maximum power transfer is not guaranteed. Based on a more efficient and general model available in the literature, rederived mathematical models for evaluating the mutual inductance between circular coils with and without lateral and angular misalignment are presented. Rather than presenting results numerically, the computed results are graphically implemented using MATLAB codes. The results are compared with the published ones and clarification regarding the errors made are presented. In conclusion, this study shows that power transfer efficiency of the system can be improved if a higher frequency alternating current is supplied to the primary coil, the reactive parts of the coils are compensated with capacitors and ferrite cores are added to the coils. (paper)

  19. User manual for PACTOLUS: a code for computing power costs

    International Nuclear Information System (INIS)

    Huber, H.D.; Bloomster, C.H.

    1979-02-01

    PACTOLUS is a computer code for calculating the cost of generating electricity. Through appropriate definition of the input data, PACTOLUS can calculate the cost of generating electricity from a wide variety of power plants, including nuclear, fossil, geothermal, solar, and other types of advanced energy systems. The purpose of PACTOLUS is to develop cash flows and calculate the unit busbar power cost (mills/kWh) over the entire life of a power plant. The cash flow information is calculated by two principal models: the Fuel Model and the Discounted Cash Flow Model. The Fuel Model is an engineering cost model which calculates the cash flow for the fuel cycle costs over the project lifetime based on input data defining the fuel material requirements, the unit costs of fuel materials and processes, the process lead and lag times, and the schedule of the capacity factor for the plant. For nuclear plants, the Fuel Model calculates the cash flow for the entire nuclear fuel cycle. For fossil plants, the Fuel Model calculates the cash flow for the fossil fuel purchases. The Discounted Cash Flow Model combines the fuel costs generated by the Fuel Model with input data on the capital costs, capital structure, licensing time, construction time, rates of return on capital, tax rates, operating costs, and depreciation method of the plant to calculate the cash flow for the entire lifetime of the project. The financial and tax structure for both investor-owned utilities and municipal utilities can be simulated through varying the rates of return on equity and debt, the debt-equity ratios, and tax rates. The Discounted Cash Flow Model uses the principal that the present worth of the revenues will be equal to the present worth of the expenses including the return on investment over the economic life of the project. This manual explains how to prepare the input data, execute cases, and interpret the output results with the updated version of PACTOLUS. 11 figures, 2 tables

  20. User manual for PACTOLUS: a code for computing power costs.

    Energy Technology Data Exchange (ETDEWEB)

    Huber, H.D.; Bloomster, C.H.

    1979-02-01

    PACTOLUS is a computer code for calculating the cost of generating electricity. Through appropriate definition of the input data, PACTOLUS can calculate the cost of generating electricity from a wide variety of power plants, including nuclear, fossil, geothermal, solar, and other types of advanced energy systems. The purpose of PACTOLUS is to develop cash flows and calculate the unit busbar power cost (mills/kWh) over the entire life of a power plant. The cash flow information is calculated by two principal models: the Fuel Model and the Discounted Cash Flow Model. The Fuel Model is an engineering cost model which calculates the cash flow for the fuel cycle costs over the project lifetime based on input data defining the fuel material requirements, the unit costs of fuel materials and processes, the process lead and lag times, and the schedule of the capacity factor for the plant. For nuclear plants, the Fuel Model calculates the cash flow for the entire nuclear fuel cycle. For fossil plants, the Fuel Model calculates the cash flow for the fossil fuel purchases. The Discounted Cash Flow Model combines the fuel costs generated by the Fuel Model with input data on the capital costs, capital structure, licensing time, construction time, rates of return on capital, tax rates, operating costs, and depreciation method of the plant to calculate the cash flow for the entire lifetime of the project. The financial and tax structure for both investor-owned utilities and municipal utilities can be simulated through varying the rates of return on equity and debt, the debt-equity ratios, and tax rates. The Discounted Cash Flow Model uses the principal that the present worth of the revenues will be equal to the present worth of the expenses including the return on investment over the economic life of the project. This manual explains how to prepare the input data, execute cases, and interpret the output results. (RWR)

  1. Careful Determination of Inservice Inspection of piping by Computer Analysis in Nuclear Power Plant

    International Nuclear Information System (INIS)

    Lim, H. T.; Lee, S. L.; Lee, J. P.; Kim, B. C.

    1992-01-01

    Stress analysis has been performed using computer program ANSYS in the pressurizer surge line in accordance with ASME Sec. III in order to predict possibility of fatigue failure due to thermal stratification phenomena in pipes connected to reactor coolant system of nuclear power plants. Highly vulnerable area to crack generation has been chosen by the analysis of fatigue due to thermal stress in pressurizer surge line. This kind of result can be helpful to choose the location requiring intensive care during inservice inspection of nuclear power plants

  2. Thermohydraulic analysis of nuclear power plant accidents by computer codes

    International Nuclear Information System (INIS)

    Petelin, S.; Stritar, A.; Istenic, R.; Gregoric, M.; Jerele, A.; Mavko, B.

    1982-01-01

    RELAP4/MOD6, BRUCH-D-06, CONTEMPT-LT-28, RELAP5/MOD1 and COBRA-4-1 codes were successful y implemented at the CYBER 172 computer in Ljubljana. Input models of NPP Krsko for the first three codes were prepared. Because of the high computer cost only one analysis of double ended guillotine break of the cold leg of NPP Krsko by RELAP4 code has been done. BRUCH code is easier and cheaper for use. Several analysis have been done. Sensitivity study was performed with CONTEMPT-LT-28 for double ended pump suction break. These codes are intended to be used as a basis for independent safety analyses. (author)

  3. High Energy Density Sciences with High Power Lasers at SACLA

    Science.gov (United States)

    Kodama, Ryosuke

    2013-10-01

    One of the interesting topics on high energy density sciences with high power lasers is creation of extremely high pressures in material. The pressures of more than 0.1 TPa are the energy density corresponding to the chemical bonding energy, resulting in expectation of dramatic changes in the chemical reactions. At pressures of more than TPa, most of material would be melted on the shock Hugoniot curve. However, if the temperature is less than 1eV or lower than a melting point at pressures of more than TPa, novel solid states of matter must be created through a pressured phase transition. One of the interesting materials must be carbon. At pressures of more than TPa, the diamond structure changes to BC and cubic at more than 3TPa. To create such novel states of matter, several kinds of isentropic-like compression techniques are being developed with high power lasers. To explore the ``Tera-Pascal Science,'' now we have a new tool which is an x-ray free electron laser as well as high power lasers. The XFEL will clear the details of the HED states and also efficiently create hot dense matter. We have started a new project on high energy density sciences using an XFEL (SACLA) in Japan, which is a HERMES (High Energy density Revolution of Matter in Extreme States) project.

  4. High-power LEDs for plant cultivation

    Science.gov (United States)

    Tamulaitis, Gintautas; Duchovskis, Pavelas; Bliznikas, Zenius; Breive, Kestutis; Ulinskaite, Raimonda; Brazaityte, Ausra; Novickovas, Algirdas; Zukauskas, Arturas; Shur, Michael S.

    2004-10-01

    We report on high-power solid-state lighting facility for cultivation of greenhouse vegetables and on the results of the study of control of photosynthetic activity and growth morphology of radish and lettuce imposed by variation of the spectral composition of illumination. Experimental lighting modules (useful area of 0.22 m2) were designed based on 4 types of high-power light-emitting diodes (LEDs) with emission peaked in red at the wavelengths of 660 nm and 640 nm (predominantly absorbed by chlorophyll a and b for photosynthesis, respectively), in blue at 455 nm (phototropic function), and in far-red at 735 nm (important for photomorphology). Morphological characteristics, chlorophyll and phytohormone concentrations in radish and lettuce grown in phytotron chambers under lighting with different spectral composition of the LED-based illuminator and under illumination by high pressure sodium lamps with an equivalent photosynthetic photon flux density were compared. A well-balanced solid-state lighting was found to enhance production of green mass and to ensure healthy morphogenesis of plants compared to those grown using conventional lighting. We observed that the plant morphology and concentrations of morphologically active phytohormones is strongly affected by the spectral composition of light in the red region. Commercial application of the LED-based illumination for large-scale plant cultivation is discussed. This technology is favorable from the point of view of energy consumption, controllable growth, and food safety but is hindered by high cost of the LEDs. Large scale manufacturing of high-power red AlInGaP-based LEDs emitting at 650 nm and a further decrease of the photon price for the LEDs emitting in the vicinity of the absorption peak of chlorophylls have to be achieved to promote horticulture applications.

  5. [Restoration filtering based on projection power spectrum for single-photon emission computed tomography].

    Science.gov (United States)

    Kubo, N

    1995-04-01

    To improve the quality of single-photon emission computed tomographic (SPECT) images, a restoration filter has been developed. This filter was designed according to practical "least squares filter" theory. It is necessary to know the object power spectrum and the noise power spectrum. The power spectrum is estimated from the power spectrum of a projection, when the high-frequency power spectrum of a projection is adequately approximated as a polynomial exponential expression. A study of the restoration with the filter based on a projection power spectrum was conducted, and compared with that of the "Butterworth" filtering method (cut-off frequency of 0.15 cycles/pixel), and "Wiener" filtering (signal-to-noise power spectrum ratio was a constant). Normalized mean-squared errors (NMSE) of the phantom, two line sources located in a 99mTc filled cylinder, were used. NMSE of the "Butterworth" filter, "Wiener" filter, and filtering based on a power spectrum were 0.77, 0.83, and 0.76 respectively. Clinically, brain SPECT images utilizing this new restoration filter improved the contrast. Thus, this filter may be useful in diagnosis of SPECT images.

  6. Restoration filtering based on projection power spectrum for single-photon emission computed tomography

    International Nuclear Information System (INIS)

    Kubo, Naoki

    1995-01-01

    To improve the quality of single-photon emission computed tomographic (SPECT) images, a restoration filter has been developed. This filter was designed according to practical 'least squares filter' theory. It is necessary to know the object power spectrum and the noise power spectrum. The power spectrum is estimated from the power spectrum of a projection, when the high-frequency power spectrum of a projection is adequately approximated as a polynomial exponential expression. A study of the restoration with the filter based on a projection power spectrum was conducted, and compared with that of the 'Butterworth' filtering method (cut-off frequency of 0.15 cycles/pixel), and 'Wiener' filtering (signal-to-noise power spectrum ratio was a constant). Normalized mean-squared errors (NMSE) of the phantom, two line sources located in a 99m Tc filled cylinder, were used. NMSE of the 'Butterworth' filter, 'Wiener' filter, and filtering based on a power spectrum were 0.77, 0.83, and 0.76 respectively. Clinically, brain SPECT images utilizing this new restoration filter improved the contrast. Thus, this filter may be useful in diagnosis of SPECT images. (author)

  7. Electromagnetic Modeling of Human Body Using High Performance Computing

    Science.gov (United States)

    Ng, Cho-Kuen; Beall, Mark; Ge, Lixin; Kim, Sanghoek; Klaas, Ottmar; Poon, Ada

    Realistic simulation of electromagnetic wave propagation in the actual human body can expedite the investigation of the phenomenon of harvesting implanted devices using wireless powering coupled from external sources. The parallel electromagnetics code suite ACE3P developed at SLAC National Accelerator Laboratory is based on the finite element method for high fidelity accelerator simulation, which can be enhanced to model electromagnetic wave propagation in the human body. Starting with a CAD model of a human phantom that is characterized by a number of tissues, a finite element mesh representing the complex geometries of the individual tissues is built for simulation. Employing an optimal power source with a specific pattern of field distribution, the propagation and focusing of electromagnetic waves in the phantom has been demonstrated. Substantial speedup of the simulation is achieved by using multiple compute cores on supercomputers.

  8. Computer code validation by high temperature chemistry

    International Nuclear Information System (INIS)

    Alexander, C.A.; Ogden, J.S.

    1988-01-01

    At least five of the computer codes utilized in analysis of severe fuel damage-type events are directly dependent upon or can be verified by high temperature chemistry. These codes are ORIGEN, CORSOR, CORCON, VICTORIA, and VANESA. With the exemption of CORCON and VANESA, it is necessary that verification experiments be performed on real irradiated fuel. For ORIGEN, the familiar knudsen effusion cell is the best choice and a small piece of known mass and known burn-up is selected and volatilized completely into the mass spectrometer. The mass spectrometer is used in the integral mode to integrate the entire signal from preselected radionuclides, and from this integrated signal the total mass of the respective nuclides can be determined. For CORSOR and VICTORIA, experiments with flowing high pressure hydrogen/steam must flow over the irradiated fuel and then enter the mass spectrometer. For these experiments, a high pressure-high temperature molecular beam inlet must be employed. Finally, in support of VANESA-CORCON, the very highest temperature and molten fuels must be contained and analyzed. Results from all types of experiments will be discussed and their applicability to present and future code development will also be covered

  9. Industrial Applications of High Average Power FELS

    CERN Document Server

    Shinn, Michelle D

    2005-01-01

    The use of lasers for material processing continues to expand, and the annual sales of such lasers exceeds $1 B (US). Large scale (many m2) processing of materials require the economical production of laser powers of the tens of kilowatts, and therefore are not yet commercial processes, although they have been demonstrated. The development of FELs based on superconducting RF (SRF) linac technology provides a scaleable path to laser outputs above 50 kW in the IR, rendering these applications economically viable, since the cost/photon drops as the output power increases. This approach also enables high average power ~ 1 kW output in the UV spectrum. Such FELs will provide quasi-cw (PRFs in the tens of MHz), of ultrafast (pulsewidth ~ 1 ps) output with very high beam quality. This talk will provide an overview of applications tests by our facility's users such as pulsed laser deposition, laser ablation, and laser surface modification, as well as present plans that will be tested with our upgraded FELs. These upg...

  10. Software for computers in safety systems of nuclear power plants

    International Nuclear Information System (INIS)

    Gallagher, J.M.

    1983-01-01

    The application of distributed digital processing techniques to the protection systems of nuclear power plants provides a means to significantly improve the functional capability of the protection system with respect to the operability and availability of the power plant. A major factor in the realization of this improvement is the development and maintenance of essentially error-free software. A joint program for the development of principles for the design, testing and documentation of software to achieve this goal is presented. Results from two separate experiences in the application of these principles in terms of detected software errors are summarized. The low number of errors detected during the verification testing phase demonstrates the effectiveness of the design and documentation principles in the realization of highly reliable software. (author)

  11. Industrial application of high power disk lasers

    Science.gov (United States)

    Brockmann, Rüdiger; Havrilla, David

    2008-02-01

    Laser welding has become one of the fastest growing areas for industrial laser applications. The increasing cost effectiveness of the laser process is enabled by the development of new highly efficient laser sources, such as the Disk laser, coupled with decreasing cost per Watt. TRUMPF introduced the Disk laser several years ago, and today it has become the most reliable laser tool on the market. The excellent beam quality and output powers of up to 10 kW enable its application in the automotive industry as well as in the range of thick plate welding, such as heavy construction and ship building. This serves as an overview of the most recent developments on the TRUMPF Disk laser and its industrial applications like cutting, welding, remote welding and hybrid welding, too. The future prospects regarding increased power and even further improved productivity and economics are presented.

  12. High-field, high-density tokamak power reactor

    International Nuclear Information System (INIS)

    Cohn, D.R.; Cook, D.L.; Hay, R.D.; Kaplan, D.; Kreischer, K.; Lidskii, L.M.; Stephany, W.; Williams, J.E.C.; Jassby, D.L.; Okabayashi, M.

    1977-11-01

    A conceptual design of a compact (R 0 = 6.0 m) high power density (average P/sub f/ = 7.7 MW/m 3 ) tokamak demonstration power reactor has been developed. High magnetic field (B/sub t/ = 7.4 T) and moderate elongation (b/a = 1.6) permit operation at the high density (n(0) approximately 5 x 10 14 cm -3 ) needed for ignition in a relatively small plasma, with a spatially-averaged toroidal beta of only 4%. A unique design for the Nb 3 Sn toroidal-field magnet system reduces the stress in the high-field trunk region, and allows modularization for simpler disassembly. The modest value of toroidal beta permits a simple, modularized plasma-shaping coil system, located inside the TF coil trunk. Heating of the dense central plasma is attained by the use of ripple-assisted injection of 120-keV D 0 beams. The ripple-coil system also affords dynamic control of the plasma temperature during the burn period. A FLIBE-lithium blanket is designed especially for high-power-density operation in a high-field environment, and gives an overall tritium breeding ratio of 1.05 in the slowly pumped lithium

  13. The JLab high power ERL light source

    International Nuclear Information System (INIS)

    Neil, G.R.; Behre, C.; Benson, S.V.

    2006-01-01

    A new THz/IR/UV photon source at Jefferson Lab is the first of a new generation of light sources based on an Energy-Recovered (superconducting) Linac (ERL). The machine has a 160MeV electron beam and an average current of 10mA in 75MHz repetition rate hundred femtosecond bunches. These electron bunches pass through a magnetic chicane and therefore emit synchrotron radiation. For wavelengths longer than the electron bunch the electrons radiate coherently a broadband THz ∼ half cycle pulse whose average brightness is >5 orders of magnitude higher than synchrotron IR sources. Previous measurements showed 20W of average power extracted [Carr, et al., Nature 420 (2002) 153]. The new facility offers simultaneous synchrotron light from the visible through the FIR along with broadband THz production of 100fs pulses with >200W of average power. The FELs also provide record-breaking laser power [Neil, et al., Phys. Rev. Lett. 84 (2000) 662]: up to 10kW of average power in the IR from 1 to 14μm in 400fs pulses at up to 74.85MHz repetition rates and soon will produce similar pulses of 300-1000nm light at up to 3kW of average power from the UV FEL. These ultrashort pulses are ideal for maximizing the interaction with material surfaces. The optical beams are Gaussian with nearly perfect beam quality. See www.jlab.org/FEL for details of the operating characteristics; a wide variety of pulse train configurations are feasible from 10ms long at high repetition rates to continuous operation. The THz and IR system has been commissioned. The UV system is to follow in 2005. The light is transported to user laboratories for basic and applied research. Additional lasers synchronized to the FEL are also available. Past activities have included production of carbon nanotubes, studies of vibrational relaxation of interstitial hydrogen in silicon, pulsed laser deposition and ablation, nitriding of metals, and energy flow in proteins. This paper will present the status of the system and

  14. The JLab high power ERL light source

    Energy Technology Data Exchange (ETDEWEB)

    G.R. Neil; C. Behre; S.V. Benson; M. Bevins; G. Biallas; J. Boyce; J. Coleman; L.A. Dillon-Townes; D. Douglas; H.F. Dylla; R. Evans; A. Grippo; D. Gruber; J. Gubeli; D. Hardy; C. Hernandez-Garcia; K. Jordan; M.J. Kelley; L. Merminga; J. Mammosser; W. Moore; N. Nishimori; E. Pozdeyev; J. Preble; R. Rimmer; Michelle D. Shinn; T. Siggins; C. Tennant; R. Walker; G.P. Williams and S. Zhang

    2005-03-19

    A new THz/IR/UV photon source at Jefferson Lab is the first of a new generation of light sources based on an Energy-Recovered, (superconducting) Linac (ERL). The machine has a 160 MeV electron beam and an average current of 10 mA in 75 MHz repetition rate hundred femtosecond bunches. These electron bunches pass through a magnetic chicane and therefore emit synchrotron radiation. For wavelengths longer than the electron bunch the electrons radiate coherently a broadband THz {approx} half cycle pulse whose average brightness is > 5 orders of magnitude higher than synchrotron IR sources. Previous measurements showed 20 W of average power extracted[1]. The new facility offers simultaneous synchrotron light from the visible through the FIR along with broadband THz production of 100 fs pulses with >200 W of average power. The FELs also provide record-breaking laser power [2]: up to 10 kW of average power in the IR from 1 to 14 microns in 400 fs pulses at up to 74.85 MHz repetition rates and soon will produce similar pulses of 300-1000 nm light at up to 3 kW of average power from the UV FEL. These ultrashort pulses are ideal for maximizing the interaction with material surfaces. The optical beams are Gaussian with nearly perfect beam quality. See www.jlab.org/FEL for details of the operating characteristics; a wide variety of pulse train configurations are feasible from 10 microseconds long at high repetition rates to continuous operation. The THz and IR system has been commissioned. The UV system is to follow in 2005. The light is transported to user laboratories for basic and applied research. Additional lasers synchronized to the FEL are also available. Past activities have included production of carbon nanotubes, studies of vibrational relaxation of interstitial hydrogen in silicon, pulsed laser deposition and ablation, nitriding of metals, and energy flow in proteins. This paper will present the status of the system and discuss some of the discoveries we have made

  15. Reducing power consumption while synchronizing a plurality of compute nodes during execution of a parallel application

    Science.gov (United States)

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Peters, Amanda E [Cambridge, MA; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2012-04-17

    Methods, apparatus, and products are disclosed for reducing power consumption while synchronizing a plurality of compute nodes during execution of a parallel application that include: beginning, by each compute node, performance of a blocking operation specified by the parallel application, each compute node beginning the blocking operation asynchronously with respect to the other compute nodes; reducing, for each compute node, power to one or more hardware components of that compute node in response to that compute node beginning the performance of the blocking operation; and restoring, for each compute node, the power to the hardware components having power reduced in response to all of the compute nodes beginning the performance of the blocking operation.

  16. Reducing power consumption while synchronizing a plurality of compute nodes during execution of a parallel application

    Science.gov (United States)

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Peters, Amanda A [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2012-01-10

    Methods, apparatus, and products are disclosed for reducing power consumption while synchronizing a plurality of compute nodes during execution of a parallel application that include: beginning, by each compute node, performance of a blocking operation specified by the parallel application, each compute node beginning the blocking operation asynchronously with respect to the other compute nodes; reducing, for each compute node, power to one or more hardware components of that compute node in response to that compute node beginning the performance of the blocking operation; and restoring, for each compute node, the power to the hardware components having power reduced in response to all of the compute nodes beginning the performance of the blocking operation.

  17. Computer simulations of high pressure systems

    International Nuclear Information System (INIS)

    Wilkins, M.L.

    1977-01-01

    Numerical methods are capable of solving very difficult problems in solid mechanics and gas dynamics. In the design of engineering structures, critical decisions are possible if the behavior of materials is correctly described in the calculation. Problems of current interest require accurate analysis of stress-strain fields that range from very small elastic displacement to very large plastic deformation. A finite difference program is described that solves problems over this range and in two and three space-dimensions and time. A series of experiments and calculations serve to establish confidence in the plasticity formulation. The program can be used to design high pressure systems where plastic flow occurs. The purpose is to identify material properties, strength and elongation, that meet the operating requirements. An objective is to be able to perform destructive testing on a computer rather than on the engineering structure. Examples of topical interest are given

  18. High Performance Computing in Science and Engineering '02 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    2003-01-01

    This book presents the state-of-the-art in modeling and simulation on supercomputers. Leading German research groups present their results achieved on high-end systems of the High Performance Computing Center Stuttgart (HLRS) for the year 2002. Reports cover all fields of supercomputing simulation ranging from computational fluid dynamics to computer science. Special emphasis is given to industrially relevant applications. Moreover, by presenting results for both vector sytems and micro-processor based systems the book allows to compare performance levels and usability of a variety of supercomputer architectures. It therefore becomes an indispensable guidebook to assess the impact of the Japanese Earth Simulator project on supercomputing in the years to come.

  19. The path toward HEP High Performance Computing

    CERN Document Server

    Apostolakis, John; Carminati, Federico; Gheata, Andrei; Wenzel, Sandro

    2014-01-01

    High Energy Physics code has been known for making poor use of high performance computing architectures. Efforts in optimising HEP code on vector and RISC architectures have yield limited results and recent studies have shown that, on modern architectures, it achieves a performance between 10% and 50% of the peak one. Although several successful attempts have been made to port selected codes on GPUs, no major HEP code suite has a 'High Performance' implementation. With LHC undergoing a major upgrade and a number of challenging experiments on the drawing board, HEP cannot any longer neglect the less-than-optimal performance of its code and it has to try making the best usage of the hardware. This activity is one of the foci of the SFT group at CERN, which hosts, among others, the Root and Geant4 project. The activity of the experiments is shared and coordinated via a Concurrency Forum, where the experience in optimising HEP code is presented and discussed. Another activity is the Geant-V project, centred on th...

  20. What Physicists Should Know About High Performance Computing - Circa 2002

    Science.gov (United States)

    Frederick, Donald

    2002-08-01

    High Performance Computing (HPC) is a dynamic, cross-disciplinary field that traditionally has involved applied mathematicians, computer scientists, and others primarily from the various disciplines that have been major users of HPC resources - physics, chemistry, engineering, with increasing use by those in the life sciences. There is a technological dynamic that is powered by economic as well as by technical innovations and developments. This talk will discuss practical ideas to be considered when developing numerical applications for research purposes. Even with the rapid pace of development in the field, the author believes that these concepts will not become obsolete for a while, and will be of use to scientists who either are considering, or who have already started down the HPC path. These principles will be applied in particular to current parallel HPC systems, but there will also be references of value to desktop users. The talk will cover such topics as: computing hardware basics, single-cpu optimization, compilers, timing, numerical libraries, debugging and profiling tools and the emergence of Computational Grids.

  1. Application of high power microwave vacuum electron devices

    International Nuclear Information System (INIS)

    Ding Yaogen; Liu Pukun; Zhang Zhaochuan; Wang Yong; Shen Bin

    2011-01-01

    High power microwave vacuum electron devices can work at high frequency, high peak and average power. They have been widely used in military and civil microwave electron systems, such as radar, communication,countermeasure, TV broadcast, particle accelerators, plasma heating devices of fusion, microwave sensing and microwave heating. In scientific research, high power microwave vacuum electron devices are used mainly on high energy particle accelerator and fusion research. The devices include high peak power klystron, CW and long pulse high power klystron, multi-beam klystron,and high power gyrotron. In national economy, high power microwave vacuum electron devices are used mainly on weather and navigation radar, medical and radiation accelerator, TV broadcast and communication system. The devices include high power pulse and CW klystron, extended interaction klystron, traveling wave tube (TWT), magnetron and induced output tube (IOT). The state of art, common technology problems and trends of high power microwave vacuum electron devices are introduced in this paper. (authors)

  2. The numerical computation of seismic fragility of base-isolated Nuclear Power Plants buildings

    International Nuclear Information System (INIS)

    Perotti, Federico; Domaneschi, Marco; De Grandis, Silvia

    2013-01-01

    Highlights: • Seismic fragility of structural components in base isolated NPP is computed. • Dynamic integration, Response Surface, FORM and Monte Carlo Simulation are adopted. • Refined approach for modeling the non-linearities behavior of isolators is proposed. • Beyond-design conditions are addressed. • The preliminary design of the isolated IRIS is the application of the procedure. -- Abstract: The research work here described is devoted to the development of a numerical procedure for the computation of seismic fragilities for equipment and structural components in Nuclear Power Plants; in particular, reference is made, in the present paper, to the case of isolated buildings. The proposed procedure for fragility computation makes use of the Response Surface Methodology to model the influence of the random variables on the dynamic response. To account for stochastic loading, the latter is computed by means of a simulation procedure. Given the Response Surface, the Monte Carlo method is used to compute the failure probability. The procedure is here applied to the preliminary design of the Nuclear Power Plant reactor building within the International Reactor Innovative and Secure international project; the building is equipped with a base isolation system based on the introduction of High Damping Rubber Bearing elements showing a markedly non linear mechanical behavior. The fragility analysis is performed assuming that the isolation devices become the critical elements in terms of seismic risk and that, once base-isolation is introduced, the dynamic behavior of the building can be captured by low-dimensional numerical models

  3. Interactive simulation of nuclear power systems using a dedicated minicomputer - computer graphics facility

    International Nuclear Information System (INIS)

    Tye, C.; Sezgen, A.O.

    1980-01-01

    The design of control systems and operational procedures for large scale nuclear power plant poses a difficult optimization problem requiring a lot of computational effort. Plant dynamic simulation using digital minicomputers offers the prospect of relatively low cost computing and when combined with graphical input/output provides a powerful tool for studying such problems. The paper discusses the results obtained from a simulation study carried out at the Computer Graphics Unit of the University of Manchester using a typical station control model for an Advanced Gas Cooled reactor. Particular reference is placed on the use of computer graphics for information display, parameter and control system optimization and techniques for using graphical input for defining and/or modifying the control system topology. Experience gained from this study has shown that a relatively modest minicomputer system can be used for simulating large scale dynamic systems and that highly interactive computer graphics can be used to advantage to relieve the designer of many of the tedious aspects of simulation leaving him free to concentrate on the more creative aspects of his work. (author)

  4. Novel miniature high power ring filter

    International Nuclear Information System (INIS)

    Huang Huifen; Mao Junfa; Luo Zhihua

    2005-01-01

    The power handling capability of high temperature superconducting (HTS) filters is limited due to current concentration at the edges of the superconducting films. This problem can be overcome by using ring resonator, which employs the edge current free and reduces the current concentration. However, this kind of filter has large size. In order to reduce the cost and size and increase the power handling capability, in this paper a HTS photonic bandgap (PBG) structure filter is developed. The proposed pass band filter with PBG structure exhibits center frequency 12.23 GHz, steepness (about 35 dB/GHz), bandwidth (-3 dB bandwidth is 0.045 GHz), and low insertion loss (about -0.5 dB), and can handle input power up to 1 W (this value was limited by the measurement instrument used in the experiment). The size is reduced by 25%, insertion loss reduced by 37.5%, and steeper roll-off of the filter is also obtained compared with that in published literature

  5. High-power converters and AC drives

    CERN Document Server

    Wu, Bin

    2017-01-01

    This new edition reflects the recent technological advancements in the MV drive industry, such as advanced multilevel converters and drive configurations. It includes three new chapters, Control of Synchronous Motor Drives, Transformerless MV Drives, and Matrix Converter Fed Drives. In addition, there are extensively revised chapters on Multilevel Voltage Source Inverters and Voltage Source Inverter-Fed Drives. This book includes a systematic analysis on a variety of high-power multilevel converters, illustrates important concepts with simulations and experiments, introduces various megawatt drives produced by world leading drive manufacturers, and addresses practical problems and their mitigations methods.

  6. Compulsator, a high power compensated pulsed alternator

    International Nuclear Information System (INIS)

    Weldon, W.F.; Bird, W.L.; Driga, M.D.; Rylander, H.G.; Tolk, K.M.; Woodson, H.H.

    1983-01-01

    This chapter describes a pulsed power supply utilizing inertial energy storage as a possible replacement for large capacitor banks. The compulsator overcomes many of the limitations of the pulsed homopolar generators previously developed by the Center for Electromechanics and elsewhere in that it offers high voltage (10's of kV) and consequently higher pulse rise times, is self commutating, and offers the possibility of generating repetitive pulses. The compulsator converts rotational inertial energy directly into electrical energy utilizing the principles of both magnetic induction and flux compression. The theory of operation, a prototype compulsator design, and advanced compulsator designs are discussed

  7. Cost optimisation studies of high power accelerators

    Energy Technology Data Exchange (ETDEWEB)

    McAdams, R.; Nightingale, M.P.S.; Godden, D. [AEA Technology, Oxon (United Kingdom)] [and others

    1995-10-01

    Cost optimisation studies are carried out for an accelerator based neutron source consisting of a series of linear accelerators. The characteristics of the lowest cost design for a given beam current and energy machine such as power and length are found to depend on the lifetime envisaged for it. For a fixed neutron yield it is preferable to have a low current, high energy machine. The benefits of superconducting technology are also investigated. A Separated Orbit Cyclotron (SOC) has the potential to reduce capital and operating costs and intial estimates for the transverse and longitudinal current limits of such machines are made.

  8. Problem-Oriented Simulation Packages and Computational Infrastructure for Numerical Studies of Powerful Gyrotrons

    International Nuclear Information System (INIS)

    Damyanova, M; Sabchevski, S; Vasileva, E; Balabanova, E; Zhelyazkov, I; Dankov, P; Malinov, P

    2016-01-01

    Powerful gyrotrons are necessary as sources of strong microwaves for electron cyclotron resonance heating (ECRH) and electron cyclotron current drive (ECCD) of magnetically confined plasmas in various reactors (most notably ITER) for controlled thermonuclear fusion. Adequate physical models and efficient problem-oriented software packages are essential tools for numerical studies, analysis, optimization and computer-aided design (CAD) of such high-performance gyrotrons operating in a CW mode and delivering output power of the order of 1-2 MW. In this report we present the current status of our simulation tools (physical models, numerical codes, pre- and post-processing programs, etc.) as well as the computational infrastructure on which they are being developed, maintained and executed. (paper)

  9. Computational Environments and Analysis methods available on the NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform

    Science.gov (United States)

    Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will

  10. Highly parallel machines and future of scientific computing

    International Nuclear Information System (INIS)

    Singh, G.S.

    1992-01-01

    Computing requirement of large scale scientific computing has always been ahead of what state of the art hardware could supply in the form of supercomputers of the day. And for any single processor system the limit to increase in the computing power was realized a few years back itself. Now with the advent of parallel computing systems the availability of machines with the required computing power seems a reality. In this paper the author tries to visualize the future large scale scientific computing in the penultimate decade of the present century. The author summarized trends in parallel computers and emphasize the need for a better programming environment and software tools for optimal performance. The author concludes this paper with critique on parallel architectures, software tools and algorithms. (author). 10 refs., 2 tabs

  11. High Performance Computing in Science and Engineering '99 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    2000-01-01

    The book contains reports about the most significant projects from science and engineering of the Federal High Performance Computing Center Stuttgart (HLRS). They were carefully selected in a peer-review process and are showcases of an innovative combination of state-of-the-art modeling, novel algorithms and the use of leading-edge parallel computer technology. The projects of HLRS are using supercomputer systems operated jointly by university and industry and therefore a special emphasis has been put on the industrial relevance of results and methods.

  12. High Performance Computing in Science and Engineering '98 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    1999-01-01

    The book contains reports about the most significant projects from science and industry that are using the supercomputers of the Federal High Performance Computing Center Stuttgart (HLRS). These projects are from different scientific disciplines, with a focus on engineering, physics and chemistry. They were carefully selected in a peer-review process and are showcases for an innovative combination of state-of-the-art physical modeling, novel algorithms and the use of leading-edge parallel computer technology. As HLRS is in close cooperation with industrial companies, special emphasis has been put on the industrial relevance of results and methods.

  13. Computing an operating parameter of a unified power flow controller

    Science.gov (United States)

    Wilson, David G.; Robinett, III, Rush D.

    2017-12-26

    A Unified Power Flow Controller described herein comprises a sensor that outputs at least one sensed condition, a processor that receives the at least one sensed condition, a memory that comprises control logic that is executable by the processor; and power electronics that comprise power storage, wherein the processor causes the power electronics to selectively cause the power storage to act as one of a power generator or a load based at least in part upon the at least one sensed condition output by the sensor and the control logic, and wherein at least one operating parameter of the power electronics is designed to facilitate maximal transmittal of electrical power generated at a variable power generation system to a grid system while meeting power constraints set forth by the electrical power grid.

  14. Computing an operating parameter of a unified power flow controller

    Science.gov (United States)

    Wilson, David G; Robinett, III, Rush D

    2015-01-06

    A Unified Power Flow Controller described herein comprises a sensor that outputs at least one sensed condition, a processor that receives the at least one sensed condition, a memory that comprises control logic that is executable by the processor; and power electronics that comprise power storage, wherein the processor causes the power electronics to selectively cause the power storage to act as one of a power generator or a load based at least in part upon the at least one sensed condition output by the sensor and the control logic, and wherein at least one operating parameter of the power electronics is designed to facilitate maximal transmittal of electrical power generated at a variable power generation system to a grid system while meeting power constraints set forth by the electrical power grid.

  15. Modeling and numerical techniques for high-speed digital simulation of nuclear power plants

    International Nuclear Information System (INIS)

    Wulff, W.; Cheng, H.S.; Mallen, A.N.

    1987-01-01

    Conventional computing methods are contrasted with newly developed high-speed and low-cost computing techniques for simulating normal and accidental transients in nuclear power plants. Six principles are formulated for cost-effective high-fidelity simulation with emphasis on modeling of transient two-phase flow coolant dynamics in nuclear reactors. Available computing architectures are characterized. It is shown that the combination of the newly developed modeling and computing principles with the use of existing special-purpose peripheral processors is capable of achieving low-cost and high-speed simulation with high-fidelity and outstanding user convenience, suitable for detailed reactor plant response analyses

  16. Computational Analysis of Powered Lift Augmentation for the LEAPTech Distributed Electric Propulsion Wing

    Science.gov (United States)

    Deere, Karen A.; Viken, Sally A.; Carter, Melissa B.; Viken, Jeffrey K.; Wiese, Michael R.; Farr, Norma L.

    2017-01-01

    A computational study of a distributed electric propulsion wing with a 40deg flap deflection has been completed using FUN3D. Two lift-augmentation power conditions were compared with the power-off configuration on the high-lift wing (40deg flap) at a 73 mph freestream flow and for a range of angles of attack from -5 degrees to 14 degrees. The computational study also included investigating the benefit of corotating versus counter-rotating propeller spin direction to powered-lift performance. The results indicate a large benefit in lift coefficient, over the entire range of angle of attack studied, by using corotating propellers that all spin counter to the wingtip vortex. For the landing condition, 73 mph, the unpowered 40deg flap configuration achieved a maximum lift coefficient of 2.3. With high-lift blowing the maximum lift coefficient increased to 5.61. Therefore, the lift augmentation is a factor of 2.4. Taking advantage of the fullspan lift augmentation at similar performance means that a wing powered with the distributed electric propulsion system requires only 42 percent of the wing area of the unpowered wing. This technology will allow wings to be 'cruise optimized', meaning that they will be able to fly closer to maximum lift over drag conditions at the design cruise speed of the aircraft.

  17. A new generation drilling rig: hydraulically powered and computer controlled

    Energy Technology Data Exchange (ETDEWEB)

    Laurent, M.; Angman, P.; Oveson, D. [Tesco Corp., Calgary, AB, (Canada)

    1999-11-01

    Development, testing and operation of a new generation of hydraulically powered and computer controlled drilling rig that incorporates a number of features that enhance functionality and productivity, is described. The rig features modular construction, a large heated common drilling machinery room, permanently-mounted draw works which, along with the permanently installed top drive, significantly reduces rig-up/rig-down time. Also featured are closed and open hydraulic systems and a unique hydraulic distribution manifold. All functions are controlled through a programmable logic controller (PLC), providing almost unlimited interlocks and calculations to increase rig safety and efficiency. Simplified diagnostic routines, remote monitoring and troubleshooting are also part of the system. To date, two rigs are in operation. Performance of both rigs has been rated as `very good`. Little or no operational problems have been experienced; downtime has averaged 0.61 per cent since August 1998 when the the first of the two rigs went into operation. The most important future application for this rig is for use with the casing drilling process which eliminates the need for drill pipe and tripping. It also reduces the drilling time lost due to unscheduled events such as reaming, fishing and taking kicks while tripping. 1 tab., 6 figs.

  18. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    International Nuclear Information System (INIS)

    Mike Bockelie; Dave Swensen; Martin Denison

    2002-01-01

    This is the fifth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, our efforts have become focused on developing an improved workbench for simulating a gasifier based Vision 21 energyplex. To provide for interoperability of models developed under Vision 21 and other DOE programs, discussions have been held with DOE and other organizations developing plant simulator tools to review the possibility of establishing a common software interface or protocol to use when developing component models. A component model that employs the CCA protocol has successfully been interfaced to our CCA enabled workbench. To investigate the software protocol issue, DOE has selected a gasifier based Vision 21 energyplex configuration for use in testing and evaluating the impacts of different software interface methods. A Memo of Understanding with the Cooperative Research Centre for Coal in Sustainable Development (CCSD) in Australia has been completed that will enable collaborative research efforts on gasification issues. Preliminary results have been obtained for a CFD model of a pilot scale, entrained flow gasifier. A paper was presented at the Vision 21 Program Review Meeting at NETL (Morgantown) that summarized our accomplishments for Year One and plans for Year Two and Year Three

  19. Computer based aids for operator support in nuclear power plants

    International Nuclear Information System (INIS)

    1990-04-01

    In the framework of the Agency's programme on nuclear safety a survey was carried out based on a questionnaire to collect information on computer based aids for operator support in nuclear power plants in Member States. The intention was to put together a state-of-the-art report where different systems under development or already implemented would be described. This activity was also supported by an INSAG (International Nuclear Safety Advisory Group) recommendation. Two consultant's meetings were convened and their work is reflected in the two sections of the technical document. The first section, produced during the first meeting, is devoted to provide some general background material on the overall usability of Computerized Operator Decision Aids (CODAs), their advantages and shortcomings. During this first meeting, the first draft of the questionnaire was also produced. The second section presents the evaluation of the 40 questionnaires received from 11 Member States and comprises a short description of each system and some statistical and comparative observations. The ultimate goal of this activity was to inform Member States, particularly those who are considering implementation of a CODA, on the status of related developments elsewhere. 8 refs, 10 figs, 4 tabs

  20. High burnup models in computer code fair

    Energy Technology Data Exchange (ETDEWEB)

    Dutta, B K; Swami Prasad, P; Kushwaha, H S; Mahajan, S C; Kakodar, A [Bhabha Atomic Research Centre, Bombay (India)

    1997-08-01

    An advanced fuel analysis code FAIR has been developed for analyzing the behavior of fuel rods of water cooled reactors under severe power transients and high burnups. The code is capable of analyzing fuel pins of both collapsible clad, as in PHWR and free standing clad as in LWR. The main emphasis in the development of this code is on evaluating the fuel performance at extended burnups and modelling of the fuel rods for advanced fuel cycles. For this purpose, a number of suitable models have been incorporated in FAIR. For modelling the fission gas release, three different models are implemented, namely Physically based mechanistic model, the standard ANS 5.4 model and the Halden model. Similarly the pellet thermal conductivity can be modelled by the MATPRO equation, the SIMFUEL relation or the Halden equation. The flux distribution across the pellet is modelled by using the model RADAR. For modelling pellet clad interaction (PCMI)/ stress corrosion cracking (SCC) induced failure of sheath, necessary routines are provided in FAIR. The validation of the code FAIR is based on the analysis of fuel rods of EPRI project ``Light water reactor fuel rod modelling code evaluation`` and also the analytical simulation of threshold power ramp criteria of fuel rods of pressurized heavy water reactors. In the present work, a study is carried out by analysing three CRP-FUMEX rods to show the effect of various combinations of fission gas release models and pellet conductivity models, on the fuel analysis parameters. The satisfactory performance of FAIR may be concluded through these case studies. (author). 12 refs, 5 figs.

  1. High burnup models in computer code fair

    International Nuclear Information System (INIS)

    Dutta, B.K.; Swami Prasad, P.; Kushwaha, H.S.; Mahajan, S.C.; Kakodar, A.

    1997-01-01

    An advanced fuel analysis code FAIR has been developed for analyzing the behavior of fuel rods of water cooled reactors under severe power transients and high burnups. The code is capable of analyzing fuel pins of both collapsible clad, as in PHWR and free standing clad as in LWR. The main emphasis in the development of this code is on evaluating the fuel performance at extended burnups and modelling of the fuel rods for advanced fuel cycles. For this purpose, a number of suitable models have been incorporated in FAIR. For modelling the fission gas release, three different models are implemented, namely Physically based mechanistic model, the standard ANS 5.4 model and the Halden model. Similarly the pellet thermal conductivity can be modelled by the MATPRO equation, the SIMFUEL relation or the Halden equation. The flux distribution across the pellet is modelled by using the model RADAR. For modelling pellet clad interaction (PCMI)/ stress corrosion cracking (SCC) induced failure of sheath, necessary routines are provided in FAIR. The validation of the code FAIR is based on the analysis of fuel rods of EPRI project ''Light water reactor fuel rod modelling code evaluation'' and also the analytical simulation of threshold power ramp criteria of fuel rods of pressurized heavy water reactors. In the present work, a study is carried out by analysing three CRP-FUMEX rods to show the effect of various combinations of fission gas release models and pellet conductivity models, on the fuel analysis parameters. The satisfactory performance of FAIR may be concluded through these case studies. (author). 12 refs, 5 figs

  2. Resilient and Robust High Performance Computing Platforms for Scientific Computing Integrity

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Yier [Univ. of Central Florida, Orlando, FL (United States)

    2017-07-14

    As technology advances, computer systems are subject to increasingly sophisticated cyber-attacks that compromise both their security and integrity. High performance computing platforms used in commercial and scientific applications involving sensitive, or even classified data, are frequently targeted by powerful adversaries. This situation is made worse by a lack of fundamental security solutions that both perform efficiently and are effective at preventing threats. Current security solutions fail to address the threat landscape and ensure the integrity of sensitive data. As challenges rise, both private and public sectors will require robust technologies to protect its computing infrastructure. The research outcomes from this project try to address all these challenges. For example, we present LAZARUS, a novel technique to harden kernel Address Space Layout Randomization (KASLR) against paging-based side-channel attacks. In particular, our scheme allows for fine-grained protection of the virtual memory mappings that implement the randomization. We demonstrate the effectiveness of our approach by hardening a recent Linux kernel with LAZARUS, mitigating all of the previously presented side-channel attacks on KASLR. Our extensive evaluation shows that LAZARUS incurs only 0.943% overhead for standard benchmarks, and is therefore highly practical. We also introduced HA2lloc, a hardware-assisted allocator that is capable of leveraging an extended memory management unit to detect memory errors in the heap. We also perform testing using HA2lloc in a simulation environment and find that the approach is capable of preventing common memory vulnerabilities.

  3. High-Power Ion Thruster Technology

    Science.gov (United States)

    Beattie, J. R.; Matossian, J. N.

    1996-01-01

    Performance data are presented for the NASA/Hughes 30-cm-diam 'common' thruster operated over the power range from 600 W to 4.6 kW. At the 4.6-kW power level, the thruster produces 172 mN of thrust at a specific impulse of just under 4000 s. Xenon pressure and temperature measurements are presented for a 6.4-mm-diam hollow cathode operated at emission currents ranging from 5 to 30 A and flow rates of 4 sccm and 8 sccm. Highly reproducible results show that the cathode temperature is a linear function of emission current, ranging from approx. 1000 C to 1150 C over this same current range. Laser-induced fluorescence (LIF) measurements obtained from a 30-cm-diam thruster are presented, suggesting that LIF could be a valuable diagnostic for real-time assessment of accelerator-arid erosion. Calibration results of laminar-thin-film (LTF) erosion badges with bulk molybdenum are presented for 300-eV xenon, krypton, and argon sputtering ions. Facility-pressure effects on the charge-exchange ion current collected by 8-cm-diam and 30-cm-diam thrusters operated on xenon propellant are presented to show that accel current is nearly independent of facility pressure at low pressures, but increases rapidly under high-background-pressure conditions.

  4. Development and application of project management computer system in nuclear power station

    International Nuclear Information System (INIS)

    Chen Junpu

    2000-01-01

    According to the experiences in the construction of Daya Bay and Lingao nuclear power plants presents, the necessity to use the computers for management and their application in the nuclear power engineering project are explained

  5. Reducing power consumption while performing collective operations on a plurality of compute nodes

    Science.gov (United States)

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Peters, Amanda E [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2011-10-18

    Methods, apparatus, and products are disclosed for reducing power consumption while performing collective operations on a plurality of compute nodes that include: receiving, by each compute node, instructions to perform a type of collective operation; selecting, by each compute node from a plurality of collective operations for the collective operation type, a particular collective operation in dependence upon power consumption characteristics for each of the plurality of collective operations; and executing, by each compute node, the selected collective operation.

  6. Personal computers in high energy physics

    International Nuclear Information System (INIS)

    Quarrie, D.R.

    1987-01-01

    The role of personal computers within HEP is expanding as their capabilities increase and their cost decreases. Already they offer greater flexibility than many low-cost graphics terminals for a comparable cost and in addition they can significantly increase the productivity of physicists and programmers. This talk will discuss existing uses for personal computers and explore possible future directions for their integration into the overall computing environment. (orig.)

  7. Computing Confidence Bounds for Power and Sample Size of the General Linear Univariate Model

    OpenAIRE

    Taylor, Douglas J.; Muller, Keith E.

    1995-01-01

    The power of a test, the probability of rejecting the null hypothesis in favor of an alternative, may be computed using estimates of one or more distributional parameters. Statisticians frequently fix mean values and calculate power or sample size using a variance estimate from an existing study. Hence computed power becomes a random variable for a fixed sample size. Likewise, the sample size necessary to achieve a fixed power varies randomly. Standard statistical practice requires reporting ...

  8. WORK SYSTEM ANALYSIS OF POWER SUPPLY IN OPTIMIZING ELECTRICITY ON PERSONAL COMPUTER (PC

    Directory of Open Access Journals (Sweden)

    Sudarmaji Sudarmaji

    2017-12-01

    Full Text Available Working Principles DC Power Supply - is an energy source for a computer to operate. The power supply changes the current from AC 110 volts to 60Hz or 220 volts 50Hz to DC + 3.3 volts, +5 volts and + 12 volts. Power Supply must carry a good and stable DC power supply so the system can run well. Tools running on the voltage supplied by the onboard voltage regulator, for example RIMM and RIMM require 2.5 volts while AGP AX and cards require 1.5 volts, both supplied by the onboard regulator of the motherboard. In addition to supplying power, the Power Supply can prevent the computer from starting until a Power Supply voltage exists at a predetermined area. Power Good is a sign of a special test that is sent to the motherboard as an active signal on the computer, usually marked by a green light when the power button is pressed. The current issued by the Power Supply is a direct current (DC, power output is composed of 200 watts, 250 watts, 300 watts, 350 watts, 400 watts to 600 watts. Computers with Intel Pentium 4 processors and above use power of 380 watts to 450 watts. Keywords: Power Supply, Computer, DC, Power Good, and volt

  9. Development of superconductor electronics technology for high-end computing

    Energy Technology Data Exchange (ETDEWEB)

    Silver, A [Jet Propulsion Laboratory, 4800 Oak Grove Drive, Pasadena, CA 91109-8099 (United States); Kleinsasser, A [Jet Propulsion Laboratory, 4800 Oak Grove Drive, Pasadena, CA 91109-8099 (United States); Kerber, G [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States); Herr, Q [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States); Dorojevets, M [Department of Electrical and Computer Engineering, SUNY-Stony Brook, NY 11794-2350 (United States); Bunyk, P [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States); Abelson, L [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States)

    2003-12-01

    This paper describes our programme to develop and demonstrate ultra-high performance single flux quantum (SFQ) VLSI technology that will enable superconducting digital processors for petaFLOPS-scale computing. In the hybrid technology, multi-threaded architecture, the computational engine to power a petaFLOPS machine at affordable power will consist of 4096 SFQ multi-chip processors, with 50 to 100 GHz clock frequency and associated cryogenic RAM. We present the superconducting technology requirements, progress to date and our plan to meet these requirements. We improved SFQ Nb VLSI by two generations, to a 8 kA cm{sup -2}, 1.25 {mu}m junction process, incorporated new CAD tools into our methodology, demonstrated methods for recycling the bias current and data communication at speeds up to 60 Gb s{sup -1}, both on and between chips through passive transmission lines. FLUX-1 is the most ambitious project implemented in SFQ technology to date, a prototype general-purpose 8 bit microprocessor chip. We are testing the FLUX-1 chip (5K gates, 20 GHz clock) and designing a 32 bit floating-point SFQ multiplier with vector-register memory. We report correct operation of the complete stripline-connected gate library with large bias margins, as well as several larger functional units used in FLUX-1. The next stage will be an SFQ multi-processor machine. Important challenges include further reducing chip supply current and on-chip power dissipation, developing at least 64 kbit, sub-nanosecond cryogenic RAM chips, developing thermally and electrically efficient high data rate cryogenic-to-ambient input/output technology and improving Nb VLSI to increase gate density.

  10. Development of superconductor electronics technology for high-end computing

    International Nuclear Information System (INIS)

    Silver, A; Kleinsasser, A; Kerber, G; Herr, Q; Dorojevets, M; Bunyk, P; Abelson, L

    2003-01-01

    This paper describes our programme to develop and demonstrate ultra-high performance single flux quantum (SFQ) VLSI technology that will enable superconducting digital processors for petaFLOPS-scale computing. In the hybrid technology, multi-threaded architecture, the computational engine to power a petaFLOPS machine at affordable power will consist of 4096 SFQ multi-chip processors, with 50 to 100 GHz clock frequency and associated cryogenic RAM. We present the superconducting technology requirements, progress to date and our plan to meet these requirements. We improved SFQ Nb VLSI by two generations, to a 8 kA cm -2 , 1.25 μm junction process, incorporated new CAD tools into our methodology, demonstrated methods for recycling the bias current and data communication at speeds up to 60 Gb s -1 , both on and between chips through passive transmission lines. FLUX-1 is the most ambitious project implemented in SFQ technology to date, a prototype general-purpose 8 bit microprocessor chip. We are testing the FLUX-1 chip (5K gates, 20 GHz clock) and designing a 32 bit floating-point SFQ multiplier with vector-register memory. We report correct operation of the complete stripline-connected gate library with large bias margins, as well as several larger functional units used in FLUX-1. The next stage will be an SFQ multi-processor machine. Important challenges include further reducing chip supply current and on-chip power dissipation, developing at least 64 kbit, sub-nanosecond cryogenic RAM chips, developing thermally and electrically efficient high data rate cryogenic-to-ambient input/output technology and improving Nb VLSI to increase gate density

  11. 14 CFR 101.25 - Operating limitations for Class 2-High Power Rockets and Class 3-Advanced High Power Rockets.

    Science.gov (United States)

    2010-01-01

    ... Power Rockets and Class 3-Advanced High Power Rockets. 101.25 Section 101.25 Aeronautics and Space... OPERATING RULES MOORED BALLOONS, KITES, AMATEUR ROCKETS AND UNMANNED FREE BALLOONS Amateur Rockets § 101.25 Operating limitations for Class 2-High Power Rockets and Class 3-Advanced High Power Rockets. When operating...

  12. Optimal Operation of Plug-In Electric Vehicles in Power Systems with High Wind Power Penetrations

    DEFF Research Database (Denmark)

    Hu, Weihao; Su, Chi; Chen, Zhe

    2013-01-01

    in the power systems with high wind power penetrations. In this paper, the integration of plug-in electric vehicles in the power systems with high wind power penetrations is proposed and discussed. Optimal operation strategies of PEV in the spot market are proposed in order to decrease the energy cost for PEV......The Danish power system has a large penetration of wind power. The wind fluctuation causes a high variation in the power generation, which must be balanced by other sources. The battery storage based Plug-In Electric Vehicles (PEV) may be a possible solution to balance the wind power variations...... owners. Furthermore, the application of battery storage based aggregated PEV is analyzed as a regulation services provider in the power system with high wind power penetrations. The western Danish power system where the total share of annual wind power production is more than 27% of the electrical energy...

  13. Computational aspects in high intensity ultrasonic surgery planning.

    Science.gov (United States)

    Pulkkinen, A; Hynynen, K

    2010-01-01

    Therapeutic ultrasound treatment planning is discussed and computational aspects regarding it are reviewed. Nonlinear ultrasound simulations were solved with a combined frequency domain Rayleigh and KZK model. Ultrasonic simulations were combined with thermal simulations and were used to compute heating of muscle tissue in vivo for four different focused ultrasound transducers. The simulations were compared with measurements and good agreement was found for large F-number transducers. However, at F# 1.9 the simulated rate of temperature rise was approximately a factor of 2 higher than the measured ones. The power levels used with the F# 1 transducer were too low to show any nonlinearity. The simulations were used to investigate the importance of nonlinarities generated in the coupling water, and also the importance of including skin in the simulations. Ignoring either of these in the model would lead to larger errors. Most notably, the nonlinearities generated in the water can enhance the focal temperature by more than 100%. The simulations also demonstrated that pulsed high power sonications may provide an opportunity to significantly (up to a factor of 3) reduce the treatment time. In conclusion, nonlinear propagation can play an important role in shaping the energy distribution during a focused ultrasound treatment and it should not be ignored in planning. However, the current simulation methods are accurate only with relatively large F-numbers and better models need to be developed for sharply focused transducers. Copyright 2009 Elsevier Ltd. All rights reserved.

  14. NINJA: Java for High Performance Numerical Computing

    Directory of Open Access Journals (Sweden)

    José E. Moreira

    2002-01-01

    Full Text Available When Java was first introduced, there was a perception that its many benefits came at a significant performance cost. In the particularly performance-sensitive field of numerical computing, initial measurements indicated a hundred-fold performance disadvantage between Java and more established languages such as Fortran and C. Although much progress has been made, and Java now can be competitive with C/C++ in many important situations, significant performance challenges remain. Existing Java virtual machines are not yet capable of performing the advanced loop transformations and automatic parallelization that are now common in state-of-the-art Fortran compilers. Java also has difficulties in implementing complex arithmetic efficiently. These performance deficiencies can be attacked with a combination of class libraries (packages, in Java that implement truly multidimensional arrays and complex numbers, and new compiler techniques that exploit the properties of these class libraries to enable other, more conventional, optimizations. Two compiler techniques, versioning and semantic expansion, can be leveraged to allow fully automatic optimization and parallelization of Java code. Our measurements with the NINJA prototype Java environment show that Java can be competitive in performance with highly optimized and tuned Fortran code.

  15. High resolution computed tomography of positron emitters

    International Nuclear Information System (INIS)

    Derenzo, S.E.; Budinger, T.F.; Cahoon, J.L.; Huesman, R.H.; Jackson, H.G.

    1976-10-01

    High resolution computed transaxial radionuclide tomography has been performed on phantoms containing positron-emitting isotopes. The imaging system consisted of two opposing groups of eight NaI(Tl) crystals 8 mm x 30 mm x 50 mm deep and the phantoms were rotated to measure coincident events along 8960 projection integrals as they would be measured by a 280-crystal ring system now under construction. The spatial resolution in the reconstructed images is 7.5 mm FWHM at the center of the ring and approximately 11 mm FWHM at a radius of 10 cm. We present measurements of imaging and background rates under various operating conditions. Based on these measurements, the full 280-crystal system will image 10,000 events per sec with 400 μCi in a section 1 cm thick and 20 cm in diameter. We show that 1.5 million events are sufficient to reliably image 3.5-mm hot spots with 14-mm center-to-center spacing and isolated 9-mm diameter cold spots in phantoms 15 to 20 cm in diameter

  16. Concept for high speed computer printer

    Science.gov (United States)

    Stephens, J. W.

    1970-01-01

    Printer uses Kerr cell as light shutter for controlling the print on photosensitive paper. Applied to output data transfer, the information transfer rate of graphic computer printers could be increased to speeds approaching the data transfer rate of computer central processors /5000 to 10,000 lines per minute/.

  17. High power diode laser remelting of metals

    International Nuclear Information System (INIS)

    Chmelickova, H; Tomastik, J; Ctvrtlik, R; Supik, J; Nemecek, S; Misek, M

    2014-01-01

    This article is focused on the laser surface remelting of the steel samples with predefined overlapping of the laser spots. The goal of our experimental work was to evaluate microstructure and hardness both in overlapped zone and single pass ones for three kinds of ferrous metals with different content of carbon, cast iron, non-alloy structural steel and tool steel. High power fibre coupled diode laser Laserline LDF 3600-100 was used with robotic guided processing head equipped by the laser beam homogenizer that creates rectangular beam shape with uniform intensity distribution. Each sample was treated with identical process parameters - laser power, beam diameter, focus position, speed of motion and 40% spot overlap. Dimensions and structures of the remelted zone, zone of the partial melting, heat affected zone and base material were detected and measured by means of laser scanning and optical microscopes. Hardness progress in the vertical axis of the overlapped zone from remelted surface layer to base material was measured and compared with the hardness of the single spots. The most hardness growth was found for cast iron, the least for structural steel. Experiment results will be used to processing parameters optimization for each tested material separately.

  18. High resolving power spectrometer for beam analysis

    International Nuclear Information System (INIS)

    Moshammer, H.W.; Spencer, J.E.

    1992-03-01

    We describe a system designed to analyze the high energy, closely spaced bunches from individual RF pulses. Neither a large solid angle nor momentum range is required so this allows characteristics that appear useful for other applications such as ion beam lithography. The spectrometer is a compact, double-focusing QBQ design whose symmetry allows the Quads to range between F or D with a correspondingly large range of magnifications, dispersion and resolving power. This flexibility insures the possibility of spatially separating all of the bunches along the focal plane with minimal transverse kicks and bending angle for differing input conditions. The symmetry of the system allows a simple geometric interpretationof the resolving power in terms of thin lenses and ray optics. We discuss the optics and the hardware that is proposed to measure emittance, energy, energy spread and bunch length for each bunch in an RF pulse train for small bunch separations. We also discuss how to use such measurements for feedback and feedforward control of these bunch characteristics as well as maintain their stability. 2 refs

  19. Improved Collectors for High Power Gyrotrons

    International Nuclear Information System (INIS)

    Ives, R. Lawrence; Singh, Amarjit; Read, Michael; Borchard, Philipp; Neilson, Jeff

    2009-01-01

    High power gyrotrons are used for electron cyclotron heating, current drive and parasitic mode suppression in tokamaks for fusion energy research. These devices are crucial for successful operation of many research programs around the world, including the ITER program currently being constructed in France. Recent gyrotron failures resulted from cyclic fatigue of the copper material used to fabricated the collectors. The techniques used to collect the spent beam power is common in many gyrotrons produced around the world. There is serious concern that these tubes may also be at risk from cyclic fatigue. This program addresses the cause of the collector failure. The Phase I program successfully demonstrated feasibility of a mode of operation that eliminates the cyclic operation that caused the failure. It also demonstrated that new material can provide increased lifetime under cyclic operation that could increase the lifetime by more than on order of magnitude. The Phase II program will complete that research and develop a collector that eliminates the fatigue failures. Such a design would find application around the world.

  20. Analysis and control of high power synchronous rectifier

    Energy Technology Data Exchange (ETDEWEB)

    Singh Tejinder.

    1993-01-01

    The description, steady state/dynamic analysis and control design of a high power synchronous rectifier is presented. The proposed rectifier system exploits selective harmonic elimination modulation techniques to minimize filtering requirements, and overcomes the dc voltage limitations of prior art equipment. A detailed derivation of the optimum pulse width modulation switching patterns, in the low frequency range for high power applications is presented. A general mathematical model of the rectifier is established which is non-linear and time-invariant. The transformation of reference frame and small signal linearization techniques are used to obtain closed form solutions from the mathematical model. The modelling procedure is verified by computer simulation. The closed loop design of the synchronous rectifier based on a phase and amplitude control strategy is investigated. The transfer functions derived from this analysis are used for the design of the regulators. The steady-state and dynamic results predicted by computer simulation are verified by PECAN. A systematic design procedure is developed and a detailed design example of a 1 MV-amp rectifer system is presented. 23 refs., 33 figs.

  1. Series-Tuned High Efficiency RF-Power Amplifiers

    DEFF Research Database (Denmark)

    Vidkjær, Jens

    2008-01-01

    An approach to high efficiency RF-power amplifier design is presented. It addresses simultaneously efficiency optimization and peak voltage limitations when transistors are pushed towards their power limits.......An approach to high efficiency RF-power amplifier design is presented. It addresses simultaneously efficiency optimization and peak voltage limitations when transistors are pushed towards their power limits....

  2. High Power Flex-Propellant Arcjet Performance

    Science.gov (United States)

    Litchford, Ron J.

    2011-01-01

    implied nearly frozen flow in the nozzle and yielded performance ranges of 800-1100 sec for hydrogen and 400-600 sec for ammonia. Inferred thrust-to-power ratios were in the range of 30-10 lbf/MWe for hydrogen and 60-20 lbf/MWe for ammonia. Successful completion of this test series represents a fundamental milestone in the progression of high power arcjet technology, and it is hoped that the results may serve as a reliable touchstone for the future development of MW-class regeneratively-cooled flex-propellant plasma rockets.

  3. Splitting of high power, cw proton beams

    Directory of Open Access Journals (Sweden)

    Alberto Facco

    2007-09-01

    Full Text Available A simple method for splitting a high power, continuous wave (cw proton beam in two or more branches with low losses has been developed in the framework of the EURISOL (European Isotope Separation On-Line Radioactive Ion Beam Facility design study. The aim of the system is to deliver up to 4 MW of H^{-} beam to the main radioactive ion beam production target, and up to 100 kW of proton beams to three more targets, simultaneously. A three-step method is used, which includes magnetic neutralization of a fraction of the main H^{-} beam, magnetic splitting of H^{-} and H^{0}, and stripping of H^{0} to H^{+}. The method allows slow raising and individual fine adjustment of the beam intensity in each branch.

  4. Survey on modern pulsed high power lasers

    International Nuclear Information System (INIS)

    Witte, K.J.

    1985-01-01

    The requirements to be met by lasers for particle acceleration are partially similar to those already known for fusion lasers. The power level wanted in both caes is up to 100 TW or even more. The pulse durations favourable for laser accelerators are in the range from 1 ps to 1000 ps whereas fusion lasers require several ns. The energy range for laser accelerators is thus correspondingly smaller than that for fusion lasers: 1-100 kJ versus several 100 kJ. The design criteria of lasers meeting the requirements are discussed in the following. The CO 2 , iodine, Nd:glass and excimer lasers are treated in detail. The high repetition rate aspect will not be particularly addressed since for the present generation of lasers the wanted rates of far above 1 Hz are completely out of scope. Moreover, for the demonstration of principle these rates are not needed. (orig./HSI)

  5. QED studies using high-power lasers

    International Nuclear Information System (INIS)

    Mattias Marklund

    2010-01-01

    Complete text of publication follows. The event of extreme lasers, which intensities above 10 22 W/cm 2 will be reached on a routine basis, will give us opportunities to probe new aspects of quantum electrodynamics. In particular, the non-trivial properties of the quantum vacuum can be investigated as we reach previously unattainable laser intensities. Effects such as vacuum birefringence and pair production in strong fields could thus be probed. The prospects of obtaining new insights regarding the non-perturbative structure of quantum field theories shows that the next generation laser facilities can be important tool for fundamental physical studies. Here we aim at giving a brief overview of such aspects of high-power laser physics.

  6. High-power laser diodes with high polarization purity

    Science.gov (United States)

    Rosenkrantz, Etai; Yanson, Dan; Peleg, Ophir; Blonder, Moshe; Rappaport, Noam; Klumel, Genady

    2017-02-01

    Fiber-coupled laser diode modules employ power scaling of single emitters for fiber laser pumping. To this end, techniques such as geometrical, spectral and polarization beam combining (PBC) are used. For PBC, linear polarization with high degree of purity is important, as any non-perfectly polarized light leads to losses and heating. Furthermore, PBC is typically performed in a collimated portion of the beams, which also cancels the angular dependence of the PBC element, e.g., beam-splitter. However, we discovered that single emitters have variable degrees of polarization, which depends both on the operating current and far-field divergence. We present data to show angle-resolved polarization measurements that correlate with the ignition of high-order modes in the slow-axis emission of the emitter. We demonstrate that the ultimate laser brightness includes not only the standard parameters such as power, emitting area and beam divergence, but also the degree of polarization (DoP), which is a strong function of the latter. Improved slow-axis divergence, therefore, contributes not only to high brightness but also high beam combining efficiency through polarization.

  7. Fast Computation and Assessment Methods in Power System Analysis

    Science.gov (United States)

    Nagata, Masaki

    Power system analysis is essential for efficient and reliable power system operation and control. Recently, online security assessment system has become of importance, as more efficient use of power networks is eagerly required. In this article, fast power system analysis techniques such as contingency screening, parallel processing and intelligent systems application are briefly surveyed from the view point of their application to online dynamic security assessment.

  8. Computational Thinking and Practice - A Generic Approach to Computing in Danish High Schools

    DEFF Research Database (Denmark)

    Caspersen, Michael E.; Nowack, Palle

    2014-01-01

    Internationally, there is a growing awareness on the necessity of providing relevant computing education in schools, particularly high schools. We present a new and generic approach to Computing in Danish High Schools based on a conceptual framework derived from ideas related to computational thi...

  9. High Performance Computing Facility Operational Assessment 2015: Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Barker, Ashley D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Bernholdt, David E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Bland, Arthur S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Gary, Jeff D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Hack, James J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; McNally, Stephen T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Rogers, James H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Smith, Brian E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Straatsma, T. P. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Sukumar, Sreenivas Rangan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Thach, Kevin G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Tichenor, Suzy [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Vazhkudai, Sudharshan S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Wells, Jack C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility

    2016-03-01

    Oak Ridge National Laboratory’s (ORNL’s) Leadership Computing Facility (OLCF) continues to surpass its operational target goals: supporting users; delivering fast, reliable systems; creating innovative solutions for high-performance computing (HPC) needs; and managing risks, safety, and security aspects associated with operating one of the most powerful computers in the world. The results can be seen in the cutting-edge science delivered by users and the praise from the research community. Calendar year (CY) 2015 was filled with outstanding operational results and accomplishments: a very high rating from users on overall satisfaction that ties the highest-ever mark set in CY 2014; the greatest number of core-hours delivered to research projects; the largest percentage of capability usage since the OLCF began tracking the metric in 2009; and success in delivering on the allocation of 60, 30, and 10% of core hours offered for the INCITE (Innovative and Novel Computational Impact on Theory and Experiment), ALCC (Advanced Scientific Computing Research Leadership Computing Challenge), and Director’s Discretionary programs, respectively. These accomplishments, coupled with the extremely high utilization rate, represent the fulfillment of the promise of Titan: maximum use by maximum-size simulations. The impact of all of these successes and more is reflected in the accomplishments of OLCF users, with publications this year in notable journals Nature, Nature Materials, Nature Chemistry, Nature Physics, Nature Climate Change, ACS Nano, Journal of the American Chemical Society, and Physical Review Letters, as well as many others. The achievements included in the 2015 OLCF Operational Assessment Report reflect first-ever or largest simulations in their communities; for example Titan enabled engineers in Los Angeles and the surrounding region to design and begin building improved critical infrastructure by enabling the highest-resolution Cybershake map for Southern

  10. Low Power system Design techniques for mobile computers

    NARCIS (Netherlands)

    Havinga, Paul J.M.; Smit, Gerardus Johannes Maria

    1997-01-01

    Portable products are being used increasingly. Because these systems are battery powered, reducing power consumption is vital. In this report we give the properties of low power design and techniques to exploit them on the architecture of the system. We focus on: min imizing capacitance, avoiding

  11. Computational Efficiency of Economic MPC for Power Systems Operation

    DEFF Research Database (Denmark)

    Standardi, Laura; Poulsen, Niels Kjølstad; Jørgensen, John Bagterp

    2013-01-01

    In this work, we propose an Economic Model Predictive Control (MPC) strategy to operate power systems that consist of independent power units. The controller balances the power supply and demand, minimizing production costs. The control problem is formulated as a linear program that is solved...

  12. High technology supporting nuclear power industry in CRIEPI

    International Nuclear Information System (INIS)

    Ueda, Nobuyuki

    2009-01-01

    As a central research institute of electric power industry, Central Research Institute of Electric Power Industry (CRIEPI) has carried out R and D on broad range of topics such as power generation, power transmission, power distribution, power application and energy economics and society, aiming to develop prospective and advanced technologies, fundamental reinforce technologies and next-generation core technologies. To realize low-carbon society to cope with enhancement of global environmental issues, nuclear power is highly recommended as large-scale power with low-carbon emission. At the new start of serial explanation on advanced technologies, R and D on electric power industry was outlined. (T. Tanaka)

  13. A gateway for phylogenetic analysis powered by grid computing featuring GARLI 2.0.

    Science.gov (United States)

    Bazinet, Adam L; Zwickl, Derrick J; Cummings, Michael P

    2014-09-01

    We introduce molecularevolution.org, a publicly available gateway for high-throughput, maximum-likelihood phylogenetic analysis powered by grid computing. The gateway features a garli 2.0 web service that enables a user to quickly and easily submit thousands of maximum likelihood tree searches or bootstrap searches that are executed in parallel on distributed computing resources. The garli web service allows one to easily specify partitioned substitution models using a graphical interface, and it performs sophisticated post-processing of phylogenetic results. Although the garli web service has been used by the research community for over three years, here we formally announce the availability of the service, describe its capabilities, highlight new features and recent improvements, and provide details about how the grid system efficiently delivers high-quality phylogenetic results. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  14. Peregrine System | High-Performance Computing | NREL

    Science.gov (United States)

    classes of nodes that users access: Login Nodes Peregrine has four login nodes, each of which has Intel E5 /scratch file systems, the /mss file system is mounted on all login nodes. Compute Nodes Peregrine has 2592

  15. Switching transients in high-frequency high-power converters using power MOSFET's

    Science.gov (United States)

    Sloane, T. H.; Owen, H. A., Jr.; Wilson, T. G.

    1979-01-01

    The use of MOSFETs in a high-frequency high-power dc-to-dc converter is investigated. Consideration is given to the phenomena associated with the paralleling of MOSFETs and to the effect of stray circuit inductances on the converter circuit performance. Analytical relationships between various time constants during the turning-on and turning-off intervals are derived which provide estimates of plateau and peak levels during these intervals.

  16. Bringing together high energy physicist and computer scientist

    International Nuclear Information System (INIS)

    Bock, R.K.

    1989-01-01

    The Oxford Conference on Computing in High Energy Physics approached the physics and computing issues with the question, ''Can computer science help?'' always in mind. This summary is a personal recollection of what I considered to be the highlights of the conference: the parts which contributed to my own learning experience. It can be used as a general introduction to the following papers, or as a brief overview of the current states of computer science within high energy physics. (orig.)

  17. Integrated computer network high-speed parallel interface

    International Nuclear Information System (INIS)

    Frank, R.B.

    1979-03-01

    As the number and variety of computers within Los Alamos Scientific Laboratory's Central Computer Facility grows, the need for a standard, high-speed intercomputer interface has become more apparent. This report details the development of a High-Speed Parallel Interface from conceptual through implementation stages to meet current and future needs for large-scle network computing within the Integrated Computer Network. 4 figures

  18. Utilizing the Double-Precision Floating-Point Computing Power of GPUs for RSA Acceleration

    Directory of Open Access Journals (Sweden)

    Jiankuo Dong

    2017-01-01

    Full Text Available Asymmetric cryptographic algorithm (e.g., RSA and Elliptic Curve Cryptography implementations on Graphics Processing Units (GPUs have been researched for over a decade. The basic idea of most previous contributions is exploiting the highly parallel GPU architecture and porting the integer-based algorithms from general-purpose CPUs to GPUs, to offer high performance. However, the great potential cryptographic computing power of GPUs, especially by the more powerful floating-point instructions, has not been comprehensively investigated in fact. In this paper, we fully exploit the floating-point computing power of GPUs, by various designs, including the floating-point-based Montgomery multiplication/exponentiation algorithm and Chinese Remainder Theorem (CRT implementation in GPU. And for practical usage of the proposed algorithm, a new method is performed to convert the input/output between octet strings and floating-point numbers, fully utilizing GPUs and further promoting the overall performance by about 5%. The performance of RSA-2048/3072/4096 decryption on NVIDIA GeForce GTX TITAN reaches 42,211/12,151/5,790 operations per second, respectively, which achieves 13 times the performance of the previous fastest floating-point-based implementation (published in Eurocrypt 2009. The RSA-4096 decryption precedes the existing fastest integer-based result by 23%.

  19. Computer aided construction engineering system for nuclear power plants

    International Nuclear Information System (INIS)

    Yoshinaga, Toshiaki; Nakajima, Akira; Miyahara, Ryohei; Miura, Jun

    1990-01-01

    The construction CAE system for nuclear power plants is the tool for the construction work simulator (procedure, processes and management simulation) connected to 3D-CAD (three-dimensional plant layout planning CAD). The data used for construction work simulation are registered in the data base as the installation smallest unit data from the 3D-CAD. This construction work simulator comprises the automatic installation procedure decision system, with which a construction planner decides installation procedure by using a high performance graphic work station, and based on a 3D-CAD model, utilizing empirical procedure logic, the dialogue system for making the installation procedure more optimal by utilizing effectively the graphic function, the evaluation system for synthetically evaluating workability, personnel plan and so on by adding the simulation of human behavior based on these procedures, the schedule system which carries out work process simulation based on the above, the data base system for letting to do these plans effectively and the project management system. By means of these, the plant construction of high quality is expected. (K.I.)

  20. Power transistor module for high current applications

    International Nuclear Information System (INIS)

    Cilyo, F.F.

    1975-01-01

    One of the parts needed for the control system of the 400-GeV accelerator at Fermilab was a power transistor with a safe operating area of 1800A at 50V, dc current gain of 100,000 and 20 kHz bandwidth. Since the commercially available discrete devices and power hybrid packages did not meet these requirements, a power transistor module was developed which performed satisfactorily. By connecting 13 power transistors in parallel, with due consideration for network and heat dissipation problems, and by driving these 13 with another power transistor, a super power transistor is made, having an equivalent current, power, and safe operating area capability of 13 transistors. For higher capabilities, additional modules can be conveniently added. (auth)

  1. High power accelerator for environmental application

    International Nuclear Information System (INIS)

    Han, B.; Kim, J.K.; Kim, Y.R.; Kim, S.M.

    2011-01-01

    The problems of environmental damage and degradation of natural resources are receiving increasing attention throughout the world. The increased population, higher living standards, increased urbanization and enhanced industrial activities of humankind are all leading to degradation of the environment. Increasing urbanization has been accompanied by significant environmental pollution, given the seriousness of the situation and future risk of crises, there is an urgent need to develop the efficient technologies including economical treatment methods. Therefore, cost-effective treatment of the stack gases, wastewater and sludge containing refractory pollutant with electron beam is actively studied in EB TECH Co. Electron beam treatment of such hazardous wastes is caused by the decomposition of pollutants as a result of their reactions with highly reactive species formed from radiolysis. However, to have advantages over existing processes, the electron beam process should have cost-effective and reliable in operation. Therefore high power accelerators (400kW~1MW) are developed for environmental application and they show the decrease in the cost of construction and operation of electron beam plant. In other way to reduce the cost for treatment, radical reactions accompanied by the other processes are introduced, and the synergistic effect upon the use of combined methods such as electron beam treatment with catalytic system, biological treatment and physico-chemical adsorption and others also show the improvement of the effect of electron beam treatment. (author)

  2. High power accelerator for environmental application

    Energy Technology Data Exchange (ETDEWEB)

    Han, B.; Kim, J. K.; Kim, Y. R.; Kim, S. M. [EB-TECH Co., Ltd., Yuseong-gu Daejeon (Korea, Republic of)

    2011-07-01

    The problems of environmental damage and degradation of natural resources are receiving increasing attention throughout the world. The increased population, higher living standards, increased urbanization and enhanced industrial activities of humankind are all leading to degradation of the environment. Increasing urbanization has been accompanied by significant environmental pollution, given the seriousness of the situation and future risk of crises, there is an urgent need to develop the efficient technologies including economical treatment methods. Therefore, cost-effective treatment of the stack gases, wastewater and sludge containing refractory pollutant with electron beam is actively studied in EB TECH Co. Electron beam treatment of such hazardous wastes is caused by the decomposition of pollutants as a result of their reactions with highly reactive species formed from radiolysis. However, to have advantages over existing processes, the electron beam process should have cost-effective and reliable in operation. Therefore high power accelerators (400kW~1MW) are developed for environmental application and they show the decrease in the cost of construction and operation of electron beam plant. In other way to reduce the cost for treatment, radical reactions accompanied by the other processes are introduced, and the synergistic effect upon the use of combined methods such as electron beam treatment with catalytic system, biological treatment and physico-chemical adsorption and others also show the improvement of the effect of electron beam treatment. (author)

  3. High power accelerators and wastewater treatment

    International Nuclear Information System (INIS)

    Han, B.; Kim, J.K.; Kim, Y.R.; Kim, S.M.; Makaov, I.E.; Ponomarev, A.V.

    2006-01-01

    The problems of environmental damage and degradation of natural resources are receiving increasing attention throughout the world. The increased population, higher living standards, increased urbanization and enhanced industrial activities of humankind are all leading to degradation of the environment. Increasing urbanization has been accompanied by significant water pollution. Given the seriousness of the situation and future risk of crises, there is an urgent need to develop the water-efficient technologies including economical treatment methods of wastewater and polluted water. Therefore, cost-effective treatment of the municipal and industrial wastewater containing refractory pollutant with electron beam is actively studied in EB TECH Co.. Electron beam treatment of wastewater is caused by the decomposition of pollutants as a result of their reactions with highly reactive species formed from water radiolysis (hydrated electron, OH free radical and H atom). However, to have advantages over existing processes, the electron beam process should have cost-effective and reliable in operation. Therefore high power accelerators (400kW∼1MW) are developed for environmental application and they show the decrease in the cost of construction and operation of electron beam plant. In other way to reduce the cost for wastewater treatment, radical reactions accompanied by the other processes are introduced, and the synergistic effect upon the use of combined methods such as electron beam treatment with ozonation, biological treatment and physico-chemical adsorption and others also show the improvement of the effect of electron beam treatment for the wastewater purification. (author)

  4. NCI's High Performance Computing (HPC) and High Performance Data (HPD) Computing Platform for Environmental and Earth System Data Science

    Science.gov (United States)

    Evans, Ben; Allen, Chris; Antony, Joseph; Bastrakova, Irina; Gohar, Kashif; Porter, David; Pugh, Tim; Santana, Fabiana; Smillie, Jon; Trenham, Claire; Wang, Jingbo; Wyborn, Lesley

    2015-04-01

    The National Computational Infrastructure (NCI) has established a powerful and flexible in-situ petascale computational environment to enable both high performance computing and Data-intensive Science across a wide spectrum of national environmental and earth science data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress so far to harmonise the underlying data collections for future interdisciplinary research across these large volume data collections. NCI has established 10+ PBytes of major national and international data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the major Australian national-scale scientific collections), leading research communities, and collaborating overseas organisations. New infrastructures created at NCI mean the data collections are now accessible within an integrated High Performance Computing and Data (HPC-HPD) environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large-scale high-bandwidth Lustre filesystems. The hardware was designed at inception to ensure that it would allow the layered software environment to flexibly accommodate the advancement of future data science. New approaches to software technology and data models have also had to be developed to enable access to these large and exponentially

  5. High power diode lasers converted to the visible

    DEFF Research Database (Denmark)

    Jensen, Ole Bjarlin; Hansen, Anders Kragh; Andersen, Peter E.

    2017-01-01

    High power diode lasers have in recent years become available in many wavelength regions. However, some spectral regions are not well covered. In particular, the visible spectral range is lacking high power diode lasers with good spatial quality. In this paper, we highlight some of our recent...... results in nonlinear frequency conversion of high power near infrared diode lasers to the visible spectral region....

  6. Modeling of the dynamics of wind to power conversion including high wind speed behavior

    DEFF Research Database (Denmark)

    Litong-Palima, Marisciel; Bjerge, Martin Huus; Cutululis, Nicolaos Antonio

    2016-01-01

    This paper proposes and validates an efficient, generic and computationally simple dynamic model for the conversion of the wind speed at hub height into the electrical power by a wind turbine. This proposed wind turbine model was developed as a first step to simulate wind power time series...... for power system studies. This paper focuses on describing and validating the single wind turbine model, and is therefore neither describing wind speed modeling nor aggregation of contributions from a whole wind farm or a power system area. The state-of-the-art is to use static power curves for the purpose...... of power system studies, but the idea of the proposed wind turbine model is to include the main dynamic effects in order to have a better representation of the fluctuations in the output power and of the fast power ramping especially because of high wind speed shutdowns of the wind turbine. The high wind...

  7. Application of control computer system TESLA RPP-16 in the Bohunice nuclear power plant

    International Nuclear Information System (INIS)

    Spetko, V.

    1976-01-01

    The reasons are given for the installation of a computer at the A-1 nuclear power plant in Czechoslovakia with regard to applied research. The configuration, placement, and software of the computer system is described. The programmes are written in the SAM and FORTRAN-IV languages. The knowledge acquired in the course of tests and the prospect of the future installation of computer control equipment in the A-1 nuclear power plant are described. (J.P.)

  8. High RF Power Production for CLIC

    CERN Document Server

    Syratchev, I; Adli, E; Taborelli, M

    2007-01-01

    The CLIC Power Extraction and Transfer Structure (PETS) is a passive microwave device in which bunches of the drive beam interact with the impedance of the periodically loaded waveguide and excite preferentially the synchronous mode. The RF power produced (several hundred MW) is collected at the downstream end of the structure by means of the Power Extractor and delivered to the main linac structure. The PETS geometry is a result of multiple compromises between beam stability and main linac RF power needs. Another requirement is to provide local RF power termination in case of accelerating structure failure (ON/OFF capability). Surface electric and magnetic fields, power extraction method, HOM damping, ON/OFF capability and fabrication technology were all evaluated to provide a reliable design

  9. Nonmedical application of computed tomography to power capacitor quality assesment

    International Nuclear Information System (INIS)

    Kruger, R.P.

    1981-01-01

    Present research and development efforts at Los Alamos Scientific Laboratory require the design and use of high-efficiency rapid-discharge energy storage capacitors for laser isotope separation and plasma physics programs. In these applications, capacitors are subjected to electrical, mechanical, thermal, and other environmental stresses. These stresses cause the dielectric constant to change due to gasification from arcing at nonsoldering connections, which produce a chemical reduction of the dielectric material. This effectively limits the lifetime of the capacitor. The programs mentioned above require capacitors with a multikilohertz frequency response at a current of tens of kiloamperes and a voltage of at least 100 kV. The lifetime of such capacitors should exceed 10 10 charge/discharge cycles. Such capacitors do not presently exist. The exploration of new capacitor designs will require the use of both electrical functional tests and tests that show the changes in internal physical structure as the capacitor is repeatedly stressed by the charge/discharge cycle. The integration of electrical and structural tests throughout the life cycle of a candidate capacitor makes it imperative that the structural integrity tests be nondestructive. Computed tomography (CT) makes this integration possible. The work reported here is the result of a pilot project designed to show the potential use of CT for this application. This work includes visualization of material defects using both a layered sequence of conventional tomographic slices and orthogonal multiangular pseudoradiographs generated from these slices

  10. High-performance computing for structural mechanics and earthquake/tsunami engineering

    CERN Document Server

    Hori, Muneo; Ohsaki, Makoto

    2016-01-01

    Huge earthquakes and tsunamis have caused serious damage to important structures such as civil infrastructure elements, buildings and power plants around the globe.  To quantitatively evaluate such damage processes and to design effective prevention and mitigation measures, the latest high-performance computational mechanics technologies, which include telascale to petascale computers, can offer powerful tools. The phenomena covered in this book include seismic wave propagation in the crust and soil, seismic response of infrastructure elements such as tunnels considering soil-structure interactions, seismic response of high-rise buildings, seismic response of nuclear power plants, tsunami run-up over coastal towns and tsunami inundation considering fluid-structure interactions. The book provides all necessary information for addressing these phenomena, ranging from the fundamentals of high-performance computing for finite element methods, key algorithms of accurate dynamic structural analysis, fluid flows ...

  11. Engineering approach to model and compute electric power markets settlements

    International Nuclear Information System (INIS)

    Kumar, J.; Petrov, V.

    2006-01-01

    Back-office accounting settlement activities are an important part of market operations in Independent System Operator (ISO) organizations. A potential way to measure ISO market design correctness is to analyze how well market price signals create incentives or penalties for creating an efficient market to achieve market design goals. Market settlement rules are an important tool for implementing price signals which are fed back to participants via the settlement activities of the ISO. ISO's are currently faced with the challenge of high volumes of data resulting from the increasing size of markets and ever-changing market designs, as well as the growing complexity of wholesale energy settlement business rules. This paper analyzed the problem and presented a practical engineering solution using an approach based on mathematical formulation and modeling of large scale calculations. The paper also presented critical comments on various differences in settlement design approaches to electrical power market design, as well as further areas of development. The paper provided a brief introduction to the wholesale energy market settlement systems and discussed problem formulation. An actual settlement implementation framework and discussion of the results and conclusions were also presented. It was concluded that a proper engineering approach to this domain can yield satisfying results by formalizing wholesale energy settlements. Significant improvements were observed in the initial preparation phase, scoping and effort estimation, implementation and testing. 5 refs., 2 figs

  12. Physical-resource requirements and the power of quantum computation

    International Nuclear Information System (INIS)

    Caves, Carlton M; Deutsch, Ivan H; Blume-Kohout, Robin

    2004-01-01

    The primary resource for quantum computation is the Hilbert-space dimension. Whereas Hilbert space itself is an abstract construction, the number of dimensions available to a system is a physical quantity that requires physical resources. Avoiding a demand for an exponential amount of these resources places a fundamental constraint on the systems that are suitable for scalable quantum computation. To be scalable, the number of degrees of freedom in the computer must grow nearly linearly with the number of qubits in an equivalent qubit-based quantum computer. These considerations rule out quantum computers based on a single particle, a single atom, or a single molecule consisting of a fixed number of atoms or on classical waves manipulated using the transformations of linear optics

  13. HIGH PERFORMANCE PHOTOGRAMMETRIC PROCESSING ON COMPUTER CLUSTERS

    Directory of Open Access Journals (Sweden)

    V. N. Adrov

    2012-07-01

    Full Text Available Most cpu consuming tasks in photogrammetric processing can be done in parallel. The algorithms take independent bits as input and produce independent bits as output. The independence of bits comes from the nature of such algorithms since images, stereopairs or small image blocks parts can be processed independently. Many photogrammetric algorithms are fully automatic and do not require human interference. Photogrammetric workstations can perform tie points measurements, DTM calculations, orthophoto construction, mosaicing and many other service operations in parallel using distributed calculations. Distributed calculations save time reducing several days calculations to several hours calculations. Modern trends in computer technology show the increase of cpu cores in workstations, speed increase in local networks, and as a result dropping the price of the supercomputers or computer clusters that can contain hundreds or even thousands of computing nodes. Common distributed processing in DPW is usually targeted for interactive work with a limited number of cpu cores and is not optimized for centralized administration. The bottleneck of common distributed computing in photogrammetry can be in the limited lan throughput and storage performance, since the processing of huge amounts of large raster images is needed.

  14. Highly-stabilized power supply for synchrotron accelerators. High speed, low ripple power supply

    Energy Technology Data Exchange (ETDEWEB)

    Sato, Kenji [Osaka Univ., Ibaraki (Japan). Research Center for Nuclear Physics; Kumada, Masayuki; Fukami, Kenji; Koseki, Shoichiro; Kubo, Hiroshi; Kanazawa, Toru

    1997-02-01

    In synchrotron accelerators, in order to utilize high energy beam effectively, those are operated by repeating acceleration and taking-out at short period. In order to accelerate by maintaining beam track stable, the tracking performance with the error less than 10{sup -3} in the follow-up of current is required for the power supply. Further, in order to maintain the intensity and uniformity of beam when it is taken out, very low ripple is required for output current. The power supply having such characteristics has been developed, and applied to the HIMAC and the SPring-8. As the examples of the application of synchrotrons, the accelerators for medical treatment and the generation of synchrotron radiation are described. As to the power supply for the deflection magnets and quadrupole magnets of synchrotron accelerators, the specifications of the main power supply, the method of reducing ripple, the method of improving tracking, and active filter control are reported. As to the test results, the measurement of current ripple and tracking error is shown. The lowering of ripple was enabled by common mode filter and the symmetrical connection of electromagnets, and high speed response was realized by the compensation for delay with active filter. (K.I.)

  15. Test of a High Power Target Design

    CERN Multimedia

    2002-01-01

    %IS343 :\\\\ \\\\ A high power tantalum disc-foil target (RIST) has been developed for the proposed radioactive beam facility, SIRIUS, at the Rutherford Appleton Laboratory. The yield and release characteristics of the RIST target design have been measured at ISOLDE. The results indicate that the yields are at least as good as the best ISOLDE roll-foil targets and that the release curves are significantly faster in most cases. Both targets use 20 -25 $\\mu$m thick foils, but in a different internal geometry.\\\\ \\\\Investigations have continued at ISOLDE with targets having different foil thickness and internal geometries in an attempt to understand the release mechanisms and in particular to maximise the yield of short lived isotopes. A theoretical model has been developed which fits the release curves and gives physical values of the diffusion constants.\\\\ \\\\The latest target is constructed from 2 $\\mu$m thick tantalum foils (mass only 10 mg) and shows very short release times. The yield of $^{11}$Li (half-life of ...

  16. The SPES High Power ISOL production target

    Science.gov (United States)

    Andrighetto, A.; Corradetti, S.; Ballan, M.; Borgna, F.; Manzolaro, M.; Scarpa, D.; Monetti, A.; Rossignoli, M.; Silingardi, R.; Mozzi, A.; Vivian, G.; Boratto, E.; De Ruvo, L.; Sattin, N.; Meneghetti, G.; Oboe, R.; Guerzoni, M.; Margotti, A.; Ferrari, M.; Zenoni, A.; Prete, G.

    2016-11-01

    SPES (Selective Production of Exotic Species) is a facility under construction at INFN-LNL (Istituto Nazionale di Fisica Nucleare - Laboratori Nazionali di Legnaro), aimed to produce intense neutron-rich radioactive ion beams (RIBs). These will be obtained using the ISOL (Isotope Separation On-Line) method, bombarding a uranium carbide target with a proton beam of 40MeV energy and currents up to 200μA. The target configuration was designed to obtain a high number of fissions, up to 1013 per second, low power deposition and fast release of the produced isotopes. The exotic isotopes generated in the target are ionized, mass separated and re-accelerated by the ALPI superconducting LINAC at energies of 10AMeV and higher, for masses in the region of A = 130 amu , with an expected rate on the secondary target up to 109 particles per second. In this work, recent results on the R&D activities regarding the SPES RIB production target-ion source system are reported.

  17. High-power pure blue laser diodes

    Energy Technology Data Exchange (ETDEWEB)

    Ohta, M.; Ohizumi, Y.; Hoshina, Y.; Tanaka, T.; Yabuki, Y.; Goto, S.; Ikeda, M. [Development Center, Sony Shiroishi Semiconductor Inc., Miyagi (Japan); Funato, K. [Materials Laboratories, Sony Corporation, Kanagawa (Japan); Tomiya, S. [Materials Analysis Laboratory, Sony Corporation, Kanagawa (Japan)

    2007-06-15

    We successfully developed high-power and long-lived pure blue laser diodes (LDs) having an emission wavelength of 440-450 nm. The pure-blue LDs were grown by metalorganic chemical vapor deposition (MOCVD) on GaN substrates. The dislocation density was successfully reduced to {proportional_to}10{sup 6} cm{sup -2} by optimizing the MOCVD growth conditions and the active layer structure. The vertical layer structure was designed to have an absorption loss of 4.9 cm{sup -1} and an internal quantum efficiency of 91%. We also reduced the operating current density to 6 kA/cm{sup 2} under 750 mW continuous-wave operation at 35 C by optimizing the stripe width to 12 {mu}m and the cavity length to 2000 {mu}m. The half lifetimes in constant current mode are estimated to be longer than 10000 h. (copyright 2007 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  18. Complete low power controller for high voltage power systems

    International Nuclear Information System (INIS)

    Sumner, R.; Blanar, G.

    1997-01-01

    The MHV100 is a custom CMOS integrated circuit, developed for the AMS experiment. It provides complete control for a single channel high voltage (HV) generator and integrates all the required digital communications, D to A and A to D converters, the analog feedback loop and output drivers. This chip has been designed for use in both distributed high voltage systems or for low cost single channel high voltage systems. The output voltage and current range is determined by the external components

  19. Hybrid simulation of electrode plasmas in high-power diodes

    International Nuclear Information System (INIS)

    Welch, Dale R.; Rose, David V.; Bruner, Nichelle; Clark, Robert E.; Oliver, Bryan V.; Hahn, Kelly D.; Johnston, Mark D.

    2009-01-01

    New numerical techniques for simulating the formation and evolution of cathode and anode plasmas have been successfully implemented in a hybrid code. The dynamics of expanding electrode plasmas has long been recognized as a limiting factor in the impedance lifetimes of high-power vacuum diodes and magnetically insulated transmission lines. Realistic modeling of such plasmas is being pursued to aid in understanding the operating characteristics of these devices as well as establishing scaling relations for reliable extrapolation to higher voltages. Here, in addition to kinetic and fluid modeling, a hybrid particle-in-cell technique is described that models high density, thermal plasmas as an inertial fluid which transitions to kinetic electron or ion macroparticles above a prescribed energy. The hybrid technique is computationally efficient and does not require resolution of the Debye length. These techniques are first tested on a simple planar diode then applied to the evolution of both cathode and anode plasmas in a high-power self-magnetic pinch diode. The impact of an intense electron flux on the anode surface leads to rapid heating of contaminant material and diode impedance loss.

  20. High-Altitude Wind Power Generation

    NARCIS (Netherlands)

    Fagiano, L.; Milanese, M.; Piga, D.

    2010-01-01

    Abstract—The paper presents the innovative technology of highaltitude wind power generation, indicated as Kitenergy, which exploits the automatic flight of tethered airfoils (e.g., power kites) to extract energy from wind blowing between 200 and 800 m above the ground. The key points of this

  1. Inclusive vision for high performance computing at the CSIR

    CSIR Research Space (South Africa)

    Gazendam, A

    2006-02-01

    Full Text Available and computationally intensive applications. A number of different technologies and standards were identified as core to the open and distributed high-performance infrastructure envisaged...

  2. Management of the high-level nuclear power facilities

    International Nuclear Information System (INIS)

    Preda, Marin

    2003-05-01

    of energy produced in computer assisted high power facilities. A final chapter summarizes the concluding remarks and recommendations concerning a high performance management of high-power nuclear stations evidencing the original results of the research presented in this PhD thesis. An annex exposes the practical NPP management decision making for ensuring safe operation regimes. The experiments were conducted on the 14 MW TRIGA SSR reactor at INR Pitesti. The concepts developed in this thesis were applied to Cernavoda NPP with a special stress onto nuclear installation monitoring. In conclusion, the following items can be pointed out as achieved in this work: 1. Evidencing of nuclear facility operational monitoring policies concerning primarily the preventive maintenance and NPP safety assurance; 2. Analysis of nuclear accidents within the frame of risk-catastrophe-chaos theories highlighting the operative measures for preventing hazard events and quality assurance monitoring of nuclear reactor components; 3. Development of hybrid neuro-expert systems (with extensions to neuro-fuzzy and fuzzy models) implying process automated programming. This combined system undergoes currently a patent procedure as giving a innovative structure of intelligent hard-soft systems devoted to safe operation of power systems with nuclear injection; 4. Establishing the descriptors of high-performing managing for analysis of specific activities relating to nuclear processes; 5. Modelling of nuclear power systems in the frame of operational approach on managing operators as for instance, market and system operators, human resource and quality operator, economical-financial operator and decision-communication operator; 6. Achievement of experimental system for decision making in NPP monitoring based on 14 MW TRIGA SSR reactor at INR Pitesti. (authors)

  3. 30 GHz High Power Production for CLIC

    CERN Document Server

    Syratchev, I V

    2006-01-01

    The CLIC Power Extraction and Transfer Structure (PETS) is a passive microwave device in which bunches of the drive beam interact with the impedance of the periodically loaded waveguide and excite preferentially the synchronous TM01 mode at 30 GHz. The RF power produced (several hundred MW) is collected at the downstream end of the structure by means of the Power Extractor and conveyed to the main linac structure. The PETS geometry is a result of multiple compromises between beam stability along a single decelerator sector (600 m) and the active length of the structure to match the main linac RF power needs and layout. Surface electric and magnetic fields, power extraction method, HOM damping, ON/OFF capability and fabrication technology were all evaluated to provide a reliable design.

  4. Development of computer-aided design and production system for nuclear power plant

    International Nuclear Information System (INIS)

    Ishii, Masanori

    1983-01-01

    The technically required matters related to the design and production of nuclear power stations tended to increase from the viewpoint of the safety and reliability, and it is indispensable to cope with such technically required matters skillfully for the rationalization of the design and production and for the construction of highly reliable plants. Ishikawajima Harima Heavy Industries Co., Ltd., has developed the computer-aided design data information and engineering system which performs dialogue type design and drawing, and as the result, the design-production consistent system is developed to do stress analysis, production design, production management and the output of data for numerically controlled machine tools consistently. In this paper, mainly the consistent system in the field of plant design centering around piping and also the computer system for the design of vessels and others are outlined. The features of the design works for nuclear power plants, the rationalization of the design and production management of piping and vessels, and the application of the CAD system to other general equipment and improvement works are reported. This system is the powerful means to meet the requirement of heightening quality and reducing cost. (Kako, I.)

  5. Computational Analysis of Nanoparticles-Molten Salt Thermal Energy Storage for Concentrated Solar Power Systems

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, Vinod [Univ. of Texas, El Paso, TX (United States)

    2017-05-05

    High fidelity computational models of thermocline-based thermal energy storage (TES) were developed. The research goal was to advance the understanding of a single tank nanofludized molten salt based thermocline TES system under various concentration and sizes of the particles suspension. Our objectives were to utilize sensible-heat that operates with least irreversibility by using nanoscale physics. This was achieved by performing computational analysis of several storage designs, analyzing storage efficiency and estimating cost effectiveness for the TES systems under a concentrating solar power (CSP) scheme using molten salt as the storage medium. Since TES is one of the most costly but important components of a CSP plant, an efficient TES system has potential to make the electricity generated from solar technologies cost competitive with conventional sources of electricity.

  6. Contemporary high performance computing from petascale toward exascale

    CERN Document Server

    Vetter, Jeffrey S

    2013-01-01

    Contemporary High Performance Computing: From Petascale toward Exascale focuses on the ecosystems surrounding the world's leading centers for high performance computing (HPC). It covers many of the important factors involved in each ecosystem: computer architectures, software, applications, facilities, and sponsors. The first part of the book examines significant trends in HPC systems, including computer architectures, applications, performance, and software. It discusses the growth from terascale to petascale computing and the influence of the TOP500 and Green500 lists. The second part of the

  7. Assessment of computer codes for VVER-440/213-type nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Szabados, L.; Ezsol, Gy.; Perneczky [Atomic Energy Research Institute, Budapest (Hungary)

    1995-09-01

    Nuclear power plant of VVER-440/213 designed by the former USSR have a number of special features. As a consequence of these features the transient behaviour of such a reactor system should be different from the PWR system behaviour. To study the transient behaviour of the Hungarian Paks Nuclear Power Plant of VVER-440/213-type both analytical and experimental activities have been performed. The experimental basis of the research in the PMK-2 integral-type test facility , which is a scaled down model of the plant. Experiments performed on this facility have been used to assess thermal-hydraulic system codes. Four tests were selected for {open_quotes}Standard Problem Exercises{close_quotes} of the International Atomic Energy Agency. Results of the 4th Exercise, of high international interest, are presented in the paper, focusing on the essential findings of the assessment of computer codes.

  8. A high performance scientific cloud computing environment for materials simulations

    OpenAIRE

    Jorissen, Kevin; Vila, Fernando D.; Rehr, John J.

    2011-01-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including...

  9. Low cost highly available digital control computer

    International Nuclear Information System (INIS)

    Silvers, M.W.

    1986-01-01

    When designing digital controllers for critical plant control it is important to provide several features. Among these are reliability, availability, maintainability, environmental protection, and low cost. An examination of several applications has lead to a design that can be produced for approximately $20,000 (1000 control points). This design is compatible with modern concepts in distributed and hierarchical control. The canonical controller element is a dual-redundant self-checking computer that communicates with a cross-strapped, electrically isolated input/output system. The input/output subsystem comprises multiple intelligent input/output cards. These cards accept commands from the primary processor which are validated, executed, and acknowledged. Each card may be hot replaced to facilitate sparing. The implementation of the dual-redundant computer architecture is discussed. Called the FS-86, this computer can be used for a variety of applications. It has most recently found application in the upgrade of San Francisco's Bay Area Rapid Transit (BART) train control currently in progress and has been proposed for feedwater control in a boiling water reactor

  10. A 380 V High Efficiency and High Power Density Switched-Capacitor Power Converter using Wide Band Gap Semiconductors

    DEFF Research Database (Denmark)

    Fan, Lin; Knott, Arnold; Jørgensen, Ivan Harald Holger

    2018-01-01

    . This paper presents such a high voltage low power switched-capacitor DC-DC converter with an input voltage upto 380 V (compatible with rectified European mains) and an output power experimentally validated up to 21.3 W. The wideband gap semiconductor devices of GaN switches and SiC diodes are combined...... to compose the proposed power stage. Their switching and loss characteristics are analyzed with transient waveforms and thermal images. Different isolated driving circuits are compared and a compact isolated halfbridge driving circuit is proposed. The full-load efficiencies of 98.3% and 97.6% are achieved......State-of-the-art switched-capacitor DC-DC power converters mainly focus on low voltage and/or high power applications. However, at high voltage and low power levels, new designs are anticipated to emerge and a power converter that has both high efficiency and high power density is highly desirable...

  11. Measurement of high-power microwave pulse under intense ...

    Indian Academy of Sciences (India)

    Abstract. KALI-1000 pulse power system has been used to generate single pulse nanosecond duration high-power microwaves (HPM) from a virtual cathode oscillator. (VIRCATOR) device. HPM power measurements were carried out using a transmitting– receiving system in the presence of intense high frequency (a few ...

  12. Distributed computer control systems in future nuclear power plants

    International Nuclear Information System (INIS)

    Yan, G.; L'Archeveque, J.V.R.; Watkins, L.M.

    1978-09-01

    Good operating experience with computer control in CANDU reactors over the last decade justifies a broadening of the role of digital electronic and computer related technologies in future plants. Functions of electronic systems in the total plant context are reappraised to help evolve an appropriate match between technology and future applications. The systems research, development and demonstration program at CRNL is described, focusing on the projects pertinent to the real-time data acquisition and process control requirements. (author)

  13. System of operative computer control of power distribution fields in the Beloyarsk nuclear power plant

    International Nuclear Information System (INIS)

    Kulikov, N.Ya.; Snitko, Eh.I.; Rasputnis, A.M.; Solodov, V.P.

    1976-01-01

    Describes the system of intrareactor control over the reactors of the Byeloyarskaya Atomic Station. In the second block of the station, use is made of direct charge emission detectors installed in the central apertures of the superheater channels and operating reliably at temperatures up to 750 deg C. The detectors of the first and the second block are connected to the computer which sends the results of processing the signals to the printer, while the signals for deviations go to the mnemonic tablaux of the reactors. The good working order of the detectors is checked by comparison with zero as well as with the mean detector current for the reactor concerned. The application of the intrareactor control system has allowed the stable thermal power to be increased from 480-500 to 530 Mw and makes it possible to control and maintain the neutron field formed with a relative error of 3-4%. The structural scheme of the system of intrareactor control is given

  14. High Temperature Surface Parameters for Solar Power

    National Research Council Canada - National Science Library

    Butler, C. F; Jenkins, R. J; Rudkin, R. L; Laughridge, F. I

    1960-01-01

    ... at a given distance from the sun. Thermal conversion efficiencies with a concentration ratio of 50 have been computed for each surface when exposed to solar radiation at the Earth's mean orbital radius...

  15. Computer-aided safety systems of industrial high energy objects

    International Nuclear Information System (INIS)

    Topolsky, N.G.; Gordeev, S.G.

    1995-01-01

    Modern objects of fuel and energy, chemical industries are characterized by high power consumption; by presence of large quantities of combustible and explosive substances used in technological processes; by advanced communications of submission systems of initial liquid and gasiform reagents, lubricants and coolants, the products of processing, and wastes of production; by advanced ventilation and pneumatic transport; and by complex control systems of energy, material and information flows. Such objects have advanced infrastructures, including a significant quantity of engineering buildings intended for storage, transportation, and processing of combustible liquids, gasiform fuels and materials, and firm materials. Examples of similar objects are nuclear and thermal power stations, chemical plants, machine-building factories, iron and steel industry enterprises, etc. Many tasks and functions characterizing the problem of fire safety of these objects can be accomplished only upon the development of special Computer-Aided Fire Safety Systems (CAFSS). The CAFSS for these objects are intended to reduce the hazard of disastrous accidents both causing fires and caused by them. The tasks of fire prevention and rescue work of large-scale industrial objects are analyzed within the bounds of the recommended conception. A functional structure of CAFSS with a list of the main subsystems forming a part of its composition has been proposed

  16. High quality, high efficiency welding technology for nuclear power plants

    International Nuclear Information System (INIS)

    Aoki, Shigeyuki; Nagura, Yasumi

    1996-01-01

    For nuclear power plants, it is required to ensure the safety under the high reliability and to attain the high rate of operation. In the manufacture and installation of the machinery and equipment, the welding techniques which become the basis exert large influence to them. For the purpose of improving joint performance and excluding human errors, welding heat input and the number of passes have been reduced, the automation of welding has been advanced, and at present, narrow gap arc welding and high energy density welding such as electron beam welding and laser welding have been put to practical use. Also in the welding of pipings, automatic gas metal arc welding is employed. As for the welding of main machinery and equipment, there are the welding of the joints that constitute pressure boundaries, the build-up welding on the internal surfaces of pressure vessels for separating primary water from them, and the sealing welding of heating tubes and tube plates in steam generators. These weldings are explained. The welding of pipings and the state of development and application of new welding methods are reported. (K.I.)

  17. EBR-II high-ramp transients under computer control

    International Nuclear Information System (INIS)

    Forrester, R.J.; Larson, H.A.; Christensen, L.J.; Booty, W.F.; Dean, E.M.

    1983-01-01

    During reactor run 122, EBR-II was subjected to 13 computer-controlled overpower transients at ramps of 4 MWt/s to qualify the facility and fuel for transient testing of LMFBR oxide fuels as part of the EBR-II operational-reliability-testing (ORT) program. A computer-controlled automatic control-rod drive system (ACRDS), designed by EBR-II personnel, permitted automatic control on demand power during the transients

  18. High performance magnet power supply optimization

    International Nuclear Information System (INIS)

    Jackson, L.T.

    1975-01-01

    Three types of magnet power supply systems for the joint LBL-SLAC proposed accelerator PEP are discussed. The systems considered include a firing circuit and six-pulse controlled rectifier, transistor systems, and a chopper system. (U.S.)

  19. Controlled Compact High Voltage Power Lines

    Directory of Open Access Journals (Sweden)

    Postolati V.

    2016-04-01

    Full Text Available Nowadays modern overhead transmission lines (OHL constructions having several significant differences from conventional ones are being used in power grids more and more widely. Implementation of compact overhead lines equipped with FACTS devices, including phase angle regulator settings (compact controlled OHL, appears to be one of the most effective ways of power grid development. Compact controlled AC HV OHL represent a new generation of power transmission lines embodying recent advanced achievements in design solutions, including towers and insulation, together with interconnection schemes and control systems. Results of comprehensive research and development in relation to 110–500kV compact controlled power transmission lines together with theoretical basis, substantiation, and methodological approaches to their practical application are presented in the present paper.

  20. A highly linear power amplifier for WLAN

    International Nuclear Information System (INIS)

    Jin Jie; Shi Jia; Ai Baoli; Zhang Xuguang

    2016-01-01

    A three-stage power amplifier (PA) for WLAN application in 2.4-2.5 GHz is presented. The proposed PA employs an adaptive bias circuit to adjust the operating point of the PA to improve the linearity of the PA. Two methods to short the 2nd harmonic circuit are compared in the area of efficiency and gain of the PA. The PA is taped out in the process of 2 μm InGaP/GaAs HBT and is tested by the evaluation board. The measured results show that 31.5 dB power gain and 29.3 dBm P 1dB with an associated 40.4% power added efficiency (PAE) under the single tone stimulus. Up to 26.5 dBm output power can be achieved with an error vector magnitude (EVM) of lower than 3% under the 64QAM/OFDM WLAN stimulus. (paper)

  1. Advanced Capacitors for High-Power Applications, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — As the consumer and industrial requirements for compact, high-power-density, electrical power systems grow substantially over the next decade; there will be a...

  2. High power ring methods and accelerator driven subcritical reactor application

    Energy Technology Data Exchange (ETDEWEB)

    Tahar, Malek Haj [Univ. of Grenoble (France)

    2016-08-07

    High power proton accelerators allow providing, by spallation reaction, the neutron fluxes necessary in the synthesis of fissile material, starting from Uranium 238 or Thorium 232. This is the basis of the concept of sub-critical operation of a reactor, for energy production or nuclear waste transmutation, with the objective of achieving cleaner, safer and more efficient process than today’s technologies allow. Designing, building and operating a proton accelerator in the 500-1000 MeV energy range, CW regime, MW power class still remains a challenge nowadays. There is a limited number of installations at present achieving beam characteristics in that class, e.g., PSI in Villigen, 590 MeV CW beam from a cyclotron, SNS in Oakland, 1 GeV pulsed beam from a linear accelerator, in addition to projects as the ESS in Europe, a 5 MW beam from a linear accelerator. Furthermore, coupling an accelerator to a sub-critical nuclear reactor is a challenging proposition: some of the key issues/requirements are the design of a spallation target to withstand high power densities as well as ensure the safety of the installation. These two domains are the grounds of the PhD work: the focus is on the high power ring methods in the frame of the KURRI FFAG collaboration in Japan: upgrade of the installation towards high intensity is crucial to demonstrate the high beam power capability of FFAG. Thus, modeling of the beam dynamics and benchmarking of different codes was undertaken to validate the simulation results. Experimental results revealed some major losses that need to be understood and eventually overcome. By developing analytical models that account for the field defects, one identified major sources of imperfection in the design of scaling FFAG that explain the important tune variations resulting in the crossing of several betatron resonances. A new formula is derived to compute the tunes and properties established that characterize the effect of the field imperfections on the

  3. Advances in high-power rf amplifiers

    International Nuclear Information System (INIS)

    Tallerico, P.J.

    1979-01-01

    Several powerful accelerators and storage rings are being considered that will require tens or even hundreds of megawatts of continuous rf power. The economics of such large machines can be dictated by the cost and efficiency of the rf amplifiers. The overall design and performance of such narrow-band amplifiers, operating in the 50- to 1500-MHz region, are being theoretically studied as a function of frequency to determine the optimum rf amplifier output power, gain, efficiency, and dc power requirements. The state of the art for three types of amplifiers (gridded tubes, klystrons, and gyrocons) is considered and the development work necessary to improve each is discussed. The gyrocon is a new device, hence its various embodiments are discussed in detail. The Soviet designs are reviewed and the gyrocon's strengths and weaknesses are compared to other types of microwave amplifiers. The primary advantages of the gyrocon are the very large amount of power available from a single device and the excellent efficiency and stable operation. The klystron however, has much greater gain and is simpler mechanically. At very low frequencies, the small size of the gridded tube makes it the optimum choice for all but the most powerful systems

  4. Research Activity in Computational Physics utilizing High Performance Computing: Co-authorship Network Analysis

    Science.gov (United States)

    Ahn, Sul-Ah; Jung, Youngim

    2016-10-01

    The research activities of the computational physicists utilizing high performance computing are analyzed by bibliometirc approaches. This study aims at providing the computational physicists utilizing high-performance computing and policy planners with useful bibliometric results for an assessment of research activities. In order to achieve this purpose, we carried out a co-authorship network analysis of journal articles to assess the research activities of researchers for high-performance computational physics as a case study. For this study, we used journal articles of the Scopus database from Elsevier covering the time period of 2004-2013. We extracted the author rank in the physics field utilizing high-performance computing by the number of papers published during ten years from 2004. Finally, we drew the co-authorship network for 45 top-authors and their coauthors, and described some features of the co-authorship network in relation to the author rank. Suggestions for further studies are discussed.

  5. Design and development of high voltage high power operational ...

    Indian Academy of Sciences (India)

    Applications of power operational amplifiers (opamps) are increasing day by day in the industry as they are used in audio amplifiers, Piezo transducer systems and the electron deflection systems. Power operational amplifiers have all the features of a general purpose opamp except the additional power handling capability.

  6. Modeling of mode purity in high power gyrotrons

    International Nuclear Information System (INIS)

    Cai, S.Y.; Antonsen, T.M. Jr.; Saraph, G.P.

    1993-01-01

    Spurious mode generation at the same frequency of the operational mode in a high power gyrotron can significantly reduce the power handling capability and the stability of a gyrotron oscillator because these modes are usually not matched at the output window and thus have high absorption and reflection rates. To study the generation of this kind of mode, the authors developed a numerical model based on an existing multimode self-consistent time-dependent computer code. This model includes both TE and TM modes and accounts for mode transformations due to the waveguide inhomogeneity. With this new tool, they study the mode transformation in the gyrotron and the possibility of excitation of parasitic TE and TM modes in the up taper section due to the gyroklystron mechanism. Their preliminary results show moderate excitation of both TE and TM modes at the same frequency as the main operating mode at locations near their cutoff. Details of the model and further simulation results will be presented

  7. High performance computations using dynamical nucleation theory

    International Nuclear Information System (INIS)

    Windus, T L; Crosby, L D; Kathmann, S M

    2008-01-01

    Chemists continue to explore the use of very large computations to perform simulations that describe the molecular level physics of critical challenges in science. In this paper, we describe the Dynamical Nucleation Theory Monte Carlo (DNTMC) model - a model for determining molecular scale nucleation rate constants - and its parallel capabilities. The potential for bottlenecks and the challenges to running on future petascale or larger resources are delineated. A 'master-slave' solution is proposed to scale to the petascale and will be developed in the NWChem software. In addition, mathematical and data analysis challenges are described

  8. Atmospheric Propagation and Combining of High-Power Lasers

    Science.gov (United States)

    2015-09-08

    Brightness-scaling potential of actively phase- locked solid state laser arrays,” IEEE J. Sel. Topics Quantum Electron., vol. 13, no. 3, pp. 460–472, May...attempting to phase- lock high-power lasers, which is not encountered when phase- locking low-power lasers, for example mW power levels. Regardless, we...technology does not currently exist. This presents a challenging problem when attempting to phase- lock high-power lasers, which is not encountered when

  9. Welding with high power fiber lasers - A preliminary study

    International Nuclear Information System (INIS)

    Quintino, L.; Costa, A.; Miranda, R.; Yapp, D.; Kumar, V.; Kong, C.J.

    2007-01-01

    The new generation of high power fiber lasers presents several benefits for industrial purposes, namely high power with low beam divergence, flexible beam delivery, low maintenance costs, high efficiency and compact size. This paper presents a brief review of the development of high power lasers, and presents initial data on welding of API 5L: X100 pipeline steel with an 8 kW fiber laser. Weld bead geometry was evaluated and transition between conduction and deep penetration welding modes was investigated

  10. A Heterogeneous High-Performance System for Computational and Computer Science

    Science.gov (United States)

    2016-11-15

    expand the research infrastructure at the institution but also to enhance the high -performance computing training provided to both undergraduate and... cloud computing, supercomputing, and the availability of cheap memory and storage led to enormous amounts of data to be sifted through in forensic... High -Performance Computing (HPC) tools that can be integrated with existing curricula and support our research to modernize and dramatically advance

  11. Software Systems for High-performance Quantum Computing

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S [ORNL; Britt, Keith A [ORNL

    2016-01-01

    Quantum computing promises new opportunities for solving hard computational problems, but harnessing this novelty requires breakthrough concepts in the design, operation, and application of computing systems. We define some of the challenges facing the development of quantum computing systems as well as software-based approaches that can be used to overcome these challenges. Following a brief overview of the state of the art, we present models for the quantum programming and execution models, the development of architectures for hybrid high-performance computing systems, and the realization of software stacks for quantum networking. This leads to a discussion of the role that conventional computing plays in the quantum paradigm and how some of the current challenges for exascale computing overlap with those facing quantum computing.

  12. Computational tool for simulation of power and refrigeration cycles

    Science.gov (United States)

    Córdoba Tuta, E.; Reyes Orozco, M.

    2016-07-01

    Small improvement in thermal efficiency of power cycles brings huge cost savings in the production of electricity, for that reason have a tool for simulation of power cycles allows modeling the optimal changes for a best performance. There is also a big boom in research Organic Rankine Cycle (ORC), which aims to get electricity at low power through cogeneration, in which the working fluid is usually a refrigerant. A tool to design the elements of an ORC cycle and the selection of the working fluid would be helpful, because sources of heat from cogeneration are very different and in each case would be a custom design. In this work the development of a multiplatform software for the simulation of power cycles and refrigeration, which was implemented in the C ++ language and includes a graphical interface which was developed using multiplatform environment Qt and runs on operating systems Windows and Linux. The tool allows the design of custom power cycles, selection the type of fluid (thermodynamic properties are calculated through CoolProp library), calculate the plant efficiency, identify the fractions of flow in each branch and finally generates a report very educational in pdf format via the LaTeX tool.

  13. The computer program system for structural design of nuclear power plants

    International Nuclear Information System (INIS)

    Aihara, S.; Atsumi, K.; Sasagawa, K.; Satoh, S.

    1979-01-01

    In recent days, the design method of the Nuclear Power Plant has become more complex than in the past. The Finite Element Method (FEM) applied for analysis of Nuclear Power Plants, especially requires more computer use. The recent computers have made remarkable progress, so that in design work manpower and time necessary for analysis have been reduced considerably. However, instead the arrangement of outputs have increased tremendously. Therefore, a computer program system was developed for performing all of the processes, from data making to output arrangement, and rebar evaluations. This report introduces the computer program system pertaining to the design flow of the Reactor Building. (orig.)

  14. Computer Security for Commercial Nuclear Power Plants - Literature Review for Korea Hydro Nuclear Power Central Research Institute

    Energy Technology Data Exchange (ETDEWEB)

    Duran, Felicia Angelica [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Security Systems Analysis Dept.; Waymire, Russell L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Security Systems Analysis Dept.

    2013-10-01

    Sandia National Laboratories (SNL) is providing training and consultation activities on security planning and design for the Korea Hydro and Nuclear Power Central Research Institute (KHNPCRI). As part of this effort, SNL performed a literature review on computer security requirements, guidance and best practices that are applicable to an advanced nuclear power plant. This report documents the review of reports generated by SNL and other organizations [U.S. Nuclear Regulatory Commission, Nuclear Energy Institute, and International Atomic Energy Agency] related to protection of information technology resources, primarily digital controls and computer resources and their data networks. Copies of the key documents have also been provided to KHNP-CRI.

  15. Computer Security for Commercial Nuclear Power Plants - Literature Review for Korea Hydro Nuclear Power Central Research Institute

    International Nuclear Information System (INIS)

    Duran, Felicia Angelica; Waymire, Russell L.

    2013-01-01

    Sandia National Laboratories (SNL) is providing training and consultation activities on security planning and design for the Korea Hydro and Nuclear Power Central Research Institute (KHNPCRI). As part of this effort, SNL performed a literature review on computer security requirements, guidance and best practices that are applicable to an advanced nuclear power plant. This report documents the review of reports generated by SNL and other organizations [U.S. Nuclear Regulatory Commission, Nuclear Energy Institute, and International Atomic Energy Agency] related to protection of information technology resources, primarily digital controls and computer resources and their data networks. Copies of the key documents have also been provided to KHNP-CRI.

  16. Solid state isotopic power source for computer chips

    International Nuclear Information System (INIS)

    Brown, P.M.

    1992-01-01

    This paper reports that recent developments in materials technology now make it possible to fabricate nonthermal thin-film isotopic energy converters (REC) with a specific power of 24 W/kg and 5 to 10 year working life at 5 to 10 Watts. This creates applications never before possible, such as placing the power supply directly on integrated circuit chips. The efficiency of the REC is about 25% which is two to three times greater than the 6 to 8% capabilities of current thermoelectric systems

  17. High power diode pumped solid state lasers

    International Nuclear Information System (INIS)

    Solarz, R.; Albrecht, G.; Beach, R.; Comaskey, B.

    1992-01-01

    Although operational for over twenty years, diode pumped solid state lasers have, for most of their existence, been limited to individual diodes pumping a tiny volume of active medium in an end pumped configuration. More recent years have witnessed the appearance of diode bars, packing around 100 diodes in a 1 cm bar which have enabled end and side pumped small solid state lasers at the few Watt level of output. This paper describes the subsequent development of how proper cooling and stacking of bars enables the fabrication of multi kill average power diode pump arrays with irradiances of 1 kw/cm peak and 250 W/cm 2 average pump power. Since typical conversion efficiencies from the diode light to the pumped laser output light are of order 30% or more, kW average power diode pumped solid state lasers now are possible

  18. RISC Processors and High Performance Computing

    Science.gov (United States)

    Bailey, David H.; Saini, Subhash; Craw, James M. (Technical Monitor)

    1995-01-01

    This tutorial will discuss the top five RISC microprocessors and the parallel systems in which they are used. It will provide a unique cross-machine comparison not available elsewhere. The effective performance of these processors will be compared by citing standard benchmarks in the context of real applications. The latest NAS Parallel Benchmarks, both absolute performance and performance per dollar, will be listed. The next generation of the NPB will be described. The tutorial will conclude with a discussion of future directions in the field. Technology Transfer Considerations: All of these computer systems are commercially available internationally. Information about these processors is available in the public domain, mostly from the vendors themselves. The NAS Parallel Benchmarks and their results have been previously approved numerous times for public release, beginning back in 1991.

  19. Computer graphics as an information means for power plants

    International Nuclear Information System (INIS)

    Kollmannsberger, J.; Pfadler, H.

    1990-01-01

    Computer-aided graphics have proved increasingly successful as a help in process control in large plants. The specific requirements for the system and the methods of planning and achieving graphic systems in powerstation control rooms are described. Experience from operation is evaluated from completed plants. (orig.) [de

  20. Error Immune Logic for Low-Power Probabilistic Computing

    Directory of Open Access Journals (Sweden)

    Bo Marr

    2010-01-01

    design for the maximum amount of energy savings per a given error rate. Spice simulation results using a commercially available and well-tested 0.25 μm technology are given verifying the ultra-low power, probabilistic full-adder designs. Further, close to 6X energy savings is achieved for a probabilistic full-adder over the deterministic case.

  1. Computer program for afterheat temperature distribution for mobile nuclear power plant

    Science.gov (United States)

    Parker, W. G.; Vanbibber, L. E.

    1972-01-01

    ESATA computer program was developed to analyze thermal safety aspects of post-impacted mobile nuclear power plants. Program is written in FORTRAN 4 and designed for IBM 7094/7044 direct coupled system.

  2. Thermoelectric cooling of microelectronic circuits and waste heat electrical power generation in a desktop personal computer

    International Nuclear Information System (INIS)

    Gould, C.A.; Shammas, N.Y.A.; Grainger, S.; Taylor, I.

    2011-01-01

    Thermoelectric cooling and micro-power generation from waste heat within a standard desktop computer has been demonstrated. A thermoelectric test system has been designed and constructed, with typical test results presented for thermoelectric cooling and micro-power generation when the computer is executing a number of different applications. A thermoelectric module, operating as a heat pump, can lower the operating temperature of the computer's microprocessor and graphics processor to temperatures below ambient conditions. A small amount of electrical power, typically in the micro-watt or milli-watt range, can be generated by a thermoelectric module attached to the outside of the computer's standard heat sink assembly, when a secondary heat sink is attached to the other side of the thermoelectric module. Maximum electrical power can be generated by the thermoelectric module when a water cooled heat sink is used as the secondary heat sink, as this produces the greatest temperature difference between both sides of the module.

  3. Simulating elastic light scattering using high performance computing methods

    NARCIS (Netherlands)

    Hoekstra, A.G.; Sloot, P.M.A.; Verbraeck, A.; Kerckhoffs, E.J.H.

    1993-01-01

    The Coupled Dipole method, as originally formulated byPurcell and Pennypacker, is a very powerful method tosimulate the Elastic Light Scattering from arbitraryparticles. This method, which is a particle simulationmodel for Computational Electromagnetics, has one majordrawback: if the size of the

  4. The Plant-Window System: A framework for an integrated computing environment at advanced nuclear power plants

    International Nuclear Information System (INIS)

    Wood, R.T.; Mullens, J.A.; Naser, J.A.

    1997-01-01

    Power plant data, and the information that can be derived from it, provide the link to the plant through which the operations, maintenance and engineering staff understand and manage plant performance. The extensive use of computer technology in advanced reactor designs provides the opportunity to greatly expand the capability to obtain, analyze, and present data about the plant to station personnel. However, to support highly efficient and increasingly safe operation of nuclear power plants, it is necessary to transform the vast quantity of available data into clear, concise, and coherent information that can be readily accessed and used throughout the plant. This need can be met by an integrated computer workstation environment that provides the necessary information and software applications, in a manner that can be easily understood and sued, to the proper users throughout the plan. As part of a Cooperative Research and Development Agreement with the Electric Power Research Institute, the Oak Ridge National laboratory has developed functional requirements for a Plant-Wide Integrated Environment Distributed On Workstations (Plant-Window) System. The Plant-Window System (PWS) can serve the needs of operations, engineering, and maintenance personnel at nuclear power stations by providing integrated data and software applications within a common computing environment. The PWS requirements identify functional capabilities and provide guidelines for standardized hardware, software, and display interfaces so as to define a flexible computing environment for both current generation nuclear power plants and advanced reactor designs

  5. CERN-group conceptual design of a fast neutron operated high power energy amplifier

    International Nuclear Information System (INIS)

    Rubbia, C.; Rubio, J.A.; Buono, S.

    1997-01-01

    The practical feasibility of an Energy Amplifier (EA) with power and power density which are comparable to the ones of the present generation of large PWR is discussed in this paper. This is only possible with fast neutrons. Schemes are described which offer a high gain, a large maximum power density and an extended burn-up, well in excess of 100 GW x d/t corresponding to about five years at full power operation with no intervention on the fuel core. The following topics are discussed: physics considerations and parameter definition, the accelerator complex, the energy amplifier unit, computer simulated operation, and fuel cycle closing

  6. CERN-group conceptual design of a fast neutron operated high power energy amplifier

    Energy Technology Data Exchange (ETDEWEB)

    Rubbia, C; Rubio, J A [European Organization for Nuclear Research, CERN, Geneva (Switzerland); Buono, S [Laboratoire du Cyclotron, Nice (France); and others

    1997-11-01

    The practical feasibility of an Energy Amplifier (EA) with power and power density which are comparable to the ones of the present generation of large PWR is discussed in this paper. This is only possible with fast neutrons. Schemes are described which offer a high gain, a large maximum power density and an extended burn-up, well in excess of 100 GW x d/t corresponding to about five years at full power operation with no intervention on the fuel core. The following topics are discussed: physics considerations and parameter definition, the accelerator complex, the energy amplifier unit, computer simulated operation, and fuel cycle closing. 84 refs, figs, tabs.

  7. New high power CW klystrons at TED

    CERN Document Server

    Beunas, A; Marchesin, R

    2003-01-01

    Thales Electron Devices (TED) has been awarded a contract by CERN to develop and produce 20 units of the klystrons needed to feed the Large Hadrons Collider (LHC). Each of these delivers 300 kW of CW RF power at 400 MHz. Three klystrons have been delivered to CERN up to now.

  8. A highly linear power amplifier for WLAN

    Science.gov (United States)

    Jie, Jin; Jia, Shi; Baoli, Ai; Xuguang, Zhang

    2016-02-01

    A three-stage power amplifier (PA) for WLAN application in 2.4-2.5 GHz is presented. The proposed PA employs an adaptive bias circuit to adjust the operating point of the PA to improve the linearity of the PA. Two methods to short the 2nd harmonic circuit are compared in the area of efficiency and gain of the PA. The PA is taped out in the process of 2 μm InGaP/GaAs HBT and is tested by the evaluation board. The measured results show that 31.5 dB power gain and 29.3 dBm P1dB with an associated 40.4% power added efficiency (PAE) under the single tone stimulus. Up to 26.5 dBm output power can be achieved with an error vector magnitude (EVM) of lower than 3% under the 64QAM/OFDM WLAN stimulus. Project supported by the National Natural Science Foundation of China (No. 61201244) and the Natural Science Fund of SUES (No. E1-0501-14-0168).

  9. High-voltage, high-power architecture considerations

    International Nuclear Information System (INIS)

    Moser, R.L.

    1985-01-01

    Three basic EPS architectures, direct energy transfer, peak-power tracking, and a potential EPS architecture for a nuclear reactor are described and compared. Considerations for the power source and energy storage are discussed. Factors to be considered in selecting the operating voltage are pointed out. Other EPS architecture considerations are autonomy, solar array degrees of freedom, and EPS modularity. It was concluded that selection of the power source and energy storage has major impacts on the spacecraft architecture and mass

  10. Fiber facet gratings for high power fiber lasers

    Science.gov (United States)

    Vanek, Martin; Vanis, Jan; Baravets, Yauhen; Todorov, Filip; Ctyroky, Jiri; Honzatko, Pavel

    2017-12-01

    We numerically investigated the properties of diffraction gratings designated for fabrication on the facet of an optical fiber. The gratings are intended to be used in high-power fiber lasers as mirrors either with a low or high reflectivity. The modal reflectance of low reflectivity polarizing grating has a value close to 3% for TE mode while it is significantly suppressed for TM mode. Such a grating can be fabricated on laser output fiber facet. The polarizing grating with high modal reflectance is designed as a leaky-mode resonant diffraction grating. The grating can be etched in a thin layer of high index dielectric which is sputtered on fiber facet. We used refractive index of Ta2O5 for such a layer. We found that modal reflectance can be close to 0.95 for TE polarization and polarization extinction ratio achieves 18 dB. Rigorous coupled wave analysis was used for fast optimization of grating parameters while aperiodic rigorous coupled wave analysis, Fourier modal method and finite difference time domain method were compared and used to compute modal reflectance of designed gratings.

  11. Modelling aluminium wire bond reliability in high power OMP devices

    NARCIS (Netherlands)

    Kregting, R.; Yuan, C.A.; Xiao, A.; Bruijn, F. de

    2011-01-01

    In a RF power application such as the OMP, the wires are subjected to high current (because of the high power) and high temperature (because of the heat from IC and joule-heating from the wire itself). Moreover, the wire shape is essential to the RF performance. Hence, the aluminium wire is

  12. GaN-based High Power High Frequency Wide Range LLC Resonant Converter, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — SET Group will design, build and demonstrate a Gallium Nitride (GaN) based High Power High Frequency Wide Range LLC Resonant Converter capable of handling high power...

  13. Improving Quality of Service and Reducing Power Consumption with WAN accelerator in Cloud Computing Environments

    OpenAIRE

    Shin-ichi Kuribayashi

    2013-01-01

    The widespread use of cloud computing services is expected to deteriorate a Quality of Service and toincrease the power consumption of ICT devices, since the distance to a server becomes longer than before. Migration of virtual machines over a wide area can solve many problems such as load balancing and power saving in cloud computing environments. This paper proposes to dynamically apply WAN accelerator within the network when a virtual machine is moved to a distant center, in order to preve...

  14. Automated high speed volume computed tomography for inline quality control

    International Nuclear Information System (INIS)

    Hanke, R.; Kugel, A.; Troup, P.

    2004-01-01

    Increasing complexity of innovative products as well as growing requirements on quality and reliability call for more detailed knowledge about internal structures of manufactured components rather by 100 % inspection than just by sampling test. A first-step solution, like radioscopic inline inspection machines, equipped with automated data evaluation software, have become state of the art in the production floor during the last years. However, these machines provide just ordinary two-dimensional information and deliver no volume data e.g. to evaluate exact position or shape of detected defects. One way to solve this problem is the application of X-ray computed tomography (CT). Compared to the performance of the first generation medical scanners (scanning times of many hours), today, modern Volume CT machines for industrial applications need about 5 minutes for a full object scan depending on the object size. Of course, this is still too long to introduce this powerful method into the inline production quality control. In order to gain acceptance, the scanning time including subsequent data evaluation must be decreased significantly and adapted to the manufacturing cycle times. This presentation demonstrates the new technical set up, reconstruction results and the methods for high-speed volume data evaluation of a new fully automated high-speed CT scanner with cycle times below one minute for an object size of less than 15 cm. This will directly create new opportunities in design and construction of more complex objects. (author)

  15. Los Alamos high-power proton linac designs

    Energy Technology Data Exchange (ETDEWEB)

    Lawrence, G.P. [Los Alamos National Laboratory, NM (United States)

    1995-10-01

    Medium-energy high-power proton linear accelerators have been studied at Los Alamos as drivers for spallation neutron applications requiring large amounts of beam power. Reference designs for such accelerators are discussed, important design factors are reviewed, and issues and concern specific to this unprecedented power regime are discussed.

  16. Performance of a high efficiency high power UHF klystron

    International Nuclear Information System (INIS)

    Konrad, G.T.

    1977-03-01

    A 500 kW c-w klystron was designed for the PEP storage ring at SLAC. The tube operates at 353.2 MHz, 62 kV, a microperveance of 0.75, and a gain of approximately 50 dB. Stable operation is required for a VSWR as high as 2 : 1 at any phase angle. The design efficiency is 70%. To obtain this value of efficiency, a second harmonic cavity is used in order to produce a very tightly bunched beam in the output gap. At the present time it is planned to install 12 such klystrons in PEP. A tube with a reduced size collector was operated at 4% duty at 500 kW. An efficiency of 63% was observed. The same tube was operated up to 200 kW c-w for PEP accelerator cavity tests. A full-scale c-w tube reached 500 kW at 65 kV with an efficiency of 55%. In addition to power and phase measurements into a matched load, some data at various load mismatches are presented

  17. Temperature Stabilized Characterization of High Voltage Power Supplies

    CERN Document Server

    Krarup, Ole

    2017-01-01

    High precision measurements of the masses of nuclear ions in the ISOLTRAP experiment relies on an MR-ToF. A major source of noise and drift is the instability of the high voltage power supplies employed. Electrical noise and temperature changes can broaden peaks in time-of-flight spectra and shift the position of peaks between runs. In this report we investigate how the noise and drift of high-voltage power supplies can be characterized. Results indicate that analog power supplies generally have better relative stability than digitally controlled ones, and that the high temperature coefficients of all power supplies merit efforts to stabilize them.

  18. Analysis of control room computers at nuclear power plants

    International Nuclear Information System (INIS)

    Leijonhufvud, S.; Lindholm, L.

    1984-03-01

    The following problems are analyzed: - the developing of a system - hardware and software - data - the aquisition of the system - operation and service. The findings are: - most reliability problems can be solved by doubling critical units - reliability in software has a quality that can only be created through development - reliability in computer systems in extremely unusual situations can not be quantified or verified, except possibly for very small and functionally simple systems - to attain the highest possible reliability by such simple systems these have to: - contian one or very few functions - be functionally simple - be application-transparent, viz. the internal function of the system should be independent of the status of the process - a computer system will compete succesfully with other possible systems regarding reliability for the following reasons: - if the function is simple enough for other systems, the dator system would be small - if the functions cannot be realized by other systems - the computer system would complement the human effort - and the man-machine system would be a better solution than no system, possibly better than human function only. (Aa)

  19. A high performance scientific cloud computing environment for materials simulations

    Science.gov (United States)

    Jorissen, K.; Vila, F. D.; Rehr, J. J.

    2012-09-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.

  20. Elucidation of complicated phenomena in nuclear power field by computation science techniques

    International Nuclear Information System (INIS)

    Takahashi, Ryoichi

    1996-01-01

    In this crossover research, the complicated phenomena treated in nuclear power field are elucidated, and for connecting them to engineering application research, the development of high speed computer utilization technology and the large scale numerical simulation utilizing it are carried out. As the scale of calculation, it is aimed at to realize the three-dimensional numerical simulation of the largest scale in the world of about 100 million mesh and to develop the results into engineering research. In the nuclear power plants of next generation, the further improvement of economical efficiency is demanded together with securing safety, and it is important that the design window is large. The work of confirming quantitatively the size of design window is not easy, and it is very difficult to separate observed phenomena into elementary events. As the method of forecasting and reproducing complicated phenomena and quantifying design window, large scale numerical simulation is promising. The roles of theory, experiment and computation science are discussed. The system of executing this crossover research is described. (K.I.)