WorldWideScience

Sample records for technology sponsors large-scale

  1. Sensitivity technologies for large scale simulation

    International Nuclear Information System (INIS)

    Collis, Samuel Scott; Bartlett, Roscoe Ainsworth; Smith, Thomas Michael; Heinkenschloss, Matthias; Wilcox, Lucas C.; Hill, Judith C.; Ghattas, Omar; Berggren, Martin Olof; Akcelik, Volkan; Ober, Curtis Curry; van Bloemen Waanders, Bart Gustaaf; Keiter, Eric Richard

    2005-01-01

    Sensitivity analysis is critically important to numerous analysis algorithms, including large scale optimization, uncertainty quantification,reduced order modeling, and error estimation. Our research focused on developing tools, algorithms and standard interfaces to facilitate the implementation of sensitivity type analysis into existing code and equally important, the work was focused on ways to increase the visibility of sensitivity analysis. We attempt to accomplish the first objective through the development of hybrid automatic differentiation tools, standard linear algebra interfaces for numerical algorithms, time domain decomposition algorithms and two level Newton methods. We attempt to accomplish the second goal by presenting the results of several case studies in which direct sensitivities and adjoint methods have been effectively applied, in addition to an investigation of h-p adaptivity using adjoint based a posteriori error estimation. A mathematical overview is provided of direct sensitivities and adjoint methods for both steady state and transient simulations. Two case studies are presented to demonstrate the utility of these methods. A direct sensitivity method is implemented to solve a source inversion problem for steady state internal flows subject to convection diffusion. Real time performance is achieved using novel decomposition into offline and online calculations. Adjoint methods are used to reconstruct initial conditions of a contamination event in an external flow. We demonstrate an adjoint based transient solution. In addition, we investigated time domain decomposition algorithms in an attempt to improve the efficiency of transient simulations. Because derivative calculations are at the root of sensitivity calculations, we have developed hybrid automatic differentiation methods and implemented this approach for shape optimization for gas dynamics using the Euler equations. The hybrid automatic differentiation method was applied to a first

  2. Large-scale demonstration of D ampersand D technologies

    International Nuclear Information System (INIS)

    Bhattacharyya, S.K.; Black, D.B.; Rose, R.W.

    1997-01-01

    It is becoming increasingly evident that new technologies will need to be utilized for decontamination and decommissioning (D ampersand D) activities in order to assure safe and cost effective operations. The magnitude of the international D ampersand D problem is sufficiently large in anticipated cost (100's of billions of dollars) and in elapsed time (decades), that the utilization of new technologies should lead to substantial improvements in cost and safety performance. Adoption of new technologies in the generally highly contaminated D ampersand D environments requires assurances that the technology will perform as advertised. Such assurances can be obtained from demonstrations of the technology in environments that are similar to the actual environments without being quite as contaminated and hazardous. The Large Scale Demonstration Project (LSDP) concept was designed to provide such a function. The first LSDP funded by the U.S. Department Of Energy's Environmental Management Office (EM) was on the Chicago Pile 5 (CP-5) Reactor at Argonne National Laboratory. The project, conducted by a Strategic Alliance for Environmental Restoration, has completed demonstrations of 10 D ampersand D technologies and is in the process of comparing the performance to baseline technologies. At the conclusion of the project, a catalog of performance comparisons of these technologies will be developed that will be suitable for use by future D ampersand D planners

  3. Technologies and challenges in large-scale phosphoproteomics

    DEFF Research Database (Denmark)

    Engholm-Keller, Kasper; Larsen, Martin Røssel

    2013-01-01

    become the main technique for discovery and characterization of phosphoproteins in a nonhypothesis driven fashion. In this review, we describe methods for state-of-the-art MS-based analysis of protein phosphorylation as well as the strategies employed in large-scale phosphoproteomic experiments...... with focus on the various challenges and limitations this field currently faces....

  4. Status of large scale wind turbine technology development abroad?

    Institute of Scientific and Technical Information of China (English)

    Ye LI; Lei DUAN

    2016-01-01

    To facilitate the large scale (multi-megawatt) wind turbine development in China, the foreign e?orts and achievements in the area are reviewed and summarized. Not only the popular horizontal axis wind turbines on-land but also the o?shore wind turbines, vertical axis wind turbines, airborne wind turbines, and shroud wind turbines are discussed. The purpose of this review is to provide a comprehensive comment and assessment about the basic work principle, economic aspects, and environmental impacts of turbines.

  5. Optimization of FTA technology for large scale plant DNA isolation ...

    African Journals Online (AJOL)

    Conventional methods for DNA acquisition and storage require expensive reagents and equipments. Experimental fields located in remote areas and large sample size presents greater challenge to developing country institutions constrained financially. FTATM technology uses a single format utilizing basic tools found in ...

  6. Vision for single flux quantum very large scale integrated technology

    International Nuclear Information System (INIS)

    Silver, Arnold; Bunyk, Paul; Kleinsasser, Alan; Spargo, John

    2006-01-01

    Single flux quantum (SFQ) electronics is extremely fast and has very low on-chip power dissipation. SFQ VLSI is an excellent candidate for high-performance computing and other applications requiring extremely high-speed signal processing. Despite this, SFQ technology has generally not been accepted for system implementation. We argue that this is due, at least in part, to the use of outdated tools to produce SFQ circuits and chips. Assuming the use of tools equivalent to those employed in the semiconductor industry, we estimate the density of Josephson junctions, circuit speed, and power dissipation that could be achieved with SFQ technology. Today, CMOS lithography is at 90-65 nm with about 20 layers. Assuming equivalent technology, aggressively increasing the current density above 100 kA cm -2 to achieve junction speeds approximately 1000 GHz, and reducing device footprints by converting device profiles from planar to vertical, one could expect to integrate about 250 M Josephson junctions cm -2 into SFQ digital circuits. This should enable circuit operation with clock frequencies above 200 GHz and place approximately 20 K gates within a radius of one clock period. As a result, complete microprocessors, including integrated memory registers, could be fabricated on a single chip

  7. Vision for single flux quantum very large scale integrated technology

    Energy Technology Data Exchange (ETDEWEB)

    Silver, Arnold [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States); Bunyk, Paul [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States); Kleinsasser, Alan [Jet Propulsion Laboratory, 4800 Oak Grove Drive, Pasadena, CA 91109-8099 (United States); Spargo, John [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States)

    2006-05-15

    Single flux quantum (SFQ) electronics is extremely fast and has very low on-chip power dissipation. SFQ VLSI is an excellent candidate for high-performance computing and other applications requiring extremely high-speed signal processing. Despite this, SFQ technology has generally not been accepted for system implementation. We argue that this is due, at least in part, to the use of outdated tools to produce SFQ circuits and chips. Assuming the use of tools equivalent to those employed in the semiconductor industry, we estimate the density of Josephson junctions, circuit speed, and power dissipation that could be achieved with SFQ technology. Today, CMOS lithography is at 90-65 nm with about 20 layers. Assuming equivalent technology, aggressively increasing the current density above 100 kA cm{sup -2} to achieve junction speeds approximately 1000 GHz, and reducing device footprints by converting device profiles from planar to vertical, one could expect to integrate about 250 M Josephson junctions cm{sup -2} into SFQ digital circuits. This should enable circuit operation with clock frequencies above 200 GHz and place approximately 20 K gates within a radius of one clock period. As a result, complete microprocessors, including integrated memory registers, could be fabricated on a single chip.

  8. Public reactions to large-scale energy technologies

    Energy Technology Data Exchange (ETDEWEB)

    Midden, C J; Daamen, D D; Verplanken, B

    1986-02-01

    In the first part of this article we discuss certain factors which are of influence upon the perception of risks connected to energy technologies. Several studies show that the catastrophic potential and the degree to which people consider negative consequences to be controllable are the main factors which influence this perception. In the next part differences between experts and lay people are discussed. Lay people are found to be bad at making numerical estimates of annual fatality frequencies of different causes of death. High frequencies appear to be underestimated and low frequencies overestimated. We conclude that differences between experts and lay people may be partly explained by the use of different concepts in talking about risks. In the third part attitudes on the use of nuclear energy and coal for the generation of electricity are discussed. Attitudes are determined by the observed probability of negative consequences rather than the expected probability of positive effects. It appears that the differences between the two groups are mostly not based on ideology but rather determined by a fairly rational trade-off of expected risks and advantages. The last part is concerned with the siting of nuclear power plants. The fact that people living near nuclear plants give a lower estimate of the risks than people living further away can be explained in a number of ways. Finally we discuss the problem of compensation for local residents and representatives in the choice of a site for a new plant. Our conclusion is that the usefulness of such strategies depends on the fact whether the perception of risks on a local level is based on feelings of insecurity or on an expert-like risk assessment. 4 figs., 35 refs.

  9. Output Control Technologies for a Large-scale PV System Considering Impacts on a Power Grid

    Science.gov (United States)

    Kuwayama, Akira

    The mega-solar demonstration project named “Verification of Grid Stabilization with Large-scale PV Power Generation systems” had been completed in March 2011 at Wakkanai, the northernmost city of Japan. The major objectives of this project were to evaluate adverse impacts of large-scale PV power generation systems connected to the power grid and develop output control technologies with integrated battery storage system. This paper describes the outline and results of this project. These results show the effectiveness of battery storage system and also proposed output control methods for a large-scale PV system to ensure stable operation of power grids. NEDO, New Energy and Industrial Technology Development Organization of Japan conducted this project and HEPCO, Hokkaido Electric Power Co., Inc managed the overall project.

  10. A review of sensing technologies for small and large-scale touch panels

    Science.gov (United States)

    Akhtar, Humza; Kemao, Qian; Kakarala, Ramakrishna

    2017-06-01

    A touch panel is an input device for human computer interaction. It consists of a network of sensors, a sampling circuit and a micro controller for detecting and locating a touch input. Touch input can come from either finger or stylus depending upon the type of touch technology. These touch panels provide an intuitive and collaborative workspace so that people can perform various tasks with the use of their fingers instead of traditional input devices like keyboard and mouse. Touch sensing technology is not new. At the time of this writing, various technologies are available in the market and this paper reviews the most common ones. We review traditional designs and sensing algorithms for touch technology. We also observe that due to its various strengths, capacitive touch will dominate the large-scale touch panel industry in years to come. In the end, we discuss the motivation for doing academic research on large-scale panels.

  11. Technology for the large-scale production of multi-crystalline silicon solar cells and modules

    International Nuclear Information System (INIS)

    Weeber, A.W.; De Moor, H.H.C.

    1997-06-01

    In cooperation with Shell Solar Energy (formerly R and S Renewable Energy Systems) and the Research Institute for Materials of the Catholic University Nijmegen the Netherlands Energy Research Foundation (ECN) plans to develop a competitive technology for the large-scale manufacturing of solar cells and solar modules on the basis of multi-crystalline silicon. The project will be carried out within the framework of the Economy, Ecology and Technology (EET) program of the Dutch ministry of Economic Affairs and the Dutch ministry of Education, Culture and Sciences. The aim of the EET-project is to reduce the costs of a solar module by 50% by means of increasing the conversion efficiency as well as the development of cheap processes for large-scale production

  12. Automatic Measurement in Large-Scale Space with the Laser Theodolite and Vision Guiding Technology

    Directory of Open Access Journals (Sweden)

    Bin Wu

    2013-01-01

    Full Text Available The multitheodolite intersection measurement is a traditional approach to the coordinate measurement in large-scale space. However, the procedure of manual labeling and aiming results in the low automation level and the low measuring efficiency, and the measurement accuracy is affected easily by the manual aiming error. Based on the traditional theodolite measuring methods, this paper introduces the mechanism of vision measurement principle and presents a novel automatic measurement method for large-scale space and large workpieces (equipment combined with the laser theodolite measuring and vision guiding technologies. The measuring mark is established on the surface of the measured workpiece by the collimating laser which is coaxial with the sight-axis of theodolite, so the cooperation targets or manual marks are no longer needed. With the theoretical model data and the multiresolution visual imaging and tracking technology, it can realize the automatic, quick, and accurate measurement of large workpieces in large-scale space. Meanwhile, the impact of artificial error is reduced and the measuring efficiency is improved. Therefore, this method has significant ramification for the measurement of large workpieces, such as the geometry appearance characteristics measuring of ships, large aircraft, and spacecraft, and deformation monitoring for large building, dams.

  13. Current Barriers to Large-scale Interoperability of Traceability Technology in the Seafood Sector.

    Science.gov (United States)

    Hardt, Marah J; Flett, Keith; Howell, Colleen J

    2017-08-01

    Interoperability is a critical component of full-chain digital traceability, but is almost nonexistent in the seafood industry. Using both quantitative and qualitative methodology, this study explores the barriers impeding progress toward large-scale interoperability among digital traceability systems in the seafood sector from the perspectives of seafood companies, technology vendors, and supply chains as a whole. We highlight lessons from recent research and field work focused on implementing traceability across full supply chains and make some recommendations for next steps in terms of overcoming challenges and scaling current efforts. © 2017 Institute of Food Technologists®.

  14. Review of DC System Technologies for Large Scale Integration of Wind Energy Systems with Electricity Grids

    Directory of Open Access Journals (Sweden)

    Sheng Jie Shao

    2010-06-01

    Full Text Available The ever increasing development and availability of power electronic systems is the underpinning technology that enables large scale integration of wind generation plants with the electricity grid. As the size and power capacity of the wind turbine continues to increase, so is the need to place these significantly large structures at off-shore locations. DC grids and associated power transmission technologies provide opportunities for cost reduction and electricity grid impact minimization as the bulk power is concentrated at single point of entry. As a result, planning, optimization and impact can be studied and carefully controlled minimizing the risk of the investment as well as power system stability issues. This paper discusses the key technologies associated with DC grids for offshore wind farm applications.

  15. Assessment of the technology required to develop photovoltaic power system for large scale national energy applications

    Science.gov (United States)

    Lutwack, R.

    1974-01-01

    A technical assessment of a program to develop photovoltaic power system technology for large-scale national energy applications was made by analyzing and judging the alternative candidate photovoltaic systems and development tasks. A program plan was constructed based on achieving the 10 year objective of a program to establish the practicability of large-scale terrestrial power installations using photovoltaic conversion arrays costing less than $0.50/peak W. Guidelines for the tasks of a 5 year program were derived from a set of 5 year objectives deduced from the 10 year objective. This report indicates the need for an early emphasis on the development of the single-crystal Si photovoltaic system for commercial utilization; a production goal of 5 x 10 to the 8th power peak W/year of $0.50 cells was projected for the year 1985. The developments of other photovoltaic conversion systems were assigned to longer range development roles. The status of the technology developments and the applicability of solar arrays in particular power installations, ranging from houses to central power plants, was scheduled to be verified in a series of demonstration projects. The budget recommended for the first 5 year phase of the program is $268.5M.

  16. Windows for innovation: a story of two large-scale technologies

    International Nuclear Information System (INIS)

    Hazelrigg, G.A.; Roth, E.B.

    1982-01-01

    In the report two technologies (communication satellites and light water reactors) are examined to determine the technological, institutional, and economic forces that were at play during their development and implementation. These two technologies embody a wide variety of issues encountered in technology development. Both are large-scale technologies that embody many component technologies; both have profound and widespread social impacts. Communication satellites enabled extensive international communications, and have been instrumental in fostering economic development. Light water reactors have attributes which make them potentially highly beneficial to society; however, they are plagued with potential hazards of considerable magnitude, and their future is being debated in several nations. Both technologies were developed primarily under support from the U.S. government. Although both technologies initially appeared to meet civilian market demands well and promised to enjoy successful periods of implementation, the U.S. nuclear industry may not survive. On the other hand, communication satellites are being implemented at a rate that surpasses the most favorable predictions made in the 1960s

  17. Large-scale decontamination and decommissioning technology demonstration project at a former uranium metal production facility

    International Nuclear Information System (INIS)

    Martineit, R.A.; Borgman, T.D.; Peters, M.S.; Stebbins, L.L.

    1997-01-01

    The Department of Energy's (DOE) Office of Science and Technology Decontamination and Decommissioning (D ampersand D) Focus Area, led by the Federal Energy Technology Center, has been charged with improving upon baseline D ampersand D technologies with the goal of demonstrating and validating more cost-effective and safer technologies to characterize, deactivate, survey, decontaminate, dismantle, and dispose of surplus structures, buildings, and their contents at DOE sites. The D ampersand D Focus Area's approach to verifying the benefits of the improved D ampersand D technologies is to use them in large-scale technology demonstration (LSTD) projects at several DOE sites. The Fernald Environmental Management Project (FEMP) was selected to host one of the first three LSTD's awarded by the D ampersand D Focus Area. The FEMP is a DOE facility near Cincinnati, Ohio, that was formerly engaged in the production of high quality uranium metal. The FEMP is a Superfund site which has completed its RUFS process and is currently undergoing environmental restoration. With the FEMP's selection to host an LSTD, the FEMP was immediately faced with some challenges. The primary challenge was that this LSTD was to be integrated into the FEMP's Plant 1 D ampersand D Project which was an ongoing D ampersand D Project for which a firm fixed price contract had been issued to the D ampersand D Contractor. Thus, interferences with the baseline D ampersand D project could have significant financial implications. Other challenges include defining and selecting meaningful technology demonstrations, finding/selecting technology providers, and integrating the technology into the baseline D ampersand D project. To date, twelve technologies have been selected, and six have been demonstrated. The technology demonstrations have yielded a high proportion of open-quotes winners.close quotes All demonstrated, technologies will be evaluated for incorporation into the FEMP's baseline D ampersand D

  18. Graphene/MoS2 hybrid technology for large-scale two-dimensional electronics.

    Science.gov (United States)

    Yu, Lili; Lee, Yi-Hsien; Ling, Xi; Santos, Elton J G; Shin, Yong Cheol; Lin, Yuxuan; Dubey, Madan; Kaxiras, Efthimios; Kong, Jing; Wang, Han; Palacios, Tomás

    2014-06-11

    Two-dimensional (2D) materials have generated great interest in the past few years as a new toolbox for electronics. This family of materials includes, among others, metallic graphene, semiconducting transition metal dichalcogenides (such as MoS2), and insulating boron nitride. These materials and their heterostructures offer excellent mechanical flexibility, optical transparency, and favorable transport properties for realizing electronic, sensing, and optical systems on arbitrary surfaces. In this paper, we demonstrate a novel technology for constructing large-scale electronic systems based on graphene/molybdenum disulfide (MoS2) heterostructures grown by chemical vapor deposition. We have fabricated high-performance devices and circuits based on this heterostructure, where MoS2 is used as the transistor channel and graphene as contact electrodes and circuit interconnects. We provide a systematic comparison of the graphene/MoS2 heterojunction contact to more traditional MoS2-metal junctions, as well as a theoretical investigation, using density functional theory, of the origin of the Schottky barrier height. The tunability of the graphene work function with electrostatic doping significantly improves the ohmic contact to MoS2. These high-performance large-scale devices and circuits based on this 2D heterostructure pave the way for practical flexible transparent electronics.

  19. Large-scale commercial applications of the in situ vitrification remediation technology

    International Nuclear Information System (INIS)

    Campbell, B.E.; Hansen, J.E.; McElroy, J.L.; Thompson, L.E.; Timmerman, C.L.

    1994-01-01

    The first large-scale commercial application of the innovative In Situ Vitrification (ISV) remediation technology was completed at the Parsons Chemical/ETM Enterprises Superfund site in Michigan State midyear 1994. This project involved treating 4,800 tons of pesticide and mercury-contaminated soil. The project also involved performance of the USEPA SITE Program demonstration test for the ISV technology. The Parsons project involved consolidation and staging of contaminated soil from widespread locations on and nearby the site. This paper presents a brief description of the ISV technology along with case-study type information on these two sites and the performance of the ISV technology on them. The paper also reviews other remediation projects where ISV has been identified as the/a preferred remedy, and where ISV is currently planned for use. These sites include soils contaminated with pesticides, dioxin, PCP, paint wastes, and a variety of heavy metals. This review of additional sites also includes a description of a planned radioactive mixed waste remediation project in Australia that contains large amounts of plutonium, uranium, lead, beryllium, and metallic and other debris buried in limestone and dolomitic soil burial pits. Initial test work has been completed on this application, and preparations are now underway for pilot testing in Australia. This project will demonstrate the applicability of the ISV technology to the challenging application of buried mixed wastes

  20. Microfluidic very large-scale integration for biochips: Technology, testing and fault-tolerant design

    DEFF Research Database (Denmark)

    Araci, Ismail Emre; Pop, Paul; Chakrabarty, Krishnendu

    2015-01-01

    of this paper is on continuous-flow biochips, where the basic building block is a microvalve. By combining these microvalves, more complex units such as mixers, switches, multiplexers can be built, hence the name of the technology, “microfluidic Very Large-Scale Integration” (mVLSI). A roadblock......Microfluidic biochips are replacing the conventional biochemical analyzers by integrating all the necessary functions for biochemical analysis using microfluidics. Biochips are used in many application areas, such as, in vitro diagnostics, drug discovery, biotech and ecology. The focus...... presents the state-of-the-art in the mVLSI platforms and emerging research challenges in the area of continuous-flow microfluidics, focusing on testing techniques and fault-tolerant design....

  1. New technologies for large-scale micropatterning of functional nanocomposite polymers

    Science.gov (United States)

    Khosla, A.; Gray, B. L.

    2012-04-01

    We present a review of different micropatterning technologies for flexible elastomeric functional nanocomposites with a particular emphasis on mold material and processes for production of large size substrates. The functional polymers include electrically conducting and magnetic materials developed at the Micro-instrumentation Laboratory at Simon Fraser University, Canada. We present a chart that compares many of these different conductive and magnetic functional nanocomposites and their measured characteristics. Furthermore, we have previously reported hybrid processes for nanocomposite polymers micromolded against SU-8 photoepoxy masters. However, SU-8 is typically limited to substrate sizes that are compatible with microelectronics processing as a microelectronics uv-patterning step is typically involved, and de-molding problems are observed. Recently, we have developed new processes that address the problems faced with SU-8 molds. These new technologies for micropatterning nanocomposites involve new substrate materials. A low cost Poly(methyl methacrylate) (PMMA) microfabrication technology has been developed, which involves fabrication of micromold via either CO2 laser ablation or deep UV. We have previously reported this large-scale patterning technique using laser ablation. Finally, we compare the two processes for PMMA producing micromolds for nanocomposites.

  2. Status of the technology development of large scale HTS generators for wind turbine

    Energy Technology Data Exchange (ETDEWEB)

    Le, T. D.; Kim, J. H.; Kim, D. J.; Boo, C. J.; Kim, H. M. [Jeju National University, Jeju (Korea, Republic of)

    2015-06-15

    Large wind turbine generators with high temperature superconductors (HTS) are in incessant development because of their advantages such as weight and volume reduction and the increased efficiency compared with conventional technologies. In addition, nowadays the wind turbine market is growing in a function of time, increasing the capacity and energy production of the wind farms installed and increasing the electrical power for the electrical generators installed. As a consequence, it is raising the wind power energy contribution for the global electricity demand. In this study, a forecast of wind energy development will be firstly emphasized, then it continue presenting a recent status of the technology development of large scale HTSG for wind power followed by an explanation of HTS wire trend, cryogenics cooling systems concept, HTS magnets field coil stability and other technological parts for optimization of HTS generator design-operating temperature, design topology, field coil shape and level cost of energy, as well. Finally, the most relevant projects and designs of HTS generators specifically for offshore wind power systems are also mentioned in this study.

  3. Status of the technology development of large scale HTS generators for wind turbine

    International Nuclear Information System (INIS)

    Le, T. D.; Kim, J. H.; Kim, D. J.; Boo, C. J.; Kim, H. M.

    2015-01-01

    Large wind turbine generators with high temperature superconductors (HTS) are in incessant development because of their advantages such as weight and volume reduction and the increased efficiency compared with conventional technologies. In addition, nowadays the wind turbine market is growing in a function of time, increasing the capacity and energy production of the wind farms installed and increasing the electrical power for the electrical generators installed. As a consequence, it is raising the wind power energy contribution for the global electricity demand. In this study, a forecast of wind energy development will be firstly emphasized, then it continue presenting a recent status of the technology development of large scale HTSG for wind power followed by an explanation of HTS wire trend, cryogenics cooling systems concept, HTS magnets field coil stability and other technological parts for optimization of HTS generator design-operating temperature, design topology, field coil shape and level cost of energy, as well. Finally, the most relevant projects and designs of HTS generators specifically for offshore wind power systems are also mentioned in this study

  4. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  5. Commercial applications of large-scale Research and Development computer simulation technologies

    International Nuclear Information System (INIS)

    Kuok Mee Ling; Pascal Chen; Wen Ho Lee

    1998-01-01

    The potential commercial applications of two large-scale R and D computer simulation technologies are presented. One such technology is based on the numerical solution of the hydrodynamics equations, and is embodied in the two-dimensional Eulerian code EULE2D, which solves the hydrodynamic equations with various models for the equation of state (EOS), constitutive relations and fracture mechanics. EULE2D is an R and D code originally developed to design and analyze conventional munitions for anti-armor penetrations such as shaped charges, explosive formed projectiles, and kinetic energy rods. Simulated results agree very well with actual experiments. A commercial application presented here is the design and simulation of shaped charges for oil and gas well bore perforation. The other R and D simulation technology is based on the numerical solution of Maxwell's partial differential equations of electromagnetics in space and time, and is implemented in the three-dimensional code FDTD-SPICE, which solves Maxwell's equations in the time domain with finite-differences in the three spatial dimensions and calls SPICE for information when nonlinear active devices are involved. The FDTD method has been used in the radar cross-section modeling of military aircrafts and many other electromagnetic phenomena. The coupling of FDTD method with SPICE, a popular circuit and device simulation program, provides a powerful tool for the simulation and design of microwave and millimeter-wave circuits containing nonlinear active semiconductor devices. A commercial application of FDTD-SPICE presented here is the simulation of a two-element active antenna system. The simulation results and the experimental measurements are in excellent agreement. (Author)

  6. Integrating large-scale data and RNA technology to protect crops from fungal pathogens

    Directory of Open Access Journals (Sweden)

    Ian Joseph Girard

    2016-05-01

    Full Text Available With a rapidly growing human population it is expected that plant science researchers and the agricultural community will need to increase food productivity using less arable land. This challenge is complicated by fungal pathogens and diseases, many of which can severely impact crop yield. Current measures to control fungal pathogens are either ineffective or have adverse effects on the agricultural enterprise. Thus, developing new strategies through research innovation to protect plants from pathogenic fungi is necessary to overcome these hurdles. RNA sequencing technologies are increasing our understanding of the underlying genes and gene regulatory networks mediating disease outcomes. The application of invigorating next generation sequencing strategies to study plant-pathogen interactions has and will provide unprecedented insight into the complex patterns of gene activity responsible for crop protection. However, questions remain about how biological processes in both the pathogen and the host are specified in space directly at the site of infection and over the infection period. The integration of cutting edge molecular and computational tools will provide plant scientists with the arsenal required to identify genes and molecules that play a role in plant protection. Large scale RNA sequence data can then be used to protect plants by targeting genes essential for pathogen viability in the production of stably transformed lines expressing RNA interference molecules, or through foliar applications of double stranded RNA.

  7. Large scale renewable power generation advances in technologies for generation, transmission and storage

    CERN Document Server

    Hossain, Jahangir

    2014-01-01

    This book focuses on the issues of integrating large-scale renewable power generation into existing grids. It includes a new protection technique for renewable generators along with the inclusion of current status of smart grid.

  8. Implementing Large-Scale Instructional Technology in Kenya: Changing Instructional Practice and Developing Accountability in a National Education System

    Science.gov (United States)

    Piper, Benjamin; Oyanga, Arbogast; Mejia, Jessica; Pouezevara, Sarah

    2017-01-01

    Previous large-scale education technology interventions have shown only modest impacts on student achievement. Building on results from an earlier randomized controlled trial of three different applications of information and communication technologies (ICTs) on primary education in Kenya, the Tusome Early Grade Reading Activity developed the…

  9. Unlocking biomarker discovery: large scale application of aptamer proteomic technology for early detection of lung cancer.

    Directory of Open Access Journals (Sweden)

    Rachel M Ostroff

    Full Text Available BACKGROUND: Lung cancer is the leading cause of cancer deaths worldwide. New diagnostics are needed to detect early stage lung cancer because it may be cured with surgery. However, most cases are diagnosed too late for curative surgery. Here we present a comprehensive clinical biomarker study of lung cancer and the first large-scale clinical application of a new aptamer-based proteomic technology to discover blood protein biomarkers in disease. METHODOLOGY/PRINCIPAL FINDINGS: We conducted a multi-center case-control study in archived serum samples from 1,326 subjects from four independent studies of non-small cell lung cancer (NSCLC in long-term tobacco-exposed populations. Sera were collected and processed under uniform protocols. Case sera were collected from 291 patients within 8 weeks of the first biopsy-proven lung cancer and prior to tumor removal by surgery. Control sera were collected from 1,035 asymptomatic study participants with ≥ 10 pack-years of cigarette smoking. We measured 813 proteins in each sample with a new aptamer-based proteomic technology, identified 44 candidate biomarkers, and developed a 12-protein panel (cadherin-1, CD30 ligand, endostatin, HSP90α, LRIG3, MIP-4, pleiotrophin, PRKCI, RGM-C, SCF-sR, sL-selectin, and YES that discriminates NSCLC from controls with 91% sensitivity and 84% specificity in cross-validated training and 89% sensitivity and 83% specificity in a separate verification set, with similar performance for early and late stage NSCLC. CONCLUSIONS/SIGNIFICANCE: This study is a significant advance in clinical proteomics in an area of high unmet clinical need. Our analysis exceeds the breadth and dynamic range of proteome interrogated of previously published clinical studies of broad serum proteome profiling platforms including mass spectrometry, antibody arrays, and autoantibody arrays. The sensitivity and specificity of our 12-biomarker panel improves upon published protein and gene expression panels

  10. Data warehousing technologies for large-scale and right-time data

    DEFF Research Database (Denmark)

    Xiufeng, Liu

    heterogeneous sources into a central data warehouse (DW) by Extract-Transform-Load (ETL) at regular time intervals, e.g., monthly, weekly, or daily. But now, it becomes challenging for large-scale data, and hard to meet the near real-time/right-time business decisions. This thesis considers some...

  11. Research on precision grinding technology of large scale and ultra thin optics

    Science.gov (United States)

    Zhou, Lian; Wei, Qiancai; Li, Jie; Chen, Xianhua; Zhang, Qinghua

    2018-03-01

    The flatness and parallelism error of large scale and ultra thin optics have an important influence on the subsequent polishing efficiency and accuracy. In order to realize the high precision grinding of those ductile elements, the low deformation vacuum chuck was designed first, which was used for clamping the optics with high supporting rigidity in the full aperture. Then the optics was planar grinded under vacuum adsorption. After machining, the vacuum system was turned off. The form error of optics was on-machine measured using displacement sensor after elastic restitution. The flatness would be convergenced with high accuracy by compensation machining, whose trajectories were integrated with the measurement result. For purpose of getting high parallelism, the optics was turned over and compensation grinded using the form error of vacuum chuck. Finally, the grinding experiment of large scale and ultra thin fused silica optics with aperture of 430mm×430mm×10mm was performed. The best P-V flatness of optics was below 3 μm, and parallelism was below 3 ″. This machining technique has applied in batch grinding of large scale and ultra thin optics.

  12. Large-scale educational telecommunications systems for the US: An analysis of educational needs and technological opportunities

    Science.gov (United States)

    Morgan, R. P.; Singh, J. P.; Rothenberg, D.; Robinson, B. E.

    1975-01-01

    The needs to be served, the subsectors in which the system might be used, the technology employed, and the prospects for future utilization of an educational telecommunications delivery system are described and analyzed. Educational subsectors are analyzed with emphasis on the current status and trends within each subsector. Issues which affect future development, and prospects for future use of media, technology, and large-scale electronic delivery within each subsector are included. Information on technology utilization is presented. Educational telecommunications services are identified and grouped into categories: public television and radio, instructional television, computer aided instruction, computer resource sharing, and information resource sharing. Technology based services, their current utilization, and factors which affect future development are stressed. The role of communications satellites in providing these services is discussed. Efforts to analyze and estimate future utilization of large-scale educational telecommunications are summarized. Factors which affect future utilization are identified. Conclusions are presented.

  13. Reflections on the political economy of large-scale technology using the example of German fast-breeder development

    International Nuclear Information System (INIS)

    Keck, O.

    1981-01-01

    Proceeding from Anglo-Saxon opinions which, from a liberal point of view, criticize the German practice of research policy - state centres of large-scale research and state subventions for research and development in industry - to be inefficient, the author empirically verified these statements taking the German fast breeder project as an example. If the case of the German fast breeder can be generalized, this had consequences for the research political practice and for other technologies. Supporters as well as opponents of large-scale technology today proceed from the assumption that almost every technology can be made commercially viable when using sufficient amounts of money and persons. This is a migth which owes its existence to the technical success of great projects in non-commercial fields. The German fast breeder project confirms the opinion that the recipes for success of these non-commercial projects cannot be applied to the field of commercial technology. The results of this study suggest that practice and theory of technology policy can be misdirected if they are uncritically oriented according to the form of state intervention so far used in large-scale technology. (orig./HSCH) [de

  14. Development of technology for the large-scale preparation of 60Co polymer film source

    International Nuclear Information System (INIS)

    Udhayakumar, J.; Pardeshi, G.S.; Gandhi, Shymala S.; Chakravarty, Rubel; Kumar, Manoj; Dash, Ashutosh; Venkatesh, Meera

    2008-01-01

    60 Co sources (∼37 kBq) in the form of a thin film are widely used in position identification of perforation in offshore oil-well explorations. This paper describes the large-scale preparation of such sources using a radioactive polymer containing 60 Co. 60 Co was extracted into chloroform containing 8-hydroxyquinoline. The chloroform layer was mixed with polymethyl methacrylate (PMMA) polymer. A large film was prepared using the polymer solution containing the complex. The polymer film was then cut into circular sources, mounted on a source holder and supplied to various users

  15. Assessment of Vehicle Sizing, Energy Consumption and Cost Through Large Scale Simulation of Advanced Vehicle Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Moawad, Ayman [Argonne National Lab. (ANL), Argonne, IL (United States); Kim, Namdoo [Argonne National Lab. (ANL), Argonne, IL (United States); Shidore, Neeraj [Argonne National Lab. (ANL), Argonne, IL (United States); Rousseau, Aymeric [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-01-01

    The U.S. Department of Energy (DOE) Vehicle Technologies Office (VTO) has been developing more energy-efficient and environmentally friendly highway transportation technologies that will enable America to use less petroleum. The long-term aim is to develop "leapfrog" technologies that will provide Americans with greater freedom of mobility and energy security, while lowering costs and reducing impacts on the environment. This report reviews the results of the DOE VTO. It gives an assessment of the fuel and light-duty vehicle technologies that are most likely to be established, developed, and eventually commercialized during the next 30 years (up to 2045). Because of the rapid evolution of component technologies, this study is performed every two years to continuously update the results based on the latest state-of-the-art technologies.

  16. Economic Impact of Large-Scale Deployment of Offshore Marine and Hydrokinetic Technology in Oregon Coastal Counties

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, T. [Bureau of Ocean Energy Management (BOEM), Washington, DC (United States); Tegen, S. [Bureau of Ocean Energy Management (BOEM), Washington, DC (United States); Beiter, P. [Bureau of Ocean Energy Management (BOEM), Washington, DC (United States)

    2015-03-01

    To begin understanding the potential economic impacts of large-scale WEC technology, the Bureau of Ocean Energy Management (BOEM) commissioned the National Renewable Energy Laboratory (NREL) to conduct an economic impact analysis of largescale WEC deployment for Oregon coastal counties. This report follows a previously published report by BOEM and NREL on the jobs and economic impacts of WEC technology for the entire state (Jimenez and Tegen 2015). As in Jimenez and Tegen (2015), this analysis examined two deployment scenarios in the 2026-2045 timeframe: the first scenario assumed 13,000 megawatts (MW) of WEC technology deployed during the analysis period, and the second assumed 18,000 MW of WEC technology deployed by 2045. Both scenarios require major technology and cost improvements in the WEC devices. The study is on very large-scale deployment so readers can examine and discuss the potential of a successful and very large WEC industry. The 13,000-MW is used as the basis for the county analysis as it is the smaller of the two scenarios. Sensitivity studies examined the effects of a robust in-state WEC supply chain. The region of analysis is comprised of the seven coastal counties in Oregon—Clatsop, Coos, Curry, Douglas, Lane, Lincoln, and Tillamook—so estimates of jobs and other economic impacts are specific to this coastal county area.

  17. Large-Scale Campus Computer Technology Implementation: Lessons from the First Year.

    Science.gov (United States)

    Nichols, Todd; Frazer, Linda H.

    The purpose of the Elementary Technology Demonstration Schools (ETDS) Project, funded by IBM and Apple, Inc., was to demonstrate the effectiveness of technology in accelerating the learning of low achieving at-risk students and enhancing the education of high achieving students. The paper begins by giving background information on the district,…

  18. Factors Affecting the Rate of Penetration of Large-Scale Electricity Technologies: The Case of Carbon Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    James R. McFarland; Howard J. Herzog

    2007-05-14

    This project falls under the Technology Innovation and Diffusion topic of the Integrated Assessment of Climate Change Research Program. The objective was to better understand the critical variables that affect the rate of penetration of large-scale electricity technologies in order to improve their representation in integrated assessment models. We conducted this research in six integrated tasks. In our first two tasks, we identified potential factors that affect penetration rates through discussions with modeling groups and through case studies of historical precedent. In the next three tasks, we investigated in detail three potential sets of critical factors: industrial conditions, resource conditions, and regulatory/environmental considerations. Research to assess the significance and relative importance of these factors involved the development of a microeconomic, system dynamics model of the US electric power sector. Finally, we implemented the penetration rate models in an integrated assessment model. While the focus of this effort is on carbon capture and sequestration technologies, much of the work will be applicable to other large-scale energy conversion technologies.

  19. Large scale deployment of non power applications (NPAs) and spin-off technologies in rural sector

    International Nuclear Information System (INIS)

    Patankar, A.M.; Mule, S.S.

    2009-01-01

    Over the past 50 years a large indigenous Science and Technology (S and T) know-how has been generated in various national laboratories and in parallel, several technologies have been imported. Urban sector has received the highest attention by way of deployment of large number of these technologies and know-how in urban areas resulting in rapid urban development leading to urban rural divide in terms of prosperity and opportunities. Further, India's young population is expected to be the largest in the world in decades ahead, over 500 millions. Creating gainful and productive work for them is the greatest challenge. Technical know-how generated in national laboratories related to basic needs such as water, food, energy and environment has been underutilized. Deployment and adaptation of this know-how to the rural needs could provide a creative opportunity for expected 500 million youths in rural and urban India to contribute to the national wealth with prosperity for everybody including villages

  20. Survey and research for the enhancement of large-scale technology development 1. Japan's large-scale technology development and the effects; Ogata gijutsu kaihatsu suishin no tame no chosa kenkyu. 1. Nippon no daikibo gijutsu kaihatsu to sono koka

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1981-03-01

    A survey is conducted into the effects of projects implemented under the large-scale industrial technology research and development system. In the development of 'ultraperformance computers,' each of the technologies is being widely utilized, and the data service system of Nippon Telegraph and Telephone Public Corporation and the large computer (HITAC8800) owe much for their success to the fruits of the development endeavor. In the development of the 'desulfurization technology,' the fruits are in use by Tokyo Electric Power Co., Inc., and Chubu Electric Power Co., Inc., incorporated into their desulfurization systems. Although there is no practical plant based on the 'great-depth remotely controlled submarine oil drilling rig,' yet oceanic technologies and control methods are being utilized in various fields. The 'seawater desalination and by-product utilization' technologies have enabled the establishment of technologies of the top level in the world thanks to the resultant manufacture of concrete evaporator and related technologies. Eleven plants have been completed utilizing the fruits of the development. In the field of 'electric vehicle,' there is no commercialization in progress due to problems in cost effectiveness though remarkable improvement has been achieved in terms of performance. Technologies about weight reduction, semiconductor devices, battery parts and components, etc., are being utilized in many fields. (NEDO)

  1. Improved technique that allows the performance of large-scale SNP genotyping on DNA immobilized by FTA technology.

    Science.gov (United States)

    He, Hongbin; Argiro, Laurent; Dessein, Helia; Chevillard, Christophe

    2007-01-01

    FTA technology is a novel method designed to simplify the collection, shipment, archiving and purification of nucleic acids from a wide variety of biological sources. The number of punches that can normally be obtained from a single specimen card are often however, insufficient for the testing of the large numbers of loci required to identify genetic factors that control human susceptibility or resistance to multifactorial diseases. In this study, we propose an improved technique to perform large-scale SNP genotyping. We applied a whole genome amplification method to amplify DNA from buccal cell samples stabilized using FTA technology. The results show that using the improved technique it is possible to perform up to 15,000 genotypes from one buccal cell sample. Furthermore, the procedure is simple. We consider this improved technique to be a promising methods for performing large-scale SNP genotyping because the FTA technology simplifies the collection, shipment, archiving and purification of DNA, while whole genome amplification of FTA card bound DNA produces sufficient material for the determination of thousands of SNP genotypes.

  2. Large scale meta-analysis of fragment-based screening campaigns: privileged fragments and complementary technologies.

    Science.gov (United States)

    Kutchukian, Peter S; Wassermann, Anne Mai; Lindvall, Mika K; Wright, S Kirk; Ottl, Johannes; Jacob, Jaison; Scheufler, Clemens; Marzinzik, Andreas; Brooijmans, Natasja; Glick, Meir

    2015-06-01

    A first step in fragment-based drug discovery (FBDD) often entails a fragment-based screen (FBS) to identify fragment "hits." However, the integration of conflicting results from orthogonal screens remains a challenge. Here we present a meta-analysis of 35 fragment-based campaigns at Novartis, which employed a generic 1400-fragment library against diverse target families using various biophysical and biochemical techniques. By statistically interrogating the multidimensional FBS data, we sought to investigate three questions: (1) What makes a fragment amenable for FBS? (2) How do hits from different fragment screening technologies and target classes compare with each other? (3) What is the best way to pair FBS assay technologies? In doing so, we identified substructures that were privileged for specific target classes, as well as fragments that were privileged for authentic activity against many targets. We also revealed some of the discrepancies between technologies. Finally, we uncovered a simple rule of thumb in screening strategy: when choosing two technologies for a campaign, pairing a biochemical and biophysical screen tends to yield the greatest coverage of authentic hits. © 2014 Society for Laboratory Automation and Screening.

  3. Developing Server-Side Infrastructure for Large-Scale E-Learning of Web Technology

    Science.gov (United States)

    Simpkins, Neil

    2010-01-01

    The growth of E-business has made experience in server-side technology an increasingly important area for educators. Server-side skills are in increasing demand and recognised to be of relatively greater value than comparable client-side aspects (Ehie, 2002). In response to this, many educational organisations have developed E-business courses,…

  4. Strategic Planning Tools for Large-Scale Technology-Based Assessments

    Science.gov (United States)

    Koomen, Marten; Zoanetti, Nathan

    2018-01-01

    Education systems are increasingly being called upon to implement new technology-based assessment systems that generate efficiencies, better meet changing stakeholder expectations, or fulfil new assessment purposes. These assessment systems require coordinated organisational effort to implement and can be expensive in time, skill and other…

  5. CO{sub 2} mitigation costs of large-scale bioenergy technologies in competitive electricity markets

    Energy Technology Data Exchange (ETDEWEB)

    Gustavsson, L [Mid-Sweden University, Ostersund (Sweden). Dept. of Natural and Environmental Sciences, Ecotechnology; Madlener, R [Swiss Federal Institute of Technology, Zurich (Switzerland). CEPE

    2003-11-01

    In this study, we compare and contrast the impact of recent technological developments in large biomass-fired and natural-gas-fired cogeneration and condensing plants in terms of CO{sub 2} mitigation costs and under the conditions of a competitive electricity market. The CO{sub 2} mitigation cost indicates the minimum economic incentive required (e.g. in the form of a carbon tax) to equal the cost of a less carbon extensive system with the cost of a reference system. The results show that CO{sub 2} mitigation costs are lower for biomass systems than for natural gas systems with decarbonization. However, in liberalized energy markets and given the sociopolitical will to implement carbon extensive energy systems, market-based policy measures are still required to make biomass and decarbonization options competitive and thus help them to penetrate the market. This cost of cogeneration plants, however, depends on the evaluation method used. If we account for the limitation of heat sinks by expanding the reference entity to include both heat and power, as is typically recommended in life-cycle analysis, then the biomass-based gasification combined cycle (BIG/CC) technology turns out to be less expensive and to exhibit lower CO{sub 2} mitigation costs than biomass-fired steam turbine plants. However, a heat credit granted to cogeneration systems that is based on avoided cost of separate heat production, puts the steam turbine technology despite its lower system efficiency at an advantage. In contrast, when a crediting method based on avoided electricity production in natural gas fired condensing plants is employed, the BIG/CC technology turns out to be more cost competitive than the steam turbine technology for carbon tax levels beyond about $150/t C. Furthermore, steam turbine plants are able to compete with natural gas fired cogeneration plants at carbon tax levels higher than about $90/tC. (author)

  6. Exploring Large Scale Data Analysis and Visualization for ARM Data Discovery Using NoSQL Technologies

    Science.gov (United States)

    Krishna, B.; Gustafson, W. I., Jr.; Vogelmann, A. M.; Toto, T.; Devarakonda, R.; Palanisamy, G.

    2016-12-01

    This paper presents a new way of providing ARM data discovery through data analysis and visualization services. ARM stands for Atmospheric Radiation Measurement. This Program was created to study cloud formation processes and their influence on radiative transfer and also include additional measurements of aerosol and precipitation at various highly instrumented ground and mobile stations. The total volume of ARM data is roughly 900TB. The current search for ARM data is performed by using its metadata, such as the site name, instrument name, date, etc. NoSQL technologies were explored to improve the capabilities of data searching, not only by their metadata, but also by using the measurement values. Two technologies that are currently being implemented for testing are Apache Cassandra (noSQL database) and Apache Spark (noSQL based analytics framework). Both of these technologies were developed to work in a distributed environment and hence can handle large data for storing and analytics. D3.js is a JavaScript library that can generate interactive data visualizations in web browsers by making use of commonly used SVG, HTML5, and CSS standards. To test the performance of NoSQL for ARM data, we will be using ARM's popular measurements to locate the data based on its value. Recently noSQL technology has been applied to a pilot project called LASSO, which stands for LES ARM Symbiotic Simulation and Observation Workflow. LASSO will be packaging LES output and observations in "data bundles" and analyses will require the ability for users to analyze both observations and LES model output either individually or together across multiple time periods. The LASSO implementation strategy suggests that enormous data storage is required to store the above mentioned quantities. Thus noSQL was used to provide a powerful means to store portions of the data that provided users with search capabilities on each simulation's traits through a web application. Based on the user selection

  7. Characterizing Agricultural Impacts of Recent Large-Scale US Droughts and Changing Technology and Management

    Science.gov (United States)

    Elliott, Joshua; Glotter, Michael; Ruane, Alex C.; Boote, Kenneth J.; Hatfield, Jerry L.; Jones, James W.; Rosenzweig, Cynthia; Smith, Leonard A.; Foster, Ian

    2017-01-01

    Process-based agricultural models, applied in novel ways, can reproduce historical crop yield anomalies in the US, with median absolute deviation from observations of 6.7% at national-level and 11% at state-level. In seasons for which drought is the overriding factor, performance is further improved. Historical counterfactual scenarios for the 1988 and 2012 droughts show that changes in agricultural technologies and management have reduced system-level drought sensitivity in US maize production by about 25% in the intervening years. Finally, we estimate the economic costs of the two droughts in terms of insured and uninsured crop losses in each US county (for a total, adjusted for inflation, of $9 billion in 1988 and $21.6 billion in 2012). We compare these with cost estimates from the counterfactual scenarios and with crop indemnity data where available. Model based measures are capable of accurately reproducing the direct agro-economic losses associated with extreme drought and can be used to characterize and compare events that occurred under very different conditions. This work suggests new approaches to modeling, monitoring, forecasting, and evaluating drought impacts on agriculture, as well as evaluating technological changes to inform adaptation strategies for future climate change and extreme events.

  8. Characterizing agricultural impacts of recent large-scale US droughts and changing technology and management

    Energy Technology Data Exchange (ETDEWEB)

    Elliott, Joshua [Univ. of Chicago, IL (United States). Computation Inst.; Argonne National Lab. (ANL), Lemont, IL (United States); Glotter, Michael [Univ. of Chicago, IL (United States). Dept. of the Geophysical Sciences; Ruane, Alex C. [NASA Goddard Inst. for Space Studies (GISS), New York, NY (United States); Boote, Kenneth J. [Univ. of Florida, Gainesville, FL (United States). Agricultural and Biological Engineering Dept.; Hatfield, Jerry L. [US Dept. of Agriculture (USDA)., Ames, IA (United States). National Lab. for Agriculture and the Environment; Jones, James W. [Univ. of Florida, Gainesville, FL (United States). Agricultural and Biological Engineering Dept.; Rosenzweig, Cynthia [NASA Goddard Inst. for Space Studies (GISS), New York, NY (United States); Smith, Leonard A. [London School of Economics, London (United Kingdom). Center for Analysis of Time Series; Foster, Ian [Univ. of Chicago, IL (United States). Computation Inst.; Computation Inst.; Argonne National Lab. (ANL), Lemont, IL (United States)

    2018-01-01

    Process-based agricultural models, applied in novel ways, can reproduce historical crop yield anomalies in the US, with median absolute deviation from observations of 6.7% at national-level and 11% at state-level. In seasons for which drought is the overriding factor, performance is further improved. Historical counterfactual scenarios for the 1988 and 2012 droughts show that changes in agricultural technologies and management have reduced system-level drought sensitivity in US maize production by about 25% in the intervening years. Finally, we estimate the economic costs of the two droughts in terms of insured and uninsured crop losses in each US county (for a total, adjusted for inflation, of $9 billion in 1988 and $21.6 billion in 2012). We compare these with cost estimates from the counterfactual scenarios and with crop indemnity data where available. Model-based measures are capable of accurately reproducing the direct agro-economic losses associated with extreme drought and can be used to characterize and compare events that occurred under very different conditions. This work suggests new approaches to modeling, monitoring, forecasting, and evaluating drought impacts on agriculture, as well as evaluating technological changes to inform adaptation strategies for future climate change and extreme events.

  9. Large-scale nanofabrication of periodic nanostructures using nanosphere-related techniques for green technology applications (Conference Presentation)

    Science.gov (United States)

    Yen, Chen-Chung; Wu, Jyun-De; Chien, Yi-Hsin; Wang, Chang-Han; Liu, Chi-Ching; Ku, Chen-Ta; Chen, Yen-Jon; Chou, Meng-Cheng; Chang, Yun-Chorng

    2016-09-01

    Nanotechnology has been developed for decades and many interesting optical properties have been demonstrated. However, the major hurdle for the further development of nanotechnology depends on finding economic ways to fabricate such nanostructures in large-scale. Here, we demonstrate how to achieve low-cost fabrication using nanosphere-related techniques, such as Nanosphere Lithography (NSL) and Nanospherical-Lens Lithography (NLL). NSL is a low-cost nano-fabrication technique that has the ability to fabricate nano-triangle arrays that cover a very large area. NLL is a very similar technique that uses polystyrene nanospheres to focus the incoming ultraviolet light and exposure the underlying photoresist (PR) layer. PR hole arrays form after developing. Metal nanodisk arrays can be fabricated following metal evaporation and lifting-off processes. Nanodisk or nano-ellipse arrays with various sizes and aspect ratios are routinely fabricated in our research group. We also demonstrate we can fabricate more complicated nanostructures, such as nanodisk oligomers, by combining several other key technologies such as angled exposure and deposition, we can modify these methods to obtain various metallic nanostructures. The metallic structures are of high fidelity and in large scale. The metallic nanostructures can be transformed into semiconductor nanostructures and be used in several green technology applications.

  10. Development of innovative technological base for large-scale nuclear power

    International Nuclear Information System (INIS)

    Adamov, E.O.; Dedul, A.V.; Orlov, V.V.; Rachkov, V.I.; Slesarev, I.S.

    2017-01-01

    The problems of the Nuclear Power (NP) further development as well as the ways of their resolution on the basis of innovative fast reactor concepts and the Closed Equilibrium Fuel Cycle (CEFC) are analyzed. The new paradigm of NP and the corresponding NP super task are declared. The corresponding super task could be considered a transition to the vital risk free nuclear power through the guaranteed elimination/suppression of all their vital risks and threats (or their transformation to the category of some ordinary risks and threats) on the base of ''natural safety principle''. The project of Rosatom State Corporation (named ''PRORYV'') is launched within the Federal Target Program ''Nuclear power technologies of new generation for 2010 to 2015 and in perspective till 2020''. It has been planned just for these goals achievement. Super-task solution is quite ''on teeth'' to PRORYV project which is initially focused on the ''natural safety'' realization. This project is aimed, in particular, at construction of the demonstration lead cooled reactor BREST-300-OD and the enterprise for equilibrium fuel cycle closing.

  11. Development of innovative technological base for large-scale nuclear power

    Energy Technology Data Exchange (ETDEWEB)

    Adamov, E.O.; Dedul, A.V.; Orlov, V.V.; Rachkov, V.I.; Slesarev, I.S. [ITC ' ' PRORYV' ' Project, Moscow (Russian Federation)

    2017-04-15

    The problems of the Nuclear Power (NP) further development as well as the ways of their resolution on the basis of innovative fast reactor concepts and the Closed Equilibrium Fuel Cycle (CEFC) are analyzed. The new paradigm of NP and the corresponding NP super task are declared. The corresponding super task could be considered a transition to the vital risk free nuclear power through the guaranteed elimination/suppression of all their vital risks and threats (or their transformation to the category of some ordinary risks and threats) on the base of ''natural safety principle''. The project of Rosatom State Corporation (named ''PRORYV'') is launched within the Federal Target Program ''Nuclear power technologies of new generation for 2010 to 2015 and in perspective till 2020''. It has been planned just for these goals achievement. Super-task solution is quite ''on teeth'' to PRORYV project which is initially focused on the ''natural safety'' realization. This project is aimed, in particular, at construction of the demonstration lead cooled reactor BREST-300-OD and the enterprise for equilibrium fuel cycle closing.

  12. A large-scale view of Space Technology 5 magnetometer response to solar wind drivers.

    Science.gov (United States)

    Knipp, D J; Kilcommons, L M; Gjerloev, J; Redmon, R J; Slavin, J; Le, G

    2015-04-01

    In this data report we discuss reprocessing of the Space Technology 5 (ST5) magnetometer database for inclusion in NASA's Coordinated Data Analysis Web (CDAWeb) virtual observatory. The mission consisted of three spacecraft flying in elliptical orbits, from 27 March to 27 June 2006. Reprocessing includes (1) transforming the data into the Modified Apex Coordinate System for projection to a common reference altitude of 110 km, (2) correcting gain jumps, and (3) validating the results. We display the averaged magnetic perturbations as a keogram, which allows direct comparison of the full-mission data with the solar wind values and geomagnetic indices. With the data referenced to a common altitude, we find the following: (1) Magnetic perturbations that track the passage of corotating interaction regions and high-speed solar wind; (2) unexpectedly strong dayside perturbations during a solstice magnetospheric sawtooth oscillation interval characterized by a radial interplanetary magnetic field (IMF) component that may have enhanced the accompanying modest southward IMF; and (3) intervals of reduced magnetic perturbations or "calms," associated with periods of slow solar wind, interspersed among variable-length episodic enhancements. These calms are most evident when the IMF is northward or projects with a northward component onto the geomagnetic dipole. The reprocessed ST5 data are in very good agreement with magnetic perturbations from the Defense Meteorological Satellite Program (DMSP) spacecraft, which we also map to 110 km. We briefly discuss the methods used to remap the ST5 data and the means of validating the results against DMSP. Our methods form the basis for future intermission comparisons of space-based magnetometer data.

  13. Developing technology for large-scale production of forest chips. Wood Energy Technology Programme 1999-2003. Interim report

    International Nuclear Information System (INIS)

    Hakkila, P.

    2003-01-01

    Finland is enhancing its use of renewable sources in energy production. From the 1995 level, the use of renewable energy is to be increased by 50 % by 2010, and 100 % by 2025. Wood-based fuels will play a leading role in this development. The main source of wood-based fuels is processing residues from the forest industries. However, as all processing residues are already in use, an increase is possible only as far as the capacity and wood consumption of the forest industries grow. Energy policy affects the production and availability of processing residues only indirectly. Another large source of wood-based energy is forest fuels, consisting of traditional firewood and chips comminuted from low-quality biomass. It is estimated that the reserve of technically harvest-able forest biomass is 10-16 Mm' annually, when no specific cost limit is applied. This corresponds to 2-3 Mtoe or 6-9 % of the present consumption of primary energy in Finland. How much of this re-serve it will actually be possible to harvest and utilize depends on the cost competitiveness of forest chips against alternative sources of energy. A goal of Finnish energy and climate strategies is to use 5 Mm' forest chips annually by 2010. The use of wood fuels is being promoted by means of taxation, investment aid and support for chip production from young forests. Furthermore, research and development is being supported in order to create techno-economic conditions for the competitive production of forest chips. In 1999, the National Technology Agency Tekes established the five-year Wood Energy Technology Programme to stimulate the development of efficient systems for the large-scale production of forest chips. Key tar-gets are competitive costs, reliable supply and good quality chips. The two guiding principles of the programme are: (1) close cooperation between researchers and practitioners and (2) to apply research and development to the practical applications and commercialization. As of November

  14. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  15. Survey of high-voltage pulse technology suitable for large-scale plasma source ion implantation processes

    International Nuclear Information System (INIS)

    Reass, W.A.

    1994-01-01

    Many new plasma processes ideas are finding their way from the research lab to the manufacturing plant floor. These require high voltage (HV) pulse power equipment, which must be optimized for application, system efficiency, and reliability. Although no single HV pulse technology is suitable for all plasma processes, various classes of high voltage pulsers may offer a greater versatility and economy to the manufacturer. Technology developed for existing radar and particle accelerator modulator power systems can be utilized to develop a modern large scale plasma source ion implantation (PSII) system. The HV pulse networks can be broadly defined by two classes of systems, those that generate the voltage directly, and those that use some type of pulse forming network and step-up transformer. This article will examine these HV pulse technologies and discuss their applicability to the specific PSII process. Typical systems that will be reviewed will include high power solid state, hard tube systems such as crossed-field ''hollow beam'' switch tubes and planar tetrodes, and ''soft'' tube systems with crossatrons and thyratrons. Results will be tabulated and suggestions provided for a particular PSII process

  16. Organizational and technological genesis as a tool for strategic planning of large-scale real estate development projects

    Directory of Open Access Journals (Sweden)

    Gusakova Elena

    2018-01-01

    Full Text Available Conceptual planning and implementation of large-scale real estate development projects is one of the most difficult tasks in the organization of construction. In the Russian practice, a large experience of development, complex reorganization and redevelopment of large development areas is accumulated. The methodological basis for solving similar problems is the organizational and technological genesis, which considers the development of the project during the full life cycle. An analysis of this experience allows us to talk about the formation of new and effective approaches and methods within the organizational and technological genesis. Among them, the most significant and universal approaches should be highlighted: The concept of real estate development, which explains the reasons and objective needs for project transformations during its life cycle, as well as to increase the adaptive capabilities of design decisions and the project's suitability for the most likely future changes; Development project of joint action, which is based on the balance of interests of project participants; Master planning of the life cycle stages of the project and subprojects, based on the rethinking of the theory and methods of the construction organization, and allowing rationally localized construction sites and related subprojects, while retaining the remaining development and development area beyond of the negative effect of construction for comfortable living and work.

  17. Large-scale laboratory testing of bedload-monitoring technologies: overview of the StreamLab06 Experiments

    Science.gov (United States)

    Marr, Jeffrey D.G.; Gray, John R.; Davis, Broderick E.; Ellis, Chris; Johnson, Sara; Gray, John R.; Laronne, Jonathan B.; Marr, Jeffrey D.G.

    2010-01-01

    A 3-month-long, large-scale flume experiment involving research and testing of selected conventional and surrogate bedload-monitoring technologies was conducted in the Main Channel at the St. Anthony Falls Laboratory under the auspices of the National Center for Earth-surface Dynamics. These experiments, dubbed StreamLab06, involved 25 researchers and volunteers from academia, government, and the private sector. The research channel was equipped with a sediment-recirculation system and a sediment-flux monitoring system that allowed continuous measurement of sediment flux in the flume and provided a data set by which samplers were evaluated. Selected bedload-measurement technologies were tested under a range of flow and sediment-transport conditions. The experiment was conducted in two phases. The bed material in phase I was well-sorted siliceous sand (0.6-1.8 mm median diameter). A gravel mixture (1-32 mm median diameter) composed the bed material in phase II. Four conventional bedload samplers – a standard Helley-Smith, Elwha, BLH-84, and Toutle River II (TR-2) sampler – were manually deployed as part of both experiment phases. Bedload traps were deployed in study Phase II. Two surrogate bedload samplers – stationarymounted down-looking 600 kHz and 1200 kHz acoustic Doppler current profilers – were deployed in experiment phase II. This paper presents an overview of the experiment including the specific data-collection technologies used and the ambient hydraulic, sediment-transport and environmental conditions measured as part of the experiment. All data collected as part of the StreamLab06 experiments are, or will be available to the research community.

  18. LARGE-SCALE MECURY CONTROL TECHNOLOGY TESTING FOR LIGNITE-FIRED UTILITIES-OXIDATION SYSTEMS FOR WET FGD

    Energy Technology Data Exchange (ETDEWEB)

    Michael J. Holmes; Steven A. Benson; Jeffrey S. Thompson

    2004-03-01

    The Energy & Environmental Research Center (EERC) is conducting a consortium-based effort directed toward resolving the mercury (Hg) control issues facing the lignite industry. Specifically, the EERC team--the EERC, EPRI, URS, ADA-ES, Babcock & Wilcox, the North Dakota Industrial Commission, SaskPower, and the Mercury Task Force, which includes Basin Electric Power Cooperative, Otter Tail Power Company, Great River Energy, Texas Utilities (TXU), Montana-Dakota Utilities Co., Minnkota Power Cooperative, BNI Coal Ltd., Dakota Westmoreland Corporation, and the North American Coal Company--has undertaken a project to significantly and cost-effectively oxidize elemental mercury in lignite combustion gases, followed by capture in a wet scrubber. This approach will be applicable to virtually every lignite utility in the United States and Canada and potentially impact subbituminous utilities. The oxidation process is proven at the pilot-scale and in short-term full-scale tests. Additional optimization is continuing on oxidation technologies, and this project focuses on longer-term full-scale testing. The lignite industry has been proactive in advancing the understanding of and identifying control options for Hg in lignite combustion flue gases. Approximately 1 year ago, the EERC and EPRI began a series of Hg-related discussions with the Mercury Task Force as well as utilities firing Texas and Saskatchewan lignites. This project is one of three being undertaken by the consortium to perform large-scale Hg control technology testing to address the specific needs and challenges to be met in controlling Hg from lignite-fired power plants. This project involves Hg oxidation upstream of a system equipped with an electrostatic precipitator (ESP) followed by wet flue gas desulfurization (FGD). The team involved in conducting the technical aspects of the project includes the EERC, Babcock & Wilcox, URS, and ADA-ES. The host sites include Minnkota Power Cooperative Milton R. Young

  19. Social Network Analysis and Mining to Monitor and Identify Problems with Large-Scale Information and Communication Technology Interventions.

    Science.gov (United States)

    da Silva, Aleksandra do Socorro; de Brito, Silvana Rossy; Vijaykumar, Nandamudi Lankalapalli; da Rocha, Cláudio Alex Jorge; Monteiro, Maurílio de Abreu; Costa, João Crisóstomo Weyl Albuquerque; Francês, Carlos Renato Lisboa

    2016-01-01

    The published literature reveals several arguments concerning the strategic importance of information and communication technology (ICT) interventions for developing countries where the digital divide is a challenge. Large-scale ICT interventions can be an option for countries whose regions, both urban and rural, present a high number of digitally excluded people. Our goal was to monitor and identify problems in interventions aimed at certification for a large number of participants in different geographical regions. Our case study is the training at the Telecentros.BR, a program created in Brazil to install telecenters and certify individuals to use ICT resources. We propose an approach that applies social network analysis and mining techniques to data collected from Telecentros.BR dataset and from the socioeconomics and telecommunications infrastructure indicators of the participants' municipalities. We found that (i) the analysis of interactions in different time periods reflects the objectives of each phase of training, highlighting the increased density in the phase in which participants develop and disseminate their projects; (ii) analysis according to the roles of participants (i.e., tutors or community members) reveals that the interactions were influenced by the center (or region) to which the participant belongs (that is, a community contained mainly members of the same region and always with the presence of tutors, contradicting expectations of the training project, which aimed for intense collaboration of the participants, regardless of the geographic region); (iii) the social network of participants influences the success of the training: that is, given evidence that the degree of the community member is in the highest range, the probability of this individual concluding the training is 0.689; (iv) the North region presented the lowest probability of participant certification, whereas the Northeast, which served municipalities with similar

  20. A Web-based Multi-user Interactive Visualization System For Large-Scale Computing Using Google Web Toolkit Technology

    Science.gov (United States)

    Weiss, R. M.; McLane, J. C.; Yuen, D. A.; Wang, S.

    2009-12-01

    We have created a web-based, interactive system for multi-user collaborative visualization of large data sets (on the order of terabytes) that allows users in geographically disparate locations to simultaneous and collectively visualize large data sets over the Internet. By leveraging asynchronous java and XML (AJAX) web development paradigms via the Google Web Toolkit (http://code.google.com/webtoolkit/), we are able to provide remote, web-based users a web portal to LCSE's (http://www.lcse.umn.edu) large-scale interactive visualization system already in place at the University of Minnesota that provides high resolution visualizations to the order of 15 million pixels by Megan Damon. In the current version of our software, we have implemented a new, highly extensible back-end framework built around HTTP "server push" technology to provide a rich collaborative environment and a smooth end-user experience. Furthermore, the web application is accessible via a variety of devices including netbooks, iPhones, and other web- and javascript-enabled cell phones. New features in the current version include: the ability for (1) users to launch multiple visualizations, (2) a user to invite one or more other users to view their visualization in real-time (multiple observers), (3) users to delegate control aspects of the visualization to others (multiple controllers) , and (4) engage in collaborative chat and instant messaging with other users within the user interface of the web application. We will explain choices made regarding implementation, overall system architecture and method of operation, and the benefits of an extensible, modular design. We will also discuss future goals, features, and our plans for increasing scalability of the system which includes a discussion of the benefits potentially afforded us by a migration of server-side components to the Google Application Engine (http://code.google.com/appengine/).

  1. Large Scale Solar Heating

    DEFF Research Database (Denmark)

    Heller, Alfred

    2001-01-01

    The main objective of the research was to evaluate large-scale solar heating connected to district heating (CSDHP), to build up a simulation tool and to demonstrate the application of the simulation tool for design studies and on a local energy planning case. The evaluation was mainly carried out...... model is designed and validated on the Marstal case. Applying the Danish Reference Year, a design tool is presented. The simulation tool is used for proposals for application of alternative designs, including high-performance solar collector types (trough solar collectors, vaccum pipe collectors......). Simulation programs are proposed as control supporting tool for daily operation and performance prediction of central solar heating plants. Finaly the CSHP technolgy is put into persepctive with respect to alternatives and a short discussion on the barries and breakthrough of the technology are given....

  2. Survey on the technological development issues for large-scale methanol engine power generation plant; Ogata methanol engine hatsuden plant ni kansuru gijutsu kaihatsu kadai chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-03-01

    Based on the result of `Survey on the feasibility of large-scale methanol engine power generation plant` in fiscal 1992, concrete technological development issues were studied for its practical use, and the technological R & D scheme was prepared for large-scale methanol engine power plant featured by low NOx and high efficiency. Technological development issues of this plant were as follows: improvement of thermal efficiency, reduction of NOx emission, improvement of the reliability and durability of ignition and fuel injection systems, and reduction of vibration. As the economical effect of the technological development, the profitability of NOx control measures was compared between this methanol engine and conventional heavy oil diesel engines or gas engines. As a result, this engine was more economical than conventional engines. It was suggested that development of the equipment will be completed in nearly 4 years through every component study, single-cylinder model experiment and real engine test. 21 refs., 43 figs., 19 tabs.

  3. Large scale tracking algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Ross L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Love, Joshua Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Melgaard, David Kennett [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Karelitz, David B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pitts, Todd Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Zollweg, Joshua David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Anderson, Dylan Z. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Nandy, Prabal [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Whitlow, Gary L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bender, Daniel A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Byrne, Raymond Harry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  4. Passive technologies for future large-scale photonic integrated circuits on silicon: polarization handling, light non-reciprocity and loss reduction

    Directory of Open Access Journals (Sweden)

    Daoxin Dai

    2012-03-01

    Full Text Available Silicon-based large-scale photonic integrated circuits are becoming important, due to the need for higher complexity and lower cost for optical transmitters, receivers and optical buffers. In this paper, passive technologies for large-scale photonic integrated circuits are described, including polarization handling, light non-reciprocity and loss reduction. The design rule for polarization beam splitters based on asymmetrical directional couplers is summarized and several novel designs for ultra-short polarization beam splitters are reviewed. A novel concept for realizing a polarization splitter–rotator is presented with a very simple fabrication process. Realization of silicon-based light non-reciprocity devices (e.g., optical isolator, which is very important for transmitters to avoid sensitivity to reflections, is also demonstrated with the help of magneto-optical material by the bonding technology. Low-loss waveguides are another important technology for large-scale photonic integrated circuits. Ultra-low loss optical waveguides are achieved by designing a Si3N4 core with a very high aspect ratio. The loss is reduced further to <0.1 dB m−1 with an improved fabrication process incorporating a high-quality thermal oxide upper cladding by means of wafer bonding. With the developed ultra-low loss Si3N4 optical waveguides, some devices are also demonstrated, including ultra-high-Q ring resonators, low-loss arrayed-waveguide grating (demultiplexers, and high-extinction-ratio polarizers.

  5. Development of Synthesis and Large Scale Production Technology for Ultrahigh Energy and Density Fluoro-Organic Compounds

    National Research Council Canada - National Science Library

    Yang, Jing; Knight, Travis W; Dolfier, Jr., William R; Segal, Corin

    2005-01-01

    .... The project combined the scientific research base of the University of Florida Department of Chemistry and Department of Mechanical and Aerospace Engineering with the analytical skills and technology...

  6. Conference on Large Scale Optimization

    CERN Document Server

    Hearn, D; Pardalos, P

    1994-01-01

    On February 15-17, 1993, a conference on Large Scale Optimization, hosted by the Center for Applied Optimization, was held at the University of Florida. The con­ ference was supported by the National Science Foundation, the U. S. Army Research Office, and the University of Florida, with endorsements from SIAM, MPS, ORSA and IMACS. Forty one invited speakers presented papers on mathematical program­ ming and optimal control topics with an emphasis on algorithm development, real world applications and numerical results. Participants from Canada, Japan, Sweden, The Netherlands, Germany, Belgium, Greece, and Denmark gave the meeting an important international component. At­ tendees also included representatives from IBM, American Airlines, US Air, United Parcel Serice, AT & T Bell Labs, Thinking Machines, Army High Performance Com­ puting Research Center, and Argonne National Laboratory. In addition, the NSF sponsored attendance of thirteen graduate students from universities in the United States and abro...

  7. Traditional methods v. new technologies – dilemmas for dietary assessment in large-scale nutrition surveys and studies

    DEFF Research Database (Denmark)

    Amoutzopoulos, B.; Steer, T.; Roberts, C.

    2018-01-01

    assessment in population surveys’, was held at the 9th International Conference on Diet and Activity Methods (ICDAM9), Brisbane, September 2015. Despite respondent and researcher burden, traditional methods have been most commonly used in nutrition surveys. However, dietary assessment technologies offer...... of traditional dietary assessment methods (food records, FFQ, 24 h recalls, diet history with interviewer-assisted data collection) v. new technology-based dietary assessment methods (web-based and mobile device applications). The panel discussion ‘Traditional methods v. new technologies: dilemmas for dietary......The aim of the present paper is to summarise current and future applications of dietary assessment technologies in nutrition surveys in developed countries. It includes the discussion of key points and highlights of subsequent developments from a panel discussion to address strengths and weaknesses...

  8. Integration of Technology, Curriculum, and Professional Development for Advancing Middle School Mathematics: Three Large-Scale Studies

    Science.gov (United States)

    Roschelle, Jeremy; Shechtman, Nicole; Tatar, Deborah; Hegedus, Stephen; Hopkins, Bill; Empson, Susan; Knudsen, Jennifer; Gallagher, Lawrence P.

    2010-01-01

    The authors present three studies (two randomized controlled experiments and one embedded quasi-experiment) designed to evaluate the impact of replacement units targeting student learning of advanced middle school mathematics. The studies evaluated the SimCalc approach, which integrates an interactive representational technology, paper curriculum,…

  9. Large-Scale Mercury Control Technology Testing for Lignite-Fired Utilities - Oxidation Systems for Wet FGD

    Energy Technology Data Exchange (ETDEWEB)

    Steven A. Benson; Michael J. Holmes; Donald P. McCollor; Jill M. Mackenzie; Charlene R. Crocker; Lingbu Kong; Kevin C. Galbreath

    2007-03-31

    Mercury (Hg) control technologies were evaluated at Minnkota Power Cooperative's Milton R. Young (MRY) Station Unit 2, a 450-MW lignite-fired cyclone unit near Center, North Dakota, and TXU Energy's Monticello Steam Electric Station (MoSES) Unit 3, a 793-MW lignite--Powder River Basin (PRB) subbituminous coal-fired unit near Mt. Pleasant, Texas. A cold-side electrostatic precipitator (ESP) and wet flue gas desulfurization (FGD) scrubber are used at MRY and MoSES for controlling particulate and sulfur dioxide (SO{sub 2}) emissions, respectively. Several approaches for significantly and cost-effectively oxidizing elemental mercury (Hg{sup 0}) in lignite combustion flue gases, followed by capture in an ESP and/or FGD scrubber were evaluated. The project team involved in performing the technical aspects of the project included Babcock & Wilcox, the Energy & Environmental Research Center (EERC), the Electric Power Research Institute, and URS Corporation. Calcium bromide (CaBr{sub 2}), calcium chloride (CaCl{sub 2}), magnesium chloride (MgCl{sub 2}), and a proprietary sorbent enhancement additive (SEA), hereafter referred to as SEA2, were added to the lignite feeds to enhance Hg capture in the ESP and/or wet FGD. In addition, powdered activated carbon (PAC) was injected upstream of the ESP at MRY Unit 2. The work involved establishing Hg concentrations and removal rates across existing ESP and FGD units, determining costs associated with a given Hg removal efficiency, quantifying the balance-of-plant impacts of the control technologies, and facilitating technology commercialization. The primary project goal was to achieve ESP-FGD Hg removal efficiencies of {ge}55% at MRY and MoSES for about a month.

  10. Small-scale and large-scale testing of photo-electrochemically activated leaching technology in Aprelkovo and Delmachik Mines

    Science.gov (United States)

    Sekisov, AG; Lavrov, AYu; Rubtsov, YuI

    2017-02-01

    The paper gives a description of tests and trials of the technology of heap gold leaching from rebellious ore in Aprelkovo and Delmachik Mines. Efficiency of leaching flowsheets with the stage-wise use of activated solutions of different reagents, including active forms of oxygen, is evaluated. Carbonate-peroxide solutions are used at the first stage of leaching to oxidize sulfide and sulfide-arsenide ore minerals to recover iron and copper from them. The second stage leaching uses active cyanide solutions to leach encapsulated and disperse gold and silver.

  11. Preinoculation of Soybean Seeds Treated with Agrichemicals up to 30 Days before Sowing: Technological Innovation for Large-Scale Agriculture.

    Science.gov (United States)

    Araujo, Ricardo Silva; da Cruz, Sonia Purin; Souchie, Edson Luiz; Martin, Thomas Newton; Nakatani, André Shigueyoshi; Nogueira, Marco Antonio; Hungria, Mariangela

    2017-01-01

    The cultivation of soybean in Brazil experienced an expressive growth in the last decades. Soybean is highly demanding on nitrogen (N) that must come from fertilizers or from biological fixation. The N supply to the soybean crop in Brazil relies on the inoculation with elite strains of Bradyrhizobium japonicum, B. elkanii, and B. diazoefficiens , which are able to fulfill the crop's N requirements and enrich the soil for the following crop. The effectiveness of the association between N 2 -fixing bacteria and soybean plants depends on the efficacy of the inoculation process. Seed treatment with pesticides, especially fungicides or micronutrients, may rapidly kill the inoculated bacteria, affecting the establishment and outcome of the symbiosis. The development of technologies that allow inoculation to become a successful component of industrial seed treatment represents a valuable tool for the seed industry, as well as for the soybean crop worldwide. In this article, we report the results of new technologies, developed by the company Total Biotecnologia Indústria e Comércio S/A of Brazil, for preinoculation of soybean seeds with bradyrhizobia, in the presence of agrichemicals. Our results demonstrate improved bacterial survival for up to 30 days after inoculation, without compromising nodulation, N 2 -fixation, and yield in the field.

  12. Possibility of applying large-scale point cloud/mixed reality technology in decommissioning of nuclear facilities

    International Nuclear Information System (INIS)

    Shoji, Kimiaki

    2017-01-01

    After the accident at Tokyo Electric Company's Fukushima No. 1 nuclear power plant, decommissioning projects of nuclear power plants exceeding 40 years since the start of operation began to move in full swing. And four nuclear power plants have already been under decommissioning. Several decommissioning engineering systems (ES) have been developed according to these decommissioning projects. Various problems were clarified and many findings were obtained by these efforts. On the other, advanced information technologies and products such as three-dimensional CAD, CG, 3D laser measurement, computer aided engineering (CAE) and mixed reality (MR) are progressing rapidly. By combining these technologies and products, it has become possible not only to enhance the usefulness of existing 3D CAD data but also to enable high-level digital study that combines reality and virtual models. Furthermore, it can be applied to a wide range of fields such as demolition simulation for dismantling works of nuclear facilities, which is expected to increase in future, human resource development and skill transfer. In this paper, focusing on a video see-through method capable of displaying a virtual object at a correct position of a real image accurately reflecting the positional relationship between the real image and the virtual object, we introduce items that should contribute to the feasibility and usefulness of application to decommissioning of nuclear facilities. (author)

  13. Interfacing Detectors and Collecting Data for Large-Scale Experiments in High Energy Physics Using COTS Technology

    CERN Document Server

    Schumacher, Jorn; Wandelli, Wainer

    Data-acquisition systems for high-energy physics experiments like the ATLAS experiment at the European particle-physics research institute CERN are used to record experimental physics data and are essential for the effective operation of an experiment. Located in underground facilities with limited space, power, cooling, and exposed to ionizing radiation and strong magnetic fields, data-acquisition systems have unique requirements and are challenging to design and build. Traditionally, these systems have been composed of custom-designed electronic components to be able to cope with the large data volumes that high-energy physics experiments generate and at the same time meet technological and environmental requirements. Custom-designed electronics is costly to develop, effortful to maintain and typically not very flexible. This thesis explores an alternative architecture for data-acquisition systems based on commercial off-the-shelf (COTS) components. A COTS-based data distribution device called FELIX that w...

  14. Interactive Visualization of Large-Scale Hydrological Data using Emerging Technologies in Web Systems and Parallel Programming

    Science.gov (United States)

    Demir, I.; Krajewski, W. F.

    2013-12-01

    As geoscientists are confronted with increasingly massive datasets from environmental observations to simulations, one of the biggest challenges is having the right tools to gain scientific insight from the data and communicate the understanding to stakeholders. Recent developments in web technologies make it easy to manage, visualize and share large data sets with general public. Novel visualization techniques and dynamic user interfaces allow users to interact with data, and modify the parameters to create custom views of the data to gain insight from simulations and environmental observations. This requires developing new data models and intelligent knowledge discovery techniques to explore and extract information from complex computational simulations or large data repositories. Scientific visualization will be an increasingly important component to build comprehensive environmental information platforms. This presentation provides an overview of the trends and challenges in the field of scientific visualization, and demonstrates information visualization and communication tools developed within the light of these challenges.

  15. Interfacing detectors and collecting data for large-scale experiments in high energy physics using COTS technology

    International Nuclear Information System (INIS)

    Schumacher, Joern

    2017-01-01

    Data-acquisition systems for high-energy physics experiments like the ATLAS experiment at the European particle-physics research institute CERN are used to record experimental physics data and are essential for the effective operation of an experiment. Located in underground facilities with limited space, power, cooling, and exposed to ionizing radiation and strong magnetic fields, data-acquisition systems have unique requirements and are challenging to design and build. Traditionally, these systems have been composed of custom-designed electronic components to be able to cope with the large data volumes that high-energy physics experiments generate and at the same time meet technological and environmental requirements. Custom-designed electronics is costly to develop,effortful to maintain and typically not very flexible. This thesis explores an alternative architecture for data-acquisition systems based on commercial off-the-shelf (COTS) components. A COTS-based data distribution device called FELIX that will be integrated in ATLAS is presented. The hardware and software implementation of this device is discussed, with a specific focus on performance, heterogenity of systems and traffic patterns. The COTS-based readout approach is evaluated in the context of the future requirements of the ATLAS experiment. The main contributions of the thesis are an analysis of the ATLAS data-acquisition system with a focus on the readout system, a software architecture for the main application on FELIX hosts, a performance analysis and tuning based on computer science methods for central FELIX software components with respect to the requirements of the ATLAS experiment, a network communication library with a high-level software interface to utilize high-performance computing network technology for the purpose of data-acquisition systems, and an evaluation and discussion of ATLAS data-acquisition using FELIX systems as a case study for COTS-based data-acquisition in high

  16. Transcriptome sequencing of lentil based on second-generation technology permits large-scale unigene assembly and SSR marker discovery

    Directory of Open Access Journals (Sweden)

    Materne Michael

    2011-05-01

    Full Text Available Abstract Background Lentil (Lens culinaris Medik. is a cool-season grain legume which provides a rich source of protein for human consumption. In terms of genomic resources, lentil is relatively underdeveloped, in comparison to other Fabaceae species, with limited available data. There is hence a significant need to enhance such resources in order to identify novel genes and alleles for molecular breeding to increase crop productivity and quality. Results Tissue-specific cDNA samples from six distinct lentil genotypes were sequenced using Roche 454 GS-FLX Titanium technology, generating c. 1.38 × 106 expressed sequence tags (ESTs. De novo assembly generated a total of 15,354 contigs and 68,715 singletons. The complete unigene set was sequence-analysed against genome drafts of the model legume species Medicago truncatula and Arabidopsis thaliana to identify 12,639, and 7,476 unique matches, respectively. When compared to the genome of Glycine max, a total of 20,419 unique hits were observed corresponding to c. 31% of the known gene space. A total of 25,592 lentil unigenes were subsequently annoated from GenBank. Simple sequence repeat (SSR-containing ESTs were identified from consensus sequences and a total of 2,393 primer pairs were designed. A subset of 192 EST-SSR markers was screened for validation across a panel 12 cultivated lentil genotypes and one wild relative species. A total of 166 primer pairs obtained successful amplification, of which 47.5% detected genetic polymorphism. Conclusions A substantial collection of ESTs has been developed from sequence analysis of lentil genotypes using second-generation technology, permitting unigene definition across a broad range of functional categories. As well as providing resources for functional genomics studies, the unigene set has permitted significant enhancement of the number of publicly-available molecular genetic markers as tools for improvement of this species.

  17. Interfacing detectors and collecting data for large-scale experiments in high energy physics using COTS technology

    Energy Technology Data Exchange (ETDEWEB)

    Schumacher, Joern

    2017-07-01

    Data-acquisition systems for high-energy physics experiments like the ATLAS experiment at the European particle-physics research institute CERN are used to record experimental physics data and are essential for the effective operation of an experiment. Located in underground facilities with limited space, power, cooling, and exposed to ionizing radiation and strong magnetic fields, data-acquisition systems have unique requirements and are challenging to design and build. Traditionally, these systems have been composed of custom-designed electronic components to be able to cope with the large data volumes that high-energy physics experiments generate and at the same time meet technological and environmental requirements. Custom-designed electronics is costly to develop,effortful to maintain and typically not very flexible. This thesis explores an alternative architecture for data-acquisition systems based on commercial off-the-shelf (COTS) components. A COTS-based data distribution device called FELIX that will be integrated in ATLAS is presented. The hardware and software implementation of this device is discussed, with a specific focus on performance, heterogenity of systems and traffic patterns. The COTS-based readout approach is evaluated in the context of the future requirements of the ATLAS experiment. The main contributions of the thesis are an analysis of the ATLAS data-acquisition system with a focus on the readout system, a software architecture for the main application on FELIX hosts, a performance analysis and tuning based on computer science methods for central FELIX software components with respect to the requirements of the ATLAS experiment, a network communication library with a high-level software interface to utilize high-performance computing network technology for the purpose of data-acquisition systems, and an evaluation and discussion of ATLAS data-acquisition using FELIX systems as a case study for COTS-based data-acquisition in high

  18. Large Scale System Defense

    Science.gov (United States)

    2008-10-01

    NUMBER 00 5f. WORK UNIT NUMBER 01 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Columbia University 1700 Broadway New York NY 10019-5905 8...PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) AFRL/RIGA 525 Brooks Rd. Rome NY 13441-4505...pealing because of the need to modify source code. Since source-level annotations serve as a vestigial policy, we articulated a way to augment self

  19. MacroBac: New Technologies for Robust and Efficient Large-Scale Production of Recombinant Multiprotein Complexes.

    Science.gov (United States)

    Gradia, Scott D; Ishida, Justin P; Tsai, Miaw-Sheue; Jeans, Chris; Tainer, John A; Fuss, Jill O

    2017-01-01

    Recombinant expression of large, multiprotein complexes is essential and often rate limiting for determining structural, biophysical, and biochemical properties of DNA repair, replication, transcription, and other key cellular processes. Baculovirus-infected insect cell expression systems are especially well suited for producing large, human proteins recombinantly, and multigene baculovirus systems have facilitated studies of multiprotein complexes. In this chapter, we describe a multigene baculovirus system called MacroBac that uses a Biobricks-type assembly method based on restriction and ligation (Series 11) or ligation-independent cloning (Series 438). MacroBac cloning and assembly is efficient and equally well suited for either single subcloning reactions or high-throughput cloning using 96-well plates and liquid handling robotics. MacroBac vectors are polypromoter with each gene flanked by a strong polyhedrin promoter and an SV40 poly(A) termination signal that minimize gene order expression level effects seen in many polycistronic assemblies. Large assemblies are robustly achievable, and we have successfully assembled as many as 10 genes into a single MacroBac vector. Importantly, we have observed significant increases in expression levels and quality of large, multiprotein complexes using a single, multigene, polypromoter virus rather than coinfection with multiple, single-gene viruses. Given the importance of characterizing functional complexes, we believe that MacroBac provides a critical enabling technology that may change the way that structural, biophysical, and biochemical research is done. © 2017 Elsevier Inc. All rights reserved.

  20. Survey and research for the enhancement of large-scale technology development 2. How large-scale technology development should be in the future; Ogata gijutsu kaihatsu suishin no tame no chosa kenkyu. 2. Kongo no ogata gijutsu kaihatsu no arikata

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1981-03-01

    A survey is conducted over the subject matter by holding interviews with people, employed with the entrusted businesses participating in the large-scale industrial technology development system, who are engaged in the development of industrial technologies, and with people of experience or academic background involved in the project enhancement effort. Needs of improvement are pointed out that the competition principle based for example on parallel development be introduced; that research-on-research be practiced for effective task institution; midway evaluation be substantiated since prior evaluation is difficult; efforts be made to organize new industries utilizing the fruits of large-scale industrial technology for the creation of markets, not to induce economic conflicts; that transfer of technologies be enhanced from the private sector to public sector. Studies are made about the review of research conducting systems; utilization of the power of private sector research and development efforts; enlightening about industrial proprietorship; and the diffusion of large-scale project systems. In this connection, problems are pointed out, requests are submitted, and remedial measures and suggestions are presented. (NEDO)

  1. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  2. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  3. Survey and research on how large-scale technological development should be in the future; Kongo no ogata gijutsu kaihatsu no hoko ni tsuite no chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1982-03-01

    Tasks to be subjected to research and development under the large-scale industrial technology research and development system are discussed. Mentioned in the fields of resources and foods are a submarine metal sulfide mining system, a submarine oil development system for ice-covered sea areas, an all-weather type useful vegetable automatic production system, etc. Mentioned in the fields of social development, security, and disaster prevention are a construction work robot, shelter system technologies, disaster control technologies in case of mega-scale disasters, etc. Mentioned in the fields of health, welfare, and education are biomimetics, biosystems, cancer diagnosis and treatment systems, etc. Mentioned in the field of commodity distribution, service, and software are a computer security system, an unmanned collection and distribution system, etc. Mentioned in the field of process conversion are aluminum refining, synzyme technologies for precise synthesis, etc. Mentioned in the field of data processing are optical computers, bioelectronics, etc. Various tasks are pointed out also in the fields of aviation, space, ocean, and machining. (NEDO)

  4. How do staff members at science and technology centres consider the impact of sponsors on the scientific content of exhibitions?

    DEFF Research Database (Denmark)

    Davidsson, Eva; Sørensen, Helene

    2009-01-01

    or historical museums. But in what ways may sponsors impact exhibition content and design at science and technology centres? This study seeks to explore how staff members consider the impact of sponsors and donors on exhibit content and design. The data collection involves a survey, interviews and a focus group...... interview with staff members, who work with planning and constructing new exhibitions at their science and technology centre. The results suggest that sponsors may interfere in exhibition construction both directly and indirectly. This means that sponsors could put explicit demands when it comes...... to the choice of scientific content and design and thereby interfere directly. Indirect impact, on the other hand, refers to implicit demands of sponsors where staff members take into account for what they believe are views of the sponsors through self-censorship....

  5. Large-scale field application of RNAi technology reducing Israeli acute paralysis virus disease in honey bees (Apis mellifera, Hymenoptera: Apidae.

    Directory of Open Access Journals (Sweden)

    Wayne Hunter

    Full Text Available The importance of honey bees to the world economy far surpasses their contribution in terms of honey production; they are responsible for up to 30% of the world's food production through pollination of crops. Since fall 2006, honey bees in the U.S. have faced a serious population decline, due in part to a phenomenon called Colony Collapse Disorder (CCD, which is a disease syndrome that is likely caused by several factors. Data from an initial study in which investigators compared pathogens in honey bees affected by CCD suggested a putative role for Israeli Acute Paralysis Virus, IAPV. This is a single stranded RNA virus with no DNA stage placed taxonomically within the family Dicistroviridae. Although subsequent studies have failed to find IAPV in all CCD diagnosed colonies, IAPV has been shown to cause honey bee mortality. RNA interference technology (RNAi has been used successfully to silence endogenous insect (including honey bee genes both by injection and feeding. Moreover, RNAi was shown to prevent bees from succumbing to infection from IAPV under laboratory conditions. In the current study IAPV specific homologous dsRNA was used in the field, under natural beekeeping conditions in order to prevent mortality and improve the overall health of bees infected with IAPV. This controlled study included a total of 160 honey bee hives in two discrete climates, seasons and geographical locations (Florida and Pennsylvania. To our knowledge, this is the first successful large-scale real world use of RNAi for disease control.

  6. Based on Real Time Remote Health Monitoring Systems: A New Approach for Prioritization "Large Scales Data" Patients with Chronic Heart Diseases Using Body Sensors and Communication Technology.

    Science.gov (United States)

    Kalid, Naser; Zaidan, A A; Zaidan, B B; Salman, Omar H; Hashim, M; Albahri, O S; Albahri, A S

    2018-03-02

    This paper presents a new approach to prioritize "Large-scale Data" of patients with chronic heart diseases by using body sensors and communication technology during disasters and peak seasons. An evaluation matrix is used for emergency evaluation and large-scale data scoring of patients with chronic heart diseases in telemedicine environment. However, one major problem in the emergency evaluation of these patients is establishing a reasonable threshold for patients with the most and least critical conditions. This threshold can be used to detect the highest and lowest priority levels when all the scores of patients are identical during disasters and peak seasons. A practical study was performed on 500 patients with chronic heart diseases and different symptoms, and their emergency levels were evaluated based on four main measurements: electrocardiogram, oxygen saturation sensor, blood pressure monitoring, and non-sensory measurement tool, namely, text frame. Data alignment was conducted for the raw data and decision-making matrix by converting each extracted feature into an integer. This integer represents their state in the triage level based on medical guidelines to determine the features from different sources in a platform. The patients were then scored based on a decision matrix by using multi-criteria decision-making techniques, namely, integrated multi-layer for analytic hierarchy process (MLAHP) and technique for order performance by similarity to ideal solution (TOPSIS). For subjective validation, cardiologists were consulted to confirm the ranking results. For objective validation, mean ± standard deviation was computed to check the accuracy of the systematic ranking. This study provides scenarios and checklist benchmarking to evaluate the proposed and existing prioritization methods. Experimental results revealed the following. (1) The integration of TOPSIS and MLAHP effectively and systematically solved the patient settings on triage and

  7. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its......The Subject of large scale networks is approached from the perspective of the network planner. An analysis of the long term planning problems is presented with the main focus on the changing requirements for large scale networks and the potential problems in meeting these requirements. The problems...... the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...

  8. Large-scale solar purchasing

    International Nuclear Information System (INIS)

    1999-01-01

    The principal objective of the project was to participate in the definition of a new IEA task concerning solar procurement (''the Task'') and to assess whether involvement in the task would be in the interest of the UK active solar heating industry. The project also aimed to assess the importance of large scale solar purchasing to UK active solar heating market development and to evaluate the level of interest in large scale solar purchasing amongst potential large scale purchasers (in particular housing associations and housing developers). A further aim of the project was to consider means of stimulating large scale active solar heating purchasing activity within the UK. (author)

  9. An analysis of the energy efficiency of winter rapeseed biomass under different farming technologies. A case study of a large-scale farm in Poland

    International Nuclear Information System (INIS)

    Budzyński, Wojciech Stefan; Jankowski, Krzysztof Józef; Jarocki, Marcin

    2015-01-01

    The article presents the results of a three-year study investigating the impact of production technology on the energy efficiency of winter rapeseed produced in large-scale farms. Rapeseed biomass produced in a high-input system was characterized by the highest energy demand (30.00 GJ ha"−"1). The energy demand associated with medium-input and low-input systems was 20% and 34% lower, respectively. The highest energy value of oil, oil cake and straw was noted in winter rapeseed produced in the high-input system. In the total energy output (268.5 GJ ha"−"1), approximately 17% of energy was accumulated in oil, 20% in oil cake, and 63% in straw. In lower input systems, the energy output of oil decreased by 13–23%, the energy output of oil cake – by 6–16%, and the energy output of straw – by 29–37% without visible changes in the structure of energy accumulated in different components of rapeseed biomass. The highest energy gain was observed in the high-input system. The low-input system was characterized by the highest energy efficiency ratio, at 4.22 for seeds and 9.43 for seeds and straw. The increase in production intensity reduced the energy efficiency of rapeseed biomass production by 8–18% (seeds) and 5–9% (seeds and straw). - Highlights: • Energy inputs in the high-input production system reached 30 GJ ha"−"1. • Energy inputs in the medium- and low-input systems were reduced by 20% and 34%. • Energy gain in the high-input system was 15% and 42% higher than in other systems. • Energy ratio in the high-input system was 5–18% lower than in the low-input system.

  10. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  11. Large scale cluster computing workshop

    International Nuclear Information System (INIS)

    Dane Skow; Alan Silverman

    2002-01-01

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community

  12. Large-scale grid management

    International Nuclear Information System (INIS)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-01-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series

  13. The Expanded Large Scale Gap Test

    Science.gov (United States)

    1987-03-01

    NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T

  14. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  15. Large scale structure and baryogenesis

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    2001-08-01

    We discuss a possible connection between the large scale structure formation and the baryogenesis in the universe. An update review of the observational indications for the presence of a very large scale 120h -1 Mpc in the distribution of the visible matter of the universe is provided. The possibility to generate a periodic distribution with the characteristic scale 120h -1 Mpc through a mechanism producing quasi-periodic baryon density perturbations during inflationary stage, is discussed. The evolution of the baryon charge density distribution is explored in the framework of a low temperature boson condensate baryogenesis scenario. Both the observed very large scale of a the visible matter distribution in the universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, for some model's parameters a natural separation of matter superclusters from antimatter ones can be achieved. (author)

  16. Large-scale CO2 injection demos for the development of monitoring and verification technology and guidelines (CO2ReMoVe)

    Energy Technology Data Exchange (ETDEWEB)

    Wildenborg, T.; David, P. [TNO Built Environment and Geosciences, Princetonlaan 6, 3584 CB Utrecht (Netherlands); Bentham, M.; Chadwick, A.; Kirk, K. [British Geological Survey, Kingsley Dunham Centre, Keyworth, Nottingham NG12 5GG (United Kingdom); Dillen, M. [SINTEF Petroleum Research, Trondheim (Norway); Groenenberg, H. [Unit Policy Studies, Energy Research Centre of the Netherlands ECN, Amsterdam (Netherlands); Deflandre, J.P.; Le Gallo, J. [Institut Francais du Petrole, Rueil-Malmaison (France)

    2009-04-15

    The objectives of the EU project CO2ReMoVe are to undertake the research and development necessary to establish scientifically based standards for monitoring future CCS operations and to develop the performance assessment methodologies necessary to demonstrate the long-term reliability of geological storage of CO2. This could in turn lead to guidelines for the certification of sites suitable for CCS on a wide scale. Crucial to the project portfolio are the continuing large-scale CO2 injection operation at Sleipner, the injection operation at In Salah (Algeria) and the recently started injection project at Snoehvit (Norway). Two pilot sites are also currently in the project portfolio, Ketzin in Germany and K12-B in the offshore continental shelf of the Netherlands.

  17. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  18. Production of recombinant antigens and antibodies in Nicotiana benthamiana using 'magnifection' technology: GMP-compliant facilities for small- and large-scale manufacturing.

    Science.gov (United States)

    Klimyuk, Victor; Pogue, Gregory; Herz, Stefan; Butler, John; Haydon, Hugh

    2014-01-01

    This review describes the adaptation of the plant virus-based transient expression system, magnICON(®) for the at-scale manufacturing of pharmaceutical proteins. The system utilizes so-called "deconstructed" viral vectors that rely on Agrobacterium-mediated systemic delivery into the plant cells for recombinant protein production. The system is also suitable for production of hetero-oligomeric proteins like immunoglobulins. By taking advantage of well established R&D tools for optimizing the expression of protein of interest using this system, product concepts can reach the manufacturing stage in highly competitive time periods. At the manufacturing stage, the system offers many remarkable features including rapid production cycles, high product yield, virtually unlimited scale-up potential, and flexibility for different manufacturing schemes. The magnICON system has been successfully adaptated to very different logistical manufacturing formats: (1) speedy production of multiple small batches of individualized pharmaceuticals proteins (e.g. antigens comprising individualized vaccines to treat NonHodgkin's Lymphoma patients) and (2) large-scale production of other pharmaceutical proteins such as therapeutic antibodies. General descriptions of the prototype GMP-compliant manufacturing processes and facilities for the product formats that are in preclinical and clinical testing are provided.

  19. Prospects for large scale electricity storage in Denmark

    DEFF Research Database (Denmark)

    Krog Ekman, Claus; Jensen, Søren Højgaard

    2010-01-01

    In a future power systems with additional wind power capacity there will be an increased need for large scale power management as well as reliable balancing and reserve capabilities. Different technologies for large scale electricity storage provide solutions to the different challenges arising w...

  20. Economically viable large-scale hydrogen liquefaction

    Science.gov (United States)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  1. Application of GRA method, dynamic analysis and fuzzy set theory in evaluation and selection of emergency treatment technology for large scale phenol spill incidents

    Science.gov (United States)

    Zhao, Jingjing; Yu, Lean; Li, Lian

    2017-05-01

    Select an appropriate technology in an emergency response is a very important issue with various kinds of chemical contingency spills frequently taking place. Due to the complexity, fuzziness and uncertainties of the chemical contingency spills, the theory of GRA method, dynamic analysis combined with fuzzy set theory will be appropriately applied to selection and evaluation of emergency treatment technology. Finally, a emergency phenol spill accidence occurred in highway is provided to illustrate the applicability and feasibility of the proposed methods.

  2. Large scale biomimetic membrane arrays

    DEFF Research Database (Denmark)

    Hansen, Jesper Søndergaard; Perry, Mark; Vogel, Jörg

    2009-01-01

    To establish planar biomimetic membranes across large scale partition aperture arrays, we created a disposable single-use horizontal chamber design that supports combined optical-electrical measurements. Functional lipid bilayers could easily and efficiently be established across CO2 laser micro......-structured 8 x 8 aperture partition arrays with average aperture diameters of 301 +/- 5 mu m. We addressed the electro-physical properties of the lipid bilayers established across the micro-structured scaffold arrays by controllable reconstitution of biotechnological and physiological relevant membrane...... peptides and proteins. Next, we tested the scalability of the biomimetic membrane design by establishing lipid bilayers in rectangular 24 x 24 and hexagonal 24 x 27 aperture arrays, respectively. The results presented show that the design is suitable for further developments of sensitive biosensor assays...

  3. Large scale nuclear structure studies

    International Nuclear Information System (INIS)

    Faessler, A.

    1985-01-01

    Results of large scale nuclear structure studies are reported. The starting point is the Hartree-Fock-Bogoliubov solution with angular momentum and proton and neutron number projection after variation. This model for number and spin projected two-quasiparticle excitations with realistic forces yields in sd-shell nuclei similar good results as the 'exact' shell-model calculations. Here the authors present results for a pf-shell nucleus 46 Ti and results for the A=130 mass region where they studied 58 different nuclei with the same single-particle energies and the same effective force derived from a meson exchange potential. They carried out a Hartree-Fock-Bogoliubov variation after mean field projection in realistic model spaces. In this way, they determine for each yrast state the optimal mean Hartree-Fock-Bogoliubov field. They apply this method to 130 Ce and 128 Ba using the same effective nucleon-nucleon interaction. (Auth.)

  4. Large-scale river regulation

    International Nuclear Information System (INIS)

    Petts, G.

    1994-01-01

    Recent concern over human impacts on the environment has tended to focus on climatic change, desertification, destruction of tropical rain forests, and pollution. Yet large-scale water projects such as dams, reservoirs, and inter-basin transfers are among the most dramatic and extensive ways in which our environment has been, and continues to be, transformed by human action. Water running to the sea is perceived as a lost resource, floods are viewed as major hazards, and wetlands are seen as wastelands. River regulation, involving the redistribution of water in time and space, is a key concept in socio-economic development. To achieve water and food security, to develop drylands, and to prevent desertification and drought are primary aims for many countries. A second key concept is ecological sustainability. Yet the ecology of rivers and their floodplains is dependent on the natural hydrological regime, and its related biochemical and geomorphological dynamics. (Author)

  5. Foster Wheeler's Solutions for Large Scale CFB Boiler Technology: Features and Operational Performance of Łagisza 460 MWe CFB Boiler

    Science.gov (United States)

    Hotta, Arto

    During recent years, once-through supercritical (OTSC) CFB technology has been developed, enabling the CFB technology to proceed to medium-scale (500 MWe) utility projects such as Łagisza Power Plant in Poland owned by Poludniowy Koncern Energetyczny SA. (PKE), with net efficiency nearly 44%. Łagisza power plant is currently under commissioning and has reached full load operation in March 2009. The initial operation shows very good performance and confirms, that the CFB process has no problems with the scaling up to this size. Also the once-through steam cycle utilizing Siemens' vertical tube Benson technology has performed as predicted in the CFB process. Foster Wheeler has developed the CFB design further up to 800 MWe with net efficiency of ≥45%.

  6. Boosting the adoption and the reliability of renewable energy sources: Mitigating the large-scale wind power intermittency through vehicle to grid technology

    International Nuclear Information System (INIS)

    Zhao, Yang; Noori, Mehdi; Tatari, Omer

    2017-01-01

    The integration of wind energy in the electricity sector and the adoption of electric vehicles in the transportation sector both have the potential to significantly reduce greenhouse gas emissions individually as well as in tandem with Vehicle-to-Grid technology. This study aims to evaluate the greenhouse gas emission savings of mitigating intermittency resulting from the introduction of wind power through Vehicle-to-Grid technologies, as well as the extent to which the marginal electricity consumption from charging an electric vehicle fleet may weaken this overall environmental benefit. To this end, the comparisons are conducted in seven independent system operator regions. The results indicate that, in most cases, the emission savings of a combination of wind power and Vehicle-to-Grid technology outweighs the additional emissions from marginal electricity generation for electric vehicles. In addition, the fluctuations in newly-integrated wind power could be balanced in the future using EVs and V2G technology, provided that a moderate portion of EV owners is willing to provide V2G services. On the other hand, such a combination is not favorable if the Vehicle-to-Grid service participation rate is less than 5% of all electric vehicle owners within a particular region. - Highlights: • The environmental benefit of vehicle to grid systems as grid stabilizer is analyzed. • Emission savings of vehicle to grid and impacts of electric vehicles are compared. • Seven independent system operator regions are studied. • Uncertainty and sensitivity analysis are performed through a Monte Carlo Simulation.

  7. Reports on research programs in the field of reactor safety sponsored by the Federal Ministry for Research and Technology

    International Nuclear Information System (INIS)

    1986-11-01

    Investigations on the safety of Light Water Reactors (LWR) being performed in the framework of the research program on reactor safety (RS-projects) are sponsored by the Federal Ministry for Research and Technology (BMFT). Objective of this program is to investigate in greater detail the safety margins of nuclear power plants and their systems and the further development of safety technology. Besides the investigations of LWR tasks also projects on the safety of advanced reactors are sponsored by the BMFT. The individual reports are classified according to the research program on the safety of LWRs 1977-1980 of the BMFT. Another table of contents uses the same classification system as applied in the nuclear safety index of the CEC (Commission of the European Communities) and the OECD (Organization for Economic Cooperation and Development). The reports are arranged in the sequence of their project numbers. (orig./HP) [de

  8. Reports of research programs in the field of reactor safety sponsored by the Federal Ministry for Research and Technology

    International Nuclear Information System (INIS)

    1986-06-01

    Investigations on the safety of Light Water Reactors (LWR) being performed in the framework of his research program on reactor safety (RS-projects) are sponsored by the Federal Ministry for Research and Technology (BMFT). Objective of this program is to investigate in greater detail the safety margins of nuclear power plants and their systems and the further development of safety technology. Besides the investigations of LWR tasks also projects on the safety of advanced reactors are sponsored by the BMFT. The individual reports are classified according to the research program on the safety of LWRs 1977-1980 of the BMFT. Another table of contents uses the same classification system as applied in the nuclear safety index of the CEC (Commission of the European Communities) and the OECD (Organization for Economic Cooperation and Development). The reports are arranged in the sequence of their project numbers. (orig./HP) [de

  9. 15 year's summary report on blanket technology and materials of mixed fuel reactor research sponsored by national '863' projects

    International Nuclear Information System (INIS)

    Xu Zengyu; Chen Jiming; Liu Xiang

    2001-01-01

    15 year's achievements of Southwestern Institute of Physics, China, in fusion technology and materials research sponsored by National '863' Engineering Projects are summarized. Many scientific and technical achievements are obtained in the researches on tritium production and recovery, doped carbon basic materials, V-alloys, 316L SS irradiation performance, B 4 C and TiC coatings, etc. Some facilities were built and some were improved for materials research. 108 references are annexed

  10. Reviving large-scale projects

    International Nuclear Information System (INIS)

    Desiront, A.

    2003-01-01

    For the past decade, most large-scale hydro development projects in northern Quebec have been put on hold due to land disputes with First Nations. Hydroelectric projects have recently been revived following an agreement signed with Aboriginal communities in the province who recognized the need to find new sources of revenue for future generations. Many Cree are working on the project to harness the waters of the Eastmain River located in the middle of their territory. The work involves building an 890 foot long dam, 30 dikes enclosing a 603 square-km reservoir, a spillway, and a power house with 3 generating units with a total capacity of 480 MW of power for start-up in 2007. The project will require the use of 2,400 workers in total. The Cree Construction and Development Company is working on relations between Quebec's 14,000 Crees and the James Bay Energy Corporation, the subsidiary of Hydro-Quebec which is developing the project. Approximately 10 per cent of the $735-million project has been designated for the environmental component. Inspectors ensure that the project complies fully with environmental protection guidelines. Total development costs for Eastmain-1 are in the order of $2 billion of which $735 million will cover work on site and the remainder will cover generating units, transportation and financial charges. Under the treaty known as the Peace of the Braves, signed in February 2002, the Quebec government and Hydro-Quebec will pay the Cree $70 million annually for 50 years for the right to exploit hydro, mining and forest resources within their territory. The project comes at a time when electricity export volumes to the New England states are down due to growth in Quebec's domestic demand. Hydropower is a renewable and non-polluting source of energy that is one of the most acceptable forms of energy where the Kyoto Protocol is concerned. It was emphasized that large-scale hydro-electric projects are needed to provide sufficient energy to meet both

  11. Transition to large scale use of hydrogen and sustainable energy services. Choices of technology and infrastructure under path dependence, feedback and nonlinearity

    Energy Technology Data Exchange (ETDEWEB)

    Gether, Kaare

    2004-07-01

    We live in a world of becoming. The future is not given, but forms continuously in dynamic processes where path dependence plays a major role. There are many different possible futures. What we actually end up with is determined in part by chance and in part by the decisions we make. To make sound decisions we require models that are flexible enough to identify opportunities and to help us choose options that lead to advantageous alternatives. This way of thinking differs from traditional cost-benefit analysis that employs net present value calculations to choose on purely economic grounds, without regard to future consequences. Time and dynamic behaviour introduce a separate perspective. There is a focus on change, and decisions acquire windows of opportunity: the right decision at the right time may lead to substantial change, while it will have little effect if too early or too late. Modelling needs to reflect this dynamic behaviour. It is the perspective of time and dynamics that leads to a focus on sustainability, and thereby the role hydrogen might play in a future energy system. The present work develops a particular understanding relevant to energy infrastructures. Central elements of this understanding are: competition, market preference and choice beyond costs, bounded rationality, uncertainty and risk, irreversibility, increasing returns, path dependence, feedback, delay, nonlinear behaviour. Change towards a ''hydrogen economy'' will involve far-reaching change away from our existing energy infrastructure. This infrastructure is viewed as a dynamic set of interacting technologies (value sequences) that provide services to end-users and uphold the required supply of energy for this, all the way from primary energy sources. The individual technologies also develop with time. Building on this understanding and analysis, an analytical tool has emerged: the Energy Infrastructure Competition (EICOMP) model. In the model each technology is

  12. Large Scale Glazed Concrete Panels

    DEFF Research Database (Denmark)

    Bache, Anja Margrethe

    2010-01-01

    Today, there is a lot of focus on concrete surface’s aesthitic potential, both globally and locally. World famous architects such as Herzog De Meuron, Zaha Hadid, Richard Meyer and David Chippenfield challenge the exposure of concrete in their architecture. At home, this trend can be seen...... in the crinkly façade of DR-Byen (the domicile of the Danish Broadcasting Company) by architect Jean Nouvel and Zaha Hadid’s Ordrupgård’s black curved smooth concrete surfaces. Furthermore, one can point to initiatives such as “Synlig beton” (visible concrete) that can be seen on the website www.......synligbeton.dk and spæncom’s aesthetic relief effects by the designer Line Kramhøft (www.spaencom.com). It is my hope that the research-development project “Lasting large scale glazed concrete formwork,” I am working on at DTU, department of Architectural Engineering will be able to complement these. It is a project where I...

  13. Large scale cross hole testing

    International Nuclear Information System (INIS)

    Ball, J.K.; Black, J.H.; Doe, T.

    1991-05-01

    As part of the Site Characterisation and Validation programme the results of the large scale cross hole testing have been used to document hydraulic connections across the SCV block, to test conceptual models of fracture zones and obtain hydrogeological properties of the major hydrogeological features. The SCV block is highly heterogeneous. This heterogeneity is not smoothed out even over scales of hundreds of meters. Results of the interpretation validate the hypothesis of the major fracture zones, A, B and H; not much evidence of minor fracture zones is found. The uncertainty in the flow path, through the fractured rock, causes sever problems in interpretation. Derived values of hydraulic conductivity were found to be in a narrow range of two to three orders of magnitude. Test design did not allow fracture zones to be tested individually. This could be improved by testing the high hydraulic conductivity regions specifically. The Piezomac and single hole equipment worked well. Few, if any, of the tests ran long enough to approach equilibrium. Many observation boreholes showed no response. This could either be because there is no hydraulic connection, or there is a connection but a response is not seen within the time scale of the pumping test. The fractional dimension analysis yielded credible results, and the sinusoidal testing procedure provided an effective means of identifying the dominant hydraulic connections. (10 refs.) (au)

  14. Large-scale pool fires

    Directory of Open Access Journals (Sweden)

    Steinhaus Thomas

    2007-01-01

    Full Text Available A review of research into the burning behavior of large pool fires and fuel spill fires is presented. The features which distinguish such fires from smaller pool fires are mainly associated with the fire dynamics at low source Froude numbers and the radiative interaction with the fire source. In hydrocarbon fires, higher soot levels at increased diameters result in radiation blockage effects around the perimeter of large fire plumes; this yields lower emissive powers and a drastic reduction in the radiative loss fraction; whilst there are simplifying factors with these phenomena, arising from the fact that soot yield can saturate, there are other complications deriving from the intermittency of the behavior, with luminous regions of efficient combustion appearing randomly in the outer surface of the fire according the turbulent fluctuations in the fire plume. Knowledge of the fluid flow instabilities, which lead to the formation of large eddies, is also key to understanding the behavior of large-scale fires. Here modeling tools can be effectively exploited in order to investigate the fluid flow phenomena, including RANS- and LES-based computational fluid dynamics codes. The latter are well-suited to representation of the turbulent motions, but a number of challenges remain with their practical application. Massively-parallel computational resources are likely to be necessary in order to be able to adequately address the complex coupled phenomena to the level of detail that is necessary.

  15. Predictors of Information Technology Integration in Secondary Schools: Evidence from a Large Scale Study of More than 30,000 Students.

    Directory of Open Access Journals (Sweden)

    Khe Foon Hew

    Full Text Available The present study examined the predictors of information technology (IT integration in secondary school mathematics lessons. The predictors pertained to IT resource availability in schools, school contextual/institutional variables, accountability pressure faced by schools, subject culture in mathematics, and mathematics teachers' pedagogical beliefs and practices. Data from 32,256 secondary school students from 2,519 schools in 16 developed economies who participated in the Program for International Student Assessment (PISA 2012 were analyzed using hierarchical linear modeling (HLM. Results showed that after controlling for student-level (gender, prior academic achievement and socioeconomic status and school-level (class size, number of mathematics teachers variables, students in schools with more computers per student, with more IT resources, with higher levels of IT curricular expectations, with an explicit policy on the use of IT in mathematics, whose teachers believed in student-centered teaching-learning, and whose teachers provided more problem-solving activities in class reported higher levels of IT integration. On the other hand, students who studied in schools with more positive teacher-related school learning climate, and with more academically demanding parents reported lower levels of IT integration. Student-related school learning climate, principal leadership behaviors, schools' public posting of achievement data, tracking of school's achievement data by administrative authorities, and pedagogical and curricular differentiation in mathematics lessons were not related to levels of IT integration. Put together, the predictors explained a total of 15.90% of the school-level variance in levels of IT integration. In particular, school IT resource availability, and mathematics teachers' pedagogical beliefs and practices stood out as the most important determinants of IT integration in mathematics lessons.

  16. Predictors of Information Technology Integration in Secondary Schools: Evidence from a Large Scale Study of More than 30,000 Students.

    Science.gov (United States)

    Hew, Khe Foon; Tan, Cheng Yong

    2016-01-01

    The present study examined the predictors of information technology (IT) integration in secondary school mathematics lessons. The predictors pertained to IT resource availability in schools, school contextual/institutional variables, accountability pressure faced by schools, subject culture in mathematics, and mathematics teachers' pedagogical beliefs and practices. Data from 32,256 secondary school students from 2,519 schools in 16 developed economies who participated in the Program for International Student Assessment (PISA) 2012 were analyzed using hierarchical linear modeling (HLM). Results showed that after controlling for student-level (gender, prior academic achievement and socioeconomic status) and school-level (class size, number of mathematics teachers) variables, students in schools with more computers per student, with more IT resources, with higher levels of IT curricular expectations, with an explicit policy on the use of IT in mathematics, whose teachers believed in student-centered teaching-learning, and whose teachers provided more problem-solving activities in class reported higher levels of IT integration. On the other hand, students who studied in schools with more positive teacher-related school learning climate, and with more academically demanding parents reported lower levels of IT integration. Student-related school learning climate, principal leadership behaviors, schools' public posting of achievement data, tracking of school's achievement data by administrative authorities, and pedagogical and curricular differentiation in mathematics lessons were not related to levels of IT integration. Put together, the predictors explained a total of 15.90% of the school-level variance in levels of IT integration. In particular, school IT resource availability, and mathematics teachers' pedagogical beliefs and practices stood out as the most important determinants of IT integration in mathematics lessons.

  17. Large-scale galaxy bias

    Science.gov (United States)

    Desjacques, Vincent; Jeong, Donghui; Schmidt, Fabian

    2018-02-01

    This review presents a comprehensive overview of galaxy bias, that is, the statistical relation between the distribution of galaxies and matter. We focus on large scales where cosmic density fields are quasi-linear. On these scales, the clustering of galaxies can be described by a perturbative bias expansion, and the complicated physics of galaxy formation is absorbed by a finite set of coefficients of the expansion, called bias parameters. The review begins with a detailed derivation of this very important result, which forms the basis of the rigorous perturbative description of galaxy clustering, under the assumptions of General Relativity and Gaussian, adiabatic initial conditions. Key components of the bias expansion are all leading local gravitational observables, which include the matter density but also tidal fields and their time derivatives. We hence expand the definition of local bias to encompass all these contributions. This derivation is followed by a presentation of the peak-background split in its general form, which elucidates the physical meaning of the bias parameters, and a detailed description of the connection between bias parameters and galaxy statistics. We then review the excursion-set formalism and peak theory which provide predictions for the values of the bias parameters. In the remainder of the review, we consider the generalizations of galaxy bias required in the presence of various types of cosmological physics that go beyond pressureless matter with adiabatic, Gaussian initial conditions: primordial non-Gaussianity, massive neutrinos, baryon-CDM isocurvature perturbations, dark energy, and modified gravity. Finally, we discuss how the description of galaxy bias in the galaxies' rest frame is related to clustering statistics measured from the observed angular positions and redshifts in actual galaxy catalogs.

  18. Large-scale galaxy bias

    Science.gov (United States)

    Jeong, Donghui; Desjacques, Vincent; Schmidt, Fabian

    2018-01-01

    Here, we briefly introduce the key results of the recent review (arXiv:1611.09787), whose abstract is as following. This review presents a comprehensive overview of galaxy bias, that is, the statistical relation between the distribution of galaxies and matter. We focus on large scales where cosmic density fields are quasi-linear. On these scales, the clustering of galaxies can be described by a perturbative bias expansion, and the complicated physics of galaxy formation is absorbed by a finite set of coefficients of the expansion, called bias parameters. The review begins with a detailed derivation of this very important result, which forms the basis of the rigorous perturbative description of galaxy clustering, under the assumptions of General Relativity and Gaussian, adiabatic initial conditions. Key components of the bias expansion are all leading local gravitational observables, which include the matter density but also tidal fields and their time derivatives. We hence expand the definition of local bias to encompass all these contributions. This derivation is followed by a presentation of the peak-background split in its general form, which elucidates the physical meaning of the bias parameters, and a detailed description of the connection between bias parameters and galaxy (or halo) statistics. We then review the excursion set formalism and peak theory which provide predictions for the values of the bias parameters. In the remainder of the review, we consider the generalizations of galaxy bias required in the presence of various types of cosmological physics that go beyond pressureless matter with adiabatic, Gaussian initial conditions: primordial non-Gaussianity, massive neutrinos, baryon-CDM isocurvature perturbations, dark energy, and modified gravity. Finally, we discuss how the description of galaxy bias in the galaxies' rest frame is related to clustering statistics measured from the observed angular positions and redshifts in actual galaxy catalogs.

  19. The Perceived Influence of Industry-Sponsored Credentials on the Recruitment Process in the Information Technology Industry: Employer and Employee Perspectives

    Science.gov (United States)

    Bartlett, Kenneth R.; Horwitz, Sujin K.; Ipe, Minu; Liu, Yuwen

    2005-01-01

    The increase in the number of industry-sponsored credential programs raises many questions for career and technical education. This study investigated the perceived influence of industry-sponsored credentials on the recruitment process in the information technology (IT) field. Influence is examined from the perspective of Human Resource (HR)…

  20. Status: Large-scale subatmospheric cryogenic systems

    International Nuclear Information System (INIS)

    Peterson, T.

    1989-01-01

    In the late 1960's and early 1970's an interest in testing and operating RF cavities at 1.8K motivated the development and construction of four large (300 Watt) 1.8K refrigeration systems. in the past decade, development of successful superconducting RF cavities and interest in obtaining higher magnetic fields with the improved Niobium-Titanium superconductors has once again created interest in large-scale 1.8K refrigeration systems. The L'Air Liquide plant for Tore Supra is a recently commissioned 300 Watt 1.8K system which incorporates new technology, cold compressors, to obtain the low vapor pressure for low temperature cooling. CEBAF proposes to use cold compressors to obtain 5KW at 2.0K. Magnetic refrigerators of 10 Watt capacity or higher at 1.8K are now being developed. The state of the art of large-scale refrigeration in the range under 4K will be reviewed. 28 refs., 4 figs., 7 tabs

  1. An Novel Architecture of Large-scale Communication in IOT

    Science.gov (United States)

    Ma, Wubin; Deng, Su; Huang, Hongbin

    2018-03-01

    In recent years, many scholars have done a great deal of research on the development of Internet of Things and networked physical systems. However, few people have made the detailed visualization of the large-scale communications architecture in the IOT. In fact, the non-uniform technology between IPv6 and access points has led to a lack of broad principles of large-scale communications architectures. Therefore, this paper presents the Uni-IPv6 Access and Information Exchange Method (UAIEM), a new architecture and algorithm that addresses large-scale communications in the IOT.

  2. 8th International Symposium on Intelligent Distributed Computing & Workshop on Cyber Security and Resilience of Large-Scale Systems & 6th International Workshop on Multi-Agent Systems Technology and Semantics

    CERN Document Server

    Braubach, Lars; Venticinque, Salvatore; Badica, Costin

    2015-01-01

    This book represents the combined peer-reviewed proceedings of the Eight International Symposium on Intelligent Distributed Computing - IDC'2014, of the Workshop on Cyber Security and Resilience of Large-Scale Systems - WSRL-2014, and of the Sixth International Workshop on Multi-Agent Systems Technology and Semantics- MASTS-2014. All the events were held in Madrid, Spain, during September 3-5, 2014. The 47 contributions published in this book address several topics related to theory and applications of the intelligent distributed computing and multi-agent systems, including: agent-based data processing, ambient intelligence, collaborative systems, cryptography and security, distributed algorithms, grid and cloud computing, information extraction, knowledge management, big data and ontologies, social networks, swarm intelligence or videogames amongst others.

  3. Engineering management of large scale systems

    Science.gov (United States)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  4. How large-scale technological development should be in the future. Survey and research on highly automated machines; Kongo no daikibo gijutsu kaihatsu no hoko ni tsuite. Kodo jidoka kikai ni kansuru chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1982-03-01

    A survey is conducted about highly automated machines such as industrial robots. The task to be subjected to development as derived from a survey conducted about needs is the construction of a dangerous work robot. It is pointed out that work in coal mines, tall buildings, industrial complexes, or nuclear power plants may encounter large-scale accidents, and the task is how to perform such work in an automated way. The tasks concluded to be subjected to development after a seed survey analysis are categorized into three groups of element technologies, namely, sensors and recognition function, mechanism and materials, and control and data processing. These element technologies are to be ultimately integrated into a robot, for critical work which is a combination of a highly intelligent robot main body and an integrated management system. Since it will happen that humans have to directly operate such a robot under delicate conditions and share the burden of judgement and thinking, it is also necessary to develop technologies to solve problems of man-to-robot engineering. It is proposed that a dangerous work robot research and development program be established before development is started. (NEDO)

  5. Costs and Benefits of Vendor Sponsored Learning Materials in Information Technology Education

    Science.gov (United States)

    Hua, David M.

    2013-01-01

    The demand for qualified information technology professionals remains high despite downturns in the economy. It is imperative to provide students with a curriculum that provides a broad foundation in information technology knowledge, skills, and abilities. Students also need access to specialized technologies and learning materials to develop the…

  6. Reports on the research projects in the field of nuclear safety sponsored by the Federal Minister for Research and Technology

    International Nuclear Information System (INIS)

    1979-12-01

    Investigations on the safety of Light Water Reactors (LWR) being performed in the framework of the Research Program Reactor Safety (RS - Projects) are sponsored by the BMFT (Federal Minister for Research and Technology), Bundesminister fuer Forschung und Technologie. Objective of this program is to investigate in greater detail the safety margins of nuclear power plants and their systems and the further development of safety technology. Besides the investigations of LWR tasks first projects on the safety of FBR type reactors are sponsored by the BMFT. The CRS (Reactor Safety Association), Gesellschaft fuer Reaktorsicherheit mbH, by order of the BMFT, informs continuously of the status of such investigations by means of quarterly and annually publication of progress reports within the series GRS - F - Fortschrittsberichte (GRS - F - Progress Reports). Each progress report represents a compilation of individual reports about objectives, the work performed, the results, the next steps of the work etc. The individual reports are prepared in a standard form by the contractors themselves as a documentation of their progress in work. The individual reports are arranged according to the amended LWR Safety Research Program of the BMFT in the near future. Another table contents uses the same classification system as applied in the Nuclear Safety Index of the CEC Communities and the OECD.(orig./HP) [de

  7. Reports on research projects in the field of reactor safety sponsored by the Federal Ministry for research and technology

    International Nuclear Information System (INIS)

    1979-09-01

    Investigations on the safety of Light Water Reactors (LWR) being performed in the framework of the Research Program Reactor Safety (RS - Projects) are sponsored by the BMFT (Federal Minister for Research and Technology), Bundesminister fuer Forschung und Technologie. Objective of this program is to investigate in greater detail the safety margins of nuclear power-plants and their systems and the further development of safety technology. Besides the investigations of LWR tasks first projects on the safety of FBR type reactors are sponsored by the BMFT. The GRS (Reactor Safety Association), Gesellschaft fuer Reaktorsicherheit mbH, by order of the BMFT, informs continuously of the status of such investigations by means of quarterly and annually publication of progress reports within the series GRS - F - Fortschrittsberichte (GRS - F - Progress Reports). Each progress report represents a compilation of individual reports about objectives, the work performed, the results, the next steps of the work etc. The individual reports are prepared in a standard form by the contractors themselves as a documentation of their progress in work. The individual reports are arranged according to the amended LWR Safety Research Program of the BMFT in the near future. Another table contents uses the same classification system as applied in the Nuclear Safety Index of the CEC Communities and the OECD. (orig.) [de

  8. Reports on the research projects in the field of nuclear safety sponsored by the Federal Ministry for Research and Technology

    International Nuclear Information System (INIS)

    1980-06-01

    Investigations on the safety of Light Water Reactors (LWR) being performed in the framework of the Research Program Reactor Safety (RS-Projects) are sponsored by the BMFT (Federal Minister for Research and Technology), Bundesminister fuer Forschung und Technologie. Objective of this program is to investigate in greater detail the safety margins of nuclear power plants and their systems and the further development of safety technology. Besides the investigations of LWR tasks first projects on the safety of FBR type reactors are sponsored by the BMFT. The GRS (Reactor Safety Association), Gesellschaft fuer Reaktorsicherheit mbH, by order of the BMFT, informs continuously of the status of such investigations by means of quarterly and annually publication of progress reports within the series GRS-F-Fortschrittsberichte (GRS-F-progress reports). Each progress report represents a compilation of individual reports about objectives, the work performed, the results, the next steps of the work etc. The individual reports are prepared in a standard form by the contractors themselves as a documentation of their progress in work. The individual reports are arranged according to the amended LWR Safety Research Program of the BMFT, which will appear in the near future. Another table contents uses the same classification system as applied in the Nuclear Safety Index of the CEC and the OECD. (orig./HP) [de

  9. Reports on research projects in the field of reactor safety sponsored by the Federal Minister for Research and Technology

    International Nuclear Information System (INIS)

    1978-09-01

    Investigations on the safety of Light Water Reactors (LWR) being performed in the framework of the Research Program Reactor Safety (RS - Projects) are sponsored by the BMFT (Federal Minister for Research and Technology), der Bundesminister fuer Forschung und Technologie. Objective of this program is to investigate in greater detail the safety margins of nuclear power plants and their systems and the further development of safety technology. Besides the investigations of LWR tasks first projects on the safety of FBR type reactors are sponsored by the BMFT. The GRS (Reactor Safety Association), Gesellschaft fuer Reaktorsicherheit mbH, by order of the BMFT, informs continuously of the status of such investigations by means of quarterly and annually publication of progress reports within the series GRS - F Fortschrittsberichte (GRS - F - Progress Reports). Each progress report represents a compilation of individual reports about objectives, the work performed, the results, the next steps of the work etc. The individual reports are prepared in a standard form by the contractors themselves as a documentation of their progress in work. The individual reports are arranged according to the amended LWR Safety Research Program of the BMFT. Another table contents uses the same classification system as applied in the Nuclear Safety Index of the CEC Communities and the OECD. (orig./HP) 891 HP [de

  10. Reports on research projects sponsored by the Federal Minister for Research and Technology in the field of reactor safety

    International Nuclear Information System (INIS)

    1979-03-01

    Investigations on the safety of Light Water Reactors (LWR) being performed in the framework of the Research Program Reactor Safety (RS - Projects) are sponsored by the BMFT (Federal Minister for Research and Technology), Bundesminister fuer Forschung und Technologie. Objective of this program is to investigate in greater detail the safety margins of nuclear power plants and their systems and the further development of safety technology. Besides the investigations of LWR tasks first projects on the safety of FBR type reactors are sponsored by the BMFT. The GRS (Reactor Safety Association), Gesellschaft fuer Reaktorsicherheit mbH, by order of the BMFT, informs continuously of the status of such investigations by means of quarterly and annually publication of progress reports within the series GRS - F - Fortschrittsberichte (GRS - F - Progress Reports). Each progress report represents a compilation of individual reports about objectives, the work performed, the results, the next steps of the work etc. The individual reports are prepared in a standard form by the contractors themselves as a documentation of their progress in work. The individual reports are arranged according to the amended LWR Safety Research Program of the BMFT. Another table contents uses the same classification system as applied in the Nuclear Safety Index of the CEC European Communities and the OECD. (orig./HP) [de

  11. Report on the projects in the field of reactor safety sponsored by the Federal Ministry for Research and Technology

    International Nuclear Information System (INIS)

    1978-12-01

    Investigations on the safety of Light Water Reactors (LWR) being performed in the framework of the Research Program Reactor Safety (RS - Projects) are sponsored by the BMFT (Federal Minister for Research and Technology), Bundesminister fuer Forschung und Technologie. Objective of this program is to investigate in greater detail the safety margins of nuclear power-plants and their systems and the further development of safety technology. Besides the investigations of LWR tasks first projects on the safety of advanced reactors are sponsored by the BMFT. The GRS (Reactor Safety Association), Gesellschaft fuer Reaktorsicherheit mbH, by order of the BMFT, informs continuously of the status of such investigations by means of quarterly and annually publication of progress reports within the series GRS - F - Fortschrittsberichte (GRS - F - Progress Reports). Each progress report represents a compilation of individual reports about objectives, the work performed, the results, the next steps of the work etc. The individual reports are prepared in a standard form by the contractors themselves as a documentation of their progress in work and published by the FB (Research Coordination Department), Forschungsbetreuung at the GRS, within the framework of general information of the progress in reactor safety research. The individual reports are arranged according to the amended LWR Safety Research Program of the BMFT which will appear in the near future. Another table contents uses the same classification system as applied in the Nuclear Safety Index of the CEC and the OECD. (orig./HP) [de

  12. Report on the research projects in the field of reactor safety sponsored by the Federal Minister for Research and Technology

    International Nuclear Information System (INIS)

    1978-09-01

    Investigations on the safety of Light Water Reactors (LWR) being performed in the framework of the Research Program Reactor Safety (RS - Projects) are sponsored by the BMFT (Federal Minister for Research and Technology), Bundesminister fuer Forschung und Technologie. Objective of this program is to investigate in greater detail the safety margins of nuclear power plants and their systems and the further development of safety technology. Besides the investigations of LWR tasks first projects on the safety of FBR type reactors are sponsored by the BMFT. The GRS (Reactor Safety Association), Gesellschaft fuer Reaktorsicherheit mbH, by order of the BMFT, informs continuously of the status of such investigations by means of quarterly and annually publication of progress reports within the series GRS - F -Fortschrittsberichte (GRS - F - Progress Reports). Each progress report represents a compilation of individual reports about objectives, the work performed, the results, the next steps of the work etc. The individual reports are prepared in a standard form by the contractors themselves as a documentation of their progress in work, The individual reports are arranged according to the amended LWR Safety Research Program of the BMFT. Another table contents uses the same classification system as applied in the Nuclear Safety Index of the CEC Communities and the OECD. (orig./HP) 891 HP [de

  13. Ethics of large-scale change

    OpenAIRE

    Arler, Finn

    2006-01-01

      The subject of this paper is long-term large-scale changes in human society. Some very significant examples of large-scale change are presented: human population growth, human appropriation of land and primary production, the human use of fossil fuels, and climate change. The question is posed, which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, th...

  14. Large-Scale 3D Printing: The Way Forward

    Science.gov (United States)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  15. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  16. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  17. Decentralized Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

    2013-01-01

    problem is formulated as a centralized large-scale optimization problem but is then decomposed into smaller subproblems that are solved locally by each unit connected to an aggregator. For large-scale systems the method is faster than solving the full problem and can be distributed to include an arbitrary...

  18. Implementation factors affecting the large-scale deployment of digital health and well-being technologies: A qualitative study of the initial phases of the 'Living-It-Up' programme.

    Science.gov (United States)

    Agbakoba, Ruth; McGee-Lennon, Marilyn; Bouamrane, Matt-Mouley; Watson, Nicholas; Mair, Frances S

    2016-12-01

    Little is known about the factors which facilitate or impede the large-scale deployment of health and well-being consumer technologies. The Living-It-Up project is a large-scale digital intervention led by NHS 24, aiming to transform health and well-being services delivery throughout Scotland. We conducted a qualitative study of the factors affecting the implementation and deployment of the Living-It-Up services. We collected a range of data during the initial phase of deployment, including semi-structured interviews (N = 6); participant observation sessions (N = 5) and meetings with key stakeholders (N = 3). We used the Normalisation Process Theory as an explanatory framework to interpret the social processes at play during the initial phases of deployment.Initial findings illustrate that it is clear - and perhaps not surprising - that the size and diversity of the Living-It-Up consortium made implementation processes more complex within a 'multi-stakeholder' environment. To overcome these barriers, there is a need to clearly define roles, tasks and responsibilities among the consortium partners. Furthermore, varying levels of expectations and requirements, as well as diverse cultures and ways of working, must be effectively managed. Factors which facilitated implementation included extensive stakeholder engagement, such as co-design activities, which can contribute to an increased 'buy-in' from users in the long term. An important lesson from the Living-It-Up initiative is that attempting to co-design innovative digital services, but at the same time, recruiting large numbers of users is likely to generate conflicting implementation priorities which hinder - or at least substantially slow down - the effective rollout of services at scale.The deployment of Living-It-Up services is ongoing, but our results to date suggest that - in order to be successful - the roll-out of digital health and well-being technologies at scale requires a delicate and pragmatic trade

  19. Food appropriation through large scale land acquisitions

    International Nuclear Information System (INIS)

    Cristina Rulli, Maria; D’Odorico, Paolo

    2014-01-01

    The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions (LSLAs) for commercial farming will bring the technology required to close the existing crops yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with LSLAs. We show how up to 300–550 million people could be fed by crops grown in the acquired land, should these investments in agriculture improve crop production and close the yield gap. In contrast, about 190–370 million people could be supported by this land without closing of the yield gap. These numbers raise some concern because the food produced in the acquired land is typically exported to other regions, while the target countries exhibit high levels of malnourishment. Conversely, if used for domestic consumption, the crops harvested in the acquired land could ensure food security to the local populations. (letter)

  20. Thermal power generation projects ``Large Scale Solar Heating``; EU-Thermie-Projekte ``Large Scale Solar Heating``

    Energy Technology Data Exchange (ETDEWEB)

    Kuebler, R.; Fisch, M.N. [Steinbeis-Transferzentrum Energie-, Gebaeude- und Solartechnik, Stuttgart (Germany)

    1998-12-31

    The aim of this project is the preparation of the ``Large-Scale Solar Heating`` programme for an Europe-wide development of subject technology. The following demonstration programme was judged well by the experts but was not immediately (1996) accepted for financial subsidies. In November 1997 the EU-commission provided 1,5 million ECU which allowed the realisation of an updated project proposal. By mid 1997 a small project was approved, that had been requested under the lead of Chalmes Industriteteknik (CIT) in Sweden and is mainly carried out for the transfer of technology. (orig.) [Deutsch] Ziel dieses Vorhabens ist die Vorbereitung eines Schwerpunktprogramms `Large Scale Solar Heating`, mit dem die Technologie europaweit weiterentwickelt werden sollte. Das daraus entwickelte Demonstrationsprogramm wurde von den Gutachtern positiv bewertet, konnte jedoch nicht auf Anhieb (1996) in die Foerderung aufgenommen werden. Im November 1997 wurden von der EU-Kommission dann kurzfristig noch 1,5 Mio ECU an Foerderung bewilligt, mit denen ein aktualisierter Projektvorschlag realisiert werden kann. Bereits Mitte 1997 wurde ein kleineres Vorhaben bewilligt, das unter Federfuehrung von Chalmers Industriteknik (CIT) in Schweden beantragt worden war und das vor allem dem Technologietransfer dient. (orig.)

  1. Research report on the effect of the large-scale industrial technology development system and on how it should be in the future; Ogata kogyo gijutsu kaihatsu seido no seika oyobi kongo no arikata ni kansuru chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1986-03-01

    A survey was done about projects implemented under the above-named development system inaugurated in fiscal 1966, and studies are made as to how large projects should be in the future. The survey covered the subjects which had been completed by fiscal 1985, that is, the remotely controlled submarine drilling device for oil, seawater desalination and by-product utilization, electric vehicle, technology of comprehensive control of automobiles, pattern information processing system, direct iron making by use of high-temperature reduced gas, manufacture of olefines from heavy oil, aviation jet engine, resources recycling/reuse system, superhigh-performance laser-aided combined manufacturing system, submarine oil production system, and the optics-aided measurement/control system. Answers were heard from corporations concerned. The answers contained some complaints, concerning the shortage of experience on the part of participating corporations, degradation in planning functions, increase in the burden of leading companies, shortage of study or conference about an optimum promotion system, problems in accounting and auditing systems, etc., and suggestions were presented for improvement on large-scale projects. (NEDO)

  2. Successful application of FTA Classic Card technology and use of bacteriophage phi29 DNA polymerase for large-scale field sampling and cloning of complete maize streak virus genomes.

    Science.gov (United States)

    Owor, Betty E; Shepherd, Dionne N; Taylor, Nigel J; Edema, Richard; Monjane, Adérito L; Thomson, Jennifer A; Martin, Darren P; Varsani, Arvind

    2007-03-01

    Leaf samples from 155 maize streak virus (MSV)-infected maize plants were collected from 155 farmers' fields in 23 districts in Uganda in May/June 2005 by leaf-pressing infected samples onto FTA Classic Cards. Viral DNA was successfully extracted from cards stored at room temperature for 9 months. The diversity of 127 MSV isolates was analysed by PCR-generated RFLPs. Six representative isolates having different RFLP patterns and causing either severe, moderate or mild disease symptoms, were chosen for amplification from FTA cards by bacteriophage phi29 DNA polymerase using the TempliPhi system. Full-length genomes were inserted into a cloning vector using a unique restriction enzyme site, and sequenced. The 1.3-kb PCR product amplified directly from FTA-eluted DNA and used for RFLP analysis was also cloned and sequenced. Comparison of cloned whole genome sequences with those of the original PCR products indicated that the correct virus genome had been cloned and that no errors were introduced by the phi29 polymerase. This is the first successful large-scale application of FTA card technology to the field, and illustrates the ease with which large numbers of infected samples can be collected and stored for downstream molecular applications such as diversity analysis and cloning of potentially new virus genomes.

  3. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  4. Research on the Construction Management and Sustainable Development of Large-Scale Scientific Facilities in China

    Science.gov (United States)

    Guiquan, Xi; Lin, Cong; Xuehui, Jin

    2018-05-01

    As an important platform for scientific and technological development, large -scale scientific facilities are the cornerstone of technological innovation and a guarantee for economic and social development. Researching management of large-scale scientific facilities can play a key role in scientific research, sociology and key national strategy. This paper reviews the characteristics of large-scale scientific facilities, and summarizes development status of China's large-scale scientific facilities. At last, the construction, management, operation and evaluation of large-scale scientific facilities is analyzed from the perspective of sustainable development.

  5. Large-scale numerical simulations of plasmas

    International Nuclear Information System (INIS)

    Hamaguchi, Satoshi

    2004-01-01

    The recent trend of large scales simulations of fusion plasma and processing plasmas is briefly summarized. Many advanced simulation techniques have been developed for fusion plasmas and some of these techniques are now applied to analyses of processing plasmas. (author)

  6. Superconducting materials for large scale applications

    International Nuclear Information System (INIS)

    Dew-Hughes, D.

    1975-01-01

    Applications of superconductors capable of carrying large current densities in large-scale electrical devices are examined. Discussions are included on critical current density, superconducting materials available, and future prospects for improved superconducting materials. (JRD)

  7. Large-scale computing with Quantum Espresso

    International Nuclear Information System (INIS)

    Giannozzi, P.; Cavazzoni, C.

    2009-01-01

    This paper gives a short introduction to Quantum Espresso: a distribution of software for atomistic simulations in condensed-matter physics, chemical physics, materials science, and to its usage in large-scale parallel computing.

  8. Large-scale regions of antimatter

    International Nuclear Information System (INIS)

    Grobov, A. V.; Rubin, S. G.

    2015-01-01

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era

  9. Large-scale regions of antimatter

    Energy Technology Data Exchange (ETDEWEB)

    Grobov, A. V., E-mail: alexey.grobov@gmail.com; Rubin, S. G., E-mail: sgrubin@mephi.ru [National Research Nuclear University MEPhI (Russian Federation)

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  10. Introduction to vacuum technology: supplementary study material developed for IVS sponsored vacuum courses

    International Nuclear Information System (INIS)

    Bhusan, K.G.

    2008-01-01

    Vacuum technology has advanced to a large extent mainly from the demands of experimental research scientists who have more than ever understood the need for clean very low pressure environments. This need only seems to increase as the lowest pressures achievable in a laboratory setup are dropping down by the decade. What is not usually said is that conventional techniques of producing ultrahigh vacuum have also undergone a metamorphosis in order to cater to the multitude of restrictions in modern day scientific research. This book aims to give that practical approach to vacuum technology. The basics are given in the first chapter with more of a definition oriented approach - which is practically useful. The second chapter deals with the production of vacuum and ultrahigh vacuum with an emphasis on the working principles of several pumps and their working pressure ranges. Measurement of low pressures, both total and partial is presented in the third chapter with a note on leak detection and mass spectrometric techniques. Chapter 4 gives an overview of the materials that are vacuum compatible and their material properties. Chapter 5 gives the necessary methods to be followed for cleaning of vacuum components especially critical if ultrahigh vacuum environment is required. The practical use of a ultrahigh vacuum environment is demonstrated in Chapter 6 for production of high quality thin films through vapour deposition

  11. Large-scale grid management; Storskala Nettforvaltning

    Energy Technology Data Exchange (ETDEWEB)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-07-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series.

  12. Political consultation and large-scale research

    International Nuclear Information System (INIS)

    Bechmann, G.; Folkers, H.

    1977-01-01

    Large-scale research and policy consulting have an intermediary position between sociological sub-systems. While large-scale research coordinates science, policy, and production, policy consulting coordinates science, policy and political spheres. In this very position, large-scale research and policy consulting lack of institutional guarantees and rational back-ground guarantee which are characteristic for their sociological environment. This large-scale research can neither deal with the production of innovative goods under consideration of rentability, nor can it hope for full recognition by the basis-oriented scientific community. Policy consulting knows neither the competence assignment of the political system to make decisions nor can it judge succesfully by the critical standards of the established social science, at least as far as the present situation is concerned. This intermediary position of large-scale research and policy consulting has, in three points, a consequence supporting the thesis which states that this is a new form of institutionalization of science: These are: 1) external control, 2) the organization form, 3) the theoretical conception of large-scale research and policy consulting. (orig.) [de

  13. Chirping for large-scale maritime archaeological survey

    DEFF Research Database (Denmark)

    Grøn, Ole; Boldreel, Lars Ole

    2014-01-01

    Archaeological wrecks exposed on the sea floor are mapped using side-scan and multibeam techniques, whereas the detection of submerged archaeological sites, such as Stone Age settlements, and wrecks, partially or wholly embedded in sea-floor sediments, requires the application of high-resolution ...... the present state of this technology, it appears well suited to large-scale maritime archaeological mapping....

  14. New Visions for Large Scale Networks: Research and Applications

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — This paper documents the findings of the March 12-14, 2001 Workshop on New Visions for Large-Scale Networks: Research and Applications. The workshops objectives were...

  15. Reports on the projects in the field of reactor safety sponsored by the Federal Ministry for Research and Technology

    International Nuclear Information System (INIS)

    1977-11-01

    Investigations on the safety of Light Water Reactors (LWR) being performed in the framework of the Research Program Reactor Safety (RS-projects) are sponsored by the BMFT (Federal Minister for Research and Technology), Bundesminister fuer Forschung und Technologie. Objective of this program is to investigate in greater detail the safety margins of nuclear energy plants and their systems and the further development of safety technology. The GRS (Reactor Safety Association), Gesellschaft fuer Reaktorsicherheit mbH, by order of BMFT, informs continuously of the status of these investigations within the series 'GRS-F-Fortschrittsberichte' (GRS-F-Progress Reports). Each progress report represents a compilation of individual reports about the different projects of the search program. The individual reports are prepared by the contractors themselves as a documentation of their progress in work and published by the GRS-FB (Research Coordination Department), Forschungsbetreuung at the GRS, within the framework of general information of the progress in reactor safety research. Each report describes the work performed, the results and the next steps of the work. The individual reports are attached to the classification system established by the CEC (Commission of the European Communities). The GRS-F-Progress Reports also include a list of the current investigations arranged according to the projects of the BMFT-Research Program Reactor Safety. This compilation, in addition to the LWR-investigations, also contains first contributions on the safety of advanced reactors. (orig.) [de

  16. Reports on the projects in the field of reactor safety sponsored by the Federal Ministry for Research and Technology

    International Nuclear Information System (INIS)

    1977-12-01

    Investigations on the safety of Light Water Reactors (LWR) being performed in the framework of the Research Program Reactor Safety (RS-projects) are sponsored by the BMFT (Federal Minister for Research and Technology), Bundesminister fuer Forschung und Technologie. Objective of this program is to investigate in greater detail the safety margins of nuclear energy plants and their systems and the further development of safety technology. The GRS (Reactor Safety Association), Gesellschaft fuer Reaktorsicherheit mbH, by order of the BMFT, informs continuously of the status of these investigations within the series 'GRS-F-Fortschrittsberichte' (GRS-F-Progress Reports). Each progress report represents a compilation of individual reports about the different projects of the search program. The individual reports are prepared by the contractors themselves as a documentation of their progress in work and published by the GRS-FB (Research Coordination Department), Forschungsbetreuung at the GRS, within the framework of general information of the progress in reactor safety research. Each report describes the work performed, the results and the next steps of the work. The individual reports are attached to the classification system established by the CEC (Commission of the European Communities). The GRS-F-Progress Reports also include a list of the current investigations arranged according to the projects of the BMFT-Research Program Reactor Safety. This compilation, in addition to the LWR-investigations, also contains first contributions on the safety of advanced reactors. (orig.) [de

  17. Reports on the projects in the field of reactor safety sponsored by the Federal Ministry for Research and Technology

    International Nuclear Information System (INIS)

    1977-06-01

    Investigations on the safety of Light Water Reactors (LWR) being performed in the framework of the Research Program Reactor Safety (RS-projects) are sponsored by the BMFT (Federal Minister for Research and Technology), Bundesminister fuer Forschung und Technologie. Objective of this program is to investigate in greater detail the safety margins of nuclear energy plants and their systems and the further development of safety technology. The GRS (Reactor Safety Association), Gesellschaft fuer Reaktorsicherheit mbH, by order of the BMFT, informs continuously of the status of these investigations within the series 'GRS-F-Forschrittsberichte' (GRS-F-Progress Reports). Each progress report represents a compilation of individual reports about the different projects of the search program. The individual reports are prepared by the contractors themselves as a documentation of their progress in work and published by the GRS-FB (Research Coordination Department), Forschungsbetreuung at the GRS, within the framework of general information of the progress in reactor safety research. Each report describes the work performed, the results and the next steps of the work. The individual reports are attached to the classification system established by the CEC (Commission of the European Communities). The GRS-F-Progress Reports also include a list of the current investigations arranged according to the projects of the BMFT-Research Program Reactor Safety. This compilation, in addition to the LWR-investigations, also contains first contributions on the safety of advanced reactors. (orig.) [de

  18. Dissecting the large-scale galactic conformity

    Science.gov (United States)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  19. Concurrent Programming Using Actors: Exploiting Large-Scale Parallelism,

    Science.gov (United States)

    1985-10-07

    ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK* Artificial Inteligence Laboratory AREA Is WORK UNIT NUMBERS 545 Technology Square...D-R162 422 CONCURRENT PROGRMMIZNG USING f"OS XL?ITP TEH l’ LARGE-SCALE PARALLELISH(U) NASI AC E Al CAMBRIDGE ARTIFICIAL INTELLIGENCE L. G AGHA ET AL...RESOLUTION TEST CHART N~ATIONAL BUREAU OF STANDA.RDS - -96 A -E. __ _ __ __’ .,*- - -- •. - MASSACHUSETTS INSTITUTE OF TECHNOLOGY ARTIFICIAL

  20. The Impact of High School Principal's Technology Leadership on the Sustainability of Corporate Sponsored Information Communication Technology Curriculum

    Science.gov (United States)

    Gottwig, Bruce Ryan

    2013-01-01

    The proliferation of information communication technology (ICT) has placed educational institutions in the forefront in educating and training students as skilled consumers, engineers, and technicians of this widely used technology. Corporations that develop and use ICT are continually building a skilled workforce; however, because of the growth…

  1. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    1981-05-01

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  2. Activities of the NASA sponsored SRI technology applications team in transferring aerospace technology to the public sector

    Science.gov (United States)

    Berke, J. G.

    1971-01-01

    The organization and functions of an interdisciplinary team for the application of aerospace generated technology to the solution of discrete technological problems within the public sector are presented. The interdisciplinary group formed at Stanford Research Institute, California is discussed. The functions of the group are to develop and conduct a program not only optimizing the match between public sector technological problems in criminalistics, transportation, and the postal services and potential solutions found in the aerospace data base, but ensuring that appropriate solutions are acutally utilized. The work accomplished during the period from July 1, 1970 to June 30, 1971 is reported.

  3. Large-Scale Analysis of Art Proportions

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2014-01-01

    While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square) and with majo......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  4. Reports on the research projects in the field of reactor safety sponsored by the Federal Ministry of Science and Technology

    International Nuclear Information System (INIS)

    1975-12-01

    Investigations on the safety of Light Water Reactors (LWR) being performed in the framework of the safety program 'Reactor Safety' are sponsored by the Bundesminister fuer Forschung und Technologie (BMFT - Secretary of State for Research and Technology). Objective of this program is to continue improving the safety of LWR, in order to minimize the risk for the environment. With grant assistance from the Bundesminister des Innern (BMI - Secretary of State for Home Affairs) research contracts in the field of reactor safety are being performed. Results of these projects should contribute to resolve questions arising nuclear licensing procedures. The Forschungsbetreuung (FB - research supervision department) at the Institute for Reactor Safety (IRS), as consultants to BMFT and BMI, provides information about the progress of investigations. Individual reports will be prepared and put into standard forms by the research contractors. Each report gives information on: 1) the work accomplished, 2) the results obtained, 3) the work planned to be continued. Initial reports of research projects describe in addition the purpose of the work. A BMFT-research program on the safety of Fast Breeders (Schneller Brutreaktor - SBR) is presently under discussion. In order to define several problems, investigations included in the present compilation (RS 139, 140, 143, 162) will be previously performed. (orig.) [de

  5. Reports on the research projects in the field of reactor safety sponsored by the Federal Ministry of Science and Technology

    International Nuclear Information System (INIS)

    1976-12-01

    Investigations on the safety of light water reactors (LWR) being performed in the framework of the safety program 'Reactor Safety' are sponsored by the Bundesminister fuer Forschung und Technologie (BMFT - Secretary of State for Research and Technology). Objective of this program is to continue improving the safety of LWR, in order to minimize the risk for the environment. With grant assistance from the Bundesminister des Innern (BMI - Secretary of State for Home Affairs) research contrcts in the field of reactor safety are being performed. Results of these projects should contribute to resolve questions arising nuclear licensing procedures. The Forschungsbetreuung (FB - research supervision department) at the Institute for Reactor Safety (IRS), as consultants to BMFT and BMI, provides information about the progress of investigations. Individual reports will be prepared and put into standard forms by the research contractors. Each report gives information on: 1) the work accomplished, 2) the results obtained, 3) the work planned to be continued. Initial reports of research projects describe in addition the purpose of the work. A BMFT-research program on the safety of Fast Breeders (Schneller Brutreaktor - SRB) is presently under discussion. In order to define several problems, investigations included in the present compilation (RS 139, 140, 143, 162) will be previously performed. (orig.) [de

  6. Configuration management in large scale infrastructure development

    NARCIS (Netherlands)

    Rijn, T.P.J. van; Belt, H. van de; Los, R.H.

    2000-01-01

    Large Scale Infrastructure (LSI) development projects such as the construction of roads, rail-ways and other civil engineering (water)works is tendered differently today than a decade ago. Traditional workflow requested quotes from construction companies for construction works where the works to be

  7. Large-scale Motion of Solar Filaments

    Indian Academy of Sciences (India)

    tribpo

    Large-scale Motion of Solar Filaments. Pavel Ambrož, Astronomical Institute of the Acad. Sci. of the Czech Republic, CZ-25165. Ondrejov, The Czech Republic. e-mail: pambroz@asu.cas.cz. Alfred Schroll, Kanzelhöehe Solar Observatory of the University of Graz, A-9521 Treffen,. Austria. e-mail: schroll@solobskh.ac.at.

  8. Sensitivity analysis for large-scale problems

    Science.gov (United States)

    Noor, Ahmed K.; Whitworth, Sandra L.

    1987-01-01

    The development of efficient techniques for calculating sensitivity derivatives is studied. The objective is to present a computational procedure for calculating sensitivity derivatives as part of performing structural reanalysis for large-scale problems. The scope is limited to framed type structures. Both linear static analysis and free-vibration eigenvalue problems are considered.

  9. Ethics of large-scale change

    DEFF Research Database (Denmark)

    Arler, Finn

    2006-01-01

    , which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, the neoclassical economists' approach, and finally the so-called Concentric Circle Theories approach...

  10. The origin of large scale cosmic structure

    International Nuclear Information System (INIS)

    Jones, B.J.T.; Palmer, P.L.

    1985-01-01

    The paper concerns the origin of large scale cosmic structure. The evolution of density perturbations, the nonlinear regime (Zel'dovich's solution and others), the Gott and Rees clustering hierarchy, the spectrum of condensations, and biassed galaxy formation, are all discussed. (UK)

  11. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  12. Large-scale perspective as a challenge

    NARCIS (Netherlands)

    Plomp, M.G.A.

    2012-01-01

    1. Scale forms a challenge for chain researchers: when exactly is something ‘large-scale’? What are the underlying factors (e.g. number of parties, data, objects in the chain, complexity) that determine this? It appears to be a continuum between small- and large-scale, where positioning on that

  13. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data

  14. Learning from large scale neural simulations

    DEFF Research Database (Denmark)

    Serban, Maria

    2017-01-01

    Large-scale neural simulations have the marks of a distinct methodology which can be fruitfully deployed to advance scientific understanding of the human brain. Computer simulation studies can be used to produce surrogate observational data for better conceptual models and new how...

  15. Large-Scale Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S. A.

    1995-12-01

    \\catcode`\\@=11 \\ialign{m @th#1hfil ##hfil \\crcr#2\\crcr\\sim\\crcr}}} \\catcode`\\@=12 Highly collimated outflows extend out to Mpc scales in many radio-loud active galaxies. In Seyfert galaxies, which are radio-quiet, the outflows extend out to kpc scales and do not appear to be as highly collimated. In order to study the nature of large-scale (>~1 kpc) outflows in Seyferts, we have conducted optical, radio and X-ray surveys of a distance-limited sample of 22 edge-on Seyfert galaxies. Results of the optical emission-line imaging and spectroscopic survey imply that large-scale outflows are present in >~{{1} /{4}} of all Seyferts. The radio (VLA) and X-ray (ROSAT) surveys show that large-scale radio and X-ray emission is present at about the same frequency. Kinetic luminosities of the outflows in Seyferts are comparable to those in starburst-driven superwinds. Large-scale radio sources in Seyferts appear diffuse, but do not resemble radio halos found in some edge-on starburst galaxies (e.g. M82). We discuss the feasibility of the outflows being powered by the active nucleus (e.g. a jet) or a circumnuclear starburst.

  16. Stability of large scale interconnected dynamical systems

    International Nuclear Information System (INIS)

    Akpan, E.P.

    1993-07-01

    Large scale systems modelled by a system of ordinary differential equations are considered and necessary and sufficient conditions are obtained for the uniform asymptotic connective stability of the systems using the method of cone-valued Lyapunov functions. It is shown that this model significantly improves the existing models. (author). 9 refs

  17. Large-scale fuel cycle centres

    International Nuclear Information System (INIS)

    Smiley, S.H.; Black, K.M.

    1977-01-01

    The US Nuclear Regulatory Commission (NRC) has considered the nuclear energy centre concept for fuel cycle plants in the Nuclear Energy Centre Site Survey 1975 (NECSS-75) Rep. No. NUREG-0001, an important study mandated by the US Congress in the Energy Reorganization Act of 1974 which created the NRC. For this study, the NRC defined fuel cycle centres as consisting of fuel reprocessing and mixed-oxide fuel fabrication plants, and optional high-level waste and transuranic waste management facilities. A range of fuel cycle centre sizes corresponded to the fuel throughput of power plants with a total capacity of 50,000-300,000MW(e). The types of fuel cycle facilities located at the fuel cycle centre permit the assessment of the role of fuel cycle centres in enhancing the safeguard of strategic special nuclear materials - plutonium and mixed oxides. Siting fuel cycle centres presents a smaller problem than siting reactors. A single reprocessing plant of the scale projected for use in the USA (1500-2000t/a) can reprocess fuel from reactors producing 50,000-65,000MW(e). Only two or three fuel cycle centres of the upper limit size considered in the NECSS-75 would be required in the USA by the year 2000. The NECSS-75 fuel cycle centre evaluation showed that large-scale fuel cycle centres present no real technical siting difficulties from a radiological effluent and safety standpoint. Some construction economies may be achievable with fuel cycle centres, which offer opportunities to improve waste-management systems. Combined centres consisting of reactors and fuel reprocessing and mixed-oxide fuel fabrication plants were also studied in the NECSS. Such centres can eliminate shipment not only of Pu but also mixed-oxide fuel. Increased fuel cycle costs result from implementation of combined centres unless the fuel reprocessing plants are commercial-sized. Development of Pu-burning reactors could reduce any economic penalties of combined centres. The need for effective fissile

  18. FY 1991 Research and development project for large-scale industrial technologies. Report on results of R and D of superhigh technological machining systems; 1991 nendo chosentan kako system no kenkyu kaihatsu seika hokokusho. Chosentan kako system no kenkyu kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-03-01

    Described herein are the FY 1991 results of the R and D project aimed at establishment of superprecision machining technologies for developing machining technologies and nano-technologies aided by excited beams. The researches on the superprecision machining technologies involve design and development, on a trial basis, of the totally static pressure type positioning device, for which automatically controlling drawing is adopted to improve its rigidity. The researches on the surface modification technologies aided by ion beams involve scanning the ion beams onto the metallic plate to be provided around the glass substrate. The results indicate that the secondary electrons generated can be used to control charge-up. In addition, part of a 30cm square glass substrate is modified by implantation of the spot type ions of high current density, and the modified portion is used to produce a thin-film silicon transistor. The researches on superhigh-technological machining standard measurement involve improvement of precision of the system aided by a dye laser, which attains a precision of 0 to 30nm in a 0.1m measurement range. (NEDO)

  19. Large-scale innovation and change in UK higher education

    Directory of Open Access Journals (Sweden)

    Stephen Brown

    2013-09-01

    Full Text Available This paper reflects on challenges universities face as they respond to change. It reviews current theories and models of change management, discusses why universities are particularly difficult environments in which to achieve large scale, lasting change and reports on a recent attempt by the UK JISC to enable a range of UK universities to employ technology to deliver such changes. Key lessons that emerged from these experiences are reviewed covering themes of pervasiveness, unofficial systems, project creep, opposition, pressure to deliver, personnel changes and technology issues. The paper argues that collaborative approaches to project management offer greater prospects of effective large-scale change in universities than either management-driven top-down or more champion-led bottom-up methods. It also argues that while some diminution of control over project outcomes is inherent in this approach, this is outweighed by potential benefits of lasting and widespread adoption of agreed changes.

  20. Best Practices in the Evaluation of Large-scale STEM-focused Events: A Review of Recent Literature

    Science.gov (United States)

    Shebby, S.; Cobb, W. H.; Buxner, S.; Shipp, S. S.

    2015-12-01

    Each year, the National Aeronautics and Space Administration (NASA) sponsors a variety of educational events to share information with educators, students, and the general public. Intended outcomes of these events include increased interest in and awareness of the mission and goals of NASA. Events range in size from relatively small family science nights at a local school to large-scale mission and celestial event celebrations involving thousands of members of the general public. To support community members in designing event evaluations, the Science Mission Directorate (SMD) Planetary Science Forum sponsored the creation of a Best Practices Guide. The guide was generated by reviewing published large-scale event evaluation reports; however, the best practices described within are pertinent for all event organizers and evaluators regardless of event size. Each source included in the guide identified numerous challenges to conducting their event evaluation. These included difficulty in identifying extant instruments or items, collecting representative data, and disaggregating data to inform different evaluation questions. Overall, the guide demonstrates that evaluations of the large-scale events are generally done at a very basic level, with the types of data collected limited to observable demographic information and participant reactions collected via online survey. In addition to these findings, this presentation will describe evaluation best practices that will help practitioners move beyond these basic indicators and examine how to make the evaluation process an integral—and valuable—element of event planning, ultimately informing event outcomes and impacts. It will provide detailed information on five recommendations presented in the guide: 1) consider evaluation methodology, including data analysis, in advance; 2) design data collection instruments well in advance of the event; 3) collect data at different times and from multiple sources; 4) use

  1. Large-scale structure of the Universe

    International Nuclear Information System (INIS)

    Doroshkevich, A.G.

    1978-01-01

    The problems, discussed at the ''Large-scale Structure of the Universe'' symposium are considered on a popular level. Described are the cell structure of galaxy distribution in the Universe, principles of mathematical galaxy distribution modelling. The images of cell structures, obtained after reprocessing with the computer are given. Discussed are three hypothesis - vortical, entropic, adiabatic, suggesting various processes of galaxy and galaxy clusters origin. A considerable advantage of the adiabatic hypothesis is recognized. The relict radiation, as a method of direct studying the processes taking place in the Universe is considered. The large-scale peculiarities and small-scale fluctuations of the relict radiation temperature enable one to estimate the turbance properties at the pre-galaxy stage. The discussion of problems, pertaining to studying the hot gas, contained in galaxy clusters, the interactions within galaxy clusters and with the inter-galaxy medium, is recognized to be a notable contribution into the development of theoretical and observational cosmology

  2. Emerging large-scale solar heating applications

    International Nuclear Information System (INIS)

    Wong, W.P.; McClung, J.L.

    2009-01-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  3. Emerging large-scale solar heating applications

    Energy Technology Data Exchange (ETDEWEB)

    Wong, W.P.; McClung, J.L. [Science Applications International Corporation (SAIC Canada), Ottawa, Ontario (Canada)

    2009-07-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  4. Challenges for Large Scale Structure Theory

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    I will describe some of the outstanding questions in Cosmology where answers could be provided by observations of the Large Scale Structure of the Universe at late times.I will discuss some of the theoretical challenges which will have to be overcome to extract this information from the observations. I will describe some of the theoretical tools that might be useful to achieve this goal. 

  5. Methods for Large-Scale Nonlinear Optimization.

    Science.gov (United States)

    1980-05-01

    STANFORD, CALIFORNIA 94305 METHODS FOR LARGE-SCALE NONLINEAR OPTIMIZATION by Philip E. Gill, Waiter Murray, I Michael A. Saunden, and Masgaret H. Wright...typical iteration can be partitioned so that where B is an m X m basise matrix. This partition effectively divides the vari- ables into three classes... attention is given to the standard of the coding or the documentation. A much better way of obtaining mathematical software is from a software library

  6. Large scale inhomogeneities and the cosmological principle

    International Nuclear Information System (INIS)

    Lukacs, B.; Meszaros, A.

    1984-12-01

    The compatibility of cosmologic principles and possible large scale inhomogeneities of the Universe is discussed. It seems that the strongest symmetry principle which is still compatible with reasonable inhomogeneities, is a full conformal symmetry in the 3-space defined by the cosmological velocity field, but even in such a case, the standard model is isolated from the inhomogeneous ones when the whole evolution is considered. (author)

  7. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; White, B.W.; Nichols, B.D.; Smith, P.R.; Leslie, I.H.; Fenton, D.L.; Gunaji, M.V.; Blythe, J.P.

    1991-01-01

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  8. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  9. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  10. LAVA: Large scale Automated Vulnerability Addition

    Science.gov (United States)

    2016-05-23

    LAVA: Large-scale Automated Vulnerability Addition Brendan Dolan -Gavitt∗, Patrick Hulin†, Tim Leek†, Fredrich Ulrich†, Ryan Whelan† (Authors listed...released, and thus rapidly become stale. We can expect tools to have been trained to detect bugs that have been released. Given the commercial price tag...low TCN) and dead (low liveness) program data is a powerful one for vulnera- bility injection. The DUAs it identifies are internal program quantities

  11. Large-Scale Transit Signal Priority Implementation

    OpenAIRE

    Lee, Kevin S.; Lozner, Bailey

    2018-01-01

    In 2016, the District Department of Transportation (DDOT) deployed Transit Signal Priority (TSP) at 195 intersections in highly urbanized areas of Washington, DC. In collaboration with a broader regional implementation, and in partnership with the Washington Metropolitan Area Transit Authority (WMATA), DDOT set out to apply a systems engineering–driven process to identify, design, test, and accept a large-scale TSP system. This presentation will highlight project successes and lessons learned.

  12. Large-scale assembly of colloidal particles

    Science.gov (United States)

    Yang, Hongta

    This study reports a simple, roll-to-roll compatible coating technology for producing three-dimensional highly ordered colloidal crystal-polymer composites, colloidal crystals, and macroporous polymer membranes. A vertically beveled doctor blade is utilized to shear align silica microsphere-monomer suspensions to form large-area composites in a single step. The polymer matrix and the silica microspheres can be selectively removed to create colloidal crystals and self-standing macroporous polymer membranes. The thickness of the shear-aligned crystal is correlated with the viscosity of the colloidal suspension and the coating speed, and the correlations can be qualitatively explained by adapting the mechanisms developed for conventional doctor blade coating. Five important research topics related to the application of large-scale three-dimensional highly ordered macroporous films by doctor blade coating are covered in this study. The first topic describes the invention in large area and low cost color reflective displays. This invention is inspired by the heat pipe technology. The self-standing macroporous polymer films exhibit brilliant colors which originate from the Bragg diffractive of visible light form the three-dimensional highly ordered air cavities. The colors can be easily changed by tuning the size of the air cavities to cover the whole visible spectrum. When the air cavities are filled with a solvent which has the same refractive index as that of the polymer, the macroporous polymer films become completely transparent due to the index matching. When the solvent trapped in the cavities is evaporated by in-situ heating, the sample color changes back to brilliant color. This process is highly reversible and reproducible for thousands of cycles. The second topic reports the achievement of rapid and reversible vapor detection by using 3-D macroporous photonic crystals. Capillary condensation of a condensable vapor in the interconnected macropores leads to the

  13. FY 1992 research and development project for large-scale industrial technologies. Report on results of R and D of superhigh technological machining systems; 1992 nendo chosentan kako system no kenkyu kaihatsu seika hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-03-01

    Described herein are the FY 1992 results of the R and D project aimed at establishment of the technologies for development of, e.g., machine and electronic device members of superhigh precision and high functions by processing and superhigh-precision machining aided by excited beams. The elementary researches on superhigh-precision machining achieve the given targets for precision stability of the feed positioning device. The researches on development of high-precision rotating devices, on a trial basis, are directed to improvement of rotational precision of pneumatic static pressure bearings and magnetism correction/controlling circuits, increasing speed and precision of 3-point type rotational precision measurement methods, and development of rotation-driving motors, achieving rotational precision of 0.015{mu}m at 2000rpm. The researches on the surface modification technologies aided by ion beams involve experiments for production of crystalline Si films and thin-film transistors of the Si films, using the surface-modified portion of a large-size glass substrate. The researches on superhigh-technological machining standard measurement involve development of length-measuring systems aided by a dye laser, achieving a precision of {+-} 10nm or less in a 100mm measurement range. (NEDO)

  14. Survey and research for the enhancement of large-scale technology development 3. Patent researches on new tasks for development under large-scale project; Ogata gijutsu kaihatsu suishin no tame no chosa kenkyu. 3. Ogata project shinki kaihatsu tema ni kansuru tokkyo chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1981-03-01

    Regarding 'high-speed computation systems for technological use' and 'manganese nodule mining systems,' researches are conducted into technological trends from the viewpoint of patent. As for applications for patents involving the Josephson effect device, there are 79 patents disclosed in Japan, with applications from Japan recording a peak in 1977-1978 and those from overseas in 1974-1975. As for important applicants, IBM distinguishes itself among those overseas while, in Japan, the Nippon Telegraph and Telephone Public Corporation occupies 47%, and Fujitsu, Ltd., 34%. In the case of GaAs-based transistors, businesses in Japan occupies as much as 90% of the applications overwhelming overseas businesses occupying less than 10%. As for the patents on manganese nodule mining systems, 183 Japanese patents are pending, with 88 already granted in America. While the main concern in Japan has transferred from the continuous elevator bucket system of 1971-1974 to the fluid dredge system, the fluid dredge system has consistently been occupying the overpowering majority in America. (NEDO)

  15. Future hydrogen markets for large-scale hydrogen production systems

    International Nuclear Information System (INIS)

    Forsberg, Charles W.

    2007-01-01

    The cost of delivered hydrogen includes production, storage, and distribution. For equal production costs, large users (>10 6 m 3 /day) will favor high-volume centralized hydrogen production technologies to avoid collection costs for hydrogen from widely distributed sources. Potential hydrogen markets were examined to identify and characterize those markets that will favor large-scale hydrogen production technologies. The two high-volume centralized hydrogen production technologies are nuclear energy and fossil energy with carbon dioxide sequestration. The potential markets for these technologies are: (1) production of liquid fuels (gasoline, diesel and jet) including liquid fuels with no net greenhouse gas emissions and (2) peak electricity production. The development of high-volume centralized hydrogen production technologies requires an understanding of the markets to (1) define hydrogen production requirements (purity, pressure, volumes, need for co-product oxygen, etc.); (2) define and develop technologies to use the hydrogen, and (3) create the industrial partnerships to commercialize such technologies. (author)

  16. Stability and Control of Large-Scale Dynamical Systems A Vector Dissipative Systems Approach

    CERN Document Server

    Haddad, Wassim M

    2011-01-01

    Modern complex large-scale dynamical systems exist in virtually every aspect of science and engineering, and are associated with a wide variety of physical, technological, environmental, and social phenomena, including aerospace, power, communications, and network systems, to name just a few. This book develops a general stability analysis and control design framework for nonlinear large-scale interconnected dynamical systems, and presents the most complete treatment on vector Lyapunov function methods, vector dissipativity theory, and decentralized control architectures. Large-scale dynami

  17. Large-scale fuel cycle centers

    International Nuclear Information System (INIS)

    Smiley, S.H.; Black, K.M.

    1977-01-01

    The United States Nuclear Regulatory Commission (NRC) has considered the nuclear energy center concept for fuel cycle plants in the Nuclear Energy Center Site Survey - 1975 (NECSS-75) -- an important study mandated by the U.S. Congress in the Energy Reorganization Act of 1974 which created the NRC. For the study, NRC defined fuel cycle centers to consist of fuel reprocessing and mixed oxide fuel fabrication plants, and optional high-level waste and transuranic waste management facilities. A range of fuel cycle center sizes corresponded to the fuel throughput of power plants with a total capacity of 50,000 - 300,000 MWe. The types of fuel cycle facilities located at the fuel cycle center permit the assessment of the role of fuel cycle centers in enhancing safeguarding of strategic special nuclear materials -- plutonium and mixed oxides. Siting of fuel cycle centers presents a considerably smaller problem than the siting of reactors. A single reprocessing plant of the scale projected for use in the United States (1500-2000 MT/yr) can reprocess the fuel from reactors producing 50,000-65,000 MWe. Only two or three fuel cycle centers of the upper limit size considered in the NECSS-75 would be required in the United States by the year 2000 . The NECSS-75 fuel cycle center evaluations showed that large scale fuel cycle centers present no real technical difficulties in siting from a radiological effluent and safety standpoint. Some construction economies may be attainable with fuel cycle centers; such centers offer opportunities for improved waste management systems. Combined centers consisting of reactors and fuel reprocessing and mixed oxide fuel fabrication plants were also studied in the NECSS. Such centers can eliminate not only shipment of plutonium, but also mixed oxide fuel. Increased fuel cycle costs result from implementation of combined centers unless the fuel reprocessing plants are commercial-sized. Development of plutonium-burning reactors could reduce any

  18. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  19. RESTRUCTURING OF THE LARGE-SCALE SPRINKLERS

    Directory of Open Access Journals (Sweden)

    Paweł Kozaczyk

    2016-09-01

    Full Text Available One of the best ways for agriculture to become independent from shortages of precipitation is irrigation. In the seventies and eighties of the last century a number of large-scale sprinklers in Wielkopolska was built. At the end of 1970’s in the Poznan province 67 sprinklers with a total area of 6400 ha were installed. The average size of the sprinkler reached 95 ha. In 1989 there were 98 sprinklers, and the area which was armed with them was more than 10 130 ha. The study was conducted on 7 large sprinklers with the area ranging from 230 to 520 hectares in 1986÷1998. After the introduction of the market economy in the early 90’s and ownership changes in agriculture, large-scale sprinklers have gone under a significant or total devastation. Land on the State Farms of the State Agricultural Property Agency has leased or sold and the new owners used the existing sprinklers to a very small extent. This involved a change in crop structure, demand structure and an increase in operating costs. There has also been a threefold increase in electricity prices. Operation of large-scale irrigation encountered all kinds of barriers in practice and limitations of system solutions, supply difficulties, high levels of equipment failure which is not inclined to rational use of available sprinklers. An effect of a vision of the local area was to show the current status of the remaining irrigation infrastructure. The adopted scheme for the restructuring of Polish agriculture was not the best solution, causing massive destruction of assets previously invested in the sprinkler system.

  20. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard

    2012-01-01

    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  1. Fiscal 1996 large-scale industrial technology R and D project report. R and D on processing technology for creating advanced functions; 1996 nendo senshin kino soshutsu kako gijutsu no kenkyu kaihatsu seika hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This report summarizes the fiscal 1996 development result on technology for creating advanced bio-affinity materials. The functionally gradient artificial tooth root with fine structure was fabricated by the following processes. The surface of a cylindrical Ti rod was first dry-jet-sprayed with ultrafine particles prepared by RF plasma, where its composition was changed continuously from Ti to alumina in a radial direction to form an FGM layer. The so-formed green composite was then sintered while heating the Ti side and alumina side at 1400K and 1800K, respectively. Compressive strength over 200MPa, durability of 10{sup 7} stress cycles at 1000N, and adhesion strength over 65MPa to substrates were obtained. As the outermost surface of the composite was coated with hydroxyapatite by plasma spraying, cell growth on the surface was confirmed without any contamination with heavy metals. This material is suitable for dental use in mechanical and chemical properties. Study was made on non- destructive analysis of FGM. A slant angle injection method was studied to increase the instrumental resolution power of Rutherford backscattering spectrometers. Composition modification was analyzed at a nm level quantitatively. Positron annihilation Doppler broadening was also measured. (NEDO)

  2. Optical interconnect for large-scale systems

    Science.gov (United States)

    Dress, William

    2013-02-01

    This paper presents a switchless, optical interconnect module that serves as a node in a network of identical distribution modules for large-scale systems. Thousands to millions of hosts or endpoints may be interconnected by a network of such modules, avoiding the need for multi-level switches. Several common network topologies are reviewed and their scaling properties assessed. The concept of message-flow routing is discussed in conjunction with the unique properties enabled by the optical distribution module where it is shown how top-down software control (global routing tables, spanning-tree algorithms) may be avoided.

  3. Adaptive visualization for large-scale graph

    International Nuclear Information System (INIS)

    Nakamura, Hiroko; Shinano, Yuji; Ohzahata, Satoshi

    2010-01-01

    We propose an adoptive visualization technique for representing a large-scale hierarchical dataset within limited display space. A hierarchical dataset has nodes and links showing the parent-child relationship between the nodes. These nodes and links are described using graphics primitives. When the number of these primitives is large, it is difficult to recognize the structure of the hierarchical data because many primitives are overlapped within a limited region. To overcome this difficulty, we propose an adaptive visualization technique for hierarchical datasets. The proposed technique selects an appropriate graph style according to the nodal density in each area. (author)

  4. Neutrinos and large-scale structure

    International Nuclear Information System (INIS)

    Eisenstein, Daniel J.

    2015-01-01

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos

  5. Puzzles of large scale structure and gravitation

    International Nuclear Information System (INIS)

    Sidharth, B.G.

    2006-01-01

    We consider the puzzle of cosmic voids bounded by two-dimensional structures of galactic clusters as also a puzzle pointed out by Weinberg: How can the mass of a typical elementary particle depend on a cosmic parameter like the Hubble constant? An answer to the first puzzle is proposed in terms of 'Scaled' Quantum Mechanical like behaviour which appears at large scales. The second puzzle can be answered by showing that the gravitational mass of an elementary particle has a Machian character (see Ahmed N. Cantorian small worked, Mach's principle and the universal mass network. Chaos, Solitons and Fractals 2004;21(4))

  6. Neutrinos and large-scale structure

    Energy Technology Data Exchange (ETDEWEB)

    Eisenstein, Daniel J. [Daniel J. Eisenstein, Harvard-Smithsonian Center for Astrophysics, 60 Garden St., MS #20, Cambridge, MA 02138 (United States)

    2015-07-15

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos.

  7. Concepts for Large Scale Hydrogen Production

    OpenAIRE

    Jakobsen, Daniel; Åtland, Vegar

    2016-01-01

    The objective of this thesis is to perform a techno-economic analysis of large-scale, carbon-lean hydrogen production in Norway, in order to evaluate various production methods and estimate a breakeven price level. Norway possesses vast energy resources and the export of oil and gas is vital to the country s economy. The results of this thesis indicate that hydrogen represents a viable, carbon-lean opportunity to utilize these resources, which can prove key in the future of Norwegian energy e...

  8. Stabilization Algorithms for Large-Scale Problems

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg

    2006-01-01

    The focus of the project is on stabilization of large-scale inverse problems where structured models and iterative algorithms are necessary for computing approximate solutions. For this purpose, we study various iterative Krylov methods and their abilities to produce regularized solutions. Some......-curve. This heuristic is implemented as a part of a larger algorithm which is developed in collaboration with G. Rodriguez and P. C. Hansen. Last, but not least, a large part of the project has, in different ways, revolved around the object-oriented Matlab toolbox MOORe Tools developed by PhD Michael Jacobsen. New...

  9. Large scale phononic metamaterials for seismic isolation

    International Nuclear Information System (INIS)

    Aravantinos-Zafiris, N.; Sigalas, M. M.

    2015-01-01

    In this work, we numerically examine structures that could be characterized as large scale phononic metamaterials. These novel structures could have band gaps in the frequency spectrum of seismic waves when their dimensions are chosen appropriately, thus raising the belief that they could be serious candidates for seismic isolation structures. Different and easy to fabricate structures were examined made from construction materials such as concrete and steel. The well-known finite difference time domain method is used in our calculations in order to calculate the band structures of the proposed metamaterials

  10. Dipolar modulation of Large-Scale Structure

    Science.gov (United States)

    Yoon, Mijin

    For the last two decades, we have seen a drastic development of modern cosmology based on various observations such as the cosmic microwave background (CMB), type Ia supernovae, and baryonic acoustic oscillations (BAO). These observational evidences have led us to a great deal of consensus on the cosmological model so-called LambdaCDM and tight constraints on cosmological parameters consisting the model. On the other hand, the advancement in cosmology relies on the cosmological principle: the universe is isotropic and homogeneous on large scales. Testing these fundamental assumptions is crucial and will soon become possible given the planned observations ahead. Dipolar modulation is the largest angular anisotropy of the sky, which is quantified by its direction and amplitude. We measured a huge dipolar modulation in CMB, which mainly originated from our solar system's motion relative to CMB rest frame. However, we have not yet acquired consistent measurements of dipolar modulations in large-scale structure (LSS), as they require large sky coverage and a number of well-identified objects. In this thesis, we explore measurement of dipolar modulation in number counts of LSS objects as a test of statistical isotropy. This thesis is based on two papers that were published in peer-reviewed journals. In Chapter 2 [Yoon et al., 2014], we measured a dipolar modulation in number counts of WISE matched with 2MASS sources. In Chapter 3 [Yoon & Huterer, 2015], we investigated requirements for detection of kinematic dipole in future surveys.

  11. Internationalization Measures in Large Scale Research Projects

    Science.gov (United States)

    Soeding, Emanuel; Smith, Nancy

    2017-04-01

    Internationalization measures in Large Scale Research Projects Large scale research projects (LSRP) often serve as flagships used by universities or research institutions to demonstrate their performance and capability to stakeholders and other interested parties. As the global competition among universities for the recruitment of the brightest brains has increased, effective internationalization measures have become hot topics for universities and LSRP alike. Nevertheless, most projects and universities are challenged with little experience on how to conduct these measures and make internationalization an cost efficient and useful activity. Furthermore, those undertakings permanently have to be justified with the Project PIs as important, valuable tools to improve the capacity of the project and the research location. There are a variety of measures, suited to support universities in international recruitment. These include e.g. institutional partnerships, research marketing, a welcome culture, support for science mobility and an effective alumni strategy. These activities, although often conducted by different university entities, are interlocked and can be very powerful measures if interfaced in an effective way. On this poster we display a number of internationalization measures for various target groups, identify interfaces between project management, university administration, researchers and international partners to work together, exchange information and improve processes in order to be able to recruit, support and keep the brightest heads to your project.

  12. Large scale integration of photovoltaics in cities

    International Nuclear Information System (INIS)

    Strzalka, Aneta; Alam, Nazmul; Duminil, Eric; Coors, Volker; Eicker, Ursula

    2012-01-01

    Highlights: ► We implement the photovoltaics on a large scale. ► We use three-dimensional modelling for accurate photovoltaic simulations. ► We consider the shadowing effect in the photovoltaic simulation. ► We validate the simulated results using detailed hourly measured data. - Abstract: For a large scale implementation of photovoltaics (PV) in the urban environment, building integration is a major issue. This includes installations on roof or facade surfaces with orientations that are not ideal for maximum energy production. To evaluate the performance of PV systems in urban settings and compare it with the building user’s electricity consumption, three-dimensional geometry modelling was combined with photovoltaic system simulations. As an example, the modern residential district of Scharnhauser Park (SHP) near Stuttgart/Germany was used to calculate the potential of photovoltaic energy and to evaluate the local own consumption of the energy produced. For most buildings of the district only annual electrical consumption data was available and only selected buildings have electronic metering equipment. The available roof area for one of these multi-family case study buildings was used for a detailed hourly simulation of the PV power production, which was then compared to the hourly measured electricity consumption. The results were extrapolated to all buildings of the analyzed area by normalizing them to the annual consumption data. The PV systems can produce 35% of the quarter’s total electricity consumption and half of this generated electricity is directly used within the buildings.

  13. Large-scale Intelligent Transporation Systems simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, T.; Canfield, T.; Hannebutte, U.; Levine, D.; Tentner, A.

    1995-06-01

    A prototype computer system has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS) capable of running on massively parallel computers and distributed (networked) computer systems. The prototype includes the modelling of instrumented ``smart`` vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of our design is that vehicles will be represented by autonomus computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  14. Pseudoscalar-photon mixing and the large scale alignment of QsO ...

    Indian Academy of Sciences (India)

    physics pp. 679-682. Pseudoscalar-photon mixing and the large scale alignment of QsO optical polarizations. PANKAJ JAIN, sUKANTA PANDA and s sARALA. Physics Department, Indian Institute of Technology, Kanpur 208 016, India. Abstract. We review the observation of large scale alignment of QSO optical polariza-.

  15. Problems of large-scale vertically-integrated aquaculture

    Energy Technology Data Exchange (ETDEWEB)

    Webber, H H; Riordan, P F

    1976-01-01

    The problems of vertically-integrated aquaculture are outlined; they are concerned with: species limitations (in the market, biological and technological); site selection, feed, manpower needs, and legal, institutional and financial requirements. The gaps in understanding of, and the constraints limiting, large-scale aquaculture are listed. Future action is recommended with respect to: types and diversity of species to be cultivated, marketing, biotechnology (seed supply, disease control, water quality and concerted effort), siting, feed, manpower, legal and institutional aids (granting of water rights, grants, tax breaks, duty-free imports, etc.), and adequate financing. The last of hard data based on experience suggests that large-scale vertically-integrated aquaculture is a high risk enterprise, and with the high capital investment required, banks and funding institutions are wary of supporting it. Investment in pilot projects is suggested to demonstrate that large-scale aquaculture can be a fully functional and successful business. Construction and operation of such pilot farms is judged to be in the interests of both the public and private sector.

  16. Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Willcox, Karen [MIT; Marzouk, Youssef [MIT

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to

  17. OffshoreDC DC grids for integration of large scale wind power

    DEFF Research Database (Denmark)

    Zeni, Lorenzo; Endegnanew, Atsede Gualu; Stamatiou, Georgios

    The present report summarizes the main findings of the Nordic Energy Research project “DC grids for large scale integration of offshore wind power – OffshoreDC”. The project is been funded by Nordic Energy Research through the TFI programme and was active between 2011 and 2016. The overall...... objective of the project was to drive the development of the VSC based HVDC technology for future large scale offshore grids, supporting a standardised and commercial development of the technology, and improving the opportunities for the technology to support power system integration of large scale offshore...

  18. Radiations: large scale monitoring in Japan

    International Nuclear Information System (INIS)

    Linton, M.; Khalatbari, A.

    2011-01-01

    As the consequences of radioactive leaks on their health are a matter of concern for Japanese people, a large scale epidemiological study has been launched by the Fukushima medical university. It concerns the two millions inhabitants of the Fukushima Prefecture. On the national level and with the support of public funds, medical care and follow-up, as well as systematic controls are foreseen, notably to check the thyroid of 360.000 young people less than 18 year old and of 20.000 pregnant women in the Fukushima Prefecture. Some measurements have already been performed on young children. Despite the sometimes rather low measures, and because they know that some parts of the area are at least as much contaminated as it was the case around Chernobyl, some people are reluctant to go back home

  19. Large-scale digitizer system, analog converters

    International Nuclear Information System (INIS)

    Althaus, R.F.; Lee, K.L.; Kirsten, F.A.; Wagner, L.J.

    1976-10-01

    Analog to digital converter circuits that are based on the sharing of common resources, including those which are critical to the linearity and stability of the individual channels, are described. Simplicity of circuit composition is valued over other more costly approaches. These are intended to be applied in a large-scale processing and digitizing system for use with high-energy physics detectors such as drift-chambers or phototube-scintillator arrays. Signal distribution techniques are of paramount importance in maintaining adequate signal-to-noise ratio. Noise in both amplitude and time-jitter senses is held sufficiently low so that conversions with 10-bit charge resolution and 12-bit time resolution are achieved

  20. Grid sensitivity capability for large scale structures

    Science.gov (United States)

    Nagendra, Gopal K.; Wallerstein, David V.

    1989-01-01

    The considerations and the resultant approach used to implement design sensitivity capability for grids into a large scale, general purpose finite element system (MSC/NASTRAN) are presented. The design variables are grid perturbations with a rather general linking capability. Moreover, shape and sizing variables may be linked together. The design is general enough to facilitate geometric modeling techniques for generating design variable linking schemes in an easy and straightforward manner. Test cases have been run and validated by comparison with the overall finite difference method. The linking of a design sensitivity capability for shape variables in MSC/NASTRAN with an optimizer would give designers a powerful, automated tool to carry out practical optimization design of real life, complicated structures.

  1. Large - scale Rectangular Ruler Automated Verification Device

    Science.gov (United States)

    Chen, Hao; Chang, Luping; Xing, Minjian; Xie, Xie

    2018-03-01

    This paper introduces a large-scale rectangular ruler automated verification device, which consists of photoelectric autocollimator and self-designed mechanical drive car and data automatic acquisition system. The design of mechanical structure part of the device refer to optical axis design, drive part, fixture device and wheel design. The design of control system of the device refer to hardware design and software design, and the hardware mainly uses singlechip system, and the software design is the process of the photoelectric autocollimator and the automatic data acquisition process. This devices can automated achieve vertical measurement data. The reliability of the device is verified by experimental comparison. The conclusion meets the requirement of the right angle test procedure.

  2. Large Scale Landform Mapping Using Lidar DEM

    Directory of Open Access Journals (Sweden)

    Türkay Gökgöz

    2015-08-01

    Full Text Available In this study, LIDAR DEM data was used to obtain a primary landform map in accordance with a well-known methodology. This primary landform map was generalized using the Focal Statistics tool (Majority, considering the minimum area condition in cartographic generalization in order to obtain landform maps at 1:1000 and 1:5000 scales. Both the primary and the generalized landform maps were verified visually with hillshaded DEM and an orthophoto. As a result, these maps provide satisfactory visuals of the landforms. In order to show the effect of generalization, the area of each landform in both the primary and the generalized maps was computed. Consequently, landform maps at large scales could be obtained with the proposed methodology, including generalization using LIDAR DEM.

  3. Constructing sites on a large scale

    DEFF Research Database (Denmark)

    Braae, Ellen Marie; Tietjen, Anne

    2011-01-01

    Since the 1990s, the regional scale has regained importance in urban and landscape design. In parallel, the focus in design tasks has shifted from master plans for urban extension to strategic urban transformation projects. A prominent example of a contemporary spatial development approach...... for setting the design brief in a large scale urban landscape in Norway, the Jaeren region around the city of Stavanger. In this paper, we first outline the methodological challenges and then present and discuss the proposed method based on our teaching experiences. On this basis, we discuss aspects...... is the IBA Emscher Park in the Ruhr area in Germany. Over a 10 years period (1988-1998), more than a 100 local transformation projects contributed to the transformation from an industrial to a post-industrial region. The current paradigm of planning by projects reinforces the role of the design disciplines...

  4. Large scale study of tooth enamel

    International Nuclear Information System (INIS)

    Bodart, F.; Deconninck, G.; Martin, M.T.

    Human tooth enamel contains traces of foreign elements. The presence of these elements is related to the history and the environment of the human body and can be considered as the signature of perturbations which occur during the growth of a tooth. A map of the distribution of these traces on a large scale sample of the population will constitute a reference for further investigations of environmental effects. On hundred eighty samples of teeth were first analyzed using PIXE, backscattering and nuclear reaction techniques. The results were analyzed using statistical methods. Correlations between O, F, Na, P, Ca, Mn, Fe, Cu, Zn, Pb and Sr were observed and cluster analysis was in progress. The techniques described in the present work have been developed in order to establish a method for the exploration of very large samples of the Belgian population. (author)

  5. Testing Einstein's Gravity on Large Scales

    Science.gov (United States)

    Prescod-Weinstein, Chandra

    2011-01-01

    A little over a decade has passed since two teams studying high redshift Type Ia supernovae announced the discovery that the expansion of the universe was accelerating. After all this time, we?re still not sure how cosmic acceleration fits into the theory that tells us about the large-scale universe: General Relativity (GR). As part of our search for answers, we have been forced to question GR itself. But how will we test our ideas? We are fortunate enough to be entering the era of precision cosmology, where the standard model of gravity can be subjected to more rigorous testing. Various techniques will be employed over the next decade or two in the effort to better understand cosmic acceleration and the theory behind it. In this talk, I will describe cosmic acceleration, current proposals to explain it, and weak gravitational lensing, an observational effect that allows us to do the necessary precision cosmology.

  6. Large-Scale Astrophysical Visualization on Smartphones

    Science.gov (United States)

    Becciani, U.; Massimino, P.; Costa, A.; Gheller, C.; Grillo, A.; Krokos, M.; Petta, C.

    2011-07-01

    Nowadays digital sky surveys and long-duration, high-resolution numerical simulations using high performance computing and grid systems produce multidimensional astrophysical datasets in the order of several Petabytes. Sharing visualizations of such datasets within communities and collaborating research groups is of paramount importance for disseminating results and advancing astrophysical research. Moreover educational and public outreach programs can benefit greatly from novel ways of presenting these datasets by promoting understanding of complex astrophysical processes, e.g., formation of stars and galaxies. We have previously developed VisIVO Server, a grid-enabled platform for high-performance large-scale astrophysical visualization. This article reviews the latest developments on VisIVO Web, a custom designed web portal wrapped around VisIVO Server, then introduces VisIVO Smartphone, a gateway connecting VisIVO Web and data repositories for mobile astrophysical visualization. We discuss current work and summarize future developments.

  7. Large-scale sequential quadratic programming algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Eldersveld, S.K.

    1992-09-01

    The problem addressed is the general nonlinear programming problem: finding a local minimizer for a nonlinear function subject to a mixture of nonlinear equality and inequality constraints. The methods studied are in the class of sequential quadratic programming (SQP) algorithms, which have previously proved successful for problems of moderate size. Our goal is to devise an SQP algorithm that is applicable to large-scale optimization problems, using sparse data structures and storing less curvature information but maintaining the property of superlinear convergence. The main features are: 1. The use of a quasi-Newton approximation to the reduced Hessian of the Lagrangian function. Only an estimate of the reduced Hessian matrix is required by our algorithm. The impact of not having available the full Hessian approximation is studied and alternative estimates are constructed. 2. The use of a transformation matrix Q. This allows the QP gradient to be computed easily when only the reduced Hessian approximation is maintained. 3. The use of a reduced-gradient form of the basis for the null space of the working set. This choice of basis is more practical than an orthogonal null-space basis for large-scale problems. The continuity condition for this choice is proven. 4. The use of incomplete solutions of quadratic programming subproblems. Certain iterates generated by an active-set method for the QP subproblem are used in place of the QP minimizer to define the search direction for the nonlinear problem. An implementation of the new algorithm has been obtained by modifying the code MINOS. Results and comparisons with MINOS and NPSOL are given for the new algorithm on a set of 92 test problems.

  8. Enabling Large-Scale Biomedical Analysis in the Cloud

    Directory of Open Access Journals (Sweden)

    Ying-Chih Lin

    2013-01-01

    Full Text Available Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable.

  9. Multidimensional quantum entanglement with large-scale integrated optics

    DEFF Research Database (Denmark)

    Wang, Jianwei; Paesani, Stefano; Ding, Yunhong

    2018-01-01

    -dimensional entanglement. A programmable bipartite entangled system is realized with dimension up to 15 × 15 on a large-scale silicon-photonics quantum circuit. The device integrates more than 550 photonic components on a single chip, including 16 identical photon-pair sources. We verify the high precision, generality......The ability to control multidimensional quantum systems is key for the investigation of fundamental science and for the development of advanced quantum technologies. We demonstrate a multidimensional integrated quantum photonic platform able to generate, control and analyze high...

  10. Multidimensional quantum entanglement with large-scale integrated optics.

    Science.gov (United States)

    Wang, Jianwei; Paesani, Stefano; Ding, Yunhong; Santagati, Raffaele; Skrzypczyk, Paul; Salavrakos, Alexia; Tura, Jordi; Augusiak, Remigiusz; Mančinska, Laura; Bacco, Davide; Bonneau, Damien; Silverstone, Joshua W; Gong, Qihuang; Acín, Antonio; Rottwitt, Karsten; Oxenløwe, Leif K; O'Brien, Jeremy L; Laing, Anthony; Thompson, Mark G

    2018-04-20

    The ability to control multidimensional quantum systems is central to the development of advanced quantum technologies. We demonstrate a multidimensional integrated quantum photonic platform able to generate, control, and analyze high-dimensional entanglement. A programmable bipartite entangled system is realized with dimensions up to 15 × 15 on a large-scale silicon photonics quantum circuit. The device integrates more than 550 photonic components on a single chip, including 16 identical photon-pair sources. We verify the high precision, generality, and controllability of our multidimensional technology, and further exploit these abilities to demonstrate previously unexplored quantum applications, such as quantum randomness expansion and self-testing on multidimensional states. Our work provides an experimental platform for the development of multidimensional quantum technologies. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  11. Large-scale stochasticity in Hamiltonian systems

    International Nuclear Information System (INIS)

    Escande, D.F.

    1982-01-01

    Large scale stochasticity (L.S.S.) in Hamiltonian systems is defined on the paradigm Hamiltonian H(v,x,t) =v 2 /2-M cos x-P cos k(x-t) which describes the motion of one particle in two electrostatic waves. A renormalization transformation Tsub(r) is described which acts as a microscope that focusses on a given KAM (Kolmogorov-Arnold-Moser) torus in phase space. Though approximate, Tsub(r) yields the threshold of L.S.S. in H with an error of 5-10%. The universal behaviour of KAM tori is predicted: for instance the scale invariance of KAM tori and the critical exponent of the Lyapunov exponent of Cantori. The Fourier expansion of KAM tori is computed and several conjectures by L. Kadanoff and S. Shenker are proved. Chirikov's standard mapping for stochastic layers is derived in a simpler way and the width of the layers is computed. A simpler renormalization scheme for these layers is defined. A Mathieu equation for describing the stability of a discrete family of cycles is derived. When combined with Tsub(r), it allows to prove the link between KAM tori and nearby cycles, conjectured by J. Greene and, in particular, to compute the mean residue of a torus. The fractal diagrams defined by G. Schmidt are computed. A sketch of a methodology for computing the L.S.S. threshold in any two-degree-of-freedom Hamiltonian system is given. (Auth.)

  12. Large scale molecular simulations of nanotoxicity.

    Science.gov (United States)

    Jimenez-Cruz, Camilo A; Kang, Seung-gu; Zhou, Ruhong

    2014-01-01

    The widespread use of nanomaterials in biomedical applications has been accompanied by an increasing interest in understanding their interactions with tissues, cells, and biomolecules, and in particular, on how they might affect the integrity of cell membranes and proteins. In this mini-review, we present a summary of some of the recent studies on this important subject, especially from the point of view of large scale molecular simulations. The carbon-based nanomaterials and noble metal nanoparticles are the main focus, with additional discussions on quantum dots and other nanoparticles as well. The driving forces for adsorption of fullerenes, carbon nanotubes, and graphene nanosheets onto proteins or cell membranes are found to be mainly hydrophobic interactions and the so-called π-π stacking (between aromatic rings), while for the noble metal nanoparticles the long-range electrostatic interactions play a bigger role. More interestingly, there are also growing evidences showing that nanotoxicity can have implications in de novo design of nanomedicine. For example, the endohedral metallofullerenol Gd@C₈₂(OH)₂₂ is shown to inhibit tumor growth and metastasis by inhibiting enzyme MMP-9, and graphene is illustrated to disrupt bacteria cell membranes by insertion/cutting as well as destructive extraction of lipid molecules. These recent findings have provided a better understanding of nanotoxicity at the molecular level and also suggested therapeutic potential by using the cytotoxicity of nanoparticles against cancer or bacteria cells. © 2014 Wiley Periodicals, Inc.

  13. Large-scale tides in general relativity

    Energy Technology Data Exchange (ETDEWEB)

    Ip, Hiu Yan; Schmidt, Fabian, E-mail: iphys@mpa-garching.mpg.de, E-mail: fabians@mpa-garching.mpg.de [Max-Planck-Institut für Astrophysik, Karl-Schwarzschild-Str. 1, 85741 Garching (Germany)

    2017-02-01

    Density perturbations in cosmology, i.e. spherically symmetric adiabatic perturbations of a Friedmann-Lemaȋtre-Robertson-Walker (FLRW) spacetime, are locally exactly equivalent to a different FLRW solution, as long as their wavelength is much larger than the sound horizon of all fluid components. This fact is known as the 'separate universe' paradigm. However, no such relation is known for anisotropic adiabatic perturbations, which correspond to an FLRW spacetime with large-scale tidal fields. Here, we provide a closed, fully relativistic set of evolutionary equations for the nonlinear evolution of such modes, based on the conformal Fermi (CFC) frame. We show explicitly that the tidal effects are encoded by the Weyl tensor, and are hence entirely different from an anisotropic Bianchi I spacetime, where the anisotropy is sourced by the Ricci tensor. In order to close the system, certain higher derivative terms have to be dropped. We show that this approximation is equivalent to the local tidal approximation of Hui and Bertschinger [1]. We also show that this very simple set of equations matches the exact evolution of the density field at second order, but fails at third and higher order. This provides a useful, easy-to-use framework for computing the fully relativistic growth of structure at second order.

  14. Large Scale EOF Analysis of Climate Data

    Science.gov (United States)

    Prabhat, M.; Gittens, A.; Kashinath, K.; Cavanaugh, N. R.; Mahoney, M.

    2016-12-01

    We present a distributed approach towards extracting EOFs from 3D climate data. We implement the method in Apache Spark, and process multi-TB sized datasets on O(1000-10,000) cores. We apply this method to latitude-weighted ocean temperature data from CSFR, a 2.2 terabyte-sized data set comprising ocean and subsurface reanalysis measurements collected at 41 levels in the ocean, at 6 hour intervals over 31 years. We extract the first 100 EOFs of this full data set and compare to the EOFs computed simply on the surface temperature field. Our analyses provide evidence of Kelvin and Rossy waves and components of large-scale modes of oscillation including the ENSO and PDO that are not visible in the usual SST EOFs. Further, they provide information on the the most influential parts of the ocean, such as the thermocline, that exist below the surface. Work is ongoing to understand the factors determining the depth-varying spatial patterns observed in the EOFs. We will experiment with weighting schemes to appropriately account for the differing depths of the observations. We also plan to apply the same distributed approach to analysis of analysis of 3D atmospheric climatic data sets, including multiple variables. Because the atmosphere changes on a quicker time-scale than the ocean, we expect that the results will demonstrate an even greater advantage to computing 3D EOFs in lieu of 2D EOFs.

  15. Mirror dark matter and large scale structure

    International Nuclear Information System (INIS)

    Ignatiev, A.Yu.; Volkas, R.R.

    2003-01-01

    Mirror matter is a dark matter candidate. In this paper, we reexamine the linear regime of density perturbation growth in a universe containing mirror dark matter. Taking adiabatic scale-invariant perturbations as the input, we confirm that the resulting processed power spectrum is richer than for the more familiar cases of cold, warm and hot dark matter. The new features include a maximum at a certain scale λ max , collisional damping below a smaller characteristic scale λ S ' , with oscillatory perturbations between the two. These scales are functions of the fundamental parameters of the theory. In particular, they decrease for decreasing x, the ratio of the mirror plasma temperature to that of the ordinary. For x∼0.2, the scale λ max becomes galactic. Mirror dark matter therefore leads to bottom-up large scale structure formation, similar to conventional cold dark matter, for x(less-or-similar sign)0.2. Indeed, the smaller the value of x, the closer mirror dark matter resembles standard cold dark matter during the linear regime. The differences pertain to scales smaller than λ S ' in the linear regime, and generally in the nonlinear regime because mirror dark matter is chemically complex and to some extent dissipative. Lyman-α forest data and the early reionization epoch established by WMAP may hold the key to distinguishing mirror dark matter from WIMP-style cold dark matter

  16. GPU-based large-scale visualization

    KAUST Repository

    Hadwiger, Markus

    2013-11-19

    Recent advances in image and volume acquisition as well as computational advances in simulation have led to an explosion of the amount of data that must be visualized and analyzed. Modern techniques combine the parallel processing power of GPUs with out-of-core methods and data streaming to enable the interactive visualization of giga- and terabytes of image and volume data. A major enabler for interactivity is making both the computational and the visualization effort proportional to the amount of data that is actually visible on screen, decoupling it from the full data size. This leads to powerful display-aware multi-resolution techniques that enable the visualization of data of almost arbitrary size. The course consists of two major parts: An introductory part that progresses from fundamentals to modern techniques, and a more advanced part that discusses details of ray-guided volume rendering, novel data structures for display-aware visualization and processing, and the remote visualization of large online data collections. You will learn how to develop efficient GPU data structures and large-scale visualizations, implement out-of-core strategies and concepts such as virtual texturing that have only been employed recently, as well as how to use modern multi-resolution representations. These approaches reduce the GPU memory requirements of extremely large data to a working set size that fits into current GPUs. You will learn how to perform ray-casting of volume data of almost arbitrary size and how to render and process gigapixel images using scalable, display-aware techniques. We will describe custom virtual texturing architectures as well as recent hardware developments in this area. We will also describe client/server systems for distributed visualization, on-demand data processing and streaming, and remote visualization. We will describe implementations using OpenGL as well as CUDA, exploiting parallelism on GPUs combined with additional asynchronous

  17. Large Scale Self-Organizing Information Distribution System

    National Research Council Canada - National Science Library

    Low, Steven

    2005-01-01

    This project investigates issues in "large-scale" networks. Here "large-scale" refers to networks with large number of high capacity nodes and transmission links, and shared by a large number of users...

  18. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  19. Probes of large-scale structure in the Universe

    International Nuclear Information System (INIS)

    Suto, Yasushi; Gorski, K.; Juszkiewicz, R.; Silk, J.

    1988-01-01

    Recent progress in observational techniques has made it possible to confront quantitatively various models for the large-scale structure of the Universe with detailed observational data. We develop a general formalism to show that the gravitational instability theory for the origin of large-scale structure is now capable of critically confronting observational results on cosmic microwave background radiation angular anisotropies, large-scale bulk motions and large-scale clumpiness in the galaxy counts. (author)

  20. Large-scale visualization system for grid environment

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2007-01-01

    Center for Computational Science and E-systems of Japan Atomic Energy Agency (CCSE/JAEA) has been conducting R and Ds of distributed computing (grid computing) environments: Seamless Thinking Aid (STA), Information Technology Based Laboratory (ITBL) and Atomic Energy Grid InfraStructure (AEGIS). In these R and Ds, we have developed the visualization technology suitable for the distributed computing environment. As one of the visualization tools, we have developed the Parallel Support Toolkit (PST) which can execute the visualization process parallely on a computer. Now, we improve PST to be executable simultaneously on multiple heterogeneous computers using Seamless Thinking Aid Message Passing Interface (STAMPI). STAMPI, we have developed in these R and Ds, is the MPI library executable on a heterogeneous computing environment. The improvement realizes the visualization of extremely large-scale data and enables more efficient visualization processes in a distributed computing environment. (author)

  1. Including investment risk in large-scale power market models

    DEFF Research Database (Denmark)

    Lemming, Jørgen Kjærgaard; Meibom, P.

    2003-01-01

    Long-term energy market models can be used to examine investments in production technologies, however, with market liberalisation it is crucial that such models include investment risks and investor behaviour. This paper analyses how the effect of investment risk on production technology selection...... can be included in large-scale partial equilibrium models of the power market. The analyses are divided into a part about risk measures appropriate for power market investors and a more technical part about the combination of a risk-adjustment model and a partial-equilibrium model. To illustrate...... the analyses quantitatively, a framework based on an iterative interaction between the equilibrium model and a separate risk-adjustment module was constructed. To illustrate the features of the proposed modelling approach we examined how uncertainty in demand and variable costs affects the optimal choice...

  2. A large-scale computer facility for computational aerodynamics

    International Nuclear Information System (INIS)

    Bailey, F.R.; Balhaus, W.F.

    1985-01-01

    The combination of computer system technology and numerical modeling have advanced to the point that computational aerodynamics has emerged as an essential element in aerospace vehicle design methodology. To provide for further advances in modeling of aerodynamic flow fields, NASA has initiated at the Ames Research Center the Numerical Aerodynamic Simulation (NAS) Program. The objective of the Program is to develop a leading-edge, large-scale computer facility, and make it available to NASA, DoD, other Government agencies, industry and universities as a necessary element in ensuring continuing leadership in computational aerodynamics and related disciplines. The Program will establish an initial operational capability in 1986 and systematically enhance that capability by incorporating evolving improvements in state-of-the-art computer system technologies as required to maintain a leadership role. This paper briefly reviews the present and future requirements for computational aerodynamics and discusses the Numerical Aerodynamic Simulation Program objectives, computational goals, and implementation plans

  3. Large-scale Ising-machines composed of magnetic neurons

    Science.gov (United States)

    Mizushima, Koichi; Goto, Hayato; Sato, Rie

    2017-10-01

    We propose Ising-machines composed of magnetic neurons, that is, magnetic bits in a recording track. In large-scale machines, the sizes of both neurons and synapses need to be reduced, and neat and smart connections among neurons are also required to achieve all-to-all connectivity among them. These requirements can be fulfilled by adopting magnetic recording technologies such as race-track memories and skyrmion tracks because the area of a magnetic bit is almost two orders of magnitude smaller than that of static random access memory, which has normally been used as a semiconductor neuron, and the smart connections among neurons are realized by using the read and write methods of these technologies.

  4. FY 1998 Report on development of large-scale wind power generation systems. Feasibility study on development of new technologies for wind power generation (Study on the development of wind power generation); 1998 nendo ogata furyoku hatsuden system kaihatsu. Furyoku hatsuden shingijutsu kaihatsu kanosei chosa (furyoku hatsuden gijutsu ni kansuru kaihatsu doko chosa)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    This survey is designed to analyze, e.g., current status of large-scale wind power generation devices/system technologies and development trends worldwide, and to make predictions about future developments, in an effort to contribute to advancements in new technology for wind power generation systems in Japan. The international R and D cooperation programs promoted by IEA and EU have helped the participants produce a number of good results at lower costs. The European countries have developed the wind power generation industries in each area, promoted by the governmental subsidy policies, and are leading the world. The system is becoming larger, from around an average unit capacity of 250kW in the beginning of the 90's to 600kW now, reducing the cost by the scale merit. The improved computer capacity has made it possible to more easily analyze the complicated rotor aerodynamics, structural dynamics, wind characteristics and other factors related to wind power generation systems. The future R and D directions will include world standards for large-scale wind turbines, advancements in wind farm technologies, offshore wind power generation systems, advancement in design technologies, and new concepts for wind power turbine designs, e.g., floating wind turbine. (NEDO)

  5. Annual report on reactor safety research projects sponsored by the Minister for Research and Technology of the Federal Republic of Germany 1989

    International Nuclear Information System (INIS)

    1990-08-01

    Investigations on the safety of light water reactors (LWR) being performed in the framework of his research program on reactor safety are sponsored by the Bundesminister fuer Forschung und Technologie (BMFT) (Federal Minister for Research and Technology). Objective of this program is to investigate in greater detail the safety margins of nuclear power plants and their systems and the further development of safety technology. Besides the investigations of LWR tasks also projects on the safety of advanced reactors are sponsored by the BMFT. The Gesellschaft fuer Reaktorsicherheit (GRS), (Society for Reactor Safety), by order of the BMFT, informs continuously of the status of such investigations by means of semi-annual and annual publication of progress reports within the series GRS-F-Fortschrittsberichte (GRS-F-Progress Reports). Each progress report represents a compilation of individual reports about objectives, the work performed, the results, the next steps of the work etc. The individual reports are prepared in a standard form by the contractors themselves as a documentation of their progress in work and published by the Forschungsbetreuung at the GRS, (FB) (Research Coordination Department), within the framework of general informations of progress in reactor safety research. The individual reports are classified according to the same classification system as applied in the nuclear index of the CEC (Commission of the European Communities) and the OECD (Organization for Economic Cooperation and Development). The reports are arranged in sequence of their project numbers. (orig./HP)

  6. Survey and research report on how large-scale technological development should be in the future; Kongo no daikibo gijutsu kaihatsu no hoko ni tsuite no chosa kenkyu hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1980-03-01

    Mentioned as data processing technologies to be developed in the 1980s are a computer-assisted system of document translation, supercomputer, knowledge-based information system, and an automated Japanese language office work processing system. Mentioned in relation with the utilization of thermonuclear energy are atomic steel making, liquefied coal gasification, heavy oil gasification, and cracking of heavy oil into lighter oils. In relation with process conversion and new materials technologies, the development and application of functional membrane materials and the conversion of biomass into chemical materials are taken up. In relation with aviation and space technologies, a high-speed turboprop airplane, highly efficient aeroengine, ultralight aircraft, and a transport plane powered by methane or hydrogen are taken up. In relation with electronics or mechanics, an ultra-precision high order production system, automotive ceramic engine, industrial robot system, intelligent database machine system, and a transportable synchrotron are taken up. In addition, a number of subjects are mentioned in relation with social system technology, optoelectronic technology, resources and ocean development technologies, etc. (NEDO)

  7. Large scale dynamics of protoplanetary discs

    Science.gov (United States)

    Béthune, William

    2017-08-01

    Planets form in the gaseous and dusty disks orbiting young stars. These protoplanetary disks are dispersed in a few million years, being accreted onto the central star or evaporated into the interstellar medium. To explain the observed accretion rates, it is commonly assumed that matter is transported through the disk by turbulence, although the mechanism sustaining turbulence is uncertain. On the other side, irradiation by the central star could heat up the disk surface and trigger a photoevaporative wind, but thermal effects cannot account for the observed acceleration and collimation of the wind into a narrow jet perpendicular to the disk plane. Both issues can be solved if the disk is sensitive to magnetic fields. Weak fields lead to the magnetorotational instability, whose outcome is a state of sustained turbulence. Strong fields can slow down the disk, causing it to accrete while launching a collimated wind. However, the coupling between the disk and the neutral gas is done via electric charges, each of which is outnumbered by several billion neutral molecules. The imperfect coupling between the magnetic field and the neutral gas is described in terms of "non-ideal" effects, introducing new dynamical behaviors. This thesis is devoted to the transport processes happening inside weakly ionized and weakly magnetized accretion disks; the role of microphysical effects on the large-scale dynamics of the disk is of primary importance. As a first step, I exclude the wind and examine the impact of non-ideal effects on the turbulent properties near the disk midplane. I show that the flow can spontaneously organize itself if the ionization fraction is low enough; in this case, accretion is halted and the disk exhibits axisymmetric structures, with possible consequences on planetary formation. As a second step, I study the launching of disk winds via a global model of stratified disk embedded in a warm atmosphere. This model is the first to compute non-ideal effects from

  8. Large-Scale Spacecraft Fire Safety Tests

    Science.gov (United States)

    Urban, David; Ruff, Gary A.; Ferkul, Paul V.; Olson, Sandra; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Rouvreau, Sebastien; Minster, Olivier; hide

    2014-01-01

    An international collaborative program is underway to address open issues in spacecraft fire safety. Because of limited access to long-term low-gravity conditions and the small volume generally allotted for these experiments, there have been relatively few experiments that directly study spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample sizes and environment conditions typical of those expected in a spacecraft fire. The major constraint has been the size of the sample, with prior experiments limited to samples of the order of 10 cm in length and width or smaller. This lack of experimental data forces spacecraft designers to base their designs and safety precautions on 1-g understanding of flame spread, fire detection, and suppression. However, low-gravity combustion research has demonstrated substantial differences in flame behavior in low-gravity. This, combined with the differences caused by the confined spacecraft environment, necessitates practical scale spacecraft fire safety research to mitigate risks for future space missions. To address this issue, a large-scale spacecraft fire experiment is under development by NASA and an international team of investigators. This poster presents the objectives, status, and concept of this collaborative international project (Saffire). The project plan is to conduct fire safety experiments on three sequential flights of an unmanned ISS re-supply spacecraft (the Orbital Cygnus vehicle) after they have completed their delivery of cargo to the ISS and have begun their return journeys to earth. On two flights (Saffire-1 and Saffire-3), the experiment will consist of a flame spread test involving a meter-scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. On one of the flights (Saffire-2), 9 smaller (5 x 30 cm) samples will be tested to evaluate NASAs material flammability screening tests

  9. Large scale PV plants - also in Denmark. Project report

    Energy Technology Data Exchange (ETDEWEB)

    Ahm, P [PA Energy, Malling (Denmark); Vedde, J [SiCon. Silicon and PV consulting, Birkeroed (Denmark)

    2011-04-15

    Large scale PV (LPV) plants, plants with a capacity of more than 200 kW, has since 2007 constituted an increasing share of the global PV installations. In 2009 large scale PV plants with cumulative power more that 1,3 GWp were connected to the grid. The necessary design data for LPV plants in Denmark are available or can be found, although irradiance data could be improved. There seems to be very few institutional barriers for LPV projects, but as so far no real LPV projects have been processed, these findings have to be regarded as preliminary. The fast growing number of very large scale solar thermal plants for district heating applications supports these findings. It has further been investigated, how to optimize the lay-out of LPV plants. Under the Danish irradiance conditions with several winter months with very low solar height PV installations on flat surfaces will have to balance the requirements of physical space - and cost, and the loss of electricity production due to shadowing effects. The potential for LPV plants in Denmark are found in three main categories: PV installations on flat roof of large commercial buildings, PV installations on other large scale infrastructure such as noise barriers and ground mounted PV installations. The technical potential for all three categories is found to be significant and in the range of 50 - 250 km2. In terms of energy harvest PV plants will under Danish conditions exhibit an overall efficiency of about 10 % in conversion of the energy content of the light compared to about 0,3 % for biomass. The theoretical ground area needed to produce the present annual electricity consumption of Denmark at 33-35 TWh is about 300 km2 The Danish grid codes and the electricity safety regulations mention very little about PV and nothing about LPV plants. It is expected that LPV plants will be treated similarly to big wind turbines. A number of LPV plant scenarios have been investigated in detail based on real commercial offers and

  10. Evaluating Unmanned Aerial Platforms for Cultural Heritage Large Scale Mapping

    Science.gov (United States)

    Georgopoulos, A.; Oikonomou, C.; Adamopoulos, E.; Stathopoulou, E. K.

    2016-06-01

    When it comes to large scale mapping of limited areas especially for cultural heritage sites, things become critical. Optical and non-optical sensors are developed to such sizes and weights that can be lifted by such platforms, like e.g. LiDAR units. At the same time there is an increase in emphasis on solutions that enable users to get access to 3D information faster and cheaper. Considering the multitude of platforms, cameras and the advancement of algorithms in conjunction with the increase of available computing power this challenge should and indeed is further investigated. In this paper a short review of the UAS technologies today is attempted. A discussion follows as to their applicability and advantages, depending on their specifications, which vary immensely. The on-board cameras available are also compared and evaluated for large scale mapping. Furthermore a thorough analysis, review and experimentation with different software implementations of Structure from Motion and Multiple View Stereo algorithms, able to process such dense and mostly unordered sequence of digital images is also conducted and presented. As test data set, we use a rich optical and thermal data set from both fixed wing and multi-rotor platforms over an archaeological excavation with adverse height variations and using different cameras. Dense 3D point clouds, digital terrain models and orthophotos have been produced and evaluated for their radiometric as well as metric qualities.

  11. State-of-the-art of large scale biogas plants

    International Nuclear Information System (INIS)

    Prisum, J.M.; Noergaard, P.

    1992-01-01

    A survey of the technological state of large scale biogas plants in Europe treating manure is given. 83 plants are in operation at present. Of these, 16 are centralised digestion plants. Transport costs at centralised digestion plants amounts to between 25 and 40 percent of the total operational costs. Various transport equipment is used. Most large scale digesters are CSTRs, but serial, contact, 2-step, and plug-flow digesters are also found. Construction materials are mostly steel and concrete. Mesophilic digestion is most common (56%), thermophilic digestion is used in 17% of the plants, combined mesophilic and thermophilic digestion is used in 28% of the centralised plants. Mixing of digester content is performed with gas injection, propellers, and gas-liquid displacement. Heating is carried out using external or internal heat exchangers. Heat recovery is only used in Denmark. Gas purification equipment is commonplace, but not often needed. Several plants use separation of the digested manure, often as part of a post-treatment/-purification process or for the production of 'compost'. Screens, sieve belt separaters, centrifuges and filter presses are employed. The use of biogas varies considerably. In some cases, combined heat and power stations are supplying the grid and district heating systems. Other plants use only either the electricity or heat. (au)

  12. Technology demonstrations in the Decontamination and Decommissioning Focus Area

    International Nuclear Information System (INIS)

    Bossart, S.J.

    1996-01-01

    This paper describes three large-scale demonstration projects sponsored jointly by the Decontamination and Decommissioning Focus Area (DDFA), and the three US Department of Energy (DOE) Operations Offices that successfully offered to deactivate or decommission (D ampersand D) one of its facilities using a combination of innovative and commercial D ampersand D technologies. The paper also includes discussions on recent technology demonstrations for an Advanced Worker Protection System, an Electrohydraulic Scabbling System, and a Pipe Explorer trademark. The references at the conclusion of this paper should be consulted for more detailed information about the large-scale demonstration projects and recent technology demonstrations sponsored by the DDFA

  13. Large scale injection test (LASGIT) modelling

    International Nuclear Information System (INIS)

    Arnedo, D.; Olivella, S.; Alonso, E.E.

    2010-01-01

    Document available in extended abstract form only. With the objective of understanding the gas flow processes through clay barriers in schemes of radioactive waste disposal, the Lasgit in situ experiment was planned and is currently in progress. The modelling of the experiment will permit to better understand of the responses, to confirm hypothesis of mechanisms and processes and to learn in order to design future experiments. The experiment and modelling activities are included in the project FORGE (FP7). The in situ large scale injection test Lasgit is currently being performed at the Aespoe Hard Rock Laboratory by SKB and BGS. An schematic layout of the test is shown. The deposition hole follows the KBS3 scheme. A copper canister is installed in the axe of the deposition hole, surrounded by blocks of highly compacted MX-80 bentonite. A concrete plug is placed at the top of the buffer. A metallic lid anchored to the surrounding host rock is included in order to prevent vertical movements of the whole system during gas injection stages (high gas injection pressures are expected to be reached). Hydration of the buffer material is achieved by injecting water through filter mats, two placed at the rock walls and two at the interfaces between bentonite blocks. Water is also injected through the 12 canister filters. Gas injection stages are performed injecting gas to some of the canister injection filters. Since the water pressure and the stresses (swelling pressure development) will be high during gas injection, it is necessary to inject at high gas pressures. This implies mechanical couplings as gas penetrates after the gas entry pressure is achieved and may produce deformations which in turn lead to permeability increments. A 3D hydro-mechanical numerical model of the test using CODE-BRIGHT is presented. The domain considered for the modelling is shown. The materials considered in the simulation are the MX-80 bentonite blocks (cylinders and rings), the concrete plug

  14. Superconductivity for Large Scale Wind Turbines

    Energy Technology Data Exchange (ETDEWEB)

    R. Fair; W. Stautner; M. Douglass; R. Rajput-Ghoshal; M. Moscinski; P. Riley; D. Wagner; J. Kim; S. Hou; F. Lopez; K. Haran; J. Bray; T. Laskaris; J. Rochford; R. Duckworth

    2012-10-12

    A conceptual design has been completed for a 10MW superconducting direct drive wind turbine generator employing low temperature superconductors for the field winding. Key technology building blocks from the GE Wind and GE Healthcare businesses have been transferred across to the design of this concept machine. Wherever possible, conventional technology and production techniques have been used in order to support the case for commercialization of such a machine. Appendices A and B provide further details of the layout of the machine and the complete specification table for the concept design. Phase 1 of the program has allowed us to understand the trade-offs between the various sub-systems of such a generator and its integration with a wind turbine. A Failure Modes and Effects Analysis (FMEA) and a Technology Readiness Level (TRL) analysis have been completed resulting in the identification of high risk components within the design. The design has been analyzed from a commercial and economic point of view and Cost of Energy (COE) calculations have been carried out with the potential to reduce COE by up to 18% when compared with a permanent magnet direct drive 5MW baseline machine, resulting in a potential COE of 0.075 $/kWh. Finally, a top-level commercialization plan has been proposed to enable this technology to be transitioned to full volume production. The main body of this report will present the design processes employed and the main findings and conclusions.

  15. Participatory Design of Large-Scale Information Systems

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Hertzum, Morten

    2008-01-01

    into a PD process model that (1) emphasizes PD experiments as transcending traditional prototyping by evaluating fully integrated systems exposed to real work practices; (2) incorporates improvisational change management including anticipated, emergent, and opportunity-based change; and (3) extends initial...... design and development into a sustained and ongoing stepwise implementation that constitutes an overall technology-driven organizational change. The process model is presented through a largescale PD experiment in the Danish healthcare sector. We reflect on our experiences from this experiment......In this article we discuss how to engage in large-scale information systems development by applying a participatory design (PD) approach that acknowledges the unique situated work practices conducted by the domain experts of modern organizations. We reconstruct the iterative prototyping approach...

  16. Large-scale experience with biological treatment of contaminated soil

    International Nuclear Information System (INIS)

    Schulz-Berendt, V.; Poetzsch, E.

    1995-01-01

    The efficiency of biological methods for the cleanup of soil contaminated with total petroleum hydrocarbons (TPH) and polycyclic aromatic hydrocarbons (PAH) was demonstrated by a large-scale example in which 38,000 tons of TPH- and PAH-polluted soil was treated onsite with the TERRAFERM reg-sign degradation system to reach the target values of 300 mg/kg TPH and 5 mg/kg PAH. Detection of the ecotoxicological potential (Microtox reg-sign assay) showed a significant decrease during the remediation. Low concentrations of PAH in the ground were treated by an in situ technology. The in situ treatment was combined with mechanical measures (slurry wall) to prevent the contamination from dispersing from the site

  17. Towards large-scale plasma-assisted synthesis of nanowires

    Science.gov (United States)

    Cvelbar, U.

    2011-05-01

    Large quantities of nanomaterials, e.g. nanowires (NWs), are needed to overcome the high market price of nanomaterials and make nanotechnology widely available for general public use and applications to numerous devices. Therefore, there is an enormous need for new methods or routes for synthesis of those nanostructures. Here plasma technologies for synthesis of NWs, nanotubes, nanoparticles or other nanostructures might play a key role in the near future. This paper presents a three-dimensional problem of large-scale synthesis connected with the time, quantity and quality of nanostructures. Herein, four different plasma methods for NW synthesis are presented in contrast to other methods, e.g. thermal processes, chemical vapour deposition or wet chemical processes. The pros and cons are discussed in detail for the case of two metal oxides: iron oxide and zinc oxide NWs, which are important for many applications.

  18. Planning under uncertainty solving large-scale stochastic linear programs

    Energy Technology Data Exchange (ETDEWEB)

    Infanger, G. [Stanford Univ., CA (United States). Dept. of Operations Research]|[Technische Univ., Vienna (Austria). Inst. fuer Energiewirtschaft

    1992-12-01

    For many practical problems, solutions obtained from deterministic models are unsatisfactory because they fail to hedge against certain contingencies that may occur in the future. Stochastic models address this shortcoming, but up to recently seemed to be intractable due to their size. Recent advances both in solution algorithms and in computer technology now allow us to solve important and general classes of practical stochastic problems. We show how large-scale stochastic linear programs can be efficiently solved by combining classical decomposition and Monte Carlo (importance) sampling techniques. We discuss the methodology for solving two-stage stochastic linear programs with recourse, present numerical results of large problems with numerous stochastic parameters, show how to efficiently implement the methodology on a parallel multi-computer and derive the theory for solving a general class of multi-stage problems with dependency of the stochastic parameters within a stage and between different stages.

  19. Challenges and options for large scale integration of wind power

    International Nuclear Information System (INIS)

    Tande, John Olav Giaever

    2006-01-01

    Challenges and options for large scale integration of wind power are examined. Immediate challenges are related to weak grids. Assessment of system stability requires numerical simulation. Models are being developed - validation is essential. Coordination of wind and hydro generation is a key for allowing more wind power capacity in areas with limited transmission corridors. For the case study grid depending on technology and control the allowed wind farm size is increased from 50 to 200 MW. The real life example from 8 January 2005 demonstrates that existing marked based mechanisms can handle large amounts of wind power. In wind integration studies it is essential to take account of the controllability of modern wind farms, the power system flexibility and the smoothing effect of geographically dispersed wind farms. Modern wind farms contribute to system adequacy - combining wind and hydro constitutes a win-win system (ml)

  20. SIMON: Remote collaboration system based on large scale simulation

    International Nuclear Information System (INIS)

    Sugawara, Akihiro; Kishimoto, Yasuaki

    2003-01-01

    Development of SIMON (SImulation MONitoring) system is described. SIMON aims to investigate many physical phenomena of tokamak type nuclear fusion plasma by simulation and to exchange information and to carry out joint researches with scientists in the world using internet. The characteristics of SIMON are followings; 1) decrease load of simulation by trigger sending method, 2) visualization of simulation results and hierarchical structure of analysis, 3) decrease of number of license by using command line when software is used, 4) improvement of support for using network of simulation data output by use of HTML (Hyper Text Markup Language), 5) avoidance of complex built-in work in client part and 6) small-sized and portable software. The visualization method of large scale simulation, remote collaboration system by HTML, trigger sending method, hierarchical analytical method, introduction into three-dimensional electromagnetic transportation code and technologies of SIMON system are explained. (S.Y.)

  1. Optimization of large-scale fabrication of dielectric elastomer transducers

    DEFF Research Database (Denmark)

    Hassouneh, Suzan Sager

    Dielectric elastomers (DEs) have gained substantial ground in many different applications, such as wave energy harvesting, valves and loudspeakers. For DE technology to be commercially viable, it is necessary that any large-scale production operation is nondestructive, efficient and cheap. Danfoss......-strength laminates to perform as monolithic elements. For the front-to-back and front-to-front configurations, conductive elastomers were utilised. One approach involved adding the cheap and conductive filler, exfoliated graphite (EG) to a PDMS matrix to increase dielectric permittivity. The results showed that even...... as conductive adhesives were rejected. Dielectric properties below the percolation threshold were subsequently investigated, in order to conclude the study. In order to avoid destroying the network structure, carbon nanotubes (CNTs) were used as fillers during the preparation of the conductive elastomers...

  2. USE OF RFID AT LARGE-SCALE EVENTS

    Directory of Open Access Journals (Sweden)

    Yuusuke KAWAKITA

    2005-01-01

    Full Text Available Radio Frequency Identification (RFID devices and related technologies have received a great deal of attention for their ability to perform non-contact object identification. Systems incorporating RFID have been evaluated from a variety of perspectives. The authors constructed a networked RFID system to support event management at NetWorld+Interop 2004 Tokyo, an event that received 150,000 visitors. The system used multiple RFID readers installed at the venue and RFID tags carried by each visitor to provide a platform for running various management and visitor support applications. This paper presents the results of this field trial of RFID readability rates. It further addresses the applicability of RFID systems to visitor management, a problematic aspect of large-scale events.

  3. Large scale gas chromatographic demonstration system for hydrogen isotope separation

    International Nuclear Information System (INIS)

    Cheh, C.H.

    1988-01-01

    A large scale demonstration system was designed for a throughput of 3 mol/day equimolar mixture of H,D, and T. The demonstration system was assembled and an experimental program carried out. This project was funded by Kernforschungszentrum Karlsruhe, Canadian Fusion Fuel Technology Projects and Ontario Hydro Research Division. Several major design innovations were successfully implemented in the demonstration system and are discussed in detail. Many experiments were carried out in the demonstration system to study the performance of the system to separate hydrogen isotopes at high throughput. Various temperature programming schemes were tested, heart-cutting operation was evaluated, and very large (up to 138 NL/injection) samples were separated in the system. The results of the experiments showed that the specially designed column performed well as a chromatographic column and good separation could be achieved even when a 138 NL sample was injected

  4. Large scale solar district heating. Evaluation, modelling and designing

    Energy Technology Data Exchange (ETDEWEB)

    Heller, A.

    2000-07-01

    The main objective of the research was to evaluate large-scale solar heating connected to district heating (CSDHP), to build up a simulation tool and to demonstrate the application of the tool for design studies and on a local energy planning case. The evaluation of the central solar heating technology is based on measurements on the case plant in Marstal, Denmark, and on published and unpublished data for other, mainly Danish, CSDHP plants. Evaluations on the thermal, economical and environmental performances are reported, based on the experiences from the last decade. The measurements from the Marstal case are analysed, experiences extracted and minor improvements to the plant design proposed. For the detailed designing and energy planning of CSDHPs, a computer simulation model is developed and validated on the measurements from the Marstal case. The final model is then generalised to a 'generic' model for CSDHPs in general. The meteorological reference data, Danish Reference Year, is applied to find the mean performance for the plant designs. To find the expectable variety of the thermal performance of such plants, a method is proposed where data from a year with poor solar irradiation and a year with strong solar irradiation are applied. Equipped with a simulation tool design studies are carried out spreading from parameter analysis over energy planning for a new settlement to a proposal for the combination of plane solar collectors with high performance solar collectors, exemplified by a trough solar collector. The methodology of utilising computer simulation proved to be a cheap and relevant tool in the design of future solar heating plants. The thesis also exposed the demand for developing computer models for the more advanced solar collector designs and especially for the control operation of CSHPs. In the final chapter the CSHP technology is put into perspective with respect to other possible technologies to find the relevance of the application

  5. Analysis using large-scale ringing data

    Directory of Open Access Journals (Sweden)

    Baillie, S. R.

    2004-06-01

    survival and recruitment estimates from the French CES scheme to assess the relative contributions of survival and recruitment to overall population changes. He develops a novel approach to modelling survival rates from such multi–site data by using within–year recaptures to provide a covariate of between–year recapture rates. This provided parsimonious models of variation in recapture probabilities between sites and years. The approach provides promising results for the four species investigated and can potentially be extended to similar data from other CES/MAPS schemes. The final paper by Blandine Doligez, David Thomson and Arie van Noordwijk (Doligez et al., 2004 illustrates how large-scale studies of population dynamics can be important for evaluating the effects of conservation measures. Their study is concerned with the reintroduction of White Stork populations to the Netherlands where a re–introduction programme started in 1969 had resulted in a breeding population of 396 pairs by 2000. They demonstrate the need to consider a wide range of models in order to account for potential age, time, cohort and “trap–happiness” effects. As the data are based on resightings such trap–happiness must reflect some form of heterogeneity in resighting probabilities. Perhaps surprisingly, the provision of supplementary food did not influence survival, but it may havehad an indirect effect via the alteration of migratory behaviour. Spatially explicit modelling of data gathered at many sites inevitably results in starting models with very large numbers of parameters. The problem is often complicated further by having relatively sparse data at each site, even where the total amount of data gathered is very large. Both Julliard (2004 and Doligez et al. (2004 give explicit examples of problems caused by needing to handle very large numbers of parameters and show how they overcame them for their particular data sets. Such problems involve both the choice of appropriate

  6. Large-Scale Science Observatories: Building on What We Have Learned from USArray

    Science.gov (United States)

    Woodward, R.; Busby, R.; Detrick, R. S.; Frassetto, A.

    2015-12-01

    With the NSF-sponsored EarthScope USArray observatory, the Earth science community has built the operational capability and experience to tackle scientific challenges at the largest scales, such as a Subduction Zone Observatory. In the first ten years of USArray, geophysical instruments were deployed across roughly 2% of the Earth's surface. The USArray operated a rolling deployment of seismic stations that occupied ~1,700 sites across the USA, made co-located atmospheric observations, occupied hundreds of sites with magnetotelluric sensors, expanded a backbone reference network of seismic stations, and provided instruments to PI-led teams that deployed thousands of additional seismic stations. USArray included a comprehensive outreach component that directly engaged hundreds of students at over 50 colleges and universities to locate station sites and provided Earth science exposure to roughly 1,000 landowners who hosted stations. The project also included a comprehensive data management capability that received, archived and distributed data, metadata, and data products; data were acquired and distributed in real time. The USArray project was completed on time and under budget and developed a number of best practices that can inform other large-scale science initiatives that the Earth science community is contemplating. Key strategies employed by USArray included: using a survey, rather than hypothesis-driven, mode of observation to generate comprehensive, high quality data on a large-scale for exploration and discovery; making data freely and openly available to any investigator from the very onset of the project; and using proven, commercial, off-the-shelf systems to ensure a fast start and avoid delays due to over-reliance on unproven technology or concepts. Scope was set ambitiously, but managed carefully to avoid overextending. Configuration was controlled to ensure efficient operations while providing consistent, uniform observations. Finally, community

  7. Food security through large scale investments in agriculture

    Science.gov (United States)

    Rulli, M.; D'Odorico, P.

    2013-12-01

    Most of the human appropriation of freshwater resources is for food production. There is some concern that in the near future the finite freshwater resources available on Earth might not be sufficient to meet the increasing human demand for agricultural products. In the late 1700s Malthus argued that in the long run the humanity would not have enough resources to feed itself. Malthus' analysis, however, did not account for the emergence of technological innovations that could increase the rate of food production. The modern and contemporary history has seen at least three major technological advances that have increased humans' access to food, namely, the industrial revolution, the green revolution, and the intensification of global trade. Here we argue that a fourth revolution has just started to happen. It involves foreign direct investments in agriculture, which intensify the crop yields of potentially highly productive agricultural lands by introducing the use of more modern technologies. The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions for commercial farming will bring the technology required to close the existing yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of verified land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with large scale land acquisitions. We

  8. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  9. Autonomous Sensors for Large Scale Data Collection

    Science.gov (United States)

    Noto, J.; Kerr, R.; Riccobono, J.; Kapali, S.; Migliozzi, M. A.; Goenka, C.

    2017-12-01

    Presented here is a novel implementation of a "Doppler imager" which remotely measures winds and temperatures of the neutral background atmosphere at ionospheric altitudes of 87-300Km and possibly above. Incorporating both recent optical manufacturing developments, modern network awareness and the application of machine learning techniques for intelligent self-monitoring and data classification. This system achieves cost savings in manufacturing, deployment and lifetime operating costs. Deployed in both ground and space-based modalities, this cost-disruptive technology will allow computer models of, ionospheric variability and other space weather models to operate with higher precision. Other sensors can be folded into the data collection and analysis architecture easily creating autonomous virtual observatories. A prototype version of this sensor has recently been deployed in Trivandrum India for the Indian Government. This Doppler imager is capable of operation, even within the restricted CubeSat environment. The CubeSat bus offers a very challenging environment, even for small instruments. The lack of SWaP and the challenging thermal environment demand development of a new generation of instruments; the Doppler imager presented is well suited to this environment. Concurrent with this CubeSat development is the development and construction of ground based arrays of inexpensive sensors using the proposed technology. This instrument could be flown inexpensively on one or more CubeSats to provide valuable data to space weather forecasters and ionospheric scientists. Arrays of magnetometers have been deployed for the last 20 years [Alabi, 2005]. Other examples of ground based arrays include an array of white-light all sky imagers (THEMIS) deployed across Canada [Donovan et al., 2006], oceans sensors on buoys [McPhaden et al., 2010], and arrays of seismic sensors [Schweitzer et al., 2002]. A comparable array of Doppler imagers can be constructed and deployed on the

  10. Large scale chromatographic separations using continuous displacement chromatography (CDC)

    International Nuclear Information System (INIS)

    Taniguchi, V.T.; Doty, A.W.; Byers, C.H.

    1988-01-01

    A process for large scale chromatographic separations using a continuous chromatography technique is described. The process combines the advantages of large scale batch fixed column displacement chromatography with conventional analytical or elution continuous annular chromatography (CAC) to enable large scale displacement chromatography to be performed on a continuous basis (CDC). Such large scale, continuous displacement chromatography separations have not been reported in the literature. The process is demonstrated with the ion exchange separation of a binary lanthanide (Nd/Pr) mixture. The process is, however, applicable to any displacement chromatography separation that can be performed using conventional batch, fixed column chromatography

  11. Large-scale modelling of neuronal systems

    International Nuclear Information System (INIS)

    Castellani, G.; Verondini, E.; Giampieri, E.; Bersani, F.; Remondini, D.; Milanesi, L.; Zironi, I.

    2009-01-01

    The brain is, without any doubt, the most, complex system of the human body. Its complexity is also due to the extremely high number of neurons, as well as the huge number of synapses connecting them. Each neuron is capable to perform complex tasks, like learning and memorizing a large class of patterns. The simulation of large neuronal systems is challenging for both technological and computational reasons, and can open new perspectives for the comprehension of brain functioning. A well-known and widely accepted model of bidirectional synaptic plasticity, the BCM model, is stated by a differential equation approach based on bistability and selectivity properties. We have modified the BCM model extending it from a single-neuron to a whole-network model. This new model is capable to generate interesting network topologies starting from a small number of local parameters, describing the interaction between incoming and outgoing links from each neuron. We have characterized this model in terms of complex network theory, showing how this, learning rule can be a support For network generation.

  12. Large-Scale Pattern Discovery in Music

    Science.gov (United States)

    Bertin-Mahieux, Thierry

    This work focuses on extracting patterns in musical data from very large collections. The problem is split in two parts. First, we build such a large collection, the Million Song Dataset, to provide researchers access to commercial-size datasets. Second, we use this collection to study cover song recognition which involves finding harmonic patterns from audio features. Regarding the Million Song Dataset, we detail how we built the original collection from an online API, and how we encouraged other organizations to participate in the project. The result is the largest research dataset with heterogeneous sources of data available to music technology researchers. We demonstrate some of its potential and discuss the impact it already has on the field. On cover song recognition, we must revisit the existing literature since there are no publicly available results on a dataset of more than a few thousand entries. We present two solutions to tackle the problem, one using a hashing method, and one using a higher-level feature computed from the chromagram (dubbed the 2DFTM). We further investigate the 2DFTM since it has potential to be a relevant representation for any task involving audio harmonic content. Finally, we discuss the future of the dataset and the hope of seeing more work making use of the different sources of data that are linked in the Million Song Dataset. Regarding cover songs, we explain how this might be a first step towards defining a harmonic manifold of music, a space where harmonic similarities between songs would be more apparent.

  13. Reports on research projects in the field of reactor safety sponsored by the Federal Ministry for Education, Science, Research and Technology. Period covered: January 1 - June 30, 1997

    International Nuclear Information System (INIS)

    1997-01-01

    Within the framework of its research programme on reactor safety, the Bundesministerium fuer Bildung, Wissenschaft, Forschung und Technology (BMBF) (Federal Ministry for Education, Science, Research and Technology) sponsors investigations into the safety of nuclear reactors. These investigations that are carried out within the framework of the programme are to provide fundamental knowledge, procedures and methods contributing to realistic safety assessments of nuclear facilities, the further development of safety technology, and the use of the potential of innovative safety-related approaches. Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) bmH, by order of the BMBF, continuously issues information on the status of such investigations by publishing semiannual and annual progress reports within the series of GRS-F-Fortschrittsberichte (GRS-F-Progress Reports). Each progress report represents a compilation of individual reports about the objectives, work performed, results, next steps of the work etc. The individual reports are prepared in a standard form by the research organisations themselves as documentation of their progress in work and are published by the Research Management Division of GRS within the framework of general information on the progress in reactor safety research. (orig./SR) [de

  14. Large Scale Experiments on Spacecraft Fire Safety

    Science.gov (United States)

    Urban, David; Ruff, Gary A.; Minster, Olivier; Fernandez-Pello, A. Carlos; Tien, James S.; Torero, Jose L.; Legros, Guillaume; Eigenbrod, Christian; Smirnov, Nickolay; Fujita, Osamu; hide

    2012-01-01

    developed by an international topical team that is collaboratively defining the experiment requirements and performing supporting analysis, experimentation and technology development. This paper presents the objectives, status and concept of this project.

  15. Large-scale matrix-handling subroutines 'ATLAS'

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Takeda, Tatsuoki; Fujita, Keiichi; Matsuura, Toshihiko; Tahara, Nobuo

    1978-03-01

    Subroutine package ''ATLAS'' has been developed for handling large-scale matrices. The package is composed of four kinds of subroutines, i.e., basic arithmetic routines, routines for solving linear simultaneous equations and for solving general eigenvalue problems and utility routines. The subroutines are useful in large scale plasma-fluid simulations. (auth.)

  16. Large-scale Agricultural Land Acquisitions in West Africa | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    This project will examine large-scale agricultural land acquisitions in nine West African countries -Burkina Faso, Guinea-Bissau, Guinea, Benin, Mali, Togo, Senegal, Niger, and Côte d'Ivoire. ... They will use the results to increase public awareness and knowledge about the consequences of large-scale land acquisitions.

  17. Large-scale synthesis of YSZ nanopowder by Pechini method

    Indian Academy of Sciences (India)

    Administrator

    structure and chemical purity of 99⋅1% by inductively coupled plasma optical emission spectroscopy on a large scale. Keywords. Sol–gel; yttria-stabilized zirconia; large scale; nanopowder; Pechini method. 1. Introduction. Zirconia has attracted the attention of many scientists because of its tremendous thermal, mechanical ...

  18. Parallel Index and Query for Large Scale Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chou, Jerry; Wu, Kesheng; Ruebel, Oliver; Howison, Mark; Qiang, Ji; Prabhat,; Austin, Brian; Bethel, E. Wes; Ryne, Rob D.; Shoshani, Arie

    2011-07-18

    Modern scientific datasets present numerous data management and analysis challenges. State-of-the-art index and query technologies are critical for facilitating interactive exploration of large datasets, but numerous challenges remain in terms of designing a system for process- ing general scientific datasets. The system needs to be able to run on distributed multi-core platforms, efficiently utilize underlying I/O infrastructure, and scale to massive datasets. We present FastQuery, a novel software framework that address these challenges. FastQuery utilizes a state-of-the-art index and query technology (FastBit) and is designed to process mas- sive datasets on modern supercomputing platforms. We apply FastQuery to processing of a massive 50TB dataset generated by a large scale accelerator modeling code. We demonstrate the scalability of the tool to 11,520 cores. Motivated by the scientific need to search for inter- esting particles in this dataset, we use our framework to reduce search time from hours to tens of seconds.

  19. HTS cables open the window for large-scale renewables

    International Nuclear Information System (INIS)

    Geschiere, A; Willen, D; Piga, E; Barendregt, P

    2008-01-01

    In a realistic approach to future energy consumption, the effects of sustainable power sources and the effects of growing welfare with increased use of electricity need to be considered. These factors lead to an increased transfer of electric energy over the networks. A dominant part of the energy need will come from expanded large-scale renewable sources. To use them efficiently over Europe, large energy transits between different countries are required. Bottlenecks in the existing infrastructure will be avoided by strengthening the network. For environmental reasons more infrastructure will be built underground. Nuon is studying the HTS technology as a component to solve these challenges. This technology offers a tremendously large power transport capacity as well as the possibility to reduce short circuit currents, making integration of renewables easier. Furthermore, power transport will be possible at lower voltage levels, giving the opportunity to upgrade the existing network while re-using it. This will result in large cost savings while reaching the future energy challenges. In a 6 km backbone structure in Amsterdam Nuon wants to install a 50 kV HTS Triax cable for a significant increase of the transport capacity, while developing its capabilities. Nevertheless several barriers have to be overcome

  20. Biotechnological lignite conversion - a large-scale concept

    Energy Technology Data Exchange (ETDEWEB)

    Reich-Walber, M.; Meyrahn, H.; Felgener, G.W. [Rheinbraun AG, Koeln (Germany). Fuel Technology and Lab. Dept.

    1997-12-31

    Concerning the research on biotechnological lignite upgrading, Rheinbraun`s overall objective is the large-scale production of liquid and gaseous products for the energy and chemical/refinery sectors. The presentation outlines Rheinbraun`s technical concept for electricity production on the basis of biotechnologically solubilized lignite. A first rough cost estimate based on the assumptions described in the paper in detail and compared with the latest power plant generation shows the general cost efficiency of this technology despite the additional costs in respect of coal solubilization. The main reasons are low-cost process techniques for coal conversion on the one hand and cost reductions mainly in power plant technology (more efficient combustion processes and simplified gas clean-up) but also in coal transport (easy fuel handling) on the other hand. Moreover, it is hoped that an extended range of products will make it possible to widen the fields of lignite application. The presentation also points out that there is still a huge gap between this scenario and reality by limited microbiological knowledge. To close this gap Rheinbraun started a research project supported by the North-Rhine Westphalian government in 1995. Several leading biotechnological companies and institutes in Germany and the United States are involved in the project. The latest results of the current project will be presented in the paper. This includes fundamental research activities in the field of microbial coal conversion as well as investigations into bioreactor design and product treatment (dewatering, deashing and desulphurization). (orig.)

  1. Algorithm 896: LSA: Algorithms for Large-Scale Optimization

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2009-01-01

    Roč. 36, č. 3 (2009), 16-1-16-29 ISSN 0098-3500 R&D Pro jects: GA AV ČR IAA1030405; GA ČR GP201/06/P397 Institutional research plan: CEZ:AV0Z10300504 Keywords : algorithms * design * large-scale optimization * large-scale nonsmooth optimization * large-scale nonlinear least squares * large-scale nonlinear minimax * large-scale systems of nonlinear equations * sparse pro blems * partially separable pro blems * limited-memory methods * discrete Newton methods * quasi-Newton methods * primal interior-point methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.904, year: 2009

  2. The Sponsored Film.

    Science.gov (United States)

    Klein, Walter J.

    For public relations professionals and would-be sponsors of films, this book provides guidelines for understanding the film medium and its potential as a persuasive force in industry, government, organizations, and religious orders. For filmmakers, it brings together practical information needed to survive in the sponsored-film industry and to…

  3. Optimization of FTA technology for large scale plant DNA isolation ...

    African Journals Online (AJOL)

    GRACE

    2006-05-02

    May 2, 2006 ... product yields and quality are sufficient for reliable scoring, distinguishing heterozygous from homozygous plants ... food and agriculture, testing drug discovery, transgenic, ... container. For QPM ... mM EDTA, pH 8). The FTA ...

  4. The technology of large-scale pharmaceutical plasmid purification ...

    African Journals Online (AJOL)

    STORAGESEVER

    2010-01-04

    Jan 4, 2010 ... DNA vaccine, the cost of purification must be decreased. Although commonly .... Three mice were killed every 4 days interval. Tissues of heart, liver, .... Now, methods such as chromatography had good prospects in plasmid ...

  5. The technology of large-scale pharmaceutical plasmid purification ...

    African Journals Online (AJOL)

    Further test demonstrated that the pcDNAlacZ purified with CTAB and authoritative endotoxin-free plasmid Kit had the similar transfection efficiency in vivo and in vitro. CTAB can be used for plasmid purification; the main advantages of the DNAs purified with CTAB include the avoidance of animal-derived enzymes, toxic ...

  6. Educational Technology--The White Elephant.

    Science.gov (United States)

    Molnar, Andrew R.

    A ten year experiment in educational technology sponsored under Title VII of the National Defense Education Act (NDEA) demonstrated the feasibility of large-scale educational systems which can extend education to all while permitting the individualization of instruction without significant increase in cost (through television, computer systems,…

  7. Annual report on Reactor Safety Research Projects sponsored by the Ministry of Economics and Technology of the Federal Republic of Germany. Reporting period 1999. Progress report

    International Nuclear Information System (INIS)

    2000-01-01

    Within its competence for energy research, the Bundesministerium fuer Wirtschaft und Technologie (BMWi) (Federal Ministry of Economics and Technology) sponsors investigations into the safety of nuclear power plants. The objective of these investigations is to provide fundamental knowledge, procedures and methods to contribute to realistic safety assessments of nuclear installations, to the further development of safety technology and to make use of the potential of innovative safety-related approaches. The Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH, by order of the BMWi, continuously issues information on the status of such investigations by publishing semi-annual and annual progress reports within the series of GRS-F-Fortschrittsberichte (GRS-F-Progress Reports). Each progress report represents a compilation of individual reports about the objectives, work performed, results achieved, next steps of the work etc. The individual reports are prepared in a standard form by the research organisations themselves as documentation of their progress in work and are published by the Research Management Division of GRS within the framework of general information on the progress in reactor safety research. The compilation of the reports is classified according to the classification system ''Joint Safety Research Index'' of the CEC (commission of the European communities). The reports are arranged in sequence of their project numbers. (orig.)

  8. Advances in Large-Scale Solar Heating and Long Term Storage in Denmark

    DEFF Research Database (Denmark)

    Heller, Alfred

    2000-01-01

    According to (the) information from the European Large-Scale Solar Heating Network, (See http://www.hvac.chalmers.se/cshp/), the area of installed solar collectors for large-scale application is in Europe, approximately 8 mill m2, corresponding to about 4000 MW thermal power. The 11 plants...... the last 10 years and the corresponding cost per collector area for the final installed plant is kept constant, even so the solar production is increased. Unfortunately large-scale seasonal storage was not able to keep up with the advances in solar technology, at least for pit water and gravel storage...... of the total 51 plants are equipped with long-term storage. In Denmark, 7 plants are installed, comprising of approx. 18,000-m2 collector area with new plants planned. The development of these plants and the involved technologies will be presented in this paper, with a focus on the improvements for Danish...

  9. Can administrative referenda be an instrument of control over large-scale technical installations?

    International Nuclear Information System (INIS)

    Rossnagel, A.

    1986-01-01

    An administrative referendum offers the possibility of direct participation of the citizens in decisions concerning large-scale technical installations. The article investigates the legal status of such a referendum on the basis of constitutional and democratic principles. The conclusion drawn is that any attempt to realize more direct democracy in a concrete field of jurisdiction of the state will meet with very large difficulties. On the other hand, the author clearly states more direct democracy for control over the establishment of large-scale technology to be sensible in terms of politics and principles of democracy, and possible within the constitutional system. Developments towards more direct democracy would mean an enhancement of representative democracy and would be adequate vis a vis the problems posed by large-scale technology. (HSCH) [de

  10. How the Internet Will Help Large-Scale Assessment Reinvent Itself

    Directory of Open Access Journals (Sweden)

    Randy Elliot Bennett

    2001-02-01

    Full Text Available Large-scale assessment in the United States is undergoing enormous pressure to change. That pressure stems from many causes. Depending upon the type of test, the issues precipitating change include an outmoded cognitive-scientific basis for test design; a mismatch with curriculum; the differential performance of population groups; a lack of information to help individuals improve; and inefficiency. These issues provide a strong motivation to reconceptualize both the substance and the business of large-scale assessment. At the same time, advances in technology, measurement, and cognitive science are providing the means to make that reconceptualization a reality. The thesis of this paper is that the largest facilitating factor will be technological, in particular the Internet. In the same way that it is already helping to revolutionize commerce, education, and even social interaction, the Internet will help revolutionize the business and substance of large-scale assessment.

  11. Jointly Sponsored Research Program

    Energy Technology Data Exchange (ETDEWEB)

    Everett A. Sondreal; John G. Hendrikson; Thomas A. Erickson

    2009-03-31

    U.S. Department of Energy (DOE) Cooperative Agreement DE-FC26-98FT40321 funded through the Office of Fossil Energy and administered at the National Energy Technology Laboratory (NETL) supported the performance of a Jointly Sponsored Research Program (JSRP) at the Energy & Environmental Research Center (EERC) with a minimum 50% nonfederal cost share to assist industry in commercializing and effectively applying highly efficient, nonpolluting energy systems that meet the nation's requirements for clean fuels, chemicals, and electricity in the 21st century. The EERC in partnership with its nonfederal partners jointly performed 131 JSRP projects for which the total DOE cost share was $22,716,634 (38%) and the nonfederal share was $36,776,573 (62%). Summaries of these projects are presented in this report for six program areas: (1) resource characterization and waste management, (2) air quality assessment and control, (3) advanced power systems, (4) advanced fuel forms, (5) value-added coproducts, and (6) advanced materials. The work performed under this agreement addressed DOE goals for reductions in CO{sub 2} emissions through efficiency, capture, and sequestration; near-zero emissions from highly efficient coal-fired power plants; environmental control capabilities for SO{sub 2}, NO{sub x}, fine respirable particulate (PM{sub 2.5}), and mercury; alternative transportation fuels including liquid synfuels and hydrogen; and synergistic integration of fossil and renewable resources.

  12. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  13. Comparison Between Overtopping Discharge in Small and Large Scale Models

    DEFF Research Database (Denmark)

    Helgason, Einar; Burcharth, Hans F.

    2006-01-01

    The present paper presents overtopping measurements from small scale model test performed at the Haudraulic & Coastal Engineering Laboratory, Aalborg University, Denmark and large scale model tests performed at the Largde Wave Channel,Hannover, Germany. Comparison between results obtained from...... small and large scale model tests show no clear evidence of scale effects for overtopping above a threshold value. In the large scale model no overtopping was measured for waveheights below Hs = 0.5m as the water sunk into the voids between the stones on the crest. For low overtopping scale effects...

  14. Large Scale Landslide Database System Established for the Reservoirs in Southern Taiwan

    Science.gov (United States)

    Tsai, Tsai-Tsung; Tsai, Kuang-Jung; Shieh, Chjeng-Lun

    2017-04-01

    Typhoon Morakot seriously attack southern Taiwan awaken the public awareness of large scale landslide disasters. Large scale landslide disasters produce large quantity of sediment due to negative effects on the operating functions of reservoirs. In order to reduce the risk of these disasters within the study area, the establishment of a database for hazard mitigation / disaster prevention is necessary. Real time data and numerous archives of engineering data, environment information, photo, and video, will not only help people make appropriate decisions, but also bring the biggest concern for people to process and value added. The study tried to define some basic data formats / standards from collected various types of data about these reservoirs and then provide a management platform based on these formats / standards. Meanwhile, in order to satisfy the practicality and convenience, the large scale landslide disasters database system is built both provide and receive information abilities, which user can use this large scale landslide disasters database system on different type of devices. IT technology progressed extreme quick, the most modern system might be out of date anytime. In order to provide long term service, the system reserved the possibility of user define data format /standard and user define system structure. The system established by this study was based on HTML5 standard language, and use the responsive web design technology. This will make user can easily handle and develop this large scale landslide disasters database system.

  15. Annual report on reactor safety research projects sponsored by the Ministry of Economics and Technology of the Federal Republic of Germany. Reporting period 2005. Progress report

    International Nuclear Information System (INIS)

    2005-01-01

    Within its competence for energy research, the Bundesministerium fuer Wirtschaft und Technology (BMWi) (Federal Ministry of Economics and Technology) sponsors investigations into the safety of nuclear power plants. The objective of these investigations is to provide fundamental knowledge, procedures and methods to contribute to realistic safety assessments of nuclear installations, to the further development of safety technology and to make use of the potential of innovative safety-related approaches. The Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH, by order of the BMWi, continuously issues information on the status of such investigations by publishing semi-annual and annual progress reports within the series of GRS-F-Fortschrittsberichte (GRS-F-Progress Reports). Each progress report represents a compilation of individual reports about the objectives, work performed, results achieved, next steps of the work etc. The individual reports are prepared in a standard form by the research organisations themselves as documentation of their progress in work and are published by the Research Management Division of GRS within the framework of general information on the progress in reactor safety research. The compilation of the reports is classified according to general topics related to reactor safety research. Further, use is made of the classification system 'Joint Safety Research Index' of the CEC (Commission of the European Communities). The reports are arranged in sequence of their project numbers. It has to be pointed out that the authors of the reports are responsible for the contents of this compilation. The BMWi does not take any responsibility for the correctness, exactness and completeness of the information nor for the observance of private claims of third parties. (orig.)

  16. Large-Scale Multiantenna Multisine Wireless Power Transfer

    Science.gov (United States)

    Huang, Yang; Clerckx, Bruno

    2017-11-01

    Wireless Power Transfer (WPT) is expected to be a technology reshaping the landscape of low-power applications such as the Internet of Things, Radio Frequency identification (RFID) networks, etc. Although there has been some progress towards multi-antenna multi-sine WPT design, the large-scale design of WPT, reminiscent of massive MIMO in communications, remains an open challenge. In this paper, we derive efficient multiuser algorithms based on a generalizable optimization framework, in order to design transmit sinewaves that maximize the weighted-sum/minimum rectenna output DC voltage. The study highlights the significant effect of the nonlinearity introduced by the rectification process on the design of waveforms in multiuser systems. Interestingly, in the single-user case, the optimal spatial domain beamforming, obtained prior to the frequency domain power allocation optimization, turns out to be Maximum Ratio Transmission (MRT). In contrast, in the general weighted sum criterion maximization problem, the spatial domain beamforming optimization and the frequency domain power allocation optimization are coupled. Assuming channel hardening, low-complexity algorithms are proposed based on asymptotic analysis, to maximize the two criteria. The structure of the asymptotically optimal spatial domain precoder can be found prior to the optimization. The performance of the proposed algorithms is evaluated. Numerical results confirm the inefficiency of the linear model-based design for the single and multi-user scenarios. It is also shown that as nonlinear model-based designs, the proposed algorithms can benefit from an increasing number of sinewaves.

  17. Factors influencing the decommissioning of large-scale nuclear plants

    International Nuclear Information System (INIS)

    Large, J.H.

    1988-01-01

    The decision-making process involving the decommissioning of the UK graphite moderated, gas-cooled nuclear power stations is complex. There are timing, engineering, waste disposal, cost and lost generation capacity factors to consider and the overall decision of when and how to proceed with decommissioning may include political and public tolerance dimensions. For the final stage of decommissioning the nuclear industry could either completely dismantle the reactor island leaving a green-field site or, alternatively, the reactor island could be maintained indefinitely with additional super- and substructure containment. At this time the first of these options, or deferred decommissioning, prevails and with this the nuclear industry has expressed considerable confidence that the technology required will become available with passing time, that acceptable radioactive waste disposal methods and facilities will be available and that the eventual costs of decommissioning will not escalate without restraint. If the deferred decommissioning strategy is wrong and it is not possible to completely dismantle the reactor islands a century into the future, then it may be too late to effect sufficient longer term containment to maintain the reactor hulks in a reliable condition. With respect to the final decommissioning of large-scale nuclear plant, it is concluded that the nuclear industry does not know quite how to do it, when it will be attempted and when it will be completed, and they do not know how much it will eventually cost. (author)

  18. Large scale photovoltaic field trials. Second technical report: monitoring phase

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-09-15

    This report provides an update on the Large-Scale Building Integrated Photovoltaic Field Trials (LS-BIPV FT) programme commissioned by the Department of Trade and Industry (Department for Business, Enterprise and Industry; BERR). It provides detailed profiles of the 12 projects making up this programme, which is part of the UK programme on photovoltaics and has run in parallel with the Domestic Field Trial. These field trials aim to record the experience and use the lessons learnt to raise awareness of, and confidence in, the technology and increase UK capabilities. The projects involved: the visitor centre at the Gaia Energy Centre in Cornwall; a community church hall in London; council offices in West Oxfordshire; a sports science centre at Gloucester University; the visitor centre at Cotswold Water Park; the headquarters of the Insolvency Service; a Welsh Development Agency building; an athletics centre in Birmingham; a research facility at the University of East Anglia; a primary school in Belfast; and Barnstable civic centre in Devon. The report describes the aims of the field trials, monitoring issues, performance, observations and trends, lessons learnt and the results of occupancy surveys.

  19. Numerically modelling the large scale coronal magnetic field

    Science.gov (United States)

    Panja, Mayukh; Nandi, Dibyendu

    2016-07-01

    The solar corona spews out vast amounts of magnetized plasma into the heliosphere which has a direct impact on the Earth's magnetosphere. Thus it is important that we develop an understanding of the dynamics of the solar corona. With our present technology it has not been possible to generate 3D magnetic maps of the solar corona; this warrants the use of numerical simulations to study the coronal magnetic field. A very popular method of doing this, is to extrapolate the photospheric magnetic field using NLFF or PFSS codes. However the extrapolations at different time intervals are completely independent of each other and do not capture the temporal evolution of magnetic fields. On the other hand full MHD simulations of the global coronal field, apart from being computationally very expensive would be physically less transparent, owing to the large number of free parameters that are typically used in such codes. This brings us to the Magneto-frictional model which is relatively simpler and computationally more economic. We have developed a Magnetofrictional Model, in 3D spherical polar co-ordinates to study the large scale global coronal field. Here we present studies of changing connectivities between active regions, in response to photospheric motions.

  20. Updating Geospatial Data from Large Scale Data Sources

    Science.gov (United States)

    Zhao, R.; Chen, J.; Wang, D.; Shang, Y.; Wang, Z.; Li, X.; Ai, T.

    2011-08-01

    In the past decades, many geospatial databases have been established at national, regional and municipal levels over the world. Nowadays, it has been widely recognized that how to update these established geo-spatial database and keep them up to date is most critical for the value of geo-spatial database. So, more and more efforts have been devoted to the continuous updating of these geospatial databases. Currently, there exist two main types of methods for Geo-spatial database updating: directly updating with remote sensing images or field surveying materials, and indirectly updating with other updated data result such as larger scale newly updated data. The former method is the basis because the update data sources in the two methods finally root from field surveying and remote sensing. The later method is often more economical and faster than the former. Therefore, after the larger scale database is updated, the smaller scale database should be updated correspondingly in order to keep the consistency of multi-scale geo-spatial database. In this situation, it is very reasonable to apply map generalization technology into the process of geo-spatial database updating. The latter is recognized as one of most promising methods of geo-spatial database updating, especially in collaborative updating environment in terms of map scale, i.e , different scale database are produced and maintained separately by different level organizations such as in China. This paper is focused on applying digital map generalization into the updating of geo-spatial database from large scale in the collaborative updating environment for SDI. The requirements of the application of map generalization into spatial database updating are analyzed firstly. A brief review on geospatial data updating based digital map generalization is then given. Based on the requirements analysis and review, we analyze the key factors for implementing updating geospatial data from large scale including technical

  1. Large-scale land transformations in Indonesia: The role of ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... enable timely responses to the impacts of large-scale land transformations in Central Kalimantan ... In partnership with UNESCO's Organization for Women in Science for the ... New funding opportunity for gender equality and climate change.

  2. Resolute large scale mining company contribution to health services of

    African Journals Online (AJOL)

    Resolute large scale mining company contribution to health services of Lusu ... in terms of socio economic, health, education, employment, safe drinking water, ... The data were analyzed using Scientific Package for Social Science (SPSS).

  3. Personalized Opportunistic Computing for CMS at Large Scale

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    **Douglas Thain** is an Associate Professor of Computer Science and Engineering at the University of Notre Dame, where he designs large scale distributed computing systems to power the needs of advanced science and...

  4. Bottom-Up Accountability Initiatives and Large-Scale Land ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Corey Piccioni

    fuel/energy, climate, and finance has occurred and one of the most ... this wave of large-scale land acquisitions. In fact, esti- ... Environmental Rights Action/Friends of the Earth,. Nigeria ... map the differentiated impacts (gender, ethnicity,.

  5. Large-scale linear programs in planning and prediction.

    Science.gov (United States)

    2017-06-01

    Large-scale linear programs are at the core of many traffic-related optimization problems in both planning and prediction. Moreover, many of these involve significant uncertainty, and hence are modeled using either chance constraints, or robust optim...

  6. Bottom-Up Accountability Initiatives and Large-Scale Land ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... Security can help increase accountability for large-scale land acquisitions in ... to build decent economic livelihoods and participate meaningfully in decisions ... its 2017 call for proposals to establish Cyber Policy Centres in the Global South.

  7. Amplification of large-scale magnetic field in nonhelical magnetohydrodynamics

    KAUST Repository

    Kumar, Rohit

    2017-08-11

    It is typically assumed that the kinetic and magnetic helicities play a crucial role in the growth of large-scale dynamo. In this paper, we demonstrate that helicity is not essential for the amplification of large-scale magnetic field. For this purpose, we perform nonhelical magnetohydrodynamic (MHD) simulation, and show that the large-scale magnetic field can grow in nonhelical MHD when random external forcing is employed at scale 1/10 the box size. The energy fluxes and shell-to-shell transfer rates computed using the numerical data show that the large-scale magnetic energy grows due to the energy transfers from the velocity field at the forcing scales.

  8. Needs, opportunities, and options for large scale systems research

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  9. No Large Scale Curvature Perturbations during Waterfall of Hybrid Inflation

    OpenAIRE

    Abolhasani, Ali Akbar; Firouzjahi, Hassan

    2010-01-01

    In this paper the possibility of generating large scale curvature perturbations induced from the entropic perturbations during the waterfall phase transition of standard hybrid inflation model is studied. We show that whether or not appreciable amounts of large scale curvature perturbations are produced during the waterfall phase transition depend crucially on the competition between the classical and the quantum mechanical back-reactions to terminate inflation. If one considers only the clas...

  10. Bayesian hierarchical model for large-scale covariance matrix estimation.

    Science.gov (United States)

    Zhu, Dongxiao; Hero, Alfred O

    2007-12-01

    Many bioinformatics problems implicitly depend on estimating large-scale covariance matrix. The traditional approaches tend to give rise to high variance and low accuracy due to "overfitting." We cast the large-scale covariance matrix estimation problem into the Bayesian hierarchical model framework, and introduce dependency between covariance parameters. We demonstrate the advantages of our approaches over the traditional approaches using simulations and OMICS data analysis.

  11. Benefits of transactive memory systems in large-scale development

    OpenAIRE

    Aivars, Sablis

    2016-01-01

    Context. Large-scale software development projects are those consisting of a large number of teams, maybe even spread across multiple locations, and working on large and complex software tasks. That means that neither a team member individually nor an entire team holds all the knowledge about the software being developed and teams have to communicate and coordinate their knowledge. Therefore, teams and team members in large-scale software development projects must acquire and manage expertise...

  12. Capabilities of the Large-Scale Sediment Transport Facility

    Science.gov (United States)

    2016-04-01

    pump flow meters, sediment trap weigh tanks , and beach profiling lidar. A detailed discussion of the original LSTF features and capabilities can be...ERDC/CHL CHETN-I-88 April 2016 Approved for public release; distribution is unlimited. Capabilities of the Large-Scale Sediment Transport...describes the Large-Scale Sediment Transport Facility (LSTF) and recent upgrades to the measurement systems. The purpose of these upgrades was to increase

  13. Comparative Analysis of Different Protocols to Manage Large Scale Networks

    OpenAIRE

    Anil Rao Pimplapure; Dr Jayant Dubey; Prashant Sen

    2013-01-01

    In recent year the numbers, complexity and size is increased in Large Scale Network. The best example of Large Scale Network is Internet, and recently once are Data-centers in Cloud Environment. In this process, involvement of several management tasks such as traffic monitoring, security and performance optimization is big task for Network Administrator. This research reports study the different protocols i.e. conventional protocols like Simple Network Management Protocol and newly Gossip bas...

  14. Trends in large-scale testing of reactor structures

    International Nuclear Information System (INIS)

    Blejwas, T.E.

    2003-01-01

    Large-scale tests of reactor structures have been conducted at Sandia National Laboratories since the late 1970s. This paper describes a number of different large-scale impact tests, pressurization tests of models of containment structures, and thermal-pressure tests of models of reactor pressure vessels. The advantages of large-scale testing are evident, but cost, in particular limits its use. As computer models have grown in size, such as number of degrees of freedom, the advent of computer graphics has made possible very realistic representation of results - results that may not accurately represent reality. A necessary condition to avoiding this pitfall is the validation of the analytical methods and underlying physical representations. Ironically, the immensely larger computer models sometimes increase the need for large-scale testing, because the modeling is applied to increasing more complex structural systems and/or more complex physical phenomena. Unfortunately, the cost of large-scale tests is a disadvantage that will likely severely limit similar testing in the future. International collaborations may provide the best mechanism for funding future programs with large-scale tests. (author)

  15. Large-scale agent-based social simulation : A study on epidemic prediction and control

    NARCIS (Netherlands)

    Zhang, M.

    2016-01-01

    Large-scale agent-based social simulation is gradually proving to be a versatile methodological approach for studying human societies, which could make contributions from policy making in social science, to distributed artificial intelligence and agent technology in computer science, and to theory

  16. The Ecological Impacts of Large-Scale Agrofuel Monoculture Production Systems in the Americas

    Science.gov (United States)

    Altieri, Miguel A.

    2009-01-01

    This article examines the expansion of agrofuels in the Americas and the ecological impacts associated with the technologies used in the production of large-scale monocultures of corn and soybeans. In addition to deforestation and displacement of lands devoted to food crops due to expansion of agrofuels, the massive use of transgenic crops and…

  17. 77 FR 58416 - Large Scale Networking (LSN); Middleware and Grid Interagency Coordination (MAGIC) Team

    Science.gov (United States)

    2012-09-20

    ..., Grid, and cloud projects. The MAGIC Team reports to the Large Scale Networking (LSN) Coordinating Group... Coordination (MAGIC) Team AGENCY: The Networking and Information Technology Research and Development (NITRD.... Dates/Location: The MAGIC Team meetings are held on the first Wednesday of each month, 2:00-4:00pm, at...

  18. 78 FR 70076 - Large Scale Networking (LSN)-Middleware and Grid Interagency Coordination (MAGIC) Team

    Science.gov (United States)

    2013-11-22

    ... projects. The MAGIC Team reports to the Large Scale Networking (LSN) Coordinating Group (CG). Public... Coordination (MAGIC) Team AGENCY: The Networking and Information Technology Research and Development (NITRD... MAGIC Team meetings are held on the first Wednesday of each month, 2:00-4:00 p.m., at the National...

  19. Annual report on reactor safety research projects sponsored by the Ministry of Economics and Technology of the Federal Republic of Germany. Reporting period 2007. Progress report

    International Nuclear Information System (INIS)

    2007-01-01

    Within its competence for energy research the Federal Ministry of Economics and Technology (BMWi) sponsors research projects on the safety of nuclear power plants currently in operation. The objective of these projects is to provide fundamental knowledge, procedures and methods to contribute to realistic safety assessments of nuclear installations, to the further development of safety technology and to make use of the potential of innovative safety-related approaches. The Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH, by order of the BMWi, continuously issues information on the status of such research projects by publishing semi-annual and annual progress reports within the series of GRS-F-Fortschrittsberichte (GRS-F-Progress Reports). Each progress report represents a compilation of individual reports about the objectives, work performed, results achieved, next steps of the work etc. The individual reports are prepared in a standard form by the research organisations themselves as documentation of their progress in work. The progress reports are published by the Research Management Division of GRS. The reports as of the year 2000 are available in the Internet-based information system on results and data of reactor safety research (http://www.grs-fbw.de). The compilation of the reports is classified according to the classification system 'Joint Safety Research Index (JSRI)'. The reports are arranged in sequence of their project numbers. It has to be pointed out that the authors of the reports are responsible for the contents of this compilation. The BMWi does not take any responsibility for the correctness, exactness and completeness of the information nor for the observance of private claims of third parties. (orig.)

  20. Testing, development and demonstration of large scale solar district heating systems

    DEFF Research Database (Denmark)

    Furbo, Simon; Fan, Jianhua; Perers, Bengt

    2015-01-01

    In 2013-2014 the project “Testing, development and demonstration of large scale solar district heating systems” was carried out within the Sino-Danish Renewable Energy Development Programme, the so called RED programme jointly developed by the Chinese and Danish governments. In the project Danish...... know how on solar heating plants and solar heating test technology have been transferred from Denmark to China, large solar heating systems have been promoted in China, test capabilities on solar collectors and large scale solar heating systems have been improved in China and Danish-Chinese cooperation...

  1. A Topology Visualization Early Warning Distribution Algorithm for Large-Scale Network Security Incidents

    Directory of Open Access Journals (Sweden)

    Hui He

    2013-01-01

    Full Text Available It is of great significance to research the early warning system for large-scale network security incidents. It can improve the network system’s emergency response capabilities, alleviate the cyber attacks’ damage, and strengthen the system’s counterattack ability. A comprehensive early warning system is presented in this paper, which combines active measurement and anomaly detection. The key visualization algorithm and technology of the system are mainly discussed. The large-scale network system’s plane visualization is realized based on the divide and conquer thought. First, the topology of the large-scale network is divided into some small-scale networks by the MLkP/CR algorithm. Second, the sub graph plane visualization algorithm is applied to each small-scale network. Finally, the small-scale networks’ topologies are combined into a topology based on the automatic distribution algorithm of force analysis. As the algorithm transforms the large-scale network topology plane visualization problem into a series of small-scale network topology plane visualization and distribution problems, it has higher parallelism and is able to handle the display of ultra-large-scale network topology.

  2. Analysis for preliminary evaluation of discrete fracture flow and large-scale permeability in sedimentary rocks

    International Nuclear Information System (INIS)

    Kanehiro, B.Y.; Lai, C.H.; Stow, S.H.

    1987-05-01

    Conceptual models for sedimentary rock settings that could be used in future evaluation and suitability studies are being examined through the DOE Repository Technology Program. One area of concern for the hydrologic aspects of these models is discrete fracture flow analysis as related to the estimation of the size of the representative elementary volume, evaluation of the appropriateness of continuum assumptions and estimation of the large-scale permeabilities of sedimentary rocks. A basis for preliminary analysis of flow in fracture systems of the types that might be expected to occur in low permeability sedimentary rocks is presented. The approach used involves numerical modeling of discrete fracture flow for the configuration of a large-scale hydrologic field test directed at estimation of the size of the representative elementary volume and large-scale permeability. Analysis of fracture data on the basis of this configuration is expected to provide a preliminary indication of the scale at which continuum assumptions can be made

  3. Phylogenetic distribution of large-scale genome patchiness

    Directory of Open Access Journals (Sweden)

    Hackenberg Michael

    2008-04-01

    Full Text Available Abstract Background The phylogenetic distribution of large-scale genome structure (i.e. mosaic compositional patchiness has been explored mainly by analytical ultracentrifugation of bulk DNA. However, with the availability of large, good-quality chromosome sequences, and the recently developed computational methods to directly analyze patchiness on the genome sequence, an evolutionary comparative analysis can be carried out at the sequence level. Results The local variations in the scaling exponent of the Detrended Fluctuation Analysis are used here to analyze large-scale genome structure and directly uncover the characteristic scales present in genome sequences. Furthermore, through shuffling experiments of selected genome regions, computationally-identified, isochore-like regions were identified as the biological source for the uncovered large-scale genome structure. The phylogenetic distribution of short- and large-scale patchiness was determined in the best-sequenced genome assemblies from eleven eukaryotic genomes: mammals (Homo sapiens, Pan troglodytes, Mus musculus, Rattus norvegicus, and Canis familiaris, birds (Gallus gallus, fishes (Danio rerio, invertebrates (Drosophila melanogaster and Caenorhabditis elegans, plants (Arabidopsis thaliana and yeasts (Saccharomyces cerevisiae. We found large-scale patchiness of genome structure, associated with in silico determined, isochore-like regions, throughout this wide phylogenetic range. Conclusion Large-scale genome structure is detected by directly analyzing DNA sequences in a wide range of eukaryotic chromosome sequences, from human to yeast. In all these genomes, large-scale patchiness can be associated with the isochore-like regions, as directly detected in silico at the sequence level.

  4. Report of the Workshop on Petascale Systems Integration for LargeScale Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Kramer, William T.C.; Walter, Howard; New, Gary; Engle, Tom; Pennington, Rob; Comes, Brad; Bland, Buddy; Tomlison, Bob; Kasdorf, Jim; Skinner, David; Regimbal, Kevin

    2007-10-01

    There are significant issues regarding Large Scale System integration that are not being addressed in other forums such as current research portfolios or vendor user groups. Unfortunately, the issues in the area of large-scale system integration often fall into a netherworld; not research, not facilities, not procurement, not operations, not user services. Taken together, these issues along with the impact of sub-optimal integration technology means the time required to deploy, integrate and stabilize large scale system may consume up to 20 percent of the useful life of such systems. Improving the state of the art for large scale systems integration has potential to increase the scientific productivity of these systems. Sites have significant expertise, but there are no easy ways to leverage this expertise among them . Many issues inhibit the sharing of information, including available time and effort, as well as issues with sharing proprietary information. Vendors also benefit in the long run from the solutions to issues detected during site testing and integration. There is a great deal of enthusiasm for making large scale system integration a full-fledged partner along with the other major thrusts supported by funding agencies in the definition, design, and use of a petascale systems. Integration technology and issues should have a full 'seat at the table' as petascale and exascale initiatives and programs are planned. The workshop attendees identified a wide range of issues and suggested paths forward. Pursuing these with funding opportunities and innovation offers the opportunity to dramatically improve the state of large scale system integration.

  5. Unified Access Architecture for Large-Scale Scientific Datasets

    Science.gov (United States)

    Karna, Risav

    2014-05-01

    Data-intensive sciences have to deploy diverse large scale database technologies for data analytics as scientists have now been dealing with much larger volume than ever before. While array databases have bridged many gaps between the needs of data-intensive research fields and DBMS technologies (Zhang 2011), invocation of other big data tools accompanying these databases is still manual and separate the database management's interface. We identify this as an architectural challenge that will increasingly complicate the user's work flow owing to the growing number of useful but isolated and niche database tools. Such use of data analysis tools in effect leaves the burden on the user's end to synchronize the results from other data manipulation analysis tools with the database management system. To this end, we propose a unified access interface for using big data tools within large scale scientific array database using the database queries themselves to embed foreign routines belonging to the big data tools. Such an invocation of foreign data manipulation routines inside a query into a database can be made possible through a user-defined function (UDF). UDFs that allow such levels of freedom as to call modules from another language and interface back and forth between the query body and the side-loaded functions would be needed for this purpose. For the purpose of this research we attempt coupling of four widely used tools Hadoop (hadoop1), Matlab (matlab1), R (r1) and ScaLAPACK (scalapack1) with UDF feature of rasdaman (Baumann 98), an array-based data manager, for investigating this concept. The native array data model used by an array-based data manager provides compact data storage and high performance operations on ordered data such as spatial data, temporal data, and matrix-based data for linear algebra operations (scidbusr1). Performances issues arising due to coupling of tools with different paradigms, niche functionalities, separate processes and output

  6. Thermal System Analysis and Optimization of Large-Scale Compressed Air Energy Storage (CAES

    Directory of Open Access Journals (Sweden)

    Zhongguang Fu

    2015-08-01

    Full Text Available As an important solution to issues regarding peak load and renewable energy resources on grids, large-scale compressed air energy storage (CAES power generation technology has recently become a popular research topic in the area of large-scale industrial energy storage. At present, the combination of high-expansion ratio turbines with advanced gas turbine technology is an important breakthrough in energy storage technology. In this study, a new gas turbine power generation system is coupled with current CAES technology. Moreover, a thermodynamic cycle system is optimized by calculating for the parameters of a thermodynamic system. Results show that the thermal efficiency of the new system increases by at least 5% over that of the existing system.

  7. A new large-scale manufacturing platform for complex biopharmaceuticals.

    Science.gov (United States)

    Vogel, Jens H; Nguyen, Huong; Giovannini, Roberto; Ignowski, Jolene; Garger, Steve; Salgotra, Anil; Tom, Jennifer

    2012-12-01

    Complex biopharmaceuticals, such as recombinant blood coagulation factors, are addressing critical medical needs and represent a growing multibillion-dollar market. For commercial manufacturing of such, sometimes inherently unstable, molecules it is important to minimize product residence time in non-ideal milieu in order to obtain acceptable yields and consistently high product quality. Continuous perfusion cell culture allows minimization of residence time in the bioreactor, but also brings unique challenges in product recovery, which requires innovative solutions. In order to maximize yield, process efficiency, facility and equipment utilization, we have developed, scaled-up and successfully implemented a new integrated manufacturing platform in commercial scale. This platform consists of a (semi-)continuous cell separation process based on a disposable flow path and integrated with the upstream perfusion operation, followed by membrane chromatography on large-scale adsorber capsules in rapid cycling mode. Implementation of the platform at commercial scale for a new product candidate led to a yield improvement of 40% compared to the conventional process technology, while product quality has been shown to be more consistently high. Over 1,000,000 L of cell culture harvest have been processed with 100% success rate to date, demonstrating the robustness of the new platform process in GMP manufacturing. While membrane chromatography is well established for polishing in flow-through mode, this is its first commercial-scale application for bind/elute chromatography in the biopharmaceutical industry and demonstrates its potential in particular for manufacturing of potent, low-dose biopharmaceuticals. Copyright © 2012 Wiley Periodicals, Inc.

  8. KBTAC [Knowledge-Based Technology Application Center] - The EPRI [Electric Power Research Institute]-sponsored knowledge-based technology application center

    International Nuclear Information System (INIS)

    Meyer, W.; Wood, R.M.; Scherer, J.

    1990-01-01

    The Electric Power Research Institute (EPRI) has announced the establishment of the Knowledge-Based Technology Application Center (KBTAC), whose goal is to assist member utilities with expert system technology and applications. The center, established November 7, 1989, is located on the campus of Syracuse University, Syracuse, New York, and will be operated jointly by Kaman Sciences Corporation and the university. The mission of the KBTAC is to assist EPRI member utilities to develop, test, and transfer expert systems into nuclear power plant operations, maintenance, and administration

  9. PKI security in large-scale healthcare networks.

    Science.gov (United States)

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  10. Large-Scale Agriculture and Outgrower Schemes in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    , the impact of large-scale agriculture and outgrower schemes on productivity, household welfare and wages in developing countries is highly contentious. Chapter 1 of this thesis provides an introduction to the study, while also reviewing the key debate in the contemporary land ‘grabbing’ and historical large...... sugarcane outgrower scheme on household income and asset stocks. Chapter 5 examines the wages and working conditions in ‘formal’ large-scale and ‘informal’ small-scale irrigated agriculture. The results in Chapter 2 show that moisture stress, the use of untested planting materials, and conflict over land...... commands a higher wage than ‘formal’ large-scale agriculture, while rather different wage determination mechanisms exist in the two sectors. Human capital characteristics (education and experience) partly explain the differences in wages within the formal sector, but play no significant role...

  11. Seismic safety in conducting large-scale blasts

    Science.gov (United States)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  12. Balancing modern Power System with large scale of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Altin, Müfit; Hansen, Anca Daniela

    2014-01-01

    Power system operators must ensure robust, secure and reliable power system operation even with a large scale integration of wind power. Electricity generated from the intermittent wind in large propor-tion may impact on the control of power system balance and thus deviations in the power system...... frequency in small or islanded power systems or tie line power flows in interconnected power systems. Therefore, the large scale integration of wind power into the power system strongly concerns the secure and stable grid operation. To ensure the stable power system operation, the evolving power system has...... to be analysed with improved analytical tools and techniques. This paper proposes techniques for the active power balance control in future power systems with the large scale wind power integration, where power balancing model provides the hour-ahead dispatch plan with reduced planning horizon and the real time...

  13. The role of large scale motions on passive scalar transport

    Science.gov (United States)

    Dharmarathne, Suranga; Araya, Guillermo; Tutkun, Murat; Leonardi, Stefano; Castillo, Luciano

    2014-11-01

    We study direct numerical simulation (DNS) of turbulent channel flow at Reτ = 394 to investigate effect of large scale motions on fluctuating temperature field which forms a passive scalar field. Statistical description of the large scale features of the turbulent channel flow is obtained using two-point correlations of velocity components. Two-point correlations of fluctuating temperature field is also examined in order to identify possible similarities between velocity and temperature fields. The two-point cross-correlations betwen the velocity and temperature fluctuations are further analyzed to establish connections between these two fields. In addition, we use proper orhtogonal decompotion (POD) to extract most dominant modes of the fields and discuss the coupling of large scale features of turbulence and the temperature field.

  14. Large-Scale Structure and Hyperuniformity of Amorphous Ices

    Science.gov (United States)

    Martelli, Fausto; Torquato, Salvatore; Giovambattista, Nicolas; Car, Roberto

    2017-09-01

    We investigate the large-scale structure of amorphous ices and transitions between their different forms by quantifying their large-scale density fluctuations. Specifically, we simulate the isothermal compression of low-density amorphous ice (LDA) and hexagonal ice to produce high-density amorphous ice (HDA). Both HDA and LDA are nearly hyperuniform; i.e., they are characterized by an anomalous suppression of large-scale density fluctuations. By contrast, in correspondence with the nonequilibrium phase transitions to HDA, the presence of structural heterogeneities strongly suppresses the hyperuniformity and the system becomes hyposurficial (devoid of "surface-area fluctuations"). Our investigation challenges the largely accepted "frozen-liquid" picture, which views glasses as structurally arrested liquids. Beyond implications for water, our findings enrich our understanding of pressure-induced structural transformations in glasses.

  15. Large-scale networks in engineering and life sciences

    CERN Document Server

    Findeisen, Rolf; Flockerzi, Dietrich; Reichl, Udo; Sundmacher, Kai

    2014-01-01

    This edited volume provides insights into and tools for the modeling, analysis, optimization, and control of large-scale networks in the life sciences and in engineering. Large-scale systems are often the result of networked interactions between a large number of subsystems, and their analysis and control are becoming increasingly important. The chapters of this book present the basic concepts and theoretical foundations of network theory and discuss its applications in different scientific areas such as biochemical reactions, chemical production processes, systems biology, electrical circuits, and mobile agents. The aim is to identify common concepts, to understand the underlying mathematical ideas, and to inspire discussions across the borders of the various disciplines.  The book originates from the interdisciplinary summer school “Large Scale Networks in Engineering and Life Sciences” hosted by the International Max Planck Research School Magdeburg, September 26-30, 2011, and will therefore be of int...

  16. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  17. Host Immunity via Mutable Virtualized Large-Scale Network Containers

    Science.gov (United States)

    2016-07-25

    Programs, P.O. Box 8795, REPORT NUMBER Williamsburg, VA 23187-8795 9. SPONSORING/ MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM...Second, we need to mitigate the threat from insiders. Though the system only allows authenticated clients to locate and access it service, insiders...and constrain the distributed persistent inside crawlers that have va.lid credentials to access the web services. The main idea is to add a marker

  18. Report of the LASCAR forum: Large scale reprocessing plant safeguards

    International Nuclear Information System (INIS)

    1992-01-01

    This report has been prepared to provide information on the studies which were carried out from 1988 to 1992 under the auspices of the multinational forum known as Large Scale Reprocessing Plant Safeguards (LASCAR) on safeguards for four large scale reprocessing plants operated or planned to be operated in the 1990s. The report summarizes all of the essential results of these studies. The participants in LASCAR were from France, Germany, Japan, the United Kingdom, the United States of America, the Commission of the European Communities - Euratom, and the International Atomic Energy Agency

  19. Large-scale structure observables in general relativity

    International Nuclear Information System (INIS)

    Jeong, Donghui; Schmidt, Fabian

    2015-01-01

    We review recent studies that rigorously define several key observables of the large-scale structure of the Universe in a general relativistic context. Specifically, we consider (i) redshift perturbation of cosmic clock events; (ii) distortion of cosmic rulers, including weak lensing shear and magnification; and (iii) observed number density of tracers of the large-scale structure. We provide covariant and gauge-invariant expressions of these observables. Our expressions are given for a linearly perturbed flat Friedmann–Robertson–Walker metric including scalar, vector, and tensor metric perturbations. While we restrict ourselves to linear order in perturbation theory, the approach can be straightforwardly generalized to higher order. (paper)

  20. Topology Optimization of Large Scale Stokes Flow Problems

    DEFF Research Database (Denmark)

    Aage, Niels; Poulsen, Thomas Harpsøe; Gersborg-Hansen, Allan

    2008-01-01

    This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs.......This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs....

  1. Fatigue Analysis of Large-scale Wind turbine

    Directory of Open Access Journals (Sweden)

    Zhu Yongli

    2017-01-01

    Full Text Available The paper does research on top flange fatigue damage of large-scale wind turbine generator. It establishes finite element model of top flange connection system with finite element analysis software MSC. Marc/Mentat, analyzes its fatigue strain, implements load simulation of flange fatigue working condition with Bladed software, acquires flange fatigue load spectrum with rain-flow counting method, finally, it realizes fatigue analysis of top flange with fatigue analysis software MSC. Fatigue and Palmgren-Miner linear cumulative damage theory. The analysis result indicates that its result provides new thinking for flange fatigue analysis of large-scale wind turbine generator, and possesses some practical engineering value.

  2. Generation of large-scale vortives in compressible helical turbulence

    International Nuclear Information System (INIS)

    Chkhetiani, O.G.; Gvaramadze, V.V.

    1989-01-01

    We consider generation of large-scale vortices in compressible self-gravitating turbulent medium. The closed equation describing evolution of the large-scale vortices in helical turbulence with finite correlation time is obtained. This equation has the form similar to the hydromagnetic dynamo equation, which allows us to call the vortx genertation effect the vortex dynamo. It is possible that principally the same mechanism is responsible both for amplification and maintenance of density waves and magnetic fields in gaseous disks of spiral galaxies. (author). 29 refs

  3. FEASIBILITY OF LARGE-SCALE OCEAN CO2 SEQUESTRATION

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Peter Brewer; Dr. James Barry

    2002-09-30

    We have continued to carry out creative small-scale experiments in the deep ocean to investigate the science underlying questions of possible future large-scale deep-ocean CO{sub 2} sequestration as a means of ameliorating greenhouse gas growth rates in the atmosphere. This project is closely linked to additional research funded by the DoE Office of Science, and to support from the Monterey Bay Aquarium Research Institute. The listing of project achievements here over the past year reflects these combined resources. Within the last project year we have: (1) Published a significant workshop report (58 pages) entitled ''Direct Ocean Sequestration Expert's Workshop'', based upon a meeting held at MBARI in 2001. The report is available both in hard copy, and on the NETL web site. (2) Carried out three major, deep ocean, (3600m) cruises to examine the physical chemistry, and biological consequences, of several liter quantities released on the ocean floor. (3) Carried out two successful short cruises in collaboration with Dr. Izuo Aya and colleagues (NMRI, Osaka, Japan) to examine the fate of cold (-55 C) CO{sub 2} released at relatively shallow ocean depth. (4) Carried out two short cruises in collaboration with Dr. Costas Tsouris, ORNL, to field test an injection nozzle designed to transform liquid CO{sub 2} into a hydrate slurry at {approx}1000m depth. (5) In collaboration with Prof. Jill Pasteris (Washington University) we have successfully accomplished the first field test of a deep ocean laser Raman spectrometer for probing in situ the physical chemistry of the CO{sub 2} system. (6) Submitted the first major paper on biological impacts as determined from our field studies. (7) Submitted a paper on our measurements of the fate of a rising stream of liquid CO{sub 2} droplets to Environmental Science & Technology. (8) Have had accepted for publication in Eos the first brief account of the laser Raman spectrometer success. (9) Have had two

  4. Large-scale Lurgi plant would be uneconomic: study group

    Energy Technology Data Exchange (ETDEWEB)

    1964-03-21

    Gas Council and National Coal Board agreed that building of large scale Lurgi plant on the basis of study is not at present acceptable on economic grounds. The committee considered that new processes based on naphtha offered more economic sources of base and peak load production. Tables listing data provided in contractors' design studies and summary of contractors' process designs are included.

  5. Origin of large-scale cell structure in the universe

    International Nuclear Information System (INIS)

    Zel'dovich, Y.B.

    1982-01-01

    A qualitative explanation is offered for the characteristic global structure of the universe, wherein ''black'' regions devoid of galaxies are surrounded on all sides by closed, comparatively thin, ''bright'' layers populated by galaxies. The interpretation rests on some very general arguments regarding the growth of large-scale perturbations in a cold gas

  6. Large-Scale Systems Control Design via LMI Optimization

    Czech Academy of Sciences Publication Activity Database

    Rehák, Branislav

    2015-01-01

    Roč. 44, č. 3 (2015), s. 247-253 ISSN 1392-124X Institutional support: RVO:67985556 Keywords : Combinatorial linear matrix inequalities * large-scale system * decentralized control Subject RIV: BC - Control Systems Theory Impact factor: 0.633, year: 2015

  7. Identification of low order models for large scale processes

    NARCIS (Netherlands)

    Wattamwar, S.K.

    2010-01-01

    Many industrial chemical processes are complex, multi-phase and large scale in nature. These processes are characterized by various nonlinear physiochemical effects and fluid flows. Such processes often show coexistence of fast and slow dynamics during their time evolutions. The increasing demand

  8. Worldwide large-scale fluctuations of sardine and anchovy ...

    African Journals Online (AJOL)

    Worldwide large-scale fluctuations of sardine and anchovy populations. ... African Journal of Marine Science. Journal Home · ABOUT THIS JOURNAL · Advanced ... Fullscreen Fullscreen Off. http://dx.doi.org/10.2989/AJMS.2008.30.1.13.463.

  9. Worldwide large-scale fluctuations of sardine and anchovy ...

    African Journals Online (AJOL)

    Worldwide large-scale fluctuations of sardine and anchovy populations. ... African Journal of Marine Science. Journal Home · ABOUT THIS JOURNAL · Advanced ... http://dx.doi.org/10.2989/AJMS.2008.30.1.13.463 · AJOL African Journals ...

  10. Large-scale coastal impact induced by a catastrophic storm

    DEFF Research Database (Denmark)

    Fruergaard, Mikkel; Andersen, Thorbjørn Joest; Johannessen, Peter N

    breaching. Our results demonstrate that violent, millennial-scale storms can trigger significant large-scale and long-term changes on barrier coasts, and that coastal changes assumed to take place over centuries or even millennia may occur in association with a single extreme storm event....

  11. Success Factors of Large Scale ERP Implementation in Thailand

    OpenAIRE

    Rotchanakitumnuai; Siriluck

    2010-01-01

    The objectives of the study are to examine the determinants of ERP implementation success factors of ERP implementation. The result indicates that large scale ERP implementation success consist of eight factors: project management competence, knowledge sharing, ERP system quality , understanding, user involvement, business process re-engineering, top management support, organization readiness.

  12. Planck intermediate results XLII. Large-scale Galactic magnetic fields

    DEFF Research Database (Denmark)

    Adam, R.; Ade, P. A. R.; Alves, M. I. R.

    2016-01-01

    Recent models for the large-scale Galactic magnetic fields in the literature have been largely constrained by synchrotron emission and Faraday rotation measures. We use three different but representative models to compare their predicted polarized synchrotron and dust emission with that measured ...

  13. Large Scale Simulations of the Euler Equations on GPU Clusters

    KAUST Repository

    Liebmann, Manfred; Douglas, Craig C.; Haase, Gundolf; Horvá th, Zoltá n

    2010-01-01

    The paper investigates the scalability of a parallel Euler solver, using the Vijayasundaram method, on a GPU cluster with 32 Nvidia Geforce GTX 295 boards. The aim of this research is to enable large scale fluid dynamics simulations with up to one

  14. Breakdown of large-scale circulation in turbulent rotating convection

    NARCIS (Netherlands)

    Kunnen, R.P.J.; Clercx, H.J.H.; Geurts, Bernardus J.

    2008-01-01

    Turbulent rotating convection in a cylinder is investigated both numerically and experimentally at Rayleigh number Ra = $10^9$ and Prandtl number $\\sigma$ = 6.4. In this Letter we discuss two topics: the breakdown under rotation of the domain-filling large-scale circulation (LSC) typical for

  15. Penalized Estimation in Large-Scale Generalized Linear Array Models

    DEFF Research Database (Denmark)

    Lund, Adam; Vincent, Martin; Hansen, Niels Richard

    2017-01-01

    Large-scale generalized linear array models (GLAMs) can be challenging to fit. Computation and storage of its tensor product design matrix can be impossible due to time and memory constraints, and previously considered design matrix free algorithms do not scale well with the dimension...

  16. A Chain Perspective on Large-scale Number Systems

    NARCIS (Netherlands)

    Grijpink, J.H.A.M.

    2012-01-01

    As large-scale number systems gain significance in social and economic life (electronic communication, remote electronic authentication), the correct functioning and the integrity of public number systems take on crucial importance. They are needed to uniquely indicate people, objects or phenomena

  17. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-01-01

    structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination

  18. Temporal Variation of Large Scale Flows in the Solar Interior ...

    Indian Academy of Sciences (India)

    tribpo

    Temporal Variation of Large Scale Flows in the Solar Interior. 355. Figure 2. Zonal and meridional components of the time-dependent residual velocity at a few selected depths as marked above each panel, are plotted as contours of constant velocity in the longitude-latitude plane. The left panels show the zonal component, ...

  19. Facile Large-Scale Synthesis of 5- and 6-Carboxyfluoresceins

    DEFF Research Database (Denmark)

    Hammershøj, Peter; Ek, Pramod Kumar; Harris, Pernille

    2015-01-01

    A series of fluorescein dyes have been prepared from a common precursor through a very simple synthetic procedure, giving access to important precursors for fluorescent probes. The method has proven an efficient access to regioisomerically pure 5- and 6-carboxyfluoresceins on a large scale, in good...

  20. The Large-Scale Structure of Scientific Method

    Science.gov (United States)

    Kosso, Peter

    2009-01-01

    The standard textbook description of the nature of science describes the proposal, testing, and acceptance of a theoretical idea almost entirely in isolation from other theories. The resulting model of science is a kind of piecemeal empiricism that misses the important network structure of scientific knowledge. Only the large-scale description of…

  1. Newton Methods for Large Scale Problems in Machine Learning

    Science.gov (United States)

    Hansen, Samantha Leigh

    2014-01-01

    The focus of this thesis is on practical ways of designing optimization algorithms for minimizing large-scale nonlinear functions with applications in machine learning. Chapter 1 introduces the overarching ideas in the thesis. Chapters 2 and 3 are geared towards supervised machine learning applications that involve minimizing a sum of loss…

  2. Large-Scale Machine Learning for Classification and Search

    Science.gov (United States)

    Liu, Wei

    2012-01-01

    With the rapid development of the Internet, nowadays tremendous amounts of data including images and videos, up to millions or billions, can be collected for training machine learning models. Inspired by this trend, this thesis is dedicated to developing large-scale machine learning techniques for the purpose of making classification and nearest…

  3. Large scale solar district heating. Evaluation, modelling and designing - Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Heller, A.

    2000-07-01

    The appendices present the following: A) Cad-drawing of the Marstal CSHP design. B) Key values - large-scale solar heating in Denmark. C) Monitoring - a system description. D) WMO-classification of pyranometers (solarimeters). E) The computer simulation model in TRNSYS. F) Selected papers from the author. (EHS)

  4. Proceedings of the meeting on large scale computer simulation research

    International Nuclear Information System (INIS)

    2004-04-01

    The meeting to summarize the collaboration activities for FY2003 on the Large Scale Computer Simulation Research was held January 15-16, 2004 at Theory and Computer Simulation Research Center, National Institute for Fusion Science. Recent simulation results, methodologies and other related topics were presented. (author)

  5. Large-scale Homogenization of Bulk Materials in Mammoth Silos

    NARCIS (Netherlands)

    Schott, D.L.

    2004-01-01

    This doctoral thesis concerns the large-scale homogenization of bulk materials in mammoth silos. The objective of this research was to determine the best stacking and reclaiming method for homogenization in mammoth silos. For this purpose a simulation program was developed to estimate the

  6. Large Scale Survey Data in Career Development Research

    Science.gov (United States)

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  7. Large Scale Anomalies of the Cosmic Microwave Background with Planck

    DEFF Research Database (Denmark)

    Frejsel, Anne Mette

    This thesis focuses on the large scale anomalies of the Cosmic Microwave Background (CMB) and their possible origins. The investigations consist of two main parts. The first part is on statistical tests of the CMB, and the consistency of both maps and power spectrum. We find that the Planck data...

  8. Fractals and the Large-Scale Structure in the Universe

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 7; Issue 4. Fractals and the Large-Scale Structure in the Universe - Is the Cosmological Principle Valid? A K Mittal T R Seshadri. General Article Volume 7 Issue 4 April 2002 pp 39-47 ...

  9. LARGE-SCALE COMMERCIAL INVESTMENTS IN LAND: SEEKING ...

    African Journals Online (AJOL)

    extent of large-scale investment in land or to assess its impact on the people in recipient countries. .... favorable lease terms, apparently based on a belief that this is necessary to .... Harm to the rights of local occupiers of land can result from a dearth. 24. ..... applies to a self-identified group based on the group's traditions.

  10. Mixing Metaphors: Building Infrastructure for Large Scale School Turnaround

    Science.gov (United States)

    Peurach, Donald J.; Neumerski, Christine M.

    2015-01-01

    The purpose of this analysis is to increase understanding of the possibilities and challenges of building educational infrastructure--the basic, foundational structures, systems, and resources--to support large-scale school turnaround. Building educational infrastructure often exceeds the capacity of schools, districts, and state education…

  11. Reconsidering Replication: New Perspectives on Large-Scale School Improvement

    Science.gov (United States)

    Peurach, Donald J.; Glazer, Joshua L.

    2012-01-01

    The purpose of this analysis is to reconsider organizational replication as a strategy for large-scale school improvement: a strategy that features a "hub" organization collaborating with "outlet" schools to enact school-wide designs for improvement. To do so, we synthesize a leading line of research on commercial replication to construct a…

  12. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed; Elsawy, Hesham; Gharbieh, Mohammad; Alouini, Mohamed-Slim; Adinoyi, Abdulkareem; Alshaalan, Furaih

    2017-01-01

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end

  13. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  14. Solving Large Scale Crew Scheduling Problems in Practice

    NARCIS (Netherlands)

    E.J.W. Abbink (Erwin); L. Albino; T.A.B. Dollevoet (Twan); D. Huisman (Dennis); J. Roussado; R.L. Saldanha

    2010-01-01

    textabstractThis paper deals with large-scale crew scheduling problems arising at the Dutch railway operator, Netherlands Railways (NS). NS operates about 30,000 trains a week. All these trains need a driver and a certain number of guards. Some labor rules restrict the duties of a certain crew base

  15. The large scale microwave background anisotropy in decaying particle cosmology

    International Nuclear Information System (INIS)

    Panek, M.

    1987-06-01

    We investigate the large-scale anisotropy of the microwave background radiation in cosmological models with decaying particles. The observed value of the quadrupole moment combined with other constraints gives an upper limit on the redshift of the decay z/sub d/ < 3-5. 12 refs., 2 figs

  16. Dual Decomposition for Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Vandenberghe, Lieven

    2013-01-01

    Dual decomposition is applied to power balancing of exible thermal storage units. The centralized large-scale problem is decomposed into smaller subproblems and solved locallyby each unit in the Smart Grid. Convergence is achieved by coordinating the units consumption through a negotiation...

  17. Evaluation of Large-scale Public Sector Reforms

    DEFF Research Database (Denmark)

    Breidahl, Karen Nielsen; Gjelstrup, Gunnar; Hansen, Hanne Foss

    2017-01-01

    and more delimited policy areas take place. In our analysis we apply four governance perspectives (rational-instrumental, rational-interest based, institutional-cultural and a chaos perspective) in a comparative analysis of the evaluations of two large-scale public sector reforms in Denmark and Norway. We...

  18. Assessment of climate change impacts on rainfall using large scale ...

    Indian Academy of Sciences (India)

    Many of the applied techniques in water resources management can be directly or indirectly influenced by ... is based on large scale climate signals data around the world. In order ... predictand relationships are often very complex. .... constraints to solve the optimization problem. ..... social, and environmental sustainability.

  19. Factors Influencing Uptake of a Large Scale Curriculum Innovation.

    Science.gov (United States)

    Adey, Philip S.

    Educational research has all too often failed to be implemented on a large-scale basis. This paper describes the multiplier effect of a professional development program for teachers and for trainers in the United Kingdom, and how that program was developed, monitored, and evaluated. Cognitive Acceleration through Science Education (CASE) is a…

  20. ability in Large Scale Land Acquisitions in Kenya

    International Development Research Centre (IDRC) Digital Library (Canada)

    Corey Piccioni

    Kenya's national planning strategy, Vision 2030. Agri- culture, natural resource exploitation, and infrastruc- ... sitions due to high levels of poverty and unclear or in- secure land tenure rights in Kenya. Inadequate social ... lease to a private company over the expansive Yala. Swamp to undertake large-scale irrigation farming.

  1. Large-scale silviculture experiments of western Oregon and Washington.

    Science.gov (United States)

    Nathan J. Poage; Paul D. Anderson

    2007-01-01

    We review 12 large-scale silviculture experiments (LSSEs) in western Washington and Oregon with which the Pacific Northwest Research Station of the USDA Forest Service is substantially involved. We compiled and arrayed information about the LSSEs as a series of matrices in a relational database, which is included on the compact disc published with this report and...

  2. Participatory Design and the Challenges of Large-Scale Systems

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Hertzum, Morten

    2008-01-01

    With its 10th biannual anniversary conference, Participatory Design (PD) is leaving its teens and must now be considered ready to join the adult world. In this article we encourage the PD community to think big: PD should engage in large-scale information-systems development and opt for a PD...

  3. A fast approach to generate large-scale topographic maps based on new Chinese vehicle-borne Lidar system

    International Nuclear Information System (INIS)

    Youmei, Han; Bogang, Yang

    2014-01-01

    Large -scale topographic maps are important basic information for city and regional planning and management. Traditional large- scale mapping methods are mostly based on artificial mapping and photogrammetry. The traditional mapping method is inefficient and limited by the environments. While the photogrammetry methods(such as low-altitude aerial mapping) is an economical and effective way to map wide and regulate range of large scale topographic map but doesn't work well in the small area due to the high cost of manpower and resources. Recent years, the vehicle-borne LIDAR technology has a rapid development, and its application in surveying and mapping is becoming a new topic. The main objective of this investigation is to explore the potential of vehicle-borne LIDAR technology to be used to fast mapping large scale topographic maps based on new Chinese vehicle-borne LIDAR system. It studied how to use the new Chinese vehicle-borne LIDAR system measurement technology to map large scale topographic maps. After the field data capture, it can be mapped in the office based on the LIDAR data (point cloud) by software which programmed by ourselves. In addition, the detailed process and accuracy analysis were proposed by an actual case. The result show that this new technology provides a new fast method to generate large scale topographic maps, which is high efficient and accuracy compared to traditional methods

  4. Reports on research programs in the field of reactor safety sponsored by the Federal Ministry for Research and Technology. Reported period: July 1 to December 31, 1986. Progress report

    International Nuclear Information System (INIS)

    1987-05-01

    Investigations on the safety of light water reactors (LWR) being performed in the framework of the research program on reactor safety (RS-projects) are sponsored by the Federal Ministry for Research and Technology (BMFT). Objective of this program is to investigate in greater detail the safety margins of nuclear power plants and their systems and the further development of safety technology. Besides the investigations of LWR tasks also projects on the safety of advanced reactors are sponsored by the BMFT. The individual reports are classified according to the research program on the safety of LWRs 1977-1980 of the BMFT. Another table of contents uses the same classification system as applied in the nuclear safety index of the CEC (Commission of the European Communities) and the OECD (Organization for Economic Cooperation and Development). The reports are arranged in the sequence of their project numbers. (orig.) [de

  5. Limited accessibility to designs and results of Japanese large-scale clinical trials for cardiovascular diseases.

    Science.gov (United States)

    Sawata, Hiroshi; Ueshima, Kenji; Tsutani, Kiichiro

    2011-04-14

    Clinical evidence is important for improving the treatment of patients by health care providers. In the study of cardiovascular diseases, large-scale clinical trials involving thousands of participants are required to evaluate the risks of cardiac events and/or death. The problems encountered in conducting the Japanese Acute Myocardial Infarction Prospective (JAMP) study highlighted the difficulties involved in obtaining the financial and infrastructural resources necessary for conducting large-scale clinical trials. The objectives of the current study were: 1) to clarify the current funding and infrastructural environment surrounding large-scale clinical trials in cardiovascular and metabolic diseases in Japan, and 2) to find ways to improve the environment surrounding clinical trials in Japan more generally. We examined clinical trials examining cardiovascular diseases that evaluated true endpoints and involved 300 or more participants using Pub-Med, Ichushi (by the Japan Medical Abstracts Society, a non-profit organization), websites of related medical societies, the University Hospital Medical Information Network (UMIN) Clinical Trials Registry, and clinicaltrials.gov at three points in time: 30 November, 2004, 25 February, 2007 and 25 July, 2009. We found a total of 152 trials that met our criteria for 'large-scale clinical trials' examining cardiovascular diseases in Japan. Of these, 72.4% were randomized controlled trials (RCTs). Of 152 trials, 9.2% of the trials examined more than 10,000 participants, and 42.8% examined between 1,000 and 10,000 participants. The number of large-scale clinical trials markedly increased from 2001 to 2004, but suddenly decreased in 2007, then began to increase again. Ischemic heart disease (39.5%) was the most common target disease. Most of the larger-scale trials were funded by private organizations such as pharmaceutical companies. The designs and results of 13 trials were not disclosed. To improve the quality of clinical

  6. Limited accessibility to designs and results of Japanese large-scale clinical trials for cardiovascular diseases

    Directory of Open Access Journals (Sweden)

    Tsutani Kiichiro

    2011-04-01

    Full Text Available Abstract Background Clinical evidence is important for improving the treatment of patients by health care providers. In the study of cardiovascular diseases, large-scale clinical trials involving thousands of participants are required to evaluate the risks of cardiac events and/or death. The problems encountered in conducting the Japanese Acute Myocardial Infarction Prospective (JAMP study highlighted the difficulties involved in obtaining the financial and infrastructural resources necessary for conducting large-scale clinical trials. The objectives of the current study were: 1 to clarify the current funding and infrastructural environment surrounding large-scale clinical trials in cardiovascular and metabolic diseases in Japan, and 2 to find ways to improve the environment surrounding clinical trials in Japan more generally. Methods We examined clinical trials examining cardiovascular diseases that evaluated true endpoints and involved 300 or more participants using Pub-Med, Ichushi (by the Japan Medical Abstracts Society, a non-profit organization, websites of related medical societies, the University Hospital Medical Information Network (UMIN Clinical Trials Registry, and clinicaltrials.gov at three points in time: 30 November, 2004, 25 February, 2007 and 25 July, 2009. Results We found a total of 152 trials that met our criteria for 'large-scale clinical trials' examining cardiovascular diseases in Japan. Of these, 72.4% were randomized controlled trials (RCTs. Of 152 trials, 9.2% of the trials examined more than 10,000 participants, and 42.8% examined between 1,000 and 10,000 participants. The number of large-scale clinical trials markedly increased from 2001 to 2004, but suddenly decreased in 2007, then began to increase again. Ischemic heart disease (39.5% was the most common target disease. Most of the larger-scale trials were funded by private organizations such as pharmaceutical companies. The designs and results of 13 trials were not

  7. A new asynchronous parallel algorithm for inferring large-scale gene regulatory networks.

    Directory of Open Access Journals (Sweden)

    Xiangyun Xiao

    Full Text Available The reconstruction of gene regulatory networks (GRNs from high-throughput experimental data has been considered one of the most important issues in systems biology research. With the development of high-throughput technology and the complexity of biological problems, we need to reconstruct GRNs that contain thousands of genes. However, when many existing algorithms are used to handle these large-scale problems, they will encounter two important issues: low accuracy and high computational cost. To overcome these difficulties, the main goal of this study is to design an effective parallel algorithm to infer large-scale GRNs based on high-performance parallel computing environments. In this study, we proposed a novel asynchronous parallel framework to improve the accuracy and lower the time complexity of large-scale GRN inference by combining splitting technology and ordinary differential equation (ODE-based optimization. The presented algorithm uses the sparsity and modularity of GRNs to split whole large-scale GRNs into many small-scale modular subnetworks. Through the ODE-based optimization of all subnetworks in parallel and their asynchronous communications, we can easily obtain the parameters of the whole network. To test the performance of the proposed approach, we used well-known benchmark datasets from Dialogue for Reverse Engineering Assessments and Methods challenge (DREAM, experimentally determined GRN of Escherichia coli and one published dataset that contains more than 10 thousand genes to compare the proposed approach with several popular algorithms on the same high-performance computing environments in terms of both accuracy and time complexity. The numerical results demonstrate that our parallel algorithm exhibits obvious superiority in inferring large-scale GRNs.

  8. A new asynchronous parallel algorithm for inferring large-scale gene regulatory networks.

    Science.gov (United States)

    Xiao, Xiangyun; Zhang, Wei; Zou, Xiufen

    2015-01-01

    The reconstruction of gene regulatory networks (GRNs) from high-throughput experimental data has been considered one of the most important issues in systems biology research. With the development of high-throughput technology and the complexity of biological problems, we need to reconstruct GRNs that contain thousands of genes. However, when many existing algorithms are used to handle these large-scale problems, they will encounter two important issues: low accuracy and high computational cost. To overcome these difficulties, the main goal of this study is to design an effective parallel algorithm to infer large-scale GRNs based on high-performance parallel computing environments. In this study, we proposed a novel asynchronous parallel framework to improve the accuracy and lower the time complexity of large-scale GRN inference by combining splitting technology and ordinary differential equation (ODE)-based optimization. The presented algorithm uses the sparsity and modularity of GRNs to split whole large-scale GRNs into many small-scale modular subnetworks. Through the ODE-based optimization of all subnetworks in parallel and their asynchronous communications, we can easily obtain the parameters of the whole network. To test the performance of the proposed approach, we used well-known benchmark datasets from Dialogue for Reverse Engineering Assessments and Methods challenge (DREAM), experimentally determined GRN of Escherichia coli and one published dataset that contains more than 10 thousand genes to compare the proposed approach with several popular algorithms on the same high-performance computing environments in terms of both accuracy and time complexity. The numerical results demonstrate that our parallel algorithm exhibits obvious superiority in inferring large-scale GRNs.

  9. An interactive display system for large-scale 3D models

    Science.gov (United States)

    Liu, Zijian; Sun, Kun; Tao, Wenbing; Liu, Liman

    2018-04-01

    With the improvement of 3D reconstruction theory and the rapid development of computer hardware technology, the reconstructed 3D models are enlarging in scale and increasing in complexity. Models with tens of thousands of 3D points or triangular meshes are common in practical applications. Due to storage and computing power limitation, it is difficult to achieve real-time display and interaction with large scale 3D models for some common 3D display software, such as MeshLab. In this paper, we propose a display system for large-scale 3D scene models. We construct the LOD (Levels of Detail) model of the reconstructed 3D scene in advance, and then use an out-of-core view-dependent multi-resolution rendering scheme to realize the real-time display of the large-scale 3D model. With the proposed method, our display system is able to render in real time while roaming in the reconstructed scene and 3D camera poses can also be displayed. Furthermore, the memory consumption can be significantly decreased via internal and external memory exchange mechanism, so that it is possible to display a large scale reconstructed scene with over millions of 3D points or triangular meshes in a regular PC with only 4GB RAM.

  10. Recent developments in large-scale ozone generation with dielectric barrier discharges

    Science.gov (United States)

    Lopez, Jose L.

    2014-10-01

    Large-scale ozone generation for industrial applications has been entirely based on the creation of microplasmas or microdischarges created using dielectric barrier discharge (DBD) reactors. Although versions of DBD generated ozone have been in continuous use for over a hundred years especially in water treatment, recent changes in environmental awareness and sustainability have lead to a surge of ozone generating facilities throughout the world. As a result of this enhanced global usage of this environmental cleaning application various new discoveries have emerged in the science and technology of ozone generation. This presentation will describe some of the most recent breakthrough developments in large-scale ozone generation while further addressing some of the current scientific and engineering challenges of this technology.

  11. Assessment of present and future large-scale semiconductor detector systems

    International Nuclear Information System (INIS)

    Spieler, H.G.; Haller, E.E.

    1984-11-01

    The performance of large-scale semiconductor detector systems is assessed with respect to their theoretical potential and to the practical limitations imposed by processing techniques, readout electronics and radiation damage. In addition to devices which detect reaction products directly, the analysis includes photodetectors for scintillator arrays. Beyond present technology we also examine currently evolving structures and techniques which show potential for producing practical devices in the foreseeable future

  12. In-situ vitrification: a large-scale prototype for immobilizing radioactively contaminated waste

    International Nuclear Information System (INIS)

    Carter, J.G.; Buelt, J.L.

    1986-03-01

    Pacific Northwest Laboratory is developing the technology of in situ vitrification, a thermal treatment process for immobilizing radioactively contaminated soil. A permanent remedial action, the process incorporates radionuclides into a glass and crystalline form. The transportable procss consists of an electrical power system to vitrify the soil, a hood to contain gaseous effluents, an off-gas treatment system and cooling system, and a process control station. Large-scale testing of the in situ vitrification process is currently underway

  13. An economical device for carbon supplement in large-scale micro-algae production.

    Science.gov (United States)

    Su, Zhenfeng; Kang, Ruijuan; Shi, Shaoyuan; Cong, Wei; Cai, Zhaoling

    2008-10-01

    One simple but efficient carbon-supplying device was designed and developed, and the correlative carbon-supplying technology was described. The absorbing characterization of this device was studied. The carbon-supplying system proved to be economical for large-scale cultivation of Spirulina sp. in an outdoor raceway pond, and the gaseous carbon dioxide absorptivity was enhanced above 78%, which could reduce the production cost greatly.

  14. Large-scale building integrated photovoltaics field trial. First technical report - installation phase

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    This report summarises the results of the first eighteen months of the Large-Scale Building Integrated Photovoltaic Field Trial focussing on technical aspects. The project aims included increasing awareness and application of the technology, raising the UK capabilities in application of the technology, and assessing the potential for building integrated photovoltaics (BIPV). Details are given of technology choices; project organisation, cost, and status; and the evaluation criteria. Installations of BIPV described include University buildings, commercial centres, and a sports stadium, wildlife park, church hall, and district council building. Lessons learnt are discussed, and a further report covering monitoring aspects is planned.

  15. GAIA: A WINDOW TO LARGE-SCALE MOTIONS

    Energy Technology Data Exchange (ETDEWEB)

    Nusser, Adi [Physics Department and the Asher Space Science Institute-Technion, Haifa 32000 (Israel); Branchini, Enzo [Department of Physics, Universita Roma Tre, Via della Vasca Navale 84, 00146 Rome (Italy); Davis, Marc, E-mail: adi@physics.technion.ac.il, E-mail: branchin@fis.uniroma3.it, E-mail: mdavis@berkeley.edu [Departments of Astronomy and Physics, University of California, Berkeley, CA 94720 (United States)

    2012-08-10

    Using redshifts as a proxy for galaxy distances, estimates of the two-dimensional (2D) transverse peculiar velocities of distant galaxies could be obtained from future measurements of proper motions. We provide the mathematical framework for analyzing 2D transverse motions and show that they offer several advantages over traditional probes of large-scale motions. They are completely independent of any intrinsic relations between galaxy properties; hence, they are essentially free of selection biases. They are free from homogeneous and inhomogeneous Malmquist biases that typically plague distance indicator catalogs. They provide additional information to traditional probes that yield line-of-sight peculiar velocities only. Further, because of their 2D nature, fundamental questions regarding vorticity of large-scale flows can be addressed. Gaia, for example, is expected to provide proper motions of at least bright galaxies with high central surface brightness, making proper motions a likely contender for traditional probes based on current and future distance indicator measurements.

  16. Measuring Cosmic Expansion and Large Scale Structure with Destiny

    Science.gov (United States)

    Benford, Dominic J.; Lauer, Tod R.

    2007-01-01

    Destiny is a simple, direct, low cost mission to determine the properties of dark energy by obtaining a cosmologically deep supernova (SN) type Ia Hubble diagram and by measuring the large-scale mass power spectrum over time. Its science instrument is a 1.65m space telescope, featuring a near-infrared survey camera/spectrometer with a large field of view. During its first two years, Destiny will detect, observe, and characterize 23000 SN Ia events over the redshift interval 0.4Destiny will be used in its third year as a high resolution, wide-field imager to conduct a weak lensing survey covering >lo00 square degrees to measure the large-scale mass power spectrum. The combination of surveys is much more powerful than either technique on its own, and will have over an order of magnitude greater sensitivity than will be provided by ongoing ground-based projects.

  17. Volume measurement study for large scale input accountancy tank

    International Nuclear Information System (INIS)

    Uchikoshi, Seiji; Watanabe, Yuichi; Tsujino, Takeshi

    1999-01-01

    Large Scale Tank Calibration (LASTAC) facility, including an experimental tank which has the same volume and structure as the input accountancy tank of Rokkasho Reprocessing Plant (RRP) was constructed in Nuclear Material Control Center of Japan. Demonstration experiments have been carried out to evaluate a precision of solution volume measurement and to establish the procedure of highly accurate pressure measurement for a large scale tank with dip-tube bubbler probe system to be applied to the input accountancy tank of RRP. Solution volume in a tank is determined from substitution the solution level for the calibration function obtained in advance, which express a relation between the solution level and its volume in the tank. Therefore, precise solution volume measurement needs a precise calibration function that is determined carefully. The LASTAC calibration experiments using pure water showed good result in reproducibility. (J.P.N.)

  18. Prototype Vector Machine for Large Scale Semi-Supervised Learning

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Kai; Kwok, James T.; Parvin, Bahram

    2009-04-29

    Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of the kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.

  19. Large-scale influences in near-wall turbulence.

    Science.gov (United States)

    Hutchins, Nicholas; Marusic, Ivan

    2007-03-15

    Hot-wire data acquired in a high Reynolds number facility are used to illustrate the need for adequate scale separation when considering the coherent structure in wall-bounded turbulence. It is found that a large-scale motion in the log region becomes increasingly comparable in energy to the near-wall cycle as the Reynolds number increases. Through decomposition of fluctuating velocity signals, it is shown that this large-scale motion has a distinct modulating influence on the small-scale energy (akin to amplitude modulation). Reassessment of DNS data, in light of these results, shows similar trends, with the rate and intensity of production due to the near-wall cycle subject to a modulating influence from the largest-scale motions.

  20. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed

    2017-03-16

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end, cellular networks are indeed a strong first mile candidate to accommodate the data tsunami to be generated by the IoT. However, IoT devices are required in the cellular paradigm to undergo random access procedures as a precursor to resource allocation. Such procedures impose a major bottleneck that hinders cellular networks\\' ability to support large-scale IoT. In this article, we shed light on the random access dilemma and present a case study based on experimental data as well as system-level simulations. Accordingly, a case is built for the latent need to revisit random access procedures. A call for action is motivated by listing a few potential remedies and recommendations.

  1. Cosmic ray acceleration by large scale galactic shocks

    International Nuclear Information System (INIS)

    Cesarsky, C.J.; Lagage, P.O.

    1987-01-01

    The mechanism of diffusive shock acceleration may account for the existence of galactic cosmic rays detailed application to stellar wind shocks and especially to supernova shocks have been developed. Existing models can usually deal with the energetics or the spectral slope, but the observed energy range of cosmic rays is not explained. Therefore it seems worthwhile to examine the effect that large scale, long-lived galactic shocks may have on galactic cosmic rays, in the frame of the diffusive shock acceleration mechanism. Large scale fast shocks can only be expected to exist in the galactic halo. We consider three situations where they may arise: expansion of a supernova shock in the halo, galactic wind, galactic infall; and discuss the possible existence of these shocks and their role in accelerating cosmic rays

  2. Efficient algorithms for collaborative decision making for large scale settings

    DEFF Research Database (Denmark)

    Assent, Ira

    2011-01-01

    to bring about more effective and more efficient retrieval systems that support the users' decision making process. We sketch promising research directions for more efficient algorithms for collaborative decision making, especially for large scale systems.......Collaborative decision making is a successful approach in settings where data analysis and querying can be done interactively. In large scale systems with huge data volumes or many users, collaboration is often hindered by impractical runtimes. Existing work on improving collaboration focuses...... on avoiding redundancy for users working on the same task. While this improves the effectiveness of the user work process, the underlying query processing engine is typically considered a "black box" and left unchanged. Research in multiple query processing, on the other hand, ignores the application...

  3. Lagrangian space consistency relation for large scale structure

    International Nuclear Information System (INIS)

    Horn, Bart; Hui, Lam; Xiao, Xiao

    2015-01-01

    Consistency relations, which relate the squeezed limit of an (N+1)-point correlation function to an N-point function, are non-perturbative symmetry statements that hold even if the associated high momentum modes are deep in the nonlinear regime and astrophysically complex. Recently, Kehagias and Riotto and Peloso and Pietroni discovered a consistency relation applicable to large scale structure. We show that this can be recast into a simple physical statement in Lagrangian space: that the squeezed correlation function (suitably normalized) vanishes. This holds regardless of whether the correlation observables are at the same time or not, and regardless of whether multiple-streaming is present. The simplicity of this statement suggests that an analytic understanding of large scale structure in the nonlinear regime may be particularly promising in Lagrangian space

  4. The Large-scale Effect of Environment on Galactic Conformity

    Science.gov (United States)

    Sun, Shuangpeng; Guo, Qi; Wang, Lan; Wang, Jie; Gao, Liang; Lacey, Cedric G.; Pan, Jun

    2018-04-01

    We use a volume-limited galaxy sample from the SDSS Data Release 7 to explore the dependence of galactic conformity on the large-scale environment, measured on ˜ 4 Mpc scales. We find that the star formation activity of neighbour galaxies depends more strongly on the environment than on the activity of their primary galaxies. In under-dense regions most neighbour galaxies tend to be active, while in over-dense regions neighbour galaxies are mostly passive, regardless of the activity of their primary galaxies. At a given stellar mass, passive primary galaxies reside in higher density regions than active primary galaxies, leading to the apparently strong conformity signal. The dependence of the activity of neighbour galaxies on environment can be explained by the corresponding dependence of the fraction of satellite galaxies. Similar results are found for galaxies in a semi-analytical model, suggesting that no new physics is required to explain the observed large-scale conformity.

  5. Electron drift in a large scale solid xenon

    International Nuclear Information System (INIS)

    Yoo, J.; Jaskierny, W.F.

    2015-01-01

    A study of charge drift in a large scale optically transparent solid xenon is reported. A pulsed high power xenon light source is used to liberate electrons from a photocathode. The drift speeds of the electrons are measured using a 8.7 cm long electrode in both the liquid and solid phase of xenon. In the liquid phase (163 K), the drift speed is 0.193 ± 0.003 cm/μs while the drift speed in the solid phase (157 K) is 0.397 ± 0.006 cm/μs at 900 V/cm over 8.0 cm of uniform electric fields. Therefore, it is demonstrated that a factor two faster electron drift speed in solid phase xenon compared to that in liquid in a large scale solid xenon

  6. Active power reserves evaluation in large scale PVPPs

    DEFF Research Database (Denmark)

    Crăciun, Bogdan-Ionut; Kerekes, Tamas; Sera, Dezso

    2013-01-01

    The present trend on investing in renewable ways of producing electricity in the detriment of conventional fossil fuel-based plants will lead to a certain point where these plants have to provide ancillary services and contribute to overall grid stability. Photovoltaic (PV) power has the fastest...... growth among all renewable energies and managed to reach high penetration levels creating instabilities which at the moment are corrected by the conventional generation. This paradigm will change in the future scenarios where most of the power is supplied by large scale renewable plants and parts...... of the ancillary services have to be shared by the renewable plants. The main focus of the proposed paper is to technically and economically analyze the possibility of having active power reserves in large scale PV power plants (PVPPs) without any auxiliary storage equipment. The provided reserves should...

  7. Real-time simulation of large-scale floods

    Science.gov (United States)

    Liu, Q.; Qin, Y.; Li, G. D.; Liu, Z.; Cheng, D. J.; Zhao, Y. H.

    2016-08-01

    According to the complex real-time water situation, the real-time simulation of large-scale floods is very important for flood prevention practice. Model robustness and running efficiency are two critical factors in successful real-time flood simulation. This paper proposed a robust, two-dimensional, shallow water model based on the unstructured Godunov- type finite volume method. A robust wet/dry front method is used to enhance the numerical stability. An adaptive method is proposed to improve the running efficiency. The proposed model is used for large-scale flood simulation on real topography. Results compared to those of MIKE21 show the strong performance of the proposed model.

  8. A large-scale soil-structure interaction experiment: Part I design and construction

    International Nuclear Information System (INIS)

    Tang, H.T.; Tang, Y.K.; Wall, I.B.; Lin, E.

    1987-01-01

    In the simulated earthquake experiments (SIMQUAKE) sponsored by EPRI, the detonation of vertical arrays of explosives propagated wave motions through the ground to the model structures. Although such a simulation can provide information about dynamic soil-structure interaction (SSI) characteristics in a strong motion environment, it lacks seismic wave scattering characteristics for studying seismic input to the soil-structure system and the effect of different kinds of wave composition to the soil-structure response. To supplement the inadequacy of the simulated earthquake SSI experiment, the Electric Power Research Institute (EPRI) and the Taiwan Power Company (Taipower) jointly sponsored a large scale SSI experiment in the field. The objectives of the experiment are: (1) to obtain actual strong motion earthquakes induced database in a soft-soil environment which will substantiate predictive and design SSI models;and (2) to assess nuclear power plant reactor containment internal components dynamic response and margins relating to actual earthquake-induced excitation. These objectives are accomplished by recording and analyzing data from two instrumented, scaled down, (1/4- and 1/12-scale) reinforced concrete containments sited in a high seismic region in Taiwan where a strong-motion seismic array network is located

  9. Large-scale building energy efficiency retrofit: Concept, model and control

    International Nuclear Information System (INIS)

    Wu, Zhou; Wang, Bo; Xia, Xiaohua

    2016-01-01

    BEER (Building energy efficiency retrofit) projects are initiated in many nations and regions over the world. Existing studies of BEER focus on modeling and planning based on one building and one year period of retrofitting, which cannot be applied to certain large BEER projects with multiple buildings and multi-year retrofit. In this paper, the large-scale BEER problem is defined in a general TBT (time-building-technology) framework, which fits essential requirements of real-world projects. The large-scale BEER is newly studied in the control approach rather than the optimization approach commonly used before. Optimal control is proposed to design optimal retrofitting strategy in terms of maximal energy savings and maximal NPV (net present value). The designed strategy is dynamically changing on dimensions of time, building and technology. The TBT framework and the optimal control approach are verified in a large BEER project, and results indicate that promising performance of energy and cost savings can be achieved in the general TBT framework. - Highlights: • Energy efficiency retrofit of many buildings is studied. • A TBT (time-building-technology) framework is proposed. • The control system of the large-scale BEER is modeled. • The optimal retrofitting strategy is obtained.

  10. Some Statistics for Measuring Large-Scale Structure

    OpenAIRE

    Brandenberger, Robert H.; Kaplan, David M.; A, Stephen; Ramsey

    1993-01-01

    Good statistics for measuring large-scale structure in the Universe must be able to distinguish between different models of structure formation. In this paper, two and three dimensional ``counts in cell" statistics and a new ``discrete genus statistic" are applied to toy versions of several popular theories of structure formation: random phase cold dark matter model, cosmic string models, and global texture scenario. All three statistics appear quite promising in terms of differentiating betw...

  11. Foundations of Large-Scale Multimedia Information Management and Retrieval

    CERN Document Server

    Chang, Edward Y

    2011-01-01

    "Foundations of Large-Scale Multimedia Information Management and Retrieval - Mathematics of Perception" covers knowledge representation and semantic analysis of multimedia data and scalability in signal extraction, data mining, and indexing. The book is divided into two parts: Part I - Knowledge Representation and Semantic Analysis focuses on the key components of mathematics of perception as it applies to data management and retrieval. These include feature selection/reduction, knowledge representation, semantic analysis, distance function formulation for measuring similarity, and

  12. PKI security in large-scale healthcare networks

    OpenAIRE

    Mantas, G.; Lymberopoulos, D.; Komninos, N.

    2012-01-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a ...

  13. Experimental simulation of microinteractions in large scale explosions

    Energy Technology Data Exchange (ETDEWEB)

    Chen, X.; Luo, R.; Yuen, W.W.; Theofanous, T.G. [California Univ., Santa Barbara, CA (United States). Center for Risk Studies and Safety

    1998-01-01

    This paper presents data and analysis of recent experiments conducted in the SIGMA-2000 facility to simulate microinteractions in large scale explosions. Specifically, the fragmentation behavior of a high temperature molten steel drop under high pressure (beyond critical) conditions are investigated. The current data demonstrate, for the first time, the effect of high pressure in suppressing the thermal effect of fragmentation under supercritical conditions. The results support the microinteractions idea, and the ESPROSE.m prediction of fragmentation rate. (author)

  14. Large-scale motions in the universe: a review

    International Nuclear Information System (INIS)

    Burstein, D.

    1990-01-01

    The expansion of the universe can be retarded in localised regions within the universe both by the presence of gravity and by non-gravitational motions generated in the post-recombination universe. The motions of galaxies thus generated are called 'peculiar motions', and the amplitudes, size scales and coherence of these peculiar motions are among the most direct records of the structure of the universe. As such, measurements of these properties of the present-day universe provide some of the severest tests of cosmological theories. This is a review of the current evidence for large-scale motions of galaxies out to a distance of ∼5000 km s -1 (in an expanding universe, distance is proportional to radial velocity). 'Large-scale' in this context refers to motions that are correlated over size scales larger than the typical sizes of groups of galaxies, up to and including the size of the volume surveyed. To orient the reader into this relatively new field of study, a short modern history is given together with an explanation of the terminology. Careful consideration is given to the data used to measure the distances, and hence the peculiar motions, of galaxies. The evidence for large-scale motions is presented in a graphical fashion, using only the most reliable data for galaxies spanning a wide range in optical properties and over the complete range of galactic environments. The kinds of systematic errors that can affect this analysis are discussed, and the reliability of these motions is assessed. The predictions of two models of large-scale motion are compared to the observations, and special emphasis is placed on those motions in which our own Galaxy directly partakes. (author)

  15. A Classification Framework for Large-Scale Face Recognition Systems

    OpenAIRE

    Zhou, Ziheng; Deravi, Farzin

    2009-01-01

    This paper presents a generic classification framework for large-scale face recognition systems. Within the framework, a data sampling strategy is proposed to tackle the data imbalance when image pairs are sampled from thousands of face images for preparing a training dataset. A modified kernel Fisher discriminant classifier is proposed to make it computationally feasible to train the kernel-based classification method using tens of thousands of training samples. The framework is tested in an...

  16. Large Scale Visual Recommendations From Street Fashion Images

    OpenAIRE

    Jagadeesh, Vignesh; Piramuthu, Robinson; Bhardwaj, Anurag; Di, Wei; Sundaresan, Neel

    2014-01-01

    We describe a completely automated large scale visual recommendation system for fashion. Our focus is to efficiently harness the availability of large quantities of online fashion images and their rich meta-data. Specifically, we propose four data driven models in the form of Complementary Nearest Neighbor Consensus, Gaussian Mixture Models, Texture Agnostic Retrieval and Markov Chain LDA for solving this problem. We analyze relative merits and pitfalls of these algorithms through extensive e...

  17. Design study on sodium cooled large-scale reactor

    International Nuclear Information System (INIS)

    Murakami, Tsutomu; Hishida, Masahiko; Kisohara, Naoyuki

    2004-07-01

    In Phase 1 of the 'Feasibility Studies on Commercialized Fast Reactor Cycle Systems (F/S)', an advanced loop type reactor has been selected as a promising concept of sodium-cooled large-scale reactor, which has a possibility to fulfill the design requirements of the F/S. In Phase 2, design improvement for further cost reduction of establishment of the plant concept has been performed. This report summarizes the results of the design study on the sodium-cooled large-scale reactor performed in JFY2003, which is the third year of Phase 2. In the JFY2003 design study, critical subjects related to safety, structural integrity and thermal hydraulics which found in the last fiscal year has been examined and the plant concept has been modified. Furthermore, fundamental specifications of main systems and components have been set and economy has been evaluated. In addition, as the interim evaluation of the candidate concept of the FBR fuel cycle is to be conducted, cost effectiveness and achievability for the development goal were evaluated and the data of the three large-scale reactor candidate concepts were prepared. As a results of this study, the plant concept of the sodium-cooled large-scale reactor has been constructed, which has a prospect to satisfy the economic goal (construction cost: less than 200,000 yens/kWe, etc.) and has a prospect to solve the critical subjects. From now on, reflecting the results of elemental experiments, the preliminary conceptual design of this plant will be preceded toward the selection for narrowing down candidate concepts at the end of Phase 2. (author)

  18. Design study on sodium-cooled large-scale reactor

    International Nuclear Information System (INIS)

    Shimakawa, Yoshio; Nibe, Nobuaki; Hori, Toru

    2002-05-01

    In Phase 1 of the 'Feasibility Study on Commercialized Fast Reactor Cycle Systems (F/S)', an advanced loop type reactor has been selected as a promising concept of sodium-cooled large-scale reactor, which has a possibility to fulfill the design requirements of the F/S. In Phase 2 of the F/S, it is planed to precede a preliminary conceptual design of a sodium-cooled large-scale reactor based on the design of the advanced loop type reactor. Through the design study, it is intended to construct such a plant concept that can show its attraction and competitiveness as a commercialized reactor. This report summarizes the results of the design study on the sodium-cooled large-scale reactor performed in JFY2001, which is the first year of Phase 2. In the JFY2001 design study, a plant concept has been constructed based on the design of the advanced loop type reactor, and fundamental specifications of main systems and components have been set. Furthermore, critical subjects related to safety, structural integrity, thermal hydraulics, operability, maintainability and economy have been examined and evaluated. As a result of this study, the plant concept of the sodium-cooled large-scale reactor has been constructed, which has a prospect to satisfy the economic goal (construction cost: less than 200,000yens/kWe, etc.) and has a prospect to solve the critical subjects. From now on, reflecting the results of elemental experiments, the preliminary conceptual design of this plant will be preceded toward the selection for narrowing down candidate concepts at the end of Phase 2. (author)

  19. NASA: Assessments of Selected Large-Scale Projects

    Science.gov (United States)

    2011-03-01

    REPORT DATE MAR 2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Assessments Of Selected Large-Scale Projects...Volatile EvolutioN MEP Mars Exploration Program MIB Mishap Investigation Board MMRTG Multi Mission Radioisotope Thermoelectric Generator MMS Magnetospheric...probes designed to explore the Martian surface, to satellites equipped with advanced sensors to study the earth , to telescopes intended to explore the

  20. Perturbation theory instead of large scale shell model calculations

    International Nuclear Information System (INIS)

    Feldmeier, H.; Mankos, P.

    1977-01-01

    Results of large scale shell model calculations for (sd)-shell nuclei are compared with a perturbation theory provides an excellent approximation when the SU(3)-basis is used as a starting point. The results indicate that perturbation theory treatment in an SU(3)-basis including 2hω excitations should be preferable to a full diagonalization within the (sd)-shell. (orig.) [de

  1. Primordial large-scale electromagnetic fields from gravitoelectromagnetic inflation

    Energy Technology Data Exchange (ETDEWEB)

    Membiela, Federico Agustin [Departamento de Fisica, Facultad de Ciencias Exactas y Naturales, Universidad Nacional de Mar del Plata, Funes 3350, (7600) Mar del Plata (Argentina); Consejo Nacional de Investigaciones Cientificas y Tecnicas (CONICET) (Argentina)], E-mail: membiela@mdp.edu.ar; Bellini, Mauricio [Departamento de Fisica, Facultad de Ciencias Exactas y Naturales, Universidad Nacional de Mar del Plata, Funes 3350, (7600) Mar del Plata (Argentina); Consejo Nacional de Investigaciones Cientificas y Tecnicas (CONICET) (Argentina)], E-mail: mbellini@mdp.edu.ar

    2009-04-20

    We investigate the origin and evolution of primordial electric and magnetic fields in the early universe, when the expansion is governed by a cosmological constant {lambda}{sub 0}. Using the gravitoelectromagnetic inflationary formalism with A{sub 0}=0, we obtain the power of spectrums for large-scale magnetic fields and the inflaton field fluctuations during inflation. A very important fact is that our formalism is naturally non-conformally invariant.

  2. Primordial large-scale electromagnetic fields from gravitoelectromagnetic inflation

    Science.gov (United States)

    Membiela, Federico Agustín; Bellini, Mauricio

    2009-04-01

    We investigate the origin and evolution of primordial electric and magnetic fields in the early universe, when the expansion is governed by a cosmological constant Λ0. Using the gravitoelectromagnetic inflationary formalism with A0 = 0, we obtain the power of spectrums for large-scale magnetic fields and the inflaton field fluctuations during inflation. A very important fact is that our formalism is naturally non-conformally invariant.

  3. Primordial large-scale electromagnetic fields from gravitoelectromagnetic inflation

    International Nuclear Information System (INIS)

    Membiela, Federico Agustin; Bellini, Mauricio

    2009-01-01

    We investigate the origin and evolution of primordial electric and magnetic fields in the early universe, when the expansion is governed by a cosmological constant Λ 0 . Using the gravitoelectromagnetic inflationary formalism with A 0 =0, we obtain the power of spectrums for large-scale magnetic fields and the inflaton field fluctuations during inflation. A very important fact is that our formalism is naturally non-conformally invariant.

  4. Rotation invariant fast features for large-scale recognition

    Science.gov (United States)

    Takacs, Gabriel; Chandrasekhar, Vijay; Tsai, Sam; Chen, David; Grzeszczuk, Radek; Girod, Bernd

    2012-10-01

    We present an end-to-end feature description pipeline which uses a novel interest point detector and Rotation- Invariant Fast Feature (RIFF) descriptors. The proposed RIFF algorithm is 15× faster than SURF1 while producing large-scale retrieval results that are comparable to SIFT.2 Such high-speed features benefit a range of applications from Mobile Augmented Reality (MAR) to web-scale image retrieval and analysis.

  5. On a Game of Large-Scale Projects Competition

    Science.gov (United States)

    Nikonov, Oleg I.; Medvedeva, Marina A.

    2009-09-01

    The paper is devoted to game-theoretical control problems motivated by economic decision making situations arising in realization of large-scale projects, such as designing and putting into operations the new gas or oil pipelines. A non-cooperative two player game is considered with payoff functions of special type for which standard existence theorems and algorithms for searching Nash equilibrium solutions are not applicable. The paper is based on and develops the results obtained in [1]-[5].

  6. The Phoenix series large scale LNG pool fire experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  7. Large scale 2D spectral compressed sensing in continuous domain

    KAUST Repository

    Cai, Jian-Feng

    2017-06-20

    We consider the problem of spectral compressed sensing in continuous domain, which aims to recover a 2-dimensional spectrally sparse signal from partially observed time samples. The signal is assumed to be a superposition of s complex sinusoids. We propose a semidefinite program for the 2D signal recovery problem. Our model is able to handle large scale 2D signals of size 500 × 500, whereas traditional approaches only handle signals of size around 20 × 20.

  8. Large scale 2D spectral compressed sensing in continuous domain

    KAUST Repository

    Cai, Jian-Feng; Xu, Weiyu; Yang, Yang

    2017-01-01

    We consider the problem of spectral compressed sensing in continuous domain, which aims to recover a 2-dimensional spectrally sparse signal from partially observed time samples. The signal is assumed to be a superposition of s complex sinusoids. We propose a semidefinite program for the 2D signal recovery problem. Our model is able to handle large scale 2D signals of size 500 × 500, whereas traditional approaches only handle signals of size around 20 × 20.

  9. Large-scale nuclear energy from the thorium cycle

    International Nuclear Information System (INIS)

    Lewis, W.B.; Duret, M.F.; Craig, D.S.; Veeder, J.I.; Bain, A.S.

    1973-02-01

    The thorium fuel cycle in CANDU (Canada Deuterium Uranium) reactors challenges breeders and fusion as the simplest means of meeting the world's large-scale demands for energy for centuries. Thorium oxide fuel allows high power density with excellent neutron economy. The combination of thorium fuel with organic caloporteur promises easy maintenance and high availability of the whole plant. The total fuelling cost including charges on the inventory is estimated to be attractively low. (author) [fr

  10. Large scale particle image velocimetry with helium filled soap bubbles

    Energy Technology Data Exchange (ETDEWEB)

    Bosbach, Johannes; Kuehn, Matthias; Wagner, Claus [German Aerospace Center (DLR), Institute of Aerodynamics and Flow Technology, Goettingen (Germany)

    2009-03-15

    The application of particle image velocimetry (PIV) to measurement of flows on large scales is a challenging necessity especially for the investigation of convective air flows. Combining helium filled soap bubbles as tracer particles with high power quality switched solid state lasers as light sources allows conducting PIV on scales of the order of several square meters. The technique was applied to mixed convection in a full scale double aisle aircraft cabin mock-up for validation of computational fluid dynamics simulations. (orig.)

  11. Fast, large-scale hologram calculation in wavelet domain

    Science.gov (United States)

    Shimobaba, Tomoyoshi; Matsushima, Kyoji; Takahashi, Takayuki; Nagahama, Yuki; Hasegawa, Satoki; Sano, Marie; Hirayama, Ryuji; Kakue, Takashi; Ito, Tomoyoshi

    2018-04-01

    We propose a large-scale hologram calculation using WAvelet ShrinkAge-Based superpositIon (WASABI), a wavelet transform-based algorithm. An image-type hologram calculated using the WASABI method is printed on a glass substrate with the resolution of 65 , 536 × 65 , 536 pixels and a pixel pitch of 1 μm. The hologram calculation time amounts to approximately 354 s on a commercial CPU, which is approximately 30 times faster than conventional methods.

  12. Evolutionary leap in large-scale flood risk assessment needed

    OpenAIRE

    Vorogushyn, Sergiy; Bates, Paul D.; de Bruijn, Karin; Castellarin, Attilio; Kreibich, Heidi; Priest, Sally J.; Schröter, Kai; Bagli, Stefano; Blöschl, Günter; Domeneghetti, Alessio; Gouldby, Ben; Klijn, Frans; Lammersen, Rita; Neal, Jeffrey C.; Ridder, Nina

    2018-01-01

    Current approaches for assessing large-scale flood risks contravene the fundamental principles of the flood risk system functioning because they largely ignore basic interactions and feedbacks between atmosphere, catchments, river-floodplain systems and socio-economic processes. As a consequence, risk analyses are uncertain and might be biased. However, reliable risk estimates are required for prioritizing national investments in flood risk mitigation or for appraisal and management of insura...

  13. Large scale particle image velocimetry with helium filled soap bubbles

    Science.gov (United States)

    Bosbach, Johannes; Kühn, Matthias; Wagner, Claus

    2009-03-01

    The application of Particle Image Velocimetry (PIV) to measurement of flows on large scales is a challenging necessity especially for the investigation of convective air flows. Combining helium filled soap bubbles as tracer particles with high power quality switched solid state lasers as light sources allows conducting PIV on scales of the order of several square meters. The technique was applied to mixed convection in a full scale double aisle aircraft cabin mock-up for validation of Computational Fluid Dynamics simulations.

  14. Large-scale Health Information Database and Privacy Protection*1

    OpenAIRE

    YAMAMOTO, Ryuichi

    2016-01-01

    Japan was once progressive in the digitalization of healthcare fields but unfortunately has fallen behind in terms of the secondary use of data for public interest. There has recently been a trend to establish large-scale health databases in the nation, and a conflict between data use for public interest and privacy protection has surfaced as this trend has progressed. Databases for health insurance claims or for specific health checkups and guidance services were created according to the law...

  15. Large-scale preparation of hollow graphitic carbon nanospheres

    International Nuclear Information System (INIS)

    Feng, Jun; Li, Fu; Bai, Yu-Jun; Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning; Lu, Xi-Feng

    2013-01-01

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 °C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g −1 after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 °C, which exhibit superior electrochemical performance to graphite. Highlights: ► Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 °C ► The preparation is simple, effective and eco-friendly. ► The in situ yielded MgO nanocrystals promote the graphitization. ► The HGCNSs exhibit superior electrochemical performance to graphite.

  16. Homogenization of Large-Scale Movement Models in Ecology

    Science.gov (United States)

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  17. Multiresolution comparison of precipitation datasets for large-scale models

    Science.gov (United States)

    Chun, K. P.; Sapriza Azuri, G.; Davison, B.; DeBeer, C. M.; Wheater, H. S.

    2014-12-01

    Gridded precipitation datasets are crucial for driving large-scale models which are related to weather forecast and climate research. However, the quality of precipitation products is usually validated individually. Comparisons between gridded precipitation products along with ground observations provide another avenue for investigating how the precipitation uncertainty would affect the performance of large-scale models. In this study, using data from a set of precipitation gauges over British Columbia and Alberta, we evaluate several widely used North America gridded products including the Canadian Gridded Precipitation Anomalies (CANGRD), the National Center for Environmental Prediction (NCEP) reanalysis, the Water and Global Change (WATCH) project, the thin plate spline smoothing algorithms (ANUSPLIN) and Canadian Precipitation Analysis (CaPA). Based on verification criteria for various temporal and spatial scales, results provide an assessment of possible applications for various precipitation datasets. For long-term climate variation studies (~100 years), CANGRD, NCEP, WATCH and ANUSPLIN have different comparative advantages in terms of their resolution and accuracy. For synoptic and mesoscale precipitation patterns, CaPA provides appealing performance of spatial coherence. In addition to the products comparison, various downscaling methods are also surveyed to explore new verification and bias-reduction methods for improving gridded precipitation outputs for large-scale models.

  18. BILGO: Bilateral greedy optimization for large scale semidefinite programming

    KAUST Repository

    Hao, Zhifeng

    2013-10-03

    Many machine learning tasks (e.g. metric and manifold learning problems) can be formulated as convex semidefinite programs. To enable the application of these tasks on a large-scale, scalability and computational efficiency are considered as desirable properties for a practical semidefinite programming algorithm. In this paper, we theoretically analyze a new bilateral greedy optimization (denoted BILGO) strategy in solving general semidefinite programs on large-scale datasets. As compared to existing methods, BILGO employs a bilateral search strategy during each optimization iteration. In such an iteration, the current semidefinite matrix solution is updated as a bilateral linear combination of the previous solution and a suitable rank-1 matrix, which can be efficiently computed from the leading eigenvector of the descent direction at this iteration. By optimizing for the coefficients of the bilateral combination, BILGO reduces the cost function in every iteration until the KKT conditions are fully satisfied, thus, it tends to converge to a global optimum. In fact, we prove that BILGO converges to the global optimal solution at a rate of O(1/k), where k is the iteration counter. The algorithm thus successfully combines the efficiency of conventional rank-1 update algorithms and the effectiveness of gradient descent. Moreover, BILGO can be easily extended to handle low rank constraints. To validate the effectiveness and efficiency of BILGO, we apply it to two important machine learning tasks, namely Mahalanobis metric learning and maximum variance unfolding. Extensive experimental results clearly demonstrate that BILGO can solve large-scale semidefinite programs efficiently.

  19. Human visual system automatically represents large-scale sequential regularities.

    Science.gov (United States)

    Kimura, Motohiro; Widmann, Andreas; Schröger, Erich

    2010-03-04

    Our brain recordings reveal that large-scale sequential regularities defined across non-adjacent stimuli can be automatically represented in visual sensory memory. To show that, we adopted an auditory paradigm developed by Sussman, E., Ritter, W., and Vaughan, H. G. Jr. (1998). Predictability of stimulus deviance and the mismatch negativity. NeuroReport, 9, 4167-4170, Sussman, E., and Gumenyuk, V. (2005). Organization of sequential sounds in auditory memory. NeuroReport, 16, 1519-1523 to the visual domain by presenting task-irrelevant infrequent luminance-deviant stimuli (D, 20%) inserted among task-irrelevant frequent stimuli being of standard luminance (S, 80%) in randomized (randomized condition, SSSDSSSSSDSSSSD...) and fixed manners (fixed condition, SSSSDSSSSDSSSSD...). Comparing the visual mismatch negativity (visual MMN), an event-related brain potential (ERP) index of memory-mismatch processes in human visual sensory system, revealed that visual MMN elicited by deviant stimuli was reduced in the fixed compared to the randomized condition. Thus, the large-scale sequential regularity being present in the fixed condition (SSSSD) must have been represented in visual sensory memory. Interestingly, this effect did not occur in conditions with stimulus-onset asynchronies (SOAs) of 480 and 800 ms but was confined to the 160-ms SOA condition supporting the hypothesis that large-scale regularity extraction was based on perceptual grouping of the five successive stimuli defining the regularity. 2010 Elsevier B.V. All rights reserved.

  20. Large-scale preparation of hollow graphitic carbon nanospheres

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Jun; Li, Fu [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); Bai, Yu-Jun, E-mail: byj97@126.com [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); State Key laboratory of Crystal Materials, Shandong University, Jinan 250100 (China); Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); Lu, Xi-Feng [Lunan Institute of Coal Chemical Engineering, Jining 272000 (China)

    2013-01-15

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 Degree-Sign C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g{sup -1} after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 Degree-Sign C, which exhibit superior electrochemical performance to graphite. Highlights: Black-Right-Pointing-Pointer Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 Degree-Sign C Black-Right-Pointing-Pointer The preparation is simple, effective and eco-friendly. Black-Right-Pointing-Pointer The in situ yielded MgO nanocrystals promote the graphitization. Black-Right-Pointing-Pointer The HGCNSs exhibit superior electrochemical performance to graphite.

  1. Study of a large scale neutron measurement channel

    International Nuclear Information System (INIS)

    Amarouayache, Anissa; Ben Hadid, Hayet.

    1982-12-01

    A large scale measurement channel allows the processing of the signal coming from an unique neutronic sensor, during three different running modes: impulses, fluctuations and current. The study described in this note includes three parts: - A theoretical study of the large scale channel and its brief description are given. The results obtained till now in that domain are presented. - The fluctuation mode is thoroughly studied and the improvements to be done are defined. The study of a fluctuation linear channel with an automatic commutation of scales is described and the results of the tests are given. In this large scale channel, the method of data processing is analogical. - To become independent of the problems generated by the use of a an analogical processing of the fluctuation signal, a digital method of data processing is tested. The validity of that method is improved. The results obtained on a test system realized according to this method are given and a preliminary plan for further research is defined [fr

  2. Critical thinking, politics on a large scale and media democracy

    Directory of Open Access Journals (Sweden)

    José Antonio IBÁÑEZ-MARTÍN

    2015-06-01

    Full Text Available The first approximation to the social current reality offers us numerous motives for the worry. The spectacle of violence and of immorality can scare us easily. But more worrying still it is to verify that the horizon of conviviality, peace and wellbeing that Europe had been developing from the Treaty of Rome of 1957 has compromised itself seriously for the economic crisis. Today we are before an assault to the democratic politics, which is qualified, on the part of the media democracy, as an exhausted system, which is required to be changed into a new and great politics, a politics on a large scale. The article analyses the concept of a politics on a large scale, primarily attending to Nietzsche, and noting its union with the great philosophy and the great education. The study of the texts of Nietzsche leads us to the conclusion of how in them we often find an interesting analysis of the problems and a misguided proposal for solutions. We cannot think to suggest solutions to all the problems, but we outline various proposals about changes of political activity, that reasonably are defended from the media democracy. In conclusion, we point out that a politics on a large scale requires statesmen, able to suggest modes of life in common that can structure a long-term coexistence.

  3. Geospatial Optimization of Siting Large-Scale Solar Projects

    Energy Technology Data Exchange (ETDEWEB)

    Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Quinby, Ted [National Renewable Energy Lab. (NREL), Golden, CO (United States); Caulfield, Emmet [Stanford Univ., CA (United States); Gerritsen, Margot [Stanford Univ., CA (United States); Diffendorfer, Jay [U.S. Geological Survey, Boulder, CO (United States); Haines, Seth [U.S. Geological Survey, Boulder, CO (United States)

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent with each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  4. Parallel clustering algorithm for large-scale biological data sets.

    Science.gov (United States)

    Wang, Minchao; Zhang, Wu; Ding, Wang; Dai, Dongbo; Zhang, Huiran; Xie, Hao; Chen, Luonan; Guo, Yike; Xie, Jiang

    2014-01-01

    Recent explosion of biological data brings a great challenge for the traditional clustering algorithms. With increasing scale of data sets, much larger memory and longer runtime are required for the cluster identification problems. The affinity propagation algorithm outperforms many other classical clustering algorithms and is widely applied into the biological researches. However, the time and space complexity become a great bottleneck when handling the large-scale data sets. Moreover, the similarity matrix, whose constructing procedure takes long runtime, is required before running the affinity propagation algorithm, since the algorithm clusters data sets based on the similarities between data pairs. Two types of parallel architectures are proposed in this paper to accelerate the similarity matrix constructing procedure and the affinity propagation algorithm. The memory-shared architecture is used to construct the similarity matrix, and the distributed system is taken for the affinity propagation algorithm, because of its large memory size and great computing capacity. An appropriate way of data partition and reduction is designed in our method, in order to minimize the global communication cost among processes. A speedup of 100 is gained with 128 cores. The runtime is reduced from serval hours to a few seconds, which indicates that parallel algorithm is capable of handling large-scale data sets effectively. The parallel affinity propagation also achieves a good performance when clustering large-scale gene data (microarray) and detecting families in large protein superfamilies.

  5. Accelerating large-scale phase-field simulations with GPU

    Directory of Open Access Journals (Sweden)

    Xiaoming Shi

    2017-10-01

    Full Text Available A new package for accelerating large-scale phase-field simulations was developed by using GPU based on the semi-implicit Fourier method. The package can solve a variety of equilibrium equations with different inhomogeneity including long-range elastic, magnetostatic, and electrostatic interactions. Through using specific algorithm in Compute Unified Device Architecture (CUDA, Fourier spectral iterative perturbation method was integrated in GPU package. The Allen-Cahn equation, Cahn-Hilliard equation, and phase-field model with long-range interaction were solved based on the algorithm running on GPU respectively to test the performance of the package. From the comparison of the calculation results between the solver executed in single CPU and the one on GPU, it was found that the speed on GPU is enormously elevated to 50 times faster. The present study therefore contributes to the acceleration of large-scale phase-field simulations and provides guidance for experiments to design large-scale functional devices.

  6. Large-scale fracture mechancis testing -- requirements and possibilities

    International Nuclear Information System (INIS)

    Brumovsky, M.

    1993-01-01

    Application of fracture mechanics to very important and/or complicated structures, like reactor pressure vessels, brings also some questions about the reliability and precision of such calculations. These problems become more pronounced in cases of elastic-plastic conditions of loading and/or in parts with non-homogeneous materials (base metal and austenitic cladding, property gradient changes through material thickness) or with non-homogeneous stress fields (nozzles, bolt threads, residual stresses etc.). For such special cases some verification by large-scale testing is necessary and valuable. This paper discusses problems connected with planning of such experiments with respect to their limitations, requirements to a good transfer of received results to an actual vessel. At the same time, an analysis of possibilities of small-scale model experiments is also shown, mostly in connection with application of results between standard, small-scale and large-scale experiments. Experience from 30 years of large-scale testing in SKODA is used as an example to support this analysis. 1 fig

  7. BILGO: Bilateral greedy optimization for large scale semidefinite programming

    KAUST Repository

    Hao, Zhifeng; Yuan, Ganzhao; Ghanem, Bernard

    2013-01-01

    Many machine learning tasks (e.g. metric and manifold learning problems) can be formulated as convex semidefinite programs. To enable the application of these tasks on a large-scale, scalability and computational efficiency are considered as desirable properties for a practical semidefinite programming algorithm. In this paper, we theoretically analyze a new bilateral greedy optimization (denoted BILGO) strategy in solving general semidefinite programs on large-scale datasets. As compared to existing methods, BILGO employs a bilateral search strategy during each optimization iteration. In such an iteration, the current semidefinite matrix solution is updated as a bilateral linear combination of the previous solution and a suitable rank-1 matrix, which can be efficiently computed from the leading eigenvector of the descent direction at this iteration. By optimizing for the coefficients of the bilateral combination, BILGO reduces the cost function in every iteration until the KKT conditions are fully satisfied, thus, it tends to converge to a global optimum. In fact, we prove that BILGO converges to the global optimal solution at a rate of O(1/k), where k is the iteration counter. The algorithm thus successfully combines the efficiency of conventional rank-1 update algorithms and the effectiveness of gradient descent. Moreover, BILGO can be easily extended to handle low rank constraints. To validate the effectiveness and efficiency of BILGO, we apply it to two important machine learning tasks, namely Mahalanobis metric learning and maximum variance unfolding. Extensive experimental results clearly demonstrate that BILGO can solve large-scale semidefinite programs efficiently.

  8. Utilization of Large Scale Surface Models for Detailed Visibility Analyses

    Science.gov (United States)

    Caha, J.; Kačmařík, M.

    2017-11-01

    This article demonstrates utilization of large scale surface models with small spatial resolution and high accuracy, acquired from Unmanned Aerial Vehicle scanning, for visibility analyses. The importance of large scale data for visibility analyses on the local scale, where the detail of the surface model is the most defining factor, is described. The focus is not only the classic Boolean visibility, that is usually determined within GIS, but also on so called extended viewsheds that aims to provide more information about visibility. The case study with examples of visibility analyses was performed on river Opava, near the Ostrava city (Czech Republic). The multiple Boolean viewshed analysis and global horizon viewshed were calculated to determine most prominent features and visibility barriers of the surface. Besides that, the extended viewshed showing angle difference above the local horizon, which describes angular height of the target area above the barrier, is shown. The case study proved that large scale models are appropriate data source for visibility analyses on local level. The discussion summarizes possible future applications and further development directions of visibility analyses.

  9. Photorealistic large-scale urban city model reconstruction.

    Science.gov (United States)

    Poullis, Charalambos; You, Suya

    2009-01-01

    The rapid and efficient creation of virtual environments has become a crucial part of virtual reality applications. In particular, civil and defense applications often require and employ detailed models of operations areas for training, simulations of different scenarios, planning for natural or man-made events, monitoring, surveillance, games, and films. A realistic representation of the large-scale environments is therefore imperative for the success of such applications since it increases the immersive experience of its users and helps reduce the difference between physical and virtual reality. However, the task of creating such large-scale virtual environments still remains a time-consuming and manual work. In this work, we propose a novel method for the rapid reconstruction of photorealistic large-scale virtual environments. First, a novel, extendible, parameterized geometric primitive is presented for the automatic building identification and reconstruction of building structures. In addition, buildings with complex roofs containing complex linear and nonlinear surfaces are reconstructed interactively using a linear polygonal and a nonlinear primitive, respectively. Second, we present a rendering pipeline for the composition of photorealistic textures, which unlike existing techniques, can recover missing or occluded texture information by integrating multiple information captured from different optical sensors (ground, aerial, and satellite).

  10. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-05-27

    While real-time applications are nowadays routinely used in visualizing large nu- merical simulations and volumes, handling these large-scale datasets requires high-end graphics clusters or supercomputers to process and visualize them. However, not all users have access to powerful clusters. Therefore, it is challenging to come up with a visualization approach that provides insight to large-scale datasets on a single com- puter. Explorable images (EI) is one of the methods that allows users to handle large data on a single workstation. Although it is a view-dependent method, it combines both exploration and modification of visual aspects without re-accessing the original huge data. In this thesis, we propose a novel image-based method that applies the concept of EI in visualizing large flow-field pathlines data. The goal of our work is to provide an optimized image-based method, which scales well with the dataset size. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  11. [A large-scale accident in Alpine terrain].

    Science.gov (United States)

    Wildner, M; Paal, P

    2015-02-01

    Due to the geographical conditions, large-scale accidents amounting to mass casualty incidents (MCI) in Alpine terrain regularly present rescue teams with huge challenges. Using an example incident, specific conditions and typical problems associated with such a situation are presented. The first rescue team members to arrive have the elementary tasks of qualified triage and communication to the control room, which is required to dispatch the necessary additional support. Only with a clear "concept", to which all have to adhere, can the subsequent chaos phase be limited. In this respect, a time factor confounded by adverse weather conditions or darkness represents enormous pressure. Additional hazards are frostbite and hypothermia. If priorities can be established in terms of urgency, then treatment and procedure algorithms have proven successful. For evacuation of causalities, a helicopter should be strived for. Due to the low density of hospitals in Alpine regions, it is often necessary to distribute the patients over a wide area. Rescue operations in Alpine terrain have to be performed according to the particular conditions and require rescue teams to have specific knowledge and expertise. The possibility of a large-scale accident should be considered when planning events. With respect to optimization of rescue measures, regular training and exercises are rational, as is the analysis of previous large-scale Alpine accidents.

  12. Research and development project for a large-scale industrial technology in fiscal 1992. Research and development of an advanced function creating and processing technology /Development of an advanced function creating and processing technology (Report on work achievements); 1992 nend senshin kino soshutsu kako gijutsu no kenkyu kaihatsu seika hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-01-01

    A joint research has been performed between the Material Engineering Technology Research Institute of the National Institute of Materials and Chemical Research of the Ministry of International Trade and Industry and the Advanced Function Creating and Processing Technology Research Association. The research themes are the 'identification of the basic conditions for production of ultra fine ceramics particles by using the hybrid high-frequency plasma process' and the 'establishment of non-destructive analysis technology for inclination functional materials'. This paper reports the achievements in fiscal 1992. The research on the inclination functional materials has performed hybridization of high-frequency plasmas and fabrication of ultra fine alumina particles, trial fabrication and evaluation on alumina-titanium mixed ultra fine particles by using the high-frequency plasma process, spraying of hydroxyapatite by using high-frequency plasma, evaluation on thick film bulk made of ultra fine particles, and trial fabrication of an ultra fine particle injection device. It was intended to evaluate quantitatively crystalline deficiency and composition distribution in the inclination functional materials for which continuous composition control is important. Therefore, a Rutherford wake scattering device was introduced and installed newly at the Material Engineering Technology Research Institute, which has improved the evaluation and experiment system at the institute. (NEDO)

  13. Large-Scale Sequencing: The Future of Genomic Sciences Colloquium

    Energy Technology Data Exchange (ETDEWEB)

    Margaret Riley; Merry Buckley

    2009-01-01

    Genetic sequencing and the various molecular techniques it has enabled have revolutionized the field of microbiology. Examining and comparing the genetic sequences borne by microbes - including bacteria, archaea, viruses, and microbial eukaryotes - provides researchers insights into the processes microbes carry out, their pathogenic traits, and new ways to use microorganisms in medicine and manufacturing. Until recently, sequencing entire microbial genomes has been laborious and expensive, and the decision to sequence the genome of an organism was made on a case-by-case basis by individual researchers and funding agencies. Now, thanks to new technologies, the cost and effort of sequencing is within reach for even the smallest facilities, and the ability to sequence the genomes of a significant fraction of microbial life may be possible. The availability of numerous microbial genomes will enable unprecedented insights into microbial evolution, function, and physiology. However, the current ad hoc approach to gathering sequence data has resulted in an unbalanced and highly biased sampling of microbial diversity. A well-coordinated, large-scale effort to target the breadth and depth of microbial diversity would result in the greatest impact. The American Academy of Microbiology convened a colloquium to discuss the scientific benefits of engaging in a large-scale, taxonomically-based sequencing project. A group of individuals with expertise in microbiology, genomics, informatics, ecology, and evolution deliberated on the issues inherent in such an effort and generated a set of specific recommendations for how best to proceed. The vast majority of microbes are presently uncultured and, thus, pose significant challenges to such a taxonomically-based approach to sampling genome diversity. However, we have yet to even scratch the surface of the genomic diversity among cultured microbes. A coordinated sequencing effort of cultured organisms is an appropriate place to begin

  14. Robust large-scale parallel nonlinear solvers for simulations.

    Energy Technology Data Exchange (ETDEWEB)

    Bader, Brett William; Pawlowski, Roger Patrick; Kolda, Tamara Gibson (Sandia National Laboratories, Livermore, CA)

    2005-11-01

    This report documents research to develop robust and efficient solution techniques for solving large-scale systems of nonlinear equations. The most widely used method for solving systems of nonlinear equations is Newton's method. While much research has been devoted to augmenting Newton-based solvers (usually with globalization techniques), little has been devoted to exploring the application of different models. Our research has been directed at evaluating techniques using different models than Newton's method: a lower order model, Broyden's method, and a higher order model, the tensor method. We have developed large-scale versions of each of these models and have demonstrated their use in important applications at Sandia. Broyden's method replaces the Jacobian with an approximation, allowing codes that cannot evaluate a Jacobian or have an inaccurate Jacobian to converge to a solution. Limited-memory methods, which have been successful in optimization, allow us to extend this approach to large-scale problems. We compare the robustness and efficiency of Newton's method, modified Newton's method, Jacobian-free Newton-Krylov method, and our limited-memory Broyden method. Comparisons are carried out for large-scale applications of fluid flow simulations and electronic circuit simulations. Results show that, in cases where the Jacobian was inaccurate or could not be computed, Broyden's method converged in some cases where Newton's method failed to converge. We identify conditions where Broyden's method can be more efficient than Newton's method. We also present modifications to a large-scale tensor method, originally proposed by Bouaricha, for greater efficiency, better robustness, and wider applicability. Tensor methods are an alternative to Newton-based methods and are based on computing a step based on a local quadratic model rather than a linear model. The advantage of Bouaricha's method is that it can use any

  15. Foundational perspectives on causality in large-scale brain networks

    Science.gov (United States)

    Mannino, Michael; Bressler, Steven L.

    2015-12-01

    A profusion of recent work in cognitive neuroscience has been concerned with the endeavor to uncover causal influences in large-scale brain networks. However, despite the fact that many papers give a nod to the important theoretical challenges posed by the concept of causality, this explosion of research has generally not been accompanied by a rigorous conceptual analysis of the nature of causality in the brain. This review provides both a descriptive and prescriptive account of the nature of causality as found within and between large-scale brain networks. In short, it seeks to clarify the concept of causality in large-scale brain networks both philosophically and scientifically. This is accomplished by briefly reviewing the rich philosophical history of work on causality, especially focusing on contributions by David Hume, Immanuel Kant, Bertrand Russell, and Christopher Hitchcock. We go on to discuss the impact that various interpretations of modern physics have had on our understanding of causality. Throughout all this, a central focus is the distinction between theories of deterministic causality (DC), whereby causes uniquely determine their effects, and probabilistic causality (PC), whereby causes change the probability of occurrence of their effects. We argue that, given the topological complexity of its large-scale connectivity, the brain should be considered as a complex system and its causal influences treated as probabilistic in nature. We conclude that PC is well suited for explaining causality in the brain for three reasons: (1) brain causality is often mutual; (2) connectional convergence dictates that only rarely is the activity of one neuronal population uniquely determined by another one; and (3) the causal influences exerted between neuronal populations may not have observable effects. A number of different techniques are currently available to characterize causal influence in the brain. Typically, these techniques quantify the statistical

  16. Large-scale ground motion simulation using GPGPU

    Science.gov (United States)

    Aoi, S.; Maeda, T.; Nishizawa, N.; Aoki, T.

    2012-12-01

    Huge computation resources are required to perform large-scale ground motion simulations using 3-D finite difference method (FDM) for realistic and complex models with high accuracy. Furthermore, thousands of various simulations are necessary to evaluate the variability of the assessment caused by uncertainty of the assumptions of the source models for future earthquakes. To conquer the problem of restricted computational resources, we introduced the use of GPGPU (General purpose computing on graphics processing units) which is the technique of using a GPU as an accelerator of the computation which has been traditionally conducted by the CPU. We employed the CPU version of GMS (Ground motion Simulator; Aoi et al., 2004) as the original code and implemented the function for GPU calculation using CUDA (Compute Unified Device Architecture). GMS is a total system for seismic wave propagation simulation based on 3-D FDM scheme using discontinuous grids (Aoi&Fujiwara, 1999), which includes the solver as well as the preprocessor tools (parameter generation tool) and postprocessor tools (filter tool, visualization tool, and so on). The computational model is decomposed in two horizontal directions and each decomposed model is allocated to a different GPU. We evaluated the performance of our newly developed GPU version of GMS on the TSUBAME2.0 which is one of the Japanese fastest supercomputer operated by the Tokyo Institute of Technology. First we have performed a strong scaling test using the model with about 22 million grids and achieved 3.2 and 7.3 times of the speed-up by using 4 and 16 GPUs. Next, we have examined a weak scaling test where the model sizes (number of grids) are increased in proportion to the degree of parallelism (number of GPUs). The result showed almost perfect linearity up to the simulation with 22 billion grids using 1024 GPUs where the calculation speed reached to 79.7 TFlops and about 34 times faster than the CPU calculation using the same number

  17. Proceedings of the Joint IAEA/CSNI Specialists' Meeting on Fracture Mechanics Verification by Large-Scale Testing

    International Nuclear Information System (INIS)

    1993-10-01

    This report provides the proceedings of a Specialists' Meeting on Fracture Mechanics Verification by Large-Scale Testing that was held in Oak Ridge, Tennessee, on October 23-25, 1992. The meeting was jointly sponsored by the International Atomic Energy Agency (IAEA) and the Nuclear Energy Agency (NEA) of the Organization for Economic Cooperation and Development. In particular, the International Working Group (IWG) on Life Management of Nuclear Power Plants (LMNPP) was the IAEA sponsor, and the Principal Working Group 3 (PWG-3) (Primary System Component Integrity) of the Committee for the Safety of Nuclear Installations (CSNI) was the NEA's sponsor. This meeting was preceded by two prior international activities that were designed to examine the state-of-the-art in fracture analysis capabilities and emphasized applications to the safety evaluation of nuclear power facilities. The first of those two activities was an IAEA Specialists' Meeting on Fracture Mechanics Verification by Large-Scale Testing that was held at the Staatliche Materialprufungsanstalt (MPA) in Stuttgart, Germany, on May 25-27, 1988; the proceedings of that meeting were published 1991.1 The second activity was the CSNI/PWG-3's Fracture Assessment Group's Project FALSIRE (Fracture Analyses of Large-Scale International Reference Experiments). The proceedings of the FALSIRE workshop that was held in Boston, Massachusetts, U.S.A., on May 8-10, 1990, was recently published by the Oak Ridge National Laboratory (ORNL). Those previous activities identified capabilities and shortcomings of various fracture analysis methods based on analyses of six available large-scale experiments. Different modes of fracture behavior, which ranged from brittle to ductile, were considered. In addition, geometry, size, constraint and multiaxial effects were considered. While generally good predictive capabilities were demonstrated for brittle fracture, issues were identified relative to predicting fracture behavior at higher

  18. A practical process for light-water detritiation at large scales

    Energy Technology Data Exchange (ETDEWEB)

    Boniface, H.A. [Atomic Energy of Canada Limited, Chalk River, ON (Canada); Robinson, J., E-mail: jr@tyne-engineering.com [Tyne Engineering, Burlington, ON (Canada); Gnanapragasam, N.V.; Castillo, I.; Suppiah, S. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2014-07-01

    AECL and Tyne Engineering have recently completed a preliminary engineering design for a modest-scale tritium removal plant for light water, intended for installation at AECL's Chalk River Laboratories (CRL). This plant design was based on the Combined Electrolysis and Catalytic Exchange (CECE) technology developed at CRL over many years and demonstrated there and elsewhere. The general features and capabilities of this design have been reported as well as the versatility of the design for separating any pair of the three hydrogen isotopes. The same CECE technology could be applied directly to very large-scale wastewater detritiation, such as the case at Fukushima Daiichi Nuclear Power Station. However, since the CECE process scales linearly with throughput, the required capital and operating costs are substantial for such large-scale applications. This paper discusses some options for reducing the costs of very large-scale detritiation. Options include: Reducing tritium removal effectiveness; Energy recovery; Improving the tolerance of impurities; Use of less expensive or more efficient equipment. A brief comparison with alternative processes is also presented. (author)

  19. Accident of Large-scale Wind Turbines Disconnecting from Power Grid and Its Protection

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    There were many accidents of large-scale wind turbines disconnecting from power grid in 2011. As single- phase-to-ground fault cannot be correctly detected, single-phase-to-ground fault evolved to phase-to-phase fault. Phase-to-phase fault was isolated slowly, thus leading to low voltage. And wind turbines without enough low voltage ride-through capacity had to be disconnected from the grid. After some wind turbines being disconnected from the grid, overvoltage caused by reactive power surplus made more wind turbines disconnect from the grid. Based on the accident analysis, this paper presents solutions to above problems, including travelling waves based single-phase-to-ground protection, adaptive low voltage protection, integrated protection and control, and high impedance fault detection. The solutions lay foundations in theory and technology to prevent large-scale wind turbines disconnecting from the operating power grid.

  20. Task-Management Method Using R-Tree Spatial Cloaking for Large-Scale Crowdsourcing

    Directory of Open Access Journals (Sweden)

    Yan Li

    2017-12-01

    Full Text Available With the development of sensor technology and the popularization of the data-driven service paradigm, spatial crowdsourcing systems have become an important way of collecting map-based location data. However, large-scale task management and location privacy are important factors for participants in spatial crowdsourcing. In this paper, we propose the use of an R-tree spatial cloaking-based task-assignment method for large-scale spatial crowdsourcing. We use an estimated R-tree based on the requested crowdsourcing tasks to reduce the crowdsourcing server-side inserting cost and enable the scalability. By using Minimum Bounding Rectangle (MBR-based spatial anonymous data without exact position data, this method preserves the location privacy of participants in a simple way. In our experiment, we showed that our proposed method is faster than the current method, and is very efficient when the scale is increased.

  1. Simulation research on the process of large scale ship plane segmentation intelligent workshop

    Science.gov (United States)

    Xu, Peng; Liao, Liangchuang; Zhou, Chao; Xue, Rui; Fu, Wei

    2017-04-01

    Large scale ship plane segmentation intelligent workshop is a new thing, and there is no research work in related fields at home and abroad. The mode of production should be transformed by the existing industry 2.0 or part of industry 3.0, also transformed from "human brain analysis and judgment + machine manufacturing" to "machine analysis and judgment + machine manufacturing". In this transforming process, there are a great deal of tasks need to be determined on the aspects of management and technology, such as workshop structure evolution, development of intelligent equipment and changes in business model. Along with them is the reformation of the whole workshop. Process simulation in this project would verify general layout and process flow of large scale ship plane section intelligent workshop, also would analyze intelligent workshop working efficiency, which is significant to the next step of the transformation of plane segmentation intelligent workshop.

  2. A novel iron-lead redox flow battery for large-scale energy storage

    Science.gov (United States)

    Zeng, Y. K.; Zhao, T. S.; Zhou, X. L.; Wei, L.; Ren, Y. X.

    2017-04-01

    The redox flow battery (RFB) is one of the most promising large-scale energy storage technologies for the massive utilization of intermittent renewables especially wind and solar energy. This work presents a novel redox flow battery that utilizes inexpensive and abundant Fe(II)/Fe(III) and Pb/Pb(II) redox couples as redox materials. Experimental results show that both the Fe(II)/Fe(III) and Pb/Pb(II) redox couples have fast electrochemical kinetics in methanesulfonic acid, and that the coulombic efficiency and energy efficiency of the battery are, respectively, as high as 96.2% and 86.2% at 40 mA cm-2. Furthermore, the battery exhibits stable performance in terms of efficiencies and discharge capacities during the cycle test. The inexpensive redox materials, fast electrochemical kinetics and stable cycle performance make the present battery a promising candidate for large-scale energy storage applications.

  3. A Large Scale Problem Based Learning inter-European Student Satellite Construction Project

    DEFF Research Database (Denmark)

    Nielsen, Jens Frederik Dalsgaard; Alminde, Lars; Bisgaard, Morten

    2006-01-01

    that electronic communication technology was vital within the project. Additionally the SSETI EXPRESS project implied the following problems it didn’t fit to a standard semester - 18 months for the satellite project compared to 5/6 months for a “normal” semester project. difficulties in integrating the tasks......A LARGE SCALE PROBLEM BASED LEARNING INTER-EUROPEAN STUDENT SATELLITE CONSTRUCTION PROJECT This paper describes the pedagogical outcome of a large scale PBL experiment. ESA (European Space Agency) Education Office launched January 2004 an ambitious project: Let students from all over Europe build....... The satellite was successfully launched on October 27th 2005 (http://www.express.space.aau.dk). The project was a student driven project with student project responsibility adding at lot of international experiences and project management skills to the outcome of more traditional one semester, single group...

  4. A European collaboration research programme to study and test large scale base isolated structures

    International Nuclear Information System (INIS)

    Renda, V.; Verzeletti, G.; Papa, L.

    1995-01-01

    The improvement of the technology of innovative anti-seismic mechanisms, as those for base isolation and energy dissipation, needs of testing capability for large scale models of structures integrated with these mechanisms. These kind experimental tests are of primary importance for the validation of design rules and the setting up of an advanced earthquake engineering for civil constructions of relevant interest. The Joint Research Centre of the European Commission offers the European Laboratory for Structural Assessment located at Ispra - Italy, as a focal point for an international european collaboration research programme to test large scale models of structure making use of innovative anti-seismic mechanisms. A collaboration contract, opened to other future contributions, has been signed with the national italian working group on seismic isolation (Gruppo di Lavoro sull's Isolamento Sismico GLIS) which includes the national research centre ENEA, the national electricity board ENEL, the industrial research centre ISMES and producer of isolators ALGA. (author). 3 figs

  5. Large-scale compositional heterogeneity in the Earth's mantle

    Science.gov (United States)

    Ballmer, M.

    2017-12-01

    Seismic imaging of subducted Farallon and Tethys lithosphere in the lower mantle has been taken as evidence for whole-mantle convection, and efficient mantle mixing. However, cosmochemical constraints point to a lower-mantle composition that has a lower Mg/Si compared to upper-mantle pyrolite. Moreover, geochemical signatures of magmatic rocks indicate the long-term persistence of primordial reservoirs somewhere in the mantle. In this presentation, I establish geodynamic mechanisms for sustaining large-scale (primordial) heterogeneity in the Earth's mantle using numerical models. Mantle flow is controlled by rock density and viscosity. Variations in intrinsic rock density, such as due to heterogeneity in basalt or iron content, can induce layering or partial layering in the mantle. Layering can be sustained in the presence of persistent whole mantle convection due to active "unmixing" of heterogeneity in low-viscosity domains, e.g. in the transition zone or near the core-mantle boundary [1]. On the other hand, lateral variations in intrinsic rock viscosity, such as due to heterogeneity in Mg/Si, can strongly affect the mixing timescales of the mantle. In the extreme case, intrinsically strong rocks may remain unmixed through the age of the Earth, and persist as large-scale domains in the mid-mantle due to focusing of deformation along weak conveyor belts [2]. That large-scale lateral heterogeneity and/or layering can persist in the presence of whole-mantle convection can explain the stagnation of some slabs, as well as the deflection of some plumes, in the mid-mantle. These findings indeed motivate new seismic studies for rigorous testing of model predictions. [1] Ballmer, M. D., N. C. Schmerr, T. Nakagawa, and J. Ritsema (2015), Science Advances, doi:10.1126/sciadv.1500815. [2] Ballmer, M. D., C. Houser, J. W. Hernlund, R. Wentzcovitch, and K. Hirose (2017), Nature Geoscience, doi:10.1038/ngeo2898.

  6. Large-Scale Traveling Weather Systems in Mars’ Southern Extratropics

    Science.gov (United States)

    Hollingsworth, Jeffery L.; Kahre, Melinda A.

    2017-10-01

    Between late fall and early spring, Mars’ middle- and high-latitude atmosphere supports strong mean equator-to-pole temperature contrasts and an accompanying mean westerly polar vortex. Observations from both the MGS Thermal Emission Spectrometer (TES) and the MRO Mars Climate Sounder (MCS) indicate that a mean baroclinicity-barotropicity supports intense, large-scale eastward traveling weather systems (i.e., transient synoptic-period waves). Such extratropical weather disturbances are critical components of the global circulation as they serve as agents in the transport of heat and momentum, and generalized scalar/tracer quantities (e.g., atmospheric dust, water-vapor and ice clouds). The character of such traveling extratropical synoptic disturbances in Mars' southern hemisphere during late winter through early spring is investigated using a moderately high-resolution Mars global climate model (Mars GCM). This Mars GCM imposes interactively-lifted and radiatively-active dust based on a threshold value of the surface stress. The model exhibits a reasonable "dust cycle" (i.e., globally averaged, a dustier atmosphere during southern spring and summer occurs). Compared to the northern-hemisphere counterparts, the southern synoptic-period weather disturbances and accompanying frontal waves have smaller meridional and zonal scales, and are far less intense. Influences of the zonally asymmetric (i.e., east-west varying) topography on southern large-scale weather are investigated, in addition to large-scale up-slope/down-slope flows and the diurnal cycle. A southern storm zone in late winter and early spring presents in the western hemisphere via orographic influences from the Tharsis highlands, and the Argyre and Hellas impact basins. Geographically localized transient-wave activity diagnostics are constructed that illuminate dynamical differences amongst the simulations and these are presented.

  7. On the Phenomenology of an Accelerated Large-Scale Universe

    Directory of Open Access Journals (Sweden)

    Martiros Khurshudyan

    2016-10-01

    Full Text Available In this review paper, several new results towards the explanation of the accelerated expansion of the large-scale universe is discussed. On the other hand, inflation is the early-time accelerated era and the universe is symmetric in the sense of accelerated expansion. The accelerated expansion of is one of the long standing problems in modern cosmology, and physics in general. There are several well defined approaches to solve this problem. One of them is an assumption concerning the existence of dark energy in recent universe. It is believed that dark energy is responsible for antigravity, while dark matter has gravitational nature and is responsible, in general, for structure formation. A different approach is an appropriate modification of general relativity including, for instance, f ( R and f ( T theories of gravity. On the other hand, attempts to build theories of quantum gravity and assumptions about existence of extra dimensions, possible variability of the gravitational constant and the speed of the light (among others, provide interesting modifications of general relativity applicable to problems of modern cosmology, too. In particular, here two groups of cosmological models are discussed. In the first group the problem of the accelerated expansion of large-scale universe is discussed involving a new idea, named the varying ghost dark energy. On the other hand, the second group contains cosmological models addressed to the same problem involving either new parameterizations of the equation of state parameter of dark energy (like varying polytropic gas, or nonlinear interactions between dark energy and dark matter. Moreover, for cosmological models involving varying ghost dark energy, massless particle creation in appropriate radiation dominated universe (when the background dynamics is due to general relativity is demonstrated as well. Exploring the nature of the accelerated expansion of the large-scale universe involving generalized

  8. Solving large scale structure in ten easy steps with COLA

    Energy Technology Data Exchange (ETDEWEB)

    Tassev, Svetlin [Department of Astrophysical Sciences, Princeton University, 4 Ivy Lane, Princeton, NJ 08544 (United States); Zaldarriaga, Matias [School of Natural Sciences, Institute for Advanced Study, Olden Lane, Princeton, NJ 08540 (United States); Eisenstein, Daniel J., E-mail: stassev@cfa.harvard.edu, E-mail: matiasz@ias.edu, E-mail: deisenstein@cfa.harvard.edu [Center for Astrophysics, Harvard University, 60 Garden Street, Cambridge, MA 02138 (United States)

    2013-06-01

    We present the COmoving Lagrangian Acceleration (COLA) method: an N-body method for solving for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). Unlike standard N-body methods, the COLA method can straightforwardly trade accuracy at small-scales in order to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing, as those catalogs are essential for performing detailed error analysis for ongoing and future surveys of LSS. As an illustration, we ran a COLA-based N-body code on a box of size 100 Mpc/h with particles of mass ≈ 5 × 10{sup 9}M{sub s}un/h. Running the code with only 10 timesteps was sufficient to obtain an accurate description of halo statistics down to halo masses of at least 10{sup 11}M{sub s}un/h. This is only at a modest speed penalty when compared to mocks obtained with LPT. A standard detailed N-body run is orders of magnitude slower than our COLA-based code. The speed-up we obtain with COLA is due to the fact that we calculate the large-scale dynamics exactly using LPT, while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos. Achieving a similar level of accuracy in halo statistics without the COLA method requires at least 3 times more timesteps than when COLA is employed.

  9. Large-Scale Traveling Weather Systems in Mars Southern Extratropics

    Science.gov (United States)

    Hollingsworth, Jeffery L.; Kahre, Melinda A.

    2017-01-01

    Between late fall and early spring, Mars' middle- and high-latitude atmosphere supports strong mean equator-to-pole temperature contrasts and an accompanying mean westerly polar vortex. Observations from both the MGS Thermal Emission Spectrometer (TES) and the MRO Mars Climate Sounder (MCS) indicate that a mean baroclinicity-barotropicity supports intense, large-scale eastward traveling weather systems (i.e., transient synoptic-period waves). Such extratropical weather disturbances are critical components of the global circulation as they serve as agents in the transport of heat and momentum, and generalized scalar/tracer quantities (e.g., atmospheric dust, water-vapor and ice clouds). The character of such traveling extratropical synoptic disturbances in Mars' southern hemisphere during late winter through early spring is investigated using a moderately high-resolution Mars global climate model (Mars GCM). This Mars GCM imposes interactively-lifted and radiatively-active dust based on a threshold value of the surface stress. The model exhibits a reasonable "dust cycle" (i.e., globally averaged, a dustier atmosphere during southern spring and summer occurs). Compared to the northern-hemisphere counterparts, the southern synoptic-period weather disturbances and accompanying frontal waves have smaller meridional and zonal scales, and are far less intense. Influences of the zonally asymmetric (i.e., east-west varying) topography on southern large-scale weather are investigated, in addition to large-scale up-slope/down-slope flows and the diurnal cycle. A southern storm zone in late winter and early spring presents in the western hemisphere via orographic influences from the Tharsis highlands, and the Argyre and Hellas impact basins. Geographically localized transient-wave activity diagnostics are constructed that illuminate dynamical differences amongst the simulations and these are presented.

  10. Nonlinear evolution of large-scale structure in the universe

    International Nuclear Information System (INIS)

    Frenk, C.S.; White, S.D.M.; Davis, M.

    1983-01-01

    Using N-body simulations we study the nonlinear development of primordial density perturbation in an Einstein--de Sitter universe. We compare the evolution of an initial distribution without small-scale density fluctuations to evolution from a random Poisson distribution. These initial conditions mimic the assumptions of the adiabatic and isothermal theories of galaxy formation. The large-scale structures which form in the two cases are markedly dissimilar. In particular, the correlation function xi(r) and the visual appearance of our adiabatic (or ''pancake'') models match better the observed distribution of galaxies. This distribution is characterized by large-scale filamentary structure. Because the pancake models do not evolve in a self-similar fashion, the slope of xi(r) steepens with time; as a result there is a unique epoch at which these models fit the galaxy observations. We find the ratio of cutoff length to correlation length at this time to be lambda/sub min//r 0 = 5.1; its expected value in a neutrino dominated universe is 4(Ωh) -1 (H 0 = 100h km s -1 Mpc -1 ). At early epochs these models predict a negligible amplitude for xi(r) and could explain the lack of measurable clustering in the Lyα absorption lines of high-redshift quasars. However, large-scale structure in our models collapses after z = 2. If this collapse precedes galaxy formation as in the usual pancake theory, galaxies formed uncomfortably recently. The extent of this problem may depend on the cosmological model used; the present series of experiments should be extended in the future to include models with Ω<1

  11. Study of multi-functional precision optical measuring system for large scale equipment

    Science.gov (United States)

    Jiang, Wei; Lao, Dabao; Zhou, Weihu; Zhang, Wenying; Jiang, Xingjian; Wang, Yongxi

    2017-10-01

    The effective application of high performance measurement technology can greatly improve the large-scale equipment manufacturing ability. Therefore, the geometric parameters measurement, such as size, attitude and position, requires the measurement system with high precision, multi-function, portability and other characteristics. However, the existing measuring instruments, such as laser tracker, total station, photogrammetry system, mostly has single function, station moving and other shortcomings. Laser tracker needs to work with cooperative target, but it can hardly meet the requirement of measurement in extreme environment. Total station is mainly used for outdoor surveying and mapping, it is hard to achieve the demand of accuracy in industrial measurement. Photogrammetry system can achieve a wide range of multi-point measurement, but the measuring range is limited and need to repeatedly move station. The paper presents a non-contact opto-electronic measuring instrument, not only it can work by scanning the measurement path but also measuring the cooperative target by tracking measurement. The system is based on some key technologies, such as absolute distance measurement, two-dimensional angle measurement, automatically target recognition and accurate aiming, precision control, assembly of complex mechanical system and multi-functional 3D visualization software. Among them, the absolute distance measurement module ensures measurement with high accuracy, and the twodimensional angle measuring module provides precision angle measurement. The system is suitable for the case of noncontact measurement of large-scale equipment, it can ensure the quality and performance of large-scale equipment throughout the process of manufacturing and improve the manufacturing ability of large-scale and high-end equipment.

  12. Large-Scale Power Production Potential on U.S. Department of Energy Lands

    Energy Technology Data Exchange (ETDEWEB)

    Kandt, Alicen J. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Elgqvist, Emma M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gagne, Douglas A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hillesheim, Michael B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Walker, H. A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); King, Jeff [Colorado School of Mines, Golden, CO (United States); Boak, Jeremy [Colorado School of Mines, Golden, CO (United States); Washington, Jeremy [Colorado School of Mines, Golden, CO (United States); Sharp, Cory [Colorado School of Mines, Golden, CO (United States)

    2017-11-03

    This report summarizes the potential for independent power producers to generate large-scale power on U.S. Department of Energy (DOE) lands and export that power into a larger power market, rather than serving on-site DOE loads. The report focuses primarily on the analysis of renewable energy (RE) technologies that are commercially viable at utility scale, including photovoltaics (PV), concentrating solar power (CSP), wind, biomass, landfill gas (LFG), waste to energy (WTE), and geothermal technologies. The report also summarizes the availability of fossil fuel, uranium, or thorium resources at 55 DOE sites.

  13. Recent Advances in Understanding Large Scale Vapour Explosions

    International Nuclear Information System (INIS)

    Board, S.J.; Hall, R.W.

    1976-01-01

    In foundries, violent explosions occur occasionally when molten metal comes into contact with water. If similar explosions can occur with other materials, hazardous situations may arise for example in LNG marine transportation accidents, or in liquid cooled reactor incidents when molten UO 2 contacts water or sodium coolant. Over the last 10 years a large body of experimental data has been obtained on the behaviour of small quantities of hot material in contact with a vaporisable coolant. Such experiments generally give low energy yields, despite producing fine fragmentation of the molten material. These events have been interpreted in terms of a wide range of phenomena such as violent boiling, liquid entrainment, bubble collapse, superheat, surface cracking and many others. Many of these studies have been aimed at understanding the small scale behaviour of the particular materials of interest. However, understanding the nature of the energetic events which were the original cause for concern may also be necessary to give confidence that violent events cannot occur for these materials in large scale situations. More recently, there has been a trend towards larger experiments and some of these have produced explosions of moderately high efficiency. Although occurrence of such large scale explosions can depend rather critically on initial conditions in a way which is not fully understood, there are signs that the interpretation of these events may be more straightforward than that of the single drop experiments. In the last two years several theoretical models for large scale explosions have appeared which attempt a self contained explanation of at least some stages of such high yield events: these have as their common feature a description of how a propagating breakdown of an initially quasi-stable distribution of materials is induced by the pressure and flow field caused by the energy release in adjacent regions. These models have led to the idea that for a full

  14. Generation Expansion Planning Considering Integrating Large-scale Wind Generation

    DEFF Research Database (Denmark)

    Zhang, Chunyu; Ding, Yi; Østergaard, Jacob

    2013-01-01

    necessitated the inclusion of more innovative and sophisticated approaches in power system investment planning. A bi-level generation expansion planning approach considering large-scale wind generation was proposed in this paper. The first phase is investment decision, while the second phase is production...... optimization decision. A multi-objective PSO (MOPSO) algorithm was introduced to solve this optimization problem, which can accelerate the convergence and guarantee the diversity of Pareto-optimal front set as well. The feasibility and effectiveness of the proposed bi-level planning approach and the MOPSO...

  15. Safeguarding aspects of large-scale commercial reprocessing plants

    International Nuclear Information System (INIS)

    1979-03-01

    The paper points out that several solutions to the problems of safeguarding large-scale plants have been put forward: (1) Increased measurement accuracy. This does not remove the problem of timely detection. (2) Continuous in-process measurement. As yet unproven and likely to be costly. (3) More extensive use of containment and surveillance. The latter appears to be feasible but requires the incorporation of safeguards into plant design and sufficient redundancy to protect the operators interests. The advantages of altering the emphasis of safeguards philosophy from quantitative goals to the analysis of diversion strategies should be considered

  16. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif

    2017-01-07

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  17. Highly Scalable Trip Grouping for Large Scale Collective Transportation Systems

    DEFF Research Database (Denmark)

    Gidofalvi, Gyozo; Pedersen, Torben Bach; Risch, Tore

    2008-01-01

    Transportation-related problems, like road congestion, parking, and pollution, are increasing in most cities. In order to reduce traffic, recent work has proposed methods for vehicle sharing, for example for sharing cabs by grouping "closeby" cab requests and thus minimizing transportation cost...... and utilizing cab space. However, the methods published so far do not scale to large data volumes, which is necessary to facilitate large-scale collective transportation systems, e.g., ride-sharing systems for large cities. This paper presents highly scalable trip grouping algorithms, which generalize previous...

  18. Large-Scale Analysis of Network Bistability for Human Cancers

    Science.gov (United States)

    Shiraishi, Tetsuya; Matsuyama, Shinako; Kitano, Hiroaki

    2010-01-01

    Protein–protein interaction and gene regulatory networks are likely to be locked in a state corresponding to a disease by the behavior of one or more bistable circuits exhibiting switch-like behavior. Sets of genes could be over-expressed or repressed when anomalies due to disease appear, and the circuits responsible for this over- or under-expression might persist for as long as the disease state continues. This paper shows how a large-scale analysis of network bistability for various human cancers can identify genes that can potentially serve as drug targets or diagnosis biomarkers. PMID:20628618

  19. Development of Large-Scale Spacecraft Fire Safety Experiments

    DEFF Research Database (Denmark)

    Ruff, Gary A.; Urban, David L.; Fernandez-Pello, A. Carlos

    2013-01-01

    exploration missions outside of low-earth orbit and accordingly, more complex in terms of operations, logistics, and safety. This will increase the challenge of ensuring a fire-safe environment for the crew throughout the mission. Based on our fundamental uncertainty of the behavior of fires in low...... of the spacecraft fire safety risk. The activity of this project is supported by an international topical team of fire experts from other space agencies who conduct research that is integrated into the overall experiment design. The large-scale space flight experiment will be conducted in an Orbital Sciences...

  20. Large scale obscuration and related climate effects open literature bibliography

    International Nuclear Information System (INIS)

    Russell, N.A.; Geitgey, J.; Behl, Y.K.; Zak, B.D.

    1994-05-01

    Large scale obscuration and related climate effects of nuclear detonations first became a matter of concern in connection with the so-called ''Nuclear Winter Controversy'' in the early 1980's. Since then, the world has changed. Nevertheless, concern remains about the atmospheric effects of nuclear detonations, but the source of concern has shifted. Now it focuses less on global, and more on regional effects and their resulting impacts on the performance of electro-optical and other defense-related systems. This bibliography reflects the modified interest

  1. Large Scale Composite Manufacturing for Heavy Lift Launch Vehicles

    Science.gov (United States)

    Stavana, Jacob; Cohen, Leslie J.; Houseal, Keth; Pelham, Larry; Lort, Richard; Zimmerman, Thomas; Sutter, James; Western, Mike; Harper, Robert; Stuart, Michael

    2012-01-01

    Risk reduction for the large scale composite manufacturing is an important goal to produce light weight components for heavy lift launch vehicles. NASA and an industry team successfully employed a building block approach using low-cost Automated Tape Layup (ATL) of autoclave and Out-of-Autoclave (OoA) prepregs. Several large, curved sandwich panels were fabricated at HITCO Carbon Composites. The aluminum honeycomb core sandwich panels are segments of a 1/16th arc from a 10 meter cylindrical barrel. Lessons learned highlight the manufacturing challenges required to produce light weight composite structures such as fairings for heavy lift launch vehicles.

  2. Large Scale Simulations of the Euler Equations on GPU Clusters

    KAUST Repository

    Liebmann, Manfred

    2010-08-01

    The paper investigates the scalability of a parallel Euler solver, using the Vijayasundaram method, on a GPU cluster with 32 Nvidia Geforce GTX 295 boards. The aim of this research is to enable large scale fluid dynamics simulations with up to one billion elements. We investigate communication protocols for the GPU cluster to compensate for the slow Gigabit Ethernet network between the GPU compute nodes and to maintain overall efficiency. A diesel engine intake-port and a nozzle, meshed in different resolutions, give good real world examples for the scalability tests on the GPU cluster. © 2010 IEEE.

  3. Large-scale structure in the universe: Theory vs observations

    International Nuclear Information System (INIS)

    Kashlinsky, A.; Jones, B.J.T.

    1990-01-01

    A variety of observations constrain models of the origin of large scale cosmic structures. We review here the elements of current theories and comment in detail on which of the current observational data provide the principal constraints. We point out that enough observational data have accumulated to constrain (and perhaps determine) the power spectrum of primordial density fluctuations over a very large range of scales. We discuss the theories in the light of observational data and focus on the potential of future observations in providing even (and ever) tighter constraints. (orig.)

  4. Current status of large-scale cryogenic gravitational wave telescope

    International Nuclear Information System (INIS)

    Kuroda, K; Ohashi, M; Miyoki, S; Uchiyama, T; Ishitsuka, H; Yamamoto, K; Kasahara, K; Fujimoto, M-K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Nagano, S; Tsunesada, Y; Zhu, Zong-Hong; Shintomi, T; Yamamoto, A; Suzuki, T; Saito, Y; Haruyama, T; Sato, N; Higashi, Y; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Aso, Y; Ueda, K-I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Tagoshi, H; Nakamura, T; Sasaki, M; Tanaka, T; Oohara, K; Takahashi, H; Miyakawa, O; Tobar, M E

    2003-01-01

    The large-scale cryogenic gravitational wave telescope (LCGT) project is the proposed advancement of TAMA, which will be able to detect the coalescences of binary neutron stars occurring in our galaxy. LCGT intends to detect the coalescence events within about 240 Mpc, the rate of which is expected to be from 0.1 to several events in a year. LCGT has Fabry-Perot cavities of 3 km baseline and the mirrors are cooled down to a cryogenic temperature of 20 K. It is planned to be built in the underground of Kamioka mine. This paper overviews the revision of the design and the current status of the R and D

  5. Large scale obscuration and related climate effects open literature bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Russell, N.A.; Geitgey, J.; Behl, Y.K.; Zak, B.D.

    1994-05-01

    Large scale obscuration and related climate effects of nuclear detonations first became a matter of concern in connection with the so-called ``Nuclear Winter Controversy`` in the early 1980`s. Since then, the world has changed. Nevertheless, concern remains about the atmospheric effects of nuclear detonations, but the source of concern has shifted. Now it focuses less on global, and more on regional effects and their resulting impacts on the performance of electro-optical and other defense-related systems. This bibliography reflects the modified interest.

  6. Properties of large-scale methane/hydrogen jet fires

    Energy Technology Data Exchange (ETDEWEB)

    Studer, E. [CEA Saclay, DEN, LTMF Heat Transfer and Fluid Mech Lab, 91 - Gif-sur-Yvette (France); Jamois, D.; Leroy, G.; Hebrard, J. [INERIS, F-60150 Verneuil En Halatte (France); Jallais, S. [Air Liquide, F-78350 Jouy En Josas (France); Blanchetiere, V. [GDF SUEZ, 93 - La Plaine St Denis (France)

    2009-12-15

    A future economy based on reduction of carbon-based fuels for power generation and transportation may consider hydrogen as possible energy carrier Extensive and widespread use of hydrogen might require a pipeline network. The alternatives might be the use of the existing natural gas network or to design a dedicated network. Whatever the solution, mixing hydrogen with natural gas will modify the consequences of accidents, substantially The French National Research Agency (ANR) funded project called HYDROMEL focuses on these critical questions Within this project large-scale jet fires have been studied experimentally and numerically The main characteristics of these flames including visible length, radiation fluxes and blowout have been assessed. (authors)

  7. Less is more: regularization perspectives on large scale machine learning

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Deep learning based techniques provide a possible solution at the expanse of theoretical guidance and, especially, of computational requirements. It is then a key challenge for large scale machine learning to devise approaches guaranteed to be accurate and yet computationally efficient. In this talk, we will consider a regularization perspectives on machine learning appealing to classical ideas in linear algebra and inverse problems to scale-up dramatically nonparametric methods such as kernel methods, often dismissed because of prohibitive costs. Our analysis derives optimal theoretical guarantees while providing experimental results at par or out-performing state of the art approaches.

  8. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif; Orakzai, Faisal Moeen; Abdelaziz, Ibrahim; Khayyat, Zuhair

    2017-01-01

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  9. Cosmological parameters from large scale structure - geometric versus shape information

    CERN Document Server

    Hamann, Jan; Lesgourgues, Julien; Rampf, Cornelius; Wong, Yvonne Y Y

    2010-01-01

    The matter power spectrum as derived from large scale structure (LSS) surveys contains two important and distinct pieces of information: an overall smooth shape and the imprint of baryon acoustic oscillations (BAO). We investigate the separate impact of these two types of information on cosmological parameter estimation, and show that for the simplest cosmological models, the broad-band shape information currently contained in the SDSS DR7 halo power spectrum (HPS) is by far superseded by geometric information derived from the baryonic features. An immediate corollary is that contrary to popular beliefs, the upper limit on the neutrino mass m_\

  10. Test on large-scale seismic isolation elements, 2

    International Nuclear Information System (INIS)

    Mazda, T.; Moteki, M.; Ishida, K.; Shiojiri, H.; Fujita, T.

    1991-01-01

    Seismic isolation test program of Central Research Inst. of Electric Power Industry (CRIEPI) to apply seismic isolation to Fast Breeder Reactor (FBR) plant was started in 1987. In this test program, demonstration test of seismic isolation elements was considered as one of the most important research items. Facilities for testing seismic isolation elements were built in Abiko Research Laboratory of CRIEPI. Various tests of large-scale seismic isolation elements were conducted up to this day. Many important test data to develop design technical guidelines was obtained. (author)

  11. Testing Inflation with Large Scale Structure: Connecting Hopes with Reality

    International Nuclear Information System (INIS)

    Alvarez, Marcello; Baldauf, T.; Bond, J. Richard; Dalal, N.; Putter, R. D.; Dore, O.; Green, Daniel; Hirata, Chris; Huang, Zhiqi; Huterer, Dragan; Jeong, Donghui; Johnson, Matthew C.; Krause, Elisabeth; Loverde, Marilena; Meyers, Joel; Meeburg, Daniel; Senatore, Leonardo; Shandera, Sarah; Silverstein, Eva; Slosar, Anze; Smith, Kendrick; Zaldarriaga, Matias; Assassi, Valentin; Braden, Jonathan; Hajian, Amir; Kobayashi, Takeshi; Stein, George; Engelen, Alexander van

    2014-01-01

    The statistics of primordial curvature fluctuations are our window into the period of inflation, where these fluctuations were generated. To date, the cosmic microwave background has been the dominant source of information about these perturbations. Large-scale structure is, however, from where drastic improvements should originate. In this paper, we explain the theoretical motivations for pursuing such measurements and the challenges that lie ahead. In particular, we discuss and identify theoretical targets regarding the measurement of primordial non-Gaussianity. We argue that when quantified in terms of the local (equilateral) template amplitude floc\

  12. Design techniques for large scale linear measurement systems

    International Nuclear Information System (INIS)

    Candy, J.V.

    1979-03-01

    Techniques to design measurement schemes for systems modeled by large scale linear time invariant systems, i.e., physical systems modeled by a large number (> 5) of ordinary differential equations, are described. The techniques are based on transforming the physical system model to a coordinate system facilitating the design and then transforming back to the original coordinates. An example of a three-stage, four-species, extraction column used in the reprocessing of spent nuclear fuel elements is presented. The basic ideas are briefly discussed in the case of noisy measurements. An example using a plutonium nitrate storage vessel (reprocessing) with measurement uncertainty is also presented

  13. How Large-Scale Research Facilities Connect to Global Research

    DEFF Research Database (Denmark)

    Lauto, Giancarlo; Valentin, Finn

    2013-01-01

    Policies for large-scale research facilities (LSRFs) often highlight their spillovers to industrial innovation and their contribution to the external connectivity of the regional innovation system hosting them. Arguably, the particular institutional features of LSRFs are conducive for collaborative...... research. However, based on data on publications produced in 2006–2009 at the Neutron Science Directorate of Oak Ridge National Laboratory in Tennessee (United States), we find that internationalization of its collaborative research is restrained by coordination costs similar to those characterizing other...

  14. Novel algorithm of large-scale simultaneous linear equations

    International Nuclear Information System (INIS)

    Fujiwara, T; Hoshi, T; Yamamoto, S; Sogabe, T; Zhang, S-L

    2010-01-01

    We review our recently developed methods of solving large-scale simultaneous linear equations and applications to electronic structure calculations both in one-electron theory and many-electron theory. This is the shifted COCG (conjugate orthogonal conjugate gradient) method based on the Krylov subspace, and the most important issue for applications is the shift equation and the seed switching method, which greatly reduce the computational cost. The applications to nano-scale Si crystals and the double orbital extended Hubbard model are presented.

  15. Optimization of large scale food production using Lean Manufacturing principles

    DEFF Research Database (Denmark)

    Engelund, Eva Høy; Friis, Alan; Breum, Gitte

    2009-01-01

    This paper discusses how the production principles of Lean Manufacturing (Lean) can be applied in a large-scale meal production. Lean principles are briefly presented, followed by a field study of how a kitchen at a Danish hospital has implemented Lean in the daily production. In the kitchen...... not be negatively affected by the rationalisation of production procedures. The field study shows that Lean principles can be applied in meal production and can result in increased production efficiency and systematic improvement of product quality without negative effects on the working environment. The results...... show that Lean can be applied and used to manage the production of meals in the kitchen....

  16. Committees and sponsors

    Science.gov (United States)

    2011-10-01

    International Advisory Committee Richard F CastenYale, USA Luiz Carlos ChamonSão Paulo, Brazil Osvaldo CivitareseLa Plata, Argentina Jozsef CsehATOMKI, Hungary Jerry P DraayerLSU, USA Alfredo Galindo-UribarriORNL & UT, USA James J KolataNotre Dame, USA Jorge López UTEP, USA Joseph B NatowitzTexas A & M, USA Ma Esther Ortiz IF-UNAM Stuart PittelDelaware, USA Andrés SandovalIF-UNAM Adam SzczepaniakIndiana, USA Piet Van IsackerGANIL, France Michael WiescherNotre Dame, USA Organizing Committee Libertad Barrón-Palos (Chair)IF-UNAM Roelof BijkerICN-UNAM Ruben FossionICN-UNAM David LizcanoININ Sponsors Instituto de Ciencias Nucleares, UNAMInstituto de Física, UNAMInstituto Nacional de Investigaciones NuclearesDivisión de Física Nuclear de la SMFCentro Latinoamericano de Física

  17. Topographically Engineered Large Scale Nanostructures for Plasmonic Biosensing

    Science.gov (United States)

    Xiao, Bo; Pradhan, Sangram K.; Santiago, Kevin C.; Rutherford, Gugu N.; Pradhan, Aswini K.

    2016-04-01

    We demonstrate that a nanostructured metal thin film can achieve enhanced transmission efficiency and sharp resonances and use a large-scale and high-throughput nanofabrication technique for the plasmonic structures. The fabrication technique combines the features of nanoimprint and soft lithography to topographically construct metal thin films with nanoscale patterns. Metal nanogratings developed using this method show significantly enhanced optical transmission (up to a one-order-of-magnitude enhancement) and sharp resonances with full width at half maximum (FWHM) of ~15nm in the zero-order transmission using an incoherent white light source. These nanostructures are sensitive to the surrounding environment, and the resonance can shift as the refractive index changes. We derive an analytical method using a spatial Fourier transformation to understand the enhancement phenomenon and the sensing mechanism. The use of real-time monitoring of protein-protein interactions in microfluidic cells integrated with these nanostructures is demonstrated to be effective for biosensing. The perpendicular transmission configuration and large-scale structures provide a feasible platform without sophisticated optical instrumentation to realize label-free surface plasmon resonance (SPR) sensing.

  18. Large-scale hydrogen production using nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Ryland, D.; Stolberg, L.; Kettner, A.; Gnanapragasam, N.; Suppiah, S. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2014-07-01

    For many years, Atomic Energy of Canada Limited (AECL) has been studying the feasibility of using nuclear reactors, such as the Supercritical Water-cooled Reactor, as an energy source for large scale hydrogen production processes such as High Temperature Steam Electrolysis and the Copper-Chlorine thermochemical cycle. Recent progress includes the augmentation of AECL's experimental capabilities by the construction of experimental systems to test high temperature steam electrolysis button cells at ambient pressure and temperatures up to 850{sup o}C and CuCl/HCl electrolysis cells at pressures up to 7 bar and temperatures up to 100{sup o}C. In parallel, detailed models of solid oxide electrolysis cells and the CuCl/HCl electrolysis cell are being refined and validated using experimental data. Process models are also under development to assess options for economic integration of these hydrogen production processes with nuclear reactors. Options for large-scale energy storage, including hydrogen storage, are also under study. (author)

  19. The effective field theory of cosmological large scale structures

    Energy Technology Data Exchange (ETDEWEB)

    Carrasco, John Joseph M. [Stanford Univ., Stanford, CA (United States); Hertzberg, Mark P. [Stanford Univ., Stanford, CA (United States); SLAC National Accelerator Lab., Menlo Park, CA (United States); Senatore, Leonardo [Stanford Univ., Stanford, CA (United States); SLAC National Accelerator Lab., Menlo Park, CA (United States)

    2012-09-20

    Large scale structure surveys will likely become the next leading cosmological probe. In our universe, matter perturbations are large on short distances and small at long scales, i.e. strongly coupled in the UV and weakly coupled in the IR. To make precise analytical predictions on large scales, we develop an effective field theory formulated in terms of an IR effective fluid characterized by several parameters, such as speed of sound and viscosity. These parameters, determined by the UV physics described by the Boltzmann equation, are measured from N-body simulations. We find that the speed of sound of the effective fluid is c2s ≈ 10–6c2 and that the viscosity contributions are of the same order. The fluid describes all the relevant physics at long scales k and permits a manifestly convergent perturbative expansion in the size of the matter perturbations δ(k) for all the observables. As an example, we calculate the correction to the power spectrum at order δ(k)4. As a result, the predictions of the effective field theory are found to be in much better agreement with observation than standard cosmological perturbation theory, already reaching percent precision at this order up to a relatively short scale k ≃ 0.24h Mpc–1.

  20. Systematic renormalization of the effective theory of Large Scale Structure

    International Nuclear Information System (INIS)

    Abolhasani, Ali Akbar; Mirbabayi, Mehrdad; Pajer, Enrico

    2016-01-01

    A perturbative description of Large Scale Structure is a cornerstone of our understanding of the observed distribution of matter in the universe. Renormalization is an essential and defining step to make this description physical and predictive. Here we introduce a systematic renormalization procedure, which neatly associates counterterms to the UV-sensitive diagrams order by order, as it is commonly done in quantum field theory. As a concrete example, we renormalize the one-loop power spectrum and bispectrum of both density and velocity. In addition, we present a series of results that are valid to all orders in perturbation theory. First, we show that while systematic renormalization requires temporally non-local counterterms, in practice one can use an equivalent basis made of local operators. We give an explicit prescription to generate all counterterms allowed by the symmetries. Second, we present a formal proof of the well-known general argument that the contribution of short distance perturbations to large scale density contrast δ and momentum density π(k) scale as k 2 and k, respectively. Third, we demonstrate that the common practice of introducing counterterms only in the Euler equation when one is interested in correlators of δ is indeed valid to all orders.