WorldWideScience

Sample records for involving large-scale integrated

  1. Feasibility of an energy conversion system in Canada involving large-scale integrated hydrogen production using solid fuels

    International Nuclear Information System (INIS)

    Gnanapragasam, Nirmal V.; Reddy, Bale V.; Rosen, Marc A.

    2010-01-01

    A large-scale hydrogen production system is proposed using solid fuels and designed to increase the sustainability of alternative energy forms in Canada, and the technical and economic aspects of the system within the Canadian energy market are examined. The work investigates the feasibility and constraints in implementing such a system within the energy infrastructure of Canada. The proposed multi-conversion and single-function system produces hydrogen in large quantities using energy from solid fuels such as coal, tar sands, biomass, municipal solid waste (MSW) and agricultural/forest/industrial residue. The proposed system involves significant technology integration, with various energy conversion processes (such as gasification, chemical looping combustion, anaerobic digestion, combustion power cycles-electrolysis and solar-thermal converters) interconnected to increase the utilization of solid fuels as much as feasible within cost, environmental and other constraints. The analysis involves quantitative and qualitative assessments based on (i) energy resources availability and demand for hydrogen, (ii) commercial viability of primary energy conversion technologies, (iii) academia, industry and government participation, (iv) sustainability and (v) economics. An illustrative example provides an initial road map for implementing such a system. (author)

  2. Large Scale System Safety Integration for Human Rated Space Vehicles

    Science.gov (United States)

    Massie, Michael J.

    2005-12-01

    Since the 1960s man has searched for ways to establish a human presence in space. Unfortunately, the development and operation of human spaceflight vehicles carry significant safety risks that are not always well understood. As a result, the countries with human space programs have felt the pain of loss of lives in the attempt to develop human space travel systems. Integrated System Safety is a process developed through years of experience (since before Apollo and Soyuz) as a way to assess risks involved in space travel and prevent such losses. The intent of Integrated System Safety is to take a look at an entire program and put together all the pieces in such a way that the risks can be identified, understood and dispositioned by program management. This process has many inherent challenges and they need to be explored, understood and addressed.In order to prepare truly integrated analysis safety professionals must gain a level of technical understanding of all of the project's pieces and how they interact. Next, they must find a way to present the analysis so the customer can understand the risks and make decisions about managing them. However, every organization in a large-scale project can have different ideas about what is or is not a hazard, what is or is not an appropriate hazard control, and what is or is not adequate hazard control verification. NASA provides some direction on these topics, but interpretations of those instructions can vary widely.Even more challenging is the fact that every individual/organization involved in a project has different levels of risk tolerance. When the discrete hazard controls of the contracts and agreements cannot be met, additional risk must be accepted. However, when one has left the arena of compliance with the known rules, there can be no longer be specific ground rules on which to base a decision as to what is acceptable and what is not. The integrator must find common grounds between all parties to achieve

  3. Electric vehicles and large-scale integration of wind power

    DEFF Research Database (Denmark)

    Liu, Wen; Hu, Weihao; Lund, Henrik

    2013-01-01

    was 6.5% in 2009 and which has the plan to develop large-scale wind power. The results show that electric vehicles (EVs) have the ability to balance the electricity demand and supply and to further the wind power integration. In the best case, the energy system with EV can increase wind power...... with this imbalance and to reduce its high dependence on oil production. For this reason, it is interesting to analyse the extent to which transport electrification can further the renewable energy integration. This paper quantifies this issue in Inner Mongolia, where the share of wind power in the electricity supply...... integration by 8%. The application of EVs benefits from saving both energy system cost and fuel cost. However, the negative consequences of decreasing energy system efficiency and increasing the CO2 emission should be noted when applying the hydrogen fuel cell vehicle (HFCV). The results also indicate...

  4. Challenges and options for large scale integration of wind power

    International Nuclear Information System (INIS)

    Tande, John Olav Giaever

    2006-01-01

    Challenges and options for large scale integration of wind power are examined. Immediate challenges are related to weak grids. Assessment of system stability requires numerical simulation. Models are being developed - validation is essential. Coordination of wind and hydro generation is a key for allowing more wind power capacity in areas with limited transmission corridors. For the case study grid depending on technology and control the allowed wind farm size is increased from 50 to 200 MW. The real life example from 8 January 2005 demonstrates that existing marked based mechanisms can handle large amounts of wind power. In wind integration studies it is essential to take account of the controllability of modern wind farms, the power system flexibility and the smoothing effect of geographically dispersed wind farms. Modern wind farms contribute to system adequacy - combining wind and hydro constitutes a win-win system (ml)

  5. Multidimensional quantum entanglement with large-scale integrated optics.

    Science.gov (United States)

    Wang, Jianwei; Paesani, Stefano; Ding, Yunhong; Santagati, Raffaele; Skrzypczyk, Paul; Salavrakos, Alexia; Tura, Jordi; Augusiak, Remigiusz; Mančinska, Laura; Bacco, Davide; Bonneau, Damien; Silverstone, Joshua W; Gong, Qihuang; Acín, Antonio; Rottwitt, Karsten; Oxenløwe, Leif K; O'Brien, Jeremy L; Laing, Anthony; Thompson, Mark G

    2018-04-20

    The ability to control multidimensional quantum systems is central to the development of advanced quantum technologies. We demonstrate a multidimensional integrated quantum photonic platform able to generate, control, and analyze high-dimensional entanglement. A programmable bipartite entangled system is realized with dimensions up to 15 × 15 on a large-scale silicon photonics quantum circuit. The device integrates more than 550 photonic components on a single chip, including 16 identical photon-pair sources. We verify the high precision, generality, and controllability of our multidimensional technology, and further exploit these abilities to demonstrate previously unexplored quantum applications, such as quantum randomness expansion and self-testing on multidimensional states. Our work provides an experimental platform for the development of multidimensional quantum technologies. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  6. Multidimensional quantum entanglement with large-scale integrated optics

    DEFF Research Database (Denmark)

    Wang, Jianwei; Paesani, Stefano; Ding, Yunhong

    2018-01-01

    -dimensional entanglement. A programmable bipartite entangled system is realized with dimension up to 15 × 15 on a large-scale silicon-photonics quantum circuit. The device integrates more than 550 photonic components on a single chip, including 16 identical photon-pair sources. We verify the high precision, generality......The ability to control multidimensional quantum systems is key for the investigation of fundamental science and for the development of advanced quantum technologies. We demonstrate a multidimensional integrated quantum photonic platform able to generate, control and analyze high...... and controllability of our multidimensional technology, and further exploit these abilities to demonstrate key quantum applications experimentally unexplored before, such as quantum randomness expansion and self-testing on multidimensional states. Our work provides an experimental platform for the development...

  7. Optimal Wind Energy Integration in Large-Scale Electric Grids

    Science.gov (United States)

    Albaijat, Mohammad H.

    The major concern in electric grid operation is operating under the most economical and reliable fashion to ensure affordability and continuity of electricity supply. This dissertation investigates the effects of such challenges, which affect electric grid reliability and economic operations. These challenges are: 1. Congestion of transmission lines, 2. Transmission lines expansion, 3. Large-scale wind energy integration, and 4. Phaser Measurement Units (PMUs) optimal placement for highest electric grid observability. Performing congestion analysis aids in evaluating the required increase of transmission line capacity in electric grids. However, it is necessary to evaluate expansion of transmission line capacity on methods to ensure optimal electric grid operation. Therefore, the expansion of transmission line capacity must enable grid operators to provide low-cost electricity while maintaining reliable operation of the electric grid. Because congestion affects the reliability of delivering power and increases its cost, the congestion analysis in electric grid networks is an important subject. Consequently, next-generation electric grids require novel methodologies for studying and managing congestion in electric grids. We suggest a novel method of long-term congestion management in large-scale electric grids. Owing to the complication and size of transmission line systems and the competitive nature of current grid operation, it is important for electric grid operators to determine how many transmission lines capacity to add. Traditional questions requiring answers are "Where" to add, "How much of transmission line capacity" to add, and "Which voltage level". Because of electric grid deregulation, transmission lines expansion is more complicated as it is now open to investors, whose main interest is to generate revenue, to build new transmission lines. Adding a new transmission capacity will help the system to relieve the transmission system congestion, create

  8. Properties Important To Mixing For WTP Large Scale Integrated Testing

    International Nuclear Information System (INIS)

    Koopman, D.; Martino, C.; Poirier, M.

    2012-01-01

    Large Scale Integrated Testing (LSIT) is being planned by Bechtel National, Inc. to address uncertainties in the full scale mixing performance of the Hanford Waste Treatment and Immobilization Plant (WTP). Testing will use simulated waste rather than actual Hanford waste. Therefore, the use of suitable simulants is critical to achieving the goals of the test program. External review boards have raised questions regarding the overall representativeness of simulants used in previous mixing tests. Accordingly, WTP requested the Savannah River National Laboratory (SRNL) to assist with development of simulants for use in LSIT. Among the first tasks assigned to SRNL was to develop a list of waste properties that matter to pulse-jet mixer (PJM) mixing of WTP tanks. This report satisfies Commitment 5.2.3.1 of the Department of Energy Implementation Plan for Defense Nuclear Facilities Safety Board Recommendation 2010-2: physical properties important to mixing and scaling. In support of waste simulant development, the following two objectives are the focus of this report: (1) Assess physical and chemical properties important to the testing and development of mixing scaling relationships; (2) Identify the governing properties and associated ranges for LSIT to achieve the Newtonian and non-Newtonian test objectives. This includes the properties to support testing of sampling and heel management systems. The test objectives for LSIT relate to transfer and pump out of solid particles, prototypic integrated operations, sparger operation, PJM controllability, vessel level/density measurement accuracy, sampling, heel management, PJM restart, design and safety margin, Computational Fluid Dynamics (CFD) Verification and Validation (V and V) and comparison, performance testing and scaling, and high temperature operation. The slurry properties that are most important to Performance Testing and Scaling depend on the test objective and rheological classification of the slurry (i

  9. Generation Expansion Planning Considering Integrating Large-scale Wind Generation

    DEFF Research Database (Denmark)

    Zhang, Chunyu; Ding, Yi; Østergaard, Jacob

    2013-01-01

    necessitated the inclusion of more innovative and sophisticated approaches in power system investment planning. A bi-level generation expansion planning approach considering large-scale wind generation was proposed in this paper. The first phase is investment decision, while the second phase is production...... optimization decision. A multi-objective PSO (MOPSO) algorithm was introduced to solve this optimization problem, which can accelerate the convergence and guarantee the diversity of Pareto-optimal front set as well. The feasibility and effectiveness of the proposed bi-level planning approach and the MOPSO...

  10. Solutions of large-scale electromagnetics problems involving dielectric objects with the parallel multilevel fast multipole algorithm.

    Science.gov (United States)

    Ergül, Özgür

    2011-11-01

    Fast and accurate solutions of large-scale electromagnetics problems involving homogeneous dielectric objects are considered. Problems are formulated with the electric and magnetic current combined-field integral equation and discretized with the Rao-Wilton-Glisson functions. Solutions are performed iteratively by using the multilevel fast multipole algorithm (MLFMA). For the solution of large-scale problems discretized with millions of unknowns, MLFMA is parallelized on distributed-memory architectures using a rigorous technique, namely, the hierarchical partitioning strategy. Efficiency and accuracy of the developed implementation are demonstrated on very large problems involving as many as 100 million unknowns.

  11. An integrated system for large scale scanning of nuclear emulsions

    Energy Technology Data Exchange (ETDEWEB)

    Bozza, Cristiano, E-mail: kryss@sa.infn.it [University of Salerno and INFN, via Ponte Don Melillo, Fisciano 84084 (Italy); D’Ambrosio, Nicola [Laboratori Nazionali del Gran Sasso, S.S. 17 BIS km 18.910, Assergi (AQ) 67010 (Italy); De Lellis, Giovanni [University of Napoli and INFN, Complesso Universitario di Monte Sant' Angelo, via Cintia Ed. G, Napoli 80126 (Italy); De Serio, Marilisa [University of Bari and INFN, via E. Orabona 4, Bari 70125 (Italy); Di Capua, Francesco [INFN Napoli, Complesso Universitario di Monte Sant' Angelo, via Cintia Ed. G, Napoli 80126 (Italy); Di Crescenzo, Antonia [University of Napoli and INFN, Complesso Universitario di Monte Sant' Angelo, via Cintia Ed. G, Napoli 80126 (Italy); Di Ferdinando, Donato [INFN Bologna, viale B. Pichat 6/2, Bologna 40127 (Italy); Di Marco, Natalia [Laboratori Nazionali del Gran Sasso, S.S. 17 BIS km 18.910, Assergi (AQ) 67010 (Italy); Esposito, Luigi Salvatore [Laboratori Nazionali del Gran Sasso, now at CERN, Geneva (Switzerland); Fini, Rosa Anna [INFN Bari, via E. Orabona 4, Bari 70125 (Italy); Giacomelli, Giorgio [University of Bologna and INFN, viale B. Pichat 6/2, Bologna 40127 (Italy); Grella, Giuseppe [University of Salerno and INFN, via Ponte Don Melillo, Fisciano 84084 (Italy); Ieva, Michela [University of Bari and INFN, via E. Orabona 4, Bari 70125 (Italy); Kose, Umut [INFN Padova, via Marzolo 8, Padova (PD) 35131 (Italy); Longhin, Andrea; Mauri, Nicoletta [INFN Laboratori Nazionali di Frascati, via E. Fermi 40, Frascati (RM) 00044 (Italy); Medinaceli, Eduardo [University of Padova and INFN, via Marzolo 8, Padova (PD) 35131 (Italy); Monacelli, Piero [University of L' Aquila and INFN, via Vetoio Loc. Coppito, L' Aquila (AQ) 67100 (Italy); Muciaccia, Maria Teresa; Pastore, Alessandra [University of Bari and INFN, via E. Orabona 4, Bari 70125 (Italy); and others

    2013-03-01

    The European Scanning System, developed to analyse nuclear emulsions at high speed, has been completed with the development of a high level software infrastructure to automate and support large-scale emulsion scanning. In one year, an average installation is capable of performing data-taking and online analysis on a total surface ranging from few m{sup 2} to tens of m{sup 2}, acquiring many billions of tracks, corresponding to several TB. This paper focuses on the procedures that have been implemented and on their impact on physics measurements. The system proved robust, reliable, fault-tolerant and user-friendly, and seldom needs assistance. A dedicated relational Data Base system is the backbone of the whole infrastructure, storing data themselves and not only catalogues of data files, as in common practice, being a unique case in high-energy physics DAQ systems. The logical organisation of the system is described and a summary is given of the physics measurement that are readily available by automated processing.

  12. Large Scale Integration of Carbon Nanotubes in Microsystems

    DEFF Research Database (Denmark)

    Gjerde, Kjetil

    2007-01-01

    Kulstof nanorør har mange egenskaber der kunne anvendes i kombination med traditionelle mikrosystemer, her især overlegne mekaniske og elektriske egenskaber. I dette arbejde bliver metoder til stor-skala integration av kulstof nanorør i mikrosystemer undersøgt, med henblik på anvendelse som mekan...

  13. Large scale grid integration of renewable energy sources

    CERN Document Server

    Moreno-Munoz, Antonio

    2017-01-01

    This book presents comprehensive coverage of the means to integrate renewable power, namely wind and solar power. It looks at new approaches to meet the challenges, such as increasing interconnection capacity among geographical areas, hybridisation of different distributed energy resources and building up demand response capabilities.

  14. Quantifying the Impacts of Large Scale Integration of Renewables in Indian Power Sector

    Science.gov (United States)

    Kumar, P.; Mishra, T.; Banerjee, R.

    2017-12-01

    India's power sector is responsible for nearly 37 percent of India's greenhouse gas emissions. For a fast emerging economy like India whose population and energy consumption are poised to rise rapidly in the coming decades, renewable energy can play a vital role in decarbonizing power sector. In this context, India has targeted 33-35 percent emission intensity reduction (with respect to 2005 levels) along with large scale renewable energy targets (100GW solar, 60GW wind, and 10GW biomass energy by 2022) in INDCs submitted at Paris agreement. But large scale integration of renewable energy is a complex process which faces a number of problems like capital intensiveness, matching intermittent loads with least storage capacity and reliability. In this context, this study attempts to assess the technical feasibility of integrating renewables into Indian electricity mix by 2022 and analyze its implications on power sector operations. This study uses TIMES, a bottom up energy optimization model with unit commitment and dispatch features. We model coal and gas fired units discretely with region-wise representation of wind and solar resources. The dispatch features are used for operational analysis of power plant units under ramp rate and minimum generation constraints. The study analyzes India's electricity sector transition for the year 2022 with three scenarios. The base case scenario (no RE addition) along with INDC scenario (with 100GW solar, 60GW wind, 10GW biomass) and low RE scenario (50GW solar, 30GW wind) have been created to analyze the implications of large scale integration of variable renewable energy. The results provide us insights on trade-offs involved in achieving mitigation targets and investment decisions involved. The study also examines operational reliability and flexibility requirements of the system for integrating renewables.

  15. Large Scale Wind and Solar Integration in Germany

    Energy Technology Data Exchange (ETDEWEB)

    Ernst, Bernhard; Schreirer, Uwe; Berster, Frank; Pease, John; Scholz, Cristian; Erbring, Hans-Peter; Schlunke, Stephan; Makarov, Yuri V.

    2010-02-28

    This report provides key information concerning the German experience with integrating of 25 gigawatts of wind and 7 gigawatts of solar power capacity and mitigating its impacts on the electric power system. The report has been prepared based on information provided by the Amprion GmbH and 50Hertz Transmission GmbH managers and engineers to the Bonneville Power Administration (BPA) and Pacific Northwest National Laboratory representatives during their visit to Germany in October 2009. The trip and this report have been sponsored by the BPA Technology Innovation office. Learning from the German experience could help the Bonneville Power Administration engineers to compare and evaluate potential new solutions for managing higher penetrations of wind energy resources in their control area. A broader dissemination of this experience will benefit wind and solar resource integration efforts in the United States.

  16. Electricity Prices, Large-Scale Renewable Integration, and Policy Implications

    OpenAIRE

    Kyritsis, Evangelos; Andersson, Jonas; Serletis, Apostolos

    2016-01-01

    This paper investigates the effects of intermittent solar and wind power generation on electricity price formation in Germany. We use daily data from 2010 to 2015, a period with profound modifications in the German electricity market, the most notable being the rapid integration of photovoltaic and wind power sources, as well as the phasing out of nuclear energy. In the context of a GARCH-in-Mean model, we show that both solar and wind power Granger cause electricity prices, that solar power ...

  17. Vision for single flux quantum very large scale integrated technology

    Science.gov (United States)

    Silver, Arnold; Bunyk, Paul; Kleinsasser, Alan; Spargo, John

    2006-05-01

    Single flux quantum (SFQ) electronics is extremely fast and has very low on-chip power dissipation. SFQ VLSI is an excellent candidate for high-performance computing and other applications requiring extremely high-speed signal processing. Despite this, SFQ technology has generally not been accepted for system implementation. We argue that this is due, at least in part, to the use of outdated tools to produce SFQ circuits and chips. Assuming the use of tools equivalent to those employed in the semiconductor industry, we estimate the density of Josephson junctions, circuit speed, and power dissipation that could be achieved with SFQ technology. Today, CMOS lithography is at 90-65 nm with about 20 layers. Assuming equivalent technology, aggressively increasing the current density above 100 kA cm-2 to achieve junction speeds approximately 1000 GHz, and reducing device footprints by converting device profiles from planar to vertical, one could expect to integrate about 250 M Josephson junctions cm-2 into SFQ digital circuits. This should enable circuit operation with clock frequencies above 200 GHz and place approximately 20 K gates within a radius of one clock period. As a result, complete microprocessors, including integrated memory registers, could be fabricated on a single chip. This technology was exported from the United States in accordance with the US Department of Commerce Export Administration Regulations (EAR) for ultimate destination in the United Kingdom. Diversion contrary to US law prohibited.

  18. Vision for single flux quantum very large scale integrated technology

    International Nuclear Information System (INIS)

    Silver, Arnold; Bunyk, Paul; Kleinsasser, Alan; Spargo, John

    2006-01-01

    Single flux quantum (SFQ) electronics is extremely fast and has very low on-chip power dissipation. SFQ VLSI is an excellent candidate for high-performance computing and other applications requiring extremely high-speed signal processing. Despite this, SFQ technology has generally not been accepted for system implementation. We argue that this is due, at least in part, to the use of outdated tools to produce SFQ circuits and chips. Assuming the use of tools equivalent to those employed in the semiconductor industry, we estimate the density of Josephson junctions, circuit speed, and power dissipation that could be achieved with SFQ technology. Today, CMOS lithography is at 90-65 nm with about 20 layers. Assuming equivalent technology, aggressively increasing the current density above 100 kA cm -2 to achieve junction speeds approximately 1000 GHz, and reducing device footprints by converting device profiles from planar to vertical, one could expect to integrate about 250 M Josephson junctions cm -2 into SFQ digital circuits. This should enable circuit operation with clock frequencies above 200 GHz and place approximately 20 K gates within a radius of one clock period. As a result, complete microprocessors, including integrated memory registers, could be fabricated on a single chip

  19. Power System Operation with Large Scale Wind Power Integration

    DEFF Research Database (Denmark)

    Suwannarat, A.; Bak-Jensen, B.; Chen, Z.

    2007-01-01

    The Danish power system starts to face problems of integrating thousands megawatts of wind power, which produce in a stochastic behavior due to natural wind fluctuations. With wind power capacities increasing, the Danish Transmission System Operator (TSO) is faced with new challenges related...... to the uncertain nature of wind power. In this paper, proposed models of generations and control system are presented which analyze the deviation of power exchange at the western Danish-German border, taking into account the fluctuating nature of wind power. The performance of the secondary control of the thermal...... power plants and the spinning reserves control from the Combined Heat and Power (CHP) units to achieve active power balance with the increased wind power penetration is presented....

  20. Integration of Large-scale Consumers in Smart Grid

    DEFF Research Database (Denmark)

    Rahnama, Samira

    is on industrial consumers. We propose a three-level hierarchical control framework, in which a so-called “Aggregator” is located between a number of flexible industrial demands and a grid operator. The aggregator is the heart of this setup, with the task of handling the energy/power services can be derived from...... the demand that these consumers represent. The exact responsibility of the aggregator, however, can vary depending on several factors such as control strategies, demand types, provided services etc. This thesis addresses the aggregator design for a specific class of consumers. The work involves selecting...... the industrial thermal loads. Our case studies are a supermarket refrigeration system and an HVAC chiller in conjunction with an ice storage which are virtually connected to the aggregator. Practical results obtained from testing on real industrial consumers demonstrate the theoretical studies to a satisfactory...

  1. Electricity prices, large-scale renewable integration, and policy implications

    International Nuclear Information System (INIS)

    Kyritsis, Evangelos; Andersson, Jonas; Serletis, Apostolos

    2017-01-01

    This paper investigates the effects of intermittent solar and wind power generation on electricity price formation in Germany. We use daily data from 2010 to 2015, a period with profound modifications in the German electricity market, the most notable being the rapid integration of photovoltaic and wind power sources, as well as the phasing out of nuclear energy. In the context of a GARCH-in-Mean model, we show that both solar and wind power Granger cause electricity prices, that solar power generation reduces the volatility of electricity prices by scaling down the use of peak-load power plants, and that wind power generation increases the volatility of electricity prices by challenging electricity market flexibility. - Highlights: • We model the impact of solar and wind power generation on day-ahead electricity prices. • We discuss the different nature of renewables in relation to market design. • We explore the impact of renewables on the distributional properties of electricity prices. • Solar and wind reduce electricity prices but affect price volatility in the opposite way. • Solar decreases the probability of electricity price spikes, while wind increases it.

  2. An innovative large scale integration of silicon nanowire-based field effect transistors

    Science.gov (United States)

    Legallais, M.; Nguyen, T. T. T.; Mouis, M.; Salem, B.; Robin, E.; Chenevier, P.; Ternon, C.

    2018-05-01

    Since the early 2000s, silicon nanowire field effect transistors are emerging as ultrasensitive biosensors while offering label-free, portable and rapid detection. Nevertheless, their large scale production remains an ongoing challenge due to time consuming, complex and costly technology. In order to bypass these issues, we report here on the first integration of silicon nanowire networks, called nanonet, into long channel field effect transistors using standard microelectronic process. A special attention is paid to the silicidation of the contacts which involved a large number of SiNWs. The electrical characteristics of these FETs constituted by randomly oriented silicon nanowires are also studied. Compatible integration on the back-end of CMOS readout and promising electrical performances open new opportunities for sensing applications.

  3. Analysis for Large Scale Integration of Electric Vehicles into Power Grids

    DEFF Research Database (Denmark)

    Hu, Weihao; Chen, Zhe; Wang, Xiaoru

    2011-01-01

    Electric Vehicles (EVs) provide a significant opportunity for reducing the consumption of fossil energies and the emission of carbon dioxide. With more and more electric vehicles integrated in the power systems, it becomes important to study the effects of EV integration on the power systems......, especially the low and middle voltage level networks. In the paper, the basic structure and characteristics of the electric vehicles are introduced. The possible impacts of large scale integration of electric vehicles on the power systems especially the advantage to the integration of the renewable energies...... are discussed. Finally, the research projects related to the large scale integration of electric vehicles into the power systems are introduced, it will provide reference for large scale integration of Electric Vehicles into power grids....

  4. Analysis for Large Scale Integration of Electric Vehicles into Power Grids

    DEFF Research Database (Denmark)

    Hu, Weihao; Chen, Zhe; Wang, Xiaoru

    2011-01-01

    , especially the low and middle voltage level networks. In the paper, the basic structure and characteristics of the electric vehicles are introduced. The possible impacts of large scale integration of electric vehicles on the power systems especially the advantage to the integration of the renewable energies......Electric Vehicles (EVs) provide a significant opportunity for reducing the consumption of fossil energies and the emission of carbon dioxide. With more and more electric vehicles integrated in the power systems, it becomes important to study the effects of EV integration on the power systems...... are discussed. Finally, the research projects related to the large scale integration of electric vehicles into the power systems are introduced, it will provide reference for large scale integration of Electric Vehicles into power grids....

  5. A Large-Scale Design Integration Approach Developed in Conjunction with the Ares Launch Vehicle Program

    Science.gov (United States)

    Redmon, John W.; Shirley, Michael C.; Kinard, Paul S.

    2012-01-01

    This paper presents a method for performing large-scale design integration, taking a classical 2D drawing envelope and interface approach and applying it to modern three dimensional computer aided design (3D CAD) systems. Today, the paradigm often used when performing design integration with 3D models involves a digital mockup of an overall vehicle, in the form of a massive, fully detailed, CAD assembly; therefore, adding unnecessary burden and overhead to design and product data management processes. While fully detailed data may yield a broad depth of design detail, pertinent integration features are often obscured under the excessive amounts of information, making them difficult to discern. In contrast, the envelope and interface method results in a reduction in both the amount and complexity of information necessary for design integration while yielding significant savings in time and effort when applied to today's complex design integration projects. This approach, combining classical and modern methods, proved advantageous during the complex design integration activities of the Ares I vehicle. Downstream processes, benefiting from this approach by reducing development and design cycle time, include: Creation of analysis models for the Aerodynamic discipline; Vehicle to ground interface development; Documentation development for the vehicle assembly.

  6. Report of the Workshop on Petascale Systems Integration for LargeScale Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Kramer, William T.C.; Walter, Howard; New, Gary; Engle, Tom; Pennington, Rob; Comes, Brad; Bland, Buddy; Tomlison, Bob; Kasdorf, Jim; Skinner, David; Regimbal, Kevin

    2007-10-01

    There are significant issues regarding Large Scale System integration that are not being addressed in other forums such as current research portfolios or vendor user groups. Unfortunately, the issues in the area of large-scale system integration often fall into a netherworld; not research, not facilities, not procurement, not operations, not user services. Taken together, these issues along with the impact of sub-optimal integration technology means the time required to deploy, integrate and stabilize large scale system may consume up to 20 percent of the useful life of such systems. Improving the state of the art for large scale systems integration has potential to increase the scientific productivity of these systems. Sites have significant expertise, but there are no easy ways to leverage this expertise among them . Many issues inhibit the sharing of information, including available time and effort, as well as issues with sharing proprietary information. Vendors also benefit in the long run from the solutions to issues detected during site testing and integration. There is a great deal of enthusiasm for making large scale system integration a full-fledged partner along with the other major thrusts supported by funding agencies in the definition, design, and use of a petascale systems. Integration technology and issues should have a full 'seat at the table' as petascale and exascale initiatives and programs are planned. The workshop attendees identified a wide range of issues and suggested paths forward. Pursuing these with funding opportunities and innovation offers the opportunity to dramatically improve the state of large scale system integration.

  7. OffshoreDC DC grids for integration of large scale wind power

    DEFF Research Database (Denmark)

    Zeni, Lorenzo; Endegnanew, Atsede Gualu; Stamatiou, Georgios

    The present report summarizes the main findings of the Nordic Energy Research project “DC grids for large scale integration of offshore wind power – OffshoreDC”. The project is been funded by Nordic Energy Research through the TFI programme and was active between 2011 and 2016. The overall...... objective of the project was to drive the development of the VSC based HVDC technology for future large scale offshore grids, supporting a standardised and commercial development of the technology, and improving the opportunities for the technology to support power system integration of large scale offshore...... wind power. This was done by bringing together the key industry stakeholders and competent research organisations in the project....

  8. Hypersingular integral equations, waveguiding effects in Cantorian Universe and genesis of large scale structures

    International Nuclear Information System (INIS)

    Iovane, G.; Giordano, P.

    2005-01-01

    In this work we introduce the hypersingular integral equations and analyze a realistic model of gravitational waveguides on a cantorian space-time. A waveguiding effect is considered with respect to the large scale structure of the Universe, where the structure formation appears as if it were a classically self-similar random process at all astrophysical scales. The result is that it seems we live in an El Naschie's o (∞) Cantorian space-time, where gravitational lensing and waveguiding effects can explain the appearing Universe. In particular, we consider filamentary and planar large scale structures as possible refraction channels for electromagnetic radiation coming from cosmological structures. From this vision the Universe appears like a large self-similar adaptive mirrors set, thanks to three numerical simulations. Consequently, an infinite Universe is just an optical illusion that is produced by mirroring effects connected with the large scale structure of a finite and not a large Universe

  9. Large-scale offshore wind energy. Cost analysis and integration in the Dutch electricity market

    International Nuclear Information System (INIS)

    De Noord, M.

    1999-02-01

    The results of analysis of the construction and integration costs of large-scale offshore wind energy (OWE) farms in 2010 are presented. The integration of these farms (1 and 5 GW) in the Dutch electricity distribution system have been regarded against the background of a liberalised electricity market. A first step is taken for the determination of costs involved in solving integration problems. Three different types of foundations are examined: the mono-pile, the jacket and a new type of foundation: the concrete caisson pile: all single-turbine-single-support structures. For real offshore applications (>10 km offshore, at sea-depths >20 m), the concrete caisson pile is regarded as the most suitable. The price/power ratios of wind turbines are analysed. It is assumed that in 2010 turbines in the power range of 3-5 MW are available. The main calculations have been conducted for a 3 MW turbine. The main choice in electrical infrastructure is for AC or DC. Calculations show that at distances of 30 km offshore and more, the use of HVDC will result in higher initial costs but lower operating costs. The share of operating and maintenance (O ampersand M) costs in the kWh cost price is approximately 3.3%. To be able to compare the two farms, a base case is derived with a construction time of 10 years for both. The energy yield is calculated for a wind regime offshore of 9.0 m/s annual mean wind speed. Per 3 MW turbine this results in an annual energy production of approximately 12 GWh. The total farm efficiency amounts to 82%, resulting in a total farm capacity factor of 38%. With a required internal rate of return of 15%, the kWh cost price amounts to 0.24 DFl and 0.21 DFl for the 1 GW and 5 GW farms respectively in the base case. The required internal rate of return has a large effect on the kWh cost price, followed by costs of subsystems. O ampersand M costs have little effect on the cost price. Parameter studies show that a small cost reduction of 5% is possible when

  10. Integrated fringe projection 3D scanning system for large-scale metrology based on laser tracker

    Science.gov (United States)

    Du, Hui; Chen, Xiaobo; Zhou, Dan; Guo, Gen; Xi, Juntong

    2017-10-01

    Large scale components exist widely in advance manufacturing industry,3D profilometry plays a pivotal role for the quality control. This paper proposes a flexible, robust large-scale 3D scanning system by integrating a robot with a binocular structured light scanner and a laser tracker. The measurement principle and system construction of the integrated system are introduced. And a mathematical model is established for the global data fusion. Subsequently, a flexible and robust method and mechanism is introduced for the establishment of the end coordination system. Based on this method, a virtual robot noumenon is constructed for hand-eye calibration. And then the transformation matrix between end coordination system and world coordination system is solved. Validation experiment is implemented for verifying the proposed algorithms. Firstly, hand-eye transformation matrix is solved. Then a car body rear is measured for 16 times for the global data fusion algorithm verification. And the 3D shape of the rear is reconstructed successfully.

  11. Studies of Sub-Synchronous Oscillations in Large-Scale Wind Farm Integrated System

    Science.gov (United States)

    Yue, Liu; Hang, Mend

    2018-01-01

    With the rapid development and construction of large-scale wind farms and grid-connected operation, the series compensation wind power AC transmission is gradually becoming the main way of power usage and improvement of wind power availability and grid stability, but the integration of wind farm will change the SSO (Sub-Synchronous oscillation) damping characteristics of synchronous generator system. Regarding the above SSO problem caused by integration of large-scale wind farms, this paper focusing on doubly fed induction generator (DFIG) based wind farms, aim to summarize the SSO mechanism in large-scale wind power integrated system with series compensation, which can be classified as three types: sub-synchronous control interaction (SSCI), sub-synchronous torsional interaction (SSTI), sub-synchronous resonance (SSR). Then, SSO modelling and analysis methods are categorized and compared by its applicable areas. Furthermore, this paper summarizes the suppression measures of actual SSO projects based on different control objectives. Finally, the research prospect on this field is explored.

  12. Large scale mapping of groundwater resources using a highly integrated set of tools

    DEFF Research Database (Denmark)

    Søndergaard, Verner; Auken, Esben; Christiansen, Anders Vest

    platforms (e.g. SkyTEM) have made large-scale mapping attractive and affordable in the planning and administration of groundwater resources. The handling and optimized use of huge amounts of geophysical data covering large areas has also required a comprehensive database, where data can easily be stored......The aim of this abstract is to give a short description of the essential ideas of the Danish national strategy for large scale mapping of the groundwater resources.Emphasis will be put on a description of the advantages obtained by combining acquirement of spatially dense geophysical data covering...... large areas with information from an optimum number of new investigation boreholes, existing boreholes, logs and water samples to get an integrated and detailed description of the groundwater resources and their vulnerability.Development of more time efficient and airborne geophysical data acquisition...

  13. Dynamic model of frequency control in Danish power system with large scale integration of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Hansen, Anca Daniela; Sørensen, Poul Ejnar

    2013-01-01

    This work evaluates the impact of large scale integration of wind power in future power systems when 50% of load demand can be met from wind power. The focus is on active power balance control, where the main source of power imbalance is an inaccurate wind speed forecast. In this study, a Danish...... power system model with large scale of wind power is developed and a case study for an inaccurate wind power forecast is investigated. The goal of this work is to develop an adequate power system model that depicts relevant dynamic features of the power plants and compensates for load generation...... imbalances, caused by inaccurate wind speed forecast, by an appropriate control of the active power production from power plants....

  14. Large-scale building integrated photovoltaics field trial. First technical report - installation phase

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    This report summarises the results of the first eighteen months of the Large-Scale Building Integrated Photovoltaic Field Trial focussing on technical aspects. The project aims included increasing awareness and application of the technology, raising the UK capabilities in application of the technology, and assessing the potential for building integrated photovoltaics (BIPV). Details are given of technology choices; project organisation, cost, and status; and the evaluation criteria. Installations of BIPV described include University buildings, commercial centres, and a sports stadium, wildlife park, church hall, and district council building. Lessons learnt are discussed, and a further report covering monitoring aspects is planned.

  15. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

    Science.gov (United States)

    Massie, Michael J.; Morris, A. Terry

    2010-01-01

    Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

  16. Integrated Technologies for Large-Scale Trapped-Ion Quantum Information Processing

    Science.gov (United States)

    Sorace-Agaskar, C.; Bramhavar, S.; Kharas, D.; Mehta, K. K.; Loh, W.; Panock, R.; Bruzewicz, C. D.; McConnell, R.; Ram, R. J.; Sage, J. M.; Chiaverini, J.

    2016-05-01

    Atomic ions trapped and controlled using electromagnetic fields hold great promise for practical quantum information processing due to their inherent coherence properties and controllability. However, to realize this promise, the ability to maintain and manipulate large-scale systems is required. We present progress toward the development of, and proof-of-principle demonstrations and characterization of, several technologies that can be integrated with ion-trap arrays on-chip to enable such scaling to practically useful sizes. Of particular use are integrated photonic elements for routing and focusing light throughout a chip without the need for free-space optics. The integration of CMOS electronics and photo-detectors for on-chip control and readout, and methods for monolithic fabrication and wafer-scale integration to incorporate these capabilities into tile-able 2D ion-trap array cells, are also explored.

  17. Do large-scale assessments measure students' ability to integrate scientific knowledge?

    Science.gov (United States)

    Lee, Hee-Sun

    2010-03-01

    Large-scale assessments are used as means to diagnose the current status of student achievement in science and compare students across schools, states, and countries. For efficiency, multiple-choice items and dichotomously-scored open-ended items are pervasively used in large-scale assessments such as Trends in International Math and Science Study (TIMSS). This study investigated how well these items measure secondary school students' ability to integrate scientific knowledge. This study collected responses of 8400 students to 116 multiple-choice and 84 open-ended items and applied an Item Response Theory analysis based on the Rasch Partial Credit Model. Results indicate that most multiple-choice items and dichotomously-scored open-ended items can be used to determine whether students have normative ideas about science topics, but cannot measure whether students integrate multiple pieces of relevant science ideas. Only when the scoring rubric is redesigned to capture subtle nuances of student open-ended responses, open-ended items become a valid and reliable tool to assess students' knowledge integration ability.

  18. A Methodology for Integrated, Multiregional Life Cycle Assessment Scenarios under Large-Scale Technological Change.

    Science.gov (United States)

    Gibon, Thomas; Wood, Richard; Arvesen, Anders; Bergesen, Joseph D; Suh, Sangwon; Hertwich, Edgar G

    2015-09-15

    Climate change mitigation demands large-scale technological change on a global level and, if successfully implemented, will significantly affect how products and services are produced and consumed. In order to anticipate the life cycle environmental impacts of products under climate mitigation scenarios, we present the modeling framework of an integrated hybrid life cycle assessment model covering nine world regions. Life cycle assessment databases and multiregional input-output tables are adapted using forecasted changes in technology and resources up to 2050 under a 2 °C scenario. We call the result of this modeling "technology hybridized environmental-economic model with integrated scenarios" (THEMIS). As a case study, we apply THEMIS in an integrated environmental assessment of concentrating solar power. Life-cycle greenhouse gas emissions for this plant range from 33 to 95 g CO2 eq./kWh across different world regions in 2010, falling to 30-87 g CO2 eq./kWh in 2050. Using regional life cycle data yields insightful results. More generally, these results also highlight the need for systematic life cycle frameworks that capture the actual consequences and feedback effects of large-scale policies in the long term.

  19. Large scale integration of intermittent renewable energy sources in the Greek power sector

    International Nuclear Information System (INIS)

    Voumvoulakis, Emmanouil; Asimakopoulou, Georgia; Danchev, Svetoslav; Maniatis, George; Tsakanikas, Aggelos

    2012-01-01

    As a member of the European Union, Greece has committed to achieve ambitious targets for the penetration of renewable energy sources (RES) in gross electricity consumption by 2020. Large scale integration of RES requires a suitable mixture of compatible generation units, in order to deal with the intermittency of wind velocity and solar irradiation. The scope of this paper is to examine the impact of large scale integration of intermittent energy sources, required to meet the 2020 RES target, on the generation expansion plan, the fuel mix and the spinning reserve requirements of the Greek electricity system. We perform hourly simulation of the intermittent RES generation to estimate residual load curves on a monthly basis, which are then inputted in a WASP-IV model of the Greek power system. We find that the decarbonisation effort, with the rapid entry of RES and the abolishment of the grandfathering of CO 2 allowances, will radically transform the Greek electricity sector over the next 10 years, which has wide-reaching policy implications. - Highlights: ► Greece needs 8.8 to 9.3 GW additional RES installations by 2020. ► RES capacity credit varies between 12.2% and 15.3%, depending on interconnections. ► Without institutional changes, the reserve requirements will be more than double. ► New CCGT installed capacity will probably exceed the cost-efficient level. ► Competitive pressures should be introduced in segments other than day-ahead market.

  20. Large-scale integration of wind power into different energy systems

    DEFF Research Database (Denmark)

    Lund, Henrik

    2005-01-01

    The paper presents the ability of different energy systems and regulation strategies to integrate wind power. The ability is expressed by the following three factors: the degree of electricity excess production caused by fluctuations in wind and Combined Heat and Power (CHP) heat demands......, the ability to utilise wind power to reduce CO2 emission in the system, and the ability to benefit from exchange of electricity on the market. Energy systems and regulation strategies are analysed in the range of a wind power input from 0 to 100% of the electricity demand. Based on the Danish energy system...... of the analyses make it possible to compare short-term and long-term potentials of different strategies of large-scale integration of wind power....

  1. Deciphering the clinical effect of drugs through large-scale data integration

    DEFF Research Database (Denmark)

    Kjærulff, Sonny Kim

    chemical biology database. ChemProt integrates different chemical-protein annotation resources for disease-associated proteins and protein-protein interaction data. ChemProt is developed to assist in silico evaluation of environmental chemicals, natural products and approved drugs, as well as to aid......-effect data have been implemented. Chapter 3 presents two articles that showcase the application of systems chemical biology approaches to understand and model drug side-effect data. The first approach applies machine learning methods to cluster side-effects, drugs, proteins and clinical outcomes in networks......This thesis presents the work carried out at the Center for Biological Sequence Analysis, Technical University of Denmark. The thesis includes four articles describing large-scale data integration and methods for the prediction of drug side-effects. Chapter 2 presents ChemProt, a novel disease...

  2. Operation Modeling of Power Systems Integrated with Large-Scale New Energy Power Sources

    Directory of Open Access Journals (Sweden)

    Hui Li

    2016-10-01

    Full Text Available In the most current methods of probabilistic power system production simulation, the output characteristics of new energy power generation (NEPG has not been comprehensively considered. In this paper, the power output characteristics of wind power generation and photovoltaic power generation are firstly analyzed based on statistical methods according to their historical operating data. Then the characteristic indexes and the filtering principle of the NEPG historical output scenarios are introduced with the confidence level, and the calculation model of NEPG’s credible capacity is proposed. Based on this, taking the minimum production costs or the best energy-saving and emission-reduction effect as the optimization objective, the power system operation model with large-scale integration of new energy power generation (NEPG is established considering the power balance, the electricity balance and the peak balance. Besides, the constraints of the operating characteristics of different power generation types, the maintenance schedule, the load reservation, the emergency reservation, the water abandonment and the transmitting capacity between different areas are also considered. With the proposed power system operation model, the operation simulations are carried out based on the actual Northwest power grid of China, which resolves the new energy power accommodations considering different system operating conditions. The simulation results well verify the validity of the proposed power system operation model in the accommodation analysis for the power system which is penetrated with large scale NEPG.

  3. Energy System Analysis of Large-Scale Integration of Wind Power

    International Nuclear Information System (INIS)

    Lund, Henrik

    2003-11-01

    The paper presents the results of two research projects conducted by Aalborg University and financed by the Danish Energy Research Programme. Both projects include the development of models and system analysis with focus on large-scale integration of wind power into different energy systems. Market reactions and ability to exploit exchange on the international market for electricity by locating exports in hours of high prices are included in the analyses. This paper focuses on results which are valid for energy systems in general. The paper presents the ability of different energy systems and regulation strategies to integrate wind power, The ability is expressed by three factors: One factor is the degree of electricity excess production caused by fluctuations in wind and CHP heat demands. The other factor is the ability to utilise wind power to reduce CO 2 emission in the system. And the third factor is the ability to benefit from exchange of electricity on the market. Energy systems and regulation strategies are analysed in the range of a wind power input from 0 to 100% of the electricity demand. Based on the Danish energy system, in which 50 per cent of the electricity demand is produced in CHP, a number of future energy systems with CO 2 reduction potentials are analysed, i.e. systems with more CHP, systems using electricity for transportation (battery or hydrogen vehicles) and systems with fuel-cell technologies. For the present and such potential future energy systems different regulation strategies have been analysed, i.e. the inclusion of small CHP plants into the regulation task of electricity balancing and grid stability and investments in electric heating, heat pumps and heat storage capacity. Also the potential of energy management has been analysed. The results of the analyses make it possible to compare short-term and long-term potentials of different strategies of large-scale integration of wind power

  4. Towards Large-Scale Fast Reprogrammable SOA-Based Photonic Integrated Switch Circuits

    Directory of Open Access Journals (Sweden)

    Ripalta Stabile

    2017-09-01

    Full Text Available Due to the exponentially increasing connectivity and bandwidth demand from the Internet, the most advanced examples of medium-scale fast reconfigurable photonic integrated switch circuits are offered by research carried out for data- and computer-communication applications, where network flexibility at a high speed and high connectivity are provided to suit network demand. Recently we have prototyped optical switching circuits using monolithic integration technology with up to several hundreds of integrated optical components per chip for high connectivity. In this paper, the current status of fast reconfigurable medium-scale indium phosphide (InP integrated photonic switch matrices based on the use of semiconductor optical amplifier (SOA gates is reviewed, focusing on broadband and cross-connecting monolithic implementations, granting a connectivity of up to sixteen input ports, sixteen output ports, and sixty-four channels, respectively. The opportunities for increasing connectivity, enabling nanosecond order reconfigurability, and introducing distributed optical power monitoring at the physical layer are highlighted. Complementary architecture based on resonant switching elements on the same material platform are also discussed for power efficient switching. Performance projections related to the physical layer are presented and strategies for improvements are discussed in view of opening a route towards large-scale power efficient fast reprogrammable photonic integrated switching circuits.

  5. Distributed constraint satisfaction for coordinating and integrating a large-scale, heterogenous enterprise

    CERN Document Server

    Eisenberg, C

    2003-01-01

    Market forces are continuously driving public and private organisations towards higher productivity, shorter process and production times, and fewer labour hours. To cope with these changes, organisations are adopting new organisational models of coordination and cooperation that increase their flexibility, consistency, efficiency, productivity and profit margins. In this thesis an organisational model of coordination and cooperation is examined using a real life example; the technical integration of a distributed large-scale project of an international physics collaboration. The distributed resource constraint project scheduling problem is modelled and solved with the methods of distributed constraint satisfaction. A distributed local search method, the distributed breakout algorithm (DisBO), is used as the basis for the coordination scheme. The efficiency of the local search method is improved by extending it with an incremental problem solving scheme with variable ordering. The scheme is implemented as cen...

  6. Impacts of large-scale offshore wind farm integration on power systems through VSC-HVDC

    DEFF Research Database (Denmark)

    Liu, Hongzhi; Chen, Zhe

    2013-01-01

    the impacts of integrating a large-scale offshore wind farm into the transmission system of a power grid through VSC-HVDC connection. The concerns are focused on steady-state voltage stability, dynamic voltage stability and transient angle stability. Simulation results based on an exemplary power system......, an offshore wind farm could have a capacity rating to hundreds of MWs or even GWs that is large enough to compete with conventional power plants. Thus the impacts of a large offshore wind farm on power system operation and security should be thoroughly studied and understood. This paper investigates......The potential of offshore wind energy has been commonly recognized and explored globally. Many countries have implemented and planned offshore wind farms to meet their increasing electricity demands and public environmental appeals, especially in Europe. With relatively less space limitation...

  7. Review of DC System Technologies for Large Scale Integration of Wind Energy Systems with Electricity Grids

    Directory of Open Access Journals (Sweden)

    Sheng Jie Shao

    2010-06-01

    Full Text Available The ever increasing development and availability of power electronic systems is the underpinning technology that enables large scale integration of wind generation plants with the electricity grid. As the size and power capacity of the wind turbine continues to increase, so is the need to place these significantly large structures at off-shore locations. DC grids and associated power transmission technologies provide opportunities for cost reduction and electricity grid impact minimization as the bulk power is concentrated at single point of entry. As a result, planning, optimization and impact can be studied and carefully controlled minimizing the risk of the investment as well as power system stability issues. This paper discusses the key technologies associated with DC grids for offshore wind farm applications.

  8. Microfluidic very large-scale integration for biochips: Technology, testing and fault-tolerant design

    DEFF Research Database (Denmark)

    Araci, Ismail Emre; Pop, Paul; Chakrabarty, Krishnendu

    2015-01-01

    of this paper is on continuous-flow biochips, where the basic building block is a microvalve. By combining these microvalves, more complex units such as mixers, switches, multiplexers can be built, hence the name of the technology, “microfluidic Very Large-Scale Integration” (mVLSI). A roadblock......Microfluidic biochips are replacing the conventional biochemical analyzers by integrating all the necessary functions for biochemical analysis using microfluidics. Biochips are used in many application areas, such as, in vitro diagnostics, drug discovery, biotech and ecology. The focus...... presents the state-of-the-art in the mVLSI platforms and emerging research challenges in the area of continuous-flow microfluidics, focusing on testing techniques and fault-tolerant design....

  9. Polymer optical circuits technology for large-scale integration of passive functions

    Science.gov (United States)

    Maalouf, Azar; Bosc, Dominique; Henrio, Frédéric; Haesaert, Séverine; Grosso, Philippe; Hardy, Isabelle; Gadonna, Michel

    2006-04-01

    Polymers are attractive to realize integrated circuits specially because they are very simple to process and are promising for low cost devices. Moreover, beside low cost technology, the large possible range of refractive index, could lead to large scale of integration, lowering the fabrication costs. In some cases, it could be an alternative solution to semiconductor or inorganic dielectric technologies. With usual UV photolithography technology, this work shows that it is possible to perform small guides in order to provide relatively high circuit densification. The refractive index contrast, between optical core and cladding, can be as high as 0.07 instead of 0.02 for the higher contrast in silica Ge doped waveguides. Recently, this contrast has been increased to 0.11 at the wavelength of 1550nm. These materials make possible the patterning of guides having radius of curvature smaller than 200μm. Such curvatures open the way to functions based on microrings that potentially lead to compact wavelength multiplexers. With the view to control the fabrication of polymer waveguides, some features of the process are reported here. For example, shortcomings such as unsuitable film worm aspects are described and solutions are given with requirements assigned to rough materials. Mechanical and thermal properties of polymers have to be adjusted to withstand integrated circuit processing. This paper also presents results concerning the realization of integrated passive microring resonators with this technology.

  10. The Cortlandt complex: evidence for large-scale liquid immiscibility involving granodiorite and diorite magmas

    Science.gov (United States)

    Bender, J. F.; Hanson, G. N.; Bence, A. E.

    1982-05-01

    Granodiorite and diorite plutons of the Rosetown complex, N.Y., which are associated with the nearby Cortlandt complex, have chemical and textural characteristics indicating that large-scale liquid immiscibility played a major role in their petrogenesis. Rare earth element, zirconium, niobium and phosphorus abundances are much greater in the diorite precluding the possibility that the Rosetown diorite and granodiorite are related by fractional crystallization. The trace element data also eliminate the possibility that the granodiorite represents: (1) a partial melt of crustal rocks including basalt; (2) a granitic cumulate; or (3) a residue from an aqueous fluid derived either from a silicate melt or crustal rocks. Liquid immiscibility appears to be viable model for the origin of the Rosetown granodiorite and iron-rich diorite. This model is supported by the following: (1) the major element compositions occur in a two-liquid field on a Greig diagram; (2) both bodies have similar Sr isotope compositions; (3) common phases in the two rock types have overlapping compositions; (4) the major and trace element data of the diorite and granodiorite are similar to the experimentally determined partition data of immiscible liquid pairs; and (5) possible ocelli of iron-rich diorite are found in the granodiorite.

  11. From Principles to Details: Integrated Framework for Architecture Modelling of Large Scale Software Systems

    Directory of Open Access Journals (Sweden)

    Andrzej Zalewski

    2013-06-01

    Full Text Available There exist numerous models of software architecture (box models, ADL’s, UML, architectural decisions, architecture modelling frameworks (views, enterprise architecture frameworks and even standards recommending practice for the architectural description. We show in this paper, that there is still a gap between these rather abstract frameworks/standards and existing architecture models. Frameworks and standards define what should be modelled rather than which models should be used and how these models are related to each other. We intend to prove that a less abstract modelling framework is needed for the effective modelling of large scale software intensive systems. It should provide a more precise guidance kinds of models to be employed and how they should relate to each other. The paper defines principles that can serve as base for an integrated model. Finally, structure of such a model has been proposed. It comprises three layers: the upper one – architectural policy – reflects corporate policy and strategies in architectural terms, the middle one –system organisation pattern – represents the core structural concepts and their rationale at a given level of scope, the lower one contains detailed architecture models. Architectural decisions play an important role here: they model the core architectural concepts explaining detailed models as well as organise the entire integrated model and the relations between its submodels.

  12. Integrating large-scale data and RNA technology to protect crops from fungal pathogens

    Directory of Open Access Journals (Sweden)

    Ian Joseph Girard

    2016-05-01

    Full Text Available With a rapidly growing human population it is expected that plant science researchers and the agricultural community will need to increase food productivity using less arable land. This challenge is complicated by fungal pathogens and diseases, many of which can severely impact crop yield. Current measures to control fungal pathogens are either ineffective or have adverse effects on the agricultural enterprise. Thus, developing new strategies through research innovation to protect plants from pathogenic fungi is necessary to overcome these hurdles. RNA sequencing technologies are increasing our understanding of the underlying genes and gene regulatory networks mediating disease outcomes. The application of invigorating next generation sequencing strategies to study plant-pathogen interactions has and will provide unprecedented insight into the complex patterns of gene activity responsible for crop protection. However, questions remain about how biological processes in both the pathogen and the host are specified in space directly at the site of infection and over the infection period. The integration of cutting edge molecular and computational tools will provide plant scientists with the arsenal required to identify genes and molecules that play a role in plant protection. Large scale RNA sequence data can then be used to protect plants by targeting genes essential for pathogen viability in the production of stably transformed lines expressing RNA interference molecules, or through foliar applications of double stranded RNA.

  13. Integrating Large-Scale Data and RNA Technology to Protect Crops from Fungal Pathogens.

    Science.gov (United States)

    Girard, Ian J; Mcloughlin, Austein G; de Kievit, Teresa R; Fernando, Dilantha W G; Belmonte, Mark F

    2016-01-01

    With a rapidly growing human population it is expected that plant science researchers and the agricultural community will need to increase food productivity using less arable land. This challenge is complicated by fungal pathogens and diseases, many of which can severely impact crop yield. Current measures to control fungal pathogens are either ineffective or have adverse effects on the agricultural enterprise. Thus, developing new strategies through research innovation to protect plants from pathogenic fungi is necessary to overcome these hurdles. RNA sequencing technologies are increasing our understanding of the underlying genes and gene regulatory networks mediating disease outcomes. The application of invigorating next generation sequencing strategies to study plant-pathogen interactions has and will provide unprecedented insight into the complex patterns of gene activity responsible for crop protection. However, questions remain about how biological processes in both the pathogen and the host are specified in space directly at the site of infection and over the infection period. The integration of cutting edge molecular and computational tools will provide plant scientists with the arsenal required to identify genes and molecules that play a role in plant protection. Large scale RNA sequence data can then be used to protect plants by targeting genes essential for pathogen viability in the production of stably transformed lines expressing RNA interference molecules, or through foliar applications of double stranded RNA.

  14. A large-scale clinical validation of an integrated monitoring system in the emergency department.

    Science.gov (United States)

    Clifton, David A; Wong, David; Clifton, Lei; Wilson, Sarah; Way, Rob; Pullinger, Richard; Tarassenko, Lionel

    2013-07-01

    We consider an integrated patient monitoring system, combining electronic patient records with high-rate acquisition of patient physiological data. There remain many challenges in increasing the robustness of "e-health" applications to a level at which they are clinically useful, particularly in the use of automated algorithms used to detect and cope with artifact in data contained within the electronic patient record, and in analyzing and communicating the resultant data for reporting to clinicians. There is a consequential "plague of pilots," in which engineering prototype systems do not enter into clinical use. This paper describes an approach in which, for the first time, the Emergency Department (ED) of a major research hospital has adopted such systems for use during a large clinical trial. We describe the disadvantages of existing evaluation metrics when applied to such large trials, and propose a solution suitable for large-scale validation. We demonstrate that machine learning technologies embedded within healthcare information systems can provide clinical benefit, with the potential to improve patient outcomes in the busy environment of a major ED and other high-dependence areas of patient care.

  15. Integrated calibration of a 3D attitude sensor in large-scale metrology

    International Nuclear Information System (INIS)

    Gao, Yang; Lin, Jiarui; Yang, Linghui; Zhu, Jigui; Muelaner, Jody; Keogh, Patrick

    2017-01-01

    A novel calibration method is presented for a multi-sensor fusion system in large-scale metrology, which improves the calibration efficiency and reliability. The attitude sensor is composed of a pinhole prism, a converging lens, an area-array camera and a biaxial inclinometer. A mathematical model is established to determine its 3D attitude relative to a cooperative total station by using two vector observations from the imaging system and the inclinometer. There are two areas of unknown parameters in the measurement model that should be calibrated: the intrinsic parameters of the imaging model, and the transformation matrix between the camera and the inclinometer. An integrated calibration method using a three-axis rotary table and a total station is proposed. A single mounting position of the attitude sensor on the rotary table is sufficient to solve for all parameters of the measurement model. A correction technique for the reference laser beam of the total station is also presented to remove the need for accurate positioning of the sensor on the rotary table. Experimental verification has proved the practicality and accuracy of this calibration method. Results show that the mean deviations of attitude angles using the proposed method are less than 0.01°. (paper)

  16. Integrated biodosimetry in large scale radiological events. Opportunities for civil military co-operation

    International Nuclear Information System (INIS)

    Port, M.; Eder, S.F.; Lamkowski, A.; Majewski, M.; Abend, M.

    2016-01-01

    Radiological events like large scale radiological or nuclear accidents, terroristic attacks with radionuclide dispersal devices require rapid and precise medical classification (''triage'') and medical management of a large number of patients. Estimates on the absorbed dose and in particular predictions of the radiation induced health effects are mandatory for optimized allocation of limited medical resources and initiation of patient centred treatment. Among the German Armed Forces Medical Services the Bundeswehr Institute of Radiobiology offers a wide range of tools for the purpose of medical management to cope with different scenarios. The forward deployable mobile Medical Task Force has access to state of the art methodologies summarized into approaches such as physical dosimetry (including mobile gammaspectroscopy), clinical ''dosimetry'' (prodromi, H-Modul) and different means of biological dosimetry (e.g. dicentrics, high throughput gene expression techniques, gamma-H2AX). The integration of these different approaches enables trained physicians of the Medical Task Force to assess individual health injuries as well as prognostic evaluation, considering modern treatment options. To enhance the capacity of single institutions, networking has been recognized as an important emergency response strategy. The capabilities of physical, biological and clinical ''dosimetry'' approaches spanning from low up to high radiation exposures will be discussed. Furthermore civil military opportunities for combined efforts will be demonstrated.

  17. Identifying gene-environment interactions in schizophrenia: contemporary challenges for integrated, large-scale investigations.

    Science.gov (United States)

    van Os, Jim; Rutten, Bart P; Myin-Germeys, Inez; Delespaul, Philippe; Viechtbauer, Wolfgang; van Zelst, Catherine; Bruggeman, Richard; Reininghaus, Ulrich; Morgan, Craig; Murray, Robin M; Di Forti, Marta; McGuire, Philip; Valmaggia, Lucia R; Kempton, Matthew J; Gayer-Anderson, Charlotte; Hubbard, Kathryn; Beards, Stephanie; Stilo, Simona A; Onyejiaka, Adanna; Bourque, Francois; Modinos, Gemma; Tognin, Stefania; Calem, Maria; O'Donovan, Michael C; Owen, Michael J; Holmans, Peter; Williams, Nigel; Craddock, Nicholas; Richards, Alexander; Humphreys, Isla; Meyer-Lindenberg, Andreas; Leweke, F Markus; Tost, Heike; Akdeniz, Ceren; Rohleder, Cathrin; Bumb, J Malte; Schwarz, Emanuel; Alptekin, Köksal; Üçok, Alp; Saka, Meram Can; Atbaşoğlu, E Cem; Gülöksüz, Sinan; Gumus-Akay, Guvem; Cihan, Burçin; Karadağ, Hasan; Soygür, Haldan; Cankurtaran, Eylem Şahin; Ulusoy, Semra; Akdede, Berna; Binbay, Tolga; Ayer, Ahmet; Noyan, Handan; Karadayı, Gülşah; Akturan, Elçin; Ulaş, Halis; Arango, Celso; Parellada, Mara; Bernardo, Miguel; Sanjuán, Julio; Bobes, Julio; Arrojo, Manuel; Santos, Jose Luis; Cuadrado, Pedro; Rodríguez Solano, José Juan; Carracedo, Angel; García Bernardo, Enrique; Roldán, Laura; López, Gonzalo; Cabrera, Bibiana; Cruz, Sabrina; Díaz Mesa, Eva Ma; Pouso, María; Jiménez, Estela; Sánchez, Teresa; Rapado, Marta; González, Emiliano; Martínez, Covadonga; Sánchez, Emilio; Olmeda, Ma Soledad; de Haan, Lieuwe; Velthorst, Eva; van der Gaag, Mark; Selten, Jean-Paul; van Dam, Daniella; van der Ven, Elsje; van der Meer, Floor; Messchaert, Elles; Kraan, Tamar; Burger, Nadine; Leboyer, Marion; Szoke, Andrei; Schürhoff, Franck; Llorca, Pierre-Michel; Jamain, Stéphane; Tortelli, Andrea; Frijda, Flora; Vilain, Jeanne; Galliot, Anne-Marie; Baudin, Grégoire; Ferchiou, Aziz; Richard, Jean-Romain; Bulzacka, Ewa; Charpeaud, Thomas; Tronche, Anne-Marie; De Hert, Marc; van Winkel, Ruud; Decoster, Jeroen; Derom, Catherine; Thiery, Evert; Stefanis, Nikos C; Sachs, Gabriele; Aschauer, Harald; Lasser, Iris; Winklbaur, Bernadette; Schlögelhofer, Monika; Riecher-Rössler, Anita; Borgwardt, Stefan; Walter, Anna; Harrisberger, Fabienne; Smieskova, Renata; Rapp, Charlotte; Ittig, Sarah; Soguel-dit-Piquard, Fabienne; Studerus, Erich; Klosterkötter, Joachim; Ruhrmann, Stephan; Paruch, Julia; Julkowski, Dominika; Hilboll, Desiree; Sham, Pak C; Cherny, Stacey S; Chen, Eric Y H; Campbell, Desmond D; Li, Miaoxin; Romeo-Casabona, Carlos María; Emaldi Cirión, Aitziber; Urruela Mora, Asier; Jones, Peter; Kirkbride, James; Cannon, Mary; Rujescu, Dan; Tarricone, Ilaria; Berardi, Domenico; Bonora, Elena; Seri, Marco; Marcacci, Thomas; Chiri, Luigi; Chierzi, Federico; Storbini, Viviana; Braca, Mauro; Minenna, Maria Gabriella; Donegani, Ivonne; Fioritti, Angelo; La Barbera, Daniele; La Cascia, Caterina Erika; Mulè, Alice; Sideli, Lucia; Sartorio, Rachele; Ferraro, Laura; Tripoli, Giada; Seminerio, Fabio; Marinaro, Anna Maria; McGorry, Patrick; Nelson, Barnaby; Amminger, G Paul; Pantelis, Christos; Menezes, Paulo R; Del-Ben, Cristina M; Gallo Tenan, Silvia H; Shuhama, Rosana; Ruggeri, Mirella; Tosato, Sarah; Lasalvia, Antonio; Bonetto, Chiara; Ira, Elisa; Nordentoft, Merete; Krebs, Marie-Odile; Barrantes-Vidal, Neus; Cristóbal, Paula; Kwapil, Thomas R; Brietzke, Elisa; Bressan, Rodrigo A; Gadelha, Ary; Maric, Nadja P; Andric, Sanja; Mihaljevic, Marina; Mirjanic, Tijana

    2014-07-01

    Recent years have seen considerable progress in epidemiological and molecular genetic research into environmental and genetic factors in schizophrenia, but methodological uncertainties remain with regard to validating environmental exposures, and the population risk conferred by individual molecular genetic variants is small. There are now also a limited number of studies that have investigated molecular genetic candidate gene-environment interactions (G × E), however, so far, thorough replication of findings is rare and G × E research still faces several conceptual and methodological challenges. In this article, we aim to review these recent developments and illustrate how integrated, large-scale investigations may overcome contemporary challenges in G × E research, drawing on the example of a large, international, multi-center study into the identification and translational application of G × E in schizophrenia. While such investigations are now well underway, new challenges emerge for G × E research from late-breaking evidence that genetic variation and environmental exposures are, to a significant degree, shared across a range of psychiatric disorders, with potential overlap in phenotype. © The Author 2014. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  18. Challenge of Primary Voltage Control in Large Scale Wind Integrated Power System: A Danish Power System Case Study

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain; Chen, Zhe; Thøgersen, Paul

    2013-01-01

    Grid integration of Renewable Energy (RE) at large scale poses vast majority of challenges to secure and stable operation of Power System. This paper presents the challenge of short circuit power and primary voltage control of wind integrated power system where majority of conventional generators...... of operational and future model of western Danish power system has been presented to support the effectiveness of demonstrated alternatives....... are replaced by wind generators. The impact of large scale wind integration on fast reactive power support is studied in this paper. Considering both technical and economic aspects, alternatives to address the challenge of dynamic voltage support have also been demonstrated in this paper. A case study...

  19. Monolithic Ge-on-Si lasers for large-scale electronic–photonic integration

    International Nuclear Information System (INIS)

    Liu, Jifeng; Kimerling, Lionel C; Michel, Jurgen

    2012-01-01

    A silicon-based monolithic laser source has long been envisioned as a key enabling component for large-scale electronic–photonic integration in future generations of high-performance computation and communication systems. In this paper we present a comprehensive review on the development of monolithic Ge-on-Si lasers for this application. Starting with a historical review of light emission from the direct gap transition of Ge dating back to the 1960s, we focus on the rapid progress in band-engineered Ge-on-Si lasers in the past five years after a nearly 30-year gap in this research field. Ge has become an interesting candidate for active devices in Si photonics in the past decade due to its pseudo-direct gap behavior and compatibility with Si complementary metal oxide semiconductor (CMOS) processing. In 2007, we proposed combing tensile strain with n-type doping to compensate the energy difference between the direct and indirect band gap of Ge, thereby achieving net optical gain for CMOS-compatible diode lasers. Here we systematically present theoretical modeling, material growth methods, spontaneous emission, optical gain, and lasing under optical and electrical pumping from band-engineered Ge-on-Si, culminated by recently demonstrated electrically pumped Ge-on-Si lasers with >1 mW output in the communication wavelength window of 1500–1700 nm. The broad gain spectrum enables on-chip wavelength division multiplexing. A unique feature of band-engineered pseudo-direct gap Ge light emitters is that the emission intensity increases with temperature, exactly opposite to conventional direct gap semiconductor light-emitting devices. This extraordinary thermal anti-quenching behavior greatly facilitates monolithic integration on Si microchips where temperatures can reach up to 80 °C during operation. The same band-engineering approach can be extended to other pseudo-direct gap semiconductors, allowing us to achieve efficient light emission at wavelengths previously

  20. Advancing flood risk analysis by integrating adaptive behaviour in large-scale flood risk assessments

    Science.gov (United States)

    Haer, T.; Botzen, W.; Aerts, J.

    2016-12-01

    In the last four decades the global population living in the 1/100 year-flood zone has doubled from approximately 500 million to a little less than 1 billion people. Urbanization in low lying -flood prone- cities further increases the exposed assets, such as buildings and infrastructure. Moreover, climate change will further exacerbate flood risk in the future. Accurate flood risk assessments are important to inform policy-makers and society on current- and future flood risk levels. However, these assessment suffer from a major flaw in the way they estimate flood vulnerability and adaptive behaviour of individuals and governments. Current flood risk projections commonly assume that either vulnerability remains constant, or try to mimic vulnerability through incorporating an external scenario. Such a static approach leads to a misrepresentation of future flood risk, as humans respond adaptively to flood events, flood risk communication, and incentives to reduce risk. In our study, we integrate adaptive behaviour in a large-scale European flood risk framework through an agent-based modelling approach. This allows for the inclusion of heterogeneous agents, which dynamically respond to each other and a changing environment. We integrate state-of-the-art flood risk maps based on climate scenarios (RCP's), and socio-economic scenarios (SSP's), with government and household agents, which behave autonomously based on (micro-)economic behaviour rules. We show for the first time that excluding adaptive behaviour leads to a major misrepresentation of future flood risk. The methodology is applied to flood risk, but has similar implications for other research in the field of natural hazards. While more research is needed, this multi-disciplinary study advances our understanding of how future flood risk will develop.

  1. Using very large scale integrated optics (VLSIO) to create high-complexity optoelectronic components

    Science.gov (United States)

    West, Lawrence C.; Roberts, Charles W.; Piscani, Emil C.; Dubey, Madan; Jones, Kenneth A.; McLane, George F.

    1996-01-01

    Optics has the fundamental capability of dramatically improving computer performance via the reduction of capacitance for intrinsic high bandwidth communications and low power usage. Yet optical devices have not displaced silicon VLSI in any measure to date. The reason is clear. When placed into systems, the optical devices have not had significantly greater performance in equally complex information processing circuits and similarly low manufacturing cost. An approach demonstrated here uses the same system integration techniques that have been successful for silicon electronics, only applied to optics. Essential for creation of Very Large Scale Integrated Optics, with over 50,000 high speed logic gates per square centimeter, is a new class of Ultra High Confinement (UHC) waveguides. These waveguides are created with high index difference (as high as 4.0 to 1.0) between guide and cladding. The waveguides have been demonstrated with infrared cross sections less than 5% of a square free space wavelength. These waveguides can be manufactured today only in the mid- infrared, but the concepts should scale to the near-infrared as lithography improves. Waveguide corners have been designed and demonstrated with a bend radius of less than one free space wavelength. Resonators have been designed which have over 100 times smaller volume than VCSELs, yet efficiently interconnected laterally in high densities. A connector to the UHC waveguides has been developed and demonstrated using diffractive optical element arrays on the back side of the substrate. The coupler arrays can allow up to 10,000 Gaussian beam connections per square centimeter. This connectivity also has advantages for low-cost 3D packaging for reduced cost and thermal dissipation. Experimental results on the above concepts and components will be presented.

  2. An efficient concordant integrative analysis of multiple large-scale two-sample expression data sets.

    Science.gov (United States)

    Lai, Yinglei; Zhang, Fanni; Nayak, Tapan K; Modarres, Reza; Lee, Norman H; McCaffrey, Timothy A

    2017-12-01

    We have proposed a mixture model based approach to the concordant integrative analysis of multiple large-scale two-sample expression datasets. Since the mixture model is based on the transformed differential expression test P-values (z-scores), it is generally applicable to the expression data generated by either microarray or RNA-seq platforms. The mixture model is simple with three normal distribution components for each dataset to represent down-regulation, up-regulation and no differential expression. However, when the number of datasets increases, the model parameter space increases exponentially due to the component combination from different datasets. In this study, motivated by the well-known generalized estimating equations (GEEs) for longitudinal data analysis, we focus on the concordant components and assume that the proportions of non-concordant components follow a special structure. We discuss the exchangeable, multiset coefficient and autoregressive structures for model reduction, and their related expectation-maximization (EM) algorithms. Then, the parameter space is linear with the number of datasets. In our previous study, we have applied the general mixture model to three microarray datasets for lung cancer studies. We show that more gene sets (or pathways) can be detected by the reduced mixture model with the exchangeable structure. Furthermore, we show that more genes can also be detected by the reduced model. The Cancer Genome Atlas (TCGA) data have been increasingly collected. The advantage of incorporating the concordance feature has also been clearly demonstrated based on TCGA RNA sequencing data for studying two closely related types of cancer. Additional results are included in a supplemental file. Computer program R-functions are freely available at http://home.gwu.edu/∼ylai/research/Concordance. ylai@gwu.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights

  3. Large Scale Integration of Renewable Power Sources into the Vietnamese Power System

    Science.gov (United States)

    Kies, Alexander; Schyska, Bruno; Thanh Viet, Dinh; von Bremen, Lueder; Heinemann, Detlev; Schramm, Stefan

    2017-04-01

    The Vietnamese Power system is expected to expand considerably in upcoming decades. Power capacities installed are projected to grow from 39 GW in 2015 to 129.5 GW by 2030. Installed wind power capacities are expected to grow to 6 GW (0.8 GW 2015) and solar power capacities to 12 GW (0.85 GW 2015). This goes hand in hand with an increase of the renewable penetration in the power mix from 1.3% from wind and photovoltaics (PV) in 2015 to 5.4% by 2030. The overall potential for wind power in Vietnam is estimated to be around 24 GW. Moreover, the up-scaling of renewable energy sources was formulated as one of the priorized targets of the Vietnamese government in the National Power Development Plan VII. In this work, we investigate the transition of the Vietnamese power system towards high shares of renewables. For this purpose, we jointly optimise the expansion of renewable generation facilities for wind and PV, and the transmission grid within renewable build-up pathways until 2030 and beyond. To simulate the Vietnamese power system and its generation from renewable sources, we use highly spatially and temporally resolved historical weather and load data and the open source modelling toolbox Python for Power System Analysis (PyPSA). We show that the highest potential of renewable generation for wind and PV is observed in southern Vietnam and discuss the resulting need for transmission grid extensions in dependency of the optimal pathway. Furthermore, we show that the smoothing effect of wind power has several considerable beneficial effects and that the Vietnamese hydro power potential can be efficiently used to provide balancing opportunities. This work is part of the R&D Project "Analysis of the Large Scale Integration of Renewable Power into the Future Vietnamese Power System" (GIZ, 2016-2018).

  4. Construction of large scale switch matrix by interconnecting integrated optical switch chips with EDFAs

    Science.gov (United States)

    Liao, Mingle; Wu, Baojian; Hou, Jianhong; Qiu, Kun

    2018-03-01

    Large scale optical switches are essential components in optical communication network. We aim to build up a large scale optical switch matrix by the interconnection of silicon-based optical switch chips using 3-stage CLOS structure, where EDFAs are needed to compensate for the insertion loss of the chips. The optical signal-to-noise ratio (OSNR) performance of the resulting large scale optical switch matrix is investigated for TE-mode light and the experimental results are in agreement with the theoretical analysis. We build up a 64 ×64 switch matrix by use of 16 ×16 optical switch chips and the OSNR and receiver sensibility can respectively be improved by 0.6 dB and 0.2 dB by optimizing the gain configuration of the EDFAs.

  5. Large scale functional brain networks underlying temporal integration of audio-visual speech perception: An EEG study

    Directory of Open Access Journals (Sweden)

    G. Vinodh Kumar

    2016-10-01

    Full Text Available Observable lip movements of the speaker influence perception of auditory speech. A classical example of this influence is reported by listeners who perceive an illusory (cross-modal speech sound (McGurk-effect when presented with incongruent audio-visual (AV speech stimuli. Recent neuroimaging studies of AV speech perception accentuate the role of frontal, parietal and the integrative brain sites in the vicinity of the superior temporal sulcus (STS for multisensory speech perception. However, if and how does the network across the whole brain participates during multisensory perception processing remains an open question. We posit that a large-scale functional connectivity among the neural population situated in distributed brain sites may provide valuable insights involved in processing and fusing of AV speech. Varying the psychophysical parameters in tandem with electroencephalogram (EEG recordings, we exploited the trial-by-trial perceptual variability of incongruent audio-visual (AV speech stimuli to identify the characteristics of the large-scale cortical network that facilitates multisensory perception during synchronous and asynchronous AV speech. We evaluated the spectral landscape of EEG signals during multisensory speech perception at varying AV lags. Functional connectivity dynamics for all sensor pairs was computed using the time-frequency global coherence, the vector sum of pairwise coherence changes over time. During synchronous AV speech, we observed enhanced global gamma-band coherence and decreased alpha and beta-band coherence underlying cross-modal (illusory perception compared to unisensory perception around a temporal window of 300-600 ms following onset of stimuli. During asynchronous speech stimuli, a global broadband coherence was observed during cross-modal perception at earlier times along with pre-stimulus decreases of lower frequency power, e.g., alpha rhythms for positive AV lags and theta rhythms for negative AV

  6. Delineating large-scale migratory connectivity of reed warblers using integrated multistate models

    NARCIS (Netherlands)

    Procházka, Petr; Hahn, Steffen; Rolland, Simon; van der Jeugd, Henk; Csörgő, Tibor; Jiguet, Frédéric; Mokwa, Tomasz; Liechti, Felix; Vangeluwe, Didier; Korner-Nievergelt, Fränzi

    2017-01-01

    Aim Assessing the extent of large-scale migratory connectivity is crucial for understanding the evolution of migratory systems and effective species conservation. It has been, however, difficult to elucidate the annual whereabouts of migratory populations of small animals across the annual cycle.

  7. Delta-Connected Cascaded H-Bridge Multilevel Converters for Large-Scale Photovoltaic Grid Integration

    DEFF Research Database (Denmark)

    Yu, Yifan; Konstantinou, Georgios; Townsend, Christopher D.

    2017-01-01

    The cascaded H-bridge (CHB) converter is becoming a promising candidate for use in next generation large-scale photovoltaic (PV) power plants. However, solar power generation in the three converter phase-legs can be significantly unbalanced, especially in a large geographically-dispersed plant...

  8. Design, development and integration of a large scale multiple source X-ray computed tomography system

    International Nuclear Information System (INIS)

    Malcolm, Andrew A.; Liu, Tong; Ng, Ivan Kee Beng; Teng, Wei Yuen; Yap, Tsi Tung; Wan, Siew Ping; Kong, Chun Jeng

    2013-01-01

    X-ray Computed Tomography (CT) allows visualisation of the physical structures in the interior of an object without physically opening or cutting it. This technology supports a wide range of applications in the non-destructive testing, failure analysis or performance evaluation of industrial products and components. Of the numerous factors that influence the performance characteristics of an X-ray CT system the energy level in the X-ray spectrum to be used is one of the most significant. The ability of the X-ray beam to penetrate a given thickness of a specific material is directly related to the maximum available energy level in the beam. Higher energy levels allow penetration of thicker components made of more dense materials. In response to local industry demand and in support of on-going research activity in the area of 3D X-ray imaging for industrial inspection the Singapore Institute of Manufacturing Technology (SIMTech) engaged in the design, development and integration of large scale multiple source X-ray computed tomography system based on X-ray sources operating at higher energies than previously available in the Institute. The system consists of a large area direct digital X-ray detector (410 x 410 mm), a multiple-axis manipulator system, a 225 kV open tube microfocus X-ray source and a 450 kV closed tube millifocus X-ray source. The 225 kV X-ray source can be operated in either transmission or reflection mode. The body of the 6-axis manipulator system is fabricated from heavy-duty steel onto which high precision linear and rotary motors have been mounted in order to achieve high accuracy, stability and repeatability. A source-detector distance of up to 2.5 m can be achieved. The system is controlled by a proprietary X-ray CT operating system developed by SIMTech. The system currently can accommodate samples up to 0.5 x 0.5 x 0.5 m in size with weight up to 50 kg. These specifications will be increased to 1.0 x 1.0 x 1.0 m and 100 kg in future

  9. Experience of Integrated Safeguards Approach for Large-scale Hot Cell Laboratory

    International Nuclear Information System (INIS)

    Miyaji, N.; Kawakami, Y.; Koizumi, A.; Otsuji, A.; Sasaki, K.

    2010-01-01

    The Japan Atomic Energy Agency (JAEA) has been operating a large-scale hot cell laboratory, the Fuels Monitoring Facility (FMF), located near the experimental fast reactor Joyo at the Oarai Research and Development Center (JNC-2 site). The FMF conducts post irradiation examinations (PIE) of fuel assemblies irradiated in Joyo. The assemblies are disassembled and non-destructive examinations, such as X-ray computed tomography tests, are carried out. Some of the fuel pins are cut into specimens and destructive examinations, such as ceramography and X-ray micro analyses, are performed. Following PIE, the tested material, in the form of a pin or segments, is shipped back to a Joyo spent fuel pond. In some cases, after reassembly of the examined irradiated fuel pins is completed, the fuel assemblies are shipped back to Joyo for further irradiation. For the IAEA to apply the integrated safeguards approach (ISA) to the FMF, a new verification system on material shipping and receiving process between Joyo and the FMF has been established by the IAEA under technical collaboration among the Japan Safeguard Office (JSGO) of MEXT, the Nuclear Material Control Center (NMCC) and the JAEA. The main concept of receipt/shipment verification under the ISA for JNC-2 site is as follows: under the IS, the FMF is treated as a Joyo-associated facility in terms of its safeguards system because it deals with the same spent fuels. Verification of the material shipping and receiving process between Joyo and the FMF can only be applied to the declared transport routes and transport casks. The verification of the nuclear material contained in the cask is performed with the method of gross defect at the time of short notice random interim inspections (RIIs) by measuring the surface neutron dose rate of the cask, filled with water to reduce radiation. The JAEA performed a series of preliminary tests with the IAEA, the JSGO and the NMCC, and confirmed from the standpoint of the operator that this

  10. Optimal Siting and Sizing of Energy Storage System for Power Systems with Large-scale Wind Power Integration

    DEFF Research Database (Denmark)

    Zhao, Haoran; Wu, Qiuwei; Huang, Shaojun

    2015-01-01

    This paper proposes algorithms for optimal sitingand sizing of Energy Storage System (ESS) for the operationplanning of power systems with large scale wind power integration.The ESS in this study aims to mitigate the wind powerfluctuations during the interval between two rolling Economic...... optimal siting and sizing of storage units throughoutthe network. These questions are investigated using an IEEE benchmark system...

  11. Scalability of Semi-Implicit Time Integrators for Nonhydrostatic Galerkin-based Atmospheric Models on Large Scale Cluster

    Science.gov (United States)

    2011-01-01

    Scalability of Semi-Implicit Time Integrators for Nonhydrostatic Galerkin-based Atmospheric Models on Large Scale Cluster James F. Kelly and Francis...present performance statistics to explain the scalability behavior. Keywords- atmospheric models , time intergrators, MPI, scal- ability, performance; I...moving toward the nonhy- drostatic regime. The nonhydrostatic atmospheric models , which run at resolutions finer than 10 km, possess fast- moving

  12. Self-encapsulated silver metallization and low-k polyimide for ultra-large-scale integration

    Science.gov (United States)

    Zou, Yuelin Lee

    An integrated approach has been taken to study the interconnect materials system and develop processes related to silver encapsulation, the metal dry etch, diffusion barrier and adhesion promoter, and low-dielectric-constant (low-k) polyimide materials. The objective is to develop a processing scheme for the low resistance metal and the low-k dielectrics, and to use fabricated test structures to demonstrate better performance especially in electromigration and dielectric insulation than the current industry standard for ultra-large-scale integration (ULSI) applications. Silver is an attractive choice of the low resistivity metal to replace aluminum alloy interconnect. A thin diffusion barrier for Ag metallization can be formed by nitridation of Ti using Ag/Ti bilayers in an ammonia ambient. A linear-parabolic model was proposed to describe the kinetics of the nitridation reaction. The Ti-nitride grows fast initially and a linear kinetics is presumed in a reaction-limiting step. After 15 min annealing, the nitride growth slows down and follows a parabolic growth kinetics. Silver films in Ag/Ti bilayer structures exhibited a strong texture component and a near-bamboo grain structures. In contrast, Ag films with randomly oriented grains were observed on Cr underlayers. The encapsulated silver layers had minimal residual Ti accumulations. X-ray analysis confirmed the absence of intermetallic phase transformation. Therefore, resistivity values of about 2 muOmega-cm were obtained for the encapsulated Ag bilayer films, which are comparable to that of the bulk values. A process was first reported to dry etch Ag metallization for microelectronic applications. Silver films oxidize readily in the oxygen plasma to form silver oxides, which subsequently crack and spall away due to incorporation of Ag to the oxide lattice. As a result, the oxide formed can be etched away in the reactive ion etch (RIE) chamber. The silver pattern prepared by the dry etching technique demonstrated

  13. Fast large-scale clustering of protein structures using Gauss integrals

    DEFF Research Database (Denmark)

    Harder, Tim; Borg, Mikael; Boomsma, Wouter

    2011-01-01

    -workers – and subsequently performing K-means clustering. Conclusions: Compared to current methods, Pleiades dramatically improves on the time needed to perform clustering, and can cluster a signicantly larger number of structures, while providing state-ofthe- art results. The number of low energy structures generated...... trajectories. Results: We present Pleiades, a novel approach to clustering protein structures with a rigorous mathematical underpinning. The method approximates clustering based on the root mean square deviation by rst mapping structures to Gauss integral vectors – which were introduced by Røgen and co......Motivation: Clustering protein structures is an important task in structural bioinformatics. De novo structure prediction, for example, often involves a clustering step for nding the best prediction. Other applications include assigning proteins to fold families and analyzing molecular dynamics...

  14. Construction of a large scale integrated map of macrophage pathogen recognition and effector systems

    Directory of Open Access Journals (Sweden)

    O'Sullivan Maire

    2010-05-01

    Full Text Available Abstract Background In an effort to better understand the molecular networks that underpin macrophage activation we have been assembling a map of relevant pathways. Manual curation of the published literature was carried out in order to define the components of these pathways and the interactions between them. This information has been assembled into a large integrated directional network and represented graphically using the modified Edinburgh Pathway Notation (mEPN scheme. Results The diagram includes detailed views of the toll-like receptor (TLR pathways, other pathogen recognition systems, NF-kappa-B, apoptosis, interferon signalling, MAP-kinase cascades, MHC antigen presentation and proteasome assembly, as well as selected views of the transcriptional networks they regulate. The integrated pathway includes a total of 496 unique proteins, the complexes formed between them and the processes in which they are involved. This produces a network of 2,170 nodes connected by 2,553 edges. Conclusions The pathway diagram is a navigable visual aid for displaying a consensus view of the pathway information available for these systems. It is also a valuable resource for computational modelling and aid in the interpretation of functional genomics data. We envisage that this work will be of value to those interested in macrophage biology and also contribute to the ongoing Systems Biology community effort to develop a standard notation scheme for the graphical representation of biological pathways.

  15. ARRA-Multi-Level Energy Storage and Controls for Large-Scale Wind Energy Integration

    Energy Technology Data Exchange (ETDEWEB)

    David Wenzhong Gao

    2012-09-30

    intelligent controller that increases battery life within hybrid energy storage systems for wind application was developed. Comprehensive studies have been conducted and simulation results are analyzed. A permanent magnet synchronous generator, coupled with a variable speed wind turbine, is connected to a power grid (14-bus system). A rectifier, a DC-DC converter and an inverter are used to provide a complete model of the wind system. An Energy Storage System (ESS) is connected to a DC-link through a DC-DC converter. An intelligent controller is applied to the DC-DC converter to help the Voltage Source Inverter (VSI) to regulate output power and also to control the operation of the battery and supercapacitor. This ensures a longer life time for the batteries. The detailed model is simulated in PSCAD/EMTP. Additionally, economic analysis has been done for different methods that can reduce the wind power output fluctuation. These methods are, wind power curtailment, dumping loads, battery energy storage system and hybrid energy storage system. From the results, application of single advanced HESS can save more money for wind turbines owners. Generally the income would be the same for most of methods because the wind does not change and maximum power point tracking can be applied to most systems. On the other hand, the cost is the key point. For short term and small wind turbine, the BESS is the cheapest and applicable method while for large scale wind turbines and wind farms the application of advanced HESS would be the best method to reduce the power fluctuation. The key outcomes of this project include a new intelligent controller that can reduce energy exchanged between the battery and DC-link, reduce charging/discharging cycles, reduce depth of discharge and increase time interval between charge/discharge, and lower battery temperature. This improves the overall lifetime of battery energy storages. Additionally, a new design method based on probability help optimize the

  16. Integration of Large-Scale Optimization and Game Theory for Sustainable Water Quality Management

    Science.gov (United States)

    Tsao, J.; Li, J.; Chou, C.; Tung, C.

    2009-12-01

    Sustainable water quality management requires total mass control in pollutant discharge based on both the principles of not exceeding assimilative capacity in a river and equity among generations. The stream assimilative capacity is the carrying capacity of a river for the maximum waste load without violating the water quality standard and the spirit of total mass control is to optimize the waste load allocation in subregions. For the goal of sustainable watershed development, this study will use large-scale optimization theory to optimize the profit, and find the marginal values of loadings as reference of the fair price and then the best way to get the equilibrium by water quality trading for the whole of watershed will be found. On the other hand, game theory plays an important role to maximize both individual and entire profits. This study proves the water quality trading market is available in some situation, and also makes the whole participants get a better outcome.

  17. Integrating adaptive behaviour in large-scale flood risk assessments: an Agent-Based Modelling approach

    Science.gov (United States)

    Haer, Toon; Aerts, Jeroen

    2015-04-01

    Between 1998 and 2009, Europe suffered over 213 major damaging floods, causing 1126 deaths, displacing around half a million people. In this period, floods caused at least 52 billion euro in insured economic losses making floods the most costly natural hazard faced in Europe. In many low-lying areas, the main strategy to cope with floods is to reduce the risk of the hazard through flood defence structures, like dikes and levees. However, it is suggested that part of the responsibility for flood protection needs to shift to households and businesses in areas at risk, and that governments and insurers can effectively stimulate the implementation of individual protective measures. However, adaptive behaviour towards flood risk reduction and the interaction between the government, insurers, and individuals has hardly been studied in large-scale flood risk assessments. In this study, an European Agent-Based Model is developed including agent representatives for the administrative stakeholders of European Member states, insurers and reinsurers markets, and individuals following complex behaviour models. The Agent-Based Modelling approach allows for an in-depth analysis of the interaction between heterogeneous autonomous agents and the resulting (non-)adaptive behaviour. Existing flood damage models are part of the European Agent-Based Model to allow for a dynamic response of both the agents and the environment to changing flood risk and protective efforts. By following an Agent-Based Modelling approach this study is a first contribution to overcome the limitations of traditional large-scale flood risk models in which the influence of individual adaptive behaviour towards flood risk reduction is often lacking.

  18. Large scale continuous integration and delivery : Making great software better and faster

    NARCIS (Netherlands)

    Stahl, Daniel

    2017-01-01

    Since the inception of continuous integration, and later continuous delivery, the methods of producing software in the industry have changed dramatically over the last two decades. Automated, rapid and frequent compilation, integration, testing, analysis, packaging and delivery of new software

  19. Terahertz imaging technique and application in large scale integrated circuit failure inspection

    Science.gov (United States)

    Di, Zhi-gang; Yao, Jian-quan; Jia, Chun-rong; Xu, De-gang; Bing, Pi-bin; Yang, Peng-fei; Zheng, Yi-bo

    2010-11-01

    Terahertz ray, as a new style optic source, usually means the electromagnetic whose frequencies lies in between 0.1THz~10THz, the waveband region of the electromagnetic spectrum lies in the gap between microwaves and infrared ray. With the development of laser techniques, quantum trap techniques and compound semiconductor techniques, many new terahertz techniques have been pioneered, motivated in part by the vast range of possible applications for terahertz imaging, sensing, and spectroscopy. THz imaging technique was introduced, and THz imaging can give us not only the density picture but also the phase information within frequency domain. Consequently, images of suspicious objects such as concealed metallic or metal weapons are much sharper and more readily identified when imaged with THz imaging scanners. On the base of these, the application of THz imaging in nondestructive examination, more concretely in large scale circuit failure inspection was illuminated, and the important techniques of this application were introduced, also future prospects were discussed. With the development of correlative technology of THz, we can draw a conclusion that THz imaging technology will have nice application foreground.

  20. An integrated platform for large-scale data collection and precise perturbation of live Drosophila embryos.

    Science.gov (United States)

    Levario, Thomas J; Zhao, Charles; Rouse, Tel; Shvartsman, Stanislav Y; Lu, Hang

    2016-02-11

    Understanding the fundamental principles governing embryogenesis is a key goal of developmental biology. Direct observation of embryogenesis via in vivo live imaging is vital to understanding embryogenesis; yet, tedious sample preparation makes it difficult to acquire large-scale imaging data that is often required to overcome experimental and biological noises for quantitative studies. Furthermore, it is often difficult, and sometimes impossible, to incorporate environmental perturbation for understanding developmental responses to external stimuli. To address this issue, we have developed a method for high-throughput imaging of live embryos, delivering precise environmental perturbations, and unbiased data extraction. This platform includes an optimized microfluidic device specifically for live embryos and also for precise perturbations in the microenvironment of the developing embryos. In addition, we developed software for simple, yet accurate, automated segmentation of fluorescent images, and automated data extraction. Using a quantitative assessment we find that embryos develop normally within the microfluidic device. Finally, we show an application of the high-throughput assay for monitoring developmental responses to external stimuli: anoxia-induced developmental arrest in Drosophila embryos. With slight modifications, the method developed in this work can be applied to many other models of development and other stimulus-response behaviors during development.

  1. Integral large scale experiments on hydrogen combustion for severe accident code validation-HYCOM

    International Nuclear Information System (INIS)

    Breitung, W.; Dorofeev, S.; Kotchourko, A.; Redlinger, R.; Scholtyssek, W.; Bentaib, A.; L'Heriteau, J.-P.; Pailhories, P.; Eyink, J.; Movahed, M.; Petzold, K.-G.; Heitsch, M.; Alekseev, V.; Denkevits, A.; Kuznetsov, M.; Efimenko, A.; Okun, M.V.; Huld, T.; Baraldi, D.

    2005-01-01

    A joint research project was carried out in the EU Fifth Framework Programme, concerning hydrogen risk in a nuclear power plant. The goals were: Firstly, to create a new data base of results on hydrogen combustion experiments in the slow to turbulent combustion regimes. Secondly, to validate the partners CFD and lumped parameter codes on the experimental data, and to evaluate suitable parameter sets for application calculations. Thirdly, to conduct a benchmark exercise by applying the codes to the full scale analysis of a postulated hydrogen combustion scenario in a light water reactor containment after a core melt accident. The paper describes the work programme of the project and the partners activities. Significant progress has been made in the experimental area, where test series in medium and large scale facilities have been carried out with the focus on specific effects of scale, multi-compartent geometry, heat losses and venting. The data were used for the validation of the partners CFD and lumped parameter codes, which included blind predictive calculations and pre- and post-test intercomparison exercises. Finally, a benchmark exercise was conducted by applying the codes to the full scale analysis of a hydrogen combustion scenario. The comparison and assessment of the results of the validation phase and of the challenging containment calculation exercise allows a deep insight in the quality, capabilities and limits of the CFD and the lumped parameter tools which are currently in use at various research laboratories

  2. Involvement of herbal medicine as a cause of mesenteric phlebosclerosis: results from a large-scale nationwide survey.

    Science.gov (United States)

    Shimizu, Seiji; Kobayashi, Taku; Tomioka, Hideo; Ohtsu, Kensei; Matsui, Toshiyuki; Hibi, Toshifumi

    2017-03-01

    Mesenteric phlebosclerosis (MP) is a rare disease characterized by venous calcification extending from the colonic wall to the mesentery, with chronic ischemic changes from venous return impairment in the intestine. It is an idiopathic disease, but increasing attention has been paid to the potential involvement of herbal medicine, or Kampo, in its etiology. Until now, there were scattered case reports, but no large-scale studies have been conducted to unravel the clinical characteristics and etiology of the disease. A nationwide survey was conducted using questionnaires to assess possible etiology (particularly the involvement of herbal medicine), clinical manifestations, disease course, and treatment of MP. Data from 222 patients were collected. Among the 169 patients (76.1 %), whose history of herbal medicine was obtained, 147 (87.0 %) used herbal medicines. The use of herbal medicines containing sanshishi (gardenia fruit, Gardenia jasminoides Ellis) was reported in 119 out of 147 patients (81.0 %). Therefore, the use of herbal medicine containing sanshishi was confirmed in 70.4 % of 169 patients whose history of herbal medicine was obtained. The duration of sanshishi use ranged from 3 to 51 years (mean 13.6 years). Patients who discontinued sanshishi showed a better outcome compared with those who continued it. The use of herbal medicine containing sanshishi is associated with the etiology of MP. Although it may not be the causative factor, it is necessary for gastroenterologists to be aware of the potential risk of herbal medicine containing sanshishi for the development of MP.

  3. The application of J integral to measure cohesive laws in materials undergoing large scale yielding

    DEFF Research Database (Denmark)

    Sørensen, Bent F.; Goutianos, Stergios

    2015-01-01

    We explore the possibility of determining cohesive laws by the J-integral approach for materials having non-linear stress-strain behaviour (e.g. polymers and composites) by the use of a DCB sandwich specimen, consisting of stiff elastic beams bonded to the non-linear test material, loaded with pure...... bending moments. For a wide range of parameters of the non-linear material, the plastic unloading during crack extension is small, resulting in J integral values (fracture resistance) that deviate maximum 15% from the work of the cohesive traction. Thus the method can be used to extract the cohesive laws...... directly from experiments without any presumption about their shape. Finally, the DCB sandwich specimen was also analysed using the I integral to quantify the overestimation of the steady-state fracture resistance obtained using the J integral based method....

  4. The application of J integral to measure cohesive laws under large-scale yielding

    DEFF Research Database (Denmark)

    Goutianos, Stergios; Sørensen, Bent F.

    2016-01-01

    A method is developed to obtain the mode I cohesive law of elastic-plastic materials using a Double Cantilever Beam sandwich specimen loaded with pure bending moments. The approach is based on the validity of the J integral for materials having a non-linear stress-strain relationship without...

  5. 75 FR 24742 - In the Matter of Certain Large Scale Integrated Circuit Semiconductor Chips and Products...

    Science.gov (United States)

    2010-05-05

    ... Semiconductor, Xiqing Integrated Semiconductor, Manufacturing Site, No. 15 Xinghua Road, Xiqing Economic... Malaysia Sdn. Bhd., NO. 2 Jalan SS 8/2, Free Industrial Zone, Sungai Way, 47300 Petaling Jaya, Selengor, Malaysia. Freescale Semiconductor Pte. Ltd., 7 Changi South Street 2, 03-00, Singapore 486415. Freescale...

  6. Integration of large-scale heat pumps in the district heating systems of Greater Copenhagen

    DEFF Research Database (Denmark)

    Bach, Bjarne; Werling, Jesper; Ommen, Torben Schmidt

    2016-01-01

    This study analyses the technical and private economic aspects of integrating a large capacity of electric driven HP (heat pumps) in the Greater Copenhagen DH (district heating) system, which is an example of a state-of-the-art large district heating system with many consumers and suppliers...

  7. Integrating scientific knowledge into large-scale restoration programs: the CALFED Bay-Delta Program experience

    Science.gov (United States)

    Taylor, K.A.; Short, A.

    2009-01-01

    Integrating science into resource management activities is a goal of the CALFED Bay-Delta Program, a multi-agency effort to address water supply reliability, ecological condition, drinking water quality, and levees in the Sacramento-San Joaquin Delta of northern California. Under CALFED, many different strategies were used to integrate science, including interaction between the research and management communities, public dialogues about scientific work, and peer review. This paper explores ways science was (and was not) integrated into CALFED's management actions and decision systems through three narratives describing different patterns of scientific integration and application in CALFED. Though a collaborative process and certain organizational conditions may be necessary for developing new understandings of the system of interest, we find that those factors are not sufficient for translating that knowledge into management actions and decision systems. We suggest that the application of knowledge may be facilitated or hindered by (1) differences in the objectives, approaches, and cultures of scientists operating in the research community and those operating in the management community and (2) other factors external to the collaborative process and organization.

  8. Large-scale integration of wind power into the existing Chinese energy system

    DEFF Research Database (Denmark)

    Liu, Wen; Lund, Henrik; Mathiesen, Brian Vad

    2011-01-01

    This paper presents the ability of the existing Chinese energy system to integrate wind power and explores how the Chinese energy system needs to prepare itself in order to integrate more fluctuating renewable energy in the future. With this purpose in mind, a model of the Chinese energy system has...... been constructed by using EnergyPLAN based on the year 2007, which has then been used for investigating three issues. Firstly, the accuracy of the model itself has been examined and then the maximum feasible wind power penetration in the existing energy system has been identified. Finally, barriers...... stability, the maximum feasible wind power penetration in the existing Chinese energy system is approximately 26% from both technical and economic points of view. A fuel efficiency decrease occurred when increasing wind power penetration in the system, due to its rigid power supply structure and the task...

  9. GIGGLE: a search engine for large-scale integrated genome analysis

    Science.gov (United States)

    Layer, Ryan M; Pedersen, Brent S; DiSera, Tonya; Marth, Gabor T; Gertz, Jason; Quinlan, Aaron R

    2018-01-01

    GIGGLE is a genomics search engine that identifies and ranks the significance of genomic loci shared between query features and thousands of genome interval files. GIGGLE (https://github.com/ryanlayer/giggle) scales to billions of intervals and is over three orders of magnitude faster than existing methods. Its speed extends the accessibility and utility of resources such as ENCODE, Roadmap Epigenomics, and GTEx by facilitating data integration and hypothesis generation. PMID:29309061

  10. GIGGLE: a search engine for large-scale integrated genome analysis.

    Science.gov (United States)

    Layer, Ryan M; Pedersen, Brent S; DiSera, Tonya; Marth, Gabor T; Gertz, Jason; Quinlan, Aaron R

    2018-02-01

    GIGGLE is a genomics search engine that identifies and ranks the significance of genomic loci shared between query features and thousands of genome interval files. GIGGLE (https://github.com/ryanlayer/giggle) scales to billions of intervals and is over three orders of magnitude faster than existing methods. Its speed extends the accessibility and utility of resources such as ENCODE, Roadmap Epigenomics, and GTEx by facilitating data integration and hypothesis generation.

  11. Perspectives on Clinical Informatics: Integrating Large-Scale Clinical, Genomic, and Health Information for Clinical Care

    Directory of Open Access Journals (Sweden)

    In Young Choi

    2013-12-01

    Full Text Available The advances in electronic medical records (EMRs and bioinformatics (BI represent two significant trends in healthcare. The widespread adoption of EMR systems and the completion of the Human Genome Project developed the technologies for data acquisition, analysis, and visualization in two different domains. The massive amount of data from both clinical and biology domains is expected to provide personalized, preventive, and predictive healthcare services in the near future. The integrated use of EMR and BI data needs to consider four key informatics areas: data modeling, analytics, standardization, and privacy. Bioclinical data warehouses integrating heterogeneous patient-related clinical or omics data should be considered. The representative standardization effort by the Clinical Bioinformatics Ontology (CBO aims to provide uniquely identified concepts to include molecular pathology terminologies. Since individual genome data are easily used to predict current and future health status, different safeguards to ensure confidentiality should be considered. In this paper, we focused on the informatics aspects of integrating the EMR community and BI community by identifying opportunities, challenges, and approaches to provide the best possible care service for our patients and the population.

  12. Large-scale 3-D modeling by integration of resistivity models and borehole data through inversion

    DEFF Research Database (Denmark)

    Foged, N.; Marker, Pernille Aabye; Christiansen, A. V.

    2014-01-01

    and the borehole data set in one variable. Finally, we use k-means clustering to generate a 3-D model of the subsurface structures. We apply the procedure to the Norsminde survey in Denmark, integrating approximately 700 boreholes and more than 100 000 resistivity models from an airborne survey...... in the parameterization of the 3-D model covering 156 km2. The final five-cluster 3-D model differentiates between clay materials and different high-resistivity materials from information held in the resistivity model and borehole observations, respectively....

  13. Large Scale Environmental Monitoring through Integration of Sensor and Mesh Networks

    Directory of Open Access Journals (Sweden)

    Raja Jurdak

    2008-11-01

    Full Text Available Monitoring outdoor environments through networks of wireless sensors has received interest for collecting physical and chemical samples at high spatial and temporal scales. A central challenge to environmental monitoring applications of sensor networks is the short communication range of the sensor nodes, which increases the complexity and cost of monitoring commodities that are located in geographically spread areas. To address this issue, we propose a new communication architecture that integrates sensor networks with medium range wireless mesh networks, and provides users with an advanced web portal for managing sensed information in an integrated manner. Our architecture adopts a holistic approach targeted at improving the user experience by optimizing the system performance for handling data that originates at the sensors, traverses the mesh network, and resides at the server for user consumption. This holistic approach enables users to set high level policies that can adapt the resolution of information collected at the sensors, set the preferred performance targets for their application, and run a wide range of queries and analysis on both real-time and historical data. All system components and processes will be described in this paper.

  14. Implicit Particle Filter for Power System State Estimation with Large Scale Renewable Power Integration.

    Science.gov (United States)

    Uzunoglu, B.; Hussaini, Y.

    2017-12-01

    Implicit Particle Filter is a sequential Monte Carlo method for data assimilation that guides the particles to the high-probability by an implicit step . It optimizes a nonlinear cost function which can be inherited from legacy assimilation routines . Dynamic state estimation for almost real-time applications in power systems are becomingly increasingly more important with integration of variable wind and solar power generation. New advanced state estimation tools that will replace the old generation state estimation in addition to having a general framework of complexities should be able to address the legacy software and able to integrate the old software in a mathematical framework while allowing the power industry need for a cautious and evolutionary change in comparison to a complete revolutionary approach while addressing nonlinearity and non-normal behaviour. This work implements implicit particle filter as a state estimation tool for the estimation of the states of a power system and presents the first implicit particle filter application study on a power system state estimation. The implicit particle filter is introduced into power systems and the simulations are presented for a three-node benchmark power system . The performance of the filter on the presented problem is analyzed and the results are presented.

  15. Development of hospital-integrated large-scale PACS in Seoul National University Hospital

    Science.gov (United States)

    Kim, JongHyo; Yeon, Kyoung M.; Han, Man Chung; Lee, Dong Hyuk; Cho, Han I.

    1997-05-01

    The SNUH has started a PACS project with three main goals: to develop a fully hospital-integrated PACS, to develop a cost effective PACS using open systems architecture, and to extend PACS' role to the advanced application such as image guided surgery, multi-media assisted education and research. In order to achieve these goals, we have designed a PACS architecture which takes advantage of client-server computing, high speed communication network, computing power of up-to-date high-end PC, and advanced image compression method. We have installed ATM based communication network in radiology department and in-patient wards, and implemented DICOM compliant acquisition modules, image storage and management servers, and high resolution display workstations based on high-end PC and Microsoft Windows 95 and Windows NT operating systems. The SNUH PACS is in partial scale operation now, and will be expanded to full scale by the end of 1998.

  16. Large-scale integration of renewable energy into international electricity markets

    DEFF Research Database (Denmark)

    Lund, Henrik

    2004-01-01

    has lead to excess electricity production and thus low prices on the Nord Pool electricity market. This paper describes how such problems can be avoided by the introduction of flexible energy systems including changes in the regulation of power plants and investments in heat pumps and heat storage......The paper presents the ability of different energy systems and regulation strategies to integrate renewable energy sources (RES) into the electricity supply system. The fluctuating electricity production from renewable energy must interact with the rest of the production units in order to make...... it possible for the system to secure a balance between supply and demand. At the same time most European electricity systems are in the process of being transformed into competitive electricity markets. Already today, the annual share of wind power in the western part of Denmark is nearly 20 percent, which...

  17. Some Effects of integrated Production Planning in Large-scale Kitchens

    DEFF Research Database (Denmark)

    Engelund, Eva Høy; Friis, Alan; Jacobsen, Peter

    2005-01-01

    consist of as few steps as possible. When using the philosophies in Lean Manufacturing a concept obeying the above constraints can be created. The focus is turned to time and work planning in production, core competences in production and how to meet consumer demands. This means that the system is able......An integrated production system is a system that views the production, distribution, serving and ordering of meals as one uninterrupted supply chain stretching from the patient to the kitchen. To meet the stakeholders demands of flexibility, productivity, low cost, freshness etc. production must...... to handle both consumer demands for flexible, freshly prepared menus and food service manager demands for up-to-date production systems....

  18. Non-destructive screening method for radiation hardened performance of large scale integration

    International Nuclear Information System (INIS)

    Zhou Dong; Xi Shanbin; Guo Qi; Ren Diyuan; Li Yudong; Sun Jing; Wen Lin

    2013-01-01

    The space radiation environment could induce radiation damage on the electronic devices. As the performance of commercial devices is generally superior to that of radiation hardened devices, it is necessary to screen out the devices with good radiation hardened performance from the commercial devices and applying these devices to space systems could improve the reliability of the systems. Combining the mathematical regression analysis with the different physical stressing experiments, we investigated the non-destructive screening method for radiation hardened performance of the integrated circuit. The relationship between the change of typical parameters and the radiation performance of the circuit was discussed. The irradiation-sensitive parameters were confirmed. The pluralistic linear regression equation toward the prediction of the radiation performance was established. Finally, the regression equations under stress conditions were verified by practical irradiation. The results show that the reliability and accuracy of the non-destructive screening method can be elevated by combining the mathematical regression analysis with the practical stressing experiment. (authors)

  19. A Proposal for Six Sigma Integration for Large-Scale Production of Penicillin G and Subsequent Conversion to 6-APA.

    Science.gov (United States)

    Nandi, Anirban; Pan, Sharadwata; Potumarthi, Ravichandra; Danquah, Michael K; Sarethy, Indira P

    2014-01-01

    Six Sigma methodology has been successfully applied to daily operations by several leading global private firms including GE and Motorola, to leverage their net profits. Comparatively, limited studies have been conducted to find out whether this highly successful methodology can be applied to research and development (R&D). In the current study, we have reviewed and proposed a process for a probable integration of Six Sigma methodology to large-scale production of Penicillin G and its subsequent conversion to 6-aminopenicillanic acid (6-APA). It is anticipated that the important aspects of quality control and quality assurance will highly benefit from the integration of Six Sigma methodology in mass production of Penicillin G and/or its conversion to 6-APA.

  20. Fossil fleet transition with fuel changes and large scale variable renewable integration

    Energy Technology Data Exchange (ETDEWEB)

    James, Revis [Electric Power Research Institute, Palo Alto, CA (United States); Hesler, Stephen [Electric Power Research Institute, Palo Alto, CA (United States); Bistline, John [Electric Power Research Institute, Palo Alto, CA (United States)

    2015-03-31

    Variability in demand as seen by grid-connected dispatchable generators can increase due to factors such as greater production from variable generation assets (for example, wind and solar), increased reliance on demand response or customer-driven automation, and aggregation of loads. This variability results a need for these generators to operate in a range of different modes, collectively referred to as “flexible operations.” This study is designed to inform power companies, researchers, and policymakers of the scope and trends in increasing levels of flexible operations as well as reliability challenges and impacts for dispatchable assets. Background Because there is rarely a direct monetization of the value of operational flexibility, the decision to provide such flexibility is typically dependent on unit- and region-specific decisions made by asset owners. It is very likely that much greater and more widespread flexible operations capabilities will be needed due to increased variability in demand seen by grid-connected generators, uncertainty regarding investment in new units to provide adequate operational flexibility, and the retirement of older, uncontrolled sub-critical pulverized coal units. Objective To enhance understanding of the technical challenges and operational impacts associated with dispatchable assets needed to increase operational flexibility and support variable demand. Approach The study approach consists of three elements: a literature review of relevant prior studies, analysis of detailed scenarios for evolution of the future fleet over the next 35 years, and engineering assessment of the degree and scope of technical challenges associated with transformation to the future fleet. The study approach integrated two key elements rarely brought together in a single analysis—1) long-term capacity planning, which enables modeling of unit retirements and new asset investments, and 2) unit commitment analysis, which permits examination of

  1. Atypical language laterality is associated with large-scale disruption of network integration in children with intractable focal epilepsy.

    Science.gov (United States)

    Ibrahim, George M; Morgan, Benjamin R; Doesburg, Sam M; Taylor, Margot J; Pang, Elizabeth W; Donner, Elizabeth; Go, Cristina Y; Rutka, James T; Snead, O Carter

    2015-04-01

    Epilepsy is associated with disruption of integration in distributed networks, together with altered localization for functions such as expressive language. The relation between atypical network connectivity and altered localization is unknown. In the current study we tested whether atypical expressive language laterality was associated with the alteration of large-scale network integration in children with medically-intractable localization-related epilepsy (LRE). Twenty-three right-handed children (age range 8-17) with medically-intractable LRE performed a verb generation task in fMRI. Language network activation was identified and the Laterality index (LI) was calculated within the pars triangularis and pars opercularis. Resting-state data from the same cohort were subjected to independent component analysis. Dual regression was used to identify associations between resting-state integration and LI values. Higher positive values of the LI, indicating typical language localization were associated with stronger functional integration of various networks including the default mode network (DMN). The normally symmetric resting-state networks showed a pattern of lateralized connectivity mirroring that of language function. The association between atypical language localization and network integration implies a widespread disruption of neural network development. These findings may inform the interpretation of localization studies by providing novel insights into reorganization of neural networks in epilepsy. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. A refined regional modeling approach for the Corn Belt - Experiences and recommendations for large-scale integrated modeling

    Science.gov (United States)

    Panagopoulos, Yiannis; Gassman, Philip W.; Jha, Manoj K.; Kling, Catherine L.; Campbell, Todd; Srinivasan, Raghavan; White, Michael; Arnold, Jeffrey G.

    2015-05-01

    Nonpoint source pollution from agriculture is the main source of nitrogen and phosphorus in the stream systems of the Corn Belt region in the Midwestern US. This region is comprised of two large river basins, the intensely row-cropped Upper Mississippi River Basin (UMRB) and Ohio-Tennessee River Basin (OTRB), which are considered the key contributing areas for the Northern Gulf of Mexico hypoxic zone according to the US Environmental Protection Agency. Thus, in this area it is of utmost importance to ensure that intensive agriculture for food, feed and biofuel production can coexist with a healthy water environment. To address these objectives within a river basin management context, an integrated modeling system has been constructed with the hydrologic Soil and Water Assessment Tool (SWAT) model, capable of estimating river basin responses to alternative cropping and/or management strategies. To improve modeling performance compared to previous studies and provide a spatially detailed basis for scenario development, this SWAT Corn Belt application incorporates a greatly refined subwatershed structure based on 12-digit hydrologic units or 'subwatersheds' as defined by the US Geological Service. The model setup, calibration and validation are time-demanding and challenging tasks for these large systems, given the scale intensive data requirements, and the need to ensure the reliability of flow and pollutant load predictions at multiple locations. Thus, the objectives of this study are both to comprehensively describe this large-scale modeling approach, providing estimates of pollution and crop production in the region as well as to present strengths and weaknesses of integrated modeling at such a large scale along with how it can be improved on the basis of the current modeling structure and results. The predictions were based on a semi-automatic hydrologic calibration approach for large-scale and spatially detailed modeling studies, with the use of the Sequential

  3. Real-time adaptive ramp metering : phase I, MILOS proof of concept (multi-objective, integrated, large-scale, optimized system).

    Science.gov (United States)

    2006-12-01

    Over the last several years, researchers at the University of Arizonas ATLAS Center have developed an adaptive ramp : metering system referred to as MILOS (Multi-Objective, Integrated, Large-Scale, Optimized System). The goal of this project : is ...

  4. Integrating an agent-based model into a large-scale hydrological model for evaluating drought management in California

    Science.gov (United States)

    Sheffield, J.; He, X.; Wada, Y.; Burek, P.; Kahil, M.; Wood, E. F.; Oppenheimer, M.

    2017-12-01

    California has endured record-breaking drought since winter 2011 and will likely experience more severe and persistent drought in the coming decades under changing climate. At the same time, human water management practices can also affect drought frequency and intensity, which underscores the importance of human behaviour in effective drought adaptation and mitigation. Currently, although a few large-scale hydrological and water resources models (e.g., PCR-GLOBWB) consider human water use and management practices (e.g., irrigation, reservoir operation, groundwater pumping), none of them includes the dynamic feedback between local human behaviors/decisions and the natural hydrological system. It is, therefore, vital to integrate social and behavioral dimensions into current hydrological modeling frameworks. This study applies the agent-based modeling (ABM) approach and couples it with a large-scale hydrological model (i.e., Community Water Model, CWatM) in order to have a balanced representation of social, environmental and economic factors and a more realistic representation of the bi-directional interactions and feedbacks in coupled human and natural systems. In this study, we focus on drought management in California and considers two types of agents, which are (groups of) farmers and state management authorities, and assumed that their corresponding objectives are to maximize the net crop profit and to maintain sufficient water supply, respectively. Farmers' behaviors are linked with local agricultural practices such as cropping patterns and deficit irrigation. More precisely, farmers' decisions are incorporated into CWatM across different time scales in terms of daily irrigation amount, seasonal/annual decisions on crop types and irrigated area as well as the long-term investment of irrigation infrastructure. This simulation-based optimization framework is further applied by performing different sets of scenarios to investigate and evaluate the effectiveness

  5. A polyacrylamide microbead-integrated chip for the large-scale manufacture of ready-to-use esiRNA.

    Science.gov (United States)

    Huang, Huang; Chang, Qing; Sun, Changhong; Yin, Shenyi; Li, Juan; Xi, Jianzhong Jeff

    2011-03-21

    Endoribonuclease-prepared siRNAs (esiRNAs) have the advantages of cost effectiveness and lower off-target effects than chemically synthesized siRNA. However, the current manufacture of esiRNA is a complex process, requiring an expensive instrument and demanding skills to accomplish the transfer, purification, quantification and normalization of liquid samples. These performances significantly hamper the application of esiRNAs on a large-scale level. In this study, we present a polymer microbead-integrated chip capable of the large-scale manufacture of esiRNA in a convenient and robust manner. This chip is able to perform the amplification, transcription and enzymatic digestion of targets on polymer scaffold, thus simplifying the transfer and purification manipulation process. What is also noted, this chip can readily tailor and normalize the amount of esiRNA product by controlling the number of DNA probes and the cycle of the amplification reaction. Thus the esiRNA, also referred to as gel-esiRNA, can be immediately applied to loss-of-function study without any further treatment. The silencing specificity and efficiency of gel-esiRNAs were assessed on transcriptional, translational or cell functional levels. All data of real-time PCR, Western blot assay, or FACS clearly supported that the gel-esiRNA produced specific gene silencing as effectively as the one generated following the conventional approach. We believe that this approach would provide a more robust and cost-effective choice to manufacture esiRNAs, thus promising both more intensive and extensive applications of these heterogeneous RNA strands. This journal is © The Royal Society of Chemistry 2011

  6. The use of public participation and economic appraisal for public involvement in large-scale hydropower projects: Case study of the Nam Theun 2 Hydropower Project

    International Nuclear Information System (INIS)

    Mirumachi, Naho; Torriti, Jacopo

    2012-01-01

    Gaining public acceptance is one of the main issues with large-scale low-carbon projects such as hydropower development. It has been recommended by the World Commission on Dams that to gain public acceptance, public involvement is necessary in the decision-making process (). As financially-significant actors in the planning and implementation of large-scale hydropower projects in developing country contexts, the paper examines the ways in which public involvement may be influenced by international financial institutions. Using the case study of the Nam Theun 2 Hydropower Project in Laos, the paper analyses how public involvement facilitated by the Asian Development Bank had a bearing on procedural and distributional justice. The paper analyses the extent of public participation and the assessment of full social and environmental costs of the project in the Cost-Benefit Analysis conducted during the project appraisal stage. It is argued that while efforts were made to involve the public, there were several factors that influenced procedural and distributional justice: the late contribution of the Asian Development Bank in the project appraisal stage; and the issue of non-market values and discount rate to calculate the full social and environmental costs. - Highlights: ► Public acceptance in large-scale hydropower projects is examined. ► Both procedural and distributional justice are important for public acceptance. ► International Financial Institutions can influence the level of public involvement. ► Public involvement benefits consideration of non-market values and discount rates.

  7. Large-scale solar purchasing

    International Nuclear Information System (INIS)

    1999-01-01

    The principal objective of the project was to participate in the definition of a new IEA task concerning solar procurement (''the Task'') and to assess whether involvement in the task would be in the interest of the UK active solar heating industry. The project also aimed to assess the importance of large scale solar purchasing to UK active solar heating market development and to evaluate the level of interest in large scale solar purchasing amongst potential large scale purchasers (in particular housing associations and housing developers). A further aim of the project was to consider means of stimulating large scale active solar heating purchasing activity within the UK. (author)

  8. [Patient satisfaction of women involved in the pilot program of large scale mammography in the Ariana area of Tunisia].

    Science.gov (United States)

    Zeghal, D; Mahjoub, S; Zakraoui, M A; Ben Aissa, R; Zaanouni, E; Lazaar, I; Mbarek, F; Ouechtati, A; Zouari, F; Boussen, H; Gueddana, N

    2009-07-01

    Evaluate the degree of satisfaction of women included in the large scale mammography program of breast cancer screening in the state of Ariana in Tunisia. [corrected] Within the women explored by mammography, we have contaced 112 patients who had a positif screening requiring histological checking. We have established a questionnaire concerning: the invitation, the clinical examination, the result announcement and the therapeutic management. The average age of patients was 49 years. 64% had a primary education level. 80 women or 71.4% were satisfied with the process of screening and the method of announcement. The main cause of dissatisfaction for patients with cancer diagnosis was delay and difficult access to adjuvant treatments. Among patients who had histological diagnosis: 47.3% had a malignant disease (53 cases) against 37.5% of benign (42 cases). 100% of patients who had a pathological result reassuring are satisfied at the end of the screening program. The psychosocial impact of screening must be considered for the development of new programs. The waiting and announcement of results are essential factors that allow us to judge the success of the project, because of patient satisfaction will depend the quality of monitoring and adherence to screening.

  9. Economics of intermittent renewable energy sources: four essays on large-scale integration into European power systems

    International Nuclear Information System (INIS)

    Henriot, Arthur

    2014-01-01

    This thesis centres on issues of economic efficiency originating from the large-scale development of intermittent renewable energy sources (RES) in Europe. The flexible resources that are necessary to cope with their specificities (variability, low-predictability, site specificity) are already known, but adequate signals are required to foster efficient operation and investment in these resources. A first question is to what extent intermittent RES can remain out of the market at times when they are the main driver of investment and operation in power systems. A second question is whether the current market design is adapted to their specificities. These two questions are tackled in four distinct contributions.The first chapter is a critical literature review. This analysis introduces and confronts two (often implicit) paradigms for RES integration. It then identifies and discusses a set of evolutions required to develop a market design adapted to the large-scale development of RES, such as new definitions of the products exchanged and reorganisation of the sequence of electricity markets.In the second chapter, an analytical model is used to assess the potential of intra-day markets as a flexibility provider to intermittent RES with low production predictability. This study highlights and demonstrates how the potential of intra-day markets is heavily dependent on the evolution of the forecast errors.The third chapter focuses on the benefits of curtailing the production by intermittent RES, as a tool to smooth out their variability and reduce overall generation costs. Another analytical model is employed to anatomise the relationship between these benefits and a set of pivotal parameters. Special attention is also paid to the allocation of these benefits between the different stakeholders.In the fourth chapter, a numerical simulation is used to evaluate the ability of the European transmission system operators to tackle the investment wave required in order to

  10. Team-Base Redesign as a Large-Scale Change: Applying Theory to the Implementation of Integrated Product Teams

    National Research Council Canada - National Science Library

    Hocevar, Susan P; Owen, Walter E

    1998-01-01

    .... This article draws from existing theory and research related to both large-scale change and team-based organization design to identify critical issues that must be explicitly managed to achieve...

  11. Three-dimensional packaging of very large scale integrated optics (VLSIO) for high-complexity optical systems

    Science.gov (United States)

    West, Lawrence C.; Roberts, Charles W.; Piscani, Emil C.; Dubey, Madan; Jones, Kenneth A.; McLane, George F.

    1996-03-01

    Optics has the fundamental capability of dramatically improving computer performance via the reduction of capacitance for intrinsic high bandwidth communications and low power usage. Yet optical devices have not displaced silicon VLSI in any measure to date. The reason is clear. When placed into systems, the optical devices have not had significantly greater performance in equally complex information processing circuits and similarly low manufacturing cost. An approach demonstrated here uses the same system integration techniques that have been successful for silicon electronics, only applied to optics. Essential for creation of very large scale integrated optics (VLSIO), with over 50,000 high speed logic gates per square centimeter, is a new class of ultra high confinement (UHC) waveguides. These waveguides are created with high index difference (as high as 4.0 to 1.0) between guide and cladding. The waveguides have been demonstrated with infrared cross sections less than 5% of a square free space wavelength. These waveguides can be manufactured today only in the mid-infrared, but the concepts should scale to the near-infrared as lithography improves. Waveguide corners have been designed and demonstrated with a bend radius of less than one free space wavelength. Resonators have been designed which have over 100 times smaller volume than VCSELs, yet efficiently inter-connected laterally in high densities. A connector to the UHC waveguides has been developed and demonstrated using diffractive optical element arrays on the back side of the substrate. The coupler arrays can allow up to 10,000 Gaussian beam connections per square centimeter. This connectivity also has advantages for low cost three dimensional packaging for reduced cost and thermal dissipation. Experimental results on the above concepts and components are presented.

  12. Life cycle energy and greenhouse gas analysis of a large-scale vertically integrated organic dairy in the United States.

    Science.gov (United States)

    Heller, Martin C; Keoleian, Gregory A

    2011-03-01

    In order to manage strategies to curb climate change, systemic benchmarking at a variety of production scales and methods is needed. This study is the first life cycle assessment (LCA) of a large-scale, vertically integrated organic dairy in the United States. Data collected at Aurora Organic Dairy farms and processing facilities were used to build a LCA model for benchmarking the greenhouse gas (GHG) emissions and energy consumption across the entire milk production system, from organic feed production to post-consumer waste disposal. Energy consumption and greenhouse gas emissions for the entire system (averaged over two years of analysis) were 18.3 MJ per liter of packaged fluid milk and 2.3 kg CO(2 )equiv per liter of packaged fluid milk, respectively. Methane emissions from enteric fermentation and manure management account for 27% of total system GHG emissions. Transportation represents 29% of the total system energy use and 15% of the total GHG emissions. Utilization of renewable energy at the farms, processing plant, and major transport legs could lead to a 16% reduction in system energy use and 6.4% less GHG emissions. Sensitivity and uncertainty analysis reveal that alternative meat coproduct allocation methods can lead to a 2.2% and 7.5% increase in overall system energy and GHG, respectively. Feed inventory data source can influence system energy use by -1% to +10% and GHG emission by -4.6% to +9.2%, and uncertainties in diffuse emission factors contribute -13% to +25% to GHG emission.

  13. Techno-economic Modeling of the Integration of 20% Wind and Large-scale Energy Storage in ERCOT by 2030

    Energy Technology Data Exchange (ETDEWEB)

    Baldick, Ross; Webber, Michael; King, Carey; Garrison, Jared; Cohen, Stuart; Lee, Duehee

    2012-12-21

    This study's objective is to examine interrelated technical and economic avenues for the Electric Reliability Council of Texas (ERCOT) grid to incorporate up to and over 20% wind generation by 2030. Our specific interests are to look at the factors that will affect the implementation of both high level of wind power penetration (> 20% generation) and installation of large scale storage.

  14. TradeWind. Integrating wind. Developing Europe's power market for the large-scale integration of wind power. Final report

    Energy Technology Data Exchange (ETDEWEB)

    2009-02-15

    Based on a single European grid and power market system, the TradeWind project explores to what extent large-scale wind power integration challenges could be addressed by reinforcing interconnections between Member States in Europe. Additionally, the project looks at the conditions required for a sound power market design that ensures a cost-effective integration of wind power at EU level. In this way, the study addresses two issues of key importance for the future integration of renewable energy, namely the weak interconnectivity levels between control zones and the inflexibility and fragmented nature of the European power market. Work on critical transmission paths and interconnectors is slow for a variety of reasons including planning and administrative barriers, lack of public acceptance, insufficient economic incentives for TSOs, and the lack of a joint European approach by the key stakeholders. (au)

  15. An integrated, indicator framework for assessing large-scale variations and change in seasonal timing and phenology (Invited)

    Science.gov (United States)

    Betancourt, J. L.; Weltzin, J. F.

    2013-12-01

    As part of an effort to develop an Indicator System for the National Climate Assessment (NCA), the Seasonality and Phenology Indicators Technical Team (SPITT) proposed an integrated, continental-scale framework for understanding and tracking seasonal timing in physical and biological systems. The framework shares several metrics with the EPA's National Climate Change Indicators. The SPITT framework includes a comprehensive suite of national indicators to track conditions, anticipate vulnerabilities, and facilitate intervention or adaptation to the extent possible. Observed, modeled, and forecasted seasonal timing metrics can inform a wide spectrum of decisions on federal, state, and private lands in the U.S., and will be pivotal for international efforts to mitigation and adaptation. Humans use calendars both to understand the natural world and to plan their lives. Although the seasons are familiar concepts, we lack a comprehensive understanding of how variability arises in the timing of seasonal transitions in the atmosphere, and how variability and change translate and propagate through hydrological, ecological and human systems. For example, the contributions of greenhouse warming and natural variability to secular trends in seasonal timing are difficult to disentangle, including earlier spring transitions from winter (strong westerlies) to summer (weak easterlies) patterns of atmospheric circulation; shifts in annual phasing of daily temperature means and extremes; advanced timing of snow and ice melt and soil thaw at higher latitudes and elevations; and earlier start and longer duration of the growing and fire seasons. The SPITT framework aims to relate spatiotemporal variability in surface climate to (1) large-scale modes of natural climate variability and greenhouse gas-driven climatic change, and (2) spatiotemporal variability in hydrological, ecological and human responses and impacts. The hierarchical framework relies on ground and satellite observations

  16. 8th international workshop on large-scale integration of wind power into power systems as well as on transmission networks for offshore wind farms. Proceedings

    International Nuclear Information System (INIS)

    Betancourt, Uta; Ackermann, Thomas

    2009-01-01

    Within the 8th International Workshop on Large-Scale Integration of Wind Power into Power Systems as well as on Transmission Networks for Offshore Wind Farms at 14th to 15th October, 2009 in Bremen (Federal Republic of Germany), lectures and posters were presented to the following sessions: (1) Keynote session and panel; (2) Grid integration studies and experience: Europe; (3) Connection of offshore wind farms; (4) Wind forecast; (5) High voltage direct current (HVDC); (6) German grid code issues; (7) Offshore grid connection; (8) Grid integration studies and experience: North America; (9) SUPWIND - Decision support tools for large scale integration of wind; (10) Windgrid - Wind on the grid: An integrated approach; (11) IEA Task 25; (12) Grid code issues; (13) Market Issues; (14) Offshore Grid; (15) Modelling; (16) Wind power and storage; (17) Power system balancing; (18) Wind turbine performance; (19) Modelling and offshore transformer.

  17. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  18. Large-scale integration of renewable and distributed generation of electricity in Spain: Current situation and future needs

    International Nuclear Information System (INIS)

    Cossent, Rafael; Gómez, Tomás; Olmos, Luis

    2011-01-01

    Similar to other European countries, mechanisms for the promotion of electricity generation from renewable energy sources (RESs) and combined heat and power (CHP) production have caused a significant growth in distributed generation (DG) in Spain. Low DG/RES penetration levels do not have a major impact on electricity systems. However, several problems arise as DG shares increase. Smarter distribution grids are deemed necessary to facilitate DG/RES integration. This involves modifying the way distribution networks are currently planned and operated. Furthermore, DG and demand should also adopt a more active role. This paper reviews the current situation of DG/RES in Spain including penetration rates, support payments for DG/RES, level of market integration, economic regulation of Distribution System Operators (DSOs), smart metering implementation, grid operation and planning, and incentives for DSO innovation. This paper identifies several improvements that could be made to the treatment of DG/RES. Key aspects of an efficient DG/RES integration are identified and several regulatory changes specific to the Spanish situation are recommended. - Highlights: ► Substantial DG/RES penetration levels are foreseen for the coming years in Spain. ► Integrating such amount of DG/RES in electricity markets and networks is challenging. ► We review key regulatory aspects that may affect DG/RES integration in Spain. ► Several recommendations aimed at easing DG/RES integration in Spain are provided. ► Market integration and the transition towards smarter grids are deemed key issues.

  19. Comparison of zero-sequence injection methods in cascaded H-bridge multilevel converters for large-scale photovoltaic integration

    DEFF Research Database (Denmark)

    Yu, Yifan; Konstantinou, Georgios; Townsend, Christopher David

    2017-01-01

    Photovoltaic (PV) power generation levels in the three phases of a multilevel cascaded H-bridge (CHB) converter can be significantly unbalanced, owing to different irradiance levels and ambient temperatures over a large-scale solar PV power plant. Injection of a zero-sequence voltage is required...... to maintain three-phase balanced grid currents with unbalanced power generation. This study theoretically compares power balance capabilities of various zero-sequence injection methods based on two metrics which can be easily generalised for all CHB applications to PV systems. Experimental results based...

  20. Large-scale brain networks in affective and social neuroscience: Towards an integrative functional architecture of the brain

    Science.gov (United States)

    Barrett, Lisa Feldman; Satpute, Ajay

    2013-01-01

    Understanding how a human brain creates a human mind ultimately depends on mapping psychological categories and concepts to physical measurements of neural response. Although it has long been assumed that emotional, social, and cognitive phenomena are realized in the operations of separate brain regions or brain networks, we demonstrate that it is possible to understand the body of neuroimaging evidence using a framework that relies on domain general, distributed structure-function mappings. We review current research in affective and social neuroscience and argue that the emerging science of large-scale intrinsic brain networks provides a coherent framework for a domain-general functional architecture of the human brain. PMID:23352202

  1. Large-Scale Nonlinear Lumped and Integrated Field Simulations of Top-Orthogonal-to-Bottom-Electrode CMUT Architectures.

    Science.gov (United States)

    Ceroici, Chris; Zemp, Roger J

    2017-07-01

    Capacitive micromachined ultrasonic transducers (CMUTs) promise many advantages over traditional piezoelectric transducers such as the potential to construct large, cost-effective 2-D arrays. To avoid wiring congestion issues associated with fully wired arrays, top-orthogonal-to-bottom electrode (TOBE) CMUT array architectures have proven to be a more practical alternative, using only 2N wires for an N ×N array. Optimally designing a TOBE CMUT array is a significant challenge due to the range of parameters from the device level up to the operating conditions of the entire array. Since testing many design variations can be prohibitively expensive, a simulation approach accounting for both the small and large-scale array characteristics of TOBE arrays is essential. In this paper, we demonstrate large-scale TOBE CMUT array simulations using a nonlinear CMUT lumped-circuit model. We investigate the performance of the array with different CMUT design parameters and array operating conditions. These simulated results are then compared with measurements of TOBE arrays fabricated using a sacrificial release process.

  2. Data Portal for the Library of Integrated Network-based Cellular Signatures (LINCS) program: integrated access to diverse large-scale cellular perturbation response data

    Science.gov (United States)

    Koleti, Amar; Terryn, Raymond; Stathias, Vasileios; Chung, Caty; Cooper, Daniel J; Turner, John P; Vidović, Dušica; Forlin, Michele; Kelley, Tanya T; D’Urso, Alessandro; Allen, Bryce K; Torre, Denis; Jagodnik, Kathleen M; Wang, Lily; Jenkins, Sherry L; Mader, Christopher; Niu, Wen; Fazel, Mehdi; Mahi, Naim; Pilarczyk, Marcin; Clark, Nicholas; Shamsaei, Behrouz; Meller, Jarek; Vasiliauskas, Juozas; Reichard, John; Medvedovic, Mario; Ma’ayan, Avi; Pillai, Ajay

    2018-01-01

    Abstract The Library of Integrated Network-based Cellular Signatures (LINCS) program is a national consortium funded by the NIH to generate a diverse and extensive reference library of cell-based perturbation-response signatures, along with novel data analytics tools to improve our understanding of human diseases at the systems level. In contrast to other large-scale data generation efforts, LINCS Data and Signature Generation Centers (DSGCs) employ a wide range of assay technologies cataloging diverse cellular responses. Integration of, and unified access to LINCS data has therefore been particularly challenging. The Big Data to Knowledge (BD2K) LINCS Data Coordination and Integration Center (DCIC) has developed data standards specifications, data processing pipelines, and a suite of end-user software tools to integrate and annotate LINCS-generated data, to make LINCS signatures searchable and usable for different types of users. Here, we describe the LINCS Data Portal (LDP) (http://lincsportal.ccs.miami.edu/), a unified web interface to access datasets generated by the LINCS DSGCs, and its underlying database, LINCS Data Registry (LDR). LINCS data served on the LDP contains extensive metadata and curated annotations. We highlight the features of the LDP user interface that is designed to enable search, browsing, exploration, download and analysis of LINCS data and related curated content. PMID:29140462

  3. The importance of large-scale integral experiments in improving nuclear safety. Or why I approved the FP-2 experiment

    International Nuclear Information System (INIS)

    Vaughan, J.W. Jr.

    1991-01-01

    The thrust of this discussion is (1) to review the events leading up to the significant final fission product release test, FP-2, (2) to share the inside story of why I approved the special conditions essential to the success of the experiment, (3) to assess the relative importance of large scale experiments in contributing to nuclear safety, and (4) to share some views as to the value of these efforts in the current nuclear energy debate. The energy debate and the decisions as to whether or not to use nuclear power plants or fossil plants for electrical power generation will be settled more by public and economic debate than by the technical community. However, there is no chance of favorable consideration of nuclear power without a sound technical base and a continuing demonstration that we can safely and reliably operate nuclear power plants

  4. Integral imaging-based large-scale full-color 3-D display of holographic data by using a commercial LCD panel.

    Science.gov (United States)

    Dong, Xiao-Bin; Ai, Ling-Yu; Kim, Eun-Soo

    2016-02-22

    We propose a new type of integral imaging-based large-scale full-color three-dimensional (3-D) display of holographic data based on direct ray-optical conversion of holographic data into elemental images (EIs). In the proposed system, a 3-D scene is modeled as a collection of depth-sliced object images (DOIs), and three-color hologram patterns for that scene are generated by interfering each color DOI with a reference beam, and summing them all based on Fresnel convolution integrals. From these hologram patterns, full-color DOIs are reconstructed, and converted into EIs using a ray mapping-based direct pickup process. These EIs are then optically reconstructed to be a full-color 3-D scene with perspectives on the depth-priority integral imaging (DPII)-based 3-D display system employing a large-scale LCD panel. Experiments with a test video confirm the feasibility of the proposed system in the practical application fields of large-scale holographic 3-D displays.

  5. Assessing Impact of Large-Scale Distributed Residential HVAC Control Optimization on Electricity Grid Operation and Renewable Energy Integration

    Science.gov (United States)

    Corbin, Charles D.

    Demand management is an important component of the emerging Smart Grid, and a potential solution to the supply-demand imbalance occurring increasingly as intermittent renewable electricity is added to the generation mix. Model predictive control (MPC) has shown great promise for controlling HVAC demand in commercial buildings, making it an ideal solution to this problem. MPC is believed to hold similar promise for residential applications, yet very few examples exist in the literature despite a growing interest in residential demand management. This work explores the potential for residential buildings to shape electric demand at the distribution feeder level in order to reduce peak demand, reduce system ramping, and increase load factor using detailed sub-hourly simulations of thousands of buildings coupled to distribution power flow software. More generally, this work develops a methodology for the directed optimization of residential HVAC operation using a distributed but directed MPC scheme that can be applied to today's programmable thermostat technologies to address the increasing variability in electric supply and demand. Case studies incorporating varying levels of renewable energy generation demonstrate the approach and highlight important considerations for large-scale residential model predictive control.

  6. NeuroCa: integrated framework for systematic analysis of spatiotemporal neuronal activity patterns from large-scale optical recording data.

    Science.gov (United States)

    Jang, Min Jee; Nam, Yoonkey

    2015-07-01

    Optical recording facilitates monitoring the activity of a large neural network at the cellular scale, but the analysis and interpretation of the collected data remain challenging. Here, we present a MATLAB-based toolbox, named NeuroCa, for the automated processing and quantitative analysis of large-scale calcium imaging data. Our tool includes several computational algorithms to extract the calcium spike trains of individual neurons from the calcium imaging data in an automatic fashion. Two algorithms were developed to decompose the imaging data into the activity of individual cells and subsequently detect calcium spikes from each neuronal signal. Applying our method to dense networks in dissociated cultures, we were able to obtain the calcium spike trains of [Formula: see text] neurons in a few minutes. Further analyses using these data permitted the quantification of neuronal responses to chemical stimuli as well as functional mapping of spatiotemporal patterns in neuronal firing within the spontaneous, synchronous activity of a large network. These results demonstrate that our method not only automates time-consuming, labor-intensive tasks in the analysis of neural data obtained using optical recording techniques but also provides a systematic way to visualize and quantify the collective dynamics of a network in terms of its cellular elements.

  7. Shortening feedback time in continuous integration environment in large-scale embedded software development with test selection

    OpenAIRE

    Koivuniemi, J. (Jarmo)

    2017-01-01

    Abstract Continuous integration is one of the Extreme Programming practices and is used in agile software development to provide rapid feedback and to have a working system at all times. In continuous integration, a developer commits code to projects mainline at least once a day which triggers automated build and tests. Large projects can struggle with continuous integration because with growing code base the number of test...

  8. Role of National Support Policy in the large-scale integration of DER into the European electricity market

    DEFF Research Database (Denmark)

    ten Donkelaar, Michael; Klinge Jacobsen, Henrik

    2008-01-01

    This report concerns a study of the DER support schemes in the different EU Member States, their effectiveness and if necessary how these might be moulded to become more cost-effective in the future to integrate much larger shares of DER in the European electricity supply system. The report is part...... of a set of reports on DER integration issues and together they present a full and complete report on key issues of policy support, required changes in regulation and other issues that hamper more DER integration in supply....

  9. Fully implicit solution of large-scale non-equilibrium radiation diffusion with high order time integration

    International Nuclear Information System (INIS)

    Brown, Peter N.; Shumaker, Dana E.; Woodward, Carol S.

    2005-01-01

    We present a solution method for fully implicit radiation diffusion problems discretized on meshes having millions of spatial zones. This solution method makes use of high order in time integration techniques, inexact Newton-Krylov nonlinear solvers, and multigrid preconditioners. We explore the advantages and disadvantages of high order time integration methods for the fully implicit formulation on both two- and three-dimensional problems with tabulated opacities and highly nonlinear fusion source terms

  10. Evaluating the implementation of a national disclosure policy for large-scale adverse events in an integrated health care system: identification of gaps and successes

    Directory of Open Access Journals (Sweden)

    Elizabeth M. Maguire

    2016-11-01

    Full Text Available Abstract Background Many healthcare organizations have developed disclosure policies for large-scale adverse events, including the Veterans Health Administration (VA. This study evaluated VA’s national large-scale disclosure policy and identifies gaps and successes in its implementation. Methods Semi-structured qualitative interviews were conducted with leaders, hospital employees, and patients at nine sites to elicit their perceptions of recent large-scale adverse events notifications and the national disclosure policy. Data were coded using the constructs of the Consolidated Framework for Implementation Research (CFIR. Results We conducted 97 interviews. Insights included how to handle the communication of large-scale disclosures through multiple levels of a large healthcare organization and manage ongoing communications about the event with employees. Of the 5 CFIR constructs and 26 sub-constructs assessed, seven were prominent in interviews. Leaders and employees specifically mentioned key problem areas involving 1 networks and communications during disclosure, 2 organizational culture, 3 engagement of external change agents during disclosure, and 4 a need for reflecting on and evaluating the policy implementation and disclosure itself. Patients shared 5 preferences for personal outreach by phone in place of the current use of certified letters. All interviewees discussed 6 issues with execution and 7 costs of the disclosure. Conclusions CFIR analysis reveals key problem areas that need to be addresses during disclosure, including: timely communication patterns throughout the organization, establishing a supportive culture prior to implementation, using patient-approved, effective communications strategies during disclosures; providing follow-up support for employees and patients, and sharing lessons learned.

  11. Real-time impact of power balancing on power system operation with large scale integration of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Hansen, Anca Daniela; Sørensen, Poul Ejnar

    2017-01-01

    Highly wind power integrated power system requires continuous active power regulation to tackle the power imbalances resulting from the wind power forecast errors. The active power balance is maintained in real-time with the automatic generation control and also from the control room, where...... power system model. The power system model takes the hour-ahead regulating power plan from power balancing model and the generation and power exchange capacities for the year 2020 into account. The real-time impact of power balancing in a highly wind power integrated power system is assessed...

  12. LARGE SCALE GLAZED

    DEFF Research Database (Denmark)

    Bache, Anja Margrethe

    2010-01-01

    WORLD FAMOUS ARCHITECTS CHALLENGE TODAY THE EXPOSURE OF CONCRETE IN THEIR ARCHITECTURE. IT IS MY HOPE TO BE ABLE TO COMPLEMENT THESE. I TRY TO DEVELOP NEW AESTHETIC POTENTIALS FOR THE CONCRETE AND CERAMICS, IN LARGE SCALES THAT HAS NOT BEEN SEEN BEFORE IN THE CERAMIC AREA. IT IS EXPECTED TO RESUL...

  13. SuperGrid or SmartGrid: Competing strategies for large-scale integration of intermittent renewables?

    DEFF Research Database (Denmark)

    Blarke, Morten; M. Jenkins, Bryan

    2013-01-01

    strategies to evolve in parallel, but in different territories, or with strategic integration, avoiding for one strategy to undermine the feasibility of the other. A strategic zoning strategy is introduced from which attentive societies as well as the global community stand to benefit. The analysis includes...

  14. Integration of Technology, Curriculum, and Professional Development for Advancing Middle School Mathematics: Three Large-Scale Studies

    Science.gov (United States)

    Roschelle, Jeremy; Shechtman, Nicole; Tatar, Deborah; Hegedus, Stephen; Hopkins, Bill; Empson, Susan; Knudsen, Jennifer; Gallagher, Lawrence P.

    2010-01-01

    The authors present three studies (two randomized controlled experiments and one embedded quasi-experiment) designed to evaluate the impact of replacement units targeting student learning of advanced middle school mathematics. The studies evaluated the SimCalc approach, which integrates an interactive representational technology, paper curriculum,…

  15. A J integral based method to measure fracture resistance and cohesive laws in materials exhibiting large scale plasticity

    DEFF Research Database (Denmark)

    Sørensen, Bent F.; Goutianos, Stergios

    2014-01-01

    A method is developed to extract the fracture resistance and mode I cohesive law of nonlinear elastic-plastic materials using a Double Cantilever Beam (DCB) sandwich specimen loaded with pure bending moments. The method is based on the J integral which is valid for materials having a non-linear s......A method is developed to extract the fracture resistance and mode I cohesive law of nonlinear elastic-plastic materials using a Double Cantilever Beam (DCB) sandwich specimen loaded with pure bending moments. The method is based on the J integral which is valid for materials having a non...... to measure the fracture resistance experimentally and determine the mode I cohesive law including its shape....

  16. SuperGrid or SmartGrid: Competing strategies for large-scale integration of intermittent renewables?

    International Nuclear Information System (INIS)

    Blarke, Morten B.; Jenkins, Bryan M.

    2013-01-01

    This paper defines and compares two strategies for integrating intermittent renewables: SuperGrid and SmartGrid. While conventional energy policy suggests that these strategies may be implemented alongside each other, the paper identifies significant technological and socio-economic conflicts of interest between the two. The article identifies differences between a domestic strategy for the integration of intermittent renewables, vis-à-vis the SmartGrid, and a cross-system strategy, vis-à-vis the SuperGrid. Policy makers and transmission system operators must understand the need for both strategies to evolve in parallel, but in different territories, or with strategic integration, avoiding for one strategy to undermine the feasibility of the other. A strategic zoning strategy is introduced from which attentive societies as well as the global community stand to benefit. The analysis includes a paradigmatic case study from West Denmark which supports the hypothesis that these strategies are mutually exclusive. The case study shows that increasing cross-system transmission capacity jeopardizes the feasibility of SmartGrid technology investments. A political effort is required for establishing dedicated SmartGrid innovation zones, while also redefining infrastructure to avoid the narrow focus on grids and cables. SmartGrid Investment Trusts could be supported from reallocation of planned transmission grid investments to provide for the equitable development of SmartGrid strategies. - Highlights: • Compares SuperGrid and SmartGrid strategies for integrating intermittent renewables. • Identifies technological and socio-economic conflicts of interest between the two. • Proposes a strategic zoning strategy allowing for both strategies to evolve. • Presents a paradigmatic case study showing that strategies are mutually exclusive. • Proposes dedicated SmartGrid innovation zones and SmartGrid investment trusts

  17. Ontology Design Patterns: Bridging the Gap Between Local Semantic Use Cases and Large-Scale, Long-Term Data Integration

    Science.gov (United States)

    Shepherd, Adam; Arko, Robert; Krisnadhi, Adila; Hitzler, Pascal; Janowicz, Krzysztof; Chandler, Cyndy; Narock, Tom; Cheatham, Michelle; Schildhauer, Mark; Jones, Matt; Raymond, Lisa; Mickle, Audrey; Finin, Tim; Fils, Doug; Carbotte, Suzanne; Lehnert, Kerstin

    2015-04-01

    Integrating datasets for new use cases is one of the common drivers for adopting semantic web technologies. Even though linked data principles enables this type of activity over time, the task of reconciling new ontological commitments for newer use cases can be daunting. This situation was faced by the Biological and Chemical Oceanography Data Management Office (BCO-DMO) as it sought to integrate its existing linked data with other data repositories to address newer scientific use cases as a partner in the GeoLink Project. To achieve a successful integration with other GeoLink partners, BCO-DMO's metadata would need to be described using the new ontologies developed by the GeoLink partners - a situation that could impact semantic inferencing, pre-existing software and external users of BCO-DMO's linked data. This presentation describes the process of how GeoLink is bridging the gap between local, pre-existing ontologies to achieve scientific metadata integration for all its partners through the use of ontology design patterns. GeoLink, an NSF EarthCube Building Block, brings together experts from the geosciences, computer science, and library science in an effort to improve discovery and reuse of data and knowledge. Its participating repositories include content from field expeditions, laboratory analyses, journal publications, conference presentations, theses/reports, and funding awards that span scientific studies from marine geology to marine ecology and biogeochemistry to paleoclimatology. GeoLink's outcomes include a set of reusable ontology design patterns (ODPs) that describe core geoscience concepts, a network of Linked Data published by participating repositories using those ODPs, and tools to facilitate discovery of related content in multiple repositories.

  18. Integrating the NEPA 216 process with large-scale privatization projects under the US Department of Energy

    International Nuclear Information System (INIS)

    Eccleston, C.H.

    1994-05-01

    The US Department of Energy (DOE) is considering the possibility of replacing the existing Hanford Site 200 Are steam system through a privatization effort. Such an action would be subject to requirements of the National Environmental Policy Act (NEPA) of 1969. Section 216 of the Doe NEPA Implementation Procedures (216 Process) provides a specific mechanism for integrating the DOE procurement process with NEPA compliance requirements

  19. Smart Dual-Mode Heat Pump With HP2Grid Functionality To Support Large-Scale Integration Of Intermittent Renewables

    DEFF Research Database (Denmark)

    Carmo, Carolina; Blarke, Morten Boje

    2013-01-01

    and compare the core technology options that are involved in the design of this concept, including high-pressure heat pumps, advanced thermal storage materials, and small-scale Rankine cycles with natural working fluids. The new concept – called HP2Grid - effectively represents state-of-the-art in terms...

  20. System-Level Modeling and Synthesis Techniques for Flow-Based Microfluidic Very Large Scale Integration Biochips

    DEFF Research Database (Denmark)

    Minhass, Wajid Hassan

    Microfluidic biochips integrate different biochemical analysis functionalities on-chip and offer several advantages over the conventional biochemical laboratories. In this thesis, we focus on the flow-based biochips. The basic building block of such a chip is a valve which can be fabricated at very...... to reduce the macro-assembly around the chip and enhance chip scalability, we propose an approach for the biochip pin count minimization. We also propose a throughput maximization scheme for the cell culture mVLSI biochips, saving time and reducing costs. We have extensively evaluated the proposed...

  1. SOLID-DER. Reaching large-scale integration of Distributed Energy Resources in the enlarged European electricity market

    International Nuclear Information System (INIS)

    Van Oostvoorn, F.; Ten Donkelaar, M.

    2007-05-01

    The integration of DER (distributed energy resources) in the European electricity networks has become a key issue for energy producers, network operators, policy makers and the R and D community. In some countries it created already a number of challenges for the stability of the electricity supply system, thereby creating new barriers for further expansion of the share of DER in supply. On the other hand in many Member States there exists still a lack of awareness and understanding of the possible benefits and role of DER in the electricity system, while environmental goals and security of supply issues ask more and more for solutions that DER could provide in the future. The project SOLID-DER, a Coordination Action, will assess the barriers for further integration of DER, overcome both the lack of awareness of benefits of DER solutions and fragmentation in EU R and D results by consolidating all European DER research activities and report on its common findings. In particular awareness of DER solutions and benefits will be raised in the new Member States, thereby addressing their specific issues and barriers and incorporate them in the existing EU DER R and D community. The SOLID-DER Coordination Action will run from November 2005 to October 2008

  2. Rapid implementation of an integrated large-scale HIV counseling and testing, malaria, and diarrhea prevention campaign in rural Kenya.

    Directory of Open Access Journals (Sweden)

    Eric Lugada

    Full Text Available BACKGROUND: Integrated disease prevention in low resource settings can increase coverage, equity and efficiency in controlling high burden infectious diseases. A public-private partnership with the Ministry of Health, CDC, Vestergaard Frandsen and CHF International implemented a one-week integrated multi-disease prevention campaign. METHOD: Residents of Lurambi, Western Kenya were eligible for participation. The aim was to offer services to at least 80% of those aged 15-49. 31 temporary sites in strategically dispersed locations offered: HIV counseling and testing, 60 male condoms, an insecticide-treated bednet, a household water filter for women or an individual filter for men, and for those testing positive, a 3-month supply of cotrimoxazole and referral for follow-up care and treatment. FINDINGS: Over 7 days, 47,311 people attended the campaign with a 96% uptake of the multi-disease preventive package. Of these, 99.7% were tested for HIV (87% in the target 15-49 age group; 80% had previously never tested. 4% of those tested were positive, 61% were women (5% of women and 3% of men, 6% had median CD4 counts of 541 cell/µL (IQR; 356, 754. 386 certified counselors attended to an average 17 participants per day, consistent with recommended national figures for mass campaigns. Among women, HIV infection varied by age, and was more likely with an ended marriage (e.g. widowed vs. never married, OR.3.91; 95% CI. 2.87-5.34, and lack of occupation. In men, quantitatively stronger relationships were found (e.g. widowed vs. never married, OR.7.0; 95% CI. 3.5-13.9. Always using condoms with a non-steady partner was more common among HIV-infected women participants who knew their status compared to those who did not (OR.5.4 95% CI. 2.3-12.8. CONCLUSION: Through integrated campaigns it is feasible to efficiently cover large proportions of eligible adults in rural underserved communities with multiple disease preventive services simultaneously achieving

  3. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  4. The transport sectors potential contribution to the flexibility in the power sector required by large-scale wind power integration

    DEFF Research Database (Denmark)

    Nørgård, Per Bromand; Lund, H.; Mathiesen, B.V.

    2007-01-01

    -scale integration of renewable energy in the power system – in specific wind power. In the plan, 20 % of the road transport is based on electricity and 20 % on bio- fuels. This, together with other initiatives allows for up to 55-60 % wind power penetration in the power system. A fleet of 0.5 mio electrical...... vehicles in Denmark in 2030 connected to the grid 50 % of the time represents an aggregated flexible power capacity of 1- 1.5 GW and an energy capacity of 10-150 GWh.......In 2006, the Danish Society of Engineers developed a visionary plan for the Danish energy system in 2030. The paper presents and qualifies selected part of the analyses, illustrating the transport sectors potential to contribute to the flexibility in the power sector, necessary for large...

  5. Wind power integration into the automatic generation control of power systems with large-scale wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Hansen, Anca Daniela; Altin, Müfit

    2014-01-01

    Transmission system operators have an increased interest in the active participation of wind power plants (WPP) in the power balance control of power systems with large wind power penetration. The emphasis in this study is on the integration of WPPs into the automatic generation control (AGC......) of the power system. The present paper proposes a coordinated control strategy for the AGC between combined heat and power plants (CHPs) and WPPs to enhance the security and the reliability of a power system operation in the case of a large wind power penetration. The proposed strategy, described...... and exemplified for the future Danish power system, takes the hour-ahead regulating power plan for generation and power exchange with neighbouring power systems into account. The performance of the proposed strategy for coordinated secondary control is assessed and discussed by means of simulations for different...

  6. Wind power integration into the automatic generation control of power systems with large-scale wind power

    Directory of Open Access Journals (Sweden)

    Abdul Basit

    2014-10-01

    Full Text Available Transmission system operators have an increased interest in the active participation of wind power plants (WPP in the power balance control of power systems with large wind power penetration. The emphasis in this study is on the integration of WPPs into the automatic generation control (AGC of the power system. The present paper proposes a coordinated control strategy for the AGC between combined heat and power plants (CHPs and WPPs to enhance the security and the reliability of a power system operation in the case of a large wind power penetration. The proposed strategy, described and exemplified for the future Danish power system, takes the hour-ahead regulating power plan for generation and power exchange with neighbouring power systems into account. The performance of the proposed strategy for coordinated secondary control is assessed and discussed by means of simulations for different possible future scenarios, when wind power production in the power system is high and conventional production from CHPs is at a minimum level. The investigation results of the proposed control strategy have shown that the WPPs can actively help the AGC, and reduce the real-time power imbalance in the power system, by down regulating their production when CHPs are unable to provide the required response.

  7. Zero boil-off methods for large-scale liquid hydrogen tanks using integrated refrigeration and storage

    Science.gov (United States)

    Notardonato, W. U.; Swanger, A. M.; E Fesmire, J.; Jumper, K. M.; Johnson, W. L.; Tomsik, T. M.

    2017-12-01

    NASA has completed a series of tests at the Kennedy Space Center to demonstrate the capability of using integrated refrigeration and storage (IRAS) to remove energy from a liquid hydrogen (LH2) tank and control the state of the propellant. A primary test objective was the keeping and storing of the liquid in a zero boil-off state, so that the total heat leak entering the tank is removed by a cryogenic refrigerator with an internal heat exchanger. The LH2 is therefore stored and kept with zero losses for an indefinite period of time. The LH2 tank is a horizontal cylindrical geometry with a vacuum-jacketed, multilayer insulation system and a capacity of 125,000 liters. The closed-loop helium refrigeration system was a Linde LR1620 capable of 390W cooling at 20K (without any liquid nitrogen pre-cooling). Three different control methods were used to obtain zero boil-off: temperature control of the helium refrigerant, refrigerator control using the tank pressure sensor, and duty cycling (on/off) of the refrigerator as needed. Summarized are the IRAS design approach, zero boil-off control methods, and results of the series of zero boil-off tests.

  8. Robust and rapid algorithms facilitate large-scale whole genome sequencing downstream analysis in an integrative framework.

    Science.gov (United States)

    Li, Miaoxin; Li, Jiang; Li, Mulin Jun; Pan, Zhicheng; Hsu, Jacob Shujui; Liu, Dajiang J; Zhan, Xiaowei; Wang, Junwen; Song, Youqiang; Sham, Pak Chung

    2017-05-19

    Whole genome sequencing (WGS) is a promising strategy to unravel variants or genes responsible for human diseases and traits. However, there is a lack of robust platforms for a comprehensive downstream analysis. In the present study, we first proposed three novel algorithms, sequence gap-filled gene feature annotation, bit-block encoded genotypes and sectional fast access to text lines to address three fundamental problems. The three algorithms then formed the infrastructure of a robust parallel computing framework, KGGSeq, for integrating downstream analysis functions for whole genome sequencing data. KGGSeq has been equipped with a comprehensive set of analysis functions for quality control, filtration, annotation, pathogenic prediction and statistical tests. In the tests with whole genome sequencing data from 1000 Genomes Project, KGGSeq annotated several thousand more reliable non-synonymous variants than other widely used tools (e.g. ANNOVAR and SNPEff). It took only around half an hour on a small server with 10 CPUs to access genotypes of ∼60 million variants of 2504 subjects, while a popular alternative tool required around one day. KGGSeq's bit-block genotype format used 1.5% or less space to flexibly represent phased or unphased genotypes with multiple alleles and achieved a speed of over 1000 times faster to calculate genotypic correlation. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  9. CRNET: An efficient sampling approach to infer functional regulatory networks by integrating large-scale ChIP-seq and time-course RNA-seq data.

    Science.gov (United States)

    Chen, Xi; Gu, Jinghua; Wang, Xiao; Jung, Jin-Gyoung; Wang, Tian-Li; Hilakivi-Clarke, Leena; Clarke, Robert; Xuan, Jianhua

    2017-12-21

    NGS techniques have been widely applied in genetic and epigenetic studies. Multiple ChIP-seq and RNA-seq profiles can now be jointly used to infer functional regulatory networks (FRNs). However, existing methods suffer from either oversimplified assumption on transcription factor (TF) regulation or slow convergence of sampling for FRN inference from large-scale ChIP-seq and time-course RNA-seq data. We developed an efficient Bayesian integration method (CRNET) for FRN inference using a two-stage Gibbs sampler to estimate iteratively hidden TF activities and the posterior probabilities of binding events. A novel statistic measure that jointly considers regulation strength and regression error enables the sampling process of CRNET to converge quickly, thus making CRNET very efficient for large-scale FRN inference. Experiments on synthetic and benchmark data showed a significantly improved performance of CRNET when compared with existing methods. CRNET was applied to breast cancer data to identify FRNs functional at promoter or enhancer regions in breast cancer MCF-7 cells. Transcription factor MYC is predicted as a key functional factor in both promoter and enhancer FRNs. We experimentally validated the regulation effects of MYC on CRNET-predicted target genes using appropriate RNAi approaches in MCF-7 cells. R scripts of CRNET are available at http://www.cbil.ece.vt.edu/software.htm. xuan@vt.edu. Supplementary data are available at Bioinformatics online.

  10. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  11. Very Large Scale Integration of Nano-Patterned YBa2Cu3O7-delta Josephson Junctions in a Two-Dimensional Array

    Energy Technology Data Exchange (ETDEWEB)

    Cybart, Shane A; Anton, Steven; Wu, Stephen; Clarke, John; Dynes, Robert

    2009-09-01

    Very large scale integration of Josephson junctions in a two-dimensional series-parallel array has been achieved by ion irradiating a YBa{sub 2}Cu{sub 3}O{sub 7-{delta}} film through slits in a nano-fabricated mask created with electron beam lithography and reactive ion etching. The mask consisted of 15,820 high-aspect ratio (20:1), 35-nm wide slits that restricted the irradiation in the film below to form Josephson junctions. Characterizing each parallel segment k, containing 28 junctions, with a single critical current I{sub ck} we found a standard deviation in I{sub ck} of about 16%.

  12. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  13. Large Scale Solar Heating

    DEFF Research Database (Denmark)

    Heller, Alfred

    2001-01-01

    The main objective of the research was to evaluate large-scale solar heating connected to district heating (CSDHP), to build up a simulation tool and to demonstrate the application of the simulation tool for design studies and on a local energy planning case. The evaluation was mainly carried out...... based on measurements on the Marstal plant, Denmark, and through comparison with published and unpublished data from other plants. Evaluations on the thermal, economical and environmental performance are repored, based on experiences from the last decade. For detailed designing, a computer simulation...... model is designed and validated on the Marstal case. Applying the Danish Reference Year, a design tool is presented. The simulation tool is used for proposals for application of alternative designs, including high-performance solar collector types (trough solar collectors, vaccum pipe collectors...

  14. Evaluating the implementation of a national disclosure policy for large-scale adverse events in an integrated health care system: identification of gaps and successes

    OpenAIRE

    Maguire, Elizabeth M.; Bokhour, Barbara G.; Wagner, Todd H.; Asch, Steven M.; Gifford, Allen L.; Gallagher, Thomas H.; Durfee, Janet M.; Martinello, Richard A.; Elwy, A. Rani

    2016-01-01

    Background Many healthcare organizations have developed disclosure policies for large-scale adverse events, including the Veterans Health Administration (VA). This study evaluated VA?s national large-scale disclosure policy and identifies gaps and successes in its implementation. Methods Semi-structured qualitative interviews were conducted with leaders, hospital employees, and patients at nine sites to elicit their perceptions of recent large-scale adverse events notifications and the nation...

  15. Large scale tracking algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Ross L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Love, Joshua Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Melgaard, David Kennett [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Karelitz, David B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pitts, Todd Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Zollweg, Joshua David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Anderson, Dylan Z. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Nandy, Prabal [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Whitlow, Gary L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bender, Daniel A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Byrne, Raymond Harry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  16. Large scale integration of flexible non-volatile, re-addressable memories using P(VDF-TrFE) and amorphous oxide transistors

    International Nuclear Information System (INIS)

    Gelinck, Gerwin H; Cobb, Brian; Van Breemen, Albert J J M; Myny, Kris

    2015-01-01

    Ferroelectric polymers and amorphous metal oxide semiconductors have emerged as important materials for re-programmable non-volatile memories and high-performance, flexible thin-film transistors, respectively. However, realizing sophisticated transistor memory arrays has proven to be a challenge, and demonstrating reliable writing to and reading from such a large scale memory has thus far not been demonstrated. Here, we report an integration of ferroelectric, P(VDF-TrFE), transistor memory arrays with thin-film circuitry that can address each individual memory element in that array. n-type indium gallium zinc oxide is used as the active channel material in both the memory and logic thin-film transistors. The maximum process temperature is 200 °C, allowing plastic films to be used as substrate material. The technology was scaled up to 150 mm wafer size, and offers good reproducibility, high device yield and low device variation. This forms the basis for successful demonstration of memory arrays, read and write circuitry, and the integration of these. (paper)

  17. An Efficient Large-Scale Retroviral Transduction Method Involving Preloading the Vector into a RetroNectin-Coated Bag with Low-Temperature Shaking

    Science.gov (United States)

    Dodo, Katsuyuki; Chono, Hideto; Saito, Naoki; Tanaka, Yoshinori; Tahara, Kenichi; Nukaya, Ikuei; Mineno, Junichi

    2014-01-01

    In retroviral vector-mediated gene transfer, transduction efficiency can be hampered by inhibitory molecules derived from the culture fluid of virus producer cell lines. To remove these inhibitory molecules to enable better gene transduction, we had previously developed a transduction method using a fibronectin fragment-coated vessel (i.e., the RetroNectin-bound virus transduction method). In the present study, we developed a method that combined RetroNectin-bound virus transduction with low-temperature shaking and applied this method in manufacturing autologous retroviral-engineered T cells for adoptive transfer gene therapy in a large-scale closed system. Retroviral vector was preloaded into a RetroNectin-coated bag and incubated at 4°C for 16 h on a reciprocating shaker at 50 rounds per minute. After the supernatant was removed, activated T cells were added to the bag. The bag transduction method has the advantage of increasing transduction efficiency, as simply flipping over the bag during gene transduction facilitates more efficient utilization of the retroviral vector adsorbed on the top and bottom surfaces of the bag. Finally, we performed validation runs of endoribonuclease MazF-modified CD4+ T cell manufacturing for HIV-1 gene therapy and T cell receptor-modified T cell manufacturing for MAGE-A4 antigen-expressing cancer gene therapy and achieved over 200-fold (≥1010) and 100-fold (≥5×109) expansion, respectively. In conclusion, we demonstrated that the large-scale closed transduction system is highly efficient for retroviral vector-based T cell manufacturing for adoptive transfer gene therapy, and this technology is expected to be amenable to automation and improve current clinical gene therapy protocols. PMID:24454964

  18. Integrated biodosimetry in large scale radiological events. Opportunities for civil military co-operation; Integrierte Biodosimetrie bei radiologischen Grossschadensereignissen. Moeglichkeiten fuer zivil-militaerische Zusammenarbeit

    Energy Technology Data Exchange (ETDEWEB)

    Port, M.; Eder, S.F.; Lamkowski, A.; Majewski, M.; Abend, M. [Institut fuer Radiobiologie der Bundeswehr, Muenchen (Germany)

    2016-07-01

    Radiological events like large scale radiological or nuclear accidents, terroristic attacks with radionuclide dispersal devices require rapid and precise medical classification (''triage'') and medical management of a large number of patients. Estimates on the absorbed dose and in particular predictions of the radiation induced health effects are mandatory for optimized allocation of limited medical resources and initiation of patient centred treatment. Among the German Armed Forces Medical Services the Bundeswehr Institute of Radiobiology offers a wide range of tools for the purpose of medical management to cope with different scenarios. The forward deployable mobile Medical Task Force has access to state of the art methodologies summarized into approaches such as physical dosimetry (including mobile gammaspectroscopy), clinical ''dosimetry'' (prodromi, H-Modul) and different means of biological dosimetry (e.g. dicentrics, high throughput gene expression techniques, gamma-H2AX). The integration of these different approaches enables trained physicians of the Medical Task Force to assess individual health injuries as well as prognostic evaluation, considering modern treatment options. To enhance the capacity of single institutions, networking has been recognized as an important emergency response strategy. The capabilities of physical, biological and clinical ''dosimetry'' approaches spanning from low up to high radiation exposures will be discussed. Furthermore civil military opportunities for combined efforts will be demonstrated.

  19. ANEMOS: Development of a next generation wind power forecasting system for the large-scale integration of onshore and offshore wind farms.

    Science.gov (United States)

    Kariniotakis, G.; Anemos Team

    2003-04-01

    Objectives: Accurate forecasting of the wind energy production up to two days ahead is recognized as a major contribution for reliable large-scale wind power integration. Especially, in a liberalized electricity market, prediction tools enhance the position of wind energy compared to other forms of dispatchable generation. ANEMOS, is a new 3.5 years R&D project supported by the European Commission, that resembles research organizations and end-users with an important experience on the domain. The project aims to develop advanced forecasting models that will substantially outperform current methods. Emphasis is given to situations like complex terrain, extreme weather conditions, as well as to offshore prediction for which no specific tools currently exist. The prediction models will be implemented in a software platform and installed for online operation at onshore and offshore wind farms by the end-users participating in the project. Approach: The paper presents the methodology of the project. Initially, the prediction requirements are identified according to the profiles of the end-users. The project develops prediction models based on both a physical and an alternative statistical approach. Research on physical models gives emphasis to techniques for use in complex terrain and the development of prediction tools based on CFD techniques, advanced model output statistics or high-resolution meteorological information. Statistical models (i.e. based on artificial intelligence) are developed for downscaling, power curve representation, upscaling for prediction at regional or national level, etc. A benchmarking process is set-up to evaluate the performance of the developed models and to compare them with existing ones using a number of case studies. The synergy between statistical and physical approaches is examined to identify promising areas for further improvement of forecasting accuracy. Appropriate physical and statistical prediction models are also developed for

  20. Large-scale river regulation

    International Nuclear Information System (INIS)

    Petts, G.

    1994-01-01

    Recent concern over human impacts on the environment has tended to focus on climatic change, desertification, destruction of tropical rain forests, and pollution. Yet large-scale water projects such as dams, reservoirs, and inter-basin transfers are among the most dramatic and extensive ways in which our environment has been, and continues to be, transformed by human action. Water running to the sea is perceived as a lost resource, floods are viewed as major hazards, and wetlands are seen as wastelands. River regulation, involving the redistribution of water in time and space, is a key concept in socio-economic development. To achieve water and food security, to develop drylands, and to prevent desertification and drought are primary aims for many countries. A second key concept is ecological sustainability. Yet the ecology of rivers and their floodplains is dependent on the natural hydrological regime, and its related biochemical and geomorphological dynamics. (Author)

  1. The Integrated Use of DMSP-OLS Nighttime Light and MODIS Data for Monitoring Large-Scale Impervious Surface Dynamics: A Case Study in the Yangtze River Delta

    Directory of Open Access Journals (Sweden)

    Zhenfeng Shao

    2014-09-01

    Full Text Available The timely and reliable estimation of imperviousness is essential for the scientific understanding of human-Earth interactions. Due to the unique capacity of capturing artificial light luminosity and long-term data records, the Defense Meteorological Satellite Program (DMSP’s Operational Line-scan System (OLS nighttime light (NTL imagery offers an appealing opportunity for continuously characterizing impervious surface area (ISA at regional and continental scales. Although different levels of success have been achieved, critical challenges still remain in the literature. ISA results generated by DMSP-OLS NTL alone suffer from limitations due to systemic defects of the sensor. Moreover, the majority of developed methodologies seldom consider spatial heterogeneity, which is a key issue in coarse imagery applications. In this study, we proposed a novel method for multi-temporal ISA estimation. This method is based on a linear regression model developed between the sub-pixel ISA fraction and a multi-source index with the integrated use of DMSP-OLS NTL and MODIS NDVI. In contrast with traditional regression analysis, we incorporated spatial information to the regression model for obtaining spatially adaptive coefficients at the per-pixel level. To produce multi-temporal ISA maps using a mono-temporal reference dataset, temporally stable samples were extracted for model training and validation. We tested the proposed method in the Yangtze River Delta and generated annual ISA fraction maps for the decade 2000–2009. According to our assessments, the proposed method exhibited substantial improvements compared with the standard linear regression model and provided a feasible way to monitor large-scale impervious surface dynamics.

  2. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  3. On the conceptual design of large-scale process & energy infrastructure systems integrating flexibility, reliability, availability, maintainability and economics (FRAME) performance metrics

    NARCIS (Netherlands)

    Ajah, A.N.

    2009-01-01

    The environment in which large-scale process and energy infrastructure systems operate is becoming more dynamic and subject to various uncertainties and disturbances. These challenge the engineers and designers to provide solutions and designs that are not only adaptive to a wide range of future

  4. The Systematic Integration of Very Large Scale Integrated Circuit Computer-Aided Design Tools into a Toolkit Optimized for Academic Applications.

    Science.gov (United States)

    1984-12-01

    line (DIP) IC package. 1965: Robert Widlar, a designer with Fairchild, develops the first practical integrated circuit operational amplifier ( opamp ...the uA709. Widlar also designed the uA702, uA710, and the uA741 11-2 .*~~~~ .* . . . . . . . .. .* . . . ... . . . . - opamps . 1966: Autonetics...There was resistance to the implementa- tion of automated design aids [17,40]. The reluctance to use CAD programs was partly due to numerous software fail

  5. Large-Scale Information Systems

    Energy Technology Data Exchange (ETDEWEB)

    D. M. Nicol; H. R. Ammerlahn; M. E. Goldsby; M. M. Johnson; D. E. Rhodes; A. S. Yoshimura

    2000-12-01

    Large enterprises are ever more dependent on their Large-Scale Information Systems (LSLS), computer systems that are distinguished architecturally by distributed components--data sources, networks, computing engines, simulations, human-in-the-loop control and remote access stations. These systems provide such capabilities as workflow, data fusion and distributed database access. The Nuclear Weapons Complex (NWC) contains many examples of LSIS components, a fact that motivates this research. However, most LSIS in use grew up from collections of separate subsystems that were not designed to be components of an integrated system. For this reason, they are often difficult to analyze and control. The problem is made more difficult by the size of a typical system, its diversity of information sources, and the institutional complexities associated with its geographic distribution across the enterprise. Moreover, there is no integrated approach for analyzing or managing such systems. Indeed, integrated development of LSIS is an active area of academic research. This work developed such an approach by simulating the various components of the LSIS and allowing the simulated components to interact with real LSIS subsystems. This research demonstrated two benefits. First, applying it to a particular LSIS provided a thorough understanding of the interfaces between the system's components. Second, it demonstrated how more rapid and detailed answers could be obtained to questions significant to the enterprise by interacting with the relevant LSIS subsystems through simulated components designed with those questions in mind. In a final, added phase of the project, investigations were made on extending this research to wireless communication networks in support of telemetry applications.

  6. Large scale homing in honeybees.

    Directory of Open Access Journals (Sweden)

    Mario Pahl

    Full Text Available Honeybee foragers frequently fly several kilometres to and from vital resources, and communicate those locations to their nest mates by a symbolic dance language. Research has shown that they achieve this feat by memorizing landmarks and the skyline panorama, using the sun and polarized skylight as compasses and by integrating their outbound flight paths. In order to investigate the capacity of the honeybees' homing abilities, we artificially displaced foragers to novel release spots at various distances up to 13 km in the four cardinal directions. Returning bees were individually registered by a radio frequency identification (RFID system at the hive entrance. We found that homing rate, homing speed and the maximum homing distance depend on the release direction. Bees released in the east were more likely to find their way back home, and returned faster than bees released in any other direction, due to the familiarity of global landmarks seen from the hive. Our findings suggest that such large scale homing is facilitated by global landmarks acting as beacons, and possibly the entire skyline panorama.

  7. Ecogrid EU - a large scale smart grids demonstration of real time market-based integration of numerous small DER and DR

    DEFF Research Database (Denmark)

    Ding, Yi; Nyeng, Preben; Ostergaard, Jacob

    2012-01-01

    This paper provides an overview of the Ecogrid EU project, which is a large-scale demonstration project on the Danish island Bornholm. It provides Europe a fast track evolution towards smart grid dissemination and deployment in the distribution network. Objective of Ecogrid EU is to illustrate th...... customers will be equipped with demand response devices with smart controllers and smart meters, allowing them to respond to real-time prices based on their pre-programmed demand-response preferences.......This paper provides an overview of the Ecogrid EU project, which is a large-scale demonstration project on the Danish island Bornholm. It provides Europe a fast track evolution towards smart grid dissemination and deployment in the distribution network. Objective of Ecogrid EU is to illustrate...

  8. Integration of 18 GW Wind Energy into the Energy Market. Practical Experiences in Germany. Experiences with large-scale integration of wind power into power systems

    International Nuclear Information System (INIS)

    Krauss, C.; Graeber, B.; Lange, M.; Focken, U.

    2006-01-01

    This work describes the integration of 18 GW of wind power into the German energy market. The focus lies on reporting practical experiences concerning the use of wind energy in Germany within the framework of the renewable energy act (EEG) and the immediate exchange of wind power between the four German grid control areas. Due to the EEG the demand for monitoring the current energy production of wind farms and for short-term predictions of wind power has significantly increased and opened a broader market for these services. In particular for trading on the intraday market ultra short term predictions in the time frame of 1 to 10 hours require different approaches than usual dayahead predictions because the large numerical meteorological models are not sufficiently optimized for very short time horizons. It is shown that for this range a combination of a statistical and a deterministic model leads to significant improvements and stable results as it unites the characteristics of the current wind power production with the synoptic-scale meteorological situation. The possible concepts of balancing the remaining differences between predicted and actual wind power generation are discussed. As wind power prediction errors and load forecasting errors are uncorrelated, benefits can arise from a combined balancing. Finally practical experiences with wind power fluctuations and large forecast errors are presented.

  9. Reviving large-scale projects

    International Nuclear Information System (INIS)

    Desiront, A.

    2003-01-01

    For the past decade, most large-scale hydro development projects in northern Quebec have been put on hold due to land disputes with First Nations. Hydroelectric projects have recently been revived following an agreement signed with Aboriginal communities in the province who recognized the need to find new sources of revenue for future generations. Many Cree are working on the project to harness the waters of the Eastmain River located in the middle of their territory. The work involves building an 890 foot long dam, 30 dikes enclosing a 603 square-km reservoir, a spillway, and a power house with 3 generating units with a total capacity of 480 MW of power for start-up in 2007. The project will require the use of 2,400 workers in total. The Cree Construction and Development Company is working on relations between Quebec's 14,000 Crees and the James Bay Energy Corporation, the subsidiary of Hydro-Quebec which is developing the project. Approximately 10 per cent of the $735-million project has been designated for the environmental component. Inspectors ensure that the project complies fully with environmental protection guidelines. Total development costs for Eastmain-1 are in the order of $2 billion of which $735 million will cover work on site and the remainder will cover generating units, transportation and financial charges. Under the treaty known as the Peace of the Braves, signed in February 2002, the Quebec government and Hydro-Quebec will pay the Cree $70 million annually for 50 years for the right to exploit hydro, mining and forest resources within their territory. The project comes at a time when electricity export volumes to the New England states are down due to growth in Quebec's domestic demand. Hydropower is a renewable and non-polluting source of energy that is one of the most acceptable forms of energy where the Kyoto Protocol is concerned. It was emphasized that large-scale hydro-electric projects are needed to provide sufficient energy to meet both

  10. SMILE: experimental results of the WP4 PTS large scale test performed on a component in terms of cracked cylinder involving warm pre-stress

    International Nuclear Information System (INIS)

    Kerkhof, K.; Bezdikian, G.; Moinereau, D.; Dahl, A; Wadier, Y.; Gilles, P.; Keim, E.; Chapuliot, S.; Taylor, N.; Lidbury, D.; Sharples, J.; Budden, P.; Siegele, D.; Nagel, G.; Bass, R.; Emond, D.

    2005-01-01

    The Reactor Pressure Vessel (RPV) is an essential component, which is liable to limit the lifetime duration of PWR plants. The assessment of defects in RPV subjected to pressurized thermal shock (PTS) transients made at an European level generally does not necessarily consider the beneficial effect of the load history (Warm Pre-stress, WPS). The SMILE project - Structural Margin Improvements in aged embrittled RPV with Load history Effects-aims to give sufficient elements to demonstrate, to model and to validate the beneficial WPS effect. It also aims to harmonize the different approaches in the national codes and standards regarding the inclusion of the WPS effect in a RPV structural integrity assessment. The project includes significant experimental work on WPS type experiments with C(T) specimens and a PTS type transient experiment on a large component. This paper deals with the results of the PTS type transient experiment on a component-like specimen subjected to WPS- loading, the so called Validation Test, carried out within the framework of work package WP4. The test specimen consists of a cylindrical thick walled specimen with a thickness of 40 mm and an outer diameter of 160 mm, provided with an internal fully circumferential crack with a depth of about 15 mm. The specified load path type is Load-Cool-Unload-Fracture (LCUF). No crack initiation occurred during cooling (thermal shock loading) although the loading path crossed the fracture toughness curve in the transition region. The benefit of the WPS-effect by final re-loading up to fracture in the lower shelf region, was shown clearly. The corresponding fracture load during reloading in the lower shelf region was significantly higher than the crack initiation values of the original material in the lower shelf region. The post test fractographic evaluation showed that the fracture mode was predominantly cleavage fracture also with some secondary cracks emanating from major crack. (authors)

  11. Large-scale nanophotonic phased array.

    Science.gov (United States)

    Sun, Jie; Timurdogan, Erman; Yaacobi, Ami; Hosseini, Ehsan Shah; Watts, Michael R

    2013-01-10

    Electromagnetic phased arrays at radio frequencies are well known and have enabled applications ranging from communications to radar, broadcasting and astronomy. The ability to generate arbitrary radiation patterns with large-scale phased arrays has long been pursued. Although it is extremely expensive and cumbersome to deploy large-scale radiofrequency phased arrays, optical phased arrays have a unique advantage in that the much shorter optical wavelength holds promise for large-scale integration. However, the short optical wavelength also imposes stringent requirements on fabrication. As a consequence, although optical phased arrays have been studied with various platforms and recently with chip-scale nanophotonics, all of the demonstrations so far are restricted to one-dimensional or small-scale two-dimensional arrays. Here we report the demonstration of a large-scale two-dimensional nanophotonic phased array (NPA), in which 64 × 64 (4,096) optical nanoantennas are densely integrated on a silicon chip within a footprint of 576 μm × 576 μm with all of the nanoantennas precisely balanced in power and aligned in phase to generate a designed, sophisticated radiation pattern in the far field. We also show that active phase tunability can be realized in the proposed NPA by demonstrating dynamic beam steering and shaping with an 8 × 8 array. This work demonstrates that a robust design, together with state-of-the-art complementary metal-oxide-semiconductor technology, allows large-scale NPAs to be implemented on compact and inexpensive nanophotonic chips. In turn, this enables arbitrary radiation pattern generation using NPAs and therefore extends the functionalities of phased arrays beyond conventional beam focusing and steering, opening up possibilities for large-scale deployment in applications such as communication, laser detection and ranging, three-dimensional holography and biomedical sciences, to name just a few.

  12. Integrating SMOS brightness temperatures with a new conceptual spatially distributed hydrological model for improving flood and drought predictions at large scale.

    Science.gov (United States)

    Hostache, Renaud; Rains, Dominik; Chini, Marco; Lievens, Hans; Verhoest, Niko E. C.; Matgen, Patrick

    2017-04-01

    Motivated by climate change and its impact on the scarcity or excess of water in many parts of the world, several agencies and research institutions have taken initiatives in monitoring and predicting the hydrologic cycle at a global scale. Such a monitoring/prediction effort is important for understanding the vulnerability to extreme hydrological events and for providing early warnings. This can be based on an optimal combination of hydro-meteorological models and remote sensing, in which satellite measurements can be used as forcing or calibration data or for regularly updating the model states or parameters. Many advances have been made in these domains and the near future will bring new opportunities with respect to remote sensing as a result of the increasing number of spaceborn sensors enabling the large scale monitoring of water resources. Besides of these advances, there is currently a tendency to refine and further complicate physically-based hydrologic models to better capture the hydrologic processes at hand. However, this may not necessarily be beneficial for large-scale hydrology, as computational efforts are therefore increasing significantly. As a matter of fact, a novel thematic science question that is to be investigated is whether a flexible conceptual model can match the performance of a complex physically-based model for hydrologic simulations at large scale. In this context, the main objective of this study is to investigate how innovative techniques that allow for the estimation of soil moisture from satellite data can help in reducing errors and uncertainties in large scale conceptual hydro-meteorological modelling. A spatially distributed conceptual hydrologic model has been set up based on recent developments of the SUPERFLEX modelling framework. As it requires limited computational efforts, this model enables early warnings for large areas. Using as forcings the ERA-Interim public dataset and coupled with the CMEM radiative transfer model

  13. Large scale structure and baryogenesis

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    2001-08-01

    We discuss a possible connection between the large scale structure formation and the baryogenesis in the universe. An update review of the observational indications for the presence of a very large scale 120h -1 Mpc in the distribution of the visible matter of the universe is provided. The possibility to generate a periodic distribution with the characteristic scale 120h -1 Mpc through a mechanism producing quasi-periodic baryon density perturbations during inflationary stage, is discussed. The evolution of the baryon charge density distribution is explored in the framework of a low temperature boson condensate baryogenesis scenario. Both the observed very large scale of a the visible matter distribution in the universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, for some model's parameters a natural separation of matter superclusters from antimatter ones can be achieved. (author)

  14. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  15. Effect of a package of integrated demand- and supply-side interventions on facility delivery rates in rural Bangladesh: Implications for large-scale programs.

    Directory of Open Access Journals (Sweden)

    Sayedur Rahman

    Full Text Available According to the Bangladesh Demographic and Health Survey 2014, only approximately 37 percent of women deliver in a health facility. Among the eight administrative divisions of Bangladesh, the facility delivery rate is lowest in the Sylhet division (22.6 percent where we assessed the effect of integrated supply- and demand-side interventions on the facility-based delivery rate.Population-based cohort data of pregnant women from an ongoing maternal and newborn health improvement study being conducted in a population of ~120,000 in Sylhet district were used. The study required collection and processing of biological samples immediately after delivery. Therefore, the project assembled various strategies to increase institutional delivery rates. The supply-side intervention included capacity expansion of the health facilities through service provider refresher training, 24/7 service coverage, additions of drugs and supplies, and incentives to the providers. The demand-side component involved financial incentives to cover expenses, a provision of emergency transport, and referral support to a tertiary-level hospital. We conducted a before-and-after observational study to assess the impact of the intervention in a total of 1,861 deliveries between December 2014 and November 2016.Overall, implementation of the intervention package was associated with 52.6 percentage point increase in the proportions of facility-based deliveries from a baseline rate of 25.0 percent to 77.6 percent in 24 months. We observed lower rates of institutional deliveries when only supply-side interventions were implemented. The proportion rose to 47.1 percent and continued increasing when the project emphasized addressing the financial barriers to accessing obstetric care in a health facility.An integrated supply- and demand-side intervention was associated with a substantial increase in institutional delivery. The package can be tailored to identify which combination of

  16. Effect of a package of integrated demand- and supply-side interventions on facility delivery rates in rural Bangladesh: Implications for large-scale programs.

    Science.gov (United States)

    Rahman, Sayedur; Choudhury, Aziz Ahmed; Khanam, Rasheda; Moin, Syed Mamun Ibne; Ahmed, Salahuddin; Begum, Nazma; Shoma, Nurun Naher; Quaiyum, Md Abdul; Baqui, Abdullah H

    2017-01-01

    According to the Bangladesh Demographic and Health Survey 2014, only approximately 37 percent of women deliver in a health facility. Among the eight administrative divisions of Bangladesh, the facility delivery rate is lowest in the Sylhet division (22.6 percent) where we assessed the effect of integrated supply- and demand-side interventions on the facility-based delivery rate. Population-based cohort data of pregnant women from an ongoing maternal and newborn health improvement study being conducted in a population of ~120,000 in Sylhet district were used. The study required collection and processing of biological samples immediately after delivery. Therefore, the project assembled various strategies to increase institutional delivery rates. The supply-side intervention included capacity expansion of the health facilities through service provider refresher training, 24/7 service coverage, additions of drugs and supplies, and incentives to the providers. The demand-side component involved financial incentives to cover expenses, a provision of emergency transport, and referral support to a tertiary-level hospital. We conducted a before-and-after observational study to assess the impact of the intervention in a total of 1,861 deliveries between December 2014 and November 2016. Overall, implementation of the intervention package was associated with 52.6 percentage point increase in the proportions of facility-based deliveries from a baseline rate of 25.0 percent to 77.6 percent in 24 months. We observed lower rates of institutional deliveries when only supply-side interventions were implemented. The proportion rose to 47.1 percent and continued increasing when the project emphasized addressing the financial barriers to accessing obstetric care in a health facility. An integrated supply- and demand-side intervention was associated with a substantial increase in institutional delivery. The package can be tailored to identify which combination of interventions may

  17. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  18. Large Scale Glazed Concrete Panels

    DEFF Research Database (Denmark)

    Bache, Anja Margrethe

    2010-01-01

    .synligbeton.dk and spæncom’s aesthetic relief effects by the designer Line Kramhøft (www.spaencom.com). It is my hope that the research-development project “Lasting large scale glazed concrete formwork,” I am working on at DTU, department of Architectural Engineering will be able to complement these. It is a project where I...... in the crinkly façade of DR-Byen (the domicile of the Danish Broadcasting Company) by architect Jean Nouvel and Zaha Hadid’s Ordrupgård’s black curved smooth concrete surfaces. Furthermore, one can point to initiatives such as “Synlig beton” (visible concrete) that can be seen on the website www...... try to develop new aesthetic potentials for the concrete, in large scales that has not been seen before in the ceramic area. It is expected to result in new types of large scale and very thin, glazed concrete façades in building. If such are introduced in an architectural context as exposed surfaces...

  19. Development of large scale riverine terrain-bathymetry dataset by integrating NHDPlus HR with NED,CoNED and HAND data

    Science.gov (United States)

    Li, Z.; Clark, E. P.

    2017-12-01

    Large scale and fine resolution riverine bathymetry data is critical for flood inundation modelingbut not available over the continental United States (CONUS). Previously we implementedbankfull hydraulic geometry based approaches to simulate bathymetry for individual riversusing NHDPlus v2.1 data and 10 m National Elevation Dataset (NED). USGS has recentlydeveloped High Resolution NHD data (NHDPlus HR Beta) (USGS, 2017), and thisenhanced dataset has a significant improvement on its spatial correspondence with 10 m DEM.In this study, we used this high resolution data, specifically NHDFlowline and NHDArea,to create bathymetry/terrain for CONUS river channels and floodplains. A software packageNHDPlus Inundation Modeler v5.0 Beta was developed for this project as an Esri ArcGIShydrological analysis extension. With the updated tools, raw 10 m DEM was first hydrologicallytreated to remove artificial blockages (e.g., overpasses, bridges and eve roadways, etc.) usinglow pass moving window filters. Cross sections were then automatically constructed along eachflowline to extract elevation from the hydrologically treated DEM. In this study, river channelshapes were approximated using quadratic curves to reduce uncertainties from commonly usedtrapezoids. We calculated underneath water channel elevation at each cross section samplingpoint using bankfull channel dimensions that were estimated from physiographicprovince/division based regression equations (Bieger et al. 2015). These elevation points werethen interpolated to generate bathymetry raster. The simulated bathymetry raster wasintegrated with USGS NED and Coastal National Elevation Database (CoNED) (whereveravailable) to make seamless terrain-bathymetry dataset. Channel bathymetry was alsointegrated to the HAND (Height above Nearest Drainage) dataset to improve large scaleinundation modeling. The generated terrain-bathymetry was processed at WatershedBoundary Dataset Hydrologic Unit 4 (WBDHU4) level.

  20. Large-scale integration of off-shore wind power and regulation strategies of cogeneration plants in the Danish electricity system

    DEFF Research Database (Denmark)

    Østergaard, Poul Alberg

    2005-01-01

    The article analyses how the amount of a small-scale CHP plants and heat pumps and the regulation strategies of these affect the quantity of off-shore wind power that may be integrated into Danish electricity supply......The article analyses how the amount of a small-scale CHP plants and heat pumps and the regulation strategies of these affect the quantity of off-shore wind power that may be integrated into Danish electricity supply...

  1. Large scale biomimetic membrane arrays

    DEFF Research Database (Denmark)

    Hansen, Jesper Søndergaard; Perry, Mark; Vogel, Jörg

    2009-01-01

    To establish planar biomimetic membranes across large scale partition aperture arrays, we created a disposable single-use horizontal chamber design that supports combined optical-electrical measurements. Functional lipid bilayers could easily and efficiently be established across CO2 laser micro...... peptides and proteins. Next, we tested the scalability of the biomimetic membrane design by establishing lipid bilayers in rectangular 24 x 24 and hexagonal 24 x 27 aperture arrays, respectively. The results presented show that the design is suitable for further developments of sensitive biosensor assays...

  2. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  3. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its...... the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...... main focus. Here the general perception of the nature and role in society of large scale networks as a fundamental infrastructure is analysed. This analysis focuses on the effects of the technical DDN projects and on the perception of network infrastructure as expressed by key decision makers...

  4. Predictors of Information Technology Integration in Secondary Schools: Evidence from a Large Scale Study of More than 30,000 Students.

    Science.gov (United States)

    Hew, Khe Foon; Tan, Cheng Yong

    2016-01-01

    The present study examined the predictors of information technology (IT) integration in secondary school mathematics lessons. The predictors pertained to IT resource availability in schools, school contextual/institutional variables, accountability pressure faced by schools, subject culture in mathematics, and mathematics teachers' pedagogical beliefs and practices. Data from 32,256 secondary school students from 2,519 schools in 16 developed economies who participated in the Program for International Student Assessment (PISA) 2012 were analyzed using hierarchical linear modeling (HLM). Results showed that after controlling for student-level (gender, prior academic achievement and socioeconomic status) and school-level (class size, number of mathematics teachers) variables, students in schools with more computers per student, with more IT resources, with higher levels of IT curricular expectations, with an explicit policy on the use of IT in mathematics, whose teachers believed in student-centered teaching-learning, and whose teachers provided more problem-solving activities in class reported higher levels of IT integration. On the other hand, students who studied in schools with more positive teacher-related school learning climate, and with more academically demanding parents reported lower levels of IT integration. Student-related school learning climate, principal leadership behaviors, schools' public posting of achievement data, tracking of school's achievement data by administrative authorities, and pedagogical and curricular differentiation in mathematics lessons were not related to levels of IT integration. Put together, the predictors explained a total of 15.90% of the school-level variance in levels of IT integration. In particular, school IT resource availability, and mathematics teachers' pedagogical beliefs and practices stood out as the most important determinants of IT integration in mathematics lessons.

  5. Predictors of Information Technology Integration in Secondary Schools: Evidence from a Large Scale Study of More than 30,000 Students.

    Directory of Open Access Journals (Sweden)

    Khe Foon Hew

    Full Text Available The present study examined the predictors of information technology (IT integration in secondary school mathematics lessons. The predictors pertained to IT resource availability in schools, school contextual/institutional variables, accountability pressure faced by schools, subject culture in mathematics, and mathematics teachers' pedagogical beliefs and practices. Data from 32,256 secondary school students from 2,519 schools in 16 developed economies who participated in the Program for International Student Assessment (PISA 2012 were analyzed using hierarchical linear modeling (HLM. Results showed that after controlling for student-level (gender, prior academic achievement and socioeconomic status and school-level (class size, number of mathematics teachers variables, students in schools with more computers per student, with more IT resources, with higher levels of IT curricular expectations, with an explicit policy on the use of IT in mathematics, whose teachers believed in student-centered teaching-learning, and whose teachers provided more problem-solving activities in class reported higher levels of IT integration. On the other hand, students who studied in schools with more positive teacher-related school learning climate, and with more academically demanding parents reported lower levels of IT integration. Student-related school learning climate, principal leadership behaviors, schools' public posting of achievement data, tracking of school's achievement data by administrative authorities, and pedagogical and curricular differentiation in mathematics lessons were not related to levels of IT integration. Put together, the predictors explained a total of 15.90% of the school-level variance in levels of IT integration. In particular, school IT resource availability, and mathematics teachers' pedagogical beliefs and practices stood out as the most important determinants of IT integration in mathematics lessons.

  6. Data integration for European marine biodiversity research: creating a database on benthos and plankton to study large-scale patterns and long-term changes

    NARCIS (Netherlands)

    Vandepitte, L.; Vanhoorne, B.; Kraberg, A.; Anisimova, N.; Antoniadou, C.; Araújo, R.; Bartsch, I.; Beker, B.; Benedetti-Cecchi, L.; Bertocci, I.; Cochrane, S.J.; Cooper, K.; Craeymeersch, J.A.; Christou, E.; Crisp, D.J.; Dahle, S.; de Boissier, M.; De Kluijver, M.; Denisenko, S.; De Vito, D.; Duineveld, G.; Escaravage, V.L.; Fleischer, D.; Fraschetti, S.; Giangrande, A.; Heip, C.H.R.; Hummel, H.; Janas, U.; Karez, R.; Kedra, M.; Kingston, P.; Kuhlenkamp, R.; Libes, M.; Martens, P.; Mees, J.; Mieszkowska, N.; Mudrak, S.; Munda, I.; Orfanidis, S.; Orlando-Bonaca, M.; Palerud, R.; Rachor, E.; Reichert, K.; Rumohr, H.; Schiedek, D.; Schubert, P.; Sistermans, W.C.H.; Sousa Pinto, I.S.; Southward, A.J.; Terlizzi, A.; Tsiaga, E.; Van Beusekom, J.E.E.; Vanden Berghe, E.; Warzocha, J.; Wasmund, N.; Weslawski, J.M.; Widdicombe, C.; Wlodarska-Kowalczuk, M.; Zettler, M.L.

    2010-01-01

    The general aim of setting up a central database on benthos and plankton was to integrate long-, medium- and short-term datasets on marine biodiversity. Such a database makes it possible to analyse species assemblages and their changes on spatial and temporal scales across Europe. Data collation

  7. A CMOS-compatible large-scale monolithic integration of heterogeneous multi-sensors on flexible silicon for IoT applications

    KAUST Repository

    Nassar, Joanna M.

    2017-02-07

    We report CMOS technology enabled fabrication and system level integration of flexible bulk silicon (100) based multi-sensors platform which can simultaneously sense pressure, temperature, strain and humidity under various physical deformations. We also show an advanced wearable version for body vital monitoring which can enable advanced healthcare for IoT applications.

  8. A new approach to e-commerce customs control in China: Integrated supply chain - A practical application towards large-scale data pipeline implementation

    NARCIS (Netherlands)

    R. Hu (Rong); Y.H. Tan (Yao Hua); F. Heijmann (Frank)

    2016-01-01

    textabstractDevelopments in e-commerce are presenting new challenges in customs control and China's customs agency is developing a new approach to how it addresses these challenges. China Customs has adopted an integrated supply chain approach to secure international trade lanes and to facilitate

  9. A new approach to e-commerce customs control in China: Integrated supply chain : A practical application towards large-scale data pipeline implementation

    NARCIS (Netherlands)

    Hu, R.; Tan, Y.; Heijmann, F.

    2016-01-01

    Developments in e-commerce are presenting new challenges in customs control and China's customs agency is developing a new approach to how it addresses these challenges. China Customs has adopted an integrated supply chain approach to secure international trade lanes and to facilitate legitimate

  10. Identifying barriers to large-scale integration of variable renewable electricity into the electricity market : A literature review of market design

    NARCIS (Netherlands)

    Hu, J.|info:eu-repo/dai/nl/412681994; Harmsen, R.|info:eu-repo/dai/nl/195454278; Crijns-Graus, Wina|info:eu-repo/dai/nl/308005015; Worrell, E.|info:eu-repo/dai/nl/106856715; van den Broek, M.A.|info:eu-repo/dai/nl/092946895

    For reaching the 2 °C climate target, the robust growth of electricity generation from variable renewable energy sources (VRE) in the power sector is expected to continue. Accommodation of the power system to the variable, uncertain and locational-dependent outputs of VRE causes integration costs.

  11. CPTAC researchers report first large-scale integrated proteomic and genomic analysis of a human cancer | Office of Cancer Clinical Proteomics Research

    Science.gov (United States)

    Investigators from the National Cancer Institute's Clinical Proteomic Tumor Analysis Consortium (CPTAC) who comprehensively analyzed 95 human colorectal tumor samples, have determined how gene alterations identified in previous analyses of the same samples are expressed at the protein level. The integration of proteomic and genomic data, or proteogenomics, provides a more comprehensive view of the biological features that drive cancer than genomic analysis alone and may help identify the most important targets for cancer detection and intervention.

  12. Conference on Large Scale Optimization

    CERN Document Server

    Hearn, D; Pardalos, P

    1994-01-01

    On February 15-17, 1993, a conference on Large Scale Optimization, hosted by the Center for Applied Optimization, was held at the University of Florida. The con­ ference was supported by the National Science Foundation, the U. S. Army Research Office, and the University of Florida, with endorsements from SIAM, MPS, ORSA and IMACS. Forty one invited speakers presented papers on mathematical program­ ming and optimal control topics with an emphasis on algorithm development, real world applications and numerical results. Participants from Canada, Japan, Sweden, The Netherlands, Germany, Belgium, Greece, and Denmark gave the meeting an important international component. At­ tendees also included representatives from IBM, American Airlines, US Air, United Parcel Serice, AT & T Bell Labs, Thinking Machines, Army High Performance Com­ puting Research Center, and Argonne National Laboratory. In addition, the NSF sponsored attendance of thirteen graduate students from universities in the United States and abro...

  13. Integrating Remote Sensing Information Into A Distributed Hydrological Model for Improving Water Budget Predictions in Large-scale Basins through Data Assimilation

    Science.gov (United States)

    Qin, Changbo; Jia, Yangwen; Su, Z.(Bob); Zhou, Zuhao; Qiu, Yaqin; Suhui, Shen

    2008-01-01

    This paper investigates whether remote sensing evapotranspiration estimates can be integrated by means of data assimilation into a distributed hydrological model for improving the predictions of spatial water distribution over a large river basin with an area of 317,800 km2. A series of available MODIS satellite images over the Haihe River basin in China are used for the year 2005. Evapotranspiration is retrieved from these 1×1 km resolution images using the SEBS (Surface Energy Balance System) algorithm. The physically-based distributed model WEP-L (Water and Energy transfer Process in Large river basins) is used to compute the water balance of the Haihe River basin in the same year. Comparison between model-derived and remote sensing retrieval basin-averaged evapotranspiration estimates shows a good piecewise linear relationship, but their spatial distribution within the Haihe basin is different. The remote sensing derived evapotranspiration shows variability at finer scales. An extended Kalman filter (EKF) data assimilation algorithm, suitable for non-linear problems, is used. Assimilation results indicate that remote sensing observations have a potentially important role in providing spatial information to the assimilation system for the spatially optical hydrological parameterization of the model. This is especially important for large basins, such as the Haihe River basin in this study. Combining and integrating the capabilities of and information from model simulation and remote sensing techniques may provide the best spatial and temporal characteristics for hydrological states/fluxes, and would be both appealing and necessary for improving our knowledge of fundamental hydrological processes and for addressing important water resource management problems. PMID:27879946

  14. ScipionCloud: An integrative and interactive gateway for large scale cryo electron microscopy image processing on commercial and academic clouds.

    Science.gov (United States)

    Cuenca-Alba, Jesús; Del Cano, Laura; Gómez Blanco, Josué; de la Rosa Trevín, José Miguel; Conesa Mingo, Pablo; Marabini, Roberto; S Sorzano, Carlos Oscar; Carazo, Jose María

    2017-10-01

    New instrumentation for cryo electron microscopy (cryoEM) has significantly increased data collection rate as well as data quality, creating bottlenecks at the image processing level. Current image processing model of moving the acquired images from the data source (electron microscope) to desktops or local clusters for processing is encountering many practical limitations. However, computing may also take place in distributed and decentralized environments. In this way, cloud is a new form of accessing computing and storage resources on demand. Here, we evaluate on how this new computational paradigm can be effectively used by extending our current integrative framework for image processing, creating ScipionCloud. This new development has resulted in a full installation of Scipion both in public and private clouds, accessible as public "images", with all the required preinstalled cryoEM software, just requiring a Web browser to access all Graphical User Interfaces. We have profiled the performance of different configurations on Amazon Web Services and the European Federated Cloud, always on architectures incorporating GPU's, and compared them with a local facility. We have also analyzed the economical convenience of different scenarios, so cryoEM scientists have a clearer picture of the setup that is best suited for their needs and budgets. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Large-scale linear programs in planning and prediction.

    Science.gov (United States)

    2017-06-01

    Large-scale linear programs are at the core of many traffic-related optimization problems in both planning and prediction. Moreover, many of these involve significant uncertainty, and hence are modeled using either chance constraints, or robust optim...

  16. Large-scale galaxy bias

    Science.gov (United States)

    Jeong, Donghui; Desjacques, Vincent; Schmidt, Fabian

    2018-01-01

    Here, we briefly introduce the key results of the recent review (arXiv:1611.09787), whose abstract is as following. This review presents a comprehensive overview of galaxy bias, that is, the statistical relation between the distribution of galaxies and matter. We focus on large scales where cosmic density fields are quasi-linear. On these scales, the clustering of galaxies can be described by a perturbative bias expansion, and the complicated physics of galaxy formation is absorbed by a finite set of coefficients of the expansion, called bias parameters. The review begins with a detailed derivation of this very important result, which forms the basis of the rigorous perturbative description of galaxy clustering, under the assumptions of General Relativity and Gaussian, adiabatic initial conditions. Key components of the bias expansion are all leading local gravitational observables, which include the matter density but also tidal fields and their time derivatives. We hence expand the definition of local bias to encompass all these contributions. This derivation is followed by a presentation of the peak-background split in its general form, which elucidates the physical meaning of the bias parameters, and a detailed description of the connection between bias parameters and galaxy (or halo) statistics. We then review the excursion set formalism and peak theory which provide predictions for the values of the bias parameters. In the remainder of the review, we consider the generalizations of galaxy bias required in the presence of various types of cosmological physics that go beyond pressureless matter with adiabatic, Gaussian initial conditions: primordial non-Gaussianity, massive neutrinos, baryon-CDM isocurvature perturbations, dark energy, and modified gravity. Finally, we discuss how the description of galaxy bias in the galaxies' rest frame is related to clustering statistics measured from the observed angular positions and redshifts in actual galaxy catalogs.

  17. Large-scale galaxy bias

    Science.gov (United States)

    Desjacques, Vincent; Jeong, Donghui; Schmidt, Fabian

    2018-02-01

    This review presents a comprehensive overview of galaxy bias, that is, the statistical relation between the distribution of galaxies and matter. We focus on large scales where cosmic density fields are quasi-linear. On these scales, the clustering of galaxies can be described by a perturbative bias expansion, and the complicated physics of galaxy formation is absorbed by a finite set of coefficients of the expansion, called bias parameters. The review begins with a detailed derivation of this very important result, which forms the basis of the rigorous perturbative description of galaxy clustering, under the assumptions of General Relativity and Gaussian, adiabatic initial conditions. Key components of the bias expansion are all leading local gravitational observables, which include the matter density but also tidal fields and their time derivatives. We hence expand the definition of local bias to encompass all these contributions. This derivation is followed by a presentation of the peak-background split in its general form, which elucidates the physical meaning of the bias parameters, and a detailed description of the connection between bias parameters and galaxy statistics. We then review the excursion-set formalism and peak theory which provide predictions for the values of the bias parameters. In the remainder of the review, we consider the generalizations of galaxy bias required in the presence of various types of cosmological physics that go beyond pressureless matter with adiabatic, Gaussian initial conditions: primordial non-Gaussianity, massive neutrinos, baryon-CDM isocurvature perturbations, dark energy, and modified gravity. Finally, we discuss how the description of galaxy bias in the galaxies' rest frame is related to clustering statistics measured from the observed angular positions and redshifts in actual galaxy catalogs.

  18. Lightweight computational steering of very large scale molecular dynamics simulations

    International Nuclear Information System (INIS)

    Beazley, D.M.

    1996-01-01

    We present a computational steering approach for controlling, analyzing, and visualizing very large scale molecular dynamics simulations involving tens to hundreds of millions of atoms. Our approach relies on extensible scripting languages and an easy to use tool for building extensions and modules. The system is extremely easy to modify, works with existing C code, is memory efficient, and can be used from inexpensive workstations and networks. We demonstrate how we have used this system to manipulate data from production MD simulations involving as many as 104 million atoms running on the CM-5 and Cray T3D. We also show how this approach can be used to build systems that integrate common scripting languages (including Tcl/Tk, Perl, and Python), simulation code, user extensions, and commercial data analysis packages

  19. Handbook of Large-Scale Random Networks

    CERN Document Server

    Bollobas, Bela; Miklos, Dezso

    2008-01-01

    Covers various aspects of large-scale networks, including mathematical foundations and rigorous results of random graph theory, modeling and computational aspects of large-scale networks, as well as areas in physics, biology, neuroscience, sociology and technical areas

  20. Engineering management of large scale systems

    Science.gov (United States)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  1. Introducing Large-Scale Innovation in Schools

    Science.gov (United States)

    Sotiriou, Sofoklis; Riviou, Katherina; Cherouvis, Stephanos; Chelioti, Eleni; Bogner, Franz X.

    2016-08-01

    Education reform initiatives tend to promise higher effectiveness in classrooms especially when emphasis is given to e-learning and digital resources. Practical changes in classroom realities or school organization, however, are lacking. A major European initiative entitled Open Discovery Space (ODS) examined the challenge of modernizing school education via a large-scale implementation of an open-scale methodology in using technology-supported innovation. The present paper describes this innovation scheme which involved schools and teachers all over Europe, embedded technology-enhanced learning into wider school environments and provided training to teachers. Our implementation scheme consisted of three phases: (1) stimulating interest, (2) incorporating the innovation into school settings and (3) accelerating the implementation of the innovation. The scheme's impact was monitored for a school year using five indicators: leadership and vision building, ICT in the curriculum, development of ICT culture, professional development support, and school resources and infrastructure. Based on about 400 schools, our study produced four results: (1) The growth in digital maturity was substantial, even for previously high scoring schools. This was even more important for indicators such as vision and leadership" and "professional development." (2) The evolution of networking is presented graphically, showing the gradual growth of connections achieved. (3) These communities became core nodes, involving numerous teachers in sharing educational content and experiences: One out of three registered users (36 %) has shared his/her educational resources in at least one community. (4) Satisfaction scores ranged from 76 % (offer of useful support through teacher academies) to 87 % (good environment to exchange best practices). Initiatives such as ODS add substantial value to schools on a large scale.

  2. UAV Data Processing for Large Scale Topographical Mapping

    Directory of Open Access Journals (Sweden)

    W. Tampubolon

    2014-06-01

    Full Text Available Large scale topographical mapping in the third world countries is really a prominent challenge in geospatial industries nowadays. On one side the demand is significantly increasing while on the other hand it is constrained by limited budgets available for mapping projects. Since the advent of Act Nr.4/yr.2011 about Geospatial Information in Indonesia, large scale topographical mapping has been on high priority for supporting the nationwide development e.g. detail spatial planning. Usually large scale topographical mapping relies on conventional aerial survey campaigns in order to provide high resolution 3D geospatial data sources. Widely growing on a leisure hobby, aero models in form of the so-called Unmanned Aerial Vehicle (UAV bring up alternative semi photogrammetric aerial data acquisition possibilities suitable for relatively small Area of Interest (AOI i.e. <5,000 hectares. For detail spatial planning purposes in Indonesia this area size can be used as a mapping unit since it usually concentrates on the basis of sub district area (kecamatan level. In this paper different camera and processing software systems will be further analyzed for identifying the best optimum UAV data acquisition campaign components in combination with the data processing scheme. The selected AOI is covering the cultural heritage of Borobudur Temple as one of the Seven Wonders of the World. A detailed accuracy assessment will be concentrated within the object feature of the temple at the first place. Feature compilation involving planimetric objects (2D and digital terrain models (3D will be integrated in order to provide Digital Elevation Models (DEM as the main interest of the topographic mapping activity. By doing this research, incorporating the optimum amount of GCPs in the UAV photo data processing will increase the accuracy along with its high resolution in 5 cm Ground Sampling Distance (GSD. Finally this result will be used as the benchmark for alternative

  3. Large-scale solar heating

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Advanced Energy Systems

    1998-10-01

    Solar heating market is growing in many European countries and annually installed collector area has exceeded one million square meters. There are dozens of collector manufacturers and hundreds of firms making solar heating installations in Europe. One tendency in solar heating is towards larger systems. These can be roof integrated, consisting of some tens or hundreds of square meters of collectors, or they can be larger centralized solar district heating plants consisting of a few thousand square meters of collectors. The increase of size can reduce the specific investments of solar heating systems, because e.g. the costs of some components (controllers, pumps, and pipes), planning and installation can be smaller in larger systems. The solar heat output can also be higher in large systems, because more advanced technique is economically viable

  4. Contamination cannot explain the lack of large-scale power in the cosmic microwave background radiation

    International Nuclear Information System (INIS)

    Bunn, Emory F.; Bourdon, Austin

    2008-01-01

    Several anomalies appear to be present in the large-angle cosmic microwave background (CMB) anisotropy maps of the Wilkinson Microwave Anisotropy Probe. One of these is a lack of large-scale power. Because the data otherwise match standard models extremely well, it is natural to consider perturbations of the standard model as possible explanations. We show that, as long as the source of the perturbation is statistically independent of the source of the primary CMB anisotropy, no such model can explain this large-scale power deficit. On the contrary, any such perturbation always reduces the probability of obtaining any given low value of large-scale power. We rigorously prove this result when the lack of large-scale power is quantified with a quadratic statistic, such as the quadrupole moment. When a statistic based on the integrated square of the correlation function is used instead, we present strong numerical evidence in support of the result. The result applies to models in which the geometry of spacetime is perturbed (e.g., an ellipsoidal universe) as well as explanations involving local contaminants, undiagnosed foregrounds, or systematic errors. Because the large-scale power deficit is arguably the most significant of the observed anomalies, explanations that worsen this discrepancy should be regarded with great skepticism, even if they help in explaining other anomalies such as multipole alignments.

  5. Balancing modern Power System with large scale of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Altin, Müfit; Hansen, Anca Daniela

    2014-01-01

    Power system operators must ensure robust, secure and reliable power system operation even with a large scale integration of wind power. Electricity generated from the intermittent wind in large propor-tion may impact on the control of power system balance and thus deviations in the power system...... frequency in small or islanded power systems or tie line power flows in interconnected power systems. Therefore, the large scale integration of wind power into the power system strongly concerns the secure and stable grid operation. To ensure the stable power system operation, the evolving power system has...... to be analysed with improved analytical tools and techniques. This paper proposes techniques for the active power balance control in future power systems with the large scale wind power integration, where power balancing model provides the hour-ahead dispatch plan with reduced planning horizon and the real time...

  6. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  7. Material design of plasma-enhanced chemical vapour deposition SiCH films for low-kcap layers in the further scaling of ultra-large-scale integrated devices-Cu interconnects.

    Science.gov (United States)

    Shimizu, Hideharu; Nagano, Shuji; Uedono, Akira; Tajima, Nobuo; Momose, Takeshi; Shimogaki, Yukihiro

    2013-10-01

    Cap layers for Cu interconnects in ultra-large-scale integrated devices (ULSIs), with a low dielectric constant ( k -value) and strong barrier properties against Cu and moisture diffusion, are required for the future further scaling of ULSIs. There is a trade-off, however, between reducing the k -value and maintaining strong barrier properties. Using quantum mechanical simulations and other theoretical computations, we have designed ideal dielectrics: SiCH films with Si-C 2 H 4 -Si networks. Such films were estimated to have low porosity and low k ; thus they are the key to realizing a cap layer with a low k and strong barrier properties against diffusion. For fabricating these ideal SiCH films, we designed four novel precursors: isobutyl trimethylsilane, diisobutyl dimethylsilane, 1, 1-divinylsilacyclopentane and 5-silaspiro [4,4] noname, based on quantum chemical calculations, because such fabrication is difficult by controlling only the process conditions in plasma-enhanced chemical vapor deposition (PECVD) using conventional precursors. We demonstrated that SiCH films prepared using these newly designed precursors had large amounts of Si-C 2 H 4 -Si networks and strong barrier properties. The pore structure of these films was then analyzed by positron annihilation spectroscopy, revealing that these SiCH films actually had low porosity, as we designed. These results validate our material and precursor design concepts for developing a PECVD process capable of fabricating a low- k cap layer.

  8. Material design of plasma-enhanced chemical vapour deposition SiCH films for low-k cap layers in the further scaling of ultra-large-scale integrated devices-Cu interconnects

    Directory of Open Access Journals (Sweden)

    Hideharu Shimizu, Shuji Nagano, Akira Uedono, Nobuo Tajima, Takeshi Momose and Yukihiro Shimogaki

    2013-01-01

    Full Text Available Cap layers for Cu interconnects in ultra-large-scale integrated devices (ULSIs, with a low dielectric constant (k-value and strong barrier properties against Cu and moisture diffusion, are required for the future further scaling of ULSIs. There is a trade-off, however, between reducing the k-value and maintaining strong barrier properties. Using quantum mechanical simulations and other theoretical computations, we have designed ideal dielectrics: SiCH films with Si–C2H4–Si networks. Such films were estimated to have low porosity and low k; thus they are the key to realizing a cap layer with a low k and strong barrier properties against diffusion. For fabricating these ideal SiCH films, we designed four novel precursors: isobutyl trimethylsilane, diisobutyl dimethylsilane, 1, 1-divinylsilacyclopentane and 5-silaspiro [4,4] noname, based on quantum chemical calculations, because such fabrication is difficult by controlling only the process conditions in plasma-enhanced chemical vapor deposition (PECVD using conventional precursors. We demonstrated that SiCH films prepared using these newly designed precursors had large amounts of Si–C2H4–Si networks and strong barrier properties. The pore structure of these films was then analyzed by positron annihilation spectroscopy, revealing that these SiCH films actually had low porosity, as we designed. These results validate our material and precursor design concepts for developing a PECVD process capable of fabricating a low-k cap layer.

  9. Integrating Data Streams from in-situ Measurements, Social Networks and Satellite Earth Observation to Augment Operational Flood Monitoring and Forecasting: the 2017 Hurricane Season in the Americas as a Large-scale Test Case

    Science.gov (United States)

    Matgen, P.; Pelich, R.; Brangbour, E.; Bruneau, P.; Chini, M.; Hostache, R.; Schumann, G.; Tamisier, T.

    2017-12-01

    Hurricanes Harvey, Irma and Maria generated large streams of heterogeneous data, coming notably from three main sources: imagery (satellite and aircraft), in-situ measurement stations and social media. Interpreting these data streams brings critical information to develop, validate and update prediction models. The study addresses existing gaps in the joint extraction of disaster risk information from multiple data sources and their usefulness for reducing the predictive uncertainty of large-scale flood inundation models. Satellite EO data, most notably the free-of-charge data streams generated by the Copernicus program, provided a wealth of high-resolution imagery covering the large areas affected. Our study is focussing on the mapping of flooded areas from a sequence of Sentinel-1 SAR imagery using a classification algorithm recently implemented on the European Space Agency's Grid Processing On Demand environment. The end-to-end-processing chain provided a fast access to all relevant imagery and an effective processing for near-real time analyses. The classification algorithm was applied on pairs of images to rapidly and automatically detect, record and disseminate all observable changes of water bodies. Disaster information was also retrieved from photos as well as texts contributed on social networks and the study shows how this information may complement EO and in-situ data and augment information content. As social media data are noisy and difficult to geo-localize, different techniques are being developed to automatically infer associated semantics and geotags. The presentation provides a cross-comparison between the hazard information obtained from the three data sources. We provide examples of how the generated database of geo-localized disaster information was finally integrated into a large-scale hydrodynamic model of the Colorado River emptying into the Matagorda Bay on the Gulf of Mexico in order to reduce its predictive uncertainty. We describe the

  10. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  11. Large scale molecular simulations of nanotoxicity.

    Science.gov (United States)

    Jimenez-Cruz, Camilo A; Kang, Seung-gu; Zhou, Ruhong

    2014-01-01

    The widespread use of nanomaterials in biomedical applications has been accompanied by an increasing interest in understanding their interactions with tissues, cells, and biomolecules, and in particular, on how they might affect the integrity of cell membranes and proteins. In this mini-review, we present a summary of some of the recent studies on this important subject, especially from the point of view of large scale molecular simulations. The carbon-based nanomaterials and noble metal nanoparticles are the main focus, with additional discussions on quantum dots and other nanoparticles as well. The driving forces for adsorption of fullerenes, carbon nanotubes, and graphene nanosheets onto proteins or cell membranes are found to be mainly hydrophobic interactions and the so-called π-π stacking (between aromatic rings), while for the noble metal nanoparticles the long-range electrostatic interactions play a bigger role. More interestingly, there are also growing evidences showing that nanotoxicity can have implications in de novo design of nanomedicine. For example, the endohedral metallofullerenol Gd@C₈₂(OH)₂₂ is shown to inhibit tumor growth and metastasis by inhibiting enzyme MMP-9, and graphene is illustrated to disrupt bacteria cell membranes by insertion/cutting as well as destructive extraction of lipid molecules. These recent findings have provided a better understanding of nanotoxicity at the molecular level and also suggested therapeutic potential by using the cytotoxicity of nanoparticles against cancer or bacteria cells. © 2014 Wiley Periodicals, Inc.

  12. Large scale digital atlases in neuroscience

    Science.gov (United States)

    Hawrylycz, M.; Feng, D.; Lau, C.; Kuan, C.; Miller, J.; Dang, C.; Ng, L.

    2014-03-01

    Imaging in neuroscience has revolutionized our current understanding of brain structure, architecture and increasingly its function. Many characteristics of morphology, cell type, and neuronal circuitry have been elucidated through methods of neuroimaging. Combining this data in a meaningful, standardized, and accessible manner is the scope and goal of the digital brain atlas. Digital brain atlases are used today in neuroscience to characterize the spatial organization of neuronal structures, for planning and guidance during neurosurgery, and as a reference for interpreting other data modalities such as gene expression and connectivity data. The field of digital atlases is extensive and in addition to atlases of the human includes high quality brain atlases of the mouse, rat, rhesus macaque, and other model organisms. Using techniques based on histology, structural and functional magnetic resonance imaging as well as gene expression data, modern digital atlases use probabilistic and multimodal techniques, as well as sophisticated visualization software to form an integrated product. Toward this goal, brain atlases form a common coordinate framework for summarizing, accessing, and organizing this knowledge and will undoubtedly remain a key technology in neuroscience in the future. Since the development of its flagship project of a genome wide image-based atlas of the mouse brain, the Allen Institute for Brain Science has used imaging as a primary data modality for many of its large scale atlas projects. We present an overview of Allen Institute digital atlases in neuroscience, with a focus on the challenges and opportunities for image processing and computation.

  13. A Chain Perspective on Large-scale Number Systems

    NARCIS (Netherlands)

    Grijpink, J.H.A.M.

    2012-01-01

    As large-scale number systems gain significance in social and economic life (electronic communication, remote electronic authentication), the correct functioning and the integrity of public number systems take on crucial importance. They are needed to uniquely indicate people, objects or phenomena

  14. SDI Large-Scale System Technology Study

    National Research Council Canada - National Science Library

    1986-01-01

    .... This coordination is addressed by the Battle Management function. The algorithms and technologies required to support Battle Management are the subject of the SDC Large Scale Systems Technology Study...

  15. Participatory Design of Large-Scale Information Systems

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Hertzum, Morten

    2008-01-01

    In this article we discuss how to engage in large-scale information systems development by applying a participatory design (PD) approach that acknowledges the unique situated work practices conducted by the domain experts of modern organizations. We reconstruct the iterative prototyping approach...... into a PD process model that (1) emphasizes PD experiments as transcending traditional prototyping by evaluating fully integrated systems exposed to real work practices; (2) incorporates improvisational change management including anticipated, emergent, and opportunity-based change; and (3) extends initial...... and discuss three challenges to address when dealing with large-scale systems development....

  16. Enhancing microelectronics education with large-scale student projects

    OpenAIRE

    Rumpf, Clemens; Lidtke, Aleksander; Weddell, Alex; Maunder, Rob

    2016-01-01

    This paper discusses the benefits of using large-scale projects, involving many groups of students with different backgrounds, in the education of undergraduate microelectronics engineering students. The benefits of involving students in large, industry-like projects are first briefly reviewed. The organisation of undergraduate programmes is presented, and it is described how students can be involved in such large projects, while maintaining compatibility with undergraduate programmes. The ge...

  17. Success Factors of Large Scale ERP Implementation in Thailand

    OpenAIRE

    Rotchanakitumnuai; Siriluck

    2010-01-01

    The objectives of the study are to examine the determinants of ERP implementation success factors of ERP implementation. The result indicates that large scale ERP implementation success consist of eight factors: project management competence, knowledge sharing, ERP system quality , understanding, user involvement, business process re-engineering, top management support, organization readiness.

  18. Newton Methods for Large Scale Problems in Machine Learning

    Science.gov (United States)

    Hansen, Samantha Leigh

    2014-01-01

    The focus of this thesis is on practical ways of designing optimization algorithms for minimizing large-scale nonlinear functions with applications in machine learning. Chapter 1 introduces the overarching ideas in the thesis. Chapters 2 and 3 are geared towards supervised machine learning applications that involve minimizing a sum of loss…

  19. Resolute large scale mining company contribution to health services of

    African Journals Online (AJOL)

    Introduction: In 1995 Tanzanian Government reformed the mining industry and the new policy allowed an involvement of multinational companies but the communities living near new large scale gold mines were expected to benefit from the industry in terms of socio economic, health, education, employment, safe drinking ...

  20. Firebrands and spotting ignition in large-scale fires

    Science.gov (United States)

    Eunmo Koo; Patrick J. Pagni; David R. Weise; John P. Woycheese

    2010-01-01

    Spotting ignition by lofted firebrands is a significant mechanism of fire spread, as observed in many largescale fires. The role of firebrands in fire propagation and the important parameters involved in spot fire development are studied. Historical large-scale fires, including wind-driven urban and wildland conflagrations and post-earthquake fires are given as...

  1. Buried Waste Integrated Demonstration stakeholder involvement model

    International Nuclear Information System (INIS)

    Kaupanger, R.M.; Kostelnik, K.M.; Milam, L.M.

    1994-04-01

    The Buried Waste Integrated Demonstration (BWID) is a program funded by the US Department of Energy (DOE) Office of Technology Development. BWID supports the applied research, development, demonstration, and evaluation of a suite of advanced technologies that together form a comprehensive remediation system for the effective and efficient remediation of buried waste. Stakeholder participation in the DOE Environmental Management decision-making process is critical to remediation efforts. Appropriate mechanisms for communication with the public, private sector, regulators, elected officials, and others are being aggressively pursued by BWID to permit informed participation. This document summarizes public outreach efforts during FY-93 and presents a strategy for expanded stakeholder involvement during FY-94

  2. Dissecting the large-scale galactic conformity

    Science.gov (United States)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  3. Large scale structure statistics: Finite volume effects

    Science.gov (United States)

    Colombi, S.; Bouchet, F. R.; Schaeffer, R.

    1994-01-01

    We study finite volume effects on the count probability distribution function PN(l) and the averaged Q-body correlations Xi-barQ (2 less than or = Q less than or equal 5). These statistics are computed for cubic cells, of size l. We use as an example the case of the matter distribution of a cold dark matter (CDM) universe involving approximately 3 x 105 particles. The main effect of the finiteness of the sampled volume is to induce an abrupt cut-off on the function PN(l) at large N. This clear signature makes an analysis of the consequences easy, and one can envisage a correction procedure. As a matter of fact, we demonstrate how an unfair sample can strongly affect the estimates of the functions Xi-barQ for Q greater than or = 3 (and decrease the measured zero of the two-body correlation function). We propose a method to correct for this are fact, or at least to evaluate the corresponding errors. We show that the correlations are systematically underestimated by direct measurements. We find that, once corrected, the statistical properties of the CDM universe appear compatible with the scaling relation SQ identically equals Xi-bar2 exp Q-1 = constant with respect to scale, in the non-linear regime; it was not the case with direct measurments. However, we note a deviation from scaling at scales close to the correlation length. It is probably due to the transition between the highly non-linear regime and the weakly correlated regime, where the functions SQ also seem to present a plateau. We apply the same procedure to simulations with hot dark matter (HDM) and white noise initial conditions, with similar results. Our method thus provides the first accurate measurement of the normalized skewness, S3, and the normalized kurtosis, S4, for three typical models of large scale structure formation in an expanding universe.

  4. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  5. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    1981-05-01

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  6. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  7. Large-scale Motion of Solar Filaments

    Indian Academy of Sciences (India)

    tribpo

    Large-scale Motion of Solar Filaments. Pavel Ambrož, Astronomical Institute of the Acad. Sci. of the Czech Republic, CZ-25165. Ondrejov, The Czech Republic. e-mail: pambroz@asu.cas.cz. Alfred Schroll, Kanzelhöehe Solar Observatory of the University of Graz, A-9521 Treffen,. Austria. e-mail: schroll@solobskh.ac.at.

  8. Configuration management in large scale infrastructure development

    NARCIS (Netherlands)

    Rijn, T.P.J. van; Belt, H. van de; Los, R.H.

    2000-01-01

    Large Scale Infrastructure (LSI) development projects such as the construction of roads, rail-ways and other civil engineering (water)works is tendered differently today than a decade ago. Traditional workflow requested quotes from construction companies for construction works where the works to be

  9. Inflation, large scale structure and particle physics

    Indian Academy of Sciences (India)

    We review experimental and theoretical developments in inflation and its application to structure formation, including the curvation idea. We then discuss a particle physics model of supersymmetric hybrid inflation at the intermediate scale in which the Higgs scalar field is responsible for large scale structure, show how such ...

  10. Ethics of large-scale change

    DEFF Research Database (Denmark)

    Arler, Finn

    2006-01-01

    , which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, the neoclassical economists' approach, and finally the so-called Concentric Circle Theories approach...

  11. Learning from large scale neural simulations

    DEFF Research Database (Denmark)

    Serban, Maria

    2017-01-01

    Large-scale neural simulations have the marks of a distinct methodology which can be fruitfully deployed to advance scientific understanding of the human brain. Computer simulation studies can be used to produce surrogate observational data for better conceptual models and new how...

  12. Large scale photovoltaic field trials. Second technical report: monitoring phase

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-09-15

    This report provides an update on the Large-Scale Building Integrated Photovoltaic Field Trials (LS-BIPV FT) programme commissioned by the Department of Trade and Industry (Department for Business, Enterprise and Industry; BERR). It provides detailed profiles of the 12 projects making up this programme, which is part of the UK programme on photovoltaics and has run in parallel with the Domestic Field Trial. These field trials aim to record the experience and use the lessons learnt to raise awareness of, and confidence in, the technology and increase UK capabilities. The projects involved: the visitor centre at the Gaia Energy Centre in Cornwall; a community church hall in London; council offices in West Oxfordshire; a sports science centre at Gloucester University; the visitor centre at Cotswold Water Park; the headquarters of the Insolvency Service; a Welsh Development Agency building; an athletics centre in Birmingham; a research facility at the University of East Anglia; a primary school in Belfast; and Barnstable civic centre in Devon. The report describes the aims of the field trials, monitoring issues, performance, observations and trends, lessons learnt and the results of occupancy surveys.

  13. Reorganizing Complex Network to Improve Large-Scale Multiagent Teamwork

    Directory of Open Access Journals (Sweden)

    Yang Xu

    2014-01-01

    Full Text Available Large-scale multiagent teamwork has been popular in various domains. Similar to human society infrastructure, agents only coordinate with some of the others, with a peer-to-peer complex network structure. Their organization has been proven as a key factor to influence their performance. To expedite team performance, we have analyzed that there are three key factors. First, complex network effects may be able to promote team performance. Second, coordination interactions coming from their sources are always trying to be routed to capable agents. Although they could be transferred across the network via different paths, their sources and sinks depend on the intrinsic nature of the team which is irrelevant to the network connections. In addition, the agents involved in the same plan often form a subteam and communicate with each other more frequently. Therefore, if the interactions between agents can be statistically recorded, we are able to set up an integrated network adjustment algorithm by combining the three key factors. Based on our abstracted teamwork simulations and the coordination statistics, we implemented the adaptive reorganization algorithm. The experimental results briefly support our design that the reorganized network is more capable of coordinating heterogeneous agents.

  14. Fractals and cosmological large-scale structure

    Science.gov (United States)

    Luo, Xiaochun; Schramm, David N.

    1992-01-01

    Observations of galaxy-galaxy and cluster-cluster correlations as well as other large-scale structure can be fit with a 'limited' fractal with dimension D of about 1.2. This is not a 'pure' fractal out to the horizon: the distribution shifts from power law to random behavior at some large scale. If the observed patterns and structures are formed through an aggregation growth process, the fractal dimension D can serve as an interesting constraint on the properties of the stochastic motion responsible for limiting the fractal structure. In particular, it is found that the observed fractal should have grown from two-dimensional sheetlike objects such as pancakes, domain walls, or string wakes. This result is generic and does not depend on the details of the growth process.

  15. Large scale processing of dielectric electroactive polymers

    DEFF Research Database (Denmark)

    Vudayagiri, Sindhu

    Efficient processing techniques are vital to the success of any manufacturing industry. The processing techniques determine the quality of the products and thus to a large extent the performance and reliability of the products that are manufactured. The dielectric electroactive polymer (DEAP......) technology is relatively new and is in the initial stages of development with no established large scale manufacturing techniques. Danfoss Polypower A/S has set up a large scale manufacture process to make thin film DEAP transducers. The DEAP transducers developed by Danfoss Polypower consist...... of microstructured elastomer surfaces on which the compliant metallic electrodes are sputtered thus enabling large strains of non-stretchable metal electrode. Thin microstructured polydimethlysiloxane (PDMS) films are quintessential in DEAP technology due to scaling of their actuation strain with the reciprocal...

  16. Challenges for Large Scale Structure Theory

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    I will describe some of the outstanding questions in Cosmology where answers could be provided by observations of the Large Scale Structure of the Universe at late times.I will discuss some of the theoretical challenges which will have to be overcome to extract this information from the observations. I will describe some of the theoretical tools that might be useful to achieve this goal. 

  17. Large-scale computer-aided design

    OpenAIRE

    Adeli, Hojjat

    1997-01-01

    The author and his associates have been 'working on creating novel design theories and computational models with two broad objectives: automation and optimization. This paper is a summary of the author's Keynote Lecture based on the research done by the author and his associates recently. Novel neurocomputing algorithms are presented for large-scale computer-aided design and optimization. This research demonstrates how a new level is achieved in design automation through the ingenious use and...

  18. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; White, B.W.; Nichols, B.D.; Smith, P.R.; Leslie, I.H.; Fenton, D.L.; Gunaji, M.V.; Blythe, J.P.

    1991-01-01

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  19. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  20. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  1. Large scale inhomogeneities and the cosmological principle

    International Nuclear Information System (INIS)

    Lukacs, B.; Meszaros, A.

    1984-12-01

    The compatibility of cosmologic principles and possible large scale inhomogeneities of the Universe is discussed. It seems that the strongest symmetry principle which is still compatible with reasonable inhomogeneities, is a full conformal symmetry in the 3-space defined by the cosmological velocity field, but even in such a case, the standard model is isolated from the inhomogeneous ones when the whole evolution is considered. (author)

  2. The consistency problems of large scale structure

    International Nuclear Information System (INIS)

    Schramm, D.N.

    1986-01-01

    Studies of the early universe are reviewed, with emphasis on galaxy formation, dark matter and the generation of large scale structure. The paper was presented at the conference on ''The early universe and its evolution'', Erice, Italy, 1986. Dark matter, Big Bang nucleosynthesis, baryonic halos, flatness arguments, cosmological constant, galaxy formation, neutrinos plus strings or explosions and string models, are all discussed. (U.K.)

  3. Large-Scale Visual Data Analysis

    Science.gov (United States)

    Johnson, Chris

    2014-04-01

    Modern high performance computers have speeds measured in petaflops and handle data set sizes measured in terabytes and petabytes. Although these machines offer enormous potential for solving very large-scale realistic computational problems, their effectiveness will hinge upon the ability of human experts to interact with their simulation results and extract useful information. One of the greatest scientific challenges of the 21st century is to effectively understand and make use of the vast amount of information being produced. Visual data analysis will be among our most most important tools in helping to understand such large-scale information. Our research at the Scientific Computing and Imaging (SCI) Institute at the University of Utah has focused on innovative, scalable techniques for large-scale 3D visual data analysis. In this talk, I will present state- of-the-art visualization techniques, including scalable visualization algorithms and software, cluster-based visualization methods and innovate visualization techniques applied to problems in computational science, engineering, and medicine. I will conclude with an outline for a future high performance visualization research challenges and opportunities.

  4. Economically viable large-scale hydrogen liquefaction

    Science.gov (United States)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  5. A study to solve the variability of wind generation through integration of large-scale hydraulic generation; Um estudo para resolver a variabilidade da geracao eolica atraves da integracao em larga escala com geracao hidraulica

    Energy Technology Data Exchange (ETDEWEB)

    Emmerik, Emanuel Leonardus van; Steinberger, Johann Michael; Aredes, Mauricio [Coordenacao dos Programas de Pos-Graduacao de Engenharia (PEE/COPPE/UFRJ), RJ (Brazil). Programa de Engenharia Eletrica

    2010-07-01

    The optimal deployment of wind generation with the hydro generation is being investigated as a viable option to assist in resolving the constraints coming ahead as a consequence of the tendency of recovery in the Brazilian Amazon basin for expansion of generating facilities. It is in the validity of this research that this work is focused. The value is shown of feasibility studies of using water power generation to offset the variability of wind generation when it is deployed on a large scale. Preliminary results are presented for the variability of wind generation at various cycles, the variability of the availability of hydropower. (author)

  6. Application of simplified models to CO2 migration and immobilization in large-scale geological systems

    KAUST Repository

    Gasda, Sarah E.

    2012-07-01

    Long-term stabilization of injected carbon dioxide (CO 2) is an essential component of risk management for geological carbon sequestration operations. However, migration and trapping phenomena are inherently complex, involving processes that act over multiple spatial and temporal scales. One example involves centimeter-scale density instabilities in the dissolved CO 2 region leading to large-scale convective mixing that can be a significant driver for CO 2 dissolution. Another example is the potentially important effect of capillary forces, in addition to buoyancy and viscous forces, on the evolution of mobile CO 2. Local capillary effects lead to a capillary transition zone, or capillary fringe, where both fluids are present in the mobile state. This small-scale effect may have a significant impact on large-scale plume migration as well as long-term residual and dissolution trapping. Computational models that can capture both large and small-scale effects are essential to predict the role of these processes on the long-term storage security of CO 2 sequestration operations. Conventional modeling tools are unable to resolve sufficiently all of these relevant processes when modeling CO 2 migration in large-scale geological systems. Herein, we present a vertically-integrated approach to CO 2 modeling that employs upscaled representations of these subgrid processes. We apply the model to the Johansen formation, a prospective site for sequestration of Norwegian CO 2 emissions, and explore the sensitivity of CO 2 migration and trapping to subscale physics. Model results show the relative importance of different physical processes in large-scale simulations. The ability of models such as this to capture the relevant physical processes at large spatial and temporal scales is important for prediction and analysis of CO 2 storage sites. © 2012 Elsevier Ltd.

  7. Computational Approach to large Scale Process Optimization through Pinch Analysis

    Directory of Open Access Journals (Sweden)

    Nasser Al-Azri

    2015-08-01

    Full Text Available Since its debut in the last quarter of the twentieth century, pinch technology has become an efficient tool for efficient and cost-effective engineering process design. This method allows the integration of mass and heat streams in such a way that minimizes waste and external purchase of mass and utilities. Moreover, integrating process streams internally will minimize fuel consumption and hence carbon emission to the atmosphere. This paper discusses a programmable approach to the design of mass and heat exchange networks that can be used easily for large scale engineering processes.

  8. Large-Scale Analysis of Art Proportions

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2014-01-01

    ) and with majority of images having a proportion larger than one, but less than e.g. the golden ratio. Furthermore, more images have the inversed proportion, meaning that portrait paintings are more common than landscape paintings. The inverse is true for photographs, i.e. more landscape than portrait format......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  9. Large scale phononic metamaterials for seismic isolation

    Energy Technology Data Exchange (ETDEWEB)

    Aravantinos-Zafiris, N. [Department of Sound and Musical Instruments Technology, Ionian Islands Technological Educational Institute, Stylianou Typaldou ave., Lixouri 28200 (Greece); Sigalas, M. M. [Department of Materials Science, University of Patras, Patras 26504 (Greece)

    2015-08-14

    In this work, we numerically examine structures that could be characterized as large scale phononic metamaterials. These novel structures could have band gaps in the frequency spectrum of seismic waves when their dimensions are chosen appropriately, thus raising the belief that they could be serious candidates for seismic isolation structures. Different and easy to fabricate structures were examined made from construction materials such as concrete and steel. The well-known finite difference time domain method is used in our calculations in order to calculate the band structures of the proposed metamaterials.

  10. Adaptive visualization for large-scale graph

    International Nuclear Information System (INIS)

    Nakamura, Hiroko; Shinano, Yuji; Ohzahata, Satoshi

    2010-01-01

    We propose an adoptive visualization technique for representing a large-scale hierarchical dataset within limited display space. A hierarchical dataset has nodes and links showing the parent-child relationship between the nodes. These nodes and links are described using graphics primitives. When the number of these primitives is large, it is difficult to recognize the structure of the hierarchical data because many primitives are overlapped within a limited region. To overcome this difficulty, we propose an adaptive visualization technique for hierarchical datasets. The proposed technique selects an appropriate graph style according to the nodal density in each area. (author)

  11. Large scale phononic metamaterials for seismic isolation

    International Nuclear Information System (INIS)

    Aravantinos-Zafiris, N.; Sigalas, M. M.

    2015-01-01

    In this work, we numerically examine structures that could be characterized as large scale phononic metamaterials. These novel structures could have band gaps in the frequency spectrum of seismic waves when their dimensions are chosen appropriately, thus raising the belief that they could be serious candidates for seismic isolation structures. Different and easy to fabricate structures were examined made from construction materials such as concrete and steel. The well-known finite difference time domain method is used in our calculations in order to calculate the band structures of the proposed metamaterials

  12. Stabilization Algorithms for Large-Scale Problems

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg

    2006-01-01

    The focus of the project is on stabilization of large-scale inverse problems where structured models and iterative algorithms are necessary for computing approximate solutions. For this purpose, we study various iterative Krylov methods and their abilities to produce regularized solutions. Some......-curve. This heuristic is implemented as a part of a larger algorithm which is developed in collaboration with G. Rodriguez and P. C. Hansen. Last, but not least, a large part of the project has, in different ways, revolved around the object-oriented Matlab toolbox MOORe Tools developed by PhD Michael Jacobsen. New...

  13. Internationalization Measures in Large Scale Research Projects

    Science.gov (United States)

    Soeding, Emanuel; Smith, Nancy

    2017-04-01

    Internationalization measures in Large Scale Research Projects Large scale research projects (LSRP) often serve as flagships used by universities or research institutions to demonstrate their performance and capability to stakeholders and other interested parties. As the global competition among universities for the recruitment of the brightest brains has increased, effective internationalization measures have become hot topics for universities and LSRP alike. Nevertheless, most projects and universities are challenged with little experience on how to conduct these measures and make internationalization an cost efficient and useful activity. Furthermore, those undertakings permanently have to be justified with the Project PIs as important, valuable tools to improve the capacity of the project and the research location. There are a variety of measures, suited to support universities in international recruitment. These include e.g. institutional partnerships, research marketing, a welcome culture, support for science mobility and an effective alumni strategy. These activities, although often conducted by different university entities, are interlocked and can be very powerful measures if interfaced in an effective way. On this poster we display a number of internationalization measures for various target groups, identify interfaces between project management, university administration, researchers and international partners to work together, exchange information and improve processes in order to be able to recruit, support and keep the brightest heads to your project.

  14. Status: Large-scale subatmospheric cryogenic systems

    International Nuclear Information System (INIS)

    Peterson, T.

    1989-01-01

    In the late 1960's and early 1970's an interest in testing and operating RF cavities at 1.8K motivated the development and construction of four large (300 Watt) 1.8K refrigeration systems. in the past decade, development of successful superconducting RF cavities and interest in obtaining higher magnetic fields with the improved Niobium-Titanium superconductors has once again created interest in large-scale 1.8K refrigeration systems. The L'Air Liquide plant for Tore Supra is a recently commissioned 300 Watt 1.8K system which incorporates new technology, cold compressors, to obtain the low vapor pressure for low temperature cooling. CEBAF proposes to use cold compressors to obtain 5KW at 2.0K. Magnetic refrigerators of 10 Watt capacity or higher at 1.8K are now being developed. The state of the art of large-scale refrigeration in the range under 4K will be reviewed. 28 refs., 4 figs., 7 tabs

  15. Dipolar modulation of Large-Scale Structure

    Science.gov (United States)

    Yoon, Mijin

    For the last two decades, we have seen a drastic development of modern cosmology based on various observations such as the cosmic microwave background (CMB), type Ia supernovae, and baryonic acoustic oscillations (BAO). These observational evidences have led us to a great deal of consensus on the cosmological model so-called LambdaCDM and tight constraints on cosmological parameters consisting the model. On the other hand, the advancement in cosmology relies on the cosmological principle: the universe is isotropic and homogeneous on large scales. Testing these fundamental assumptions is crucial and will soon become possible given the planned observations ahead. Dipolar modulation is the largest angular anisotropy of the sky, which is quantified by its direction and amplitude. We measured a huge dipolar modulation in CMB, which mainly originated from our solar system's motion relative to CMB rest frame. However, we have not yet acquired consistent measurements of dipolar modulations in large-scale structure (LSS), as they require large sky coverage and a number of well-identified objects. In this thesis, we explore measurement of dipolar modulation in number counts of LSS objects as a test of statistical isotropy. This thesis is based on two papers that were published in peer-reviewed journals. In Chapter 2 [Yoon et al., 2014], we measured a dipolar modulation in number counts of WISE matched with 2MASS sources. In Chapter 3 [Yoon & Huterer, 2015], we investigated requirements for detection of kinematic dipole in future surveys.

  16. Large-scale Intelligent Transporation Systems simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, T.; Canfield, T.; Hannebutte, U.; Levine, D.; Tentner, A.

    1995-06-01

    A prototype computer system has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS) capable of running on massively parallel computers and distributed (networked) computer systems. The prototype includes the modelling of instrumented ``smart`` vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of our design is that vehicles will be represented by autonomus computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  17. Large-scale impacts of hydroelectric development

    International Nuclear Information System (INIS)

    Rosenberg, D.M.; Bodaly, R.A.; Hecky, R.E.; Rudd, J.W.M.; Berkes, F.; Kelly, C.A.

    1997-01-01

    A study was conducted in which the cumulative environmental effects of mega-hydroelectric development projects such as the James Bay development in Canada, the Sardar Sarovar development in India and the Three Gorges development in China were examined. The extent of flooding as a result of these projects and of many others around the world was presented. The study showed that several factors are responsible for methyl mercury (MeHg) bioaccumulation in reservoirs. The study also revealed that reservoirs can be a significant source of greenhouse gas emissions. Boreal forests in particular, when flooded, become a strong source of greenhouse gases to the atmosphere. This results from the fact that after flooding a boreal forest changes from being a small carbon sink to a large source of carbon to the atmosphere, due to stimulated microbial production of CO 2 and CH 4 by decomposition of plant tissues and peat. This increased decomposition also results in an increase of another microbial activity, namely the methylation of inorganic mercury to the much more toxic MeHg. Selected examples of the downstream effects of altered flows caused by large-scale hydroelectric developments world-wide were summarized. A similar tabulation provided examples of social impacts of relocation of people necessitated by large-scale hydroelectric development. 209 refs., 10 tabs., 3 figs

  18. Photorealistic large-scale urban city model reconstruction.

    Science.gov (United States)

    Poullis, Charalambos; You, Suya

    2009-01-01

    The rapid and efficient creation of virtual environments has become a crucial part of virtual reality applications. In particular, civil and defense applications often require and employ detailed models of operations areas for training, simulations of different scenarios, planning for natural or man-made events, monitoring, surveillance, games, and films. A realistic representation of the large-scale environments is therefore imperative for the success of such applications since it increases the immersive experience of its users and helps reduce the difference between physical and virtual reality. However, the task of creating such large-scale virtual environments still remains a time-consuming and manual work. In this work, we propose a novel method for the rapid reconstruction of photorealistic large-scale virtual environments. First, a novel, extendible, parameterized geometric primitive is presented for the automatic building identification and reconstruction of building structures. In addition, buildings with complex roofs containing complex linear and nonlinear surfaces are reconstructed interactively using a linear polygonal and a nonlinear primitive, respectively. Second, we present a rendering pipeline for the composition of photorealistic textures, which unlike existing techniques, can recover missing or occluded texture information by integrating multiple information captured from different optical sensors (ground, aerial, and satellite).

  19. Accelerating large-scale phase-field simulations with GPU

    Directory of Open Access Journals (Sweden)

    Xiaoming Shi

    2017-10-01

    Full Text Available A new package for accelerating large-scale phase-field simulations was developed by using GPU based on the semi-implicit Fourier method. The package can solve a variety of equilibrium equations with different inhomogeneity including long-range elastic, magnetostatic, and electrostatic interactions. Through using specific algorithm in Compute Unified Device Architecture (CUDA, Fourier spectral iterative perturbation method was integrated in GPU package. The Allen-Cahn equation, Cahn-Hilliard equation, and phase-field model with long-range interaction were solved based on the algorithm running on GPU respectively to test the performance of the package. From the comparison of the calculation results between the solver executed in single CPU and the one on GPU, it was found that the speed on GPU is enormously elevated to 50 times faster. The present study therefore contributes to the acceleration of large-scale phase-field simulations and provides guidance for experiments to design large-scale functional devices.

  20. Generalized Hermite-Hadamard type inequalities involving fractional integral operators.

    Science.gov (United States)

    Set, Erhan; Noor, Muhammed Aslam; Awan, Muhammed Uzair; Gözpinar, Abdurrahman

    2017-01-01

    In this article, a new general integral identity involving generalized fractional integral operators is established. With the help of this identity new Hermite-Hadamard type inequalities are obtained for functions whose absolute values of derivatives are convex. As a consequence, the main results of this paper generalize the existing Hermite-Hadamard type inequalities involving the Riemann-Liouville fractional integral.

  1. Design study on sodium cooled large-scale reactor

    International Nuclear Information System (INIS)

    Murakami, Tsutomu; Hishida, Masahiko; Kisohara, Naoyuki

    2004-07-01

    In Phase 1 of the 'Feasibility Studies on Commercialized Fast Reactor Cycle Systems (F/S)', an advanced loop type reactor has been selected as a promising concept of sodium-cooled large-scale reactor, which has a possibility to fulfill the design requirements of the F/S. In Phase 2, design improvement for further cost reduction of establishment of the plant concept has been performed. This report summarizes the results of the design study on the sodium-cooled large-scale reactor performed in JFY2003, which is the third year of Phase 2. In the JFY2003 design study, critical subjects related to safety, structural integrity and thermal hydraulics which found in the last fiscal year has been examined and the plant concept has been modified. Furthermore, fundamental specifications of main systems and components have been set and economy has been evaluated. In addition, as the interim evaluation of the candidate concept of the FBR fuel cycle is to be conducted, cost effectiveness and achievability for the development goal were evaluated and the data of the three large-scale reactor candidate concepts were prepared. As a results of this study, the plant concept of the sodium-cooled large-scale reactor has been constructed, which has a prospect to satisfy the economic goal (construction cost: less than 200,000 yens/kWe, etc.) and has a prospect to solve the critical subjects. From now on, reflecting the results of elemental experiments, the preliminary conceptual design of this plant will be preceded toward the selection for narrowing down candidate concepts at the end of Phase 2. (author)

  2. Radiations: large scale monitoring in Japan

    International Nuclear Information System (INIS)

    Linton, M.; Khalatbari, A.

    2011-01-01

    As the consequences of radioactive leaks on their health are a matter of concern for Japanese people, a large scale epidemiological study has been launched by the Fukushima medical university. It concerns the two millions inhabitants of the Fukushima Prefecture. On the national level and with the support of public funds, medical care and follow-up, as well as systematic controls are foreseen, notably to check the thyroid of 360.000 young people less than 18 year old and of 20.000 pregnant women in the Fukushima Prefecture. Some measurements have already been performed on young children. Despite the sometimes rather low measures, and because they know that some parts of the area are at least as much contaminated as it was the case around Chernobyl, some people are reluctant to go back home

  3. Large scale study of tooth enamel

    International Nuclear Information System (INIS)

    Bodart, F.; Deconninck, G.; Martin, M.T.

    Human tooth enamel contains traces of foreign elements. The presence of these elements is related to the history and the environment of the human body and can be considered as the signature of perturbations which occur during the growth of a tooth. A map of the distribution of these traces on a large scale sample of the population will constitute a reference for further investigations of environmental effects. On hundred eighty samples of teeth were first analyzed using PIXE, backscattering and nuclear reaction techniques. The results were analyzed using statistical methods. Correlations between O, F, Na, P, Ca, Mn, Fe, Cu, Zn, Pb and Sr were observed and cluster analysis was in progress. The techniques described in the present work have been developed in order to establish a method for the exploration of very large samples of the Belgian population. (author)

  4. On the Phenomenology of an Accelerated Large-Scale Universe

    Directory of Open Access Journals (Sweden)

    Martiros Khurshudyan

    2016-10-01

    Full Text Available In this review paper, several new results towards the explanation of the accelerated expansion of the large-scale universe is discussed. On the other hand, inflation is the early-time accelerated era and the universe is symmetric in the sense of accelerated expansion. The accelerated expansion of is one of the long standing problems in modern cosmology, and physics in general. There are several well defined approaches to solve this problem. One of them is an assumption concerning the existence of dark energy in recent universe. It is believed that dark energy is responsible for antigravity, while dark matter has gravitational nature and is responsible, in general, for structure formation. A different approach is an appropriate modification of general relativity including, for instance, f ( R and f ( T theories of gravity. On the other hand, attempts to build theories of quantum gravity and assumptions about existence of extra dimensions, possible variability of the gravitational constant and the speed of the light (among others, provide interesting modifications of general relativity applicable to problems of modern cosmology, too. In particular, here two groups of cosmological models are discussed. In the first group the problem of the accelerated expansion of large-scale universe is discussed involving a new idea, named the varying ghost dark energy. On the other hand, the second group contains cosmological models addressed to the same problem involving either new parameterizations of the equation of state parameter of dark energy (like varying polytropic gas, or nonlinear interactions between dark energy and dark matter. Moreover, for cosmological models involving varying ghost dark energy, massless particle creation in appropriate radiation dominated universe (when the background dynamics is due to general relativity is demonstrated as well. Exploring the nature of the accelerated expansion of the large-scale universe involving generalized

  5. Large Scale Landform Mapping Using Lidar DEM

    Directory of Open Access Journals (Sweden)

    Türkay Gökgöz

    2015-08-01

    Full Text Available In this study, LIDAR DEM data was used to obtain a primary landform map in accordance with a well-known methodology. This primary landform map was generalized using the Focal Statistics tool (Majority, considering the minimum area condition in cartographic generalization in order to obtain landform maps at 1:1000 and 1:5000 scales. Both the primary and the generalized landform maps were verified visually with hillshaded DEM and an orthophoto. As a result, these maps provide satisfactory visuals of the landforms. In order to show the effect of generalization, the area of each landform in both the primary and the generalized maps was computed. Consequently, landform maps at large scales could be obtained with the proposed methodology, including generalization using LIDAR DEM.

  6. [Stress management in large-scale establishments].

    Science.gov (United States)

    Fukasawa, Kenji

    2002-07-01

    Due to a recent dramatic change in industrial structures in Japan, the role of large-scale enterprises is changing. Mass production used to be the major income sources of companies, but nowadays it has changed to high value-added products, including, software development. As a consequence of highly competitive inter-corporate development, there are various sources of job stress which induce health problems in employees, especially those concerned with development or management. To simply to obey the law or offer medical care are not enough to achieve management of these problems. Occupational health staff need to act according to the disease type and provide care with support from the Supervisor and Personnel Division. And for the training, development and consultation system, occupational health staff must work with the Personnel Division and Safety Division, and be approved by management supervisors.

  7. Large - scale Rectangular Ruler Automated Verification Device

    Science.gov (United States)

    Chen, Hao; Chang, Luping; Xing, Minjian; Xie, Xie

    2018-03-01

    This paper introduces a large-scale rectangular ruler automated verification device, which consists of photoelectric autocollimator and self-designed mechanical drive car and data automatic acquisition system. The design of mechanical structure part of the device refer to optical axis design, drive part, fixture device and wheel design. The design of control system of the device refer to hardware design and software design, and the hardware mainly uses singlechip system, and the software design is the process of the photoelectric autocollimator and the automatic data acquisition process. This devices can automated achieve vertical measurement data. The reliability of the device is verified by experimental comparison. The conclusion meets the requirement of the right angle test procedure.

  8. Modelling large-scale hydrogen infrastructure development

    International Nuclear Information System (INIS)

    De Groot, A.; Smit, R.; Weeda, M.

    2005-08-01

    In modelling a possible H2 infrastructure development the following questions are answered in this presentation: How could the future demand for H2 develop in the Netherlands?; and In which year and where would it be economically viable to construct a H2 infrastructure in the Netherlands? Conclusions are that: A model for describing a possible future H2 infrastructure is successfully developed; The model is strongly regional and time dependent; Decrease of fuel cell cost appears to be a sensitive parameter for development of H2 demand; Cost-margin between large-scale and small-scale H2 production is a main driver for development of a H2 infrastructure; A H2 infrastructure seems economically viable in the Netherlands starting from the year 2022

  9. Large-scale digitizer system, analog converters

    International Nuclear Information System (INIS)

    Althaus, R.F.; Lee, K.L.; Kirsten, F.A.; Wagner, L.J.

    1976-10-01

    Analog to digital converter circuits that are based on the sharing of common resources, including those which are critical to the linearity and stability of the individual channels, are described. Simplicity of circuit composition is valued over other more costly approaches. These are intended to be applied in a large-scale processing and digitizing system for use with high-energy physics detectors such as drift-chambers or phototube-scintillator arrays. Signal distribution techniques are of paramount importance in maintaining adequate signal-to-noise ratio. Noise in both amplitude and time-jitter senses is held sufficiently low so that conversions with 10-bit charge resolution and 12-bit time resolution are achieved

  10. Large-scale sequential quadratic programming algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Eldersveld, S.K.

    1992-09-01

    The problem addressed is the general nonlinear programming problem: finding a local minimizer for a nonlinear function subject to a mixture of nonlinear equality and inequality constraints. The methods studied are in the class of sequential quadratic programming (SQP) algorithms, which have previously proved successful for problems of moderate size. Our goal is to devise an SQP algorithm that is applicable to large-scale optimization problems, using sparse data structures and storing less curvature information but maintaining the property of superlinear convergence. The main features are: 1. The use of a quasi-Newton approximation to the reduced Hessian of the Lagrangian function. Only an estimate of the reduced Hessian matrix is required by our algorithm. The impact of not having available the full Hessian approximation is studied and alternative estimates are constructed. 2. The use of a transformation matrix Q. This allows the QP gradient to be computed easily when only the reduced Hessian approximation is maintained. 3. The use of a reduced-gradient form of the basis for the null space of the working set. This choice of basis is more practical than an orthogonal null-space basis for large-scale problems. The continuity condition for this choice is proven. 4. The use of incomplete solutions of quadratic programming subproblems. Certain iterates generated by an active-set method for the QP subproblem are used in place of the QP minimizer to define the search direction for the nonlinear problem. An implementation of the new algorithm has been obtained by modifying the code MINOS. Results and comparisons with MINOS and NPSOL are given for the new algorithm on a set of 92 test problems.

  11. A Modeling & Simulation Implementation Framework for Large-Scale Simulation

    Directory of Open Access Journals (Sweden)

    Song Xiao

    2012-10-01

    Full Text Available Classical High Level Architecture (HLA systems are facing development problems for lack of supporting fine-grained component integration and interoperation in large-scale complex simulation applications. To provide efficient methods of this issue, an extensible, reusable and composable simulation framework is proposed. To promote the reusability from coarse-grained federate to fine-grained components, this paper proposes a modelling & simulation framework which consists of component-based architecture, modelling methods, and simulation services to support and simplify the process of complex simulation application construction. Moreover, a standard process and simulation tools are developed to ensure the rapid and effective development of simulation application.

  12. Analysis using large-scale ringing data

    Directory of Open Access Journals (Sweden)

    Baillie, S. R.

    2004-06-01

    survival and recruitment estimates from the French CES scheme to assess the relative contributions of survival and recruitment to overall population changes. He develops a novel approach to modelling survival rates from such multi–site data by using within–year recaptures to provide a covariate of between–year recapture rates. This provided parsimonious models of variation in recapture probabilities between sites and years. The approach provides promising results for the four species investigated and can potentially be extended to similar data from other CES/MAPS schemes. The final paper by Blandine Doligez, David Thomson and Arie van Noordwijk (Doligez et al., 2004 illustrates how large-scale studies of population dynamics can be important for evaluating the effects of conservation measures. Their study is concerned with the reintroduction of White Stork populations to the Netherlands where a re–introduction programme started in 1969 had resulted in a breeding population of 396 pairs by 2000. They demonstrate the need to consider a wide range of models in order to account for potential age, time, cohort and “trap–happiness” effects. As the data are based on resightings such trap–happiness must reflect some form of heterogeneity in resighting probabilities. Perhaps surprisingly, the provision of supplementary food did not influence survival, but it may havehad an indirect effect via the alteration of migratory behaviour. Spatially explicit modelling of data gathered at many sites inevitably results in starting models with very large numbers of parameters. The problem is often complicated further by having relatively sparse data at each site, even where the total amount of data gathered is very large. Both Julliard (2004 and Doligez et al. (2004 give explicit examples of problems caused by needing to handle very large numbers of parameters and show how they overcame them for their particular data sets. Such problems involve both the choice of appropriate

  13. Foundational perspectives on causality in large-scale brain networks

    Science.gov (United States)

    Mannino, Michael; Bressler, Steven L.

    2015-12-01

    likelihood that a change in the activity of one neuronal population affects the activity in another. We argue that these measures access the inherently probabilistic nature of causal influences in the brain, and are thus better suited for large-scale brain network analysis than are DC-based measures. Our work is consistent with recent advances in the philosophical study of probabilistic causality, which originated from inherent conceptual problems with deterministic regularity theories. It also resonates with concepts of stochasticity that were involved in establishing modern physics. In summary, we argue that probabilistic causality is a conceptually appropriate foundation for describing neural causality in the brain.

  14. Large-scale stochasticity in Hamiltonian systems

    International Nuclear Information System (INIS)

    Escande, D.F.

    1982-01-01

    Large scale stochasticity (L.S.S.) in Hamiltonian systems is defined on the paradigm Hamiltonian H(v,x,t) =v 2 /2-M cos x-P cos k(x-t) which describes the motion of one particle in two electrostatic waves. A renormalization transformation Tsub(r) is described which acts as a microscope that focusses on a given KAM (Kolmogorov-Arnold-Moser) torus in phase space. Though approximate, Tsub(r) yields the threshold of L.S.S. in H with an error of 5-10%. The universal behaviour of KAM tori is predicted: for instance the scale invariance of KAM tori and the critical exponent of the Lyapunov exponent of Cantori. The Fourier expansion of KAM tori is computed and several conjectures by L. Kadanoff and S. Shenker are proved. Chirikov's standard mapping for stochastic layers is derived in a simpler way and the width of the layers is computed. A simpler renormalization scheme for these layers is defined. A Mathieu equation for describing the stability of a discrete family of cycles is derived. When combined with Tsub(r), it allows to prove the link between KAM tori and nearby cycles, conjectured by J. Greene and, in particular, to compute the mean residue of a torus. The fractal diagrams defined by G. Schmidt are computed. A sketch of a methodology for computing the L.S.S. threshold in any two-degree-of-freedom Hamiltonian system is given. (Auth.)

  15. Mirror dark matter and large scale structure

    International Nuclear Information System (INIS)

    Ignatiev, A.Yu.; Volkas, R.R.

    2003-01-01

    Mirror matter is a dark matter candidate. In this paper, we reexamine the linear regime of density perturbation growth in a universe containing mirror dark matter. Taking adiabatic scale-invariant perturbations as the input, we confirm that the resulting processed power spectrum is richer than for the more familiar cases of cold, warm and hot dark matter. The new features include a maximum at a certain scale λ max , collisional damping below a smaller characteristic scale λ S ' , with oscillatory perturbations between the two. These scales are functions of the fundamental parameters of the theory. In particular, they decrease for decreasing x, the ratio of the mirror plasma temperature to that of the ordinary. For x∼0.2, the scale λ max becomes galactic. Mirror dark matter therefore leads to bottom-up large scale structure formation, similar to conventional cold dark matter, for x(less-or-similar sign)0.2. Indeed, the smaller the value of x, the closer mirror dark matter resembles standard cold dark matter during the linear regime. The differences pertain to scales smaller than λ S ' in the linear regime, and generally in the nonlinear regime because mirror dark matter is chemically complex and to some extent dissipative. Lyman-α forest data and the early reionization epoch established by WMAP may hold the key to distinguishing mirror dark matter from WIMP-style cold dark matter

  16. Food appropriation through large scale land acquisitions

    International Nuclear Information System (INIS)

    Cristina Rulli, Maria; D’Odorico, Paolo

    2014-01-01

    The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions (LSLAs) for commercial farming will bring the technology required to close the existing crops yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with LSLAs. We show how up to 300–550 million people could be fed by crops grown in the acquired land, should these investments in agriculture improve crop production and close the yield gap. In contrast, about 190–370 million people could be supported by this land without closing of the yield gap. These numbers raise some concern because the food produced in the acquired land is typically exported to other regions, while the target countries exhibit high levels of malnourishment. Conversely, if used for domestic consumption, the crops harvested in the acquired land could ensure food security to the local populations. (letter)

  17. Large-scale tides in general relativity

    Energy Technology Data Exchange (ETDEWEB)

    Ip, Hiu Yan; Schmidt, Fabian, E-mail: iphys@mpa-garching.mpg.de, E-mail: fabians@mpa-garching.mpg.de [Max-Planck-Institut für Astrophysik, Karl-Schwarzschild-Str. 1, 85741 Garching (Germany)

    2017-02-01

    Density perturbations in cosmology, i.e. spherically symmetric adiabatic perturbations of a Friedmann-Lemaȋtre-Robertson-Walker (FLRW) spacetime, are locally exactly equivalent to a different FLRW solution, as long as their wavelength is much larger than the sound horizon of all fluid components. This fact is known as the 'separate universe' paradigm. However, no such relation is known for anisotropic adiabatic perturbations, which correspond to an FLRW spacetime with large-scale tidal fields. Here, we provide a closed, fully relativistic set of evolutionary equations for the nonlinear evolution of such modes, based on the conformal Fermi (CFC) frame. We show explicitly that the tidal effects are encoded by the Weyl tensor, and are hence entirely different from an anisotropic Bianchi I spacetime, where the anisotropy is sourced by the Ricci tensor. In order to close the system, certain higher derivative terms have to be dropped. We show that this approximation is equivalent to the local tidal approximation of Hui and Bertschinger [1]. We also show that this very simple set of equations matches the exact evolution of the density field at second order, but fails at third and higher order. This provides a useful, easy-to-use framework for computing the fully relativistic growth of structure at second order.

  18. Food appropriation through large scale land acquisitions

    Science.gov (United States)

    Rulli, Maria Cristina; D'Odorico, Paolo

    2014-05-01

    The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions (LSLAs) for commercial farming will bring the technology required to close the existing crops yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with LSLAs. We show how up to 300-550 million people could be fed by crops grown in the acquired land, should these investments in agriculture improve crop production and close the yield gap. In contrast, about 190-370 million people could be supported by this land without closing of the yield gap. These numbers raise some concern because the food produced in the acquired land is typically exported to other regions, while the target countries exhibit high levels of malnourishment. Conversely, if used for domestic consumption, the crops harvested in the acquired land could ensure food security to the local populations.

  19. Dynamic Modeling, Optimization, and Advanced Control for Large Scale Biorefineries

    DEFF Research Database (Denmark)

    Prunescu, Remus Mihail

    Second generation biorefineries transform agricultural wastes into biochemicals with higher added value, e.g. bioethanol, which is thought to become a primary component in liquid fuels [1]. Extensive endeavors have been conducted to make the production process feasible on a large scale, and recen......Second generation biorefineries transform agricultural wastes into biochemicals with higher added value, e.g. bioethanol, which is thought to become a primary component in liquid fuels [1]. Extensive endeavors have been conducted to make the production process feasible on a large scale......-time monitoring. The Inbicon biorefinery converts wheat straw into bioethanol utilizing steam, enzymes, and genetically modified yeast. The biomass is first pretreated in a steam pressurized and continuous thermal reactor where lignin is relocated, and hemicellulose partially hydrolyzed such that cellulose...... becomes more accessible to enzymes. The biorefinery is integrated with a nearby power plant following the Integrated Biomass Utilization System (IBUS) principle for reducing steam costs [4]. During the pretreatment, by-products are also created such as organic acids, furfural, and pseudo-lignin, which act...

  20. GPU-based large-scale visualization

    KAUST Repository

    Hadwiger, Markus

    2013-11-19

    Recent advances in image and volume acquisition as well as computational advances in simulation have led to an explosion of the amount of data that must be visualized and analyzed. Modern techniques combine the parallel processing power of GPUs with out-of-core methods and data streaming to enable the interactive visualization of giga- and terabytes of image and volume data. A major enabler for interactivity is making both the computational and the visualization effort proportional to the amount of data that is actually visible on screen, decoupling it from the full data size. This leads to powerful display-aware multi-resolution techniques that enable the visualization of data of almost arbitrary size. The course consists of two major parts: An introductory part that progresses from fundamentals to modern techniques, and a more advanced part that discusses details of ray-guided volume rendering, novel data structures for display-aware visualization and processing, and the remote visualization of large online data collections. You will learn how to develop efficient GPU data structures and large-scale visualizations, implement out-of-core strategies and concepts such as virtual texturing that have only been employed recently, as well as how to use modern multi-resolution representations. These approaches reduce the GPU memory requirements of extremely large data to a working set size that fits into current GPUs. You will learn how to perform ray-casting of volume data of almost arbitrary size and how to render and process gigapixel images using scalable, display-aware techniques. We will describe custom virtual texturing architectures as well as recent hardware developments in this area. We will also describe client/server systems for distributed visualization, on-demand data processing and streaming, and remote visualization. We will describe implementations using OpenGL as well as CUDA, exploiting parallelism on GPUs combined with additional asynchronous

  1. Large scale molecular dynamics simulations of nuclear pasta

    Science.gov (United States)

    Horowitz, C. J.; Berry, D.; Briggs, C.; Chapman, M.; Clark, E.; Schneider, A.

    2014-09-01

    We report large-scale molecular dynamics simulations of nuclear pasta using from 50,000 to more than 3,000,000 nucleons. We use a simple phenomenological two-nucleon potential that reproduces nuclear saturation. We find a complex ``nuclear waffle'' phase in addition to more conventional rod, plate, and sphere phases. We also find long-lived topological defects involving screw like dislocations that may reduce the electrical conductivity and thermal conductivity of lasagna phases. From MD trajectories we calculate a variety of quantities including static structure factor, dynamical response function, shear modulus and breaking strain. We report large-scale molecular dynamics simulations of nuclear pasta using from 50,000 to more than 3,000,000 nucleons. We use a simple phenomenological two-nucleon potential that reproduces nuclear saturation. We find a complex ``nuclear waffle'' phase in addition to more conventional rod, plate, and sphere phases. We also find long-lived topological defects involving screw like dislocations that may reduce the electrical conductivity and thermal conductivity of lasagna phases. From MD trajectories we calculate a variety of quantities including static structure factor, dynamical response function, shear modulus and breaking strain. Supported in parts by DOE Grants No. DE-FG02-87ER40365 (Indiana University) and No. DE-SC0008808 (NUCLEI SciDAC Collaboration).

  2. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  3. Developing Large-Scale Bayesian Networks by Composition

    Data.gov (United States)

    National Aeronautics and Space Administration — In this paper, we investigate the use of Bayesian networks to construct large-scale diagnostic systems. In particular, we consider the development of large-scale...

  4. Ecohydrological modeling for large-scale environmental impact assessment.

    Science.gov (United States)

    Woznicki, Sean A; Nejadhashemi, A Pouyan; Abouali, Mohammad; Herman, Matthew R; Esfahanian, Elaheh; Hamaamin, Yaseen A; Zhang, Zhen

    2016-02-01

    Ecohydrological models are frequently used to assess the biological integrity of unsampled streams. These models vary in complexity and scale, and their utility depends on their final application. Tradeoffs are usually made in model scale, where large-scale models are useful for determining broad impacts of human activities on biological conditions, and regional-scale (e.g. watershed or ecoregion) models provide stakeholders greater detail at the individual stream reach level. Given these tradeoffs, the objective of this study was to develop large-scale stream health models with reach level accuracy similar to regional-scale models thereby allowing for impacts assessments and improved decision-making capabilities. To accomplish this, four measures of biological integrity (Ephemeroptera, Plecoptera, and Trichoptera taxa (EPT), Family Index of Biotic Integrity (FIBI), Hilsenhoff Biotic Index (HBI), and fish Index of Biotic Integrity (IBI)) were modeled based on four thermal classes (cold, cold-transitional, cool, and warm) of streams that broadly dictate the distribution of aquatic biota in Michigan. The Soil and Water Assessment Tool (SWAT) was used to simulate streamflow and water quality in seven watersheds and the Hydrologic Index Tool was used to calculate 171 ecologically relevant flow regime variables. Unique variables were selected for each thermal class using a Bayesian variable selection method. The variables were then used in development of adaptive neuro-fuzzy inference systems (ANFIS) models of EPT, FIBI, HBI, and IBI. ANFIS model accuracy improved when accounting for stream thermal class rather than developing a global model. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Large scale dynamics of protoplanetary discs

    Science.gov (United States)

    Béthune, William

    2017-08-01

    Planets form in the gaseous and dusty disks orbiting young stars. These protoplanetary disks are dispersed in a few million years, being accreted onto the central star or evaporated into the interstellar medium. To explain the observed accretion rates, it is commonly assumed that matter is transported through the disk by turbulence, although the mechanism sustaining turbulence is uncertain. On the other side, irradiation by the central star could heat up the disk surface and trigger a photoevaporative wind, but thermal effects cannot account for the observed acceleration and collimation of the wind into a narrow jet perpendicular to the disk plane. Both issues can be solved if the disk is sensitive to magnetic fields. Weak fields lead to the magnetorotational instability, whose outcome is a state of sustained turbulence. Strong fields can slow down the disk, causing it to accrete while launching a collimated wind. However, the coupling between the disk and the neutral gas is done via electric charges, each of which is outnumbered by several billion neutral molecules. The imperfect coupling between the magnetic field and the neutral gas is described in terms of "non-ideal" effects, introducing new dynamical behaviors. This thesis is devoted to the transport processes happening inside weakly ionized and weakly magnetized accretion disks; the role of microphysical effects on the large-scale dynamics of the disk is of primary importance. As a first step, I exclude the wind and examine the impact of non-ideal effects on the turbulent properties near the disk midplane. I show that the flow can spontaneously organize itself if the ionization fraction is low enough; in this case, accretion is halted and the disk exhibits axisymmetric structures, with possible consequences on planetary formation. As a second step, I study the launching of disk winds via a global model of stratified disk embedded in a warm atmosphere. This model is the first to compute non-ideal effects from

  6. Large-scale fuel cycle centres

    International Nuclear Information System (INIS)

    Smiley, S.H.; Black, K.M.

    1977-01-01

    The US Nuclear Regulatory Commission (NRC) has considered the nuclear energy centre concept for fuel cycle plants in the Nuclear Energy Centre Site Survey 1975 (NECSS-75) Rep. No. NUREG-0001, an important study mandated by the US Congress in the Energy Reorganization Act of 1974 which created the NRC. For this study, the NRC defined fuel cycle centres as consisting of fuel reprocessing and mixed-oxide fuel fabrication plants, and optional high-level waste and transuranic waste management facilities. A range of fuel cycle centre sizes corresponded to the fuel throughput of power plants with a total capacity of 50,000-300,000MW(e). The types of fuel cycle facilities located at the fuel cycle centre permit the assessment of the role of fuel cycle centres in enhancing the safeguard of strategic special nuclear materials - plutonium and mixed oxides. Siting fuel cycle centres presents a smaller problem than siting reactors. A single reprocessing plant of the scale projected for use in the USA (1500-2000t/a) can reprocess fuel from reactors producing 50,000-65,000MW(e). Only two or three fuel cycle centres of the upper limit size considered in the NECSS-75 would be required in the USA by the year 2000. The NECSS-75 fuel cycle centre evaluation showed that large-scale fuel cycle centres present no real technical siting difficulties from a radiological effluent and safety standpoint. Some construction economies may be achievable with fuel cycle centres, which offer opportunities to improve waste-management systems. Combined centres consisting of reactors and fuel reprocessing and mixed-oxide fuel fabrication plants were also studied in the NECSS. Such centres can eliminate shipment not only of Pu but also mixed-oxide fuel. Increased fuel cycle costs result from implementation of combined centres unless the fuel reprocessing plants are commercial-sized. Development of Pu-burning reactors could reduce any economic penalties of combined centres. The need for effective fissile

  7. Statistical Analysis of Large-Scale Structure of Universe

    Science.gov (United States)

    Tugay, A. V.

    While galaxy cluster catalogs were compiled many decades ago, other structural elements of cosmic web are detected at definite level only in the newest works. For example, extragalactic filaments were described by velocity field and SDSS galaxy distribution during the last years. Large-scale structure of the Universe could be also mapped in the future using ATHENA observations in X-rays and SKA in radio band. Until detailed observations are not available for the most volume of Universe, some integral statistical parameters can be used for its description. Such methods as galaxy correlation function, power spectrum, statistical moments and peak statistics are commonly used with this aim. The parameters of power spectrum and other statistics are important for constraining the models of dark matter, dark energy, inflation and brane cosmology. In the present work we describe the growth of large-scale density fluctuations in one- and three-dimensional case with Fourier harmonics of hydrodynamical parameters. In result we get power-law relation for the matter power spectrum.

  8. The predictability of large-scale wind-driven flows

    Directory of Open Access Journals (Sweden)

    A. Mahadevan

    2001-01-01

    Full Text Available The singular values associated with optimally growing perturbations to stationary and time-dependent solutions for the general circulation in an ocean basin provide a measure of the rate at which solutions with nearby initial conditions begin to diverge, and hence, a measure of the predictability of the flow. In this paper, the singular vectors and singular values of stationary and evolving examples of wind-driven, double-gyre circulations in different flow regimes are explored. By changing the Reynolds number in simple quasi-geostrophic models of the wind-driven circulation, steady, weakly aperiodic and chaotic states may be examined. The singular vectors of the steady state reveal some of the physical mechanisms responsible for optimally growing perturbations. In time-dependent cases, the dominant singular values show significant variability in time, indicating strong variations in the predictability of the flow. When the underlying flow is weakly aperiodic, the dominant singular values co-vary with integral measures of the large-scale flow, such as the basin-integrated upper ocean kinetic energy and the transport in the western boundary current extension. Furthermore, in a reduced gravity quasi-geostrophic model of a weakly aperiodic, double-gyre flow, the behaviour of the dominant singular values may be used to predict a change in the large-scale flow, a feature not shared by an analogous two-layer model. When the circulation is in a strongly aperiodic state, the dominant singular values no longer vary coherently with integral measures of the flow. Instead, they fluctuate in a very aperiodic fashion on mesoscale time scales. The dominant singular vectors then depend strongly on the arrangement of mesoscale features in the flow and the evolved forms of the associated singular vectors have relatively short spatial scales. These results have several implications. In weakly aperiodic, periodic, and stationary regimes, the mesoscale energy

  9. Algorithm 896: LSA: Algorithms for Large-Scale Optimization

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2009-01-01

    Roč. 36, č. 3 (2009), 16-1-16-29 ISSN 0098-3500 R&D Projects: GA AV ČR IAA1030405; GA ČR GP201/06/P397 Institutional research plan: CEZ:AV0Z10300504 Keywords : algorithms * design * large-scale optimization * large-scale nonsmooth optimization * large-scale nonlinear least squares * large-scale nonlinear minimax * large-scale systems of nonlinear equations * sparse problems * partially separable problems * limited-memory methods * discrete Newton methods * quasi-Newton methods * primal interior -point methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.904, year: 2009

  10. Literature Review: Herbal Medicine Treatment after Large-Scale Disasters.

    Science.gov (United States)

    Takayama, Shin; Kaneko, Soichiro; Numata, Takehiro; Kamiya, Tetsuharu; Arita, Ryutaro; Saito, Natsumi; Kikuchi, Akiko; Ohsawa, Minoru; Kohayagawa, Yoshitaka; Ishii, Tadashi

    2017-01-01

    Large-scale natural disasters, such as earthquakes, tsunamis, volcanic eruptions, and typhoons, occur worldwide. After the Great East Japan earthquake and tsunami, our medical support operation's experiences suggested that traditional medicine might be useful for treating the various symptoms of the survivors. However, little information is available regarding herbal medicine treatment in such situations. Considering that further disasters will occur, we performed a literature review and summarized the traditional medicine approaches for treatment after large-scale disasters. We searched PubMed and Cochrane Library for articles written in English, and Ichushi for those written in Japanese. Articles published before 31 March 2016 were included. Keywords "disaster" and "herbal medicine" were used in our search. Among studies involving herbal medicine after a disaster, we found two randomized controlled trials investigating post-traumatic stress disorder (PTSD), three retrospective investigations of trauma or common diseases, and seven case series or case reports of dizziness, pain, and psychosomatic symptoms. In conclusion, herbal medicine has been used to treat trauma, PTSD, and other symptoms after disasters. However, few articles have been published, likely due to the difficulty in designing high quality studies in such situations. Further study will be needed to clarify the usefulness of herbal medicine after disasters.

  11. Large-scale assembly of colloidal particles

    Science.gov (United States)

    Yang, Hongta

    This study reports a simple, roll-to-roll compatible coating technology for producing three-dimensional highly ordered colloidal crystal-polymer composites, colloidal crystals, and macroporous polymer membranes. A vertically beveled doctor blade is utilized to shear align silica microsphere-monomer suspensions to form large-area composites in a single step. The polymer matrix and the silica microspheres can be selectively removed to create colloidal crystals and self-standing macroporous polymer membranes. The thickness of the shear-aligned crystal is correlated with the viscosity of the colloidal suspension and the coating speed, and the correlations can be qualitatively explained by adapting the mechanisms developed for conventional doctor blade coating. Five important research topics related to the application of large-scale three-dimensional highly ordered macroporous films by doctor blade coating are covered in this study. The first topic describes the invention in large area and low cost color reflective displays. This invention is inspired by the heat pipe technology. The self-standing macroporous polymer films exhibit brilliant colors which originate from the Bragg diffractive of visible light form the three-dimensional highly ordered air cavities. The colors can be easily changed by tuning the size of the air cavities to cover the whole visible spectrum. When the air cavities are filled with a solvent which has the same refractive index as that of the polymer, the macroporous polymer films become completely transparent due to the index matching. When the solvent trapped in the cavities is evaporated by in-situ heating, the sample color changes back to brilliant color. This process is highly reversible and reproducible for thousands of cycles. The second topic reports the achievement of rapid and reversible vapor detection by using 3-D macroporous photonic crystals. Capillary condensation of a condensable vapor in the interconnected macropores leads to the

  12. Variability of load and net load in case of large scale distributed wind power

    OpenAIRE

    Holttinen, Hannele; Kiviluoma, J.; Estanqueiro, Ana; Gómez-Lázaro, E.; Raw, Barry; Dobschinski, Jan; Meibon, Peter; Lannoye, Eamonn; Aigner, Tobias; Wan, Yih H.; Milligan, Michael

    2011-01-01

    Large scale wind power production and its variability is one of the major inputs to wind integration studies. This paper analyses measured data from large scale wind power production. Comparisons of variability are made across several variables: time scale (10-60 minute ramp rates),number of wind farms, and simulated vs. modeled data. Ramp rates for Wind power production, Load (total system load) and Net load (load minus wind power production) demonstrate how wind power increases the...

  13. Variability of load and net load in case of large scale distributed wind power

    OpenAIRE

    Holttinen, Hannele; Kiviluoma, Juha; Estanqueiro, Ana; Aigner, Tobias; Wan, Yih-Huei; Milligan, Michael R.

    2010-01-01

    Large scale wind power production and its variability is one of the major inputs to wind integration studies. This paper analyses measured data from large scale wind power production. Comparisons of variability are made across several variables: time scale (10-60 minute ramp rates), number of wind farms, and simulated vs. modeled data. Ramp rates for Wind power production, Load (total system load) and Net load (load minus wind power production) demonstrate how wind power increases the net loa...

  14. Pro website development and operations streamlining DevOps for large-scale websites

    CERN Document Server

    Sacks, Matthew

    2012-01-01

    Pro Website Development and Operations gives you the experience you need to create and operate a large-scale production website. Large-scale websites have their own unique set of problems regarding their design-problems that can get worse when agile methodologies are adopted for rapid results. Managing large-scale websites, deploying applications, and ensuring they are performing well often requires a full scale team involving the development and operations sides of the company-two departments that don't always see eye to eye. When departments struggle with each other, it adds unnecessary comp

  15. Decentralized Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

    2013-01-01

    A power balancing strategy based on Douglas-Rachford splitting is proposed as a control method for largescale integration of flexible consumers in a Smart Grid. The total power consumption is controlled through a negotiation procedure between all units and a coordinating system level. The balancing...

  16. Deep Feature Learning and Cascaded Classifier for Large Scale Data

    DEFF Research Database (Denmark)

    Prasoon, Adhish

    from data rather than having a predefined feature set. We explore deep learning approach of convolutional neural network (CNN) for segmenting three dimensional medical images. We propose a novel system integrating three 2D CNNs, which have a one-to-one association with the xy, yz and zx planes of 3D......This thesis focuses on voxel/pixel classification based approaches for image segmentation. The main application is segmentation of articular cartilage in knee MRIs. The first major contribution of the thesis deals with large scale machine learning problems. Many medical imaging problems need huge...... amount of training data to cover sufficient biological variability. Learning methods scaling badly with number of training data points cannot be used in such scenarios. This may restrict the usage of many powerful classifiers having excellent generalization ability. We propose a cascaded classifier which...

  17. Complex modular structure of large-scale brain networks

    Science.gov (United States)

    Valencia, M.; Pastor, M. A.; Fernández-Seara, M. A.; Artieda, J.; Martinerie, J.; Chavez, M.

    2009-06-01

    Modular structure is ubiquitous among real-world networks from related proteins to social groups. Here we analyze the modular organization of brain networks at a large scale (voxel level) extracted from functional magnetic resonance imaging signals. By using a random-walk-based method, we unveil the modularity of brain webs and show modules with a spatial distribution that matches anatomical structures with functional significance. The functional role of each node in the network is studied by analyzing its patterns of inter- and intramodular connections. Results suggest that the modular architecture constitutes the structural basis for the coexistence of functional integration of distant and specialized brain areas during normal brain activities at rest.

  18. Sample-Starved Large Scale Network Analysis

    Science.gov (United States)

    2016-05-05

    Hero, ”Multi-centrality graph PCA and its application to cyberintrusion detection,” IEEE Intl Conference on Acoust, Speech and Signal Processing...grant. 1. A. Hero was plenary speaker at the Future Directions in Compressive Sensing and Sensing- Processing Integration, workshop at Duke University...sponsored by the Office of the Secretary of Defense (OSD)) Jan 2016, entitled “The need for new theory and new models.” 2. A. Hero gave plenary speaker

  19. Thermal power generation projects ``Large Scale Solar Heating``; EU-Thermie-Projekte ``Large Scale Solar Heating``

    Energy Technology Data Exchange (ETDEWEB)

    Kuebler, R.; Fisch, M.N. [Steinbeis-Transferzentrum Energie-, Gebaeude- und Solartechnik, Stuttgart (Germany)

    1998-12-31

    The aim of this project is the preparation of the ``Large-Scale Solar Heating`` programme for an Europe-wide development of subject technology. The following demonstration programme was judged well by the experts but was not immediately (1996) accepted for financial subsidies. In November 1997 the EU-commission provided 1,5 million ECU which allowed the realisation of an updated project proposal. By mid 1997 a small project was approved, that had been requested under the lead of Chalmes Industriteteknik (CIT) in Sweden and is mainly carried out for the transfer of technology. (orig.) [Deutsch] Ziel dieses Vorhabens ist die Vorbereitung eines Schwerpunktprogramms `Large Scale Solar Heating`, mit dem die Technologie europaweit weiterentwickelt werden sollte. Das daraus entwickelte Demonstrationsprogramm wurde von den Gutachtern positiv bewertet, konnte jedoch nicht auf Anhieb (1996) in die Foerderung aufgenommen werden. Im November 1997 wurden von der EU-Kommission dann kurzfristig noch 1,5 Mio ECU an Foerderung bewilligt, mit denen ein aktualisierter Projektvorschlag realisiert werden kann. Bereits Mitte 1997 wurde ein kleineres Vorhaben bewilligt, das unter Federfuehrung von Chalmers Industriteknik (CIT) in Schweden beantragt worden war und das vor allem dem Technologietransfer dient. (orig.)

  20. Bonus algorithm for large scale stochastic nonlinear programming problems

    CERN Document Server

    Diwekar, Urmila

    2015-01-01

    This book presents the details of the BONUS algorithm and its real world applications in areas like sensor placement in large scale drinking water networks, sensor placement in advanced power systems, water management in power systems, and capacity expansion of energy systems. A generalized method for stochastic nonlinear programming based on a sampling based approach for uncertainty analysis and statistical reweighting to obtain probability information is demonstrated in this book. Stochastic optimization problems are difficult to solve since they involve dealing with optimization and uncertainty loops. There are two fundamental approaches used to solve such problems. The first being the decomposition techniques and the second method identifies problem specific structures and transforms the problem into a deterministic nonlinear programming problem. These techniques have significant limitations on either the objective function type or the underlying distributions for the uncertain variables. Moreover, these ...

  1. Magnetization of fluid phonons and large-scale curvature perturbations

    CERN Document Server

    Giovannini, Massimo

    2014-01-01

    The quasinormal mode of a gravitating and magnetized fluid in a spatially flat, isotropic and homogeneous cosmological background is derived in the presence of the fluid sources of anisotropic stress and of the entropic fluctuations of the plasma. The obtained gauge-invariant description involves a system of two coupled differential equations whose physical content is analyzed in all the most relevant situations. The Cauchy problem of large-scale curvature perturbations during the radiation dominated stage of expansion can be neatly formulated and its general solution is shown to depend on five initial data assigned when the relevant physical wavelengths are larger than the particle horizon. The consequences of this approach are explored.

  2. Technologies and challenges in large-scale phosphoproteomics

    DEFF Research Database (Denmark)

    Engholm-Keller, Kasper; Larsen, Martin Røssel

    2013-01-01

    Phosphorylation, the reversible addition of a phosphate group to amino acid side chains of proteins, is a fundamental regulator of protein activity, stability, and molecular interactions. Most cellular processes, such as inter- and intracellular signaling, protein synthesis, degradation......, and apoptosis, rely on phosphorylation. This PTM is thus involved in many diseases, rendering localization and assessment of extent of phosphorylation of major scientific interest. MS-based phosphoproteomics, which aims at describing all phosphorylation sites in a specific type of cell, tissue, or organism, has...... become the main technique for discovery and characterization of phosphoproteins in a nonhypothesis driven fashion. In this review, we describe methods for state-of-the-art MS-based analysis of protein phosphorylation as well as the strategies employed in large-scale phosphoproteomic experiments...

  3. Analysis using large-scale ringing data

    OpenAIRE

    Baillie, S. R.; Doherty, P. F.

    2004-01-01

    Birds are highly mobile organisms and there is increasing evidence that studies at large spatial scales are needed if we are to properly understand their population dynamics. While classical metapopulation models have rarely proved useful for birds, more general metapopulation ideas involving collections of populations interacting within spatially structured landscapes are highly relevant (Harrison, 1994). There is increasing interest in understanding patterns of synchrony, or lack of synchro...

  4. LAVA: Large scale Automated Vulnerability Addition

    Science.gov (United States)

    2016-05-23

    While this state of affairs is perfectly understandable, given the scarcity of ground truth, it is an obstacle to science and progress in...memory copy, e.g., are reasonable attack points. If the goal is to inject divide- by-zero, then arithmetic operations involving division will be...fundamentally dynamic, we must take care to choose inputs to maximize code coverage. To run the program, we load it as a virtual CD into a PANDA virtual machine

  5. Large-scale ATLAS production on EGEE

    CERN Document Server

    Espinal, X; Walker, R

    2008-01-01

    In preparation for first data at the LHC, a series of Data Challenges, of increasing scale and complexity, have been performed. Large quantities of simulated data have been produced on three different Grids, integrated into the ATLAS production system. During 2006, the emphasis moved towards providing stable continuous production, as is required in the immediate run-up to first data, and thereafter. Here, we discuss the experience of the production done on EGEE resources, using submission based on the gLite WMS, CondorG and a system using Condor Glide-ins. The overall wall time efficiency of around 90% is largely independent of the submission method, and the dominant source of wasted cpu comes from data handling issues. The efficiency of grid job submission is significantly worse than this, and the glide-in method benefits greatly from factorising this out.

  6. Constructing sites on a large scale

    DEFF Research Database (Denmark)

    Braae, Ellen Marie; Tietjen, Anne

    2011-01-01

    Since the 1990s, the regional scale has regained importance in urban and landscape design. In parallel, the focus in design tasks has shifted from master plans for urban extension to strategic urban transformation projects. A prominent example of a contemporary spatial development approach...... is the IBA Emscher Park in the Ruhr area in Germany. Over a 10 years period (1988-1998), more than a 100 local transformation projects contributed to the transformation from an industrial to a post-industrial region. The current paradigm of planning by projects reinforces the role of the design disciplines...... within the development of our urban landscapes. At the same time, urban and landscape designers are confronted with new methodological problems. Within a strategic transformation perspective, the formulation of the design problem or brief becomes an integrated part of the design process. This paper...

  7. Climatological context for large-scale coral bleaching

    Science.gov (United States)

    Barton, A. D.; Casey, K. S.

    2005-12-01

    Large-scale coral bleaching was first observed in 1979 and has occurred throughout virtually all of the tropics since that time. Severe bleaching may result in the loss of live coral and in a decline of the integrity of the impacted coral reef ecosystem. Despite the extensive scientific research and increased public awareness of coral bleaching, uncertainties remain about the past and future of large-scale coral bleaching. In order to reduce these uncertainties and place large-scale coral bleaching in the longer-term climatological context, specific criteria and methods for using historical sea surface temperature (SST) data to examine coral bleaching-related thermal conditions are proposed by analyzing three, 132 year SST reconstructions: ERSST, HadISST1, and GISST2.3b. These methodologies are applied to case studies at Discovery Bay, Jamaica (77.27°W, 18.45°N), Sombrero Reef, Florida, USA (81.11°W, 24.63°N), Academy Bay, Galápagos, Ecuador (90.31°W, 0.74°S), Pearl and Hermes Reef, Northwest Hawaiian Islands, USA (175.83°W, 27.83°N), Midway Island, Northwest Hawaiian Islands, USA (177.37°W, 28.25°N), Davies Reef, Australia (147.68°E, 18.83°S), and North Male Atoll, Maldives (73.35°E, 4.70°N). The results of this study show that (1) The historical SST data provide a useful long-term record of thermal conditions in reef ecosystems, giving important insight into the thermal history of coral reefs and (2) While coral bleaching and anomalously warm SSTs have occurred over much of the world in recent decades, case studies in the Caribbean, Northwest Hawaiian Islands, and parts of other regions such as the Great Barrier Reef exhibited SST conditions and cumulative thermal stress prior to 1979 that were comparable to those conditions observed during the strong, frequent coral bleaching events since 1979. This climatological context and knowledge of past environmental conditions in reef ecosystems may foster a better understanding of how coral reefs will

  8. Networking in a Large-Scale Distributed Agile Project

    OpenAIRE

    Moe, Nils Brede; Šmite, Darja; Šāblis, Aivars; Börjesson, Anne-Lie; Andréasson, Pia

    2014-01-01

    Context: In large-scale distributed software projects the expertise may be scattered across multiple locations. Goal: We describe and discuss a large-scale distributed agile project at Ericsson, a multinational telecommunications company headquartered in Sweden. The project is distributed across four development locations (one in Sweden, one in Korea and two in China) and employs 17 teams. In such a large scale environment the challenge is to have as few dependences between teams as possible,...

  9. Large Scale Software Building with CMake in ATLAS

    Science.gov (United States)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The offline software of the ATLAS experiment at the Large Hadron Collider (LHC) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector’s trigger system to select LHC collision events during data taking. The ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows, many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications also require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the above mentioned software packages. This also makes it possible to develop and test new and modified packages on top of existing releases. The system also allows one to detect and execute partial rebuilds of the release based on single package changes. The build system makes use of CPack for building RPM packages out of the software releases, and CTest for running unit and integration tests. We report on the migration and integration of the ATLAS software to CMake and show working examples of this large scale project in production.

  10. Using a framework to implement large-scale innovation in medical education with the intent of achieving sustainability.

    Science.gov (United States)

    Hudson, Judith N; Farmer, Elizabeth A; Weston, Kathryn M; Bushnell, John A

    2015-01-16

    Particularly when undertaken on a large scale, implementing innovation in higher education poses many challenges. Sustaining the innovation requires early adoption of a coherent implementation strategy. Using an example from clinical education, this article describes a process used to implement a large-scale innovation with the intent of achieving sustainability. Desire to improve the effectiveness of undergraduate medical education has led to growing support for a longitudinal integrated clerkship (LIC) model. This involves a move away from the traditional clerkship of 'block rotations' with frequent changes in disciplines, to a focus upon clerkships with longer duration and opportunity for students to build sustained relationships with supervisors, mentors, colleagues and patients. A growing number of medical schools have adopted the LIC model for a small percentage of their students. At a time when increasing medical school numbers and class sizes are leading to competition for clinical supervisors it is however a daunting challenge to provide a longitudinal clerkship for an entire medical school class. This challenge is presented to illustrate the strategy used to implement sustainable large scale innovation. A strategy to implement and build a sustainable longitudinal integrated community-based clerkship experience for all students was derived from a framework arising from Roberto and Levesque's research in business. The framework's four core processes: chartering, learning, mobilising and realigning, provided guidance in preparing and rolling out the 'whole of class' innovation. Roberto and Levesque's framework proved useful for identifying the foundations of the implementation strategy, with special emphasis on the relationship building required to implement such an ambitious initiative. Although this was innovation in a new School it required change within the school, wider university and health community. Challenges encountered included some resistance to

  11. Large scale chromatographic separations using continuous displacement chromatography (CDC)

    International Nuclear Information System (INIS)

    Taniguchi, V.T.; Doty, A.W.; Byers, C.H.

    1988-01-01

    A process for large scale chromatographic separations using a continuous chromatography technique is described. The process combines the advantages of large scale batch fixed column displacement chromatography with conventional analytical or elution continuous annular chromatography (CAC) to enable large scale displacement chromatography to be performed on a continuous basis (CDC). Such large scale, continuous displacement chromatography separations have not been reported in the literature. The process is demonstrated with the ion exchange separation of a binary lanthanide (Nd/Pr) mixture. The process is, however, applicable to any displacement chromatography separation that can be performed using conventional batch, fixed column chromatography

  12. Identifying large-scale brain networks in fragile X syndrome.

    Science.gov (United States)

    Hall, Scott S; Jiang, Heidi; Reiss, Allan L; Greicius, Michael D

    2013-11-01

    Fragile X syndrome (FXS) is an X-linked neurogenetic disorder characterized by a cognitive and behavioral phenotype resembling features of autism spectrum disorder. Until now, research has focused largely on identifying regional differences in brain structure and function between individuals with FXS and various control groups. Very little is known about the large-scale brain networks that may underlie the cognitive and behavioral symptoms of FXS. To identify large-scale, resting-state networks in FXS that differ from control individuals matched on age, IQ, and severity of behavioral and cognitive symptoms. Cross-sectional, in vivo neuroimaging study conducted in an academic medical center. Participants (aged 10-23 years) included 17 males and females with FXS and 16 males and females serving as controls. Univariate voxel-based morphometric analyses, fractional amplitude of low-frequency fluctuations (fALFF) analysis, and group-independent component analysis with dual regression. Patients with FXS showed decreased functional connectivity in the salience, precuneus, left executive control, language, and visuospatial networks compared with controls. Decreased fALFF in the bilateral insular, precuneus, and anterior cingulate cortices also was found in patients with FXS compared with control participants. Furthermore, fALFF in the left insular cortex was significantly positively correlated with IQ in patients with FXS. Decreased gray matter density, resting-state connectivity, and fALFF converged in the left insular cortex in patients with FXS. Fragile X syndrome results in widespread reductions in functional connectivity across multiple cognitive and affective brain networks. Converging structural and functional abnormalities in the left insular cortex, a region also implicated in individuals diagnosed with autism spectrum disorder, suggests that insula integrity and connectivity may be compromised in FXS. This method could prove useful in establishing an imaging

  13. Superconductivity for Large Scale Wind Turbines

    Energy Technology Data Exchange (ETDEWEB)

    R. Fair; W. Stautner; M. Douglass; R. Rajput-Ghoshal; M. Moscinski; P. Riley; D. Wagner; J. Kim; S. Hou; F. Lopez; K. Haran; J. Bray; T. Laskaris; J. Rochford; R. Duckworth

    2012-10-12

    A conceptual design has been completed for a 10MW superconducting direct drive wind turbine generator employing low temperature superconductors for the field winding. Key technology building blocks from the GE Wind and GE Healthcare businesses have been transferred across to the design of this concept machine. Wherever possible, conventional technology and production techniques have been used in order to support the case for commercialization of such a machine. Appendices A and B provide further details of the layout of the machine and the complete specification table for the concept design. Phase 1 of the program has allowed us to understand the trade-offs between the various sub-systems of such a generator and its integration with a wind turbine. A Failure Modes and Effects Analysis (FMEA) and a Technology Readiness Level (TRL) analysis have been completed resulting in the identification of high risk components within the design. The design has been analyzed from a commercial and economic point of view and Cost of Energy (COE) calculations have been carried out with the potential to reduce COE by up to 18% when compared with a permanent magnet direct drive 5MW baseline machine, resulting in a potential COE of 0.075 $/kWh. Finally, a top-level commercialization plan has been proposed to enable this technology to be transitioned to full volume production. The main body of this report will present the design processes employed and the main findings and conclusions.

  14. Certain Inequalities Involving the Fractional q-Integral Operators

    Directory of Open Access Journals (Sweden)

    Dumitru Baleanu

    2014-01-01

    Full Text Available We establish some inequalities involving Saigo fractional q-integral operator in the theory of quantum calculus by using the two parameters of deformation, q1 and q2, whose special cases are shown to yield corresponding inequalities associated with Riemann-Liouville and Kober fractional q-integral operators, respectively. Furthermore, we also consider their relevance with other related known results.

  15. VLSI (Very Large Scale Integrated Circuits) Device Reliability Models.

    Science.gov (United States)

    1984-12-01

    components have been particularly effective on phased array radars, including Cobra Dane, Pave Paws, Cobra Judy and AN/TPS-59. In spite of the large number...Quincy, MA 55. California Devices Promised Data San Jose, CA 56. Micro-Pac Industries Promised Data Garland TX 57. Teleydyne Philbrick No Data Available

  16. The Maryland Large-Scale Integrated Neurocognitive Architecture

    Science.gov (United States)

    2008-03-01

    intelligence methods for movement control are particularly relevant [ Rodriguez , 2005]. These “bottom-up” approaches typically start with a distributed...Systems (PDCS 2003). Marina Del Rey, California (2003) pp. 574-582 Poeppel, D. and G. Hickok (2004). Towards a new functional anatomy of language...algorithm. Proceedings of the IEEE Conference on Neural Networks. Rodriguez A. & Reggia J. Collective Movement Teams for Cooperative Problem Solving

  17. Advanced Fabrication Processes for Superconducting Very Large Scale Integrated Circuits

    Science.gov (United States)

    2015-10-13

    Josephson junctions,” J. Appl. Phys., vol. 104, no. 2, July 2008, Art. ID 023909. [9] S.K. Tolpygo and D. Amparo , “Fabrication-process-induced...J. Appl. Phys., vol. 107, No. 7, 073906, Apr. 2010. [11] S.K. Tolpygo, D. Amparo , R.T. Hunt, J.A. Vivalda, and D.T. Yohannes, “Diffusion stop...June 2011. [12] D. Amparo and S.K. Tolpygo, “Investigation of the role of H in fabrication-process-induced variations of Nb/Al/AlOx/Nb Josephson

  18. Grid Integration Issues for Large Scale Wind Power Plants (WPPs)

    DEFF Research Database (Denmark)

    Wu, Qiuwei; Xu, Zhao; Østergaard, Jacob

    2010-01-01

    transmission system operators (TSOs) over the world have come up the grid codes to request the wind power plants (WPPs) to have more or less the same operating capability as the conventional power plants. The grid codes requirements from other TSOs are under development. This paper covers the steady state......The penetration level of wind power into the power system over the world have been increasing very fast in the last few years and is still keeping the fast growth rate. It is just a matter of time that the wind power will be comparable to the conventional power generation. Therefore, many...

  19. Copper metallization for current very large scale integration.

    Science.gov (United States)

    Jiang, Q; Zhu, Y F; Zhao, M

    2011-06-01

    As silicon technology scaling progresses to the 32 nm node or even further, the design on the propagation of electromagnetic signals becomes increasingly appealing due to their unyielding constraints on interconnect delay. Because of its high conductivity and electromigration resistance, Cu is now the interconnect materials in current VLSI. To ensure the signal propagation via the Cu interconnects upon the increasingly reduction in the interconnect width, related issues on Cu interconnects, such as electron scattering at surfaces and grain boundaries, electromigration failure and surface oxidation, still need to be further understood and addressed. Besides this, the performance of low-k dielectrics and reliable barrier structures, which are also much important among the device parts, are required to be further improved to minimize the signal delay and to prevent penetration of different materials, respectively. On the basis of the paper published at Recent Patent on Nanotechnology 2007; 1: 193-209, this review will focus on recent patents and some studies on Cu metallization including Cu interconnect wires, low-k dielectrics and related barrier materials as well as manufacturing techniques in VLSI, which are one of the most essential concerns in microelectronic industry and decide further development of VLSI. This review will benefit for the design of the Cu metallization in the current VLSI.

  20. Large-scale turbulence structures in shallow separating flows

    NARCIS (Netherlands)

    Talstra, H.

    2011-01-01

    The Ph.D. thesis “Large-scale turbulence structures in shallow separating flows” by Harmen Talstra is the result of a Ph.D. research project on large-scale shallow-flow turbulence, which has been performed in the Environmental Fluid Mechanics Laboratory at Delft University of Technology. The

  1. Advances in Modelling of Large Scale Coastal Evolution

    NARCIS (Netherlands)

    Stive, M.J.F.; De Vriend, H.J.

    1995-01-01

    The attention for climate change impact on the world's coastlines has established large scale coastal evolution as a topic of wide interest. Some more recent advances in this field, focusing on the potential of mathematical models for the prediction of large scale coastal evolution, are discussed.

  2. Large-Scale Agriculture and Outgrower Schemes in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    were the main reasons for the failure of large-scale jatropha plantations in Ethiopia. The findings in Chapter 3 show that when the use of family labour is combined with easy access to credit and technology, outgrowers on average achieve higher productivity than that obtained on large-scale plantations...

  3. The Large Scale Magnetic Field and Sunspot Cycles

    Indian Academy of Sciences (India)

    tribpo

    We report on the correlation between the large scale magnetic field and sunspot cycles during the last 80 years that was found by Makarov et al. (1999) and Makarov. & Tlatov (2000) in H α spherical harmonics of the large scale magnetic field for. 1915 1999. The sum of intensities of the low modes l = 1 and 3, A(t), was used ...

  4. Large-scale synthesis of YSZ nanopowder by Pechini method

    Indian Academy of Sciences (India)

    Administrator

    structure and chemical purity of 99⋅1% by inductively coupled plasma optical emission spectroscopy on a large scale. Keywords. Sol–gel; yttria-stabilized zirconia; large scale; nanopowder; Pechini method. 1. Introduction. Zirconia has attracted the attention of many scientists because of its tremendous thermal, mechanical ...

  5. ACTIVE DIMENSIONAL CONTROL OF LARGE-SCALED STEEL STRUCTURES

    Directory of Open Access Journals (Sweden)

    Radosław Rutkowski

    2013-09-01

    Full Text Available The article discusses the issues of dimensional control in the construction process of large-scaled steel structures. The main focus is on the analysis of manufacturing tolerances. The article presents the procedure of tolerance analysis usage in process of design and manufacturing of large-scaled steel structures. The proposed solution could significantly improve the manufacturing process.

  6. Linking crop yield anomalies to large-scale atmospheric circulation in Europe.

    Science.gov (United States)

    Ceglar, Andrej; Turco, Marco; Toreti, Andrea; Doblas-Reyes, Francisco J

    2017-06-15

    Understanding the effects of climate variability and extremes on crop growth and development represents a necessary step to assess the resilience of agricultural systems to changing climate conditions. This study investigates the links between the large-scale atmospheric circulation and crop yields in Europe, providing the basis to develop seasonal crop yield forecasting and thus enabling a more effective and dynamic adaptation to climate variability and change. Four dominant modes of large-scale atmospheric variability have been used: North Atlantic Oscillation, Eastern Atlantic, Scandinavian and Eastern Atlantic-Western Russia patterns. Large-scale atmospheric circulation explains on average 43% of inter-annual winter wheat yield variability, ranging between 20% and 70% across countries. As for grain maize, the average explained variability is 38%, ranging between 20% and 58%. Spatially, the skill of the developed statistical models strongly depends on the large-scale atmospheric variability impact on weather at the regional level, especially during the most sensitive growth stages of flowering and grain filling. Our results also suggest that preceding atmospheric conditions might provide an important source of predictability especially for maize yields in south-eastern Europe. Since the seasonal predictability of large-scale atmospheric patterns is generally higher than the one of surface weather variables (e.g. precipitation) in Europe, seasonal crop yield prediction could benefit from the integration of derived statistical models exploiting the dynamical seasonal forecast of large-scale atmospheric circulation.

  7. Evaluation of an Integrated Approach Involving Chemical and ...

    African Journals Online (AJOL)

    Chemical and bio-remediation measures for the detoxification of pollutants such as cyanide and heavy metals in mine tailings effluent have been developed over the years. The study sought to evaluate the decrease in the concentrations of Cu, Zn, Fe, Cd, As and Pb through the integration of the processes involving ...

  8. Integrals involving functions of the type (WS)sup(q)

    International Nuclear Information System (INIS)

    Srivastava, D.K.

    1981-10-01

    Analytical expressions for integrals involving functions of the Woods-Saxon type raised to the power of q are given. These are expected to be of immediate application in optical model studies and for obtaining various moments of the potential having such shapes. (author)

  9. Safety aspects of large-scale combustion of hydrogen

    Energy Technology Data Exchange (ETDEWEB)

    Edeskuty, F.J.; Haugh, J.J.; Thompson, R.T.

    1986-01-01

    Recent hydrogen-safety investigations have studied the possible large-scale effects from phenomena such as the accumulation of combustible hydrogen-air mixtures in large, confined volumes. Of particular interest are safe methods for the disposal of the hydrogen and the pressures which can arise from its confined combustion. Consequently, tests of the confined combustion of hydrogen-air mixtures were conducted in a 2100 m/sup 3/ volume. These tests show that continuous combustion, as the hydrogen is generated, is a safe method for its disposal. It also has been seen that, for hydrogen concentrations up to 13 vol %, it is possible to predict maximum pressures that can occur upon ignition of premixed hydrogen-air atmospheres. In addition information has been obtained concerning the survivability of the equipment that is needed to recover from an accident involving hydrogen combustion. An accident that involved the inadvertent mixing of hydrogen and oxygen gases in a tube trailer gave evidence that under the proper conditions hydrogen combustion can transit to a detonation. If detonation occurs the pressures which can be experienced are much higher although short in duration.

  10. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  11. An Novel Architecture of Large-scale Communication in IOT

    Science.gov (United States)

    Ma, Wubin; Deng, Su; Huang, Hongbin

    2018-03-01

    In recent years, many scholars have done a great deal of research on the development of Internet of Things and networked physical systems. However, few people have made the detailed visualization of the large-scale communications architecture in the IOT. In fact, the non-uniform technology between IPv6 and access points has led to a lack of broad principles of large-scale communications architectures. Therefore, this paper presents the Uni-IPv6 Access and Information Exchange Method (UAIEM), a new architecture and algorithm that addresses large-scale communications in the IOT.

  12. Large-Scale Seismic Test Program at Hualien, Taiwan

    International Nuclear Information System (INIS)

    Tang, H.T.; Graves, H.L.; Yeh, Y.S.

    1991-01-01

    The Large-Scale Seismic Test (LSST) Program at Hualien, Taiwan, is a follow-on to the soil-structure interaction (SSI) experiments at Lotung, Taiwan. The planned SSI studies will be performed at a stiff soil site in Hualien, Taiwan, that historically has had slightly more destructive earthquakes in the past than Lotung. The objectives of the LSST project is as follows: To obtain earthquake-induced SSI data at a stiff soil site having similar prototypical nuclear power plant soil conditions. To confirm the findings and methodologies validated against the Lotung soft soil SSI data for prototypical plant condition applications. To further validate the technical basis of realistic SSI analysis approaches. To further support the resolution of USI A-40 Seismic Design Criteria issue. These objectives will be accomplished through an integrated and carefully planned experimental program consisting of: soil characterization, test model design and field construction, instrumentation layout and deployment, in-situ geophysical information collection, forced vibration test, and synthesis of results and findings. The LSST is a joint effort among many interested parties. EPRI and Taipower are the organizers of the program and have the lead in planning and managing the program

  13. Static micromixers based on large-scale industrial mixer geometry.

    Science.gov (United States)

    Bertsch, A; Heimgartner, S; Cousseau, P; Renaud, P

    2001-09-01

    Mixing liquids at the micro-scale is difficult because the low Reynolds numbers in microchannels and in microreactors prohibit the use of conventional mixing techniques based on mechanical actuators and induce turbulence. Static mixers can be used to solve this mixing problem. This paper presents micromixers with geometries very close to conventional large-scale static mixers used in the chemical and food-processing industry. Two kinds of geometries have been studied. The first type is composed of a series of stationary rigid elements that form intersecting channels to split, rearrange and combine component streams. The second type is composed of a series of short helix elements arranged in pairs, each pair comprised of a right-handed and left-handed element arranged alternately in a pipe. Micromixers of both types have been designed by CAD and manufactured with the integral microstereolithography process, a new microfabrication technique that allows the manufacturing of complex three-dimensional objects in polymers. The realized mixers have been tested experimentally. Numerical simulations of these micromixers using the computational fluid dynamics (CFD) program FLUENT are used to evaluate the mixing efficiency. With a low pressure drop and good mixing efficiency these truly three-dimensional micromixers can be used for mixing of reactants or liquids containing cells in many microTAS applications.

  14. Flat-Land Large-Scale Electricity Storage (FLES

    Directory of Open Access Journals (Sweden)

    Schalij R.

    2012-10-01

    Full Text Available Growth of renewable sources requires a smarter electricity grid, integrating multiple solutions for large scale storage. Pumped storage still is the most valid option. The capacity of existing facilities is not sufficient to accommodate future renewable resources. New locations for additional pumped storage capacity are scarce. Mountainous areas mostly are remote and do not allow construction of large facilities for ecological reasons. In the Netherlands underground solutions were studied for many years. The use of (former coal mines was rejected after scientific research. Further research showed that solid rock formations below the (unstable coal layers can be harnessed to excavate the lower water reservoir for pumped storage, making an innovative underground solution possible. A complete plan was developed, with a capacity of 1400 MW (8 GWh daily output and a head of 1400 m. It is technically and economically feasible. Compared to conventional pumped storage it has significantly less impact on the environment. Less vulnerable locations are eligible. The reservoir on the surface (only one instead of two is relatively small. It offers also a solution for other European countries. The Dutch studies provide a valuable basis for new locations.

  15. Implicit solvers for large-scale nonlinear problems

    International Nuclear Information System (INIS)

    Keyes, D E; Reynolds, D; Woodward, C S

    2006-01-01

    Computational scientists are grappling with increasingly complex, multi-rate applications that couple such physical phenomena as fluid dynamics, electromagnetics, radiation transport, chemical and nuclear reactions, and wave and material propagation in inhomogeneous media. Parallel computers with large storage capacities are paving the way for high-resolution simulations of coupled problems; however, hardware improvements alone will not prove enough to enable simulations based on brute-force algorithmic approaches. To accurately capture nonlinear couplings between dynamically relevant phenomena, often while stepping over rapid adjustments to quasi-equilibria, simulation scientists are increasingly turning to implicit formulations that require a discrete nonlinear system to be solved for each time step or steady state solution. Recent advances in iterative methods have made fully implicit formulations a viable option for solution of these large-scale problems. In this paper, we overview one of the most effective iterative methods, Newton-Krylov, for nonlinear systems and point to software packages with its implementation. We illustrate the method with an example from magnetically confined plasma fusion and briefly survey other areas in which implicit methods have bestowed important advantages, such as allowing high-order temporal integration and providing a pathway to sensitivity analyses and optimization. Lastly, we overview algorithm extensions under development motivated by current SciDAC applications

  16. Large-scale navigational map in a mammal.

    Science.gov (United States)

    Tsoar, Asaf; Nathan, Ran; Bartan, Yoav; Vyssotski, Alexei; Dell'Omo, Giacomo; Ulanovsky, Nachum

    2011-09-13

    Navigation, the ability to reach desired goal locations, is critical for animals and humans. Animal navigation has been studied extensively in birds, insects, and some marine vertebrates and invertebrates, yet we are still far from elucidating the underlying mechanisms in other taxonomic groups, especially mammals. Here we report a systematic study of the mechanisms of long-range mammalian navigation. High-resolution global positioning system tracking of bats was conducted here, which revealed high, fast, and very straight commuting flights of Egyptian fruit bats (Rousettus aegyptiacus) from their cave to remote fruit trees. Bats returned to the same individual trees night after night. When displaced 44 km south, bats homed directly to one of two goal locations--familiar fruit tree or cave--ruling out beaconing, route-following, or path-integration mechanisms. Bats released 84 km south, within a deep natural crater, were initially disoriented (but eventually left the crater toward the home direction and homed successfully), whereas bats released at the crater-edge top homed directly, suggesting navigation guided primarily by distal visual landmarks. Taken together, these results provide evidence for a large-scale "cognitive map" that enables navigation of a mammal within its visually familiar area, and they also demonstrate the ability to home back when translocated outside the visually familiar area.

  17. HTS cables open the window for large-scale renewables

    International Nuclear Information System (INIS)

    Geschiere, A; Willen, D; Piga, E; Barendregt, P

    2008-01-01

    In a realistic approach to future energy consumption, the effects of sustainable power sources and the effects of growing welfare with increased use of electricity need to be considered. These factors lead to an increased transfer of electric energy over the networks. A dominant part of the energy need will come from expanded large-scale renewable sources. To use them efficiently over Europe, large energy transits between different countries are required. Bottlenecks in the existing infrastructure will be avoided by strengthening the network. For environmental reasons more infrastructure will be built underground. Nuon is studying the HTS technology as a component to solve these challenges. This technology offers a tremendously large power transport capacity as well as the possibility to reduce short circuit currents, making integration of renewables easier. Furthermore, power transport will be possible at lower voltage levels, giving the opportunity to upgrade the existing network while re-using it. This will result in large cost savings while reaching the future energy challenges. In a 6 km backbone structure in Amsterdam Nuon wants to install a 50 kV HTS Triax cable for a significant increase of the transport capacity, while developing its capabilities. Nevertheless several barriers have to be overcome

  18. Construction Claim Types and Causes for a Large-Scale Hydropower Project in Bhutan

    Directory of Open Access Journals (Sweden)

    Bonaventura H.W. Hadikusumo

    2015-01-01

    Full Text Available Hydropower construction projects are complex and uncertain, have long gestational periods and involve several parties. Furthermore, they require the integration of different components (Civil, Mechanical and Electrical to work together as a single unit. These projects require highly specialised designs, detailed plans and specifications, high-risk construction methods, effective management, skilful supervision and close coordination. Thus, claims are common in such projects. These claims are undesirable because they require significant time and resources to resolve and cause adversarial relationships among the parties involved. Therefore, it is in the common interest of all involved parties to prevent, minimise, or resolve claims as amicably as possible. Identifying common claim types and their causes is essential in devising techniques to minimise and avoid them in future projects. This report details a case study performed on a large-scale hydropower project in Bhutan. The findings of this case study indicate that differing site conditions are the major contributor of impact and change claims and 95% of total claims can be settled by negotiation, whereas 5% of claims can be settled by arbitration.

  19. Episodic memory in aspects of large-scale brain networks

    Directory of Open Access Journals (Sweden)

    Woorim eJeong

    2015-08-01

    Full Text Available Understanding human episodic memory in aspects of large-scale brain networks has become one of the central themes in neuroscience over the last decade. Traditionally, episodic memory was regarded as mostly relying on medial temporal lobe (MTL structures. However, recent studies have suggested involvement of more widely distributed cortical network and the importance of its interactive roles in the memory process. Both direct and indirect neuro-modulations of the memory network have been tried in experimental treatments of memory disorders. In this review, we focus on the functional organization of the MTL and other neocortical areas in episodic memory. Task-related neuroimaging studies together with lesion studies suggested that specific sub-regions of the MTL are responsible for specific components of memory. However, recent studies have emphasized that connectivity within MTL structures and even their network dynamics with other cortical areas are essential in the memory process. Resting-state functional network studies also have revealed that memory function is subserved by not only the MTL system but also a distributed network, particularly the default-mode network. Furthermore, researchers have begun to investigate memory networks throughout the entire brain not restricted to the specific resting-state network. Altered patterns of functional connectivity among distributed brain regions were observed in patients with memory impairments. Recently, studies have shown that brain stimulation may impact memory through modulating functional networks, carrying future implications of a novel interventional therapy for memory impairment.

  20. Episodic memory in aspects of large-scale brain networks

    Science.gov (United States)

    Jeong, Woorim; Chung, Chun Kee; Kim, June Sic

    2015-01-01

    Understanding human episodic memory in aspects of large-scale brain networks has become one of the central themes in neuroscience over the last decade. Traditionally, episodic memory was regarded as mostly relying on medial temporal lobe (MTL) structures. However, recent studies have suggested involvement of more widely distributed cortical network and the importance of its interactive roles in the memory process. Both direct and indirect neuro-modulations of the memory network have been tried in experimental treatments of memory disorders. In this review, we focus on the functional organization of the MTL and other neocortical areas in episodic memory. Task-related neuroimaging studies together with lesion studies suggested that specific sub-regions of the MTL are responsible for specific components of memory. However, recent studies have emphasized that connectivity within MTL structures and even their network dynamics with other cortical areas are essential in the memory process. Resting-state functional network studies also have revealed that memory function is subserved by not only the MTL system but also a distributed network, particularly the default-mode network (DMN). Furthermore, researchers have begun to investigate memory networks throughout the entire brain not restricted to the specific resting-state network (RSN). Altered patterns of functional connectivity (FC) among distributed brain regions were observed in patients with memory impairments. Recently, studies have shown that brain stimulation may impact memory through modulating functional networks, carrying future implications of a novel interventional therapy for memory impairment. PMID:26321939

  1. Reviving large-scale projects; La relance des grands chantiers

    Energy Technology Data Exchange (ETDEWEB)

    Desiront, A.

    2003-06-01

    For the past decade, most large-scale hydro development projects in northern Quebec have been put on hold due to land disputes with First Nations. Hydroelectric projects have recently been revived following an agreement signed with Aboriginal communities in the province who recognized the need to find new sources of revenue for future generations. Many Cree are working on the project to harness the waters of the Eastmain River located in the middle of their territory. The work involves building an 890 foot long dam, 30 dikes enclosing a 603 square-km reservoir, a spillway, and a power house with 3 generating units with a total capacity of 480 MW of power for start-up in 2007. The project will require the use of 2,400 workers in total. The Cree Construction and Development Company is working on relations between Quebec's 14,000 Crees and the James Bay Energy Corporation, the subsidiary of Hydro-Quebec which is developing the project. Approximately 10 per cent of the $735-million project has been designated for the environmental component. Inspectors ensure that the project complies fully with environmental protection guidelines. Total development costs for Eastmain-1 are in the order of $2 billion of which $735 million will cover work on site and the remainder will cover generating units, transportation and financial charges. Under the treaty known as the Peace of the Braves, signed in February 2002, the Quebec government and Hydro-Quebec will pay the Cree $70 million annually for 50 years for the right to exploit hydro, mining and forest resources within their territory. The project comes at a time when electricity export volumes to the New England states are down due to growth in Quebec's domestic demand. Hydropower is a renewable and non-polluting source of energy that is one of the most acceptable forms of energy where the Kyoto Protocol is concerned. It was emphasized that large-scale hydro-electric projects are needed to provide sufficient energy to

  2. Food security through large scale investments in agriculture

    Science.gov (United States)

    Rulli, M.; D'Odorico, P.

    2013-12-01

    Most of the human appropriation of freshwater resources is for food production. There is some concern that in the near future the finite freshwater resources available on Earth might not be sufficient to meet the increasing human demand for agricultural products. In the late 1700s Malthus argued that in the long run the humanity would not have enough resources to feed itself. Malthus' analysis, however, did not account for the emergence of technological innovations that could increase the rate of food production. The modern and contemporary history has seen at least three major technological advances that have increased humans' access to food, namely, the industrial revolution, the green revolution, and the intensification of global trade. Here we argue that a fourth revolution has just started to happen. It involves foreign direct investments in agriculture, which intensify the crop yields of potentially highly productive agricultural lands by introducing the use of more modern technologies. The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions for commercial farming will bring the technology required to close the existing yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of verified land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with large scale land acquisitions. We

  3. Large-Scale Spray Releases: Additional Aerosol Test Results

    Energy Technology Data Exchange (ETDEWEB)

    Daniel, Richard C.; Gauglitz, Phillip A.; Burns, Carolyn A.; Fountain, Matthew S.; Shimskey, Rick W.; Billing, Justin M.; Bontha, Jagannadha R.; Kurath, Dean E.; Jenks, Jeromy WJ; MacFarlan, Paul J.; Mahoney, Lenna A.

    2013-08-01

    One of the events postulated in the hazard analysis for the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak event involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids that behave as a Newtonian fluid. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and in processing facilities across the DOE complex. To expand the data set upon which the WTP accident and safety analyses were based, an aerosol spray leak testing program was conducted by Pacific Northwest National Laboratory (PNNL). PNNL’s test program addressed two key technical areas to improve the WTP methodology (Larson and Allen 2010). The first technical area was to quantify the role of slurry particles in small breaches where slurry particles may plug the hole and prevent high-pressure sprays. The results from an effort to address this first technical area can be found in Mahoney et al. (2012a). The second technical area was to determine aerosol droplet size distribution and total droplet volume from prototypic breaches and fluids, including sprays from larger breaches and sprays of slurries for which literature data are mostly absent. To address the second technical area, the testing program collected aerosol generation data at two scales, commonly referred to as small-scale and large-scale testing. The small-scale testing and resultant data are described in Mahoney et al. (2012b), and the large-scale testing and resultant data are presented in Schonewill et al. (2012). In tests at both scales, simulants were used

  4. The ENIGMA Consortium: large-scale collaborative analyses of neuroimaging and genetic data

    NARCIS (Netherlands)

    Thompson, P.M.; Stein, J.L.; Medland, S.E.; Hibar, D.P.; Arias-Vásquez, A.; Renteria, M.E.; Toro, R.; Jahanshad, N.; Schumann, G.; Franke, B.; Wright, M.J.; Martin, N.G.; Agartz, I.; Lopez de Alda, M.; Alhusaini, S.; Boomsma, D.I.; Brouwer, R.M.; de Geus, E.J.C.; den Braber, A.; Fedko, I.O.; Hottenga, J.J.; Hulshoff Pol, H.E.; Montgomery, G.W.; Penninx, B.W.J.H.; Milaneschi, Y.; Schnack, H.G.; van t Ent, D.; Westlye, L.T.; Whalley, H.C.; Whelan, C.D.; White, T.; Winkler, A.M.; Wittfeld, K.; Woldehawariat, G.; Wolf, C.; Zilles, D.; Zwiers, M.P.; Thalamuthu, A.; Schofield, P.R.; Freimer, N.B.; Lawrence, N.S.; Drevets, W.

    2014-01-01

    The Enhancing NeuroImaging Genetics through Meta-Analysis (ENIGMA) Consortium is a collaborative network of researchers working together on a range of large-scale studies that integrate data from 70 institutions worldwide. Organized into Working Groups that tackle questions in neuroscience,

  5. Can Large Scale Land Acquisition for Agro-Development in Indonesia be Managed Sustainably?

    NARCIS (Netherlands)

    Obidzinski, K.; Takahashi, I.; Dermawan, A.; Komarudin, H.; Andrianto, A.

    2013-01-01

    This paper explores the impacts of large scale land acquisition for agro-development by analyzing the Merauke Integrated Food and Energy Estate (MIFEE) in Indonesia. It also examines the potential for MIFEE to meet sustainability requirements under RSPO, ISPO, and FSC. The plantation development

  6. The ENIGMA Consortium : large-scale collaborative analyses of neuroimaging and genetic data

    NARCIS (Netherlands)

    Thompson, Paul M.; Stein, Jason L.; Medland, Sarah E.; Hibar, Derrek P.; Vasquez, Alejandro Arias; Renteria, Miguel E.; Toro, Roberto; Jahanshad, Neda; Schumann, Gunter; Franke, Barbara; Wright, Margaret J.; Martin, Nicholas G.; Agartz, Ingrid; Alda, Martin; Alhusaini, Saud; Almasy, Laura; Almeida, Jorge; Alpert, Kathryn; Andreasen, Nancy C.; Andreassen, Ole A.; Apostolova, Liana G.; Appel, Katja; Armstrong, Nicola J.; Aribisala, Benjamin; Bastin, Mark E.; Bauer, Michael; Bearden, Carrie E.; Bergmann, Orjan; Binder, Elisabeth B.; Blangero, John; Bockholt, Henry J.; Boen, Erlend; Bois, Catherine; Boomsma, Dorret I.; Booth, Tom; Bowman, Ian J.; Bralten, Janita; Brouwer, Rachel M.; Brunner, Han G.; Brohawn, David G.; Buckner, Randy L.; Buitelaar, Jan; Bulayeva, Kazima; Bustillo, Juan R.; Calhoun, Vince D.; Cannon, Dara M.; Cantor, Rita M.; Carless, Melanie A.; Caseras, Xavier; Cavalleri, Gianpiero L.; Chakravarty, M. Mallar; Chang, Kiki D.; Ching, Christopher R. K.; Christoforou, Andrea; Cichon, Sven; Clark, Vincent P.; Conrod, Patricia; Coppola, Giovanni; Crespo-Facorro, Benedicto; Curran, Joanne E.; Czisch, Michael; Deary, Ian J.; de Geus, Eco J. C.; den Braber, Anouk; Delvecchio, Giuseppe; Depondt, Chantal; de Haan, Lieuwe; de Zubicaray, Greig I.; Dima, Danai; Dimitrova, Rali; Djurovic, Srdjan; Dong, Hongwei; Donohoe, Gary; Duggirala, Ravindranath; Dyer, Thomas D.; Ehrlich, Stefan; Ekman, Carl Johan; Elvsashagen, Torbjorn; Emsell, Louise; Erk, Susanne; Espeseth, Thomas; Fagerness, Jesen; Fears, Scott; Fedko, Iryna; Fernandez, Guillen; Fisher, Simon E.; Foroud, Tatiana; Fox, Peter T.; Francks, Clyde; Frangou, Sophia; Frey, Eva Maria; Frodl, Thomas; Frouin, Vincent; Garavan, Hugh; Giddaluru, Sudheer; Glahn, David C.; Godlewska, Beata; Goldstein, Rita Z.; Gollub, Randy L.; Grabe, Hans J.; Grimm, Oliver; Gruber, Oliver; Guadalupe, Tulio; Gur, Raquel E.; Gur, Ruben C.; Goering, Harald H. H.; Hagenaars, Saskia; Hajek, Tomas; Hall, Geoffrey B.; Hall, Jeremy; Hardy, John; Hartman, Catharina A.; Hass, Johanna; Hatton, Sean N.; Haukvik, Unn K.; Hegenscheid, Katrin; Heinz, Andreas; Hickie, Ian B.; Ho, Beng-Choon; Hoehn, David; Hoekstra, Pieter J.; Hollinshead, Marisa; Holmes, Avram J.; Homuth, Georg; Hoogman, Martine; Hong, L. Elliot; Hosten, Norbert; Hottenga, Jouke-Jan; Pol, Hilleke E. Hulshoff; Hwang, Kristy S.; Jack, Clifford R.; Jenkinson, Mark; Johnston, Caroline; Joensson, Erik G.; Kahn, Rene S.; Kasperaviciute, Dalia; Kelly, Sinead; Kim, Sungeun; Kochunov, Peter; Koenders, Laura; Kraemer, Bernd; Kwok, John B. J.; Lagopoulos, Jim; Laje, Gonzalo; Landen, Mikael; Landman, Bennett A.; Lauriello, John; Lawrie, Stephen M.; Lee, Phil H.; Le Hellard, Stephanie; Lemaitre, Herve; Leonardo, Cassandra D.; Li, Chiang-shan; Liberg, Benny; Liewald, David C.; Liu, Xinmin; Lopez, Lorna M.; Loth, Eva; Lourdusamy, Anbarasu; Luciano, Michelle; Macciardi, Fabio; Machielsen, Marise W. J.; MacQueen, Glenda M.; Malt, Ulrik F.; Mandl, Rene; Manoach, Dara S.; Martinot, Jean-Luc; Matarin, Mar; Mather, Karen A.; Mattheisen, Manuel; Mattingsdal, Morten; Meyer-Lindenberg, Andreas; McDonald, Colm; McIntosh, Andrew M.; McMahon, Francis J.; McMahon, Katie L.; Meisenzahl, Eva; Melle, Ingrid; Milaneschi, Yuri; Mohnke, Sebastian; Montgomery, Grant W.; Morris, Derek W.; Moses, Eric K.; Mueller, Bryon A.; Munoz Maniega, Susana; Muehleisen, Thomas W.; Mueller-Myhsok, Bertram; Mwangi, Benson; Nauck, Matthias; Nho, Kwangsik; Nichols, Thomas E.; Nilsson, Lars-Goeran; Nugent, Allison C.; Nyberg, Lars; Olvera, Rene L.; Oosterlaan, Jaap; Ophoff, Roel A.; Pandolfo, Massimo; Papalampropoulou-Tsiridou, Melina; Papmeyer, Martina; Paus, Tomas; Pausova, Zdenka; Pearlson, Godfrey D.; Penninx, Brenda W.; Peterson, Charles P.; Pfennig, Andrea; Phillips, Mary; Pike, G. Bruce; Poline, Jean-Baptiste; Potkin, Steven G.; Puetz, Benno; Ramasamy, Adaikalavan; Rasmussen, Jerod; Rietschel, Marcella; Rijpkema, Mark; Risacher, Shannon L.; Roffman, Joshua L.; Roiz-Santianez, Roberto; Romanczuk-Seiferth, Nina; Rose, Emma J.; Royle, Natalie A.; Rujescu, Dan; Ryten, Mina; Sachdev, Perminder S.; Salami, Alireza; Satterthwaite, Theodore D.; Savitz, Jonathan; Saykin, Andrew J.; Scanlon, Cathy; Schmaal, Lianne; Schnack, Hugo G.; Schork, Andrew J.; Schulz, S. Charles; Schuer, Remmelt; Seidman, Larry; Shen, Li; Shoemaker, Jody M.; Simmons, Andrew; Sisodiya, Sanjay M.; Smith, Colin; Smoller, Jordan W.; Soares, Jair C.; Sponheim, Scott R.; Sprooten, Emma; Starr, John M.; Steen, Vidar M.; Strakowski, Stephen; Strike, Lachlan; Sussmann, Jessika; Saemann, Philipp G.; Teumer, Alexander; Toga, Arthur W.; Tordesillas-Gutierrez, Diana; Trabzuni, Daniah; Trost, Sarah; Turner, Jessica; Van den Heuvel, Martijn; van der Wee, Nic J.; van Eijk, Kristel; van Erp, Theo G. M.; van Haren, Neeltje E. M.; van 't Ent, Dennis; van Tol, Marie-Jose; Hernandez, Maria C. Valdes; Veltman, Dick J.; Versace, Amelia; Voelzke, Henry; Walker, Robert; Walter, Henrik; Wang, Lei; Wardlaw, Joanna M.; Weale, Michael E.; Weiner, Michael W.; Wen, Wei; Westlye, Lars T.; Whalley, Heather C.; Whelan, Christopher D.; White, Tonya; Winkler, Anderson M.; Wittfeld, Katharina; Woldehawariat, Girma; Wolf, Christiane; Zilles, David; Zwiers, Marcel P.; Thalamuthu, Anbupalam; Schofield, Peter R.; Freimer, Nelson B.; Lawrence, Natalia S.; Drevets, Wayne

    The Enhancing NeuroImaging Genetics through Meta-Analysis (ENIGMA) Consortium is a collaborative network of researchers working together on a range of large-scale studies that integrate data from 70 institutions worldwide. Organized into Working Groups that tackle questions in neuroscience,

  7. The ENIGMA Consortium: Large-scale collaborative analyses of neuroimaging and genetic data

    NARCIS (Netherlands)

    P.M. Thompson (Paul); J.L. Stein; S.E. Medland (Sarah Elizabeth); D.P. Hibar (Derrek); A.A. Vásquez (Arias); M.E. Rentería (Miguel); R. Toro (Roberto); N. Jahanshad (Neda); G. Schumann (Gunter); B. Franke (Barbara); M.J. Wright (Margaret); N.G. Martin (Nicholas); I. Agartz (Ingrid); M. Alda (Martin); S. Alhusaini (Saud); L. Almasy (Laura); K. Alpert (Kathryn); N.C. Andreasen; O.A. Andreassen (Ole); L.G. Apostolova (Liana); K. Appel (Katja); N.J. Armstrong (Nicola); B. Aribisala (Benjamin); M.E. Bastin (Mark); M. Bauer (Michael); C.E. Bearden (Carrie); Ø. Bergmann (Ørjan); E.B. Binder (Elisabeth); J. Blangero (John); H.J. Bockholt; E. Bøen (Erlend); M. Bois (Monique); D.I. Boomsma (Dorret); T. Booth (Tom); I.J. Bowman (Ian); L.B.C. Bralten (Linda); R.M. Brouwer (Rachel); H.G. Brunner; D.G. Brohawn (David); M. Buckner; J.K. Buitelaar (Jan); K. Bulayeva (Kazima); J. Bustillo; V.D. Calhoun (Vince); D.M. Cannon (Dara); R.M. Cantor; M.A. Carless (Melanie); X. Caseras (Xavier); G. Cavalleri (Gianpiero); M.M. Chakravarty (M. Mallar); K.D. Chang (Kiki); C.R.K. Ching (Christopher); A. Christoforou (Andrea); S. Cichon (Sven); V.P. Clark; P. Conrod (Patricia); D. Coppola (Domenico); B. Crespo-Facorro (Benedicto); J.E. Curran (Joanne); M. Czisch (Michael); I.J. Deary (Ian); E.J.C. de Geus (Eco); A. den Braber (Anouk); G. Delvecchio (Giuseppe); C. Depondt (Chantal); L. de Haan (Lieuwe); G.I. de Zubicaray (Greig); D. Dima (Danai); R. Dimitrova (Rali); S. Djurovic (Srdjan); H. Dong (Hongwei); D.J. Donohoe (Dennis); A. Duggirala (Aparna); M.D. Dyer (Matthew); S.M. Ehrlich (Stefan); C.J. Ekman (Carl Johan); T. Elvsåshagen (Torbjørn); L. Emsell (Louise); S. Erk; T. Espeseth (Thomas); J. Fagerness (Jesen); S. Fears (Scott); I. Fedko (Iryna); G. Fernandez (Guillén); S.E. Fisher (Simon); T. Foroud (Tatiana); P.T. Fox (Peter); C. Francks (Clyde); S. Frangou (Sophia); E.M. Frey (Eva Maria); T. Frodl (Thomas); V. Frouin (Vincent); H. Garavan (Hugh); S. Giddaluru (Sudheer); D.C. Glahn (David); B. Godlewska (Beata); R.Z. Goldstein (Rita); R.L. Gollub (Randy); H.J. Grabe (Hans Jörgen); O. Grimm (Oliver); O. Gruber (Oliver); T. Guadalupe (Tulio); R.E. Gur (Raquel); R.C. Gur (Ruben); H.H.H. Göring (Harald); S. Hagenaars (Saskia); T. Hajek (Tomas); G.B. Hall (Garry); J. Hall (Jeremy); J. Hardy (John); C.A. Hartman (Catharina); J. Hass (Johanna); W. Hatton; U.K. Haukvik (Unn); K. Hegenscheid (Katrin); J. Heinz (Judith); I.B. Hickie (Ian); B.C. Ho (Beng ); D. Hoehn (David); P.J. Hoekstra (Pieter); M. Hollinshead (Marisa); A.J. Holmes (Avram); G. Homuth (Georg); M. Hoogman (Martine); L.E. Hong (L.Elliot); N. Hosten (Norbert); J.J. Hottenga (Jouke Jan); H.E. Hulshoff Pol (Hilleke); K.S. Hwang (Kristy); C.R. Jack Jr. (Clifford); S. Jenkinson (Sarah); C. Johnston; E.G. Jönsson (Erik); R.S. Kahn (René); D. Kasperaviciute (Dalia); S. Kelly (Steve); S. Kim (Shinseog); P. Kochunov (Peter); L. Koenders (Laura); B. Krämer (Bernd); J.B.J. Kwok (John); J. Lagopoulos (Jim); G. Laje (Gonzalo); M. Landén (Mikael); B.A. Landman (Bennett); J. Lauriello; S. Lawrie (Stephen); P.H. Lee (Phil); S. Le Hellard (Stephanie); H. Lemaître (Herve); C.D. Leonardo (Cassandra); C.-S. Li (Chiang-shan); B. Liberg (Benny); D.C. Liewald (David C.); X. Liu (Xinmin); L.M. Lopez (Lorna); E. Loth (Eva); A. Lourdusamy (Anbarasu); M. Luciano (Michelle); F. MacCiardi (Fabio); M.W.J. Machielsen (Marise); G.M. MacQueen (Glenda); U.F. Malt (Ulrik); R. Mandl (René); D.S. Manoach (Dara); J.-L. Martinot (Jean-Luc); M. Matarin (Mar); R. Mather; M. Mattheisen (Manuel); M. Mattingsdal (Morten); A. Meyer-Lindenberg; C. McDonald (Colm); A.M. McIntosh (Andrew); F.J. Mcmahon (Francis J); K.L. Mcmahon (Katie); E. Meisenzahl (Eva); I. Melle (Ingrid); Y. Milaneschi (Yuri); S. Mohnke (Sebastian); G.W. Montgomery (Grant); D.W. Morris (Derek W); E.K. Moses (Eric); B.A. Mueller (Bryon ); S. Muñoz Maniega (Susana); T.W. Mühleisen (Thomas); B. Müller-Myhsok (Bertram); B. Mwangi (Benson); M. Nauck (Matthias); K. Nho (Kwangsik); T.E. Nichols (Thomas); L.G. Nilsson; A.C. Nugent (Allison); L. Nyberg (Lisa); R.L. Olvera (Rene); J. Oosterlaan (Jaap); R.A. Ophoff (Roel); M. Pandolfo (Massimo); M. Papalampropoulou-Tsiridou (Melina); M. Papmeyer (Martina); T. Paus (Tomas); Z. Pausova (Zdenka); G. Pearlson (Godfrey); B.W.J.H. Penninx (Brenda); C.P. Peterson (Charles); A. Pfennig (Andrea); M. Phillips (Mary); G.B. Pike (G Bruce); J.B. Poline (Jean Baptiste); S.G. Potkin (Steven); B. Pütz (Benno); A. Ramasamy (Adaikalavan); J. Rasmussen (Jerod); M. Rietschel (Marcella); M. Rijpkema (Mark); S.L. Risacher (Shannon); J.L. Roffman (Joshua); R. Roiz-Santiañez (Roberto); N. Romanczuk-Seiferth (Nina); E.J. Rose (Emma); N.A. Royle (Natalie); D. Rujescu (Dan); M. Ryten (Mina); P.S. Sachdev (Perminder); A. Salami (Alireza); T.D. Satterthwaite (Theodore); J. Savitz (Jonathan); A.J. Saykin (Andrew); C. Scanlon (Cathy); L. Schmaal (Lianne); H. Schnack (Hugo); N.J. Schork (Nicholas); S.C. Schulz (S.Charles); R. Schür (Remmelt); L.J. Seidman (Larry); L. Shen (Li); L. Shoemaker (Lawrence); A. Simmons (Andrew); S.M. Sisodiya (Sanjay); C. Smith (Colin); J.W. Smoller; J.C. Soares (Jair); S.R. Sponheim (Scott); R. Sprooten (Roy); J.M. Starr (John); V.M. Steen (Vidar); S. Strakowski (Stephen); V.M. Strike (Vanessa); J. Sussmann (Jessika); P.G. Sämann (Philipp); A. Teumer (Alexander); A.W. Toga (Arthur); D. Tordesillas-Gutierrez (Diana); D. Trabzuni (Danyah); S. Trost (Sarah); J. Turner (Jessica); M. van den Heuvel (Martijn); N.J. van der Wee (Nic); K.R. van Eijk (Kristel); T.G.M. van Erp (Theo G.); N.E.M. van Haren (Neeltje E.); D. van 't Ent (Dennis); M.J.D. van Tol (Marie-José); M.C. Valdés Hernández (Maria); D.J. Veltman (Dick); A. Versace (Amelia); H. Völzke (Henry); R. Walker (Robert); H.J. Walter (Henrik); L. Wang (Lei); J.M. Wardlaw (J.); M.E. Weale (Michael); M.W. Weiner (Michael); W. Wen (Wei); L.T. Westlye (Lars); H.C. Whalley (Heather); C.D. Whelan (Christopher); T.J.H. White (Tonya); A.M. Winkler (Anderson); K. Wittfeld (Katharina); G. Woldehawariat (Girma); A. Björnsson (Asgeir); D. Zilles (David); M.P. Zwiers (Marcel); A. Thalamuthu (Anbupalam); J.R. Almeida (Jorge); C.J. Schofield (Christopher); N.B. Freimer (Nelson); N.S. Lawrence (Natalia); D.A. Drevets (Douglas)

    2014-01-01

    textabstractThe Enhancing NeuroImaging Genetics through Meta-Analysis (ENIGMA) Consortium is a collaborative network of researchers working together on a range of large-scale studies that integrate data from 70 institutions worldwide. Organized into Working Groups that tackle questions in

  8. Robust scene stitching in large scale mobile mapping

    OpenAIRE

    Schouwenaars, Filip; Timofte, Radu; Van Gool, Luc

    2013-01-01

    Schouwenaars F., Timofte R., Van Gool L., ''Robust scene stitching in large scale mobile mapping'', 24th British machine vision conference - BMVC 2013, 11 pp., September 9-13, 2013, Bristol, United Kingdom.

  9. Needs, opportunities, and options for large scale systems research

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  10. Amplification of large-scale magnetic field in nonhelical magnetohydrodynamics

    KAUST Repository

    Kumar, Rohit

    2017-08-11

    It is typically assumed that the kinetic and magnetic helicities play a crucial role in the growth of large-scale dynamo. In this paper, we demonstrate that helicity is not essential for the amplification of large-scale magnetic field. For this purpose, we perform nonhelical magnetohydrodynamic (MHD) simulation, and show that the large-scale magnetic field can grow in nonhelical MHD when random external forcing is employed at scale 1/10 the box size. The energy fluxes and shell-to-shell transfer rates computed using the numerical data show that the large-scale magnetic energy grows due to the energy transfers from the velocity field at the forcing scales.

  11. Large-Scale 3D Printing: The Way Forward

    Science.gov (United States)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  12. Personalized Opportunistic Computing for CMS at Large Scale

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    **Douglas Thain** is an Associate Professor of Computer Science and Engineering at the University of Notre Dame, where he designs large scale distributed computing systems to power the needs of advanced science and...

  13. Large-scale event extraction from literature with multi-level gene normalization.

    Directory of Open Access Journals (Sweden)

    Sofie Van Landeghem

    Full Text Available Text mining for the life sciences aims to aid database curation, knowledge summarization and information retrieval through the automated processing of biomedical texts. To provide comprehensive coverage and enable full integration with existing biomolecular database records, it is crucial that text mining tools scale up to millions of articles and that their analyses can be unambiguously linked to information recorded in resources such as UniProt, KEGG, BioGRID and NCBI databases. In this study, we investigate how fully automated text mining of complex biomolecular events can be augmented with a normalization strategy that identifies biological concepts in text, mapping them to identifiers at varying levels of granularity, ranging from canonicalized symbols to unique gene and proteins and broad gene families. To this end, we have combined two state-of-the-art text mining components, previously evaluated on two community-wide challenges, and have extended and improved upon these methods by exploiting their complementary nature. Using these systems, we perform normalization and event extraction to create a large-scale resource that is publicly available, unique in semantic scope, and covers all 21.9 million PubMed abstracts and 460 thousand PubMed Central open access full-text articles. This dataset contains 40 million biomolecular events involving 76 million gene/protein mentions, linked to 122 thousand distinct genes from 5032 species across the full taxonomic tree. Detailed evaluations and analyses reveal promising results for application of this data in database and pathway curation efforts. The main software components used in this study are released under an open-source license. Further, the resulting dataset is freely accessible through a novel API, providing programmatic and customized access (http://www.evexdb.org/api/v001/. Finally, to allow for large-scale bioinformatic analyses, the entire resource is available for bulk download from

  14. Organizational Influences on Interdisciplinary Interactions during Research and Design of Large-Scale Complex Engineered Systems

    Science.gov (United States)

    McGowan, Anna-Maria R.; Seifert, Colleen M.; Papalambros, Panos Y.

    2012-01-01

    The design of large-scale complex engineered systems (LaCES) such as an aircraft is inherently interdisciplinary. Multiple engineering disciplines, drawing from a team of hundreds to thousands of engineers and scientists, are woven together throughout the research, development, and systems engineering processes to realize one system. Though research and development (R&D) is typically focused in single disciplines, the interdependencies involved in LaCES require interdisciplinary R&D efforts. This study investigates the interdisciplinary interactions that take place during the R&D and early conceptual design phases in the design of LaCES. Our theoretical framework is informed by both engineering practices and social science research on complex organizations. This paper provides preliminary perspective on some of the organizational influences on interdisciplinary interactions based on organization theory (specifically sensemaking), data from a survey of LaCES experts, and the authors experience in the research and design. The analysis reveals couplings between the engineered system and the organization that creates it. Survey respondents noted the importance of interdisciplinary interactions and their significant benefit to the engineered system, such as innovation and problem mitigation. Substantial obstacles to interdisciplinarity are uncovered beyond engineering that include communication and organizational challenges. Addressing these challenges may ultimately foster greater efficiencies in the design and development of LaCES and improved system performance by assisting with the collective integration of interdependent knowledge bases early in the R&D effort. This research suggests that organizational and human dynamics heavily influence and even constrain the engineering effort for large-scale complex systems.

  15. Large-scale event extraction from literature with multi-level gene normalization.

    Science.gov (United States)

    Van Landeghem, Sofie; Björne, Jari; Wei, Chih-Hsuan; Hakala, Kai; Pyysalo, Sampo; Ananiadou, Sophia; Kao, Hung-Yu; Lu, Zhiyong; Salakoski, Tapio; Van de Peer, Yves; Ginter, Filip

    2013-01-01

    Text mining for the life sciences aims to aid database curation, knowledge summarization and information retrieval through the automated processing of biomedical texts. To provide comprehensive coverage and enable full integration with existing biomolecular database records, it is crucial that text mining tools scale up to millions of articles and that their analyses can be unambiguously linked to information recorded in resources such as UniProt, KEGG, BioGRID and NCBI databases. In this study, we investigate how fully automated text mining of complex biomolecular events can be augmented with a normalization strategy that identifies biological concepts in text, mapping them to identifiers at varying levels of granularity, ranging from canonicalized symbols to unique gene and proteins and broad gene families. To this end, we have combined two state-of-the-art text mining components, previously evaluated on two community-wide challenges, and have extended and improved upon these methods by exploiting their complementary nature. Using these systems, we perform normalization and event extraction to create a large-scale resource that is publicly available, unique in semantic scope, and covers all 21.9 million PubMed abstracts and 460 thousand PubMed Central open access full-text articles. This dataset contains 40 million biomolecular events involving 76 million gene/protein mentions, linked to 122 thousand distinct genes from 5032 species across the full taxonomic tree. Detailed evaluations and analyses reveal promising results for application of this data in database and pathway curation efforts. The main software components used in this study are released under an open-source license. Further, the resulting dataset is freely accessible through a novel API, providing programmatic and customized access (http://www.evexdb.org/api/v001/). Finally, to allow for large-scale bioinformatic analyses, the entire resource is available for bulk download from http

  16. Coupled binary embedding for large-scale image retrieval.

    Science.gov (United States)

    Zheng, Liang; Wang, Shengjin; Tian, Qi

    2014-08-01

    Visual matching is a crucial step in image retrieval based on the bag-of-words (BoW) model. In the baseline method, two keypoints are considered as a matching pair if their SIFT descriptors are quantized to the same visual word. However, the SIFT visual word has two limitations. First, it loses most of its discriminative power during quantization. Second, SIFT only describes the local texture feature. Both drawbacks impair the discriminative power of the BoW model and lead to false positive matches. To tackle this problem, this paper proposes to embed multiple binary features at indexing level. To model correlation between features, a multi-IDF scheme is introduced, through which different binary features are coupled into the inverted file. We show that matching verification methods based on binary features, such as Hamming embedding, can be effectively incorporated in our framework. As an extension, we explore the fusion of binary color feature into image retrieval. The joint integration of the SIFT visual word and binary features greatly enhances the precision of visual matching, reducing the impact of false positive matches. Our method is evaluated through extensive experiments on four benchmark datasets (Ukbench, Holidays, DupImage, and MIR Flickr 1M). We show that our method significantly improves the baseline approach. In addition, large-scale experiments indicate that the proposed method requires acceptable memory usage and query time compared with other approaches. Further, when global color feature is integrated, our method yields competitive performance with the state-of-the-arts.

  17. A functional integral inclusion involving Carathéodories

    Directory of Open Access Journals (Sweden)

    Bapurao Dhage

    2003-09-01

    Full Text Available In this paper the existence of extremal solutions of a functional integral inclusion involving Carathéodory is proved under certain monotonicity conditions. Applications are given to some initial and boundary value problems of ordinary differential inclusion for proving the existence of extremal solutions. Our results generalize the results of Dhage [8] under weaker conditions and complement the results of O'Regan [16].

  18. Benefits of transactive memory systems in large-scale development

    OpenAIRE

    Aivars, Sablis

    2016-01-01

    Context. Large-scale software development projects are those consisting of a large number of teams, maybe even spread across multiple locations, and working on large and complex software tasks. That means that neither a team member individually nor an entire team holds all the knowledge about the software being developed and teams have to communicate and coordinate their knowledge. Therefore, teams and team members in large-scale software development projects must acquire and manage expertise...

  19. Efficient algorithms for collaborative decision making for large scale settings

    DEFF Research Database (Denmark)

    Assent, Ira

    2011-01-01

    Collaborative decision making is a successful approach in settings where data analysis and querying can be done interactively. In large scale systems with huge data volumes or many users, collaboration is often hindered by impractical runtimes. Existing work on improving collaboration focuses on ...... to bring about more effective and more efficient retrieval systems that support the users' decision making process. We sketch promising research directions for more efficient algorithms for collaborative decision making, especially for large scale systems....

  20. Achieving Agility and Stability in Large-Scale Software Development

    Science.gov (United States)

    2013-01-16

    question Which software development process are you currently using? 1. Agile software development (e.g., using Scrum , XP practices, test-driven...Architecting in a Complex World Twitter #SEIVirtualEvent © 2013 Carnegie Mellon University Achieving Agility and Stability in Large-Scale...staff in the Research, Technology, and System Solutions Program at the SEI. She is currently engaged in activities focusing on large scale agile

  1. Integrated Transport Planning Framework Involving Combined Utility Regret Approach

    DEFF Research Database (Denmark)

    Wang, Yang; Monzon, Andres; Di Ciommo, Floridea

    2014-01-01

    in a framework for integrated transport planning. The framework consisted of a two-round Delphi survey, integrated land use and transport model for Madrid, and multicriteria analysis. Results show that (a) the regret-based ranking has a similar mean but larger variance than the utility-based ranking does, (b......Sustainable transport planning requires an integrated approach involving strategic planning, impact analysis, and multicriteria evaluation. This study aimed at relaxing the utility-based decision-making assumption by newly embedding anticipated-regret and combined utility regret decision mechanisms......-based multicriteria analyses result in different rankings of policy packages, and (e) the combined utility regret ranking is more informative compared with the utility-based or the regret-based ranking....

  2. Trends in large-scale testing of reactor structures

    International Nuclear Information System (INIS)

    Blejwas, T.E.

    2003-01-01

    Large-scale tests of reactor structures have been conducted at Sandia National Laboratories since the late 1970s. This paper describes a number of different large-scale impact tests, pressurization tests of models of containment structures, and thermal-pressure tests of models of reactor pressure vessels. The advantages of large-scale testing are evident, but cost, in particular limits its use. As computer models have grown in size, such as number of degrees of freedom, the advent of computer graphics has made possible very realistic representation of results - results that may not accurately represent reality. A necessary condition to avoiding this pitfall is the validation of the analytical methods and underlying physical representations. Ironically, the immensely larger computer models sometimes increase the need for large-scale testing, because the modeling is applied to increasing more complex structural systems and/or more complex physical phenomena. Unfortunately, the cost of large-scale tests is a disadvantage that will likely severely limit similar testing in the future. International collaborations may provide the best mechanism for funding future programs with large-scale tests. (author)

  3. Entirely renewable energy-based electricity supply system (small scale and large scale)

    Energy Technology Data Exchange (ETDEWEB)

    Zahedi, A. [Monash University, Caulfield (Australia). Division of Electrical and Computer Systems Engineering

    1996-09-01

    Our future energy needs will be supplied by a combination of many different sources ranging from small wind turbine to provide power for a single house to central power stations that provide power in very large scale fed into the national grid. Computer control systems will integrate the performance of all these systems to make sure that as much power as possible comes from environmentally friendlier sources. As alternative sources becomes more widely available, small scale systems meeting local needs may start to replace current large scale central power stations. The author is investigating the feasibility of an entirely renewable energy-based electricity supply system. The developed system find so many applications as it can be used as small scale power system for Remote Area Power Supply, wind energy/battery or solar energy/battery, as well as large scale for interconnection with national grid. (Author)

  4. Large Scale Computing and Storage Requirements for Nuclear Physics Research

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard A.; Wasserman, Harvey J.

    2012-03-02

    IThe National Energy Research Scientific Computing Center (NERSC) is the primary computing center for the DOE Office of Science, serving approximately 4,000 users and hosting some 550 projects that involve nearly 700 codes for a wide variety of scientific disciplines. In addition to large-scale computing resources NERSC provides critical staff support and expertise to help scientists make the most efficient use of these resources to advance the scientific mission of the Office of Science. In May 2011, NERSC, DOE’s Office of Advanced Scientific Computing Research (ASCR) and DOE’s Office of Nuclear Physics (NP) held a workshop to characterize HPC requirements for NP research over the next three to five years. The effort is part of NERSC’s continuing involvement in anticipating future user needs and deploying necessary resources to meet these demands. The workshop revealed several key requirements, in addition to achieving its goal of characterizing NP computing. The key requirements include: 1. Larger allocations of computational resources at NERSC; 2. Visualization and analytics support; and 3. Support at NERSC for the unique needs of experimental nuclear physicists. This report expands upon these key points and adds others. The results are based upon representative samples, called “case studies,” of the needs of science teams within NP. The case studies were prepared by NP workshop participants and contain a summary of science goals, methods of solution, current and future computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, “multi-core” environment that is expected to dominate HPC architectures over the next few years. The report also includes a section with NERSC responses to the workshop findings. NERSC has many initiatives already underway that address key workshop findings and all of the action items are aligned with NERSC strategic plans.

  5. Output regulation of large-scale hydraulic networks with minimal steady state power consumption

    NARCIS (Netherlands)

    Jensen, Tom Nørgaard; Wisniewski, Rafał; De Persis, Claudio; Kallesøe, Carsten Skovmose

    2014-01-01

    An industrial case study involving a large-scale hydraulic network is examined. The hydraulic network underlies a district heating system, with an arbitrary number of end-users. The problem of output regulation is addressed along with a optimization criterion for the control. The fact that the

  6. Incremental development of large-scale human-robot teamwork in disaster response environments

    NARCIS (Netherlands)

    Greeff, J. de; Smets, N.J.J.M.; Hindriks, K.; Neerincx, M.A.; Kruijff-Korbayová, I.

    2017-01-01

    We report on the latest large-scale disaster-response exercise conducted by our project, which involves a robotic system with both ground robots (UGVs) and aerial robots (UAVs). In particularly, we focus on aspects related to Human-Robot teaming, and the uptake of new technology by end-users. © 2017

  7. Managing Large Scale Project Analysis Teams through a Web Accessible Database

    Science.gov (United States)

    O'Neil, Daniel A.

    2008-01-01

    Large scale space programs analyze thousands of requirements while mitigating safety, performance, schedule, and cost risks. These efforts involve a variety of roles with interdependent use cases and goals. For example, study managers and facilitators identify ground-rules and assumptions for a collection of studies required for a program or project milestone. Task leaders derive product requirements from the ground rules and assumptions and describe activities to produce needed analytical products. Disciplined specialists produce the specified products and load results into a file management system. Organizational and project managers provide the personnel and funds to conduct the tasks. Each role has responsibilities to establish information linkages and provide status reports to management. Projects conduct design and analysis cycles to refine designs to meet the requirements and implement risk mitigation plans. At the program level, integrated design and analysis cycles studies are conducted to eliminate every 'to-be-determined' and develop plans to mitigate every risk. At the agency level, strategic studies analyze different approaches to exploration architectures and campaigns. This paper describes a web-accessible database developed by NASA to coordinate and manage tasks at three organizational levels. Other topics in this paper cover integration technologies and techniques for process modeling and enterprise architectures.

  8. Parallel Computational Fluid Dynamics 2007 : Implementations and Experiences on Large Scale and Grid Computing

    CERN Document Server

    2009-01-01

    At the 19th Annual Conference on Parallel Computational Fluid Dynamics held in Antalya, Turkey, in May 2007, the most recent developments and implementations of large-scale and grid computing were presented. This book, comprised of the invited and selected papers of this conference, details those advances, which are of particular interest to CFD and CFD-related communities. It also offers the results related to applications of various scientific and engineering problems involving flows and flow-related topics. Intended for CFD researchers and graduate students, this book is a state-of-the-art presentation of the relevant methodology and implementation techniques of large-scale computing.

  9. Large scale test facilities to assure future turbine-generator reliability

    International Nuclear Information System (INIS)

    Quick, S.L.

    1976-01-01

    Turbine-generators have been a major source of nuclear power station outages and, therefore, loss of revenue to utilities. The cost penalties of outages are increasing. solution of the complex problems involved in improving the availability of turbines and generators will be aided by new test facilities costing over $10M. The uses of a large-scale turbine test facility, and a companion large-scale generator test facility, both in operation as part of a long-term programme of reliability improvement, are discussed. (author)

  10. Analysis for preliminary evaluation of discrete fracture flow and large-scale permeability in sedimentary rocks

    International Nuclear Information System (INIS)

    Kanehiro, B.Y.; Lai, C.H.; Stow, S.H.

    1987-05-01

    Conceptual models for sedimentary rock settings that could be used in future evaluation and suitability studies are being examined through the DOE Repository Technology Program. One area of concern for the hydrologic aspects of these models is discrete fracture flow analysis as related to the estimation of the size of the representative elementary volume, evaluation of the appropriateness of continuum assumptions and estimation of the large-scale permeabilities of sedimentary rocks. A basis for preliminary analysis of flow in fracture systems of the types that might be expected to occur in low permeability sedimentary rocks is presented. The approach used involves numerical modeling of discrete fracture flow for the configuration of a large-scale hydrologic field test directed at estimation of the size of the representative elementary volume and large-scale permeability. Analysis of fracture data on the basis of this configuration is expected to provide a preliminary indication of the scale at which continuum assumptions can be made

  11. Advances in Large-Scale Solar Heating and Long Term Storage in Denmark

    DEFF Research Database (Denmark)

    Heller, Alfred

    2000-01-01

    of the total 51 plants are equipped with long-term storage. In Denmark, 7 plants are installed, comprising of approx. 18,000-m2 collector area with new plants planned. The development of these plants and the involved technologies will be presented in this paper, with a focus on the improvements for Danish......According to (the) information from the European Large-Scale Solar Heating Network, (See http://www.hvac.chalmers.se/cshp/), the area of installed solar collectors for large-scale application is in Europe, approximately 8 mill m2, corresponding to about 4000 MW thermal power. The 11 plants...... Central Solar Heating Plants, servicing District Heating and related developments in large-scale thermal storage. Central solar heating today is a mature and economic realistic solution for district heating based on a renewable source. The cost for solar collectors has decreased by nearly ¼ during...

  12. Climate Response to Large-Scale Wind Farms

    Science.gov (United States)

    Malyshev, S. L.; Pacala, S. W.; Keith, D. W.; Denkenberger, D. C.; Roy, S. B.; Shevliakova, E.

    2003-12-01

    Enhanced reliance on wind energy is one of the possible ways to solve the greenhouse gas problem. However, because wind farms modify the interaction between atmosphere and surface, there is a possibility that wind energy might also change the global climate if developed on a scale large enough to make material reductions in greenhouse emissions. To investigate possible effect of large wind farm arrays on climate, we used a version of GFDL AGCM with prescribed climatological SST and ice extent. A series of 20-year integration included a range of spatial coverage and several different parameterizations for wind farms, as well as controls. Presented here are the results of the experiments with the largest wind farm arrays considered, where significant parts of Europe,North America, andChinaare covered. The total wind farm area in this case is about 10% of land surface; the resulting dissipation of kinetic energy on wind farms ranges from 12 to 19 TW (compared to current global primary energy consumption of 12 TW). The results of the simulations show similar effects on climate despite differences in parameterization for the wind farms. The global-average climate reaction is negligible, but regional effects are not. The effect of wind farms on annual mean surface air temperature reaches a regional maximum of the order of 1 degree, which is smaller than the 2XCO2 signal. The temperature response is strongest inEurasia, where it is characterized by cooling in northern mid-latitudes of entire continent and warming in the South in all seasons except JJA, when the pattern is much weaker. The response of precipitation does not show an obvious large-scale pattern and is likely to be statistically not significant.

  13. Planning and executing complex large-scale exercises.

    Science.gov (United States)

    McCormick, Lisa C; Hites, Lisle; Wakelee, Jessica F; Rucks, Andrew C; Ginter, Peter M

    2014-01-01

    Increasingly, public health departments are designing and engaging in complex operations-based full-scale exercises to test multiple public health preparedness response functions. The Department of Homeland Security's Homeland Security Exercise and Evaluation Program (HSEEP) supplies benchmark guidelines that provide a framework for both the design and the evaluation of drills and exercises; however, the HSEEP framework does not seem to have been designed to manage the development and evaluation of multiple, operations-based, parallel exercises combined into 1 complex large-scale event. Lessons learned from the planning of the Mississippi State Department of Health Emergency Support Function--8 involvement in National Level Exercise 2011 were used to develop an expanded exercise planning model that is HSEEP compliant but accounts for increased exercise complexity and is more functional for public health. The Expanded HSEEP (E-HSEEP) model was developed through changes in the HSEEP exercise planning process in areas of Exercise Plan, Controller/Evaluator Handbook, Evaluation Plan, and After Action Report and Improvement Plan development. The E-HSEEP model was tested and refined during the planning and evaluation of Mississippi's State-level Emergency Support Function-8 exercises in 2012 and 2013. As a result of using the E-HSEEP model, Mississippi State Department of Health was able to capture strengths, lessons learned, and areas for improvement, and identify microlevel issues that may have been missed using the traditional HSEEP framework. The South Central Preparedness and Emergency Response Learning Center is working to create an Excel-based E-HSEEP tool that will allow practice partners to build a database to track corrective actions and conduct many different types of analyses and comparisons.

  14. Patient involvement in the safety of care: an integrative review

    Directory of Open Access Journals (Sweden)

    Thaynara de Oliveira Silva

    2016-06-01

    Full Text Available The aim of this integrative review was to survey the strategies adopted by health institutions that involve patients in care as a barrier to prevent incidents. A search was conducted in MEDLINE, LILACS, CINAHL and PubMed databases using the descriptors ‘patient safety’, ‘iatrogenic’, ‘medical error’ and ‘involvement’. The review included studies in full text published between 2003 and March 2016 in English, Spanish or Portuguese. It was found that the effective communication and the development of patients’ autonomy are the most advocated strategies. The level of evidence of studies was limited to four and six. The assessment or description of institutional practices involving patients in their safety emerged as a gap in scientific knowledge. The impact of this review is to demonstrate the need for randomized studies to identify effective interventions, directing health institutions towards change in the organizational culture, focusing on safety and patient-centered care.

  15. Large-scale networks in engineering and life sciences

    CERN Document Server

    Findeisen, Rolf; Flockerzi, Dietrich; Reichl, Udo; Sundmacher, Kai

    2014-01-01

    This edited volume provides insights into and tools for the modeling, analysis, optimization, and control of large-scale networks in the life sciences and in engineering. Large-scale systems are often the result of networked interactions between a large number of subsystems, and their analysis and control are becoming increasingly important. The chapters of this book present the basic concepts and theoretical foundations of network theory and discuss its applications in different scientific areas such as biochemical reactions, chemical production processes, systems biology, electrical circuits, and mobile agents. The aim is to identify common concepts, to understand the underlying mathematical ideas, and to inspire discussions across the borders of the various disciplines.  The book originates from the interdisciplinary summer school “Large Scale Networks in Engineering and Life Sciences” hosted by the International Max Planck Research School Magdeburg, September 26-30, 2011, and will therefore be of int...

  16. Acoustic Studies of the Large Scale Ocean Circulation

    Science.gov (United States)

    Menemenlis, Dimitris

    1999-01-01

    Detailed knowledge of ocean circulation and its transport properties is prerequisite to an understanding of the earth's climate and of important biological and chemical cycles. Results from two recent experiments, THETIS-2 in the Western Mediterranean and ATOC in the North Pacific, illustrate the use of ocean acoustic tomography for studies of the large scale circulation. The attraction of acoustic tomography is its ability to sample and average the large-scale oceanic thermal structure, synoptically, along several sections, and at regular intervals. In both studies, the acoustic data are compared to, and then combined with, general circulation models, meteorological analyses, satellite altimetry, and direct measurements from ships. Both studies provide complete regional descriptions of the time-evolving, three-dimensional, large scale circulation, albeit with large uncertainties. The studies raise serious issues about existing ocean observing capability and provide guidelines for future efforts.

  17. Privacy Preserving Large-Scale Rating Data Publishing

    Directory of Open Access Journals (Sweden)

    Xiaoxun Sun

    2013-02-01

    Full Text Available Large scale rating data usually contains both ratings of sensitive and non-sensitive issues, and the ratings of sensitive issues belong to personal privacy. Even when survey participants do not reveal any of their ratings, their survey records are potentially identifiable by using information from other public sources. In order to protect the privacy in the large-scale rating data, it is important to propose new privacy principles which consider the properties of the rating data. Moreover, given the privacy principle, how to efficiently determine whether the rating data satisfied the required privacy principle is crucial as well. Furthermore, if the privacy principle is not satisfied, an efficient method is needed to securely publish the large-scale rating data. In this paper, all these problem will be addressed.

  18. Seismic safety in conducting large-scale blasts

    Science.gov (United States)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  19. Adaptive Fault-Tolerant Control of Uncertain Nonlinear Large-Scale Systems With Unknown Dead Zone.

    Science.gov (United States)

    Chen, Mou; Tao, Gang

    2016-08-01

    In this paper, an adaptive neural fault-tolerant control scheme is proposed and analyzed for a class of uncertain nonlinear large-scale systems with unknown dead zone and external disturbances. To tackle the unknown nonlinear interaction functions in the large-scale system, the radial basis function neural network (RBFNN) is employed to approximate them. To further handle the unknown approximation errors and the effects of the unknown dead zone and external disturbances, integrated as the compounded disturbances, the corresponding disturbance observers are developed for their estimations. Based on the outputs of the RBFNN and the disturbance observer, the adaptive neural fault-tolerant control scheme is designed for uncertain nonlinear large-scale systems by using a decentralized backstepping technique. The closed-loop stability of the adaptive control system is rigorously proved via Lyapunov analysis and the satisfactory tracking performance is achieved under the integrated effects of unknown dead zone, actuator fault, and unknown external disturbances. Simulation results of a mass-spring-damper system are given to illustrate the effectiveness of the proposed adaptive neural fault-tolerant control scheme for uncertain nonlinear large-scale systems.

  20. A mixed-methods study of system-level sustainability of evidence-based practices in 12 large-scale implementation initiatives.

    Science.gov (United States)

    Scudder, Ashley T; Taber-Thomas, Sarah M; Schaffner, Kristen; Pemberton, Joy R; Hunter, Leah; Herschell, Amy D

    2017-12-07

    In recent decades, evidence-based practices (EBPs) have been broadly promoted in community behavioural health systems in the United States of America, yet reported EBP penetration rates remain low. Determining how to systematically sustain EBPs in complex, multi-level service systems has important implications for public health. This study examined factors impacting the sustainability of parent-child interaction therapy (PCIT) in large-scale initiatives in order to identify potential predictors of sustainment. A mixed-methods approach to data collection was used. Qualitative interviews and quantitative surveys examining sustainability processes and outcomes were completed by participants from 12 large-scale initiatives. Sustainment strategies fell into nine categories, including infrastructure, training, marketing, integration and building partnerships. Strategies involving integration of PCIT into existing practices and quality monitoring predicted sustainment, while financing also emerged as a key factor. The reported factors and strategies impacting sustainability varied across initiatives; however, integration into existing practices, monitoring quality and financing appear central to high levels of sustainability of PCIT in community-based systems. More detailed examination of the progression of specific activities related to these strategies may aide in identifying priorities to include in strategic planning of future large-scale initiatives. ClinicalTrials.gov ID NCT02543359 ; Protocol number PRO12060529.

  1. Magnetic Helicity and Large Scale Magnetic Fields: A Primer

    Science.gov (United States)

    Blackman, Eric G.

    2015-05-01

    Magnetic fields of laboratory, planetary, stellar, and galactic plasmas commonly exhibit significant order on large temporal or spatial scales compared to the otherwise random motions within the hosting system. Such ordered fields can be measured in the case of planets, stars, and galaxies, or inferred indirectly by the action of their dynamical influence, such as jets. Whether large scale fields are amplified in situ or a remnant from previous stages of an object's history is often debated for objects without a definitive magnetic activity cycle. Magnetic helicity, a measure of twist and linkage of magnetic field lines, is a unifying tool for understanding large scale field evolution for both mechanisms of origin. Its importance stems from its two basic properties: (1) magnetic helicity is typically better conserved than magnetic energy; and (2) the magnetic energy associated with a fixed amount of magnetic helicity is minimized when the system relaxes this helical structure to the largest scale available. Here I discuss how magnetic helicity has come to help us understand the saturation of and sustenance of large scale dynamos, the need for either local or global helicity fluxes to avoid dynamo quenching, and the associated observational consequences. I also discuss how magnetic helicity acts as a hindrance to turbulent diffusion of large scale fields, and thus a helper for fossil remnant large scale field origin models in some contexts. I briefly discuss the connection between large scale fields and accretion disk theory as well. The goal here is to provide a conceptual primer to help the reader efficiently penetrate the literature.

  2. Large-Scale Inverse Problems and Quantification of Uncertainty

    CERN Document Server

    Biegler, Lorenz; Ghattas, Omar

    2010-01-01

    Large-scale inverse problems and associated uncertainty quantification has become an important area of research, central to a wide range of science and engineering applications. Written by leading experts in the field, Large-scale Inverse Problems and Quantification of Uncertainty focuses on the computational methods used to analyze and simulate inverse problems. The text provides PhD students, researchers, advanced undergraduate students, and engineering practitioners with the perspectives of researchers in areas of inverse problems and data assimilation, ranging from statistics and large-sca

  3. Large-scale liquid scintillation detectors for solar neutrinos

    Energy Technology Data Exchange (ETDEWEB)

    Benziger, Jay B.; Calaprice, Frank P. [Princeton University Princeton, Princeton, NJ (United States)

    2016-04-15

    Large-scale liquid scintillation detectors are capable of providing spectral yields of the low energy solar neutrinos. These detectors require > 100 tons of liquid scintillator with high optical and radiopurity. In this paper requirements for low-energy neutrino detection by liquid scintillation are specified and the procedures to achieve low backgrounds in large-scale liquid scintillation detectors for solar neutrinos are reviewed. The designs, operations and achievements of Borexino, KamLAND and SNO+ in measuring the low-energy solar neutrino fluxes are reviewed. (orig.)

  4. Large Scale Anomalies of the Cosmic Microwave Background with Planck

    DEFF Research Database (Denmark)

    Frejsel, Anne Mette

    This thesis focuses on the large scale anomalies of the Cosmic Microwave Background (CMB) and their possible origins. The investigations consist of two main parts. The first part is on statistical tests of the CMB, and the consistency of both maps and power spectrum. We find that the Planck data....... Here we find evidence that the Planck CMB maps contain residual radiation in the loop areas, which can be linked to some of the large scale CMB anomalies: the point-parity asymmetry, the alignment of quadrupole and octupole and the dipolemodulation....

  5. The survey of large-scale query classification

    Science.gov (United States)

    Zhou, Sanduo; Cheng, Kefei; Men, Lijun

    2017-04-01

    In recent years, a lot of researches have been done on query classification. The paper introduces the recent researches on query classification in detail, mainly including the source of query log, the category systems, the feature extraction methods, classification methods and the evaluation methodology. Then it discusses the issues of large-scale query classification and the solved methods combined with big data analysis systems. The research result shows there still are several problems and challenges, such as lack of authoritative classification system and evaluation methodology, efficiency of the feature extraction method, uncertainty of the performance on large-scale query log and the further query classification on the big data platform, etc.

  6. The CLASSgal code for Relativistic Cosmological Large Scale Structure

    CERN Document Server

    Di Dio, Enea; Lesgourgues, Julien; Durrer, Ruth

    2013-01-01

    We present some accurate and efficient computations of large scale structure observables, obtained with a modified version of the CLASS code which is made publicly available. This code includes all relativistic corrections and computes both the power spectrum Cl(z1,z2) and the corresponding correlation function xi(theta,z1,z2) in linear perturbation theory. For Gaussian initial perturbations, these quantities contain the full information encoded in the large scale matter distribution at the level of linear perturbation theory. We illustrate the usefulness of our code for cosmological parameter estimation through a few simple examples.

  7. [Issues of large scale tissue culture of medicinal plant].

    Science.gov (United States)

    Lv, Dong-Mei; Yuan, Yuan; Zhan, Zhi-Lai

    2014-09-01

    In order to increase the yield and quality of the medicinal plant and enhance the competitive power of industry of medicinal plant in our country, this paper analyzed the status, problem and countermeasure of the tissue culture of medicinal plant on large scale. Although the biotechnology is one of the most efficient and promising means in production of medicinal plant, it still has problems such as stability of the material, safety of the transgenic medicinal plant and optimization of cultured condition. Establishing perfect evaluation system according to the characteristic of the medicinal plant is the key measures to assure the sustainable development of the tissue culture of medicinal plant on large scale.

  8. Prospects for large scale electricity storage in Denmark

    DEFF Research Database (Denmark)

    Krog Ekman, Claus; Jensen, Søren Højgaard

    2010-01-01

    In a future power systems with additional wind power capacity there will be an increased need for large scale power management as well as reliable balancing and reserve capabilities. Different technologies for large scale electricity storage provide solutions to the different challenges arising w...... with high wind power penetration. This paper presents a review of the electricity storage technologies relevant for large power systems. The paper also presents an estimation of the economic feasibility of electricity storage using the west Danish power market area as a case....

  9. Fatigue Analysis of Large-scale Wind turbine

    Directory of Open Access Journals (Sweden)

    Zhu Yongli

    2017-01-01

    Full Text Available The paper does research on top flange fatigue damage of large-scale wind turbine generator. It establishes finite element model of top flange connection system with finite element analysis software MSC. Marc/Mentat, analyzes its fatigue strain, implements load simulation of flange fatigue working condition with Bladed software, acquires flange fatigue load spectrum with rain-flow counting method, finally, it realizes fatigue analysis of top flange with fatigue analysis software MSC. Fatigue and Palmgren-Miner linear cumulative damage theory. The analysis result indicates that its result provides new thinking for flange fatigue analysis of large-scale wind turbine generator, and possesses some practical engineering value.

  10. Large-Scale Spray Releases: Initial Aerosol Test Results

    Energy Technology Data Exchange (ETDEWEB)

    Schonewill, Philip P.; Gauglitz, Phillip A.; Bontha, Jagannadha R.; Daniel, Richard C.; Kurath, Dean E.; Adkins, Harold E.; Billing, Justin M.; Burns, Carolyn A.; Davis, James M.; Enderlin, Carl W.; Fischer, Christopher M.; Jenks, Jeromy WJ; Lukins, Craig D.; MacFarlan, Paul J.; Shutthanandan, Janani I.; Smith, Dennese M.

    2012-12-01

    One of the events postulated in the hazard analysis at the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids with Newtonian fluid behavior. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and across processing facilities in the DOE complex. Two key technical areas were identified where testing results were needed to improve the technical basis by reducing the uncertainty due to extrapolating existing literature results. The first technical need was to quantify the role of slurry particles in small breaches where the slurry particles may plug and result in substantially reduced, or even negligible, respirable fraction formed by high-pressure sprays. The second technical need was to determine the aerosol droplet size distribution and volume from prototypic breaches and fluids, specifically including sprays from larger breaches with slurries where data from the literature are scarce. To address these technical areas, small- and large-scale test stands were constructed and operated with simulants to determine aerosol release fractions and generation rates from a range of breach sizes and geometries. The properties of the simulants represented the range of properties expected in the WTP process streams and included water, sodium salt solutions, slurries containing boehmite or gibbsite, and a hazardous chemical simulant. The effect of anti-foam agents was assessed with most of the simulants. Orifices included round holes and

  11. Integrating education, training and communication for public involvement in EIA

    International Nuclear Information System (INIS)

    Oprea, Irina; Oprea, Marcel; Guta, Cornelia; Guta, Vasilica

    2003-01-01

    We are going towards a globalized world, this involving the integration of every activity and every person. The public involvement in the development process is evident, taking into account that any objective will affect the people and the negative feedback could influence the result of the investment. Generally the public could be influenced by amplification of negative evaluated consequences, resulting psychosocial effects leading to illness or anxieties. This problem will be resolved by the public access to information provided by experts. A real-time interactive communication system is proposed as an open tool in order to facilitate decision-making by access to rapid and reliable information. The main task of the system is to collect, process, display and exchange the information relative to environmental impact assessment (EIA), to provide assistance, to receive specific opinions, being also proposed for public understanding of the field. The education and training integration will mitigate the barriers, which may inhibit the interaction and communication process. To increase learning will assure specialists-public interaction and a good information flow for knowledge exchange. The paper will outline key approaches in reaching agreement on the people educational process importance. The impact of development will be available to the public revealing the positive consequences, such as increased employment and income. An effective way to avoid negative reactions consists of the extensive consultation to identify the concerns and needs of the public, the access to suggestive and attractive programs for education and training. The system is developed as a modern information module, integrated into complex international management systems. It can be placed everywhere, everybody could access the facilities for education, world experience and training. Providing a real-time response to citizen concerns, the system represents an economic and rapid way to mitigate the

  12. Cost Overruns in Large-scale Transportation Infrastructure Projects

    DEFF Research Database (Denmark)

    Cantarelli, Chantal C; Flyvbjerg, Bent; Molin, Eric J. E

    2010-01-01

    Managing large-scale transportation infrastructure projects is difficult due to frequent misinformation about the costs which results in large cost overruns that often threaten the overall project viability. This paper investigates the explanations for cost overruns that are given in the literature...

  13. Fractals and the Large-Scale Structure in the Universe

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 7; Issue 2. Fractals and the Large-Scale Structure in the Universe - Introduction and Basic Concepts. A K Mittal T R Seshadri. General Article Volume 7 Issue 2 February 2002 pp 6-19 ...

  14. New Visions for Large Scale Networks: Research and Applications

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — This paper documents the findings of the March 12-14, 2001 Workshop on New Visions for Large-Scale Networks: Research and Applications. The workshops objectives were...

  15. Chain Analysis for large-scale Communication systems

    NARCIS (Netherlands)

    Grijpink, Jan|info:eu-repo/dai/nl/095130861

    2010-01-01

    The chain concept is introduced to explain how large-scale information infrastructures so often fail and sometimes even backfire. Next, the assessment framework of the doctrine of Chain-computerisation and its chain analysis procedure are outlined. In this procedure chain description precedes

  16. Water Implications of Large-Scale Land Acquisitions in Ghana

    Directory of Open Access Journals (Sweden)

    Timothy Olalekan Williams

    2012-06-01

    The paper offers recommendations which can help the government to achieve its stated objective of developing a "policy framework and guidelines for large-scale land acquisitions by both local and foreign investors for biofuels that will protect the interests of investors and the welfare of Ghanaian farmers and landowners".

  17. Origin of large-scale cell structure in the universe

    International Nuclear Information System (INIS)

    Zel'dovich, Y.B.

    1982-01-01

    A qualitative explanation is offered for the characteristic global structure of the universe, wherein ''black'' regions devoid of galaxies are surrounded on all sides by closed, comparatively thin, ''bright'' layers populated by galaxies. The interpretation rests on some very general arguments regarding the growth of large-scale perturbations in a cold gas

  18. Large-Scale Machine Learning for Classification and Search

    Science.gov (United States)

    Liu, Wei

    2012-01-01

    With the rapid development of the Internet, nowadays tremendous amounts of data including images and videos, up to millions or billions, can be collected for training machine learning models. Inspired by this trend, this thesis is dedicated to developing large-scale machine learning techniques for the purpose of making classification and nearest…

  19. Fractals and the Large-Scale Structure in the Universe

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 7; Issue 4. Fractals and the Large-Scale Structure in the Universe - Is the Cosmological Principle Valid? A K Mittal T R Seshadri. General Article Volume 7 Issue 4 April 2002 pp 39-47 ...

  20. Evaluating Large-scale National Public Management Reforms

    DEFF Research Database (Denmark)

    Breidahl, Karen Nielsen; Gjelstrup, Gunnar; Hansen, Morten Balle

    This article explores differences and similarities between two evaluations of large-scale administrative reforms which were carried out in the 2000s: The evaluation of the Norwegian NAV reform (EVANAV) and the evaluation of the Danish Local Government Reform (LGR). We provide a comparative analysis...

  1. Invertebrates or iron: does large-scale opencast mining impact ...

    African Journals Online (AJOL)

    The results were, however, confounded by the fact that the resting eggs of pan inhabitants could remain dormant in the sediment for decades; suggesting that ... Similarly, the preservation of conservation areas and a landscape wide management system were proposed to ensure that large-scale ecological process are not ...

  2. Reconsidering Replication: New Perspectives on Large-Scale School Improvement

    Science.gov (United States)

    Peurach, Donald J.; Glazer, Joshua L.

    2012-01-01

    The purpose of this analysis is to reconsider organizational replication as a strategy for large-scale school improvement: a strategy that features a "hub" organization collaborating with "outlet" schools to enact school-wide designs for improvement. To do so, we synthesize a leading line of research on commercial replication to construct a…

  3. The large scale microwave background anisotropy in decaying particle cosmology

    International Nuclear Information System (INIS)

    Panek, M.

    1987-06-01

    We investigate the large-scale anisotropy of the microwave background radiation in cosmological models with decaying particles. The observed value of the quadrupole moment combined with other constraints gives an upper limit on the redshift of the decay z/sub d/ < 3-5. 12 refs., 2 figs

  4. Large Scale Magnetic Fields: Density Power Spectrum in Redshift ...

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    netic fields to have significant impact on the large scale structure at present. Magnetic fields of a more recent ... are produced at the time of inflation in the very early universe. Larger surveys like the on-going ... fields and their impact on redshift space power spectrum and give our main results. In section 4 we summarize our ...

  5. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  6. Dual Decomposition for Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Vandenberghe, Lieven

    2013-01-01

    Dual decomposition is applied to power balancing of exible thermal storage units. The centralized large-scale problem is decomposed into smaller subproblems and solved locallyby each unit in the Smart Grid. Convergence is achieved by coordinating the units consumption through a negotiation...... procedure with the dual variables....

  7. Large-scale Agricultural Land Acquisitions in West Africa | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Recent increases in commodity prices have led some governments and private investors to purchase or lease large tracts of land in foreign countries for producing their own food and biofuel. ... They will use the results to increase public awareness and knowledge about the consequences of large-scale land acquisitions.

  8. Temporal Variation of Large Scale Flows in the Solar Interior ...

    Indian Academy of Sciences (India)

    tribpo

    Temporal Variation of Large Scale Flows in the Solar Interior. 355. Figure 2. Zonal and meridional components of the time-dependent residual velocity at a few selected depths as marked above each panel, are plotted as contours of constant velocity in the longitude-latitude plane. The left panels show the zonal component, ...

  9. Description of a Large-Scale Micro-Teaching Program.

    Science.gov (United States)

    Webb, Clark; And Others

    This report describes the implementation of a large-scale program at Brigham Young University to provide for at least one microteaching experience for each of 730 students enrolled in a beginning education course. A definition of microteaching (the creation of a miniature teaching situation under controlled conditions) and the elements which make…

  10. Small and large scale genomic DNA isolation protocol for chickpea ...

    African Journals Online (AJOL)

    Small and large scale genomic DNA isolation protocol for chickpea ( Cicer arietinum L.), suitable for molecular marker and transgenic analyses. ... Chickpea is an important food legume crop with high nutritional value. Lack of appropriate DNA isolation protocol is a limiting factor for any molecular studies of this crop.

  11. The Large-Scale Structure of Scientific Method

    Science.gov (United States)

    Kosso, Peter

    2009-01-01

    The standard textbook description of the nature of science describes the proposal, testing, and acceptance of a theoretical idea almost entirely in isolation from other theories. The resulting model of science is a kind of piecemeal empiricism that misses the important network structure of scientific knowledge. Only the large-scale description of…

  12. The interaction of large scale and mesoscale environment leading to ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Earth System Science; Volume 118; Issue 5. The interaction of large scale and mesoscale environment leading to formation of intense thunderstorms over Kolkata. Part I: Doppler radar and satellite observations. P Mukhopadhyay M Mahakur H A K Singh. Volume 118 Issue 5 October 2009 pp ...

  13. Bottom-Up Accountability Initiatives and Large-Scale Land ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    The objective of this project is to test whether the Food and Agriculture Organization's Voluntary Guidelines on the Responsible Governance of Tenure of Land, Fisheries and Forests in the Context of National Food Security can help increase accountability for large-scale land acquisitions in Mali, Nigeria, Uganda, and South ...

  14. Participatory Design and the Challenges of Large-Scale Systems

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Hertzum, Morten

    2008-01-01

    With its 10th biannual anniversary conference, Participatory Design (PD) is leaving its teens and must now be considered ready to join the adult world. In this article we encourage the PD community to think big: PD should engage in large-scale information-systems development and opt for a PD...

  15. Temporal Variation of Large Scale Flows in the Solar Interior ...

    Indian Academy of Sciences (India)

    tribpo

    Abstract. We attempt to detect short term temporal variations in the rotation rate and other large scale velocity fields in the outer part of the solar convection zone using the ring diagram technique applied to. Michelson Doppler Imager (MDI) data. The measured velocity field shows variations by about 10 m/s on the scale of ...

  16. A Large-Scale Earth and Ocean Phenomenon

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 2. Tsunamis - A Large-Scale Earth and Ocean Phenomenon. Satish R Shetye. General Article Volume 10 Issue 2 February 2005 pp 8-19. Fulltext. Click here to view fulltext PDF. Permanent link:

  17. Large-Scale Innovation and Change in UK Higher Education

    Science.gov (United States)

    Brown, Stephen

    2013-01-01

    This paper reflects on challenges universities face as they respond to change. It reviews current theories and models of change management, discusses why universities are particularly difficult environments in which to achieve large scale, lasting change and reports on a recent attempt by the UK JISC to enable a range of UK universities to employ…

  18. Factors Influencing Uptake of a Large Scale Curriculum Innovation.

    Science.gov (United States)

    Adey, Philip S.

    Educational research has all too often failed to be implemented on a large-scale basis. This paper describes the multiplier effect of a professional development program for teachers and for trainers in the United Kingdom, and how that program was developed, monitored, and evaluated. Cognitive Acceleration through Science Education (CASE) is a…

  19. Symmetry in stochasticity: Random walk models of large-scale ...

    Indian Academy of Sciences (India)

    This paper describes the insights gained from the excursion set approach, in which various questions about the phenomenology of large-scale structure formation can be mapped to problems associated with the first crossing distribution of appropriately defined barriers by random walks. Much of this is summarized in R K ...

  20. Solving large scale crew scheduling problems by using iterative partitioning

    NARCIS (Netherlands)

    E.J.W. Abbink (Erwin)

    2008-01-01

    textabstractThis paper deals with large-scale crew scheduling problems arising at the Dutch railway operator, Netherlands Railways (NS). NS operates about 30,000 trains a week. All these trains need a driver and a certain number of conductors. No available crew scheduling algorithm can solve such

  1. Solving Large Scale Crew Scheduling Problems in Practice

    NARCIS (Netherlands)

    E.J.W. Abbink (Erwin); L. Albino; T.A.B. Dollevoet (Twan); D. Huisman (Dennis); J. Roussado; R.L. Saldanha

    2010-01-01

    textabstractThis paper deals with large-scale crew scheduling problems arising at the Dutch railway operator, Netherlands Railways (NS). NS operates about 30,000 trains a week. All these trains need a driver and a certain number of guards. Some labor rules restrict the duties of a certain crew base

  2. The Large Scale Magnetic Field and Sunspot Cycles

    Indian Academy of Sciences (India)

    tribpo

    J. Astrophys. Astr. (2000) 21, 161 162. The Large Scale Magnetic Field and Sunspot Cycles. V. I. Makarov* & A. G. Tlatov, Kislovodsk Solar Station of the Pulkovo Observatory,. Kislovodsk 357700, P.O. Box 145, Russia. *e mail: makarov@gao.spb.ru. Key words. Sun: magnetic field—sunspots—solar cycle. Extended abstract.

  3. Large-Scale Networked Virtual Environments: Architecture and Applications

    Science.gov (United States)

    Lamotte, Wim; Quax, Peter; Flerackers, Eddy

    2008-01-01

    Purpose: Scalability is an important research topic in the context of networked virtual environments (NVEs). This paper aims to describe the ALVIC (Architecture for Large-scale Virtual Interactive Communities) approach to NVE scalability. Design/methodology/approach: The setup and results from two case studies are shown: a 3-D learning environment…

  4. Information Tailoring Enhancements for Large-Scale Social Data

    Science.gov (United States)

    2016-09-26

    improved usability and navigation, (iii) improved the computational framework of Scraawl, (iv) enhanced Named Entity Recognition (NER), and (v...tailoring, large-scale analysis, OSINT 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF 18. NUMBER a. REPORT b. ABSTRACT c. THIS PAGE ABSTRACT OF PAGES...Improvements .................................................... 7 2.3 Upgrade Scraawl Computational Framework to Increase Robustness ....... 8 2.4

  5. Large-Scale Assessments and Educational Policies in Italy

    Science.gov (United States)

    Damiani, Valeria

    2016-01-01

    Despite Italy's extensive participation in most large-scale assessments, their actual influence on Italian educational policies is less easy to identify. The present contribution aims at highlighting and explaining reasons for the weak and often inconsistent relationship between international surveys and policy-making processes in Italy.…

  6. Large Scale Magnetic Fields: Density Power Spectrum in Redshift ...

    Indian Academy of Sciences (India)

    2016-01-27

    Jan 27, 2016 ... Our analysis shows that if these magnetic fields originated in the early universe then it is possible to construct models for which the shape of the power spectrum agrees with the large scale slope of the observed power spectrum. However requiring compatibility with observed CMBR anisotropies, the ...

  7. The Large Scale Structure: Polarization Aspects R. F. Pizzo

    Indian Academy of Sciences (India)

    c Indian Academy of Sciences. The Large Scale Structure: Polarization Aspects. R. F. Pizzo. ASTRON, Postbus 2, 7990 AA Dwingeloo, The Netherlands e-mail: pizzo@astron.nl. Abstract. Polarized radio emission is detected at various scales in the. Universe. In this document, I will briefly review our knowledge on polar-.

  8. Fractals and the Large-Scale Structure in the Universe

    Indian Academy of Sciences (India)

    lien from Harish-Chandra. Research Institute,. Allahabad. Areas of his interest include cosmic microwave background radiation, large scale structures in the Universe and application of fractals in these. A K Mittal and T R Seshadri. During the last decade it has been argued by some investigators that the distribution of galax-.

  9. Proceedings of the meeting on large scale computer simulation research

    International Nuclear Information System (INIS)

    2004-04-01

    The meeting to summarize the collaboration activities for FY2003 on the Large Scale Computer Simulation Research was held January 15-16, 2004 at Theory and Computer Simulation Research Center, National Institute for Fusion Science. Recent simulation results, methodologies and other related topics were presented. (author)

  10. Large Scale Survey Data in Career Development Research

    Science.gov (United States)

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  11. Large scale synthesis and characterization of Ni nanoparticles by ...

    Indian Academy of Sciences (India)

    ... Refresher Courses · Symposia · Live Streaming. Home; Journals; Bulletin of Materials Science; Volume 31; Issue 1. Large scale synthesis and characterization of Ni nanoparticles by solution reduction method. Huazhi Wang Xinli Kou Jie Zhang Jiangong Li. Nanomaterials Volume 31 Issue 1 February 2008 pp 97-100 ...

  12. Planck intermediate results XLII. Large-scale Galactic magnetic fields

    DEFF Research Database (Denmark)

    Adam, R.; Ade, P. A. R.; Alves, M. I. R.

    2016-01-01

    Recent models for the large-scale Galactic magnetic fields in the literature have been largely constrained by synchrotron emission and Faraday rotation measures. We use three different but representative models to compare their predicted polarized synchrotron and dust emission with that measured...

  13. Large-Scale Systems Control Design via LMI Optimization

    Czech Academy of Sciences Publication Activity Database

    Rehák, Branislav

    2015-01-01

    Roč. 44, č. 3 (2015), s. 247-253 ISSN 1392-124X Institutional support: RVO:67985556 Keywords : Combinatorial linear matrix inequalities * large-scale system * decentralized control Subject RIV: BC - Control Systems Theory Impact factor: 0.633, year: 2015

  14. Optimization of FTA technology for large scale plant DNA isolation ...

    African Journals Online (AJOL)

    GRACE

    2006-05-02

    May 2, 2006 ... Key words: FTATM, large-scale, DNA sampling, field set up, marker assisted selection. ... application. FTATM classic card (Whatman Inc., Clifton,. NJ) is a whatman paper that has been impregnated with a patented chemical formulation that lyses cells, .... bands for both normal agarose (data not shown) and.

  15. A large-scale industrial CT's data transfer system

    International Nuclear Information System (INIS)

    Chen Xuesong

    2004-01-01

    The large-scale industrial CT generates a large amount of data when it works. To guarantee the reliability of the real-time transfers of those data, the author designs a project by using WLAN technology. And it solves the bottleneck caused by the data rate limitation by using multi-thread technology. (author)

  16. A Large Scale Code Resolution Service Network in the Internet of Things

    Directory of Open Access Journals (Sweden)

    Xiangzhan Yu

    2012-11-01

    Full Text Available In the Internet of Things a code resolution service provides a discovery mechanism for a requester to obtain the information resources associated with a particular product code immediately. In large scale application scenarios a code resolution service faces some serious issues involving heterogeneity, big data and data ownership. A code resolution service network is required to address these issues. Firstly, a list of requirements for the network architecture and code resolution services is proposed. Secondly, in order to eliminate code resolution conflicts and code resolution overloads, a code structure is presented to create a uniform namespace for code resolution records. Thirdly, we propose a loosely coupled distributed network consisting of heterogeneous, independent; collaborating code resolution services and a SkipNet based code resolution service named SkipNet-OCRS, which not only inherits DHT’s advantages, but also supports administrative control and autonomy. For the external behaviors of SkipNet-OCRS, a novel external behavior mode named QRRA mode is proposed to enhance security and reduce requester complexity. For the internal behaviors of SkipNet-OCRS, an improved query algorithm is proposed to increase query efficiency. It is analyzed that integrating SkipNet-OCRS into our resolution service network can meet our proposed requirements. Finally, simulation experiments verify the excellent performance of SkipNet-OCRS.

  17. Real-Time Large Scale 3d Reconstruction by Fusing Kinect and Imu Data

    Science.gov (United States)

    Huai, J.; Zhang, Y.; Yilmaz, A.

    2015-08-01

    Kinect-style RGB-D cameras have been used to build large scale dense 3D maps for indoor environments. These maps can serve many purposes such as robot navigation, and augmented reality. However, to generate dense 3D maps of large scale environments is still very challenging. In this paper, we present a mapping system for 3D reconstruction that fuses measurements from a Kinect and an inertial measurement unit (IMU) to estimate motion. Our major achievements include: (i) Large scale consistent 3D reconstruction is realized by volume shifting and loop closure; (ii) The coarse-to-fine iterative closest point (ICP) algorithm, the SIFT odometry, and IMU odometry are combined to robustly and precisely estimate pose. In particular, ICP runs routinely to track the Kinect motion. If ICP fails in planar areas, the SIFT odometry provides incremental motion estimate. If both ICP and the SIFT odometry fail, e.g., upon abrupt motion or inadequate features, the incremental motion is estimated by the IMU. Additionally, the IMU also observes the roll and pitch angles which can reduce long-term drift of the sensor assembly. In experiments on a consumer laptop, our system estimates motion at 8Hz on average while integrating color images to the local map and saving volumes of meshes concurrently. Moreover, it is immune to tracking failures, and has smaller drift than the state-of-the-art systems in large scale reconstruction.

  18. Botswana water and surface energy balance research program. Part 2: Large scale moisture and passive microwaves

    Science.gov (United States)

    Vandegriend, A. A.; Owe, M.; Chang, A. T. C.

    1992-01-01

    The Botswana water and surface energy balance research program was developed to study and evaluate the integrated use of multispectral satellite remote sensing for monitoring the hydrological status of the Earth's surface. The research program consisted of two major, mutually related components: a surface energy balance modeling component, built around an extensive field campaign; and a passive microwave research component which consisted of a retrospective study of large scale moisture conditions and Nimbus scanning multichannel microwave radiometer microwave signatures. The integrated approach of both components are explained in general and activities performed within the passive microwave research component are summarized. The microwave theory is discussed taking into account: soil dielectric constant, emissivity, soil roughness effects, vegetation effects, optical depth, single scattering albedo, and wavelength effects. The study site is described. The soil moisture data and its processing are considered. The relation between observed large scale soil moisture and normalized brightness temperatures is discussed. Vegetation characteristics and inverse modeling of soil emissivity is considered.

  19. The Hamburg large scale geostrophic ocean general circulation model. Cycle 1

    International Nuclear Information System (INIS)

    Maier-Reimer, E.; Mikolajewicz, U.

    1992-02-01

    The rationale for the Large Scale Geostrophic ocean circulation model (LSG-OGCM) is based on the observations that for a large scale ocean circulation model designed for climate studies, the relevant characteristic spatial scales are large compared with the internal Rossby radius throughout most of the ocean, while the characteristic time scales are large compared with the periods of gravity modes and barotropic Rossby wave modes. In the present version of the model, the fast modes have been filtered out by a conventional technique of integrating the full primitive equations, including all terms except the nonlinear advection of momentum, by an implicit time integration method. The free surface is also treated prognostically, without invoking a rigid lid approximation. The numerical scheme is unconditionally stable and has the additional advantage that it can be applied uniformly to the entire globe, including the equatorial and coastal current regions. (orig.)

  20. WAMS Based Intelligent Operation and Control of Modern Power System with large Scale Renewable Energy Penetration

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain

    for alternative energy systems driven by the pressure to reduce carbon emission has stimulated a renewal of interest in wind power. The combined effect of growing demand and increasing level of intermittent wind energy penetration coupled with deregulated market has pushed the power system to operate close to its......Electricity demand worldwide is growing which is mainly driven by growing industrial activities and the widening of access to consumers in the developing world. On the other hand, limitations of conventional sources of energy generation coupled with substantial financial and regulatory incentives...... to intermittent nature and lack of adequate controllability of wind generation, large scale integration of wind energy compromises the security of power system. Therefore, WAMS based security assessment has been proposed to assess the steady state and dynamic security of large scale wind integrated power system...

  1. The use of production management techniques in the construction of large scale physics detectors

    CERN Document Server

    Bazan, A; Estrella, F; Kovács, Z; Le Flour, T; Le Goff, J M; Lieunard, S; McClatchey, R; Murray, S; Varga, L Z; Vialle, J P; Zsenei, M

    1999-01-01

    The construction process of detectors for the Large Hadron Collider (LHC) experiments is large scale, heavily constrained by resource availability and evolves with time. As a consequence, changes in detector component design need to be tracked and quickly reflected in the construction process. With similar problems in industry engineers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so- called Workflow Management software (WfMS) to coordinate production work processes. However, PDM and WfMS software are not generally integrated in industry. The scale of LHC experiments, like CMS, demands that industrial production techniques be applied in detector construction. This paper outlines the major functions and applications of the CRISTAL system (Cooperating Repositories and an information System for Tracking Assembly Lifecycles) in use in CMS which successfully integrates PDM and WfMS techniques in managing large scale physics detector ...

  2. Professional formation through personal involvement and value integration.

    Science.gov (United States)

    Haugland, Britt Øvrebø; Lassen, Rasmus M; Giske, Tove

    2018-03-01

    Formation is an important part of nursing education, and it is the responsibility of nurse educators to facilitate learning situations that provide students with opportunities for personal discovery. Studies have shown that awareness of one's own vulnerability can be a source of professional maturation and courageous action. The study setting is a Christian university that emphasises its value base through the perspective of diakonia in the nursing programme. Diakonia is understood as the provision of caring. Two hundred and forty-five pages of reflective journals from 124 third-year students were analysed with qualitative content analysis. The main theme of the study was Professional formation through personal involvement and value integration. Four categories emerged: 1) Diakonia as a guide to professional compassion; 2) Consciousness of one's own values; 3) The urge to act courageously; and 4) Choosing to spend the time available. The article discusses how students can integrate values in their professional lives by using all senses when learning in real-life situations and by using systematic reflection alone and together with others. Professional formation is an ongoing process, and we have found that mandatory participation, reiteration and progression are important conditions for such formation to occur. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. A large-scale circuit mechanism for hierarchical dynamical processing in the primate cortex

    OpenAIRE

    Chaudhuri, Rishidev; Knoblauch, Kenneth; Gariel, Marie-Alice; Kennedy, Henry; Wang, Xiao-Jing

    2015-01-01

    We developed a large-scale dynamical model of the macaque neocortex, which is based on recently acquired directed- and weighted-connectivity data from tract-tracing experiments, and which incorporates heterogeneity across areas. A hierarchy of timescales naturally emerges from this system: sensory areas show brief, transient responses to input (appropriate for sensory processing), whereas association areas integrate inputs over time and exhibit persistent activity (suitable for decision-makin...

  4. Large scale waste combustion projects. A study of financial structures and sensitivities

    International Nuclear Information System (INIS)

    Brandler, A.

    1993-01-01

    The principal objective of the study was to determine the key contractual and financial aspects of large scale energy-from-waste projects, and to provide the necessary background information on financing to appreciate the approach lenders take when they consider financing waste combustion projects. An integral part of the study has been the preparation of a detailed financial model, incorporating all major financing parameters, to assess the economic and financial viability of typical waste combustion projects. (author)

  5. Developing A Large-Scale, Collaborative, Productive Geoscience Education Network

    Science.gov (United States)

    Manduca, C. A.; Bralower, T. J.; Egger, A. E.; Fox, S.; Ledley, T. S.; Macdonald, H.; Mcconnell, D. A.; Mogk, D. W.; Tewksbury, B. J.

    2012-12-01

    Over the past 15 years, the geoscience education community has grown substantially and developed broad and deep capacity for collaboration and dissemination of ideas. While this community is best viewed as emergent from complex interactions among changing educational needs and opportunities, we highlight the role of several large projects in the development of a network within this community. In the 1990s, three NSF projects came together to build a robust web infrastructure to support the production and dissemination of on-line resources: On The Cutting Edge (OTCE), Earth Exploration Toolbook, and Starting Point: Teaching Introductory Geoscience. Along with the contemporaneous Digital Library for Earth System Education, these projects engaged geoscience educators nationwide in exploring professional development experiences that produced lasting on-line resources, collaborative authoring of resources, and models for web-based support for geoscience teaching. As a result, a culture developed in the 2000s in which geoscience educators anticipated that resources for geoscience teaching would be shared broadly and that collaborative authoring would be productive and engaging. By this time, a diverse set of examples demonstrated the power of the web infrastructure in supporting collaboration, dissemination and professional development . Building on this foundation, more recent work has expanded both the size of the network and the scope of its work. Many large research projects initiated collaborations to disseminate resources supporting educational use of their data. Research results from the rapidly expanding geoscience education research community were integrated into the Pedagogies in Action website and OTCE. Projects engaged faculty across the nation in large-scale data collection and educational research. The Climate Literacy and Energy Awareness Network and OTCE engaged community members in reviewing the expanding body of on-line resources. Building Strong

  6. Random access in large-scale DNA data storage.

    Science.gov (United States)

    Organick, Lee; Ang, Siena Dumas; Chen, Yuan-Jyue; Lopez, Randolph; Yekhanin, Sergey; Makarychev, Konstantin; Racz, Miklos Z; Kamath, Govinda; Gopalan, Parikshit; Nguyen, Bichlien; Takahashi, Christopher N; Newman, Sharon; Parker, Hsing-Yeh; Rashtchian, Cyrus; Stewart, Kendall; Gupta, Gagan; Carlson, Robert; Mulligan, John; Carmean, Douglas; Seelig, Georg; Ceze, Luis; Strauss, Karin

    2018-03-01

    Synthetic DNA is durable and can encode digital data with high density, making it an attractive medium for data storage. However, recovering stored data on a large-scale currently requires all the DNA in a pool to be sequenced, even if only a subset of the information needs to be extracted. Here, we encode and store 35 distinct files (over 200 MB of data), in more than 13 million DNA oligonucleotides, and show that we can recover each file individually and with no errors, using a random access approach. We design and validate a large library of primers that enable individual recovery of all files stored within the DNA. We also develop an algorithm that greatly reduces the sequencing read coverage required for error-free decoding by maximizing information from all sequence reads. These advances demonstrate a viable, large-scale system for DNA data storage and retrieval.

  7. Penalized Estimation in Large-Scale Generalized Linear Array Models

    DEFF Research Database (Denmark)

    Lund, Adam; Vincent, Martin; Hansen, Niels Richard

    2017-01-01

    Large-scale generalized linear array models (GLAMs) can be challenging to fit. Computation and storage of its tensor product design matrix can be impossible due to time and memory constraints, and previously considered design matrix free algorithms do not scale well with the dimension of the para......Large-scale generalized linear array models (GLAMs) can be challenging to fit. Computation and storage of its tensor product design matrix can be impossible due to time and memory constraints, and previously considered design matrix free algorithms do not scale well with the dimension...... of the parameter vector. A new design matrix free algorithm is proposed for computing the penalized maximum likelihood estimate for GLAMs, which, in particular, handles nondifferentiable penalty functions. The proposed algorithm is implemented and available via the R package glamlasso. It combines several ideas...

  8. Large scale EMF in current sheets induced by tearing modes

    Science.gov (United States)

    Mizerski, Krzysztof A.

    2018-02-01

    An extension of the analysis of resistive instabilities of a sheet pinch from a famous work by Furth et al (1963 Phys. Fluids 6 459) is presented here, to study the mean electromotive force (EMF) generated by the developing instability. In a Cartesian configuration and in the presence of a current sheet first the boundary layer technique is used to obtain global, matched asymptotic solutions for the velocity and magnetic field and then the solutions are used to calculate the large-scale EMF in the system. It is reported, that in the bulk the curl of the mean EMF is linear in {{j}}0\\cdot {{B}}0, a simple pseudo-scalar quantity constructed from the large-scale quantities.

  9. Mathematical programming methods for large-scale topology optimization problems

    DEFF Research Database (Denmark)

    Rojas Labanda, Susana

    for the classical minimum compliance problem. Two of the state-of-the-art optimization algorithms are investigated and implemented for this structural topology optimization problem. A Sequential Quadratic Programming (TopSQP) and an interior point method (TopIP) are developed exploiting the specific mathematical...... structure of the problem. In both solvers, information of the exact Hessian is considered. A robust iterative method is implemented to efficiently solve large-scale linear systems. Both TopSQP and TopIP have successful results in terms of convergence, number of iterations, and objective function values....... Thanks to the use of the iterative method implemented, TopIP is able to solve large-scale problems with more than three millions degrees of freedom....

  10. Measuring Cosmic Expansion and Large Scale Structure with Destiny

    Science.gov (United States)

    Benford, Dominic J.; Lauer, Tod R.

    2007-01-01

    Destiny is a simple, direct, low cost mission to determine the properties of dark energy by obtaining a cosmologically deep supernova (SN) type Ia Hubble diagram and by measuring the large-scale mass power spectrum over time. Its science instrument is a 1.65m space telescope, featuring a near-infrared survey camera/spectrometer with a large field of view. During its first two years, Destiny will detect, observe, and characterize 23000 SN Ia events over the redshift interval 0.4Destiny will be used in its third year as a high resolution, wide-field imager to conduct a weak lensing survey covering >lo00 square degrees to measure the large-scale mass power spectrum. The combination of surveys is much more powerful than either technique on its own, and will have over an order of magnitude greater sensitivity than will be provided by ongoing ground-based projects.

  11. Large-scale innovation and change in UK higher education

    Directory of Open Access Journals (Sweden)

    Stephen Brown

    2013-09-01

    Full Text Available This paper reflects on challenges universities face as they respond to change. It reviews current theories and models of change management, discusses why universities are particularly difficult environments in which to achieve large scale, lasting change and reports on a recent attempt by the UK JISC to enable a range of UK universities to employ technology to deliver such changes. Key lessons that emerged from these experiences are reviewed covering themes of pervasiveness, unofficial systems, project creep, opposition, pressure to deliver, personnel changes and technology issues. The paper argues that collaborative approaches to project management offer greater prospects of effective large-scale change in universities than either management-driven top-down or more champion-led bottom-up methods. It also argues that while some diminution of control over project outcomes is inherent in this approach, this is outweighed by potential benefits of lasting and widespread adoption of agreed changes.

  12. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed

    2017-03-16

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end, cellular networks are indeed a strong first mile candidate to accommodate the data tsunami to be generated by the IoT. However, IoT devices are required in the cellular paradigm to undergo random access procedures as a precursor to resource allocation. Such procedures impose a major bottleneck that hinders cellular networks\\' ability to support large-scale IoT. In this article, we shed light on the random access dilemma and present a case study based on experimental data as well as system-level simulations. Accordingly, a case is built for the latent need to revisit random access procedures. A call for action is motivated by listing a few potential remedies and recommendations.

  13. The Large-scale Effect of Environment on Galactic Conformity

    Science.gov (United States)

    Sun, Shuangpeng; Guo, Qi; Wang, Lan; Wang, Jie; Gao, Liang; Lacey, Cedric G.; Pan, Jun

    2018-04-01

    We use a volume-limited galaxy sample from the SDSS Data Release 7 to explore the dependence of galactic conformity on the large-scale environment, measured on ˜ 4 Mpc scales. We find that the star formation activity of neighbour galaxies depends more strongly on the environment than on the activity of their primary galaxies. In under-dense regions most neighbour galaxies tend to be active, while in over-dense regions neighbour galaxies are mostly passive, regardless of the activity of their primary galaxies. At a given stellar mass, passive primary galaxies reside in higher density regions than active primary galaxies, leading to the apparently strong conformity signal. The dependence of the activity of neighbour galaxies on environment can be explained by the corresponding dependence of the fraction of satellite galaxies. Similar results are found for galaxies in a semi-analytical model, suggesting that no new physics is required to explain the observed large-scale conformity.

  14. Volume measurement study for large scale input accountancy tank

    International Nuclear Information System (INIS)

    Uchikoshi, Seiji; Watanabe, Yuichi; Tsujino, Takeshi

    1999-01-01

    Large Scale Tank Calibration (LASTAC) facility, including an experimental tank which has the same volume and structure as the input accountancy tank of Rokkasho Reprocessing Plant (RRP) was constructed in Nuclear Material Control Center of Japan. Demonstration experiments have been carried out to evaluate a precision of solution volume measurement and to establish the procedure of highly accurate pressure measurement for a large scale tank with dip-tube bubbler probe system to be applied to the input accountancy tank of RRP. Solution volume in a tank is determined from substitution the solution level for the calibration function obtained in advance, which express a relation between the solution level and its volume in the tank. Therefore, precise solution volume measurement needs a precise calibration function that is determined carefully. The LASTAC calibration experiments using pure water showed good result in reproducibility. (J.P.N.)

  15. Model for large scale circulation of nuclides in nature, 1

    Energy Technology Data Exchange (ETDEWEB)

    Ohnishi, Teruaki

    1988-12-01

    A model for large scale circulation of nuclides was developed, and a computer code named COCAIN was made which simulates this circulation system-dynamically. The natural environment considered in the present paper consists of 2 atmospheres, 8 geospheres and 2 lithospheres. The biosphere is composed of 4 types of edible plants, 5 cattles and their products, 4 water biota and 16 human organs. The biosphere is assumed to be given nuclides from the natural environment mentioned above. With the use of COCAIN, two numerical case studies were carried out; the one is the study on nuclear pollution in nature by the radioactive nuclides originating from the past nuclear bomb tests, and the other is the study on the response of environment and biota to the pulse injection of nuclides into one compartment. From the former case study it was verified that this model can well explain the observation and properly simulate the large scale circulation of nuclides in nature.

  16. Prototype Vector Machine for Large Scale Semi-Supervised Learning

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Kai; Kwok, James T.; Parvin, Bahram

    2009-04-29

    Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of the kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.

  17. Performance of Grey Wolf Optimizer on large scale problems

    Science.gov (United States)

    Gupta, Shubham; Deep, Kusum

    2017-01-01

    For solving nonlinear continuous problems of optimization numerous nature inspired optimization techniques are being proposed in literature which can be implemented to solve real life problems wherein the conventional techniques cannot be applied. Grey Wolf Optimizer is one of such technique which is gaining popularity since the last two years. The objective of this paper is to investigate the performance of Grey Wolf Optimization Algorithm on large scale optimization problems. The Algorithm is implemented on 5 common scalable problems appearing in literature namely Sphere, Rosenbrock, Rastrigin, Ackley and Griewank Functions. The dimensions of these problems are varied from 50 to 1000. The results indicate that Grey Wolf Optimizer is a powerful nature inspired Optimization Algorithm for large scale problems, except Rosenbrock which is a unimodal function.

  18. Reliability Evaluation considering Structures of a Large Scale Wind Farm

    DEFF Research Database (Denmark)

    Shin, Je-Seok; Cha, Seung-Tae; Wu, Qiuwei

    2012-01-01

    evaluation on wind farm is necessarily required. Also, because large scale offshore wind farm has a long repair time and a high repair cost as well as a high investment cost, it is essential to take into account the economic aspect. One of methods to efficiently build and to operate wind farm is to construct......Wind energy is one of the most widely used renewable energy resources. Wind power has been connected to the grid as large scale wind farm which is made up of dozens of wind turbines, and the scale of wind farm is more increased recently. Due to intermittent and variable wind source, reliability...... wind farm which is able to enhance a capability of delivering a power instead of controlling an uncontrollable output of wind power. Therefore, this paper introduces a method to evaluate the reliability depending upon structures of wind farm and to reflect the result to the planning stage of wind farm....

  19. A survey on routing protocols for large-scale wireless sensor networks.

    Science.gov (United States)

    Li, Changle; Zhang, Hanxiao; Hao, Binbin; Li, Jiandong

    2011-01-01

    With the advances in micro-electronics, wireless sensor devices have been made much smaller and more integrated, and large-scale wireless sensor networks (WSNs) based the cooperation among the significant amount of nodes have become a hot topic. "Large-scale" means mainly large area or high density of a network. Accordingly the routing protocols must scale well to the network scope extension and node density increases. A sensor node is normally energy-limited and cannot be recharged, and thus its energy consumption has a quite significant effect on the scalability of the protocol. To the best of our knowledge, currently the mainstream methods to solve the energy problem in large-scale WSNs are the hierarchical routing protocols. In a hierarchical routing protocol, all the nodes are divided into several groups with different assignment levels. The nodes within the high level are responsible for data aggregation and management work, and the low level nodes for sensing their surroundings and collecting information. The hierarchical routing protocols are proved to be more energy-efficient than flat ones in which all the nodes play the same role, especially in terms of the data aggregation and the flooding of the control packets. With focus on the hierarchical structure, in this paper we provide an insight into routing protocols designed specifically for large-scale WSNs. According to the different objectives, the protocols are generally classified based on different criteria such as control overhead reduction, energy consumption mitigation and energy balance. In order to gain a comprehensive understanding of each protocol, we highlight their innovative ideas, describe the underlying principles in detail and analyze their advantages and disadvantages. Moreover a comparison of each routing protocol is conducted to demonstrate the differences between the protocols in terms of message complexity, memory requirements, localization, data aggregation, clustering manner and

  20. System Recovery in Large-Scale Distributed Storage Systems

    OpenAIRE

    Aga, Svein

    2008-01-01

    This report aims to describe and improve a system recovery process in large-scale storage systems. Inevitable, a recovery process results in the system being loaded with internal replication of data, and will extensively utilize several storage nodes. Such internal load can be categorized and generalized into a maintenance workload class. Obviously, a storage system will have external clients which also introduce load into the system. This can be users altering their data, uploading new cont...

  1. Foundations of Large-Scale Multimedia Information Management and Retrieval

    CERN Document Server

    Chang, Edward Y

    2011-01-01

    "Foundations of Large-Scale Multimedia Information Management and Retrieval - Mathematics of Perception" covers knowledge representation and semantic analysis of multimedia data and scalability in signal extraction, data mining, and indexing. The book is divided into two parts: Part I - Knowledge Representation and Semantic Analysis focuses on the key components of mathematics of perception as it applies to data management and retrieval. These include feature selection/reduction, knowledge representation, semantic analysis, distance function formulation for measuring similarity, and

  2. Large-Scale Physical Separation of Depleted Uranium from Soil

    Science.gov (United States)

    2012-09-01

    unweathered depleted uranium rods illustrating the formation of uranyl oxides and salts . Unfired penetrator rods can range from 10 to 50 cm in length...specific area ratio (as thin sections, fine particles, or molten states). Uranium in finely divided form is prone to ignition. Uranium also has an...ER D C/ EL T R -1 2 -2 5 Army Range Technology Program Large-Scale Physical Separation of Depleted Uranium from Soil E nv ir on m en ta l

  3. NASA: Assessments of Selected Large-Scale Projects

    Science.gov (United States)

    2011-03-01

    Selected Large-Scale Projects Common Name: Orion Project Update The President proposed cancellation of the Constellation Program, including the Orion ...fiscal year 2010. NASA remains poised to leverage Constellation assets to contribute to future exploration beyond low-Earth orbit. Orion Crew...Observatory 2 (OCO-2) 65 Orion Crew Exploration Vehicle 67 Radiation Belt Storm Probes (RBSP) 69 Soil Moisture Active and Passive (SMAP) 71

  4. Large scale 2D spectral compressed sensing in continuous domain

    KAUST Repository

    Cai, Jian-Feng

    2017-06-20

    We consider the problem of spectral compressed sensing in continuous domain, which aims to recover a 2-dimensional spectrally sparse signal from partially observed time samples. The signal is assumed to be a superposition of s complex sinusoids. We propose a semidefinite program for the 2D signal recovery problem. Our model is able to handle large scale 2D signals of size 500 × 500, whereas traditional approaches only handle signals of size around 20 × 20.

  5. How Large-Scale Research Facilities Connect to Global Research

    DEFF Research Database (Denmark)

    Lauto, Giancarlo; Valentin, Finn

    2013-01-01

    Policies for large-scale research facilities (LSRFs) often highlight their spillovers to industrial innovation and their contribution to the external connectivity of the regional innovation system hosting them. Arguably, the particular institutional features of LSRFs are conducive for collaborati...... with domestic universities or government laboratories. Policies conceiving LSRFs as “knowledge attractors” therefore should consider the complementarities between research at a LSRF and in its academic context at a regional or national level....

  6. Concurrent Programming Using Actors: Exploiting Large-Scale Parallelism,

    Science.gov (United States)

    1985-10-07

    ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK* Artificial Inteligence Laboratory AREA Is WORK UNIT NUMBERS 545 Technology Square...D-R162 422 CONCURRENT PROGRMMIZNG USING f"OS XL?ITP TEH l’ LARGE-SCALE PARALLELISH(U) NASI AC E Al CAMBRIDGE ARTIFICIAL INTELLIGENCE L. G AGHA ET AL...RESOLUTION TEST CHART N~ATIONAL BUREAU OF STANDA.RDS - -96 A -E. __ _ __ __’ .,*- - -- •. - MASSACHUSETTS INSTITUTE OF TECHNOLOGY ARTIFICIAL

  7. Large-scale prediction of drug-target relationships

    DEFF Research Database (Denmark)

    Kuhn, Michael; Campillos, Mónica; González, Paula

    2008-01-01

    , but also provides a more global view on drug-target relations. Here we review recent attempts to apply large-scale computational analyses to predict novel interactions of drugs and targets from molecular and cellular features. In this context, we quantify the family-dependent probability of two proteins...... to bind the same ligand as function of their sequence similarity. We finally discuss how phenotypic data could help to expand our understanding of the complex mechanisms of drug action....

  8. Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Willcox, Karen [MIT; Marzouk, Youssef [MIT

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to

  9. Experimental simulation of microinteractions in large scale explosions

    Energy Technology Data Exchange (ETDEWEB)

    Chen, X.; Luo, R.; Yuen, W.W.; Theofanous, T.G. [California Univ., Santa Barbara, CA (United States). Center for Risk Studies and Safety

    1998-01-01

    This paper presents data and analysis of recent experiments conducted in the SIGMA-2000 facility to simulate microinteractions in large scale explosions. Specifically, the fragmentation behavior of a high temperature molten steel drop under high pressure (beyond critical) conditions are investigated. The current data demonstrate, for the first time, the effect of high pressure in suppressing the thermal effect of fragmentation under supercritical conditions. The results support the microinteractions idea, and the ESPROSE.m prediction of fragmentation rate. (author)

  10. Primordial large-scale electromagnetic fields from gravitoelectromagnetic inflation

    International Nuclear Information System (INIS)

    Membiela, Federico Agustin; Bellini, Mauricio

    2009-01-01

    We investigate the origin and evolution of primordial electric and magnetic fields in the early universe, when the expansion is governed by a cosmological constant Λ 0 . Using the gravitoelectromagnetic inflationary formalism with A 0 =0, we obtain the power of spectrums for large-scale magnetic fields and the inflaton field fluctuations during inflation. A very important fact is that our formalism is naturally non-conformally invariant.

  11. Partitioning Large Scale Deep Belief Networks Using Dropout

    OpenAIRE

    Huang, Yanping; Zhang, Sai

    2015-01-01

    Deep learning methods have shown great promise in many practical applications, ranging from speech recognition, visual object recognition, to text processing. However, most of the current deep learning methods suffer from scalability problems for large-scale applications, forcing researchers or users to focus on small-scale problems with fewer parameters. In this paper, we consider a well-known machine learning model, deep belief networks (DBNs) that have yielded impressive classification per...

  12. Exploring the technical challenges of large-scale lifelogging

    OpenAIRE

    Gurrin, Cathal; Smeaton, Alan F.; Qiu, Zhengwei; Doherty, Aiden R.

    2013-01-01

    Ambiently and automatically maintaining a lifelog is an activity that may help individuals track their lifestyle, learning, health and productivity. In this paper we motivate and discuss the technical challenges of developing real-world lifelogging solutions, based on seven years of experience. The gathering, organisation, retrieval and presentation challenges of large-scale lifelogging are dis- cussed and we show how this can be achieved and the benefits that may accrue.

  13. The Phoenix series large scale LNG pool fire experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  14. Large-scale motions in the universe: a review

    International Nuclear Information System (INIS)

    Burstein, D.

    1990-01-01

    The expansion of the universe can be retarded in localised regions within the universe both by the presence of gravity and by non-gravitational motions generated in the post-recombination universe. The motions of galaxies thus generated are called 'peculiar motions', and the amplitudes, size scales and coherence of these peculiar motions are among the most direct records of the structure of the universe. As such, measurements of these properties of the present-day universe provide some of the severest tests of cosmological theories. This is a review of the current evidence for large-scale motions of galaxies out to a distance of ∼5000 km s -1 (in an expanding universe, distance is proportional to radial velocity). 'Large-scale' in this context refers to motions that are correlated over size scales larger than the typical sizes of groups of galaxies, up to and including the size of the volume surveyed. To orient the reader into this relatively new field of study, a short modern history is given together with an explanation of the terminology. Careful consideration is given to the data used to measure the distances, and hence the peculiar motions, of galaxies. The evidence for large-scale motions is presented in a graphical fashion, using only the most reliable data for galaxies spanning a wide range in optical properties and over the complete range of galactic environments. The kinds of systematic errors that can affect this analysis are discussed, and the reliability of these motions is assessed. The predictions of two models of large-scale motion are compared to the observations, and special emphasis is placed on those motions in which our own Galaxy directly partakes. (author)

  15. Accuracy control in ultra-large-scale electronic structure calculation

    OpenAIRE

    Hoshi, Takeo

    2007-01-01

    Numerical aspects are investigated in ultra-large-scale electronic structure calculation. Accuracy control methods in process (molecular-dynamics) calculation are focused. Flexible control methods are proposed so as to control variational freedoms, automatically at each time step, within the framework of generalized Wannier state theory. The method is demonstrated in silicon cleavage simulation with 10^2-10^5 atoms. The idea is of general importance among process calculations and is also used...

  16. Large scale particle image velocimetry with helium filled soap bubbles

    Science.gov (United States)

    Bosbach, Johannes; Kühn, Matthias; Wagner, Claus

    2009-03-01

    The application of Particle Image Velocimetry (PIV) to measurement of flows on large scales is a challenging necessity especially for the investigation of convective air flows. Combining helium filled soap bubbles as tracer particles with high power quality switched solid state lasers as light sources allows conducting PIV on scales of the order of several square meters. The technique was applied to mixed convection in a full scale double aisle aircraft cabin mock-up for validation of Computational Fluid Dynamics simulations.

  17. Large Scale Density Estimation of Blue and Fin Whales (LSD)

    Science.gov (United States)

    2015-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Large Scale Density Estimation of Blue and Fin Whales ...sensors, or both. The goal of this research is to develop and implement a new method for estimating blue and fin whale density that is effective over...develop and implement a density estimation methodology for quantifying blue and fin whale abundance from passive acoustic data recorded on sparse

  18. Study of a large scale neutron measurement channel

    International Nuclear Information System (INIS)

    Amarouayache, Anissa; Ben Hadid, Hayet.

    1982-12-01

    A large scale measurement channel allows the processing of the signal coming from an unique neutronic sensor, during three different running modes: impulses, fluctuations and current. The study described in this note includes three parts: - A theoretical study of the large scale channel and its brief description are given. The results obtained till now in that domain are presented. - The fluctuation mode is thoroughly studied and the improvements to be done are defined. The study of a fluctuation linear channel with an automatic commutation of scales is described and the results of the tests are given. In this large scale channel, the method of data processing is analogical. - To become independent of the problems generated by the use of a an analogical processing of the fluctuation signal, a digital method of data processing is tested. The validity of that method is improved. The results obtained on a test system realized according to this method are given and a preliminary plan for further research is defined [fr

  19. Primordial quantum nonequilibrium and large-scale cosmic anomalies

    Science.gov (United States)

    Colin, Samuel; Valentini, Antony

    2015-08-01

    We study incomplete relaxation to quantum equilibrium at long wavelengths, during a preinflationary phase, as a possible explanation for the reported large-scale anomalies in the cosmic microwave background. Our scenario makes use of the de Broglie-Bohm pilot-wave formulation of quantum theory, in which the Born probability rule has a dynamical origin. The large-scale power deficit could arise from incomplete relaxation for the amplitudes of the primordial perturbations. We show, by numerical simulations for a spectator scalar field, that if the preinflationary era is radiation dominated then the deficit in the emerging power spectrum will have a characteristic shape (an inverse-tangent dependence on wave number k , with oscillations). It is found that our scenario is able to produce a power deficit in the observed region and of the observed (approximate) magnitude for an appropriate choice of cosmological parameters. We also discuss the large-scale anisotropy, which might arise from incomplete relaxation for the phases of the primordial perturbations. We present numerical simulations for phase relaxation, and we show how to define characteristic scales for amplitude and phase nonequilibrium. The extent to which the data might support our scenario is left as a question for future work. Our results suggest that we have a potentially viable model that might explain two apparently independent cosmic anomalies by means of a single mechanism.

  20. Human visual system automatically represents large-scale sequential regularities.

    Science.gov (United States)

    Kimura, Motohiro; Widmann, Andreas; Schröger, Erich

    2010-03-04

    Our brain recordings reveal that large-scale sequential regularities defined across non-adjacent stimuli can be automatically represented in visual sensory memory. To show that, we adopted an auditory paradigm developed by Sussman, E., Ritter, W., and Vaughan, H. G. Jr. (1998). Predictability of stimulus deviance and the mismatch negativity. NeuroReport, 9, 4167-4170, Sussman, E., and Gumenyuk, V. (2005). Organization of sequential sounds in auditory memory. NeuroReport, 16, 1519-1523 to the visual domain by presenting task-irrelevant infrequent luminance-deviant stimuli (D, 20%) inserted among task-irrelevant frequent stimuli being of standard luminance (S, 80%) in randomized (randomized condition, SSSDSSSSSDSSSSD...) and fixed manners (fixed condition, SSSSDSSSSDSSSSD...). Comparing the visual mismatch negativity (visual MMN), an event-related brain potential (ERP) index of memory-mismatch processes in human visual sensory system, revealed that visual MMN elicited by deviant stimuli was reduced in the fixed compared to the randomized condition. Thus, the large-scale sequential regularity being present in the fixed condition (SSSSD) must have been represented in visual sensory memory. Interestingly, this effect did not occur in conditions with stimulus-onset asynchronies (SOAs) of 480 and 800 ms but was confined to the 160-ms SOA condition supporting the hypothesis that large-scale regularity extraction was based on perceptual grouping of the five successive stimuli defining the regularity. 2010 Elsevier B.V. All rights reserved.

  1. Geospatial Optimization of Siting Large-Scale Solar Projects

    Energy Technology Data Exchange (ETDEWEB)

    Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Quinby, Ted [National Renewable Energy Lab. (NREL), Golden, CO (United States); Caulfield, Emmet [Stanford Univ., CA (United States); Gerritsen, Margot [Stanford Univ., CA (United States); Diffendorfer, Jay [U.S. Geological Survey, Boulder, CO (United States); Haines, Seth [U.S. Geological Survey, Boulder, CO (United States)

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent with each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  2. Learning Short Binary Codes for Large-scale Image Retrieval.

    Science.gov (United States)

    Liu, Li; Yu, Mengyang; Shao, Ling

    2017-03-01

    Large-scale visual information retrieval has become an active research area in this big data era. Recently, hashing/binary coding algorithms prove to be effective for scalable retrieval applications. Most existing hashing methods require relatively long binary codes (i.e., over hundreds of bits, sometimes even thousands of bits) to achieve reasonable retrieval accuracies. However, for some realistic and unique applications, such as on wearable or mobile devices, only short binary codes can be used for efficient image retrieval due to the limitation of computational resources or bandwidth on these devices. In this paper, we propose a novel unsupervised hashing approach called min-cost ranking (MCR) specifically for learning powerful short binary codes (i.e., usually the code length shorter than 100 b) for scalable image retrieval tasks. By exploring the discriminative ability of each dimension of data, MCR can generate one bit binary code for each dimension and simultaneously rank the discriminative separability of each bit according to the proposed cost function. Only top-ranked bits with minimum cost-values are then selected and grouped together to compose the final salient binary codes. Extensive experimental results on large-scale retrieval demonstrate that MCR can achieve comparative performance as the state-of-the-art hashing algorithms but with significantly shorter codes, leading to much faster large-scale retrieval.

  3. BILGO: Bilateral greedy optimization for large scale semidefinite programming

    KAUST Repository

    Hao, Zhifeng

    2013-10-03

    Many machine learning tasks (e.g. metric and manifold learning problems) can be formulated as convex semidefinite programs. To enable the application of these tasks on a large-scale, scalability and computational efficiency are considered as desirable properties for a practical semidefinite programming algorithm. In this paper, we theoretically analyze a new bilateral greedy optimization (denoted BILGO) strategy in solving general semidefinite programs on large-scale datasets. As compared to existing methods, BILGO employs a bilateral search strategy during each optimization iteration. In such an iteration, the current semidefinite matrix solution is updated as a bilateral linear combination of the previous solution and a suitable rank-1 matrix, which can be efficiently computed from the leading eigenvector of the descent direction at this iteration. By optimizing for the coefficients of the bilateral combination, BILGO reduces the cost function in every iteration until the KKT conditions are fully satisfied, thus, it tends to converge to a global optimum. In fact, we prove that BILGO converges to the global optimal solution at a rate of O(1/k), where k is the iteration counter. The algorithm thus successfully combines the efficiency of conventional rank-1 update algorithms and the effectiveness of gradient descent. Moreover, BILGO can be easily extended to handle low rank constraints. To validate the effectiveness and efficiency of BILGO, we apply it to two important machine learning tasks, namely Mahalanobis metric learning and maximum variance unfolding. Extensive experimental results clearly demonstrate that BILGO can solve large-scale semidefinite programs efficiently.

  4. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-05-27

    While real-time applications are nowadays routinely used in visualizing large nu- merical simulations and volumes, handling these large-scale datasets requires high-end graphics clusters or supercomputers to process and visualize them. However, not all users have access to powerful clusters. Therefore, it is challenging to come up with a visualization approach that provides insight to large-scale datasets on a single com- puter. Explorable images (EI) is one of the methods that allows users to handle large data on a single workstation. Although it is a view-dependent method, it combines both exploration and modification of visual aspects without re-accessing the original huge data. In this thesis, we propose a novel image-based method that applies the concept of EI in visualizing large flow-field pathlines data. The goal of our work is to provide an optimized image-based method, which scales well with the dataset size. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  5. Robust regression for large-scale neuroimaging studies.

    Science.gov (United States)

    Fritsch, Virgile; Da Mota, Benoit; Loth, Eva; Varoquaux, Gaël; Banaschewski, Tobias; Barker, Gareth J; Bokde, Arun L W; Brühl, Rüdiger; Butzek, Brigitte; Conrod, Patricia; Flor, Herta; Garavan, Hugh; Lemaitre, Hervé; Mann, Karl; Nees, Frauke; Paus, Tomas; Schad, Daniel J; Schümann, Gunter; Frouin, Vincent; Poline, Jean-Baptiste; Thirion, Bertrand

    2015-05-01

    Multi-subject datasets used in neuroimaging group studies have a complex structure, as they exhibit non-stationary statistical properties across regions and display various artifacts. While studies with small sample sizes can rarely be shown to deviate from standard hypotheses (such as the normality of the residuals) due to the poor sensitivity of normality tests with low degrees of freedom, large-scale studies (e.g. >100 subjects) exhibit more obvious deviations from these hypotheses and call for more refined models for statistical inference. Here, we demonstrate the benefits of robust regression as a tool for analyzing large neuroimaging cohorts. First, we use an analytic test based on robust parameter estimates; based on simulations, this procedure is shown to provide an accurate statistical control without resorting to permutations. Second, we show that robust regression yields more detections than standard algorithms using as an example an imaging genetics study with 392 subjects. Third, we show that robust regression can avoid false positives in a large-scale analysis of brain-behavior relationships with over 1500 subjects. Finally we embed robust regression in the Randomized Parcellation Based Inference (RPBI) method and demonstrate that this combination further improves the sensitivity of tests carried out across the whole brain. Altogether, our results show that robust procedures provide important advantages in large-scale neuroimaging group studies. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Parallel clustering algorithm for large-scale biological data sets.

    Science.gov (United States)

    Wang, Minchao; Zhang, Wu; Ding, Wang; Dai, Dongbo; Zhang, Huiran; Xie, Hao; Chen, Luonan; Guo, Yike; Xie, Jiang

    2014-01-01

    Recent explosion of biological data brings a great challenge for the traditional clustering algorithms. With increasing scale of data sets, much larger memory and longer runtime are required for the cluster identification problems. The affinity propagation algorithm outperforms many other classical clustering algorithms and is widely applied into the biological researches. However, the time and space complexity become a great bottleneck when handling the large-scale data sets. Moreover, the similarity matrix, whose constructing procedure takes long runtime, is required before running the affinity propagation algorithm, since the algorithm clusters data sets based on the similarities between data pairs. Two types of parallel architectures are proposed in this paper to accelerate the similarity matrix constructing procedure and the affinity propagation algorithm. The memory-shared architecture is used to construct the similarity matrix, and the distributed system is taken for the affinity propagation algorithm, because of its large memory size and great computing capacity. An appropriate way of data partition and reduction is designed in our method, in order to minimize the global communication cost among processes. A speedup of 100 is gained with 128 cores. The runtime is reduced from serval hours to a few seconds, which indicates that parallel algorithm is capable of handling large-scale data sets effectively. The parallel affinity propagation also achieves a good performance when clustering large-scale gene data (microarray) and detecting families in large protein superfamilies.

  7. Large Scale Multimedia Production Management: from Strategic Planning to Six Sigma

    OpenAIRE

    Amorim, Joni A.; de-Siqueira, Jose Macario; Martínez-Sáez, Antonio

    2012-01-01

    [EN] Project portfolio management of large scale multimedia production emerges today as a challenge both for the enrichment of traditional classroom teaching and for distance education. Strategic planning of projects involves developing methodologies, reference models and processes while organising project management offices (PMOs) in the perspective of optimising the use of available resources in an organization. In this way, this paper presents a proposal of a project management model for d...

  8. The multilevel fast multipole algorithm (MLFMA) for solving large-scale computational electromagnetics problems

    CERN Document Server

    Ergul, Ozgur

    2014-01-01

    The Multilevel Fast Multipole Algorithm (MLFMA) for Solving Large-Scale Computational Electromagnetic Problems provides a detailed and instructional overview of implementing MLFMA. The book: Presents a comprehensive treatment of the MLFMA algorithm, including basic linear algebra concepts, recent developments on the parallel computation, and a number of application examplesCovers solutions of electromagnetic problems involving dielectric objects and perfectly-conducting objectsDiscusses applications including scattering from airborne targets, scattering from red

  9. Robust large-scale parallel nonlinear solvers for simulations.

    Energy Technology Data Exchange (ETDEWEB)

    Bader, Brett William; Pawlowski, Roger Patrick; Kolda, Tamara Gibson (Sandia National Laboratories, Livermore, CA)

    2005-11-01

    This report documents research to develop robust and efficient solution techniques for solving large-scale systems of nonlinear equations. The most widely used method for solving systems of nonlinear equations is Newton's method. While much research has been devoted to augmenting Newton-based solvers (usually with globalization techniques), little has been devoted to exploring the application of different models. Our research has been directed at evaluating techniques using different models than Newton's method: a lower order model, Broyden's method, and a higher order model, the tensor method. We have developed large-scale versions of each of these models and have demonstrated their use in important applications at Sandia. Broyden's method replaces the Jacobian with an approximation, allowing codes that cannot evaluate a Jacobian or have an inaccurate Jacobian to converge to a solution. Limited-memory methods, which have been successful in optimization, allow us to extend this approach to large-scale problems. We compare the robustness and efficiency of Newton's method, modified Newton's method, Jacobian-free Newton-Krylov method, and our limited-memory Broyden method. Comparisons are carried out for large-scale applications of fluid flow simulations and electronic circuit simulations. Results show that, in cases where the Jacobian was inaccurate or could not be computed, Broyden's method converged in some cases where Newton's method failed to converge. We identify conditions where Broyden's method can be more efficient than Newton's method. We also present modifications to a large-scale tensor method, originally proposed by Bouaricha, for greater efficiency, better robustness, and wider applicability. Tensor methods are an alternative to Newton-based methods and are based on computing a step based on a local quadratic model rather than a linear model. The advantage of Bouaricha's method is that it can use any

  10. Part-Based Deep Hashing for Large-Scale Person Re-Identification.

    Science.gov (United States)

    Zhu, Fuqing; Kong, Xiangwei; Zheng, Liang; Fu, Haiyan; Tian, Qi

    2017-10-01

    Large-scale is a trend in person re-identi- fication (re-id). It is important that real-time search be performed in a large gallery. While previous methods mostly focus on discriminative learning, this paper makes the attempt in integrating deep learning and hashing into one framework to evaluate the efficiency and accuracy for large-scale person re-id. We integrate spatial information for discriminative visual representation by partitioning the pedestrian image into horizontal parts. Specifically, Part-based Deep Hashing (PDH) is proposed, in which batches of triplet samples are employed as the input of the deep hashing architecture. Each triplet sample contains two pedestrian images (or parts) with the same identity and one pedestrian image (or part) of the different identity. A triplet loss function is employed with a constraint that the Hamming distance of pedestrian images (or parts) with the same identity is smaller than ones with the different identity. In the experiment, we show that the proposed PDH method yields very competitive re-id accuracy on the large-scale Market-1501 and Market-1501+500K datasets.

  11. Research and development of safeguards measures for the large scale reprocessing plant

    Energy Technology Data Exchange (ETDEWEB)

    Kikuchi, Masahiro; Sato, Yuji; Yokota, Yasuhiro; Masuda, Shoichiro; Kobayashi, Isao; Uchikoshi, Seiji; Tsutaki, Yasuhiro; Nidaira, Kazuo [Nuclear Material Control Center, Tokyo (Japan)

    1994-12-31

    The Government of Japan agreed on the safeguards concepts of commercial size reprocessing plant under the bilateral agreement for cooperation between the Japan and the United States. In addition, the LASCAR, that is the forum of large scale reprocessing plant safeguards, could obtain the fruitful results in the spring of 1992. The research and development of safeguards measures for the Rokkasho Reprocessing Plant should be progressed with every regard to the concepts described in both documents. Basically, the material accountancy and monitoring system should be established, based on the NRTA and other measures in order to obtain the timeliness goal for plutonium, and the un-attended mode inspection approach based on the integrated containment/surveillance system coupled with radiation monitoring in order to reduce the inspection efforts. NMCC has been studying on the following measures for a large scale reprocessing plant safeguards (1) A radiation gate monitor and integrated surveillance system (2) A near real time Shipper and Receiver Difference monitoring (3) A near real time material accountancy system operated for the bulk handling area (4) A volume measurement technique in a large scale input accountancy vessel (5) An in-process inventory estimation technique applied to the process equipment such as the pulse column and evaporator (6) Solution transfer monitoring approach applied to buffer tanks in the chemical process (7) A timely analysis technique such as a hybrid K edge densitometer operated in the on-site laboratory (J.P.N.).

  12. Computation of Large-Scale Structure Jet Noise Sources With Weak Nonlinear Effects Using Linear Euler

    Science.gov (United States)

    Dahl, Milo D.; Hixon, Ray; Mankbadi, Reda R.

    2003-01-01

    An approximate technique is presented for the prediction of the large-scale turbulent structure sound source in a supersonic jet. A linearized Euler equations code is used to solve for the flow disturbances within and near a jet with a given mean flow. Assuming a normal mode composition for the wave-like disturbances, the linear radial profiles are used in an integration of the Navier-Stokes equations. This results in a set of ordinary differential equations representing the weakly nonlinear self-interactions of the modes along with their interaction with the mean flow. Solutions are then used to correct the amplitude of the disturbances that represent the source of large-scale turbulent structure sound in the jet.

  13. NASA's Information Power Grid: Large Scale Distributed Computing and Data Management

    Science.gov (United States)

    Johnston, William E.; Vaziri, Arsi; Hinke, Tom; Tanner, Leigh Ann; Feiereisen, William J.; Thigpen, William; Tang, Harry (Technical Monitor)

    2001-01-01

    Large-scale science and engineering are done through the interaction of people, heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. The overall motivation for Grids is to facilitate the routine interactions of these resources in order to support large-scale science and engineering. Multi-disciplinary simulations provide a good example of a class of applications that are very likely to require aggregation of widely distributed computing, data, and intellectual resources. Such simulations - e.g. whole system aircraft simulation and whole system living cell simulation - require integrating applications and data that are developed by different teams of researchers frequently in different locations. The research team's are the only ones that have the expertise to maintain and improve the simulation code and/or the body of experimental data that drives the simulations. This results in an inherently distributed computing and data management environment.

  14. Optimization of large-scale heterogeneous system-of-systems models.

    Energy Technology Data Exchange (ETDEWEB)

    Parekh, Ojas; Watson, Jean-Paul; Phillips, Cynthia Ann; Siirola, John; Swiler, Laura Painton; Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Lee, Herbert K. H. (University of California, Santa Cruz, Santa Cruz, CA); Hart, William Eugene; Gray, Genetha Anne (Sandia National Laboratories, Livermore, CA); Woodruff, David L. (University of California, Davis, Davis, CA)

    2012-01-01

    Decision makers increasingly rely on large-scale computational models to simulate and analyze complex man-made systems. For example, computational models of national infrastructures are being used to inform government policy, assess economic and national security risks, evaluate infrastructure interdependencies, and plan for the growth and evolution of infrastructure capabilities. A major challenge for decision makers is the analysis of national-scale models that are composed of interacting systems: effective integration of system models is difficult, there are many parameters to analyze in these systems, and fundamental modeling uncertainties complicate analysis. This project is developing optimization methods to effectively represent and analyze large-scale heterogeneous system of systems (HSoS) models, which have emerged as a promising approach for describing such complex man-made systems. These optimization methods enable decision makers to predict future system behavior, manage system risk, assess tradeoffs between system criteria, and identify critical modeling uncertainties.

  15. PathlinesExplorer — Image-based exploration of large-scale pathline fields

    KAUST Repository

    Nagoor, Omniah H.

    2015-10-25

    PathlinesExplorer is a novel image-based tool, which has been designed to visualize large scale pathline fields on a single computer [7]. PathlinesExplorer integrates explorable images (EI) technique [4] with order-independent transparency (OIT) method [2]. What makes this method different is that it allows users to handle large data on a single workstation. Although it is a view-dependent method, PathlinesExplorer combines both exploration and modification of visual aspects without re-accessing the original huge data. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathline segments. With this view-dependent method, it is possible to filter, color-code, and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  16. A European collaboration research programme to study and test large scale base isolated structures

    International Nuclear Information System (INIS)

    Renda, V.; Verzeletti, G.; Papa, L.

    1995-01-01

    The improvement of the technology of innovative anti-seismic mechanisms, as those for base isolation and energy dissipation, needs of testing capability for large scale models of structures integrated with these mechanisms. These kind experimental tests are of primary importance for the validation of design rules and the setting up of an advanced earthquake engineering for civil constructions of relevant interest. The Joint Research Centre of the European Commission offers the European Laboratory for Structural Assessment located at Ispra - Italy, as a focal point for an international european collaboration research programme to test large scale models of structure making use of innovative anti-seismic mechanisms. A collaboration contract, opened to other future contributions, has been signed with the national italian working group on seismic isolation (Gruppo di Lavoro sull's Isolamento Sismico GLIS) which includes the national research centre ENEA, the national electricity board ENEL, the industrial research centre ISMES and producer of isolators ALGA. (author). 3 figs

  17. Method of multi-sensor data Association based on large scale

    Directory of Open Access Journals (Sweden)

    Xu-hui Lan

    2017-01-01

    Full Text Available In the multi-sensor fusion system, great difference of detection information accuracy results high uncertainty of heterogeneous information correlation. This paper proposes a multi-sensor data correlation method basing on large scale which combines evidence theory and multi-factor fuzzy integrated decision theory in information correlation. For solving the problem of uncertainty of information correlation and obtaining evidence difficultly, the method first combines uncertain information evidence, and then obtains evidence from multi-factor fuzzy integrated decision membership degree function. The test results that heterogeneous information correlation using this method can conquer uncertainty of evidence combination and reduce the error and miss correlation rate.

  18. Solving large scale structure in ten easy steps with COLA

    International Nuclear Information System (INIS)

    Tassev, Svetlin; Zaldarriaga, Matias; Eisenstein, Daniel J.

    2013-01-01

    We present the COmoving Lagrangian Acceleration (COLA) method: an N-body method for solving for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). Unlike standard N-body methods, the COLA method can straightforwardly trade accuracy at small-scales in order to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing, as those catalogs are essential for performing detailed error analysis for ongoing and future surveys of LSS. As an illustration, we ran a COLA-based N-body code on a box of size 100 Mpc/h with particles of mass ≈ 5 × 10 9 M s un/h. Running the code with only 10 timesteps was sufficient to obtain an accurate description of halo statistics down to halo masses of at least 10 11 M s un/h. This is only at a modest speed penalty when compared to mocks obtained with LPT. A standard detailed N-body run is orders of magnitude slower than our COLA-based code. The speed-up we obtain with COLA is due to the fact that we calculate the large-scale dynamics exactly using LPT, while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos. Achieving a similar level of accuracy in halo statistics without the COLA method requires at least 3 times more timesteps than when COLA is employed

  19. Large scale obscuration and related climate effects open literature bibliography

    International Nuclear Information System (INIS)

    Russell, N.A.; Geitgey, J.; Behl, Y.K.; Zak, B.D.

    1994-05-01

    Large scale obscuration and related climate effects of nuclear detonations first became a matter of concern in connection with the so-called ''Nuclear Winter Controversy'' in the early 1980's. Since then, the world has changed. Nevertheless, concern remains about the atmospheric effects of nuclear detonations, but the source of concern has shifted. Now it focuses less on global, and more on regional effects and their resulting impacts on the performance of electro-optical and other defense-related systems. This bibliography reflects the modified interest

  20. Large-scale biophysical evaluation of protein PEGylation effects

    DEFF Research Database (Denmark)

    Vernet, Erik; Popa, Gina; Pozdnyakova, Irina

    2016-01-01

    PEGylation is the most widely used method to chemically modify protein biopharmaceuticals, but surprisingly limited public data is available on the biophysical effects of protein PEGylation. Here we report the first large-scale study, with site-specific mono-PEGylation of 15 different proteins...... and characterization of 61 entities in total using a common set of analytical methods. Predictions of molecular size were typically accurate in comparison with actual size determined by size-exclusion chromatography (SEC) or dynamic light scattering (DLS). In contrast, there was no universal trend regarding the effect...