WorldWideScience

Sample records for research program large-scale

  1. Possible research program on a large scale nuclear pressure vessel

    International Nuclear Information System (INIS)

    1983-01-01

    The nuclear pressure vessel structural integrity is actually one of the main items in the nuclear plants safety field. An international study group aimed at investigating the feasibility of a ''possible research program'' on a scale 1:1 LWR pressure vessel. This report presents the study group's work. The different research programs carried out or being carried out in various countries of the European Community are presented (phase I of the study). The main characteristics of the vessel considered for the program and an evaluation of activities required for making them available are listed. Research topic priorities from the different interested countries are summarized in tables (phase 2); a critical review by the study group of the topic is presented. Then, proposals for possible experimental programs and combination of these programs are presented, only as examples of possible useful research activities. The documents pertaining to the results of phase I inquiry performed by the study group are reported in the appendix

  2. Large scale computing in the Energy Research Programs

    International Nuclear Information System (INIS)

    1991-05-01

    The Energy Research Supercomputer Users Group (ERSUG) comprises all investigators using resources of the Department of Energy Office of Energy Research supercomputers. At the December 1989 meeting held at Florida State University (FSU), the ERSUG executive committee determined that the continuing rapid advances in computational sciences and computer technology demanded a reassessment of the role computational science should play in meeting DOE's commitments. Initial studies were to be performed for four subdivisions: (1) Basic Energy Sciences (BES) and Applied Mathematical Sciences (AMS), (2) Fusion Energy, (3) High Energy and Nuclear Physics, and (4) Health and Environmental Research. The first two subgroups produced formal subreports that provided a basis for several sections of this report. Additional information provided in the AMS/BES is included as Appendix C in an abridged form that eliminates most duplication. Additionally, each member of the executive committee was asked to contribute area-specific assessments; these assessments are included in the next section. In the following sections, brief assessments are given for specific areas, a conceptual model is proposed that the entire computational effort for energy research is best viewed as one giant nation-wide computer, and then specific recommendations are made for the appropriate evolution of the system

  3. Political consultation and large-scale research

    International Nuclear Information System (INIS)

    Bechmann, G.; Folkers, H.

    1977-01-01

    Large-scale research and policy consulting have an intermediary position between sociological sub-systems. While large-scale research coordinates science, policy, and production, policy consulting coordinates science, policy and political spheres. In this very position, large-scale research and policy consulting lack of institutional guarantees and rational back-ground guarantee which are characteristic for their sociological environment. This large-scale research can neither deal with the production of innovative goods under consideration of rentability, nor can it hope for full recognition by the basis-oriented scientific community. Policy consulting knows neither the competence assignment of the political system to make decisions nor can it judge succesfully by the critical standards of the established social science, at least as far as the present situation is concerned. This intermediary position of large-scale research and policy consulting has, in three points, a consequence supporting the thesis which states that this is a new form of institutionalization of science: These are: 1) external control, 2) the organization form, 3) the theoretical conception of large-scale research and policy consulting. (orig.) [de

  4. Internationalization Measures in Large Scale Research Projects

    Science.gov (United States)

    Soeding, Emanuel; Smith, Nancy

    2017-04-01

    Internationalization measures in Large Scale Research Projects Large scale research projects (LSRP) often serve as flagships used by universities or research institutions to demonstrate their performance and capability to stakeholders and other interested parties. As the global competition among universities for the recruitment of the brightest brains has increased, effective internationalization measures have become hot topics for universities and LSRP alike. Nevertheless, most projects and universities are challenged with little experience on how to conduct these measures and make internationalization an cost efficient and useful activity. Furthermore, those undertakings permanently have to be justified with the Project PIs as important, valuable tools to improve the capacity of the project and the research location. There are a variety of measures, suited to support universities in international recruitment. These include e.g. institutional partnerships, research marketing, a welcome culture, support for science mobility and an effective alumni strategy. These activities, although often conducted by different university entities, are interlocked and can be very powerful measures if interfaced in an effective way. On this poster we display a number of internationalization measures for various target groups, identify interfaces between project management, university administration, researchers and international partners to work together, exchange information and improve processes in order to be able to recruit, support and keep the brightest heads to your project.

  5. Large-scale sequential quadratic programming algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Eldersveld, S.K.

    1992-09-01

    The problem addressed is the general nonlinear programming problem: finding a local minimizer for a nonlinear function subject to a mixture of nonlinear equality and inequality constraints. The methods studied are in the class of sequential quadratic programming (SQP) algorithms, which have previously proved successful for problems of moderate size. Our goal is to devise an SQP algorithm that is applicable to large-scale optimization problems, using sparse data structures and storing less curvature information but maintaining the property of superlinear convergence. The main features are: 1. The use of a quasi-Newton approximation to the reduced Hessian of the Lagrangian function. Only an estimate of the reduced Hessian matrix is required by our algorithm. The impact of not having available the full Hessian approximation is studied and alternative estimates are constructed. 2. The use of a transformation matrix Q. This allows the QP gradient to be computed easily when only the reduced Hessian approximation is maintained. 3. The use of a reduced-gradient form of the basis for the null space of the working set. This choice of basis is more practical than an orthogonal null-space basis for large-scale problems. The continuity condition for this choice is proven. 4. The use of incomplete solutions of quadratic programming subproblems. Certain iterates generated by an active-set method for the QP subproblem are used in place of the QP minimizer to define the search direction for the nonlinear problem. An implementation of the new algorithm has been obtained by modifying the code MINOS. Results and comparisons with MINOS and NPSOL are given for the new algorithm on a set of 92 test problems.

  6. Large-scale linear programs in planning and prediction.

    Science.gov (United States)

    2017-06-01

    Large-scale linear programs are at the core of many traffic-related optimization problems in both planning and prediction. Moreover, many of these involve significant uncertainty, and hence are modeled using either chance constraints, or robust optim...

  7. Needs, opportunities, and options for large scale systems research

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  8. How Large-Scale Research Facilities Connect to Global Research

    DEFF Research Database (Denmark)

    Lauto, Giancarlo; Valentin, Finn

    2013-01-01

    Policies for large-scale research facilities (LSRFs) often highlight their spillovers to industrial innovation and their contribution to the external connectivity of the regional innovation system hosting them. Arguably, the particular institutional features of LSRFs are conducive for collaborative...... research. However, based on data on publications produced in 2006–2009 at the Neutron Science Directorate of Oak Ridge National Laboratory in Tennessee (United States), we find that internationalization of its collaborative research is restrained by coordination costs similar to those characterizing other...

  9. Large-Scale Seismic Test Program at Hualien, Taiwan

    International Nuclear Information System (INIS)

    Tang, H.T.; Graves, H.L.; Chen, P.C.

    1992-01-01

    The Large-Scale Seismic Test (LSST) Program at Hualien, Taiwan, is a follow-on to the soil-structure interaction (SSI) experiments at Lotung, Taiwan. The planned SSI studies will be performed at a stiff soil site in Hualien, Taiwan, that historically has had slightly more destructive earthquakes in the past than Lotung. The LSST is a joint effort among many interested parties. Electric Power Research Institute (EPRI) and Taipower are the organizers of the program and have the lead in planning and managing the program. Other organizations participating in the LSST program are US Nuclear Regulatory Commission, the Central Research Institute of Electric Power Industry, the Tokyo Electric Power Company, the Commissariat A L'Energie Atomique, Electricite de France and Framatome. The LSST was initiated in January 1990, and is envisioned to be five years in duration. Based on the assumption of stiff soil and confirmed by soil boring and geophysical results the test model was designed to provide data needed for SSI studies covering: free-field input, nonlinear soil response, non-rigid body SSI, torsional response, kinematic interaction, spatial incoherency and other effects. Taipower had the lead in design of the test model and received significant input from other LSST members. Questions raised by LSST members were on embedment effects, model stiffness, base shear, and openings for equipment. This paper describes progress in site preparation, design and construction of the model and development of an instrumentation plan

  10. Proceedings of the meeting on large scale computer simulation research

    International Nuclear Information System (INIS)

    2004-04-01

    The meeting to summarize the collaboration activities for FY2003 on the Large Scale Computer Simulation Research was held January 15-16, 2004 at Theory and Computer Simulation Research Center, National Institute for Fusion Science. Recent simulation results, methodologies and other related topics were presented. (author)

  11. New Visions for Large Scale Networks: Research and Applications

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — This paper documents the findings of the March 12-14, 2001 Workshop on New Visions for Large-Scale Networks: Research and Applications. The workshops objectives were...

  12. BILGO: Bilateral greedy optimization for large scale semidefinite programming

    KAUST Repository

    Hao, Zhifeng

    2013-10-03

    Many machine learning tasks (e.g. metric and manifold learning problems) can be formulated as convex semidefinite programs. To enable the application of these tasks on a large-scale, scalability and computational efficiency are considered as desirable properties for a practical semidefinite programming algorithm. In this paper, we theoretically analyze a new bilateral greedy optimization (denoted BILGO) strategy in solving general semidefinite programs on large-scale datasets. As compared to existing methods, BILGO employs a bilateral search strategy during each optimization iteration. In such an iteration, the current semidefinite matrix solution is updated as a bilateral linear combination of the previous solution and a suitable rank-1 matrix, which can be efficiently computed from the leading eigenvector of the descent direction at this iteration. By optimizing for the coefficients of the bilateral combination, BILGO reduces the cost function in every iteration until the KKT conditions are fully satisfied, thus, it tends to converge to a global optimum. In fact, we prove that BILGO converges to the global optimal solution at a rate of O(1/k), where k is the iteration counter. The algorithm thus successfully combines the efficiency of conventional rank-1 update algorithms and the effectiveness of gradient descent. Moreover, BILGO can be easily extended to handle low rank constraints. To validate the effectiveness and efficiency of BILGO, we apply it to two important machine learning tasks, namely Mahalanobis metric learning and maximum variance unfolding. Extensive experimental results clearly demonstrate that BILGO can solve large-scale semidefinite programs efficiently.

  13. BILGO: Bilateral greedy optimization for large scale semidefinite programming

    KAUST Repository

    Hao, Zhifeng; Yuan, Ganzhao; Ghanem, Bernard

    2013-01-01

    Many machine learning tasks (e.g. metric and manifold learning problems) can be formulated as convex semidefinite programs. To enable the application of these tasks on a large-scale, scalability and computational efficiency are considered as desirable properties for a practical semidefinite programming algorithm. In this paper, we theoretically analyze a new bilateral greedy optimization (denoted BILGO) strategy in solving general semidefinite programs on large-scale datasets. As compared to existing methods, BILGO employs a bilateral search strategy during each optimization iteration. In such an iteration, the current semidefinite matrix solution is updated as a bilateral linear combination of the previous solution and a suitable rank-1 matrix, which can be efficiently computed from the leading eigenvector of the descent direction at this iteration. By optimizing for the coefficients of the bilateral combination, BILGO reduces the cost function in every iteration until the KKT conditions are fully satisfied, thus, it tends to converge to a global optimum. In fact, we prove that BILGO converges to the global optimal solution at a rate of O(1/k), where k is the iteration counter. The algorithm thus successfully combines the efficiency of conventional rank-1 update algorithms and the effectiveness of gradient descent. Moreover, BILGO can be easily extended to handle low rank constraints. To validate the effectiveness and efficiency of BILGO, we apply it to two important machine learning tasks, namely Mahalanobis metric learning and maximum variance unfolding. Extensive experimental results clearly demonstrate that BILGO can solve large-scale semidefinite programs efficiently.

  14. Concurrent Programming Using Actors: Exploiting Large-Scale Parallelism,

    Science.gov (United States)

    1985-10-07

    ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK* Artificial Inteligence Laboratory AREA Is WORK UNIT NUMBERS 545 Technology Square...D-R162 422 CONCURRENT PROGRMMIZNG USING f"OS XL?ITP TEH l’ LARGE-SCALE PARALLELISH(U) NASI AC E Al CAMBRIDGE ARTIFICIAL INTELLIGENCE L. G AGHA ET AL...RESOLUTION TEST CHART N~ATIONAL BUREAU OF STANDA.RDS - -96 A -E. __ _ __ __’ .,*- - -- •. - MASSACHUSETTS INSTITUTE OF TECHNOLOGY ARTIFICIAL

  15. Programs on large scale applications of superconductivity in Japan

    International Nuclear Information System (INIS)

    Yasukochi, K.; Ogasawara, T.

    1974-01-01

    History of the large scale application of superconductivity in Japan is reported. Experimental works on superconducting magnet systems for high energy physics have just begun. The programs are described by dividing into five categories: 1) MHD power generation systems, 2) superconducting rotating machines, 3) cryogenic power transmission systems, 4) magnetically levitated transportation, and 5) application to high energy physics experiments. The development of a big superconducting magnet for a 1,000 kW class generator was set up as a target of first seven year plan, which came to end in 1972, and continues for three years with the budget of 900 million yen from 1973 on. In the second phase plan, a prototype MHD generator is argued. A plan is contemplated to develop a synchronous generator with inner rotating field by Fuji Electric Co. The total budget for the future plans of superconducting power transmission system amounts to 20 billion yen for the first period of 8 approximately 9 years. In JNR's research and development efforts, several characteristic points are picked up: 1) linear motor drive with active side on ground, 2) loop track, 3) combined test run of maglev and LSM. The field test at the speed of 500 km/hr on a 7 km track is scheduled to be performed in 1975. The target of operation is in 1985. A 12 GeV proton synchrotron is now under construction for the study on high energy physics. Three ring intersecting storage accelerator is discussed for future plan. (Iwakiri, K.)

  16. Bonus algorithm for large scale stochastic nonlinear programming problems

    CERN Document Server

    Diwekar, Urmila

    2015-01-01

    This book presents the details of the BONUS algorithm and its real world applications in areas like sensor placement in large scale drinking water networks, sensor placement in advanced power systems, water management in power systems, and capacity expansion of energy systems. A generalized method for stochastic nonlinear programming based on a sampling based approach for uncertainty analysis and statistical reweighting to obtain probability information is demonstrated in this book. Stochastic optimization problems are difficult to solve since they involve dealing with optimization and uncertainty loops. There are two fundamental approaches used to solve such problems. The first being the decomposition techniques and the second method identifies problem specific structures and transforms the problem into a deterministic nonlinear programming problem. These techniques have significant limitations on either the objective function type or the underlying distributions for the uncertain variables. Moreover, these ...

  17. Planning under uncertainty solving large-scale stochastic linear programs

    Energy Technology Data Exchange (ETDEWEB)

    Infanger, G. [Stanford Univ., CA (United States). Dept. of Operations Research]|[Technische Univ., Vienna (Austria). Inst. fuer Energiewirtschaft

    1992-12-01

    For many practical problems, solutions obtained from deterministic models are unsatisfactory because they fail to hedge against certain contingencies that may occur in the future. Stochastic models address this shortcoming, but up to recently seemed to be intractable due to their size. Recent advances both in solution algorithms and in computer technology now allow us to solve important and general classes of practical stochastic problems. We show how large-scale stochastic linear programs can be efficiently solved by combining classical decomposition and Monte Carlo (importance) sampling techniques. We discuss the methodology for solving two-stage stochastic linear programs with recourse, present numerical results of large problems with numerous stochastic parameters, show how to efficiently implement the methodology on a parallel multi-computer and derive the theory for solving a general class of multi-stage problems with dependency of the stochastic parameters within a stage and between different stages.

  18. Large scale intender test program to measure sub gouge displacements

    Energy Technology Data Exchange (ETDEWEB)

    Been, Ken; Lopez, Juan [Golder Associates Inc, Houston, TX (United States); Sancio, Rodolfo [MMI Engineering Inc., Houston, TX (United States)

    2011-07-01

    The production of submarine pipelines in an offshore environment covered with ice is very challenging. Several precautions must be taken such as burying the pipelines to protect them from ice movement caused by gouging. The estimation of the subgouge displacements is a key factor in pipeline design for ice gouged environments. This paper investigated a method to measure subgouge displacements. An experimental program was implemented in an open field to produce large scale idealized gouges on engineered soil beds (sand and clay). The horizontal force required to produce the gouge, the subgouge displacements in the soil and the strain imposed by these displacements were monitored on a buried model pipeline. The results showed that for a given keel, the gouge depth was inversely proportional to undrained shear strength in clay. The subgouge displacements measured did not show a relationship with the gouge depth, width or soil density in sand and clay tests.

  19. Large-Scale Seismic Test Program at Hualien, Taiwan

    International Nuclear Information System (INIS)

    Tang, H.T.; Graves, H.L.; Yeh, Y.S.

    1991-01-01

    The Large-Scale Seismic Test (LSST) Program at Hualien, Taiwan, is a follow-on to the soil-structure interaction (SSI) experiments at Lotung, Taiwan. The planned SSI studies will be performed at a stiff soil site in Hualien, Taiwan, that historically has had slightly more destructive earthquakes in the past than Lotung. The objectives of the LSST project is as follows: To obtain earthquake-induced SSI data at a stiff soil site having similar prototypical nuclear power plant soil conditions. To confirm the findings and methodologies validated against the Lotung soft soil SSI data for prototypical plant condition applications. To further validate the technical basis of realistic SSI analysis approaches. To further support the resolution of USI A-40 Seismic Design Criteria issue. These objectives will be accomplished through an integrated and carefully planned experimental program consisting of: soil characterization, test model design and field construction, instrumentation layout and deployment, in-situ geophysical information collection, forced vibration test, and synthesis of results and findings. The LSST is a joint effort among many interested parties. EPRI and Taipower are the organizers of the program and have the lead in planning and managing the program

  20. Distributed system for large-scale remote research

    International Nuclear Information System (INIS)

    Ueshima, Yutaka

    2002-01-01

    In advanced photon research, large-scale simulations and high-resolution observations are powerfull tools. In numerical and real experiments, the real-time visualization and steering system is considered as a hopeful method of data analysis. This approach is valid in the typical analysis at one time or low cost experiment and simulation. In research of an unknown problem, it is necessary that the output data be analyzed many times because conclusive analysis is difficult at one time. Consequently, output data should be filed to refer and analyze at any time. To support research, we need the automatic functions, transporting data files from data generator to data storage, analyzing data, tracking history of data handling, and so on. The supporting system will be a functionally distributed system. (author)

  1. Research on large-scale wind farm modeling

    Science.gov (United States)

    Ma, Longfei; Zhang, Baoqun; Gong, Cheng; Jiao, Ran; Shi, Rui; Chi, Zhongjun; Ding, Yifeng

    2017-01-01

    Due to intermittent and adulatory properties of wind energy, when large-scale wind farm connected to the grid, it will have much impact on the power system, which is different from traditional power plants. Therefore it is necessary to establish an effective wind farm model to simulate and analyze the influence wind farms have on the grid as well as the transient characteristics of the wind turbines when the grid is at fault. However we must first establish an effective WTGs model. As the doubly-fed VSCF wind turbine has become the mainstream wind turbine model currently, this article first investigates the research progress of doubly-fed VSCF wind turbine, and then describes the detailed building process of the model. After that investigating the common wind farm modeling methods and pointing out the problems encountered. As WAMS is widely used in the power system, which makes online parameter identification of the wind farm model based on off-output characteristics of wind farm be possible, with a focus on interpretation of the new idea of identification-based modeling of large wind farms, which can be realized by two concrete methods.

  2. Large Scale Computing and Storage Requirements for Nuclear Physics Research

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard A.; Wasserman, Harvey J.

    2012-03-02

    IThe National Energy Research Scientific Computing Center (NERSC) is the primary computing center for the DOE Office of Science, serving approximately 4,000 users and hosting some 550 projects that involve nearly 700 codes for a wide variety of scientific disciplines. In addition to large-scale computing resources NERSC provides critical staff support and expertise to help scientists make the most efficient use of these resources to advance the scientific mission of the Office of Science. In May 2011, NERSC, DOE’s Office of Advanced Scientific Computing Research (ASCR) and DOE’s Office of Nuclear Physics (NP) held a workshop to characterize HPC requirements for NP research over the next three to five years. The effort is part of NERSC’s continuing involvement in anticipating future user needs and deploying necessary resources to meet these demands. The workshop revealed several key requirements, in addition to achieving its goal of characterizing NP computing. The key requirements include: 1. Larger allocations of computational resources at NERSC; 2. Visualization and analytics support; and 3. Support at NERSC for the unique needs of experimental nuclear physicists. This report expands upon these key points and adds others. The results are based upon representative samples, called “case studies,” of the needs of science teams within NP. The case studies were prepared by NP workshop participants and contain a summary of science goals, methods of solution, current and future computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, “multi-core” environment that is expected to dominate HPC architectures over the next few years. The report also includes a section with NERSC responses to the workshop findings. NERSC has many initiatives already underway that address key workshop findings and all of the action items are aligned with NERSC strategic plans.

  3. ECOLOGICAL RESEARCH IN THE LARGE-SCALE BIOSPHERE–ATMOSPHERE EXPERIMENT IN AMAZONIA: EARLY RESULTS.

    Science.gov (United States)

    M. Keller; A. Alencar; G. P. Asner; B. Braswell; M. Bustamente; E. Davidson; T. Feldpausch; E. Fern ndes; M. Goulden; P. Kabat; B. Kruijt; F. Luizao; S. Miller; D. Markewitz; A. D. Nobre; C. A. Nobre; N. Priante Filho; H. Rocha; P. Silva Dias; C von Randow; G. L. Vourlitis

    2004-01-01

    The Large-scale Biosphere–Atmosphere Experiment in Amazonia (LBA) is a multinational, interdisciplinary research program led by Brazil. Ecological studies in LBA focus on how tropical forest conversion, regrowth, and selective logging influence carbon storage, nutrient dynamics, trace gas fluxes, and the prospect for sustainable land use in the Amazon region. Early...

  4. Ecological research in the large-scale biosphere-atmosphere experiment in Amazonia: early results

    NARCIS (Netherlands)

    Keller, M.; Alencar, A.; Asner, G.P.; Braswell, B.; Bustamante, M.; Davidson, E.; Feldpausch, T.; Fernandes, E.; Goulden, M.; Kabat, P.; Kruijt, B.; Luizão, F.; Miller, S.; Markewitz, D.; Nobre, A.D.; Nobre, C.A.; Priante Filho, N.; Rocha, da H.; Silva Dias, P.; Randow, von C.; Vourlitis, G.L.

    2004-01-01

    The Large-scale Biosphere-Atmosphere Experiment in Amazonia (LBA) is a multinational, interdisciplinary research program led by Brazil. Ecological studies in LBA focus on how tropical forest conversion, regrowth, and selective logging influence carbon storage,. nutrient dynamics, trace gas fluxes,

  5. Large Scale Survey Data in Career Development Research

    Science.gov (United States)

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  6. Interior Point Methods for Large-Scale Nonlinear Programming

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2005-01-01

    Roč. 20, č. 4-5 (2005), s. 569-582 ISSN 1055-6788 R&D Projects: GA AV ČR IAA1030405 Institutional research plan: CEZ:AV0Z10300504 Keywords : nonlinear programming * interior point methods * KKT systems * indefinite preconditioners * filter methods * algorithms Subject RIV: BA - General Mathematics Impact factor: 0.477, year: 2005

  7. Evolution of an invasive species research program and implications for large-scale management of a non-native, invasive plant pathogen

    Science.gov (United States)

    Christopher A. Lee; Janice M. Alexander; Susan J. Frankel; Yana Valachovic

    2012-01-01

    We conducted a research needs assessment (RNA) in 2010 to gather opinions of "experts" and a larger public on research priorities for Phytophthora ramorum, the pathogen that causes sudden oak death in forest trees and Ramorum blight in ornamental plants. We place these 2010 findings in context with findings of similar P. ramorum...

  8. European research school on large scale solar thermal – SHINE

    DEFF Research Database (Denmark)

    Bales, Chris; Forteza, Pau Joan Cortés; Furbo, Simon

    2014-01-01

    The Solar Heat Integration NEtwork (SHINE) is a European research school in which 13 PhD students in solar thermal technologies are funded by the EU Marie-Curie program. It has five PhD course modules as well as workshops and seminars dedicated to PhD students both within the project as well...... as outside of it. The SHINE research activities focus on large solar heating systems and new applications: on district heating, industrial processes and new storage systems. The scope of this paper is on systems for district heating for which there are five PhD students, three at universities and two...

  9. Parallelizing Gene Expression Programming Algorithm in Enabling Large-Scale Classification

    Directory of Open Access Journals (Sweden)

    Lixiong Xu

    2017-01-01

    Full Text Available As one of the most effective function mining algorithms, Gene Expression Programming (GEP algorithm has been widely used in classification, pattern recognition, prediction, and other research fields. Based on the self-evolution, GEP is able to mine an optimal function for dealing with further complicated tasks. However, in big data researches, GEP encounters low efficiency issue due to its long time mining processes. To improve the efficiency of GEP in big data researches especially for processing large-scale classification tasks, this paper presents a parallelized GEP algorithm using MapReduce computing model. The experimental results show that the presented algorithm is scalable and efficient for processing large-scale classification tasks.

  10. Research on the Construction Management and Sustainable Development of Large-Scale Scientific Facilities in China

    Science.gov (United States)

    Guiquan, Xi; Lin, Cong; Xuehui, Jin

    2018-05-01

    As an important platform for scientific and technological development, large -scale scientific facilities are the cornerstone of technological innovation and a guarantee for economic and social development. Researching management of large-scale scientific facilities can play a key role in scientific research, sociology and key national strategy. This paper reviews the characteristics of large-scale scientific facilities, and summarizes development status of China's large-scale scientific facilities. At last, the construction, management, operation and evaluation of large-scale scientific facilities is analyzed from the perspective of sustainable development.

  11. New Approaches for Very Large-Scale Integer Programming

    Science.gov (United States)

    2016-06-24

    DISTRIBUTION/ AVAILABILITY STATEMENT Approved for Public Release 13. SUPPLEMENTARY NOTES 14. ABSTRACT The focus of this project is new computational... heuristics for integer programs in order to rapidly improve dual bounds. 2. Choosing good branching variables in branch-and-bound algorithms for MIP. 3...programming, algorithms, parallel processing, machine learning, heuristics 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF

  12. Mathematical programming methods for large-scale topology optimization problems

    DEFF Research Database (Denmark)

    Rojas Labanda, Susana

    for mechanical problems, but has rapidly extended to many other disciplines, such as fluid dynamics and biomechanical problems. However, the novelty and improvements of optimization methods has been very limited. It is, indeed, necessary to develop of new optimization methods to improve the final designs......, and at the same time, reduce the number of function evaluations. Nonlinear optimization methods, such as sequential quadratic programming and interior point solvers, have almost not been embraced by the topology optimization community. Thus, this work is focused on the introduction of this kind of second...... for the classical minimum compliance problem. Two of the state-of-the-art optimization algorithms are investigated and implemented for this structural topology optimization problem. A Sequential Quadratic Programming (TopSQP) and an interior point method (TopIP) are developed exploiting the specific mathematical...

  13. Research and management issues in large-scale fire modeling

    Science.gov (United States)

    David L. Peterson; Daniel L. Schmoldt

    2000-01-01

    In 1996, a team of North American fire scientists and resource managers convened to assess the effects of fire disturbance on ecosystems and to develop scientific recommendations for future fire research and management activities. These recommendations - elicited with the Analytic Hierarchy Process - include numerically ranked scientific and managerial questions and...

  14. Managing Risk and Uncertainty in Large-Scale University Research Projects

    Science.gov (United States)

    Moore, Sharlissa; Shangraw, R. F., Jr.

    2011-01-01

    Both publicly and privately funded research projects managed by universities are growing in size and scope. Complex, large-scale projects (over $50 million) pose new management challenges and risks for universities. This paper explores the relationship between project success and a variety of factors in large-scale university projects. First, we…

  15. The fastclime Package for Linear Programming and Large-Scale Precision Matrix Estimation in R.

    Science.gov (United States)

    Pang, Haotian; Liu, Han; Vanderbei, Robert

    2014-02-01

    We develop an R package fastclime for solving a family of regularized linear programming (LP) problems. Our package efficiently implements the parametric simplex algorithm, which provides a scalable and sophisticated tool for solving large-scale linear programs. As an illustrative example, one use of our LP solver is to implement an important sparse precision matrix estimation method called CLIME (Constrained L 1 Minimization Estimator). Compared with existing packages for this problem such as clime and flare, our package has three advantages: (1) it efficiently calculates the full piecewise-linear regularization path; (2) it provides an accurate dual certificate as stopping criterion; (3) it is completely coded in C and is highly portable. This package is designed to be useful to statisticians and machine learning researchers for solving a wide range of problems.

  16. Research status and needs for shear tests on large-scale reinforced concrete containment elements

    International Nuclear Information System (INIS)

    Oesterle, R.G.; Russell, H.G.

    1982-01-01

    Reinforced concrete containments at nuclear power plants are designed to resist forces caused by internal pressure, gravity, and severe earthquakes. The size, shape, and possible stress states in containments produce unique problems for design and construction. A lack of experimental data on the capacity of reinforced concrete to transfer shear stresses while subjected to biaxial tension has led to cumbersome if not impractical design criteria. Research programs recently conducted at the Construction Technology Laboratories and at Cornell University indicate that design criteria for tangential, peripheral, and radial shear are conservative. This paper discusses results from recent research and presents tentative changes for shear design provisions of the current United States code for containment structures. Areas where information is still lacking to fully verify new design provisions are discussed. Needs for further experimental research on large-scale specimens to develop economical, practical, and reliable design criteria for resisting shear forces in containment are identified. (orig.)

  17. PANDA: a Large Scale Multi-Purpose Test Facility for LWR Safety Research

    Energy Technology Data Exchange (ETDEWEB)

    Dreier, Joerg; Paladino, Domenico; Huggenberger, Max; Andreani, Michele [Laboratory for Thermal-Hydraulics, Nuclear Energy and Safety Research Department, Paul Scherrer Institut (PSI), CH-5232 Villigen PSI (Switzerland); Yadigaroglu, George [ETH Zuerich, Technoparkstrasse 1, Einstein 22- CH-8005 Zuerich (Switzerland)

    2008-07-01

    PANDA is a large-scale multi-purpose thermal-hydraulics test facility, built and operated by PSI. Due to its modular structure, PANDA provides flexibility for a variety of applications, ranging from integral containment system investigations, primary system tests, component experiments to large-scale separate-effects tests. For many applications, the experimental results are directly used for example for concept demonstrations or for the characterisation of phenomena or components, but all the experimental data generated in the various test campaigns is unique and was or/and will still be widely used for the validation and improvement of a variety of computer codes, including codes with 3D capabilities, for reactor safety analysis. The paper provides an overview of the already completed and on-going research programs performed in the PANDA facility in the different area of applications, including the main results and conclusions of the investigations. In particular the advanced passive containment cooling system concept investigations of the SBWR, ESBWR as well as of the SWR1000 in relation to various aspects are presented and the main findings are summarised. Finally the goals, planned investigations and expected results of the on-going OECD project SETH-2 are presented. (authors)

  18. PANDA: a Large Scale Multi-Purpose Test Facility for LWR Safety Research

    International Nuclear Information System (INIS)

    Dreier, Joerg; Paladino, Domenico; Huggenberger, Max; Andreani, Michele; Yadigaroglu, George

    2008-01-01

    PANDA is a large-scale multi-purpose thermal-hydraulics test facility, built and operated by PSI. Due to its modular structure, PANDA provides flexibility for a variety of applications, ranging from integral containment system investigations, primary system tests, component experiments to large-scale separate-effects tests. For many applications, the experimental results are directly used for example for concept demonstrations or for the characterisation of phenomena or components, but all the experimental data generated in the various test campaigns is unique and was or/and will still be widely used for the validation and improvement of a variety of computer codes, including codes with 3D capabilities, for reactor safety analysis. The paper provides an overview of the already completed and on-going research programs performed in the PANDA facility in the different area of applications, including the main results and conclusions of the investigations. In particular the advanced passive containment cooling system concept investigations of the SBWR, ESBWR as well as of the SWR1000 in relation to various aspects are presented and the main findings are summarised. Finally the goals, planned investigations and expected results of the on-going OECD project SETH-2 are presented. (authors)

  19. Model design for Large-Scale Seismic Test Program at Hualien, Taiwan

    International Nuclear Information System (INIS)

    Tang, H.T.; Graves, H.L.; Chen, P.C.

    1991-01-01

    The Large-Scale Seismic Test (LSST) Program at Hualien, Taiwan, is a follow-on to the soil-structure interaction (SSI) experiments at Lotung, Taiwan. The planned SSI studies will be performed at a stiff soil site in Hualien, Taiwan, that historically has had slightly more destructive earthquakes in the past than Lotung. The LSST is a joint effort among many interested parties. Electric Power Research Institute (EPRI) and Taipower are the organizers of the program and have the lead in planning and managing the program. Other organizations participating in the LSST program are US Nuclear Regulatory Commission (NRC), the Central Research Institute of Electric Power Industry (CRIEPI), the Tokyo Electric Power Company (TEPCO), the Commissariat A L'Energie Atomique (CEA), Electricite de France (EdF) and Framatome. The LSST was initiated in January 1990, and is envisioned to be five years in duration. Based on the assumption of stiff soil and confirmed by soil boring and geophysical results the test model was designed to provide data needed for SSI studies covering: free-field input, nonlinear soil response, non-rigid body SSI, torsional response, kinematic interaction, spatial incoherency and other effects. Taipower had the lead in design of the test model and received significant input from other LSST members. Questions raised by LSST members were on embedment effects, model stiffness, base shear, and openings for equipment. This paper describes progress in site preparation, design and construction of the model and development of an instrumentation plan

  20. 5 years of experience with a large-scale mentoring program for medical students

    Directory of Open Access Journals (Sweden)

    Pinilla, Severin

    2015-02-01

    Full Text Available In this paper we present our 5-year-experience with a large-scale mentoring program for undergraduate medical students at the Ludwig Maximilians-Universität Munich (LMU. We implemented a two-tiered program with a peer-mentoring concept for preclinical students and a 1:1-mentoring concept for clinical students aided by a fully automated online-based matching algorithm. Approximately 20-30% of each student cohort participates in our voluntary mentoring program. Defining ideal program evaluation strategies, recruiting mentors from beyond the academic environment and accounting for the mentoring network reality remain challenging. We conclude that a two-tiered program is well accepted by students and faculty. In addition the online-based matching seems to be effective for large-scale mentoring programs.

  1. Investigating and Stimulating Primary Teachers' Attitudes Towards Science: Summary of a Large-Scale Research Project

    Science.gov (United States)

    Walma van der Molen, Juliette; van Aalderen-Smeets, Sandra

    2013-01-01

    Attention to the attitudes of primary teachers towards science is of fundamental importance to research on primary science education. The current article describes a large-scale research project that aims to overcome three main shortcomings in attitude research, i.e. lack of a strong theoretical concept of attitude, methodological flaws in…

  2. Investigating and stimulating primary teachers’ attitudes towards science: Summary of a large-scale research project

    NARCIS (Netherlands)

    Walma van der Molen, Julie Henriëtte; van Aalderen-Smeets, Sandra

    2013-01-01

    Attention to the attitudes of primary teachers towards science is of fundamental importance to research on primary science education. The current article describes a large-scale research project that aims to overcome three main shortcomings in attitude research, i.e. lack of a strong theoretical

  3. The Need for Large-Scale, Longitudinal Empirical Studies in Middle Level Education Research

    Science.gov (United States)

    Mertens, Steven B.; Caskey, Micki M.; Flowers, Nancy

    2016-01-01

    This essay describes and discusses the ongoing need for large-scale, longitudinal, empirical research studies focused on middle grades education. After a statement of the problem and concerns, the essay describes and critiques several prior middle grades efforts and research studies. Recommendations for future research efforts to inform policy…

  4. A note on solving large-scale zero-one programming problems

    NARCIS (Netherlands)

    Adema, Jos J.

    1988-01-01

    A heuristic for solving large-scale zero-one programming problems is provided. The heuristic is based on the modifications made by H. Crowder et al. (1983) to the standard branch-and-bound strategy. First, the initialization is modified. The modification is only useful if the objective function

  5. Large-scale budget applications of mathematical programming in the Forest Service

    Science.gov (United States)

    Malcolm Kirby

    1978-01-01

    Mathematical programming applications in the Forest Service, U.S. Department of Agriculture, are growing. They are being used for widely varying problems: budgeting, lane use planning, timber transport, road maintenance and timber harvest planning. Large-scale applications are being mace in budgeting. The model that is described can be used by developing economies....

  6. Large-scale research in the Federal Republic of Germany. Pt. 4

    International Nuclear Information System (INIS)

    Mock, W.

    1986-01-01

    The name is misleading: in the biggest of 13 large-scale research institutions, the KFA Nuclear Research Centre Juelich, nuclear research is now only one sphere of activities among many, besides other areas of research such as computer science, materials, and environmental research. This change in the areas of main emphasis constitutes the successful attempt - or so it seems up to now - of a 'research dinosaur' to answer to the necessities of an altered 'research landscape'. (orig.) [de

  7. Large-scale production of megakaryocytes from human pluripotent stem cells by chemically defined forward programming.

    Science.gov (United States)

    Moreau, Thomas; Evans, Amanda L; Vasquez, Louella; Tijssen, Marloes R; Yan, Ying; Trotter, Matthew W; Howard, Daniel; Colzani, Maria; Arumugam, Meera; Wu, Wing Han; Dalby, Amanda; Lampela, Riina; Bouet, Guenaelle; Hobbs, Catherine M; Pask, Dean C; Payne, Holly; Ponomaryov, Tatyana; Brill, Alexander; Soranzo, Nicole; Ouwehand, Willem H; Pedersen, Roger A; Ghevaert, Cedric

    2016-04-07

    The production of megakaryocytes (MKs)--the precursors of blood platelets--from human pluripotent stem cells (hPSCs) offers exciting clinical opportunities for transfusion medicine. Here we describe an original approach for the large-scale generation of MKs in chemically defined conditions using a forward programming strategy relying on the concurrent exogenous expression of three transcription factors: GATA1, FLI1 and TAL1. The forward programmed MKs proliferate and differentiate in culture for several months with MK purity over 90% reaching up to 2 × 10(5) mature MKs per input hPSC. Functional platelets are generated throughout the culture allowing the prospective collection of several transfusion units from as few as 1 million starting hPSCs. The high cell purity and yield achieved by MK forward programming, combined with efficient cryopreservation and good manufacturing practice (GMP)-compatible culture, make this approach eminently suitable to both in vitro production of platelets for transfusion and basic research in MK and platelet biology.

  8. Finite Mixture Multilevel Multidimensional Ordinal IRT Models for Large Scale Cross-Cultural Research

    Science.gov (United States)

    de Jong, Martijn G.; Steenkamp, Jan-Benedict E. M.

    2010-01-01

    We present a class of finite mixture multilevel multidimensional ordinal IRT models for large scale cross-cultural research. Our model is proposed for confirmatory research settings. Our prior for item parameters is a mixture distribution to accommodate situations where different groups of countries have different measurement operations, while…

  9. Large-scale seismic test for soil-structure interaction research in Hualien, Taiwan

    International Nuclear Information System (INIS)

    Ueshima, T.; Kokusho, T.; Okamoto, T.

    1995-01-01

    It is important to evaluate dynamic soil-structure interaction more accurately in the aseismic design of important facilities such as nuclear power plants. A large-scale model structure with about 1/4th of commercial nuclear power plants was constructed on the gravelly layers in seismically active Hualien, Taiwan. This international joint project is called 'the Hualien LSST Project', where 'LSST' is short for Large-Scale Seismic Test. In this paper, research tasks and responsibilities, the process of the construction work and research tasks along the time-line, main results obtained up to now, and so on in this Project are described. (J.P.N.)

  10. Modular, object-oriented redesign of a large-scale Monte Carlo neutron transport program

    International Nuclear Information System (INIS)

    Moskowitz, B.S.

    2000-01-01

    This paper describes the modular, object-oriented redesign of a large-scale Monte Carlo neutron transport program. This effort represents a complete 'white sheet of paper' rewrite of the code. In this paper, the motivation driving this project, the design objectives for the new version of the program, and the design choices and their consequences will be discussed. The design itself will also be described, including the important subsystems as well as the key classes within those subsystems

  11. A Heuristic Approach to Author Name Disambiguation in Bibliometrics Databases for Large-scale Research Assessments

    NARCIS (Netherlands)

    D'Angelo, C.A.; Giuffrida, C.; Abramo, G.

    2011-01-01

    National exercises for the evaluation of research activity by universities are becoming regular practice in ever more countries. These exercises have mainly been conducted through the application of peer-review methods. Bibliometrics has not been able to offer a valid large-scale alternative because

  12. National studies on recidivism: an inventory of large-scale recidivism research in 33 European countries

    NARCIS (Netherlands)

    Wartna, B.S.J.; Nijssen, L.T.J.

    2006-01-01

    Measuring recidivism is an established method for examining the effects of penal interventions. Over the last decades the automation of police and judiciary data has opened up opportunities to do large-scale recidivism research. The WODC has made an inventory of the studies that are carried out in

  13. Assessing Programming Costs of Explicit Memory Localization on a Large Scale Shared Memory Multiprocessor

    Directory of Open Access Journals (Sweden)

    Silvio Picano

    1992-01-01

    Full Text Available We present detailed experimental work involving a commercially available large scale shared memory multiple instruction stream-multiple data stream (MIMD parallel computer having a software controlled cache coherence mechanism. To make effective use of such an architecture, the programmer is responsible for designing the program's structure to match the underlying multiprocessors capabilities. We describe the techniques used to exploit our multiprocessor (the BBN TC2000 on a network simulation program, showing the resulting performance gains and the associated programming costs. We show that an efficient implementation relies heavily on the user's ability to explicitly manage the memory system.

  14. The Saskatchewan River Basin - a large scale observatory for water security research (Invited)

    Science.gov (United States)

    Wheater, H. S.

    2013-12-01

    multiple jurisdictions. The SaskRB has therefore been developed as a large scale observatory, now a Regional Hydroclimate Project of the World Climate Research Programme's GEWEX project, and is available to contribute to the emerging North American Water Program. State-of-the-art hydro-ecological experimental sites have been developed for the key biomes, and a river and lake biogeochemical research facility, focussed on impacts of nutrients and exotic chemicals. Data are integrated at SaskRB scale to support the development of improved large scale climate and hydrological modelling products, the development of DSS systems for local, provincial and basin-scale management, and the development of related social science research, engaging stakeholders in the research and exploring their values and priorities for water security. The observatory provides multiple scales of observation and modelling required to develop: a) new climate, hydrological and ecological science and modelling tools to address environmental change in key environments, and their integrated effects and feedbacks at large catchment scale, b) new tools needed to support river basin management under uncertainty, including anthropogenic controls on land and water management and c) the place-based focus for the development of new transdisciplinary science.

  15. A European collaboration research programme to study and test large scale base isolated structures

    International Nuclear Information System (INIS)

    Renda, V.; Verzeletti, G.; Papa, L.

    1995-01-01

    The improvement of the technology of innovative anti-seismic mechanisms, as those for base isolation and energy dissipation, needs of testing capability for large scale models of structures integrated with these mechanisms. These kind experimental tests are of primary importance for the validation of design rules and the setting up of an advanced earthquake engineering for civil constructions of relevant interest. The Joint Research Centre of the European Commission offers the European Laboratory for Structural Assessment located at Ispra - Italy, as a focal point for an international european collaboration research programme to test large scale models of structure making use of innovative anti-seismic mechanisms. A collaboration contract, opened to other future contributions, has been signed with the national italian working group on seismic isolation (Gruppo di Lavoro sull's Isolamento Sismico GLIS) which includes the national research centre ENEA, the national electricity board ENEL, the industrial research centre ISMES and producer of isolators ALGA. (author). 3 figs

  16. Prediction of monthly rainfall on homogeneous monsoon regions of India based on large scale circulation patterns using Genetic Programming

    Science.gov (United States)

    Kashid, Satishkumar S.; Maity, Rajib

    2012-08-01

    SummaryPrediction of Indian Summer Monsoon Rainfall (ISMR) is of vital importance for Indian economy, and it has been remained a great challenge for hydro-meteorologists due to inherent complexities in the climatic systems. The Large-scale atmospheric circulation patterns from tropical Pacific Ocean (ENSO) and those from tropical Indian Ocean (EQUINOO) are established to influence the Indian Summer Monsoon Rainfall. The information of these two large scale atmospheric circulation patterns in terms of their indices is used to model the complex relationship between Indian Summer Monsoon Rainfall and the ENSO as well as EQUINOO indices. However, extracting the signal from such large-scale indices for modeling such complex systems is significantly difficult. Rainfall predictions have been done for 'All India' as one unit, as well as for five 'homogeneous monsoon regions of India', defined by Indian Institute of Tropical Meteorology. Recent 'Artificial Intelligence' tool 'Genetic Programming' (GP) has been employed for modeling such problem. The Genetic Programming approach is found to capture the complex relationship between the monthly Indian Summer Monsoon Rainfall and large scale atmospheric circulation pattern indices - ENSO and EQUINOO. Research findings of this study indicate that GP-derived monthly rainfall forecasting models, that use large-scale atmospheric circulation information are successful in prediction of All India Summer Monsoon Rainfall with correlation coefficient as good as 0.866, which may appears attractive for such a complex system. A separate analysis is carried out for All India Summer Monsoon rainfall for India as one unit, and five homogeneous monsoon regions, based on ENSO and EQUINOO indices of months of March, April and May only, performed at end of month of May. In this case, All India Summer Monsoon Rainfall could be predicted with 0.70 as correlation coefficient with somewhat lesser Correlation Coefficient (C.C.) values for different

  17. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  18. Simulation research on the process of large scale ship plane segmentation intelligent workshop

    Science.gov (United States)

    Xu, Peng; Liao, Liangchuang; Zhou, Chao; Xue, Rui; Fu, Wei

    2017-04-01

    Large scale ship plane segmentation intelligent workshop is a new thing, and there is no research work in related fields at home and abroad. The mode of production should be transformed by the existing industry 2.0 or part of industry 3.0, also transformed from "human brain analysis and judgment + machine manufacturing" to "machine analysis and judgment + machine manufacturing". In this transforming process, there are a great deal of tasks need to be determined on the aspects of management and technology, such as workshop structure evolution, development of intelligent equipment and changes in business model. Along with them is the reformation of the whole workshop. Process simulation in this project would verify general layout and process flow of large scale ship plane section intelligent workshop, also would analyze intelligent workshop working efficiency, which is significant to the next step of the transformation of plane segmentation intelligent workshop.

  19. Research on unit commitment with large-scale wind power connected power system

    Science.gov (United States)

    Jiao, Ran; Zhang, Baoqun; Chi, Zhongjun; Gong, Cheng; Ma, Longfei; Yang, Bing

    2017-01-01

    Large-scale integration of wind power generators into power grid brings severe challenges to power system economic dispatch due to its stochastic volatility. Unit commitment including wind farm is analyzed from the two parts of modeling and solving methods. The structures and characteristics can be summarized after classification has been done according to different objective function and constraints. Finally, the issues to be solved and possible directions of research and development in the future are discussed, which can adapt to the requirements of the electricity market, energy-saving power generation dispatching and smart grid, even providing reference for research and practice of researchers and workers in this field.

  20. Research and development of safeguards measures for the large scale reprocessing plant

    Energy Technology Data Exchange (ETDEWEB)

    Kikuchi, Masahiro; Sato, Yuji; Yokota, Yasuhiro; Masuda, Shoichiro; Kobayashi, Isao; Uchikoshi, Seiji; Tsutaki, Yasuhiro; Nidaira, Kazuo [Nuclear Material Control Center, Tokyo (Japan)

    1994-12-31

    The Government of Japan agreed on the safeguards concepts of commercial size reprocessing plant under the bilateral agreement for cooperation between the Japan and the United States. In addition, the LASCAR, that is the forum of large scale reprocessing plant safeguards, could obtain the fruitful results in the spring of 1992. The research and development of safeguards measures for the Rokkasho Reprocessing Plant should be progressed with every regard to the concepts described in both documents. Basically, the material accountancy and monitoring system should be established, based on the NRTA and other measures in order to obtain the timeliness goal for plutonium, and the un-attended mode inspection approach based on the integrated containment/surveillance system coupled with radiation monitoring in order to reduce the inspection efforts. NMCC has been studying on the following measures for a large scale reprocessing plant safeguards (1) A radiation gate monitor and integrated surveillance system (2) A near real time Shipper and Receiver Difference monitoring (3) A near real time material accountancy system operated for the bulk handling area (4) A volume measurement technique in a large scale input accountancy vessel (5) An in-process inventory estimation technique applied to the process equipment such as the pulse column and evaporator (6) Solution transfer monitoring approach applied to buffer tanks in the chemical process (7) A timely analysis technique such as a hybrid K edge densitometer operated in the on-site laboratory (J.P.N.).

  1. Impact of Large Scale Energy Efficiency Programs On Consumer Tariffs and Utility Finances in India

    Energy Technology Data Exchange (ETDEWEB)

    Abhyankar, Nikit; Phadke, Amol

    2011-01-20

    Large-scale EE programs would modestly increase tariffs but reduce consumers' electricity bills significantly. However, the primary benefit of EE programs is a significant reduction in power shortages, which might make these programs politically acceptable even if tariffs increase. To increase political support, utilities could pursue programs that would result in minimal tariff increases. This can be achieved in four ways: (a) focus only on low-cost programs (such as replacing electric water heaters with gas water heaters); (b) sell power conserved through the EE program to the market at a price higher than the cost of peak power purchase; (c) focus on programs where a partial utility subsidy of incremental capital cost might work and (d) increase the number of participant consumers by offering a basket of EE programs to fit all consumer subcategories and tariff tiers. Large scale EE programs can result in consistently negative cash flows and significantly erode the utility's overall profitability. In case the utility is facing shortages, the cash flow is very sensitive to the marginal tariff of the unmet demand. This will have an important bearing on the choice of EE programs in Indian states where low-paying rural and agricultural consumers form the majority of the unmet demand. These findings clearly call for a flexible, sustainable solution to the cash-flow management issue. One option is to include a mechanism like FAC in the utility incentive mechanism. Another sustainable solution might be to have the net program cost and revenue loss built into utility's revenue requirement and thus into consumer tariffs up front. However, the latter approach requires institutionalization of EE as a resource. The utility incentive mechanisms would be able to address the utility disincentive of forgone long-run return but have a minor impact on consumer benefits. Fundamentally, providing incentives for EE programs to make them comparable to supply

  2. Talking About The Smokes: a large-scale, community-based participatory research project.

    Science.gov (United States)

    Couzos, Sophia; Nicholson, Anna K; Hunt, Jennifer M; Davey, Maureen E; May, Josephine K; Bennet, Pele T; Westphal, Darren W; Thomas, David P

    2015-06-01

    To describe the Talking About The Smokes (TATS) project according to the World Health Organization guiding principles for conducting community-based participatory research (PR) involving indigenous peoples, to assist others planning large-scale PR projects. The TATS project was initiated in Australia in 2010 as part of the International Tobacco Control Policy Evaluation Project, and surveyed a representative sample of 2522 Aboriginal and Torres Strait Islander adults to assess the impact of tobacco control policies. The PR process of the TATS project, which aimed to build partnerships to create equitable conditions for knowledge production, was mapped and summarised onto a framework adapted from the WHO principles. Processes describing consultation and approval, partnerships and research agreements, communication, funding, ethics and consent, data and benefits of the research. The TATS project involved baseline and follow-up surveys conducted in 34 Aboriginal community-controlled health services and one Torres Strait community. Consistent with the WHO PR principles, the TATS project built on community priorities and strengths through strategic partnerships from project inception, and demonstrated the value of research agreements and trusting relationships to foster shared decision making, capacity building and a commitment to Indigenous data ownership. Community-based PR methodology, by definition, needs adaptation to local settings and priorities. The TATS project demonstrates that large-scale research can be participatory, with strong Indigenous community engagement and benefits.

  3. Mining the mind research network: a novel framework for exploring large scale, heterogeneous translational neuroscience research data sources.

    Directory of Open Access Journals (Sweden)

    Henry Jeremy Bockholt

    2010-04-01

    Full Text Available A neuroinformatics (NI system is critical to brain imaging research in order to shorten the time between study conception and results. Such a NI system is required to scale well when large numbers of subjects are studied. Further, when multiple sites participate in research projects organizational issues become increasingly difficult. Optimized NI applications mitigate these problems. Additionally, NI software enables coordination across multiple studies, leveraging advantages potentially leading to exponential research discoveries. The web-based, Mind Research Network (MRN, database system has been designed and improved through our experience with 200 research studies and 250 researchers from 7 different institutions. The MRN tools permit the collection, management, reporting and efficient use of large scale, heterogeneous data sources, e.g., multiple institutions, multiple principal investigators, multiple research programs and studies, and multimodal acquisitions. We have collected and analyzed data sets on thousands of research participants and have set up a framework to automatically analyze the data, thereby making efficient, practical data mining of this vast resource possible. This paper presents a comprehensive framework for capturing and analyzing heterogeneous neuroscience research data sources that has been fully optimized for end-users to perform novel data mining.

  4. Mining the Mind Research Network: A Novel Framework for Exploring Large Scale, Heterogeneous Translational Neuroscience Research Data Sources

    Science.gov (United States)

    Bockholt, Henry J.; Scully, Mark; Courtney, William; Rachakonda, Srinivas; Scott, Adam; Caprihan, Arvind; Fries, Jill; Kalyanam, Ravi; Segall, Judith M.; de la Garza, Raul; Lane, Susan; Calhoun, Vince D.

    2009-01-01

    A neuroinformatics (NI) system is critical to brain imaging research in order to shorten the time between study conception and results. Such a NI system is required to scale well when large numbers of subjects are studied. Further, when multiple sites participate in research projects organizational issues become increasingly difficult. Optimized NI applications mitigate these problems. Additionally, NI software enables coordination across multiple studies, leveraging advantages potentially leading to exponential research discoveries. The web-based, Mind Research Network (MRN), database system has been designed and improved through our experience with 200 research studies and 250 researchers from seven different institutions. The MRN tools permit the collection, management, reporting and efficient use of large scale, heterogeneous data sources, e.g., multiple institutions, multiple principal investigators, multiple research programs and studies, and multimodal acquisitions. We have collected and analyzed data sets on thousands of research participants and have set up a framework to automatically analyze the data, thereby making efficient, practical data mining of this vast resource possible. This paper presents a comprehensive framework for capturing and analyzing heterogeneous neuroscience research data sources that has been fully optimized for end-users to perform novel data mining. PMID:20461147

  5. Quantifying expert consensus against the existence of a secret, large-scale atmospheric spraying program

    Science.gov (United States)

    Shearer, Christine; West, Mick; Caldeira, Ken; Davis, Steven J.

    2016-08-01

    Nearly 17% of people in an international survey said they believed the existence of a secret large-scale atmospheric program (SLAP) to be true or partly true. SLAP is commonly referred to as ‘chemtrails’ or ‘covert geoengineering’, and has led to a number of websites purported to show evidence of widespread chemical spraying linked to negative impacts on human health and the environment. To address these claims, we surveyed two groups of experts—atmospheric chemists with expertize in condensation trails and geochemists working on atmospheric deposition of dust and pollution—to scientifically evaluate for the first time the claims of SLAP theorists. Results show that 76 of the 77 scientists (98.7%) that took part in this study said they had not encountered evidence of a SLAP, and that the data cited as evidence could be explained through other factors, including well-understood physics and chemistry associated with aircraft contrails and atmospheric aerosols. Our goal is not to sway those already convinced that there is a secret, large-scale spraying program—who often reject counter-evidence as further proof of their theories—but rather to establish a source of objective science that can inform public discourse.

  6. Results of research and development in large-scale research centers as an innovation source for firms

    International Nuclear Information System (INIS)

    Theenhaus, R.

    1978-01-01

    The twelve large-scale research centres of the Federal Republic of Germany with their 16,000 employees represent a considerable scientific and technical potential. Cooperation with industry with regard to large-scale projects has already become very close and the know-how flow as well as the contributions to innovation connected therewith are largely established. The first successful steps to utilizing the results of basic research, of spin off and those within the frame of research and development as well as the fulfilling of services are encouraging. However, there is a number of detail problems which can only be solved between all parties concerned, in particular between industry and all large-scale research centres. (orig./RW) [de

  7. BREEDER: a microcomputer program for financial analysis of a large-scale prototype breeder reactor

    International Nuclear Information System (INIS)

    Giese, R.F.

    1984-04-01

    This report describes a microcomputer-based, single-project financial analysis program: BREEDER. BREEDER is a user-friendly model designed to facilitate frequent and rapid analyses of the financial implications associated with alternative design and financing strategies for electric generating plants and large-scale prototype breeder (LSPB) reactors in particular. The model has proved to be a useful tool in establishing cost goals for LSPB reactors. The program is available on floppy disks for use on an IBM personal computer (or IBM look-a-like) running under PC-DOS or a Kaypro II transportable computer running under CP/M (and many other CP/M machines). The report documents version 1.5 of BREEDER and contains a user's guide. The report also includes a general overview of BREEDER, a summary of hardware requirements, a definition of all required program inputs, a description of all algorithms used in performing the construction-period and operation-period analyses, and a summary of all available reports. The appendixes contain a complete source-code listing, a cross-reference table, a sample interactive session, several sample runs, and additional documentation of the net-equity program option

  8. Organizational Influences on Interdisciplinary Interactions during Research and Design of Large-Scale Complex Engineered Systems

    Science.gov (United States)

    McGowan, Anna-Maria R.; Seifert, Colleen M.; Papalambros, Panos Y.

    2012-01-01

    The design of large-scale complex engineered systems (LaCES) such as an aircraft is inherently interdisciplinary. Multiple engineering disciplines, drawing from a team of hundreds to thousands of engineers and scientists, are woven together throughout the research, development, and systems engineering processes to realize one system. Though research and development (R&D) is typically focused in single disciplines, the interdependencies involved in LaCES require interdisciplinary R&D efforts. This study investigates the interdisciplinary interactions that take place during the R&D and early conceptual design phases in the design of LaCES. Our theoretical framework is informed by both engineering practices and social science research on complex organizations. This paper provides preliminary perspective on some of the organizational influences on interdisciplinary interactions based on organization theory (specifically sensemaking), data from a survey of LaCES experts, and the authors experience in the research and design. The analysis reveals couplings between the engineered system and the organization that creates it. Survey respondents noted the importance of interdisciplinary interactions and their significant benefit to the engineered system, such as innovation and problem mitigation. Substantial obstacles to interdisciplinarity are uncovered beyond engineering that include communication and organizational challenges. Addressing these challenges may ultimately foster greater efficiencies in the design and development of LaCES and improved system performance by assisting with the collective integration of interdependent knowledge bases early in the R&D effort. This research suggests that organizational and human dynamics heavily influence and even constrain the engineering effort for large-scale complex systems.

  9. Obtaining large-scale funding for empowerment-oriented qualitative research: a report from personal experience.

    Science.gov (United States)

    Padgett, Deborah K; Henwood, Benjamin F

    2009-06-01

    Obtaining funding for qualitative research remains a challenge despite greater openness to methodological pluralism. Such hurdles are presumably compounded when the proposed study employs empowerment theory, rendering it susceptible to charges of elevating ideology over rigor. This article draws on the authors' experience in securing large-scale funding for an empowerment-oriented qualitative study of homeless mentally ill adults. Lessons learned include the importance of weaving empowerment theory into the proposal's "argument," and infusing empowerment values into study protocols while simultaneously paying close attention to rigorous and transparent methods. Additional benefits accrue from having prior relationships with study sites and being willing to revise and resubmit proposals whenever possible. Though representing a fraction of all externally funded projects in the United States, qualitative research has tremendous untapped potential for success in this competitive arena-success that need not entail surrendering a commitment to empowerment values.

  10. First Joint Workshop on Energy Management for Large-Scale Research Infrastructures

    CERN Document Server

    2011-01-01

      CERN, ERF (European Association of National Research Facilities) and ESS (European Spallation Source) announce the first Joint Workshop on Energy Management for Large-Scale Research Infrastructures. The event will take place on 13-14 October 2011 at the ESS office in Sparta - Lund, Sweden.   The workshop will bring together international experts on energy and representatives from laboratories and future projects all over the world in order to identify the challenges and best practice in respect of energy efficiency and optimization, solutions and implementation as well as to review the challenges represented by potential future technical solutions and the tools for effective collaboration. Further information at: http://ess-scandinavia.eu/general-information

  11. Comparison of particle swarm optimization and dynamic programming for large scale hydro unit load dispatch

    International Nuclear Information System (INIS)

    Cheng Chuntian; Liao Shengli; Tang Zitian; Zhao Mingyan

    2009-01-01

    Dynamic programming (DP) is one of classic and sophisticated optimization methods that have successfully been applied to solve the problem of hydro unit load dispatch (HULD). However, DP will be faced with the curse of dimensionality with the increase of unit number and installed generating capacity of hydropower station. With the appearance of the huge hydropower station similar to the Three George with 26 generators of 700 MW, it is hard to apply the DP to large scale HULD problem. It is crucial to seek for other optimization techniques in order to improve the operation quality and efficiency. Different with the most of literature about power generation scheduling that focused on the comparisons of novel PSO algorithms with other techniques, the paper will pay emphasis on comparison study of PSO with DP based on a case hydropower station. The objective of study is to seek for an effective and feasible method for the large scale of hydropower station of the current and future in China. This paper first compares the performance of PSO and DP using a sample load curve of the Wujiangdu hydropower plant located in the upper stream of the Yangtze River in China and contained five units with the installed capacity of 1250 MW. Next, the effect of different load interval and unit number on the optimal results and efficiency of two methods has also been implemented. The comparison results show that the PSO is feasible for HULD. Furthermore, we simulated the effect of the magnitude of unit number and load capacity on the optimal results and cost time. The simulation comparisons show that PSO has a great advantage over DP in the efficiency and will be one of effective methods for HULD problem of huge hydropower stations.

  12. Comparison of particle swarm optimization and dynamic programming for large scale hydro unit load dispatch

    Energy Technology Data Exchange (ETDEWEB)

    Cheng Chuntian, E-mail: ctcheng@dlut.edu.c [Department of Civil and Hydraulic Engineering, Dalian University of Technology, 116024 Dalian (China); Liao Shengli; Tang Zitian [Department of Civil and Hydraulic Engineering, Dalian University of Technology, 116024 Dalian (China); Zhao Mingyan [Department of Environmental Science and Engineering, Tsinghua University, 100084 Beijing (China)

    2009-12-15

    Dynamic programming (DP) is one of classic and sophisticated optimization methods that have successfully been applied to solve the problem of hydro unit load dispatch (HULD). However, DP will be faced with the curse of dimensionality with the increase of unit number and installed generating capacity of hydropower station. With the appearance of the huge hydropower station similar to the Three George with 26 generators of 700 MW, it is hard to apply the DP to large scale HULD problem. It is crucial to seek for other optimization techniques in order to improve the operation quality and efficiency. Different with the most of literature about power generation scheduling that focused on the comparisons of novel PSO algorithms with other techniques, the paper will pay emphasis on comparison study of PSO with DP based on a case hydropower station. The objective of study is to seek for an effective and feasible method for the large scale of hydropower station of the current and future in China. This paper first compares the performance of PSO and DP using a sample load curve of the Wujiangdu hydropower plant located in the upper stream of the Yangtze River in China and contained five units with the installed capacity of 1250 MW. Next, the effect of different load interval and unit number on the optimal results and efficiency of two methods has also been implemented. The comparison results show that the PSO is feasible for HULD. Furthermore, we simulated the effect of the magnitude of unit number and load capacity on the optimal results and cost time. The simulation comparisons show that PSO has a great advantage over DP in the efficiency and will be one of effective methods for HULD problem of huge hydropower stations.

  13. Comparison of particle swarm optimization and dynamic programming for large scale hydro unit load dispatch

    Energy Technology Data Exchange (ETDEWEB)

    Chun-tian Cheng; Sheng-li Liao; Zi-Tian Tang [Dept. of Civil and Hydraulic Engineering, Dalian Univ. of Technology, 116024 Dalian (China); Ming-yan Zhao [Dept. of Environmental Science and Engineering, Tsinghua Univ., 100084 Beijing (China)

    2009-12-15

    Dynamic programming (DP) is one of classic and sophisticated optimization methods that have successfully been applied to solve the problem of hydro unit load dispatch (HULD). However, DP will be faced with the curse of dimensionality with the increase of unit number and installed generating capacity of hydropower station. With the appearance of the huge hydropower station similar to the Three George with 26 generators of 700 MW, it is hard to apply the DP to large scale HULD problem. It is crucial to seek for other optimization techniques in order to improve the operation quality and efficiency. Different with the most of literature about power generation scheduling that focused on the comparisons of novel PSO algorithms with other techniques, the paper will pay emphasis on comparison study of PSO with DP based on a case hydropower station. The objective of study is to seek for an effective and feasible method for the large scale of hydropower station of the current and future in China. This paper first compares the performance of PSO and DP using a sample load curve of the Wujiangdu hydropower plant located in the upper stream of the Yangtze River in China and contained five units with the installed capacity of 1250 MW. Next, the effect of different load interval and unit number on the optimal results and efficiency of two methods has also been implemented. The comparison results show that the PSO is feasible for HULD. Furthermore, we simulated the effect of the magnitude of unit number and load capacity on the optimal results and cost time. The simulation comparisons show that PSO has a great advantage over DP in the efficiency and will be one of effective methods for HULD problem of huge hydropower stations. (author)

  14. Can wide consultation help with setting priorities for large-scale biodiversity monitoring programs?

    Directory of Open Access Journals (Sweden)

    Frédéric Boivin

    Full Text Available Climate and other global change phenomena affecting biodiversity require monitoring to track ecosystem changes and guide policy and management actions. Designing a biodiversity monitoring program is a difficult task that requires making decisions that often lack consensus due to budgetary constrains. As monitoring programs require long-term investment, they also require strong and continuing support from all interested parties. As such, stakeholder consultation is key to identify priorities and make sound design decisions that have as much support as possible. Here, we present the results of a consultation conducted to serve as an aid for designing a large-scale biodiversity monitoring program for the province of Québec (Canada. The consultation took the form of a survey with 13 discrete choices involving tradeoffs in respect to design priorities and 10 demographic questions (e.g., age, profession. The survey was sent to thousands of individuals having expected interests and knowledge about biodiversity and was completed by 621 participants. Overall, consensuses were few and it appeared difficult to create a design fulfilling the priorities of the majority. Most participants wanted 1 a monitoring design covering the entire territory and focusing on natural habitats; 2 a focus on species related to ecosystem services, on threatened and on invasive species. The only demographic characteristic that was related to the type of prioritization was the declared level of knowledge in biodiversity (null to high, but even then the influence was quite small.

  15. Can wide consultation help with setting priorities for large-scale biodiversity monitoring programs?

    Science.gov (United States)

    Boivin, Frédéric; Simard, Anouk; Peres-Neto, Pedro

    2014-01-01

    Climate and other global change phenomena affecting biodiversity require monitoring to track ecosystem changes and guide policy and management actions. Designing a biodiversity monitoring program is a difficult task that requires making decisions that often lack consensus due to budgetary constrains. As monitoring programs require long-term investment, they also require strong and continuing support from all interested parties. As such, stakeholder consultation is key to identify priorities and make sound design decisions that have as much support as possible. Here, we present the results of a consultation conducted to serve as an aid for designing a large-scale biodiversity monitoring program for the province of Québec (Canada). The consultation took the form of a survey with 13 discrete choices involving tradeoffs in respect to design priorities and 10 demographic questions (e.g., age, profession). The survey was sent to thousands of individuals having expected interests and knowledge about biodiversity and was completed by 621 participants. Overall, consensuses were few and it appeared difficult to create a design fulfilling the priorities of the majority. Most participants wanted 1) a monitoring design covering the entire territory and focusing on natural habitats; 2) a focus on species related to ecosystem services, on threatened and on invasive species. The only demographic characteristic that was related to the type of prioritization was the declared level of knowledge in biodiversity (null to high), but even then the influence was quite small.

  16. From efficacy research to large-scale impact on undernutrition: the role of organizational cultures.

    Science.gov (United States)

    Pelletier, David; Pelto, Gretel

    2013-11-01

    Undernutrition in low-income countries is receiving unprecedented attention at global and national levels due to the convergence of many forces, including strong evidence concerning its magnitude, consequences, and potential solutions and effective advocacy by many organizations. The translation of this attention into large-scale reductions in undernutrition at the country level requires the alignment and support of many organizations in the development and implementation of a coherent policy agenda for nutrition, including the strengthening of operational and strategic capacities and a supportive research agenda. However, many countries experience difficulties achieving such alignment. This article uses the concept of organizational culture to better understand some of the reasons for these difficulties. This concept is applied to the constellation of organizations that make up the "National Nutrition Network" in a given country and some of the individual organizations within that network, including academic institutions that conduct research on undernutrition. We illustrate this concept through a case study involving a middle-income country. We conclude that efforts to align organizations in support of coherent nutrition agendas should do the following: 1) make intentional and sustained efforts to foster common understanding, shared learning, and socialization of new members and other elements of a shared culture among partners; 2) seek a way to frame problems and solutions in a fashion that enables individual organizations to secure some of their particular interests by joining the effort; and 3) not only advocate on the importance of nutrition but also insist that high-level officials hold organizations accountable for aligning in support of common-interest solutions (through some elements of a common culture) that can be effective and appropriate in the national context. We further conclude that a culture change is needed within academic departments if the

  17. Integrating scientific knowledge into large-scale restoration programs: the CALFED Bay-Delta Program experience

    Science.gov (United States)

    Taylor, K.A.; Short, A.

    2009-01-01

    Integrating science into resource management activities is a goal of the CALFED Bay-Delta Program, a multi-agency effort to address water supply reliability, ecological condition, drinking water quality, and levees in the Sacramento-San Joaquin Delta of northern California. Under CALFED, many different strategies were used to integrate science, including interaction between the research and management communities, public dialogues about scientific work, and peer review. This paper explores ways science was (and was not) integrated into CALFED's management actions and decision systems through three narratives describing different patterns of scientific integration and application in CALFED. Though a collaborative process and certain organizational conditions may be necessary for developing new understandings of the system of interest, we find that those factors are not sufficient for translating that knowledge into management actions and decision systems. We suggest that the application of knowledge may be facilitated or hindered by (1) differences in the objectives, approaches, and cultures of scientists operating in the research community and those operating in the management community and (2) other factors external to the collaborative process and organization.

  18. Cohort Profile of The GOALS Study: A Large-scale Research of Physical Activity in Dutch Students

    NARCIS (Netherlands)

    De Groot, Renate; Van Dijk, Martin; Kirschner, Paul A.

    2016-01-01

    The GOALS study (Grootschalig Onderzoek naar Activiteiten van Limburgse Scholieren [Large-scale Research of Activities in Dutch Students]) was set up to investigate possible associations between different forms of physical activity and inactivity with cognitive performance, academic achievement and

  19. Collaborative mining and interpretation of large-scale data for biomedical research insights.

    Directory of Open Access Journals (Sweden)

    Georgia Tsiliki

    Full Text Available Biomedical research becomes increasingly interdisciplinary and collaborative in nature. Researchers need to efficiently and effectively collaborate and make decisions by meaningfully assembling, mining and analyzing available large-scale volumes of complex multi-faceted data residing in different sources. In line with related research directives revealing that, in spite of the recent advances in data mining and computational analysis, humans can easily detect patterns which computer algorithms may have difficulty in finding, this paper reports on the practical use of an innovative web-based collaboration support platform in a biomedical research context. Arguing that dealing with data-intensive and cognitively complex settings is not a technical problem alone, the proposed platform adopts a hybrid approach that builds on the synergy between machine and human intelligence to facilitate the underlying sense-making and decision making processes. User experience shows that the platform enables more informed and quicker decisions, by displaying the aggregated information according to their needs, while also exploiting the associated human intelligence.

  20. A Review of Research on Large Scale Modern Vertical Axis Wind Turbines at Uppsala University

    Directory of Open Access Journals (Sweden)

    Senad Apelfröjd

    2016-07-01

    Full Text Available This paper presents a review of over a decade of research on Vertical Axis Wind Turbines (VAWTs conducted at Uppsala University. The paper presents, among others, an overview of the 200 kW VAWT located in Falkenberg, Sweden, as well as a description of the work done on the 12 kW prototype VAWT in Marsta, Sweden. Several key aspects have been tested and successfully demonstrated at our two experimental research sites. The effort of the VAWT research has been aimed at developing a robust large scale VAWT technology based on an electrical control system with a direct driven energy converter. This approach allows for a simplification where most or all of the control of the turbines can be managed by the electrical converter system, reducing investment cost and need for maintenance. The concept features an H-rotor that is omnidirectional in regards to wind direction, meaning that it can extract energy from all wind directions without the need for a yaw system. The turbine is connected to a direct driven permanent magnet synchronous generator (PMSG, located at ground level, that is specifically developed to control and extract power from the turbine. The research is ongoing and aims for a multi-megawatt VAWT in the near future.

  1. Comparisons of benthic filter feeder communities before and after a large-scale capital dredging program.

    Science.gov (United States)

    Abdul Wahab, Muhammad Azmi; Fromont, Jane; Gomez, Oliver; Fisher, Rebecca; Jones, Ross

    2017-09-15

    Changes in turbidity, sedimentation and light over a two year large scale capital dredging program at Onslow, northwestern Australia, were quantified to assess their effects on filter feeder communities, in particular sponges. Community functional morphological composition was quantified using towed video surveys, while dive surveys allowed for assessments of species composition and chlorophyll content. Onslow is relatively diverse recording 150 sponge species. The area was naturally turbid (1.1 mean P 80 NTU), with inshore sites recording 6.5× higher turbidity than offshore localities, likely influenced by the Ashburton River discharge. Turbidity and sedimentation increased by up to 146% and 240% through dredging respectively, with corresponding decreases in light levels. The effects of dredging was variable, and despite existing caveats (i.e. bleaching event and passing of a cyclone), the persistence of sponges and the absence of a pronounced response post-dredging suggest environmental filtering or passive adaptation acquired pre-dredging may have benefited these communities. Copyright © 2017. Published by Elsevier Ltd.

  2. Punctuated equilibrium in the large-scale evolution of programming languages†

    Science.gov (United States)

    Valverde, Sergi; Solé, Ricard V.

    2015-01-01

    The analogies and differences between biological and cultural evolution have been explored by evolutionary biologists, historians, engineers and linguists alike. Two well-known domains of cultural change are language and technology. Both share some traits relating the evolution of species, but technological change is very difficult to study. A major challenge in our way towards a scientific theory of technological evolution is how to properly define evolutionary trees or clades and how to weight the role played by horizontal transfer of information. Here, we study the large-scale historical development of programming languages, which have deeply marked social and technological advances in the last half century. We analyse their historical connections using network theory and reconstructed phylogenetic networks. Using both data analysis and network modelling, it is shown that their evolution is highly uneven, marked by innovation events where new languages are created out of improved combinations of different structural components belonging to previous languages. These radiation events occur in a bursty pattern and are tied to novel technological and social niches. The method can be extrapolated to other systems and consistently captures the major classes of languages and the widespread horizontal design exchanges, revealing a punctuated evolutionary path. PMID:25994298

  3. Punctuated equilibrium in the large-scale evolution of programming languages.

    Science.gov (United States)

    Valverde, Sergi; Solé, Ricard V

    2015-06-06

    The analogies and differences between biological and cultural evolution have been explored by evolutionary biologists, historians, engineers and linguists alike. Two well-known domains of cultural change are language and technology. Both share some traits relating the evolution of species, but technological change is very difficult to study. A major challenge in our way towards a scientific theory of technological evolution is how to properly define evolutionary trees or clades and how to weight the role played by horizontal transfer of information. Here, we study the large-scale historical development of programming languages, which have deeply marked social and technological advances in the last half century. We analyse their historical connections using network theory and reconstructed phylogenetic networks. Using both data analysis and network modelling, it is shown that their evolution is highly uneven, marked by innovation events where new languages are created out of improved combinations of different structural components belonging to previous languages. These radiation events occur in a bursty pattern and are tied to novel technological and social niches. The method can be extrapolated to other systems and consistently captures the major classes of languages and the widespread horizontal design exchanges, revealing a punctuated evolutionary path. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  4. Large Scale Computing and Storage Requirements for Basic Energy Sciences Research

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard; Wasserman, Harvey

    2011-03-31

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility supporting research within the Department of Energy's Office of Science. NERSC provides high-performance computing (HPC) resources to approximately 4,000 researchers working on about 400 projects. In addition to hosting large-scale computing facilities, NERSC provides the support and expertise scientists need to effectively and efficiently use HPC systems. In February 2010, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR) and DOE's Office of Basic Energy Sciences (BES) held a workshop to characterize HPC requirements for BES research through 2013. The workshop was part of NERSC's legacy of anticipating users future needs and deploying the necessary resources to meet these demands. Workshop participants reached a consensus on several key findings, in addition to achieving the workshop's goal of collecting and characterizing computing requirements. The key requirements for scientists conducting research in BES are: (1) Larger allocations of computational resources; (2) Continued support for standard application software packages; (3) Adequate job turnaround time and throughput; and (4) Guidance and support for using future computer architectures. This report expands upon these key points and presents others. Several 'case studies' are included as significant representative samples of the needs of science teams within BES. Research teams scientific goals, computational methods of solution, current and 2013 computing requirements, and special software and support needs are summarized in these case studies. Also included are researchers strategies for computing in the highly parallel, 'multi-core' environment that is expected to dominate HPC architectures over the next few years. NERSC has strategic plans and initiatives already underway that address key workshop findings. This report includes a

  5. The Hualien Large-Scale Seismic Test for soil-structure interaction research

    International Nuclear Information System (INIS)

    Tang, H.T.; Stepp, J.C.; Cheng, Y.H.

    1991-01-01

    A Large-Scale Seismic Test (LSST) Program at Hualien, Taiwan, has been initiated with the primary objective of obtaining earthquake-induced SSI data at a stiff soil site having similar prototypical nuclear power plant soil conditions. Preliminary soil boring, geophysical testing and ambient and earthquake-induced ground motion monitoring have been conducted to understand the experiment site conditions. More refined field and laboratory tests will be conducted such as the state-of-the-art freezing sampling technique and the large penetration test (LPT) method to characterize the soil constitutive behavior. The test model to be constructed will be similar to the Lotung model. The instrumentation layout will be designed to provide data for studies of SSI, spatial incoherence, soil stability, foundation uplifting, ground motion wave field and structural response. A consortium consisting of EPRI, Taipower, CRIEPI, TEPCO, CEA, EdF and Framatome has been established to carry out the project. It is envisaged that the Hualien SSI array will be ready to record earthquakes by the middle of 1992. The duration of the recording scheduled for five years. (author)

  6. Commercial applications of large-scale Research and Development computer simulation technologies

    International Nuclear Information System (INIS)

    Kuok Mee Ling; Pascal Chen; Wen Ho Lee

    1998-01-01

    The potential commercial applications of two large-scale R and D computer simulation technologies are presented. One such technology is based on the numerical solution of the hydrodynamics equations, and is embodied in the two-dimensional Eulerian code EULE2D, which solves the hydrodynamic equations with various models for the equation of state (EOS), constitutive relations and fracture mechanics. EULE2D is an R and D code originally developed to design and analyze conventional munitions for anti-armor penetrations such as shaped charges, explosive formed projectiles, and kinetic energy rods. Simulated results agree very well with actual experiments. A commercial application presented here is the design and simulation of shaped charges for oil and gas well bore perforation. The other R and D simulation technology is based on the numerical solution of Maxwell's partial differential equations of electromagnetics in space and time, and is implemented in the three-dimensional code FDTD-SPICE, which solves Maxwell's equations in the time domain with finite-differences in the three spatial dimensions and calls SPICE for information when nonlinear active devices are involved. The FDTD method has been used in the radar cross-section modeling of military aircrafts and many other electromagnetic phenomena. The coupling of FDTD method with SPICE, a popular circuit and device simulation program, provides a powerful tool for the simulation and design of microwave and millimeter-wave circuits containing nonlinear active semiconductor devices. A commercial application of FDTD-SPICE presented here is the simulation of a two-element active antenna system. The simulation results and the experimental measurements are in excellent agreement. (Author)

  7. Neurite, a finite difference large scale parallel program for the simulation of electrical signal propagation in neurites under mechanical loading.

    Directory of Open Access Journals (Sweden)

    Julián A García-Grajales

    Full Text Available With the growing body of research on traumatic brain injury and spinal cord injury, computational neuroscience has recently focused its modeling efforts on neuronal functional deficits following mechanical loading. However, in most of these efforts, cell damage is generally only characterized by purely mechanistic criteria, functions of quantities such as stress, strain or their corresponding rates. The modeling of functional deficits in neurites as a consequence of macroscopic mechanical insults has been rarely explored. In particular, a quantitative mechanically based model of electrophysiological impairment in neuronal cells, Neurite, has only very recently been proposed. In this paper, we present the implementation details of this model: a finite difference parallel program for simulating electrical signal propagation along neurites under mechanical loading. Following the application of a macroscopic strain at a given strain rate produced by a mechanical insult, Neurite is able to simulate the resulting neuronal electrical signal propagation, and thus the corresponding functional deficits. The simulation of the coupled mechanical and electrophysiological behaviors requires computational expensive calculations that increase in complexity as the network of the simulated cells grows. The solvers implemented in Neurite--explicit and implicit--were therefore parallelized using graphics processing units in order to reduce the burden of the simulation costs of large scale scenarios. Cable Theory and Hodgkin-Huxley models were implemented to account for the electrophysiological passive and active regions of a neurite, respectively, whereas a coupled mechanical model accounting for the neurite mechanical behavior within its surrounding medium was adopted as a link between electrophysiology and mechanics. This paper provides the details of the parallel implementation of Neurite, along with three different application examples: a long myelinated axon

  8. Maps4Science - National Roadmap for Large-Scale Research Facilities 2011 (NWO Application form)

    NARCIS (Netherlands)

    Van Oosterom, P.J.M.; Van der Wal, T.; De By, R.A.

    2011-01-01

    The Netherlands is historically known as one of worlds' best-measured countries. It is continuing this tradition today with unequalled new datasets, such as the nationwide large-scale topographic map, our unique digital height map (nationwide coverage; ten very accurate 3D points for every Dutch m2)

  9. Linking the GLOBE Program With NASA and NSF Large-Scale Experiments

    Science.gov (United States)

    Filmer, P. E.

    2005-12-01

    NASA and the NSF, the sponsoring Federal agencies for the GLOBE Program, are seeking the participation of science teams who are working at the cutting edge of Earth systems science in large integrated Earth systems science programs. Connecting the GLOBE concept and structure with NASA and NSF's leading Earth systems science programs will give GLOBE schools and students access to top scientists, and expose them to programs that have been designated as scientific priorities. Students, teachers, parents, and their communities will be able to see how scientists of many disciplines work together to learn about the Earth system. The GLOBE solicitation released by the NSF targets partnerships between GLOBE and NSF/NASA-funded integrated Earth systems science programs. This presentation will focus on the goals and requirements of the NSF solicitation. Proponents will be expected to provide ways for the GLOBE community to interact with a group of scientists from their science programs as part of a wider joint Earth systems science educational strategy (the sponsoring agencies', GLOBE's, and the proposing programs'). Teams proposing to this solicitation must demonstrate: - A focus on direct connections with major NSF Geosciences and/or Polar Programs and/or NASA Earth-Sun research programs that are related to Earth systems science; - A demonstrable benefit to GLOBE and to NSF Geosciences and/or Polar Programs or NASA Earth-Sun education goals (providing access to program researchers and data, working with GLOBE in setting up campaigns where possible, using tested GLOBE or non-GLOBE protocols to the greatest extent possible, actively participating in the wider GLOBE community including schools, among other goals); - An international component; - How the existing educational efforts of the large science program will coordinate with GLOBE; - An Earth systems science education focus, rather than a GLOBE protocol-support focus; - A rigorous evaluation and assessment component

  10. An inertia-free filter line-search algorithm for large-scale nonlinear programming

    Energy Technology Data Exchange (ETDEWEB)

    Chiang, Nai-Yuan; Zavala, Victor M.

    2016-02-15

    We present a filter line-search algorithm that does not require inertia information of the linear system. This feature enables the use of a wide range of linear algebra strategies and libraries, which is essential to tackle large-scale problems on modern computing architectures. The proposed approach performs curvature tests along the search step to detect negative curvature and to trigger convexification. We prove that the approach is globally convergent and we implement the approach within a parallel interior-point framework to solve large-scale and highly nonlinear problems. Our numerical tests demonstrate that the inertia-free approach is as efficient as inertia detection via symmetric indefinite factorizations. We also demonstrate that the inertia-free approach can lead to reductions in solution time because it reduces the amount of convexification needed.

  11. Research on precision grinding technology of large scale and ultra thin optics

    Science.gov (United States)

    Zhou, Lian; Wei, Qiancai; Li, Jie; Chen, Xianhua; Zhang, Qinghua

    2018-03-01

    The flatness and parallelism error of large scale and ultra thin optics have an important influence on the subsequent polishing efficiency and accuracy. In order to realize the high precision grinding of those ductile elements, the low deformation vacuum chuck was designed first, which was used for clamping the optics with high supporting rigidity in the full aperture. Then the optics was planar grinded under vacuum adsorption. After machining, the vacuum system was turned off. The form error of optics was on-machine measured using displacement sensor after elastic restitution. The flatness would be convergenced with high accuracy by compensation machining, whose trajectories were integrated with the measurement result. For purpose of getting high parallelism, the optics was turned over and compensation grinded using the form error of vacuum chuck. Finally, the grinding experiment of large scale and ultra thin fused silica optics with aperture of 430mm×430mm×10mm was performed. The best P-V flatness of optics was below 3 μm, and parallelism was below 3 ″. This machining technique has applied in batch grinding of large scale and ultra thin optics.

  12. Research on fatigue behavior and residual stress of large-scale cruciform welding joint with groove

    International Nuclear Information System (INIS)

    Zhao, Xiaohui; Liu, Yu; Liu, Yong; Gao, Yuan

    2014-01-01

    Highlights: • The fatigue behavior of the large-scale cruciform welding joint with groove was studied. • The longitudinal residual stress of the large-scale cruciform welding joint was tested by contour method. • The fatigue fracture mechanism of the large-scale cruciform welding joint with groove was analyzed. - Abstract: Fatigue fracture behavior of the 30 mm thick Q460C-Z steel cruciform welded joint with groove was investigated. The fatigue test results indicated that fatigue strength of 30 mm thick Q460C-Z steel cruciform welded joint with groove can reach fatigue level of 80 MPa (FAT80). Fatigue crack source of the failure specimen initiated from weld toe. Meanwhile, the microcrack was also found in the fusion zones of the fatigue failure specimen, which was caused by weld quality and weld metal integrity resulting from the multi-pass welds. Two-dimensional map of the longitudinal residual stress of 30 mm thick Q460C-Z steel cruciform welded joint with groove was obtained by using the contour method. The stress nephogram of Two-dimensional map indicated that longitudinal residual stress in the welding center is the largest

  13. Improved decomposition–coordination and discrete differential dynamic programming for optimization of large-scale hydropower system

    International Nuclear Information System (INIS)

    Li, Chunlong; Zhou, Jianzhong; Ouyang, Shuo; Ding, Xiaoling; Chen, Lu

    2014-01-01

    Highlights: • Optimization of large-scale hydropower system in the Yangtze River basin. • Improved decomposition–coordination and discrete differential dynamic programming. • Generating initial solution randomly to reduce generation time. • Proposing relative coefficient for more power generation. • Proposing adaptive bias corridor technology to enhance convergence speed. - Abstract: With the construction of major hydro plants, more and more large-scale hydropower systems are taking shape gradually, which brings up a challenge to optimize these systems. Optimization of large-scale hydropower system (OLHS), which is to determine water discharges or water levels of overall hydro plants for maximizing total power generation when subjecting to lots of constrains, is a high dimensional, nonlinear and coupling complex problem. In order to solve the OLHS problem effectively, an improved decomposition–coordination and discrete differential dynamic programming (IDC–DDDP) method is proposed in this paper. A strategy that initial solution is generated randomly is adopted to reduce generation time. Meanwhile, a relative coefficient based on maximum output capacity is proposed for more power generation. Moreover, an adaptive bias corridor technology is proposed to enhance convergence speed. The proposed method is applied to long-term optimal dispatches of large-scale hydropower system (LHS) in the Yangtze River basin. Compared to other methods, IDC–DDDP has competitive performances in not only total power generation but also convergence speed, which provides a new method to solve the OLHS problem

  14. Cohort Profile of the Goals Study: A Large-Scale Research of Physical Activity in Dutch Students

    Science.gov (United States)

    de Groot, Renate H. M.; van Dijk, Martin L.; Kirschner, Paul A.

    2015-01-01

    The GOALS study (Grootschalig Onderzoek naar Activiteiten van Limburgse Scholieren [Large-scale Research of Activities in Dutch Students]) was set up to investigate possible associations between different forms of physical activity and inactivity with cognitive performance, academic achievement and mental well-being. It was conducted at a…

  15. Large Scale Solar Heating

    DEFF Research Database (Denmark)

    Heller, Alfred

    2001-01-01

    The main objective of the research was to evaluate large-scale solar heating connected to district heating (CSDHP), to build up a simulation tool and to demonstrate the application of the simulation tool for design studies and on a local energy planning case. The evaluation was mainly carried out...... model is designed and validated on the Marstal case. Applying the Danish Reference Year, a design tool is presented. The simulation tool is used for proposals for application of alternative designs, including high-performance solar collector types (trough solar collectors, vaccum pipe collectors......). Simulation programs are proposed as control supporting tool for daily operation and performance prediction of central solar heating plants. Finaly the CSHP technolgy is put into persepctive with respect to alternatives and a short discussion on the barries and breakthrough of the technology are given....

  16. A research on the excavation, support, and environment control of large scale underground space

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Pil Chong; Kwon, Kwang Soo; Jeong, So Keul [Korea Institute of Geology Mining and Materials, Taejon (Korea, Republic of)

    1995-12-01

    With the growing necessity of the underground space due to the deficiency of above-ground space, the size and shape of underground structures tend to be complex and diverse. This complexity and variety force the development of new techniques for rock mass classification, excavation and supporting of underground space, monitoring and control of underground environment. All these techniques should be applied together to make the underground space comfortable. To achieve this, efforts have been made on 5 different areas; research on the underground space design and stability analysis, research on the techniques for excavation of rock by controlled blasting, research on the development of monitoring system to forecast the rock behaviour of underground space, research on the environment inspection system in closed space, and research on dynamic analysis of the airflow and environmental control in the large geos-spaces. The 5 main achievements are improvement of the existing structure analysis program(EXCRACK) to consider the deformation and failure characteristics of rock joints, development of new blasting design (SK-cut), prediction of ground vibration through the newly proposed wave propagation equation, development and In-Situ application of rock mass deformation monitoring system and data acquisition software, and trial manufacture of the environment inspection system in closed space. Should these techniques be applied to the development of underground space, prevention of industrial disaster, cut down of construction cost, domestication of monitoring system, improvement of tunnel stability, curtailment of royalty, upgrade of domestic technologies will be brought forth. (Abstract Truncated)

  17. Research highlights from a large scale residential monitoring study in a hot climate

    Energy Technology Data Exchange (ETDEWEB)

    Parker, Danny S. [Florida Solar Energy Center, Cocoa, FL (United States)

    2003-10-01

    A utility load research project has monitored a large number of residences in Central Florida, collecting detailed end-use data. The monitoring was performed to better estimate the impact of a load control program, as well as obtain improved appliance energy load profiles. The monitoring measured total as well as a number of electrical end-uses on a 15 min basis. The measured end-uses included space cooling, heating, water heating, range and cooking, clothes drying, and swimming pools electricity use and demand. The project identified a number of influences on electrical demand that are not commonly described. (Author)

  18. Research on the impacts of large-scale electric vehicles integration into power grid

    Science.gov (United States)

    Su, Chuankun; Zhang, Jian

    2018-06-01

    Because of its special energy driving mode, electric vehicles can improve the efficiency of energy utilization and reduce the pollution to the environment, which is being paid more and more attention. But the charging behavior of electric vehicles is random and intermittent. If the electric vehicle is disordered charging in a large scale, it causes great pressure on the structure and operation of the power grid and affects the safety and economic operation of the power grid. With the development of V2G technology in electric vehicle, the study of the charging and discharging characteristics of electric vehicles is of great significance for improving the safe operation of the power grid and the efficiency of energy utilization.

  19. Biomass Gasification - A synthesis of technical barriers and current research issues for deployment at large scale

    Energy Technology Data Exchange (ETDEWEB)

    Heyne, Stefan [Chalmers Univ. of Technology, Gothenburg (Sweden); Liliedahl, Truls [KTH, Royal Inst. of Technology, Stockholm (Sweden); Marklund, Magnus [Energy Technology Centre, Piteaa (Sweden)

    2013-09-01

    Thermal gasification at large scale for cogeneration of power and heat and/or production of fuels and materials is a main pathway for a sustainable deployment of biomass resources. However, so far no such full scale production exists and biomass gasification projects remain at the pilot or demonstration scale. This report focuses on the key critical technology challenges for the large-scale deployment of the following biomass-based gasification concepts: Direct Fluidized Bed Gasification (FBG), Entrained Flow Gasification (EFG) and indirect Dual Fluidized Bed Gasification (DFBG). The main content in this report is based on responses from a number of experts in biomass gasification obtained from a questionnaire. The survey was composed of a number of more or less specific questions on technical barriers as to the three gasification concepts considered. For formalising the questionnaire, the concept of Technology Readiness Level (TRL 1-9) was used for grading the level of technical maturity of the different sub-processes within the three generic biomass gasification technologies. For direct fluidized bed gasification (FBG) it is mentioned that the technology is already available at commercial scale as air-blown technology and thus that air-blown FBG gasification may be reckoned a mature technology. The remaining technical challenge is the conversion to operation on oxygen with the final goal of producing chemicals or transport fuels. Tar reduction, in particular, and gas cleaning and upgrading in general are by far the most frequently named technical issues considered problematic. Other important aspects are problems that may occur when operating on low-grade fuels - i.e. low-cost fuels. These problems include bed agglomeration/ash sintering as well as alkali fouling. Even the preparation and feeding of these low-grade fuels tend to be problematic and require further development to be used on a commercial scale. Furthermore, efficient char conversion is mentioned by

  20. Research on Large-Scale Road Network Partition and Route Search Method Combined with Traveler Preferences

    Directory of Open Access Journals (Sweden)

    De-Xin Yu

    2013-01-01

    Full Text Available Combined with improved Pallottino parallel algorithm, this paper proposes a large-scale route search method, which considers travelers’ route choice preferences. And urban road network is decomposed into multilayers effectively. Utilizing generalized travel time as road impedance function, the method builds a new multilayer and multitasking road network data storage structure with object-oriented class definition. Then, the proposed path search algorithm is verified by using the real road network of Guangzhou city as an example. By the sensitive experiments, we make a comparative analysis of the proposed path search method with the current advanced optimal path algorithms. The results demonstrate that the proposed method can increase the road network search efficiency by more than 16% under different search proportion requests, node numbers, and computing process numbers, respectively. Therefore, this method is a great breakthrough in the guidance field of urban road network.

  1. Large-scale automated analysis of news media: a novel computational method for obesity policy research.

    Science.gov (United States)

    Hamad, Rita; Pomeranz, Jennifer L; Siddiqi, Arjumand; Basu, Sanjay

    2015-02-01

    Analyzing news media allows obesity policy researchers to understand popular conceptions about obesity, which is important for targeting health education and policies. A persistent dilemma is that investigators have to read and manually classify thousands of individual news articles to identify how obesity and obesity-related policy proposals may be described to the public in the media. A machine learning method called "automated content analysis" that permits researchers to train computers to "read" and classify massive volumes of documents was demonstrated. 14,302 newspaper articles that mentioned the word "obesity" during 2011-2012 were identified. Four states that vary in obesity prevalence and policy (Alabama, California, New Jersey, and North Carolina) were examined. The reliability of an automated program to categorize the media's framing of obesity as an individual-level problem (e.g., diet) and/or an environmental-level problem (e.g., obesogenic environment) was tested. The automated program performed similarly to human coders. The proportion of articles with individual-level framing (27.7-31.0%) was higher than the proportion with neutral (18.0-22.1%) or environmental-level framing (16.0-16.4%) across all states and over the entire study period (Pnews media was demonstrated. © 2014 The Obesity Society.

  2. Managing sensitive phenotypic data and biomaterial in large-scale collaborative psychiatric genetic research projects: practical considerations.

    Science.gov (United States)

    Demiroglu, S Y; Skrowny, D; Quade, M; Schwanke, J; Budde, M; Gullatz, V; Reich-Erkelenz, D; Jakob, J J; Falkai, P; Rienhoff, O; Helbing, K; Heilbronner, U; Schulze, T G

    2012-12-01

    Large-scale collaborative research will be a hallmark of future psychiatric genetic research. Ideally, both academic and non-academic institutions should be able to participate in such collaborations to allow for the establishment of very large samples in a straightforward manner. Any such endeavor requires an easy-to-implement information technology (IT) framework. Here we present the requirements for a centralized framework and describe how they can be met through a modular IT toolbox.

  3. A Large-Scale Design Integration Approach Developed in Conjunction with the Ares Launch Vehicle Program

    Science.gov (United States)

    Redmon, John W.; Shirley, Michael C.; Kinard, Paul S.

    2012-01-01

    This paper presents a method for performing large-scale design integration, taking a classical 2D drawing envelope and interface approach and applying it to modern three dimensional computer aided design (3D CAD) systems. Today, the paradigm often used when performing design integration with 3D models involves a digital mockup of an overall vehicle, in the form of a massive, fully detailed, CAD assembly; therefore, adding unnecessary burden and overhead to design and product data management processes. While fully detailed data may yield a broad depth of design detail, pertinent integration features are often obscured under the excessive amounts of information, making them difficult to discern. In contrast, the envelope and interface method results in a reduction in both the amount and complexity of information necessary for design integration while yielding significant savings in time and effort when applied to today's complex design integration projects. This approach, combining classical and modern methods, proved advantageous during the complex design integration activities of the Ares I vehicle. Downstream processes, benefiting from this approach by reducing development and design cycle time, include: Creation of analysis models for the Aerodynamic discipline; Vehicle to ground interface development; Documentation development for the vehicle assembly.

  4. Large scale seismic test research at Hualien site in Taiwan. Results of site investigation and characterization of the foundation ground

    International Nuclear Information System (INIS)

    Okamoto, Toshiro; Kokusho, Takeharu; Nishi, Koichi

    1998-01-01

    An international joint research program called ''HLSST'' is under way. Large-Scale Seismic Test (LSST) is to be conducted to investigate Soil-Structure Interaction (SSI) during large earthquakes in the field in Hualien, a high seismic region in Taiwan. A 1/4-scale model building was constructed on the excavated gravelly ground, and the backfill material of crushed stones was placed around the model plant. The model building and the foundation ground were extensively instrumented to monitor structure and ground response. To accurately evaluate SSI during earthquakes, geotechnical investigation and forced vibration test were performed during construction process namely before/after the base excavation, after the structure construction and after the backfilling. Main results are as follows. (1) The distribution of the mechanical properties of the gravelly soil are measured by various techniques including penetration tests and PS-logging and it found that the shear wave velocities (Vs) change clearly and it depends on changing overburden pressures during the construction process. (2) Measuring Vs in the surrounding soils, it found that the Vs is smaller than that at almost same depth in the farther location. Discussion is made further on the numerical soil model for SSI analysis. (author)

  5. Icing Simulation Research Supporting the Ice-Accretion Testing of Large-Scale Swept-Wing Models

    Science.gov (United States)

    Yadlin, Yoram; Monnig, Jaime T.; Malone, Adam M.; Paul, Bernard P.

    2018-01-01

    The work summarized in this report is a continuation of NASA's Large-Scale, Swept-Wing Test Articles Fabrication; Research and Test Support for NASA IRT contract (NNC10BA05 -NNC14TA36T) performed by Boeing under the NASA Research and Technology for Aerospace Propulsion Systems (RTAPS) contract. In the study conducted under RTAPS, a series of icing tests in the Icing Research Tunnel (IRT) have been conducted to characterize ice formations on large-scale swept wings representative of modern commercial transport airplanes. The outcome of that campaign was a large database of ice-accretion geometries that can be used for subsequent aerodynamic evaluation in other experimental facilities and for validation of ice-accretion prediction codes.

  6. Understanding the Front-end of Large-scale Engineering Programs

    DEFF Research Database (Denmark)

    Lucae, Sebastian; Rebentisch, Eric; Oehmen, Josef

    2014-01-01

    Large engineering programs like sociotechnical infrastructure constructions of airports, plant constructions, or the development of radically innovative, high-tech industrial products such as electric vehicles or aircraft are affected by a number of serious risks, and subsequently commonly suffer...... from large cost overruns. Significant problems in program execution can be traced back to practices performed, or more frequently not performed, in the so-called “fuzzy front end” of the program. The lack of sufficient and effective efforts in the early stages of a program can result in unstable......, unclear and incomplete requirements, unclear roles and responsibilities within the program organization, insufficient planning, and unproductive tensions between program management and systems engineering. This study intends to clarify the importance of up-front planning to improve program performance...

  7. A Novel Large-scale Mentoring Program for Medical Students based on a Quantitative and Qualitative Needs Analysis

    Science.gov (United States)

    von der Borch, Philip; Dimitriadis, Konstantinos; Störmann, Sylvère; Meinel, Felix G.; Moder, Stefan; Reincke, Martin; Tekian, Ara; Fischer, Martin R.

    2011-01-01

    Purpose: Mentoring plays an important role in students' performance and career. The authors of this study assessed the need for mentoring among medical students and established a novel large-scale mentoring program at Ludwig-Maximilians-University (LMU) Munich School of Medicine. Methods: Needs assessment was conducted using a survey distributed to all students at the medical school (n=578 of 4,109 students, return rate 14.1%). In addition, the authors held focus groups with selected medical students (n=24) and faculty physicians (n=22). All students signing up for the individual mentoring completed a survey addressing their expectations (n=534). Results: Needs assessment revealed that 83% of medical students expressed overall satisfaction with the teaching at LMU. In contrast, only 36.5% were satisfied with how the faculty supports their individual professional development and 86% of students voiced a desire for more personal and professional support. When asked to define the role of a mentor, 55.6% "very much" wanted their mentors to act as counselors, arrange contacts for them (36.4%), and provide ideas for professional development (28.1%). Topics that future mentees "very much" wished to discuss included research (56.6%), final year electives (55.8%) and experiences abroad (45.5%). Conclusions: Based on the strong desire for mentoring among medical students, the authors developed a novel two-tiered system that introduces one-to-one mentoring for students in their clinical years and offers society-based peer mentoring for pre-clinical students. One year after launching the program, more than 300 clinical students had experienced one-to-one mentoring and 1,503 students and physicians were involved in peer mentoring societies. PMID:21818236

  8. Practical experience from the Office of Adolescent Health's large scale implementation of an evidence-based Teen Pregnancy Prevention Program.

    Science.gov (United States)

    Margolis, Amy Lynn; Roper, Allison Yvonne

    2014-03-01

    After 3 years of experience overseeing the implementation and evaluation of evidence-based teen pregnancy prevention programs in a diversity of populations and settings across the country, the Office of Adolescent Health (OAH) has learned numerous lessons through practical application and new experiences. These lessons and experiences are applicable to those working to implement evidence-based programs on a large scale. The lessons described in this paper focus on what it means for a program to be implementation ready, the role of the program developer in replicating evidence-based programs, the importance of a planning period to ensure quality implementation, the need to define and measure fidelity, and the conditions necessary to support rigorous grantee-level evaluation. Published by Elsevier Inc.

  9. Research into condensed matter using large-scale apparatus. Physics, chemistry, biology. Progress report 1992-1995. Summarizing reports

    International Nuclear Information System (INIS)

    1996-01-01

    Activities for research into condensed matter have been supported by the German BMBF with approx. 102 million Deutschmarks in the years 1992 through 1995. These financial means have been distributed among 314 research projects in the fields of physics, chemistry, biology, materials science, and other fields, which all rely on the intensive utilization of photon and particle beams generated in large-scale apparatus of institutions for basic research. The volume in hand first gives information of a general kind and statistical data on the distribution of financial means, for a number of priority research projects. The project reports are summarizing reports on the progress achieved in the various projects. (CB) [de

  10. The relevance of large scale environmental research infrastructures from the point of view of Ethics: the case of EMSO

    Science.gov (United States)

    Favali, Paolo; Beranzoli, Laura; Best, Mairi; Franceschini, PierLuigi; Materia, Paola; Peppoloni, Silvia; Picard, John

    2014-05-01

    EMSO (European Multidisciplinary Seafloor and Water Column Observatory) is a large-scale European Research Infrastructure (RI). It is a geographically distributed infrastructure composed of several deep-seafloor and water-column observatories, which will be deployed at key sites in European waters, spanning from the Arctic, through the Atlantic and Mediterranean, to the Black Sea, with the basic scientific objective of real-time, long-term monitoring of environmental processes related to the interaction between the geosphere, biosphere and hydrosphere. EMSO is one of the environmental RIs on the ESFRI roadmap. The ESRFI Roadmap identifies new RIs of pan-European importance that correspond to the long term needs of European research communities. EMSO will be the sub-sea segment of the EU's large-scale Earth Observation program, Copernicus (previously known as GMES - Global Monitoring for Environment and Security) and will significantly enhance the observational capabilities of European member states. An open data policy compliant with the recommendations being developed within the GEOSS initiative (Global Earth Observation System of Systems) will allow for shared use of the infrastructure and the exchange of scientific information and knowledge. The processes that occur in the oceans have a direct impact on human societies, therefore it is crucial to improve our understanding of how they operate and interact. To encompass the breadth of these major processes, sustained and integrated observations are required that appreciate the interconnectedness of atmospheric, surface ocean, biological pump, deep-sea, and solid-Earth dynamics and that can address: • natural and anthropogenic change; • interactions between ecosystem services, biodiversity, biogeochemistry, physics, and climate; • impacts of exploration and extraction of energy, minerals, and living resources; • geo-hazard early warning capability for earthquakes, tsunamis, gas-hydrate release, and slope

  11. Large-scale User Facility Imaging and Scattering Techniques to Facilitate Basic Medical Research

    International Nuclear Information System (INIS)

    Miller, Stephen D.; Bilheux, Jean-Christophe; Gleason, Shaun Scott; Nichols, Trent L.; Bingham, Philip R.; Green, Mark L.

    2011-01-01

    proposals submitted through the user programs operated by each facility. Imaging human and animal tissue occurs but is not routine in most places, and strict procedures must be followed to do so. However research communities are burgeoning in a number of biomedical areas, and protein crystallography research is well rooted in the X-ray and neutron scattering communities. Novel here is the forward looking work on neutron imaging with potential medical and biomedical applications. Thus the national laboratories provide a research environment with capabilities and a culture conducive to exploring new methods and techniques suitable for exploring new frontiers in medical and biomedical imaging.

  12. Evaluating the potential for large-scale fracturing at a disposal vault: an example using the underground research laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Martin, C D; Chandler, N A; Brown, Anton

    1994-09-01

    The potential for large-scale fracturing (> 10 m{sup 2}) around a nuclear fuel waste disposal vault is investigated in this report. The disposal vault is assumed to be located at a depth of 500 m in the plutonic rocks of the Canadian Shield. The rock mass surrounding the disposal vault is considered to have similar mechanical properties and in situ stress conditions to that found at a depth of 420 m at the Underground Research Laboratory. Theoretical, experimental and field evidence shows that Mode I fractures propagate in a plane perpendicular to {sigma}{sub 3} and only if the tensile stress at the tip of the advancing crack is sufficient to overcome the tensile strength of the rock. Because the stress state at a depth of 500 m or more is compressive, and will very probably stay so during the 10,000 year life of the disposal vault, there does not appear to be any mechanism which could propagate large-scale Mode I fracturing in the rock mass surrounding the vault. In addition because {sigma}{sub 3} is near vertical any Mode I fracture propagation that might occur would be in a horizontal plane. The development of either Mode I or large-scale shear fractures would require a drastic change in the compressive in situ stress state at the depth of the disposal vault. The stresses developed as a result of both thermal and glacial loading do not appear sufficient to cause new fracturing. Glacial loading would reduce the shear stresses in the rock mass and hence improve the stability of the rock mass surrounding the vault. Thus, it is not feasible that large-scale fracturing would occur over the 10,000 year life of a disposal vault in the Canadian Shield, at depths of 500 m or greater, where the compressive stress state is similar to that found at the Underground Research Laboratory. 107 refs., 44 figs.

  13. Evaluating the potential for large-scale fracturing at a disposal vault: an example using the underground research laboratory

    International Nuclear Information System (INIS)

    Martin, C.D.; Chandler, N.A.; Brown, Anton.

    1994-09-01

    The potential for large-scale fracturing (> 10 m 2 ) around a nuclear fuel waste disposal vault is investigated in this report. The disposal vault is assumed to be located at a depth of 500 m in the plutonic rocks of the Canadian Shield. The rock mass surrounding the disposal vault is considered to have similar mechanical properties and in situ stress conditions to that found at a depth of 420 m at the Underground Research Laboratory. Theoretical, experimental and field evidence shows that Mode I fractures propagate in a plane perpendicular to σ 3 and only if the tensile stress at the tip of the advancing crack is sufficient to overcome the tensile strength of the rock. Because the stress state at a depth of 500 m or more is compressive, and will very probably stay so during the 10,000 year life of the disposal vault, there does not appear to be any mechanism which could propagate large-scale Mode I fracturing in the rock mass surrounding the vault. In addition because σ 3 is near vertical any Mode I fracture propagation that might occur would be in a horizontal plane. The development of either Mode I or large-scale shear fractures would require a drastic change in the compressive in situ stress state at the depth of the disposal vault. The stresses developed as a result of both thermal and glacial loading do not appear sufficient to cause new fracturing. Glacial loading would reduce the shear stresses in the rock mass and hence improve the stability of the rock mass surrounding the vault. Thus, it is not feasible that large-scale fracturing would occur over the 10,000 year life of a disposal vault in the Canadian Shield, at depths of 500 m or greater, where the compressive stress state is similar to that found at the Underground Research Laboratory. 107 refs., 44 figs

  14. Distributed Semidefinite Programming with Application to Large-scale System Analysis

    DEFF Research Database (Denmark)

    Khoshfetrat Pakazad, Sina; Hansson, Anders; Andersen, Martin S.

    2017-01-01

    Distributed algorithms for solving coupled semidefinite programs (SDPs) commonly require many iterations to converge. They also put high computational demand on the computational agents. In this paper we show that in case the coupled problem has an inherent tree structure, it is possible to devis...

  15. Safe Patient Handling and Mobility: Development and Implementation of a Large-Scale Education Program.

    Science.gov (United States)

    Lee, Corinne; Knight, Suzanne W; Smith, Sharon L; Nagle, Dorothy J; DeVries, Lori

    This article addresses the development, implementation, and evaluation of an education program for safe patient handling and mobility at a large academic medical center. The ultimate goal of the program was to increase safety during patient mobility/transfer and reduce nursing staff injury from lifting/pulling. This comprehensive program was designed on the basis of the principles of prework, application, and support at the point of care. A combination of online learning, demonstration, skill evaluation, and coaching at the point of care was used to achieve the goal. Specific roles and responsibilities were developed to facilitate implementation. It took 17 master trainers, 88 certified trainers, 176 unit-based trainers, and 98 coaches to put 3706 nurses and nursing assistants through the program. Evaluations indicated both an increase in knowledge about safe patient handling and an increased ability to safely mobilize patients. The challenge now is sustainability of safe patient-handling practices and the growth and development of trainers and coaches.

  16. Results and lessons learned from UMANG program: A large scale community-managed supplementary feeding program in India

    International Nuclear Information System (INIS)

    Chockalingham, David; Gnanaraj, Grana Pu Selvi; Indriani, Esther

    2014-01-01

    feeding program called “UMANG” (Urgent Management & Action for Nutrition Growth) was developed and implemented across 84 ADPs. Through this program a malnourished child gets an additional feeding (one full meal and healthy snack), apart from what is provided at home and through the Government run Anganwadi Centre (an Indian policy to provide free mid-day meal to the children, but recent review shows varying degree of quality and attendance). UMANG menu meets one third of the daily requirement of children using locally available low cost nutritious food provided for a period of 90 days. Through UMANG mothers were educated and trained on healthy cooking, feeding and caring practices. Within the period of October 2012 to May 2013, as many as 24,154 children were enrolled in UMANG, and 44% have graduated to normal nutritional status at the end of 90 days program. Review of the program revealed that UMANG has increased the knowledge of mothers on malnutrition, contributed to the formation of common interest groups and enhanced the co-ordination of the frontline workers in addressing malnutrition. The presentation will highlight lessons learned from the 90-day implementation of this large scale community-managed supplementary feeding program. (author)

  17. DupTree: a program for large-scale phylogenetic analyses using gene tree parsimony.

    Science.gov (United States)

    Wehe, André; Bansal, Mukul S; Burleigh, J Gordon; Eulenstein, Oliver

    2008-07-01

    DupTree is a new software program for inferring rooted species trees from collections of gene trees using the gene tree parsimony approach. The program implements a novel algorithm that significantly improves upon the run time of standard search heuristics for gene tree parsimony, and enables the first truly genome-scale phylogenetic analyses. In addition, DupTree allows users to examine alternate rootings and to weight the reconciliation costs for gene trees. DupTree is an open source project written in C++. DupTree for Mac OS X, Windows, and Linux along with a sample dataset and an on-line manual are available at http://genome.cs.iastate.edu/CBL/DupTree

  18. Impact of large-scale energy efficiency programs on utility finances and consumer tariffs in India

    International Nuclear Information System (INIS)

    Abhyankar, Nikit; Phadke, Amol

    2012-01-01

    The objective of this paper is to analyze the effect on utility finances and consumer tariffs of implementing utility-funded demand-side energy efficiency (EE) programs in India. We use the state of Delhi as a case study. We estimate that by 2015, the electric utilities in Delhi can potentially save nearly 14% of total sales. We examine the impacts on utility finances and consumer tariffs by developing scenarios that account for variations in the following factors: (a) incentive mechanisms for mitigating the financial risk of utilities, (b) whether utilities fund the EE programs only partially, (c) whether utilities sell the conserved electricity into spot markets and (d) the level of power shortages utilities are facing. We find that average consumer tariff would increase by 2.2% although consumers participating in EE programs benefit from reduction in their electricity consumption. While utility incentive mechanisms can mitigate utilities’ risk of losing long-run returns, they cannot address the risk of consistently negative cash flow. In case of power shortages, the cash flow risk is amplified (reaching up to 57% of utilities annual returns) and is very sensitive to marginal tariffs of consumers facing power shortages. We conclude by proposing solutions to mitigate utility risks. - Highlights: ► We model implementation of energy efficiency (EE) programs in Delhi, India. ► We examine the impact on utility finances and consumer tariffs from 2012 to 2015. ► We find that average consumer tariffs increase but participating consumers benefit. ► Existing regulatory mechanisms cannot address utilities’ risk of negative cash flow. ► Frequent true-ups or ex-ante revenue adjustment is required to address such risk.

  19. Image subsampling and point scoring approaches for large-scale marine benthic monitoring programs

    Science.gov (United States)

    Perkins, Nicholas R.; Foster, Scott D.; Hill, Nicole A.; Barrett, Neville S.

    2016-07-01

    Benthic imagery is an effective tool for quantitative description of ecologically and economically important benthic habitats and biota. The recent development of autonomous underwater vehicles (AUVs) allows surveying of spatial scales that were previously unfeasible. However, an AUV collects a large number of images, the scoring of which is time and labour intensive. There is a need to optimise the way that subsamples of imagery are chosen and scored to gain meaningful inferences for ecological monitoring studies. We examine the trade-off between the number of images selected within transects and the number of random points scored within images on the percent cover of target biota, the typical output of such monitoring programs. We also investigate the efficacy of various image selection approaches, such as systematic or random, on the bias and precision of cover estimates. We use simulated biotas that have varying size, abundance and distributional patterns. We find that a relatively small sampling effort is required to minimise bias. An increased precision for groups that are likely to be the focus of monitoring programs is best gained through increasing the number of images sampled rather than the number of points scored within images. For rare species, sampling using point count approaches is unlikely to provide sufficient precision, and alternative sampling approaches may need to be employed. The approach by which images are selected (simple random sampling, regularly spaced etc.) had no discernible effect on mean and variance estimates, regardless of the distributional pattern of biota. Field validation of our findings is provided through Monte Carlo resampling analysis of a previously scored benthic survey from temperate waters. We show that point count sampling approaches are capable of providing relatively precise cover estimates for candidate groups that are not overly rare. The amount of sampling required, in terms of both the number of images and

  20. Multicontroller: an object programming approach to introduce advanced control algorithms for the GCS large scale project

    CERN Document Server

    Cabaret, S; Coppier, H; Rachid, A; Barillère, R; CERN. Geneva. IT Department

    2007-01-01

    The GCS (Gas Control System) project team at CERN uses a Model Driven Approach with a Framework - UNICOS (UNified Industrial COntrol System) - based on PLC (Programming Language Controller) and SCADA (Supervisory Control And Data Acquisition) technologies. The first' UNICOS versions were able to provide a PID (Proportional Integrative Derivative) controller whereas the Gas Systems required more advanced control strategies. The MultiController is a new UNICOS object which provides the following advanced control algorithms: Smith Predictor, PFC (Predictive Function Control), RST* and GPC (Global Predictive Control). Its design is based on a monolithic entity with a global structure definition which is able to capture the desired set of parameters of any specific control algorithm supported by the object. The SCADA system -- PVSS - supervises the MultiController operation. The PVSS interface provides users with supervision faceplate, in particular it links any MultiController with recipes: the GCS experts are ab...

  1. The assessment of the readiness of five countries to implement child maltreatment prevention programs on a large scale.

    Science.gov (United States)

    Mikton, Christopher; Power, Mick; Raleva, Marija; Makoae, Mokhantso; Al Eissa, Majid; Cheah, Irene; Cardia, Nancy; Choo, Claire; Almuneef, Maha

    2013-12-01

    This study aimed to systematically assess the readiness of five countries - Brazil, the Former Yugoslav Republic of Macedonia, Malaysia, Saudi Arabia, and South Africa - to implement evidence-based child maltreatment prevention programs on a large scale. To this end, it applied a recently developed method called Readiness Assessment for the Prevention of Child Maltreatment based on two parallel 100-item instruments. The first measures the knowledge, attitudes, and beliefs concerning child maltreatment prevention of key informants; the second, completed by child maltreatment prevention experts using all available data in the country, produces a more objective assessment readiness. The instruments cover all of the main aspects of readiness including, for instance, availability of scientific data on the problem, legislation and policies, will to address the problem, and material resources. Key informant scores ranged from 31.2 (Brazil) to 45.8/100 (the Former Yugoslav Republic of Macedonia) and expert scores, from 35.2 (Brazil) to 56/100 (Malaysia). Major gaps identified in almost all countries included a lack of professionals with the skills, knowledge, and expertise to implement evidence-based child maltreatment programs and of institutions to train them; inadequate funding, infrastructure, and equipment; extreme rarity of outcome evaluations of prevention programs; and lack of national prevalence surveys of child maltreatment. In sum, the five countries are in a low to moderate state of readiness to implement evidence-based child maltreatment prevention programs on a large scale. Such an assessment of readiness - the first of its kind - allows gaps to be identified and then addressed to increase the likelihood of program success. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Web-Enabled Analysis of a Large-Scale Deactivation and Decommissioning Program

    International Nuclear Information System (INIS)

    Bollinger, James S.; Austin, William E.

    2008-01-01

    From the mid-1950's through the 1980's, the U.S. Department of Energy's Savannah River Site produced nuclear materials for the weapons stockpile, for medical and industrial applications, and for space exploration. Although SRS has a continuing defense-related mission, the overall site mission is now oriented toward environmental restoration and management of legacy chemical and nuclear waste. With the change in mission, SRS no longer has a need for much of the infrastructure developed to support the weapons program. This excess infrastructure, which includes over 1000 facilities, will be decommissioned and demolished over the forthcoming years. A comprehensive environmental management plan was developed in 2003 detailing the schedule and sequence for decommissioning and demolition (D and D) activities. Implementation of this plan was anticipated to have a significant impact at SRS; therefore, a sophisticated web-enabled mapping capability was developed to allow SRS management, employees, and other stakeholders, to view the contents of the plan in an interactive fashion. Web-enabled mapping allows users to view the overall impact of the plan from a unique geographic perspective that can be quickly updated to reflect changes on the ground. Development of the web-based D and D mapping and management system provides SRS personnel the ability to see the detailed contents of the facilities D and D plan in a geographic context and highly interactive environment. As the plan changes due to internal and external dynamic factors, updates can be quickly made so that the latest and most accurate information is available. Information content can be easily managed by the user in the application, allowing change in granularity with the click of a mouse button. As a result, the use of GIS and web-enabled mapping has had a revolutionary impact on the development and dissemination of D and D information at SRS

  3. Optimizing Implementation of Obesity Prevention Programs: A Qualitative Investigation Within a Large-Scale Randomized Controlled Trial.

    Science.gov (United States)

    Kozica, Samantha L; Teede, Helena J; Harrison, Cheryce L; Klein, Ruth; Lombard, Catherine B

    2016-01-01

    The prevalence of obesity in rural and remote areas is elevated in comparison to urban populations, highlighting the need for interventions targeting obesity prevention in these settings. Implementing evidence-based obesity prevention programs is challenging. This study aimed to investigate factors influencing the implementation of obesity prevention programs, including adoption, program delivery, community uptake, and continuation, specifically within rural settings. Nested within a large-scale randomized controlled trial, a qualitative exploratory approach was adopted, with purposive sampling techniques utilized, to recruit stakeholders from 41 small rural towns in Australia. In-depth semistructured interviews were conducted with clinical health professionals, health service managers, and local government employees. Open coding was completed independently by 2 investigators and thematic analysis undertaken. In-depth interviews revealed that obesity prevention programs were valued by the rural workforce. Program implementation is influenced by interrelated factors across: (1) contextual factors and (2) organizational capacity. Key recommendations to manage the challenges of implementing evidence-based programs focused on reducing program delivery costs, aided by the provision of a suite of implementation and evaluation resources. Informing the scale-up of future prevention programs, stakeholders highlighted the need to build local rural capacity through developing supportive university partnerships, generating local program ownership and promoting active feedback to all program partners. We demonstrate that the rural workforce places a high value on obesity prevention programs. Our results inform the future scale-up of obesity prevention programs, providing an improved understanding of strategies to optimize implementation of evidence-based prevention programs. © 2015 National Rural Health Association.

  4. Proceedings of joint meeting of the 6th simulation science symposium and the NIFS collaboration research 'large scale computer simulation'

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-03-01

    Joint meeting of the 6th Simulation Science Symposium and the NIFS Collaboration Research 'Large Scale Computer Simulation' was held on December 12-13, 2002 at National Institute for Fusion Science, with the aim of promoting interdisciplinary collaborations in various fields of computer simulations. The present meeting attended by more than 40 people consists of the 11 invited and 22 contributed papers, of which topics were extended not only to fusion science but also to related fields such as astrophysics, earth science, fluid dynamics, molecular dynamics, computer science etc. (author)

  5. Geomorphological research of large-scale slope instability at Machu Picchu, Peru

    Czech Academy of Sciences Publication Activity Database

    Vilímek, V.; Zvelebil, J.; Klimeš, Jan; Patzelt, Z.; Astete, F.V.; Kachlík, F.; Hartvich, Filip

    2007-01-01

    Roč. 89, č. 3-4 (2007), s. 241-257 ISSN 0169-555X Institutional research plan: CEZ:AV0Z30460519 Keywords : natural hazard * Machu Picchu * landslides Subject RIV: DB - Geology ; Mineralogy Impact factor: 1.854, year: 2007 www.elsevier.com/locate/geomorphology

  6. Large Scale Computing and Storage Requirements for Biological and Environmental Research

    Energy Technology Data Exchange (ETDEWEB)

    DOE Office of Science, Biological and Environmental Research Program Office (BER),

    2009-09-30

    In May 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of Biological and Environmental Research (BER) held a workshop to characterize HPC requirements for BER-funded research over the subsequent three to five years. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. Chief among them: scientific progress in BER-funded research is limited by current allocations of computational resources. Additionally, growth in mission-critical computing -- combined with new requirements for collaborative data manipulation and analysis -- will demand ever increasing computing, storage, network, visualization, reliability and service richness from NERSC. This report expands upon these key points and adds others. It also presents a number of"case studies" as significant representative samples of the needs of science teams within BER. Workshop participants were asked to codify their requirements in this"case study" format, summarizing their science goals, methods of solution, current and 3-5 year computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel,"multi-core" environment that is expected to dominate HPC architectures over the next few years.

  7. IDRC-supported research on large-scale land acquisitions in Africa

    International Development Research Centre (IDRC) Digital Library (Canada)

    Anna Russell

    2015-12-11

    Dec 11, 2015 ... Using action research to build greater accountability ... IDRC now wishes to capitalize on existing investments, and deepen the impact of ... consultation, access to information and meaningful participation with respect to the public decision- ..... account for the management of land was found to be needed to.

  8. Impetus for TESLA ! DESY welcomes decision of the Federal Research Ministry on large-scale facilities

    CERN Multimedia

    2003-01-01

    Based on the decision published today by the Federal Minister of Education and Research, Edelgard Bulmahn, the preparations for TESLA will now enter a new phase. For the X-ray laser project, the first step will be to work out the financial, technical and organizational framework with the interested European partners (1 page).

  9. Methodology for a GIS-based damage assessment for researchers following large scale disasters

    Science.gov (United States)

    Crawford, Patrick Shane

    The 1990s were designated the International Decade for Natural Disaster Reduction by the United Nations General Assembly. This push for decrease of loss of life, property destruction, and social and economic disruption brought advancements in disaster management, including damage assessment. Damage assessment in the wake of natural and manmade disasters is a useful tool for government agencies, insurance companies, and researchers. As technologies evolve damage assessment processes constantly evolve as well. Alongside the advances in Geographic Information Systems (GIS), remote sensing, and Global Positioning System (GPS) technology, as well as the growing awareness of the needs of a standard operating procedure for GIS-based damage assessment and a need to make the damage assessment process as quick and accurate as possible, damage assessment procedures are becoming easier to execute and the results are becoming more accurate and robust. With these technological breakthroughs, multi-disciplinary damage assessment reconnaissance teams have become more efficient in their assessment methods through better organization and more robust through addition of new datasets. Damage assessment personnel are aided by software tools that offer high-level analysis and increasingly rapid damage assessment methods. GIS software has advanced the damage assessment methods of these teams by combining remotely sensed aerial imagery, GPS, and other technologies to expand the uses of the data. GIS allows researchers to use aerial imagery to show field collected data in the geographic location that it was collected so that information can be revisited, measurements can be taken, and data can be disseminated to other researchers and the public. The GIS-based data available to the reconnaissance team includes photographs of damage, worksheets, calculations, voice messages collected while studying the affected area, and many other datasets which are based on the type of disaster and the

  10. Nutritional implications of organic conversion in large scale food service preliminary results from Core Organic research

    DEFF Research Database (Denmark)

    Mikkelsen, Bent Egberg; He, Chen

    food coordinators in public schools in Denmark, Finland, Germany, and Italy. A questionnaire was adapted to fit the different languages and food cultures in the countries.. The data suggest that schools with organic supply tend to develop organisational environments that a more supportive for healthy......The discussion about nutritional advantages of organic consumption has traditionally focused on the properties of the food it self. Studies have shown however that change of consumption patterns towards organic food seems to induce changed dietary patterns. The current research was a part of the i......POPY study and was conducted to investigate if such changes can be found in school food settings. In other words does organic food schemes at school and related curricular activities help to create environments that are supportive for healthier eating among children? The research was carried out among school...

  11. CONSORT to community: translation of an RCT to a large-scale community intervention and learnings from evaluation of the upscaled program.

    Science.gov (United States)

    Moores, Carly Jane; Miller, Jacqueline; Perry, Rebecca Anne; Chan, Lily Lai Hang; Daniels, Lynne Allison; Vidgen, Helen Anna; Magarey, Anthea Margaret

    2017-11-29

    Translation encompasses the continuum from clinical efficacy to widespread adoption within the healthcare service and ultimately routine clinical practice. The Parenting, Eating and Activity for Child Health (PEACH™) program has previously demonstrated clinical effectiveness in the management of child obesity, and has been recently implemented as a large-scale community intervention in Queensland, Australia. This paper aims to describe the translation of the evaluation framework from a randomised controlled trial (RCT) to large-scale community intervention (PEACH™ QLD). Tensions between RCT paradigm and implementation research will be discussed along with lived evaluation challenges, responses to overcome these, and key learnings for future evaluation conducted at scale. The translation of evaluation from PEACH™ RCT to the large-scale community intervention PEACH™ QLD is described. While the CONSORT Statement was used to report findings from two previous RCTs, the REAIM framework was more suitable for the evaluation of upscaled delivery of the PEACH™ program. Evaluation of PEACH™ QLD was undertaken during the project delivery period from 2013 to 2016. Experiential learnings from conducting the evaluation of PEACH™ QLD to the described evaluation framework are presented for the purposes of informing the future evaluation of upscaled programs. Evaluation changes in response to real-time changes in the delivery of the PEACH™ QLD Project were necessary at stages during the project term. Key evaluation challenges encountered included the collection of complete evaluation data from a diverse and geographically dispersed workforce and the systematic collection of process evaluation data in real time to support program changes during the project. Evaluation of large-scale community interventions in the real world is challenging and divergent from RCTs which are rigourously evaluated within a more tightly-controlled clinical research setting. Constructs

  12. Communicating the promise, risks, and ethics of large-scale, open space microbiome and metagenome research.

    Science.gov (United States)

    Shamarina, Daria; Stoyantcheva, Iana; Mason, Christopher E; Bibby, Kyle; Elhaik, Eran

    2017-10-04

    The public commonly associates microorganisms with pathogens. This suspicion of microorganisms is understandable, as historically microorganisms have killed more humans than any other agent while remaining largely unknown until the late seventeenth century with the works of van Leeuwenhoek and Kircher. Despite our improved understanding regarding microorganisms, the general public are apt to think of diseases rather than of the majority of harmless or beneficial species that inhabit our bodies and the built and natural environment. As long as microbiome research was confined to labs, the public's exposure to microbiology was limited. The recent launch of global microbiome surveys, such as the Earth Microbiome Project and MetaSUB (Metagenomics and Metadesign of Subways and Urban Biomes) project, has raised ethical, financial, feasibility, and sustainability concerns as to the public's level of understanding and potential reaction to the findings, which, done improperly, risk negative implications for ongoing and future investigations, but done correctly, can facilitate a new vision of "smart cities." To facilitate improved future research, we describe here the major concerns that our discussions with ethics committees, community leaders, and government officials have raised, and we expound on how to address them. We further discuss ethical considerations of microbiome surveys and provide practical recommendations for public engagement.

  13. A radiation service centre for research and large-scale irradiation

    International Nuclear Information System (INIS)

    Offermann, B.P.; Hofmann, E.G.

    1978-01-01

    In the near future radiation processing of food may change from the present laboratory-scale to large industrial application. This step will require large irradiation facilities with high flexibility, a safe dose control system and simple food-handling systems. Some design parameters of such an irradiation facility have already been realized at the AEG-Telefunken Radiation Service Centre in Wedel. This centre came into operation in autumn 1976. It is equipped with one research-type high-power X-ray unit (200kV/32mA) and one industrial-type electron accelerator (1500kV/37.5kW). Handling systems are available for radiation crosslinking of wire and cable insulations, of plastic films, for irradiation treatment of components and parts of different types and coatings as also of sewage sludge and waste water. Some of these handling systems can be used for food irradiation too. Other handling systems will be added sometime later. As an additional service the Company's existing material and environmental testing laboratory will be available. The centre is already being used by many interested companies to investigate the effects of radiation on a broad range of organic and inorganic materials, to develop special processing equipment, to process supplied products and to perform R and D work and contracts. The service centre fills an existing gap and will have an impact on the commercialization of radiation processing techniques in Europe. (author)

  14. The new large-scale international facility for antiproton and ion research in Europe, FAIR

    International Nuclear Information System (INIS)

    Rosner, Guenther

    2012-01-01

    Full text: FAIR is currently the largest project in nuclear and particle physics worldwide, with investment costs of 1.6B euro in its first phase. It has been founded by Finland, France, Germany, India, Poland, Romania, Russia, Slovenia and Sweden in Oct. 2010. The facility will provide the international scientific community with a unique and technically innovative particle accelerator system to perform cutting-edge research in the sciences concerned with the basic structure of matter in: nuclear and particle physics, atomic and anti-matter physics, high density plasma physics, and applications in condensed matter physics, biology and the bio-medical sciences. The work horse of FAIR will be a 1.1 km circumference double ring of rapidly cycling 100 and 300 Tm synchrotrons, which will be used to produce high intensity secondary beams of anti-protons and very short-lived radioactive ions. A subsequent suite of cooler and storage rings will deliver anti-proton and heavy-ion beams of unprecedented quality regarding intensity and resolution. Large experimental facilities are presently being prototyped by the APPA, CBM, NuSTAR and PANDA Collaborations to be used by a global community of more than 3000 scientists from 2018. (author)

  15. Towards agile large-scale predictive modelling in drug discovery with flow-based programming design principles.

    Science.gov (United States)

    Lampa, Samuel; Alvarsson, Jonathan; Spjuth, Ola

    2016-01-01

    Predictive modelling in drug discovery is challenging to automate as it often contains multiple analysis steps and might involve cross-validation and parameter tuning that create complex dependencies between tasks. With large-scale data or when using computationally demanding modelling methods, e-infrastructures such as high-performance or cloud computing are required, adding to the existing challenges of fault-tolerant automation. Workflow management systems can aid in many of these challenges, but the currently available systems are lacking in the functionality needed to enable agile and flexible predictive modelling. We here present an approach inspired by elements of the flow-based programming paradigm, implemented as an extension of the Luigi system which we name SciLuigi. We also discuss the experiences from using the approach when modelling a large set of biochemical interactions using a shared computer cluster.Graphical abstract.

  16. Collaborative-Large scale Engineering Assessment Networks for Environmental Research: The Overview

    Science.gov (United States)

    Moo-Young, H.

    2004-05-01

    cyber-infrastructure; 3) A Mechanism for multidisciplinary research and education activities designed to exploit the output of the instrumented sites and networked information technology, to formulate engineering and policy options directed toward the protection, remediation, and restoration of stressed environments and sustainability of environmental resources; and 4) A Collaboration among engineers, natural and social scientists, educators, policy makers, industry, non-governmental organizations, the public, and other stakeholders.

  17. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  18. Effect Of A Large-Scale Social Franchising And Telemedicine Program On Childhood Diarrhea And Pneumonia Outcomes In India.

    Science.gov (United States)

    Mohanan, Manoj; Babiarz, Kimberly S; Goldhaber-Fiebert, Jeremy D; Miller, Grant; Vera-Hernández, Marcos

    2016-10-01

    Despite the rapid growth of social franchising, there is little evidence on its population impact in the health sector. Similar in many ways to private-sector commercial franchising, social franchising can be found in sectors with a social objective, such as health care. This article evaluates the World Health Partners (WHP) Sky program, a large-scale social franchising and telemedicine program in Bihar, India. We studied appropriate treatment for childhood diarrhea and pneumonia and associated health care outcomes. We used multivariate difference-in-differences models to analyze data on 67,950 children ages five and under in 2011 and 2014. We found that the WHP-Sky program did not improve rates of appropriate treatment or disease prevalence. Both provider participation and service use among target populations were low. Our results do not imply that social franchising cannot succeed; instead, they underscore the importance of understanding factors that explain variation in the performance of social franchises. Our findings also highlight, for donors and governments in particular, the importance of conducting rigorous impact evaluations of new and potentially innovative health care delivery programs before investing in scaling them up. Published by Project HOPE—The People-to-People Health Foundation, Inc.

  19. The Breathmobile Program: structure, implementation, and evolution of a large-scale, urban, pediatric asthma disease management program.

    Science.gov (United States)

    Jones, Craig A; Clement, Loran T; Hanley-Lopez, Jean; Morphew, Tricia; Kwong, Kenny Yat Choi; Lifson, Francene; Opas, Lawrence; Guterman, Jeffrey J

    2005-08-01

    Despite more than a decade of education and research-oriented intervention programs, inner city children with asthma continue to engage in episodic "rescue" patterns of healthcare and experience a disproportionate level of morbidity. The aim of this study was to establish and evaluate a sustainable community-wide pediatric asthma disease management program designed to shift inner city children in Los Angeles from acute episodic care to regular preventive care in accordance with national standards. In 1995 the Southern California Chapter of the Asthma and Allergy Foundation of America (AAFA), the Los Angeles County Department of Health Services (LAC DHS), and the Los Angeles Unified School District (LAUSD) established an agreement to initiate and sustain the Breathmobile Program. This program includes automated case identification, mobile school-based clinics, and highly structured clinical encounters supported by an advanced information technology solution. Interdisciplinary teams of asthma care specialists provide regular and ongoing care to children at school and county clinic sites over a wide geographic area of urban Los Angeles. Each team operates in a specially equipped mobile clinic (Breathmobile), efficiently moving a structured healthcare process to school and county clinic sites with large numbers of children. Demographic, clinical, and participation data is tracked carefully in an electronic medical record system. Program operations, clinical oversight, and patient tracking are centralized at a care coordination center. Clinical operations and methods have been replicated in fixed specialty clinic sites at the Los Angeles County + University of Southern California Medical Center. Clinical and process measures are regularly evaluated to assure quality, plan iterative improvement, and support evidence-based care. Four Breathmobiles deliver ongoing care at more than 90 school sites. The program has engaged over five thousand patients and their families in a

  20. Research Guidelines in the Era of Large-scale Collaborations: An Analysis of Genome-wide Association Study Consortia

    Science.gov (United States)

    Austin, Melissa A.; Hair, Marilyn S.; Fullerton, Stephanie M.

    2012-01-01

    Scientific research has shifted from studies conducted by single investigators to the creation of large consortia. Genetic epidemiologists, for example, now collaborate extensively for genome-wide association studies (GWAS). The effect has been a stream of confirmed disease-gene associations. However, effects on human subjects oversight, data-sharing, publication and authorship practices, research organization and productivity, and intellectual property remain to be examined. The aim of this analysis was to identify all research consortia that had published the results of a GWAS analysis since 2005, characterize them, determine which have publicly accessible guidelines for research practices, and summarize the policies in these guidelines. A review of the National Human Genome Research Institute’s Catalog of Published Genome-Wide Association Studies identified 55 GWAS consortia as of April 1, 2011. These consortia were comprised of individual investigators, research centers, studies, or other consortia and studied 48 different diseases or traits. Only 14 (25%) were found to have publicly accessible research guidelines on consortia websites. The available guidelines provide information on organization, governance, and research protocols; half address institutional review board approval. Details of publication, authorship, data-sharing, and intellectual property vary considerably. Wider access to consortia guidelines is needed to establish appropriate research standards with broad applicability to emerging forms of large-scale collaboration. PMID:22491085

  1. [Results and evaluation of 3 years of a large scale mammography program in the Ariana area of Tunisia].

    Science.gov (United States)

    Bouchlaka, A; Ben Abdallah, M; Ben Aissa, R; Zaanouni, E; Kribi, L; Smida, S; M'barek, F; Ben Hamida, A; Boussen, H; Gueddana, N

    2009-07-01

    To asses and analyse the results of 3 years large scale mammography screening of breast cancer in Ariana state in Tunisia. This program, managed by the Family and Population National Office, was addressing to women aged from 49 to 69 years old residing in a area with adds up a population of 459 700 inhabitants including 52,729 women in the target age population. The screening was including a breast clinical examination and a mammography with two incidences face and external profile. The women was invited at their residence or were sensitized in the reproductive health centers, care and base health centers or by a close relation which heard of the program. An enlightened assent was submitted to the women who wished to profit from the screening. In three years, 9093 mammography were carried out of which 8244 were retained in the analyses, that is to say a rate participation of 9.6%. The rate of women recalled for suspect test was of 18.1% and it was of 13.1% among women of more than 50 years. The rate of practiced surgical biopsies was of 0.5% and the positive predictive value was of 45.5%. The average time between the date of screening and the result of the screening was 9.7 days, more important in the event of tests requiring a complementary assessment (61.7 days). On the whole 40 cancers were detected by the program, that is to say a rough rate of detected cancers, of 4.9 per thousand, in conformity with the recommendations. The percentage of invasive cancers detect an important rate of cancers of which cancers infra-private clinics.

  2. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  3. Large-scale grid management

    International Nuclear Information System (INIS)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-01-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series

  4. Process evaluation and assessment of use of a large scale water filter and cookstove program in Rwanda

    Directory of Open Access Journals (Sweden)

    Christina K. Barstow

    2016-07-01

    financed, public health intervention can achieve high levels of initial adoption and usage of household level water filtration and improved cookstoves at a large scale.

  5. Conference on Large Scale Optimization

    CERN Document Server

    Hearn, D; Pardalos, P

    1994-01-01

    On February 15-17, 1993, a conference on Large Scale Optimization, hosted by the Center for Applied Optimization, was held at the University of Florida. The con­ ference was supported by the National Science Foundation, the U. S. Army Research Office, and the University of Florida, with endorsements from SIAM, MPS, ORSA and IMACS. Forty one invited speakers presented papers on mathematical program­ ming and optimal control topics with an emphasis on algorithm development, real world applications and numerical results. Participants from Canada, Japan, Sweden, The Netherlands, Germany, Belgium, Greece, and Denmark gave the meeting an important international component. At­ tendees also included representatives from IBM, American Airlines, US Air, United Parcel Serice, AT & T Bell Labs, Thinking Machines, Army High Performance Com­ puting Research Center, and Argonne National Laboratory. In addition, the NSF sponsored attendance of thirteen graduate students from universities in the United States and abro...

  6. Research on a Small Signal Stability Region Boundary Model of the Interconnected Power System with Large-Scale Wind Power

    Directory of Open Access Journals (Sweden)

    Wenying Liu

    2015-03-01

    Full Text Available For the interconnected power system with large-scale wind power, the problem of the small signal stability has become the bottleneck of restricting the sending-out of wind power as well as the security and stability of the whole power system. Around this issue, this paper establishes a small signal stability region boundary model of the interconnected power system with large-scale wind power based on catastrophe theory, providing a new method for analyzing the small signal stability. Firstly, we analyzed the typical characteristics and the mathematic model of the interconnected power system with wind power and pointed out that conventional methods can’t directly identify the topological properties of small signal stability region boundaries. For this problem, adopting catastrophe theory, we established a small signal stability region boundary model of the interconnected power system with large-scale wind power in two-dimensional power injection space and extended it to multiple dimensions to obtain the boundary model in multidimensional power injection space. Thirdly, we analyzed qualitatively the topological property’s changes of the small signal stability region boundary caused by large-scale wind power integration. Finally, we built simulation models by DIgSILENT/PowerFactory software and the final simulation results verified the correctness and effectiveness of the proposed model.

  7. The Contribution of International Large-Scale Assessments to Educational Research: Combining Individual and Institutional Data Sources

    Science.gov (United States)

    Strietholt, Rolf; Scherer, Ronny

    2018-01-01

    The present paper aims to discuss how data from international large-scale assessments (ILSAs) can be utilized and combined, even with other existing data sources, in order to monitor educational outcomes and study the effectiveness of educational systems. We consider different purposes of linking data, namely, extending outcomes measures,…

  8. Extraction of relations between genes and diseases from text and large-scale data analysis: implications for translational research.

    Science.gov (United States)

    Bravo, Àlex; Piñero, Janet; Queralt-Rosinach, Núria; Rautschka, Michael; Furlong, Laura I

    2015-02-21

    Current biomedical research needs to leverage and exploit the large amount of information reported in scientific publications. Automated text mining approaches, in particular those aimed at finding relationships between entities, are key for identification of actionable knowledge from free text repositories. We present the BeFree system aimed at identifying relationships between biomedical entities with a special focus on genes and their associated diseases. By exploiting morpho-syntactic information of the text, BeFree is able to identify gene-disease, drug-disease and drug-target associations with state-of-the-art performance. The application of BeFree to real-case scenarios shows its effectiveness in extracting information relevant for translational research. We show the value of the gene-disease associations extracted by BeFree through a number of analyses and integration with other data sources. BeFree succeeds in identifying genes associated to a major cause of morbidity worldwide, depression, which are not present in other public resources. Moreover, large-scale extraction and analysis of gene-disease associations, and integration with current biomedical knowledge, provided interesting insights on the kind of information that can be found in the literature, and raised challenges regarding data prioritization and curation. We found that only a small proportion of the gene-disease associations discovered by using BeFree is collected in expert-curated databases. Thus, there is a pressing need to find alternative strategies to manual curation, in order to review, prioritize and curate text-mining data and incorporate it into domain-specific databases. We present our strategy for data prioritization and discuss its implications for supporting biomedical research and applications. BeFree is a novel text mining system that performs competitively for the identification of gene-disease, drug-disease and drug-target associations. Our analyses show that mining only a

  9. An evaluation of two large scale demand side financing programs for maternal health in India: the MATIND study protocol

    Directory of Open Access Journals (Sweden)

    Sidney Kristi

    2012-08-01

    Full Text Available Abstract Background High maternal mortality in India is a serious public health challenge. Demand side financing interventions have emerged as a strategy to promote access to emergency obstetric care. Two such state run programs, Janani Suraksha Yojana (JSYand Chiranjeevi Yojana (CY, were designed and implemented to reduce financial access barriers that preclude women from obtaining emergency obstetric care. JSY, a conditional cash transfer, awards money directly to a woman who delivers in a public health facility. This will be studied in Madhya Pradesh province. CY, a voucher based program, empanels private obstetricians in Gujarat province, who are reimbursed by the government to perform deliveries of socioeconomically disadvantaged women. The programs have been in operation for the last seven years. Methods/designs The study outlined in this protocol will assess and compare the influence of the two programs on various aspects of maternal health care including trends in program uptake, institutional delivery rates, maternal and neonatal outcomes, quality of care, experiences of service providers and users, and cost effectiveness. The study will collect primary data using a combination of qualitative and quantitative methods, including facility level questionnaires, observations, a population based survey, in-depth interviews, and focus group discussions. Primary data will be collected in three districts of each province. The research will take place at three levels: the state health departments, obstetric facilities in the districts and among recently delivered mothers in the community. Discussion The protocol is a comprehensive assessment of the performance and impact of the programs and an economic analysis. It will fill existing evidence gaps in the scientific literature including access and quality to services, utilization, coverage and impact. The implementation of the protocol will also generate evidence to facilitate decision making

  10. An evaluation of two large scale demand side financing programs for maternal health in India: the MATIND study protocol.

    Science.gov (United States)

    Sidney, Kristi; de Costa, Ayesha; Diwan, Vishal; Mavalankar, Dileep V; Smith, Helen

    2012-08-27

    High maternal mortality in India is a serious public health challenge. Demand side financing interventions have emerged as a strategy to promote access to emergency obstetric care. Two such state run programs, Janani Suraksha Yojana (JSY)and Chiranjeevi Yojana (CY), were designed and implemented to reduce financial access barriers that preclude women from obtaining emergency obstetric care. JSY, a conditional cash transfer, awards money directly to a woman who delivers in a public health facility. This will be studied in Madhya Pradesh province. CY, a voucher based program, empanels private obstetricians in Gujarat province, who are reimbursed by the government to perform deliveries of socioeconomically disadvantaged women. The programs have been in operation for the last seven years. The study outlined in this protocol will assess and compare the influence of the two programs on various aspects of maternal health care including trends in program uptake, institutional delivery rates, maternal and neonatal outcomes, quality of care, experiences of service providers and users, and cost effectiveness. The study will collect primary data using a combination of qualitative and quantitative methods, including facility level questionnaires, observations, a population based survey, in-depth interviews, and focus group discussions. Primary data will be collected in three districts of each province. The research will take place at three levels: the state health departments, obstetric facilities in the districts and among recently delivered mothers in the community. The protocol is a comprehensive assessment of the performance and impact of the programs and an economic analysis. It will fill existing evidence gaps in the scientific literature including access and quality to services, utilization, coverage and impact. The implementation of the protocol will also generate evidence to facilitate decision making among policy makers and program managers who currently work with or

  11. Large scale tracking algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Ross L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Love, Joshua Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Melgaard, David Kennett [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Karelitz, David B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pitts, Todd Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Zollweg, Joshua David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Anderson, Dylan Z. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Nandy, Prabal [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Whitlow, Gary L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bender, Daniel A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Byrne, Raymond Harry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  12. Survey and research for the enhancement of large-scale technology development 1. Japan's large-scale technology development and the effects; Ogata gijutsu kaihatsu suishin no tame no chosa kenkyu. 1. Nippon no daikibo gijutsu kaihatsu to sono koka

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1981-03-01

    A survey is conducted into the effects of projects implemented under the large-scale industrial technology research and development system. In the development of 'ultraperformance computers,' each of the technologies is being widely utilized, and the data service system of Nippon Telegraph and Telephone Public Corporation and the large computer (HITAC8800) owe much for their success to the fruits of the development endeavor. In the development of the 'desulfurization technology,' the fruits are in use by Tokyo Electric Power Co., Inc., and Chubu Electric Power Co., Inc., incorporated into their desulfurization systems. Although there is no practical plant based on the 'great-depth remotely controlled submarine oil drilling rig,' yet oceanic technologies and control methods are being utilized in various fields. The 'seawater desalination and by-product utilization' technologies have enabled the establishment of technologies of the top level in the world thanks to the resultant manufacture of concrete evaporator and related technologies. Eleven plants have been completed utilizing the fruits of the development. In the field of 'electric vehicle,' there is no commercialization in progress due to problems in cost effectiveness though remarkable improvement has been achieved in terms of performance. Technologies about weight reduction, semiconductor devices, battery parts and components, etc., are being utilized in many fields. (NEDO)

  13. Self-Report Measures of the Home Learning Environment in Large Scale Research: Measurement Properties and Associations with Key Developmental Outcomes

    Science.gov (United States)

    Niklas, Frank; Nguyen, Cuc; Cloney, Daniel S.; Tayler, Collette; Adams, Raymond

    2016-01-01

    Favourable home learning environments (HLEs) support children's literacy, numeracy and social development. In large-scale research, HLE is typically measured by self-report survey, but there is little consistency between studies and many different items and latent constructs are observed. Little is known about the stability of these items and…

  14. Components of an effective large scale program for the prevention of inherited hemoglobin disorders; the paradigm of Greece

    Directory of Open Access Journals (Sweden)

    D. Loukopoulos

    2012-12-01

    Full Text Available Large scale prevention programs for Thalassemia major or Sickle cell disease have already been set up in several places with high frequency of the deleterious genes. The Greek health authorities realized the magnitude of the problem and allowed the creation of a National Thalassemia Center in 1972. The incidence of thalassemia in Greece varies from 1-2 per cent up to 15%, the mean being around 8 per cent. With an annual number of births around 100,000, if no prevention measures are taken, the expected yearly number of newborns with thalassemia major in Greece should be of the order of 100-120. To these one should add a few decades of sickle cell patients, homozygotes or compound HbS/β-thalassemia heterozygotes. The total number of patients with thalassemia major now surviving is estimated at 4,000 plus another 600-800 patients with sickle cell disease. The National Thalassemia Center Center defined a network of peripheral Thalassemia Units in the major regional hospitals of the country, let them provide free carrier identification to couples requesting the test. When both partners were identified as carriers, they were given preliminary information locally and were referred to the Central Laboratory in Athens for further genetic counselling and, if so decided, prenatal diagnosis. Prenatal diagnosis was provided initially by fetoscopy and fetal blood biosynthesis; this approach was soon replaced by chorionic villi sampling and molecular techniques. The number of prenatal diagnoses carried out yearly over the last decade appears to cover the needs; the number of positive diagnoses is very close to the expected 25%, which also excludes overdiagnosis. The overall evaluation of the the program is reflected in the number of infants who were admitted to the pediatric clinics of the country in need of transfusion over the years the program was functioning. In fact, over the past years this number has steadily decreased to approximately 10 missed

  15. A Java program for LRE-based real-time qPCR that enables large-scale absolute quantification.

    Science.gov (United States)

    Rutledge, Robert G

    2011-03-02

    Linear regression of efficiency (LRE) introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples.

  16. A Java program for LRE-based real-time qPCR that enables large-scale absolute quantification.

    Directory of Open Access Journals (Sweden)

    Robert G Rutledge

    Full Text Available BACKGROUND: Linear regression of efficiency (LRE introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. FINDINGS: Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. CONCLUSIONS: The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples.

  17. Increasing condom use and declining STI prevalence in high-risk MSM and TGs: evaluation of a large-scale prevention program in Tamil Nadu, India.

    Science.gov (United States)

    Subramanian, Thilakavathi; Ramakrishnan, Lakshmi; Aridoss, Santhakumar; Goswami, Prabuddhagopal; Kanguswami, Boopathi; Shajan, Mathew; Adhikary, Rajat; Purushothaman, Girish Kumar Chethrapilly; Ramamoorthy, Senthil Kumar; Chinnaswamy, Eswaramurthy; Veeramani, Ilaya Bharathy; Paranjape, Ramesh Shivram

    2013-09-17

    This paper presents an evaluation of Avahan, a large scale HIV prevention program that was implemented using peer-mediated strategies, condom distribution and sexually transmitted infection (STI) clinical services among high-risk men who have sex with men (HR-MSM) and male to female transgender persons (TGs) in six high-prevalence state of Tamil Nadu, in southern India. Two rounds of large scale cross-sectional bio-behavioural surveys among HR-MSM and TGs and routine program monitoring data were used to assess changes in program coverage, condom use and prevalence of STIs (including HIV) and their association to program exposure. The Avahan program for HR-MSM and TGs in Tamil Nadu was significantly scaled up and contacts by peer educators reached 77 percent of the estimated denominator by the end of the program's fourth year. Exposure to the program increased between the two rounds of surveys for both HR-MSM (from 66 percent to 90 percent; AOR = 4.6; p Tamil Nadu achieved a high coverage, resulting in improved condom use by HR-MSM with their regular and commercial male partners. Declining STI prevalence and stable HIV prevalence reflect the positive effects of the prevention strategy. Outcomes from the program logic model indiacte the effectiveness of the program for HR-MSM and TGs in Tamil Nadu.

  18. Transitioning a Large Scale HIV/AIDS Prevention Program to Local Stakeholders: Findings from the Avahan Transition Evaluation.

    Directory of Open Access Journals (Sweden)

    Sara Bennett

    Full Text Available Between 2009-2013 the Bill and Melinda Gates Foundation transitioned its HIV/AIDS prevention initiative in India from being a stand-alone program outside of government, to being fully government funded and implemented. We present an independent prospective evaluation of the transition.The evaluation drew upon (1 a structured survey of transition readiness in a sample of 80 targeted HIV prevention programs prior to transition; (2 a structured survey assessing institutionalization of program features in a sample of 70 targeted intervention (TI programs, one year post-transition; and (3 case studies of 15 TI programs.Transition was conducted in 3 rounds. While the 2009 transition round was problematic, subsequent rounds were implemented more smoothly. In the 2011 and 2012 transition rounds, Avahan programs were well prepared for transition with the large majority of TI program staff trained for transition, high alignment with government clinical, financial and managerial norms, and strong government commitment to the program. One year post transition there were significant program changes, but these were largely perceived positively. Notable negative changes were: limited flexibility in program management, delays in funding, commodity stock outs, and community member perceptions of a narrowing in program focus. Service coverage outcomes were sustained at least six months post-transition.The study suggests that significant investments in transition preparation contributed to a smooth transition and sustained service coverage. Notwithstanding, there were substantive program changes post-transition. Five key lessons for transition design and implementation are identified.

  19. Model Research of Gas Emissions From Lignite and Biomass Co-Combustion in a Large Scale CFB Boiler

    Directory of Open Access Journals (Sweden)

    Krzywański Jarosław

    2014-06-01

    Full Text Available The paper is focused on the idea of a combustion modelling of a large-scale circulating fluidised bed boiler (CFB during coal and biomass co-combustion. Numerical computation results for three solid biomass fuels co-combustion with lignite are presented in the paper. The results of the calculation showed that in previously established kinetics equations for coal combustion, some reactions had to be modified as the combustion conditions changed with the fuel blend composition. Obtained CO2, CO, SO2 and NOx emissions are located in borders of ± 20% in the relationship to the experimental data. Experimental data was obtained for forest biomass, sunflower husk, willow and lignite cocombustion tests carried out on the atmospheric 261 MWe COMPACT CFB boiler operated in PGE Turow Power Station in Poland. The energy fraction of biomass in fuel blend was: 7%wt, 10%wt and 15%wt. The measured emissions of CO, SO2 and NOx (i.e. NO + NO2 were also shown in the paper. For all types of biomass added to the fuel blends the emission of the gaseous pollutants was lower than that for coal combustion.

  20. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  1. Cost analysis of large-scale implementation of the 'Helping Babies Breathe' newborn resuscitation-training program in Tanzania.

    Science.gov (United States)

    Chaudhury, Sumona; Arlington, Lauren; Brenan, Shelby; Kairuki, Allan Kaijunga; Meda, Amunga Robson; Isangula, Kahabi G; Mponzi, Victor; Bishanga, Dunstan; Thomas, Erica; Msemo, Georgina; Azayo, Mary; Molinier, Alice; Nelson, Brett D

    2016-12-01

    Helping Babies Breathe (HBB) has become the gold standard globally for training birth-attendants in neonatal resuscitation in low-resource settings in efforts to reduce early newborn asphyxia and mortality. The purpose of this study was to do a first-ever activity-based cost-analysis of at-scale HBB program implementation and initial follow-up in a large region of Tanzania and evaluate costs of national scale-up as one component of a multi-method external evaluation of the implementation of HBB at scale in Tanzania. We used activity-based costing to examine budget expense data during the two-month implementation and follow-up of HBB in one of the target regions. Activity-cost centers included administrative, initial training (including resuscitation equipment), and follow-up training expenses. Sensitivity analysis was utilized to project cost scenarios incurred to achieve countrywide expansion of the program across all mainland regions of Tanzania and to model costs of program maintenance over one and five years following initiation. Total costs for the Mbeya Region were $202,240, with the highest proportion due to initial training and equipment (45.2%), followed by central program administration (37.2%), and follow-up visits (17.6%). Within Mbeya, 49 training sessions were undertaken, involving the training of 1,341 health providers from 336 health facilities in eight districts. To similarly expand the HBB program across the 25 regions of mainland Tanzania, the total economic cost is projected to be around $4,000,000 (around $600 per facility). Following sensitivity analyses, the estimated total for all Tanzania initial rollout lies between $2,934,793 to $4,309,595. In order to maintain the program nationally under the current model, it is estimated it would cost $2,019,115 for a further one year and $5,640,794 for a further five years of ongoing program support. HBB implementation is a relatively low-cost intervention with potential for high impact on perinatal

  2. Extrinsic Motivation for Large-Scale Assessments: A Case Study of a Student Achievement Program at One Urban High School

    Science.gov (United States)

    Emmett, Joshua; McGee, Dean

    2013-01-01

    The purpose of this case study was to discover the critical attributes of a student achievement program, known as "Think Gold," implemented at one urban comprehensive high school as part of the improvement process. Student achievement on state assessments improved during the period under study. The study draws upon perspectives on…

  3. Effectiveness of Large-Scale Chagas Disease Vector Control Program in Nicaragua by Residual Insecticide Spraying Against Triatoma dimidiata.

    Science.gov (United States)

    Yoshioka, Kota; Nakamura, Jiro; Pérez, Byron; Tercero, Doribel; Pérez, Lenin; Tabaru, Yuichiro

    2015-12-01

    Chagas disease is one of the most serious health problems in Latin America. Because the disease is transmitted mainly by triatomine vectors, a three-phase vector control strategy was used to reduce its vector-borne transmission. In Nicaragua, we implemented an indoor insecticide spraying program in five northern departments to reduce house infestation by Triatoma dimidiata. The spraying program was performed in two rounds. After each round, we conducted entomological evaluation to compare the vector infestation level before and after spraying. A total of 66,200 and 44,683 houses were sprayed in the first and second spraying rounds, respectively. The entomological evaluation showed that the proportion of houses infested by T. dimidiata was reduced from 17.0% to 3.0% after the first spraying, which was statistically significant (P vector control strategies, and implementation of sustainable vector surveillance. © The American Society of Tropical Medicine and Hygiene.

  4. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  5. Fast and accurate solution for the SCUC problem in large-scale power systems using adapted binary programming and enhanced dual neural network

    International Nuclear Information System (INIS)

    Shafie-khah, M.; Moghaddam, M.P.; Sheikh-El-Eslami, M.K.; Catalão, J.P.S.

    2014-01-01

    Highlights: • A novel hybrid method based on decomposition of SCUC into QP and BP problems is proposed. • An adapted binary programming and an enhanced dual neural network model are applied. • The proposed EDNN is exactly convergent to the global optimal solution of QP. • An AC power flow procedure is developed for including contingency/security issues. • It is suited for large-scale systems, providing both accurate and fast solutions. - Abstract: This paper presents a novel hybrid method for solving the security constrained unit commitment (SCUC) problem. The proposed formulation requires much less computation time in comparison with other methods while assuring the accuracy of the results. Furthermore, the framework provided here allows including an accurate description of warmth-dependent startup costs, valve point effects, multiple fuel costs, forbidden zones of operation, and AC load flow bounds. To solve the nonconvex problem, an adapted binary programming method and enhanced dual neural network model are utilized as optimization tools, and a procedure for AC power flow modeling is developed for including contingency/security issues, as new contributions to earlier studies. Unlike classical SCUC methods, the proposed method allows to simultaneously solve the unit commitment problem and comply with the network limits. In addition to conventional test systems, a real-world large-scale power system with 493 units has been used to fully validate the effectiveness of the novel hybrid method proposed

  6. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  7. Evaluation of the clinical implementation of a large-scale online e-learning program on venous blood specimen collection guideline practices.

    Science.gov (United States)

    Willman, Britta; Grankvist, Kjell; Bölenius, Karin

    2018-05-11

    When performed erroneously, the venous blood specimen collection (VBSC) practice steps patient identification, test request management and test tube labeling are at high risk to jeopardize patient safety. VBSC educational programs with the intention to minimize risk of harm to patients are therefore needed. In this study, we evaluate the efficiency of a large-scale online e-learning program on personnel's adherence to VBSC practices and their experience of the e-learning program. An interprofessional team transformed an implemented traditional VBSC education program to an online e-learning program developed to stimulate reflection with focus on the high-risk practice steps. We used questionnaires to evaluate the effect of the e-learning program on personnel's self-reported adherence to VBSC practices compared to questionnaire surveys before and after introduction of the traditional education program. We used content analysis to evaluate the participants free text experience of the VBSC e-learning program. Adherence to the VBSC guideline high-risk practice steps generally increased following the implementation of a traditional educational program followed by an e-learning program. We however found a negative trend over years regarding participation rates and the practice to always send/sign the request form following the introduction of an electronic request system. The participants were in general content with the VBSC e-learning program. Properly designed e-learning programs on VBSC practices supersedes traditional educational programs in usefulness and functionality. Inclusion of questionnaires in the e-learning program is necessary for follow-up of VBSC participant's practices and educational program efficiency.

  8. Factors Influencing Uptake of a Large Scale Curriculum Innovation.

    Science.gov (United States)

    Adey, Philip S.

    Educational research has all too often failed to be implemented on a large-scale basis. This paper describes the multiplier effect of a professional development program for teachers and for trainers in the United Kingdom, and how that program was developed, monitored, and evaluated. Cognitive Acceleration through Science Education (CASE) is a…

  9. The application of two-step linear temperature program to thermal analysis for monitoring the lipid induction of Nostoc sp. KNUA003 in large scale cultivation.

    Science.gov (United States)

    Kang, Bongmun; Yoon, Ho-Sung

    2015-02-01

    Recently, microalgae was considered as a renewable energy for fuel production because its production is nonseasonal and may take place on nonarable land. Despite all of these advantages, microalgal oil production is significantly affected by environmental factors. Furthermore, the large variability remains an important problem in measurement of algae productivity and compositional analysis, especially, the total lipid content. Thus, there is considerable interest in accurate determination of total lipid content during the biotechnological process. For these reason, various high-throughput technologies were suggested for accurate measurement of total lipids contained in the microorganisms, especially oleaginous microalgae. In addition, more advanced technologies were employed to quantify the total lipids of the microalgae without a pretreatment. However, these methods are difficult to measure total lipid content in wet form microalgae obtained from large-scale production. In present study, the thermal analysis performed with two-step linear temeperature program was applied to measure heat evolved in temperature range from 310 to 351 °C of Nostoc sp. KNUA003 obtained from large-scale cultivation. And then, we examined the relationship between the heat evolved in 310-351 °C (HE) and total lipid content of the wet Nostoc cell cultivated in raceway. As a result, the linear relationship was determined between HE value and total lipid content of Nostoc sp. KNUA003. Particularly, there was a linear relationship of 98% between the HE value and the total lipid content of the tested microorganism. Based on this relationship, the total lipid content converted from the heat evolved of wet Nostoc sp. KNUA003 could be used for monitoring its lipid induction in large-scale cultivation. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Learning from the design and implementation of large-scale programs to improve infant and young child feeding.

    Science.gov (United States)

    Baker, Jean; Sanghvi, Tina; Hajeebhoy, Nemat; Abrha, Teweldebrhan Hailu

    2013-09-01

    Improving and sustaining infant and young child feeding (IYCF) practices requires multiple interventions reaching diverse target groups over a sustained period of time. These interventions, together with improved maternal nutrition, are the cornerstones for realizing a lifetime of benefitsfrom investing in nutrition during the 1000 day period. Summarize major lessons from Alive & Thrive's work to improve IYCF in three diverse settings--Bangladesh, Ethiopia, and Vietnam. Draw lessons from reports, studies, surveys, routine monitoring, and discussions on the drivers of successful design and implementation of lYCF strategies. Teaming up with carefully selected implementing partners with strong commitment is a critical first step. As programs move to implementation at scale, strategic systems strengthening is needed to avoid operational bottlenecks. Performance of adequate IYCF counseling takes more than training; it requires rational task allocation, substantial follow up, and recognition of frontline workers. Investing in community demand for IYCF services should be prioritized, specifically through social mobilization and relevant media for multiple audiences. Design of behavior change communication and its implementation must be flexible and responsive to shifts in society's use of media and other social changes. Private sector creative agencies and media companies are well equipped to market IYCF. Scaling up core IYCF interventions and maintaining quality are facilitated by national-level coordinating and information exchange mechanisms using evidence on quality and coverage. It is possible to deliver quality IYCF interventions at scale, while creating new knowledge, tools, and approaches that can be adapted by others

  11. News from heat-pump research - Large-scale heat pumps, components, heat pumps and solar heating; News aus der Waermepumpen-Forschung - Gross-Waermepumpen, Komponenten, Waermepumpe und Solar

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-06-15

    These proceedings summarise the presentations made at the 16{sup th} annual meeting held by the Swiss Federal Office of Energy's Heat Pump Research Program in Burgdorf, Switzerland. The proceedings include contributions on large-scale heat pumps, components and the activities of the heat pump promotion society. A summary of targets and trends in energy research in general is presented and an overview of the heat pump market in 2009 and future perspectives is given. International work within the framework of the International Energy Agency's heat pump group is reviewed, including solar - heat pump combinations. Field-monitoring and the analysis of large-scale heat pumps are discussed and the importance of the use of correct concepts in such installations is stressed. Large-scale heat pumps with carbon dioxide as working fluid are looked at, as are output-regulated air/water heat pumps. Efficient system solutions with heat pumps used both to heat and to cool are discussed. Deep geothermal probes and the potential offered by geothermal probes using carbon dioxide as a working fluid are discussed. The proceedings are rounded off with a list of useful addresses.

  12. Proceedings of joint meeting of the 6th simulation science symposium and the NIFS collaboration research 'large scale computer simulation'

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-03-01

    Joint meeting of the 6th Simulation Science Symposium and the NIFS Collaboration Research 'Large Scale Computer Simulation' was held on December 12-13, 2002 at National Institute for Fusion Science, with the aim of promoting interdisciplinary collaborations in various fields of computer simulations. The present meeting attended by more than 40 people consists of the 11 invited and 22 contributed papers, of which topics were extended not only to fusion science but also to related fields such as astrophysics, earth science, fluid dynamics, molecular dynamics, computer science etc. (author)

  13. Research data management support for large-scale, long-term, interdisciplinary collaborative research centers with a focus on environmental sciences

    Science.gov (United States)

    Curdt, C.; Hoffmeister, D.; Bareth, G.; Lang, U.

    2017-12-01

    Science conducted in collaborative, cross-institutional research projects, requires active sharing of research ideas, data, documents and further information in a well-managed, controlled and structured manner. Thus, it is important to establish corresponding infrastructures and services for the scientists. Regular project meetings and joint field campaigns support the exchange of research ideas. Technical infrastructures facilitate storage, documentation, exchange and re-use of data as results of scientific output. Additionally, also publications, conference contributions, reports, pictures etc. should be managed. Both, knowledge and data sharing is essential to create synergies. Within the coordinated programme `Collaborative Research Center' (CRC), the German Research Foundation offers funding to establish research data management (RDM) infrastructures and services. CRCs are large-scale, interdisciplinary, multi-institutional, long-term (up to 12 years), university-based research institutions (up to 25 sub-projects). These CRCs address complex and scientifically challenging research questions. This poster presents the RDM services and infrastructures that have been established for two CRCs, both focusing on environmental sciences. Since 2007, a RDM support infrastructure and associated services have been set up for the CRC/Transregio 32 (CRC/TR32) `Patterns in Soil-Vegetation-Atmosphere-Systems: Monitoring, Modelling and Data Assimilation' (www.tr32.de). The experiences gained have been used to arrange RDM services for the CRC1211 `Earth - Evolution at the Dry Limit' (www.crc1211.de), funded since 2016. In both projects scientists from various disciplines collect heterogeneous data at field campaigns or by modelling approaches. To manage the scientific output, the TR32DB data repository (www.tr32db.de) has been designed and implemented for the CRC/TR32. This system was transferred and adapted to the CRC1211 needs (www.crc1211db.uni-koeln.de) in 2016. Both

  14. Surface-subsurface flow modeling: an example of large-scale research at the new NEON user facility

    Science.gov (United States)

    Powell, H.; McKnight, D. M.

    2009-12-01

    Climate change is predicted to alter surface-subsurface interactions in freshwater ecosystems. These interactions are hypothesized to control nutrient release at diel and seasonal time scales, which may then exert control over epilithic algal growth rates. The mechanisms underlying shifts in complex physical-chemical-biological patterns can be elucidated by long-term observations at sites that span hydrologic and climate gradients across the continent. Development of the National Ecological Observatory Network (NEON) will provide researchers the opportunity to investigate continental-scale patterns by combining investigator-driven measurements with Observatory data. NEON is a national-scale research platform for analyzing and understanding the impacts of climate change, land-use change, and invasive species on ecology. NEON features sensor networks and experiments, linked by advanced cyberinfrastructure to record and archive ecological data for at least 30 years. NEON partitions the United States into 20 ecoclimatic domains. Each domain hosts one fully instrumented Core Aquatic site in a wildland area and one Relocatable site, which aims to capture ecologically significant gradients (e.g. landuse, nitrogen deposition, urbanization). In the current definition of NEON there are 36 Aquatic sites: 30 streams/rivers and 6 ponds/lakes. Each site includes automated, in-situ sensors for groundwater elevation and temperature; stream flow (discharge and stage); pond water elevation; atmospheric chemistry (Tair, barometric pressure, PAR, radiation); and surface water chemistry (DO, Twater, conductivity, pH, turbidity, cDOM, nutrients). Groundwater and surface water sites shall be regularly sampled for selected chemical and isotopic parameters. The hydrologic and geochemical monitoring design provides basic information on water and chemical fluxes in streams and ponds and between groundwater and surface water, which is intended to support investigator-driven modeling studies

  15. Using Large-Scale Linkage Data to Evaluate the Effectiveness of a National Educational Program on Antithrombotic Prescribing and Associated Stroke Prevention in Primary Care.

    Science.gov (United States)

    Liu, Zhixin; Moorin, Rachael; Worthington, John; Tofler, Geoffrey; Bartlett, Mark; Khan, Rabia; Zuo, Yeqin

    2016-10-13

    The National Prescribing Service (NPS) MedicineWise Stroke Prevention Program, which was implemented nationally in 2009-2010 in Australia, sought to improve antithrombotic prescribing in stroke prevention using dedicated interventions that target general practitioners. This study evaluated the impact of the NPS MedicineWise Stroke Prevention Program on antithrombotic prescribing and primary stroke hospitalizations. This population-based time series study used administrative health data linked to 45 and Up Study participants with a high risk of cardiovascular disease (CVD) to assess the possible impact of the NPS MedicineWise program on first-time aspirin prescriptions and primary stroke-related hospitalizations. Time series analysis showed that the NPS MedicineWise program was significantly associated with increased first-time prescribing of aspirin (P=0.03) and decreased hospitalizations for primary ischemic stroke (P=0.03) in the at-risk study population (n=90 023). First-time aspirin prescription was correlated with a reduction in the rate of hospitalization for primary stroke (P=0.02). Following intervention, the number of first-time aspirin prescriptions increased by 19.8% (95% confidence interval, 1.6-38.0), while the number of first-time stroke hospitalizations decreased by 17.3% (95% confidence interval, 1.8-30.0). Consistent with NPS MedicineWise program messages for the high-risk CVD population, the NPS MedicineWise Stroke Prevention Program (2009) was associated with increased initiation of aspirin and a reduced rate of hospitalization for primary stroke. The findings suggest that the provision of evidence-based multifaceted large-scale educational programs in primary care can be effective in changing prescriber behavior and positively impacting patient health outcomes. © 2016 The Authors and NPS MedicineWise. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  16. Operations Research techniques in the management of large-scale reforestation programs

    Science.gov (United States)

    Joseph Buongiorno; D.E. Teeguarden

    1978-01-01

    A reforestation planning system for the Douglas-fir region of the Western United States is described. Part of the system is a simulation model to predict plantation growth and to determine economic thinning regimes and rotation ages as a function of site characteristics, initial density, reforestation costs, and management constraints. A second model estimates the...

  17. Large-scale Homogenization of Bulk Materials in Mammoth Silos

    NARCIS (Netherlands)

    Schott, D.L.

    2004-01-01

    This doctoral thesis concerns the large-scale homogenization of bulk materials in mammoth silos. The objective of this research was to determine the best stacking and reclaiming method for homogenization in mammoth silos. For this purpose a simulation program was developed to estimate the

  18. Large-scale grid management; Storskala Nettforvaltning

    Energy Technology Data Exchange (ETDEWEB)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-07-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series.

  19. Numerical and experimental simulation of accident processes using KMS large-scale test facility under the program of training university students for nuclear power industry

    International Nuclear Information System (INIS)

    Aniskevich, Yu.N.

    2005-01-01

    The KMS large-scale test facility is being constructed at NITI site and designed to model accident processes in VVER reactor plants and provide experimental data for safety analysis of both existing and future NPPs. The KMS phase I is at the completion stage. This is a containment model of 2000 m3 volume intended for experimentally simulating heat and mass transfers of steam-gas mixtures and aerosols inside containment. The KMS phase II will incorporate a reactor model (1:27 scale) and be used for analysing a number of events including primary and secondary LOCA. The KMS program for background training of university students in the nuclear field will include preparation and conduction of experiments, analysis of experiment data. The KMS program for background training of university students in nuclear will include: participation in the development and application of experiment procedures, preparation and carrying out experiments; carrying out pretest and post-test calculations with different computer codes; on-the-job training as operators of experiment scenarios; training of specialists in measurement and information acquisition technologies. (author)

  20. 'You should at least ask'. The expectations, hopes and fears of rare disease patients on large-scale data and biomaterial sharing for genomics research.

    Science.gov (United States)

    McCormack, Pauline; Kole, Anna; Gainotti, Sabina; Mascalzoni, Deborah; Molster, Caron; Lochmüller, Hanns; Woods, Simon

    2016-10-01

    Within the myriad articles about participants' opinions of genomics research, the views of a distinct group - people with a rare disease (RD) - are unknown. It is important to understand if their opinions differ from the general public by dint of having a rare disease and vulnerabilities inherent in this. Here we document RD patients' attitudes to participation in genomics research, particularly around large-scale, international data and biosample sharing. This work is unique in exploring the views of people with a range of rare disorders from many different countries. The authors work within an international, multidisciplinary consortium, RD-Connect, which has developed an integrated platform connecting databases, registries, biobanks and clinical bioinformatics for RD research. Focus groups were conducted with 52 RD patients from 16 countries. Using a scenario-based approach, participants were encouraged to raise topics relevant to their own experiences, rather than these being determined by the researcher. Issues include wide data sharing, and consent for new uses of historic samples and for children. Focus group members are positively disposed towards research and towards allowing data and biosamples to be shared internationally. Expressions of trust and attitudes to risk are often affected by the nature of the RD which they have experience of, as well as regulatory and cultural practices in their home country. Participants are concerned about data security and misuse. There is an acute recognition of the vulnerability inherent in having a RD and the possibility that open knowledge of this could lead to discrimination.

  1. Improving healthcare systems' disclosures of large-scale adverse events: a Department of Veterans Affairs leadership, policymaker, research and stakeholder partnership.

    Science.gov (United States)

    Elwy, A Rani; Bokhour, Barbara G; Maguire, Elizabeth M; Wagner, Todd H; Asch, Steven M; Gifford, Allen L; Gallagher, Thomas H; Durfee, Janet M; Martinello, Richard A; Schiffner, Susan; Jesse, Robert L

    2014-12-01

    The Department of Veterans Affairs (VA) mandates disclosure of large-scale adverse events to patients, even if risk of harm is not clearly present. Concerns about past disclosures warranted further examination of the impact of this policy. Through a collaborative partnership between VA leaders, policymakers, researchers and stakeholders, the objective was to empirically identify critical aspects of disclosure processes as a first step towards improving future disclosures. Semi-structured interviews were conducted with participants at nine VA facilities where recent disclosures took place. Ninety-seven stakeholders participated in the interviews: 38 employees, 28 leaders (from facilities, regions and national offices), 27 Veteran patients and family members, and four congressional staff members. Facility and regional leaders were interviewed by telephone, followed by a two-day site visit where employees, patients and family members were interviewed face-to-face. National leaders and congressional staff also completed telephone interviews. Interviews were analyzed using rapid qualitative assessment processes. Themes were mapped to the stages of the Crisis and Emergency Risk Communication model: pre-crisis, initial event, maintenance, resolution and evaluation. Many areas for improvement during disclosure were identified, such as preparing facilities better (pre-crisis), creating rapid communications, modifying disclosure language, addressing perceptions of harm, reducing complexity, and seeking assistance from others (initial event), managing communication with other stakeholders (maintenance), minimizing effects on staff and improving trust (resolution), and addressing facilities' needs (evaluation). Through the partnership, five recommendations to improve disclosures during each stage of communication have been widely disseminated throughout the VA using non-academic strategies. Some improvements have been made; other recommendations will be addressed through

  2. Large-scale solar purchasing

    International Nuclear Information System (INIS)

    1999-01-01

    The principal objective of the project was to participate in the definition of a new IEA task concerning solar procurement (''the Task'') and to assess whether involvement in the task would be in the interest of the UK active solar heating industry. The project also aimed to assess the importance of large scale solar purchasing to UK active solar heating market development and to evaluate the level of interest in large scale solar purchasing amongst potential large scale purchasers (in particular housing associations and housing developers). A further aim of the project was to consider means of stimulating large scale active solar heating purchasing activity within the UK. (author)

  3. Large-Scale Astrophysical Visualization on Smartphones

    Science.gov (United States)

    Becciani, U.; Massimino, P.; Costa, A.; Gheller, C.; Grillo, A.; Krokos, M.; Petta, C.

    2011-07-01

    Nowadays digital sky surveys and long-duration, high-resolution numerical simulations using high performance computing and grid systems produce multidimensional astrophysical datasets in the order of several Petabytes. Sharing visualizations of such datasets within communities and collaborating research groups is of paramount importance for disseminating results and advancing astrophysical research. Moreover educational and public outreach programs can benefit greatly from novel ways of presenting these datasets by promoting understanding of complex astrophysical processes, e.g., formation of stars and galaxies. We have previously developed VisIVO Server, a grid-enabled platform for high-performance large-scale astrophysical visualization. This article reviews the latest developments on VisIVO Web, a custom designed web portal wrapped around VisIVO Server, then introduces VisIVO Smartphone, a gateway connecting VisIVO Web and data repositories for mobile astrophysical visualization. We discuss current work and summarize future developments.

  4. Data Portal for the Library of Integrated Network-based Cellular Signatures (LINCS) program: integrated access to diverse large-scale cellular perturbation response data

    Science.gov (United States)

    Koleti, Amar; Terryn, Raymond; Stathias, Vasileios; Chung, Caty; Cooper, Daniel J; Turner, John P; Vidović, Dušica; Forlin, Michele; Kelley, Tanya T; D’Urso, Alessandro; Allen, Bryce K; Torre, Denis; Jagodnik, Kathleen M; Wang, Lily; Jenkins, Sherry L; Mader, Christopher; Niu, Wen; Fazel, Mehdi; Mahi, Naim; Pilarczyk, Marcin; Clark, Nicholas; Shamsaei, Behrouz; Meller, Jarek; Vasiliauskas, Juozas; Reichard, John; Medvedovic, Mario; Ma’ayan, Avi; Pillai, Ajay

    2018-01-01

    Abstract The Library of Integrated Network-based Cellular Signatures (LINCS) program is a national consortium funded by the NIH to generate a diverse and extensive reference library of cell-based perturbation-response signatures, along with novel data analytics tools to improve our understanding of human diseases at the systems level. In contrast to other large-scale data generation efforts, LINCS Data and Signature Generation Centers (DSGCs) employ a wide range of assay technologies cataloging diverse cellular responses. Integration of, and unified access to LINCS data has therefore been particularly challenging. The Big Data to Knowledge (BD2K) LINCS Data Coordination and Integration Center (DCIC) has developed data standards specifications, data processing pipelines, and a suite of end-user software tools to integrate and annotate LINCS-generated data, to make LINCS signatures searchable and usable for different types of users. Here, we describe the LINCS Data Portal (LDP) (http://lincsportal.ccs.miami.edu/), a unified web interface to access datasets generated by the LINCS DSGCs, and its underlying database, LINCS Data Registry (LDR). LINCS data served on the LDP contains extensive metadata and curated annotations. We highlight the features of the LDP user interface that is designed to enable search, browsing, exploration, download and analysis of LINCS data and related curated content. PMID:29140462

  5. Survey and research on how large-scale technological development should be in the future; Kongo no ogata gijutsu kaihatsu no hoko ni tsuite no chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1982-03-01

    Tasks to be subjected to research and development under the large-scale industrial technology research and development system are discussed. Mentioned in the fields of resources and foods are a submarine metal sulfide mining system, a submarine oil development system for ice-covered sea areas, an all-weather type useful vegetable automatic production system, etc. Mentioned in the fields of social development, security, and disaster prevention are a construction work robot, shelter system technologies, disaster control technologies in case of mega-scale disasters, etc. Mentioned in the fields of health, welfare, and education are biomimetics, biosystems, cancer diagnosis and treatment systems, etc. Mentioned in the field of commodity distribution, service, and software are a computer security system, an unmanned collection and distribution system, etc. Mentioned in the field of process conversion are aluminum refining, synzyme technologies for precise synthesis, etc. Mentioned in the field of data processing are optical computers, bioelectronics, etc. Various tasks are pointed out also in the fields of aviation, space, ocean, and machining. (NEDO)

  6. Do you kiss your mother with that mouth? An authentic large-scale undergraduate research experience in mapping the human oral microbiome.

    Science.gov (United States)

    Wang, Jack T H; Daly, Joshua N; Willner, Dana L; Patil, Jayee; Hall, Roy A; Schembri, Mark A; Tyson, Gene W; Hugenholtz, Philip

    2015-05-01

    Clinical microbiology testing is crucial for the diagnosis and treatment of community and hospital-acquired infections. Laboratory scientists need to utilize technical and problem-solving skills to select from a wide array of microbial identification techniques. The inquiry-driven laboratory training required to prepare microbiology graduates for this professional environment can be difficult to replicate within undergraduate curricula, especially in courses that accommodate large student cohorts. We aimed to improve undergraduate scientific training by engaging hundreds of introductory microbiology students in an Authentic Large-Scale Undergraduate Research Experience (ALURE). The ALURE aimed to characterize the microorganisms that reside in the healthy human oral cavity-the oral microbiome-by analyzing hundreds of samples obtained from student volunteers within the course. Students were able to choose from selective and differential culture media, Gram-staining, microscopy, as well as polymerase chain reaction (PCR) and 16S rRNA gene sequencing techniques, in order to collect, analyze, and interpret novel data to determine the collective oral microbiome of the student cohort. Pre- and postsurvey analysis of student learning gains across two iterations of the course (2012-2013) revealed significantly higher student confidence in laboratory skills following the completion of the ALURE (p < 0.05 using the Mann-Whitney U-test). Learning objectives on effective scientific communication were also met through effective student performance in laboratory reports describing the research outcomes of the project. The integration of undergraduate research in clinical microbiology has the capacity to deliver authentic research experiences and improve scientific training for large cohorts of undergraduate students.

  7. Spatial variation in foraging behaviour of a marine top predator (Phoca vitulina determined by a large-scale satellite tagging program.

    Directory of Open Access Journals (Sweden)

    Ruth J Sharples

    Full Text Available The harbour seal (Phoca vitulina is a widespread marine predator in Northern Hemisphere waters. British populations have been subject to rapid declines in recent years. Food supply or inter-specific competition may be implicated but basic ecological data are lacking and there are few studies of harbour seal foraging distribution and habits. In this study, satellite tagging conducted at the major seal haul outs around the British Isles showed both that seal movements were highly variable among individuals and that foraging strategy appears to be specialized within particular regions. We investigated whether these apparent differences could be explained by individual level factors: by modelling measures of trip duration and distance travelled as a function of size, sex and body condition. However, these were not found to be good predictors of foraging trip duration or distance, which instead was best predicted by tagging region, time of year and inter-trip duration. Therefore, we propose that local habitat conditions and the constraints they impose are the major determinants of foraging movements. Specifically the distance to profitable feeding grounds from suitable haul-out locations may dictate foraging strategy and behaviour. Accounting for proximity to productive foraging resources is likely to be an important component of understanding population processes. Despite more extensive offshore movements than expected, there was also marked fidelity to the local haul-out region with limited connectivity between study regions. These empirical observations of regional exchange at short time scales demonstrates the value of large scale electronic tagging programs for robust characterization of at-sea foraging behaviour at a wide spatial scale.

  8. Survey and research for the enhancement of large-scale technology development 2. How large-scale technology development should be in the future; Ogata gijutsu kaihatsu suishin no tame no chosa kenkyu. 2. Kongo no ogata gijutsu kaihatsu no arikata

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1981-03-01

    A survey is conducted over the subject matter by holding interviews with people, employed with the entrusted businesses participating in the large-scale industrial technology development system, who are engaged in the development of industrial technologies, and with people of experience or academic background involved in the project enhancement effort. Needs of improvement are pointed out that the competition principle based for example on parallel development be introduced; that research-on-research be practiced for effective task institution; midway evaluation be substantiated since prior evaluation is difficult; efforts be made to organize new industries utilizing the fruits of large-scale industrial technology for the creation of markets, not to induce economic conflicts; that transfer of technologies be enhanced from the private sector to public sector. Studies are made about the review of research conducting systems; utilization of the power of private sector research and development efforts; enlightening about industrial proprietorship; and the diffusion of large-scale project systems. In this connection, problems are pointed out, requests are submitted, and remedial measures and suggestions are presented. (NEDO)

  9. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  10. Large-scale hydropower system optimization using dynamic programming and object-oriented programming: the case of the Northeast China Power Grid.

    Science.gov (United States)

    Li, Ji-Qing; Zhang, Yu-Shan; Ji, Chang-Ming; Wang, Ai-Jing; Lund, Jay R

    2013-01-01

    This paper examines long-term optimal operation using dynamic programming for a large hydropower system of 10 reservoirs in Northeast China. Besides considering flow and hydraulic head, the optimization explicitly includes time-varying electricity market prices to maximize benefit. Two techniques are used to reduce the 'curse of dimensionality' of dynamic programming with many reservoirs. Discrete differential dynamic programming (DDDP) reduces the search space and computer memory needed. Object-oriented programming (OOP) and the ability to dynamically allocate and release memory with the C++ language greatly reduces the cumulative effect of computer memory for solving multi-dimensional dynamic programming models. The case study shows that the model can reduce the 'curse of dimensionality' and achieve satisfactory results.

  11. Large Scale Glazed Concrete Panels

    DEFF Research Database (Denmark)

    Bache, Anja Margrethe

    2010-01-01

    Today, there is a lot of focus on concrete surface’s aesthitic potential, both globally and locally. World famous architects such as Herzog De Meuron, Zaha Hadid, Richard Meyer and David Chippenfield challenge the exposure of concrete in their architecture. At home, this trend can be seen...... in the crinkly façade of DR-Byen (the domicile of the Danish Broadcasting Company) by architect Jean Nouvel and Zaha Hadid’s Ordrupgård’s black curved smooth concrete surfaces. Furthermore, one can point to initiatives such as “Synlig beton” (visible concrete) that can be seen on the website www.......synligbeton.dk and spæncom’s aesthetic relief effects by the designer Line Kramhøft (www.spaencom.com). It is my hope that the research-development project “Lasting large scale glazed concrete formwork,” I am working on at DTU, department of Architectural Engineering will be able to complement these. It is a project where I...

  12. Large-scale pool fires

    Directory of Open Access Journals (Sweden)

    Steinhaus Thomas

    2007-01-01

    Full Text Available A review of research into the burning behavior of large pool fires and fuel spill fires is presented. The features which distinguish such fires from smaller pool fires are mainly associated with the fire dynamics at low source Froude numbers and the radiative interaction with the fire source. In hydrocarbon fires, higher soot levels at increased diameters result in radiation blockage effects around the perimeter of large fire plumes; this yields lower emissive powers and a drastic reduction in the radiative loss fraction; whilst there are simplifying factors with these phenomena, arising from the fact that soot yield can saturate, there are other complications deriving from the intermittency of the behavior, with luminous regions of efficient combustion appearing randomly in the outer surface of the fire according the turbulent fluctuations in the fire plume. Knowledge of the fluid flow instabilities, which lead to the formation of large eddies, is also key to understanding the behavior of large-scale fires. Here modeling tools can be effectively exploited in order to investigate the fluid flow phenomena, including RANS- and LES-based computational fluid dynamics codes. The latter are well-suited to representation of the turbulent motions, but a number of challenges remain with their practical application. Massively-parallel computational resources are likely to be necessary in order to be able to adequately address the complex coupled phenomena to the level of detail that is necessary.

  13. Do You Kiss Your Mother with That Mouth? An Authentic Large-Scale Undergraduate Research Experience in Mapping the Human Oral Microbiome

    Directory of Open Access Journals (Sweden)

    Jack T.H. Wang

    2015-02-01

    Full Text Available Clinical microbiology testing is crucial for the diagnosis and treatment of community and hospital-acquired infections. Laboratory scientists need to utilize technical and problem-solving skills to select from a wide array of microbial identification techniques. The inquiry-driven laboratory training required to prepare microbiology graduates for this professional environment can be difficult to replicate within undergraduate curricula, especially in courses that accommodate large student cohorts. We aimed to improve undergraduate scientific training by engaging hundreds of introductory microbiology students in an Authentic Large-Scale Undergraduate Research Experience (ALURE. The ALURE aimed to characterize the microorganisms that reside in the healthy human oral cavity—the oral microbiome—by analyzing hundreds of samples obtained from student volunteers within the course. Students were able to choose from selective and differential culture media, Gram-staining, microscopy, as well as polymerase chain reaction (PCR and 16S rRNA gene sequencing techniques, in order to collect, analyze, and interpret novel data to determine the collective oral microbiome of the student cohort. Pre- and postsurvey analysis of student learning gains across two iterations of the course (2012–2013 revealed significantly higher student confidence in laboratory skills following the completion of the ALURE (p < 0.05 using the Mann-Whitney U-test. Learning objectives on effective scientific communication were also met through effective student performance in laboratory reports describing the research outcomes of the project. The integration of undergraduate research in clinical microbiology has the capacity to deliver authentic research experiences and improve scientific training for large cohorts of undergraduate students. Editor's Note:The ASM advocates that students must successfully demonstrate the ability to explain and practice safe laboratory techniques

  14. Reducing Plug and Process Loads for a Large Scale, Low Energy Office Building: NREL's Research Support Facility; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Lobato, C.; Pless, S.; Sheppy, M.; Torcellini, P.

    2011-02-01

    This paper documents the design and operational plug and process load energy efficiency measures needed to allow a large scale office building to reach ultra high efficiency building goals. The appendices of this document contain a wealth of documentation pertaining to plug and process load design in the RSF, including a list of equipment was selected for use.

  15. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  16. Large scale structure and baryogenesis

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    2001-08-01

    We discuss a possible connection between the large scale structure formation and the baryogenesis in the universe. An update review of the observational indications for the presence of a very large scale 120h -1 Mpc in the distribution of the visible matter of the universe is provided. The possibility to generate a periodic distribution with the characteristic scale 120h -1 Mpc through a mechanism producing quasi-periodic baryon density perturbations during inflationary stage, is discussed. The evolution of the baryon charge density distribution is explored in the framework of a low temperature boson condensate baryogenesis scenario. Both the observed very large scale of a the visible matter distribution in the universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, for some model's parameters a natural separation of matter superclusters from antimatter ones can be achieved. (author)

  17. Test on large-scale seismic isolation elements, 2

    International Nuclear Information System (INIS)

    Mazda, T.; Moteki, M.; Ishida, K.; Shiojiri, H.; Fujita, T.

    1991-01-01

    Seismic isolation test program of Central Research Inst. of Electric Power Industry (CRIEPI) to apply seismic isolation to Fast Breeder Reactor (FBR) plant was started in 1987. In this test program, demonstration test of seismic isolation elements was considered as one of the most important research items. Facilities for testing seismic isolation elements were built in Abiko Research Laboratory of CRIEPI. Various tests of large-scale seismic isolation elements were conducted up to this day. Many important test data to develop design technical guidelines was obtained. (author)

  18. LAVA: Large scale Automated Vulnerability Addition

    Science.gov (United States)

    2016-05-23

    LAVA: Large-scale Automated Vulnerability Addition Brendan Dolan -Gavitt∗, Patrick Hulin†, Tim Leek†, Fredrich Ulrich†, Ryan Whelan† (Authors listed...released, and thus rapidly become stale. We can expect tools to have been trained to detect bugs that have been released. Given the commercial price tag...low TCN) and dead (low liveness) program data is a powerful one for vulnera- bility injection. The DUAs it identifies are internal program quantities

  19. The Expanded Large Scale Gap Test

    Science.gov (United States)

    1987-03-01

    NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T

  20. Large-scale perspective as a challenge

    NARCIS (Netherlands)

    Plomp, M.G.A.

    2012-01-01

    1. Scale forms a challenge for chain researchers: when exactly is something ‘large-scale’? What are the underlying factors (e.g. number of parties, data, objects in the chain, complexity) that determine this? It appears to be a continuum between small- and large-scale, where positioning on that

  1. NASA's computer science research program

    Science.gov (United States)

    Larsen, R. L.

    1983-01-01

    Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.

  2. A framing approach to cross-disciplinary research collaboration: experiences from a large-scale research project on adaptive water management

    NARCIS (Netherlands)

    Dewulf, A.; Francois, G.; Pahl-Wostl, C.; Taillieu, T.

    2007-01-01

    Although cross-disciplinary research collaboration is necessary to achieve a better understanding of how human and natural systems are dynamically linked, it often turns out to be very difficult in practice. We outline a framing approach to cross-disciplinary research that focuses on the different

  3. Stereotype Threat, Inquiring about Test Takers' Race and Gender, and Performance on Low-Stakes Tests in a Large-Scale Assessment. Research Report. ETS RR-15-02

    Science.gov (United States)

    Stricker, Lawrence J.; Rock, Donald A.; Bridgeman, Brent

    2015-01-01

    This study explores stereotype threat on low-stakes tests used in a large-scale assessment, math and reading tests in the Education Longitudinal Study of 2002 (ELS). Issues identified in laboratory research (though not observed in studies of high-stakes tests) were assessed: whether inquiring about their race and gender is related to the…

  4. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  5. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  6. The innovative use of a large-scale industry biomedical consortium to research the genetic basis of drug induced serious adverse events.

    Science.gov (United States)

    Holden, Arthur L

    2007-01-01

    communities about issues related to severe adverse drug reactions and about issues related to the Consortium's research.The SAEC was launched in late September of 2007 with the scientific, technical and financial support of eight founding industrial research-funding members (i.e. Abbott, GSK, J & J, Novartis, Pfizer, Roche, Sanofi-Aventis and Wyeth). Additional members are being added as the consortium executes its phase one research program and develops its future plans.The Consortium's will focus initially on two research projects. It will attempt to identify DNA variants associated with drug-induced liver-disease and serious skin rashes [e.g. Stevens-Johnson syndrome ('SJS') and toxic epidermal necrolysis ('TEN')]. These two projects, while important in their own right, will also allow the SAEC to generate initial results in a reasonable time frame (owing to the availability of established case-control DNA sample collections) and build its core operations. Simultaneous with the Phase 1 research activities, the SAEC will plan follow on, hypothesis driven studies (post whole genome association studies) for DILI and SJS and explore the feasibility of whole genome research on additional SAEs. Our long term goal is to discover and validate genetic markers predictive of the major drug induced, rare SAEs and make these available at no cost at the same time, unencumbered by any intellectual property constraints, to all researchers and developers of clinical diagnostics. � 2007 Published by Elsevier Ltd.

  7. Research Program Overview

    Science.gov (United States)

    PEER logo Pacific Earthquake Engineering Research Center home about peer news events research products laboratories publications nisee b.i.p. members education FAQs links research Research Program Overview Tall Buildings Initiative Transportation Research Program Lifelines Program Concrete Grand

  8. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  9. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  10. How large-scale technological development should be in the future. Survey and research on highly automated machines; Kongo no daikibo gijutsu kaihatsu no hoko ni tsuite. Kodo jidoka kikai ni kansuru chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1982-03-01

    A survey is conducted about highly automated machines such as industrial robots. The task to be subjected to development as derived from a survey conducted about needs is the construction of a dangerous work robot. It is pointed out that work in coal mines, tall buildings, industrial complexes, or nuclear power plants may encounter large-scale accidents, and the task is how to perform such work in an automated way. The tasks concluded to be subjected to development after a seed survey analysis are categorized into three groups of element technologies, namely, sensors and recognition function, mechanism and materials, and control and data processing. These element technologies are to be ultimately integrated into a robot, for critical work which is a combination of a highly intelligent robot main body and an integrated management system. Since it will happen that humans have to directly operate such a robot under delicate conditions and share the burden of judgement and thinking, it is also necessary to develop technologies to solve problems of man-to-robot engineering. It is proposed that a dangerous work robot research and development program be established before development is started. (NEDO)

  11. Programmed Nanomaterial Assemblies in Large Scales: Applications of Synthetic and Genetically- Engineered Peptides to Bridge Nano-Assemblies and Macro-Assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Matsui, Hiroshi

    2014-09-09

    Work is reported in these areas: Large-scale & reconfigurable 3D structures of precise nanoparticle assemblies in self-assembled collagen peptide grids; Binary QD-Au NP 3D superlattices assembled with collagen-like peptides and energy transfer between QD and Au NP in 3D peptide frameworks; Catalytic peptides discovered by new hydrogel-based combinatorial phage display approach and their enzyme-mimicking 2D assembly; New autonomous motors of metal-organic frameworks (MOFs) powered by reorganization of self-assembled peptides at interfaces; Biomimetic assembly of proteins into microcapsules on oil-in-water droplets with structural reinforcement via biomolecular recognition-based cross-linking of surface peptides; and Biomimetic fabrication of strong freestanding genetically-engineered collagen peptide films reinforced by quantum dot joints. We gained the broad knowledge about biomimetic material assembly from nanoscale to microscale ranges by coassembling peptides and NPs via biomolecular recognition. We discovered: Genetically-engineered collagen-like peptides can be self-assembled with Au NPs to generate 3D superlattices in large volumes (> μm{sup 3}); The assembly of the 3D peptide-Au NP superstructures is dynamic and the interparticle distance changes with assembly time as the reconfiguration of structure is triggered by pH change; QDs/NPs can be assembled with the peptide frameworks to generate 3D superlattices and these QDs/NPs can be electronically coupled for the efficient energy transfer; The controlled assembly of catalytic peptides mimicking the catalytic pocket of enzymes can catalyze chemical reactions with high selectivity; and, For the bacteria-mimicking swimmer fabrication, peptide-MOF superlattices can power translational and propellant motions by the reconfiguration of peptide assembly at the MOF-liquid interface.

  12. Large scale biomimetic membrane arrays

    DEFF Research Database (Denmark)

    Hansen, Jesper Søndergaard; Perry, Mark; Vogel, Jörg

    2009-01-01

    To establish planar biomimetic membranes across large scale partition aperture arrays, we created a disposable single-use horizontal chamber design that supports combined optical-electrical measurements. Functional lipid bilayers could easily and efficiently be established across CO2 laser micro......-structured 8 x 8 aperture partition arrays with average aperture diameters of 301 +/- 5 mu m. We addressed the electro-physical properties of the lipid bilayers established across the micro-structured scaffold arrays by controllable reconstitution of biotechnological and physiological relevant membrane...... peptides and proteins. Next, we tested the scalability of the biomimetic membrane design by establishing lipid bilayers in rectangular 24 x 24 and hexagonal 24 x 27 aperture arrays, respectively. The results presented show that the design is suitable for further developments of sensitive biosensor assays...

  13. Large scale nuclear structure studies

    International Nuclear Information System (INIS)

    Faessler, A.

    1985-01-01

    Results of large scale nuclear structure studies are reported. The starting point is the Hartree-Fock-Bogoliubov solution with angular momentum and proton and neutron number projection after variation. This model for number and spin projected two-quasiparticle excitations with realistic forces yields in sd-shell nuclei similar good results as the 'exact' shell-model calculations. Here the authors present results for a pf-shell nucleus 46 Ti and results for the A=130 mass region where they studied 58 different nuclei with the same single-particle energies and the same effective force derived from a meson exchange potential. They carried out a Hartree-Fock-Bogoliubov variation after mean field projection in realistic model spaces. In this way, they determine for each yrast state the optimal mean Hartree-Fock-Bogoliubov field. They apply this method to 130 Ce and 128 Ba using the same effective nucleon-nucleon interaction. (Auth.)

  14. Large-scale river regulation

    International Nuclear Information System (INIS)

    Petts, G.

    1994-01-01

    Recent concern over human impacts on the environment has tended to focus on climatic change, desertification, destruction of tropical rain forests, and pollution. Yet large-scale water projects such as dams, reservoirs, and inter-basin transfers are among the most dramatic and extensive ways in which our environment has been, and continues to be, transformed by human action. Water running to the sea is perceived as a lost resource, floods are viewed as major hazards, and wetlands are seen as wastelands. River regulation, involving the redistribution of water in time and space, is a key concept in socio-economic development. To achieve water and food security, to develop drylands, and to prevent desertification and drought are primary aims for many countries. A second key concept is ecological sustainability. Yet the ecology of rivers and their floodplains is dependent on the natural hydrological regime, and its related biochemical and geomorphological dynamics. (Author)

  15. A large-scale mutant panel in wheat developed using heavy-ion beam mutagenesis and its application to genetic research

    Energy Technology Data Exchange (ETDEWEB)

    Murai, Koji, E-mail: murai@fpu.ac.jp [Department of Bioscience, Fukui Prefectural University, 4-1-1 Matsuoka-Kenjojima, Eiheiji-cho, Yoshida-gun, Fukui 910-1195 (Japan); Nishiura, Aiko [Department of Bioscience, Fukui Prefectural University, 4-1-1 Matsuoka-Kenjojima, Eiheiji-cho, Yoshida-gun, Fukui 910-1195 (Japan); Kazama, Yusuke [RIKEN, Innovation Center, 2-1 Hirosawa, Wako, Saitama 351-0198 (Japan); Abe, Tomoko [RIKEN, Innovation Center, 2-1 Hirosawa, Wako, Saitama 351-0198 (Japan); RIKEN, Nishina Center, 2-1 Hirosawa, Wako, Saitama 351-0198 (Japan)

    2013-11-01

    Mutation analysis is a powerful tool for studying gene function. Heavy-ion beam mutagenesis is a comparatively new approach to inducing mutations in plants and is particularly efficient because of its high linear energy transfer (LET). High LET radiation induces a higher rate of DNA double-strand breaks than other mutagenic methods. Over the last 12 years, we have constructed a large-scale mutant panel in diploid einkorn wheat (Triticum monococcum) using heavy-ion beam mutagenesis. Einkorn wheat seeds were exposed to a heavy-ion beam and then sown in the field. Selfed seeds from each spike of M{sub 1} plants were used to generate M{sub 2} lines. Every year, we obtained approximately 1000 M{sub 2} lines and eventually developed a mutant panel with 10,000 M{sub 2} lines in total. This mutant panel is being systematically screened for mutations affecting reproductive growth, and especially for flowering-time mutants. To date, we have identified several flowering-time mutants of great interest: non-flowering mutants (mvp: maintained vegetative phase), late-flowering mutants, and early-flowering mutants. These novel mutations will be of value for investigations of the genetic mechanism of flowering in wheat.

  16. Earthquakes in Action: Incorporating Multimedia, Internet Resources, Large-scale Seismic Data, and 3-D Visualizations into Innovative Activities and Research Projects for Today's High School Students

    Science.gov (United States)

    Smith-Konter, B.; Jacobs, A.; Lawrence, K.; Kilb, D.

    2006-12-01

    The most effective means of communicating science to today's "high-tech" students is through the use of visually attractive and animated lessons, hands-on activities, and interactive Internet-based exercises. To address these needs, we have developed Earthquakes in Action, a summer high school enrichment course offered through the California State Summer School for Mathematics and Science (COSMOS) Program at the University of California, San Diego. The summer course consists of classroom lectures, lab experiments, and a final research project designed to foster geophysical innovations, technological inquiries, and effective scientific communication (http://topex.ucsd.edu/cosmos/earthquakes). Course content includes lessons on plate tectonics, seismic wave behavior, seismometer construction, fault characteristics, California seismicity, global seismic hazards, earthquake stress triggering, tsunami generation, and geodetic measurements of the Earth's crust. Students are introduced to these topics through lectures-made-fun using a range of multimedia, including computer animations, videos, and interactive 3-D visualizations. These lessons are further enforced through both hands-on lab experiments and computer-based exercises. Lab experiments included building hand-held seismometers, simulating the frictional behavior of faults using bricks and sandpaper, simulating tsunami generation in a mini-wave pool, and using the Internet to collect global earthquake data on a daily basis and map earthquake locations using a large classroom map. Students also use Internet resources like Google Earth and UNAVCO/EarthScope's Jules Verne Voyager Jr. interactive mapping tool to study Earth Science on a global scale. All computer-based exercises and experiments developed for Earthquakes in Action have been distributed to teachers participating in the 2006 Earthquake Education Workshop, hosted by the Visualization Center at Scripps Institution of Oceanography (http

  17. Reviving large-scale projects

    International Nuclear Information System (INIS)

    Desiront, A.

    2003-01-01

    For the past decade, most large-scale hydro development projects in northern Quebec have been put on hold due to land disputes with First Nations. Hydroelectric projects have recently been revived following an agreement signed with Aboriginal communities in the province who recognized the need to find new sources of revenue for future generations. Many Cree are working on the project to harness the waters of the Eastmain River located in the middle of their territory. The work involves building an 890 foot long dam, 30 dikes enclosing a 603 square-km reservoir, a spillway, and a power house with 3 generating units with a total capacity of 480 MW of power for start-up in 2007. The project will require the use of 2,400 workers in total. The Cree Construction and Development Company is working on relations between Quebec's 14,000 Crees and the James Bay Energy Corporation, the subsidiary of Hydro-Quebec which is developing the project. Approximately 10 per cent of the $735-million project has been designated for the environmental component. Inspectors ensure that the project complies fully with environmental protection guidelines. Total development costs for Eastmain-1 are in the order of $2 billion of which $735 million will cover work on site and the remainder will cover generating units, transportation and financial charges. Under the treaty known as the Peace of the Braves, signed in February 2002, the Quebec government and Hydro-Quebec will pay the Cree $70 million annually for 50 years for the right to exploit hydro, mining and forest resources within their territory. The project comes at a time when electricity export volumes to the New England states are down due to growth in Quebec's domestic demand. Hydropower is a renewable and non-polluting source of energy that is one of the most acceptable forms of energy where the Kyoto Protocol is concerned. It was emphasized that large-scale hydro-electric projects are needed to provide sufficient energy to meet both

  18. The growing importance of staple foods and condiments used as ingredients in the food industry and implications for large-scale food fortification programs in Southeast Asia.

    Science.gov (United States)

    Spohrer, Rebecca; Larson, Melanie; Maurin, Clémence; Laillou, Arnaud; Capanzana, Mario; Garrett, Greg S

    2013-06-01

    Food fortification is a viable strategy to improve the nutritional status of populations. In Southeast Asia, recent growth and consolidation of the food industry provides an opportunity to explore whether certain widely consumed processed foods could contribute to micronutrient status if they are made with adequately fortified staples and condiments. To estimate the potential contribution certain processed foods can make to micronutrient intake in Southeast Asia if they are made with fortified staples and condiments; e.g., via the inclusion of iodized salt in various processed foods in the Philippines, fortified wheat flour in instant noodles in Indonesia, and fortified vegetable oil in biscuits in Vietnam. For Indonesia, the Philippines, and Vietnam, a review of consumption trends, relevant policies, and industry practices was conducted using publicly available sources,food industry market data and research reports, and oral communication. These informed the estimates of the proportion of the Recommended Nutrient Intake (RNI) that could be delivered via select processed foods. In the Philippines, Indonesia, and Vietnam, the processed food industry is not always required to use fortified staples and condiments. In the Philippines, dried salted fish with iodized salt would provide 64% to 85% of the iodine RNI for women of reproductive age and 107% to 141% of the iodine RNI for children 1 to 6 years of age. In Indonesia, a 75-g pack of instant noodles (a highly consumed product) with fortified wheat flour would provide 45% to 51% of the iron RNI for children 4 to 6 years of age and 10% to 11% of the iron RNI for women of reproductive age. In Vietnam, biscuits containing vegetable oil are increasingly popular. One 35-g biscuit serving with fortified vegetable oil would provide 13% to 18% of the vitamin A RNI for children 4 to 6 years of age and 12% to 17% of the vitamin A RNI for women of reproductive age. Ensuring that fortified staples and condiments such as flour

  19. Large scale cluster computing workshop

    International Nuclear Information System (INIS)

    Dane Skow; Alan Silverman

    2002-01-01

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community

  20. Large scale cross hole testing

    International Nuclear Information System (INIS)

    Ball, J.K.; Black, J.H.; Doe, T.

    1991-05-01

    As part of the Site Characterisation and Validation programme the results of the large scale cross hole testing have been used to document hydraulic connections across the SCV block, to test conceptual models of fracture zones and obtain hydrogeological properties of the major hydrogeological features. The SCV block is highly heterogeneous. This heterogeneity is not smoothed out even over scales of hundreds of meters. Results of the interpretation validate the hypothesis of the major fracture zones, A, B and H; not much evidence of minor fracture zones is found. The uncertainty in the flow path, through the fractured rock, causes sever problems in interpretation. Derived values of hydraulic conductivity were found to be in a narrow range of two to three orders of magnitude. Test design did not allow fracture zones to be tested individually. This could be improved by testing the high hydraulic conductivity regions specifically. The Piezomac and single hole equipment worked well. Few, if any, of the tests ran long enough to approach equilibrium. Many observation boreholes showed no response. This could either be because there is no hydraulic connection, or there is a connection but a response is not seen within the time scale of the pumping test. The fractional dimension analysis yielded credible results, and the sinusoidal testing procedure provided an effective means of identifying the dominant hydraulic connections. (10 refs.) (au)

  1. A large-scale computer facility for computational aerodynamics

    International Nuclear Information System (INIS)

    Bailey, F.R.; Balhaus, W.F.

    1985-01-01

    The combination of computer system technology and numerical modeling have advanced to the point that computational aerodynamics has emerged as an essential element in aerospace vehicle design methodology. To provide for further advances in modeling of aerodynamic flow fields, NASA has initiated at the Ames Research Center the Numerical Aerodynamic Simulation (NAS) Program. The objective of the Program is to develop a leading-edge, large-scale computer facility, and make it available to NASA, DoD, other Government agencies, industry and universities as a necessary element in ensuring continuing leadership in computational aerodynamics and related disciplines. The Program will establish an initial operational capability in 1986 and systematically enhance that capability by incorporating evolving improvements in state-of-the-art computer system technologies as required to maintain a leadership role. This paper briefly reviews the present and future requirements for computational aerodynamics and discusses the Numerical Aerodynamic Simulation Program objectives, computational goals, and implementation plans

  2. Large-scale galaxy bias

    Science.gov (United States)

    Desjacques, Vincent; Jeong, Donghui; Schmidt, Fabian

    2018-02-01

    This review presents a comprehensive overview of galaxy bias, that is, the statistical relation between the distribution of galaxies and matter. We focus on large scales where cosmic density fields are quasi-linear. On these scales, the clustering of galaxies can be described by a perturbative bias expansion, and the complicated physics of galaxy formation is absorbed by a finite set of coefficients of the expansion, called bias parameters. The review begins with a detailed derivation of this very important result, which forms the basis of the rigorous perturbative description of galaxy clustering, under the assumptions of General Relativity and Gaussian, adiabatic initial conditions. Key components of the bias expansion are all leading local gravitational observables, which include the matter density but also tidal fields and their time derivatives. We hence expand the definition of local bias to encompass all these contributions. This derivation is followed by a presentation of the peak-background split in its general form, which elucidates the physical meaning of the bias parameters, and a detailed description of the connection between bias parameters and galaxy statistics. We then review the excursion-set formalism and peak theory which provide predictions for the values of the bias parameters. In the remainder of the review, we consider the generalizations of galaxy bias required in the presence of various types of cosmological physics that go beyond pressureless matter with adiabatic, Gaussian initial conditions: primordial non-Gaussianity, massive neutrinos, baryon-CDM isocurvature perturbations, dark energy, and modified gravity. Finally, we discuss how the description of galaxy bias in the galaxies' rest frame is related to clustering statistics measured from the observed angular positions and redshifts in actual galaxy catalogs.

  3. Large-scale galaxy bias

    Science.gov (United States)

    Jeong, Donghui; Desjacques, Vincent; Schmidt, Fabian

    2018-01-01

    Here, we briefly introduce the key results of the recent review (arXiv:1611.09787), whose abstract is as following. This review presents a comprehensive overview of galaxy bias, that is, the statistical relation between the distribution of galaxies and matter. We focus on large scales where cosmic density fields are quasi-linear. On these scales, the clustering of galaxies can be described by a perturbative bias expansion, and the complicated physics of galaxy formation is absorbed by a finite set of coefficients of the expansion, called bias parameters. The review begins with a detailed derivation of this very important result, which forms the basis of the rigorous perturbative description of galaxy clustering, under the assumptions of General Relativity and Gaussian, adiabatic initial conditions. Key components of the bias expansion are all leading local gravitational observables, which include the matter density but also tidal fields and their time derivatives. We hence expand the definition of local bias to encompass all these contributions. This derivation is followed by a presentation of the peak-background split in its general form, which elucidates the physical meaning of the bias parameters, and a detailed description of the connection between bias parameters and galaxy (or halo) statistics. We then review the excursion set formalism and peak theory which provide predictions for the values of the bias parameters. In the remainder of the review, we consider the generalizations of galaxy bias required in the presence of various types of cosmological physics that go beyond pressureless matter with adiabatic, Gaussian initial conditions: primordial non-Gaussianity, massive neutrinos, baryon-CDM isocurvature perturbations, dark energy, and modified gravity. Finally, we discuss how the description of galaxy bias in the galaxies' rest frame is related to clustering statistics measured from the observed angular positions and redshifts in actual galaxy catalogs.

  4. Algorithm 896: LSA: Algorithms for Large-Scale Optimization

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2009-01-01

    Roč. 36, č. 3 (2009), 16-1-16-29 ISSN 0098-3500 R&D Pro jects: GA AV ČR IAA1030405; GA ČR GP201/06/P397 Institutional research plan: CEZ:AV0Z10300504 Keywords : algorithms * design * large-scale optimization * large-scale nonsmooth optimization * large-scale nonlinear least squares * large-scale nonlinear minimax * large-scale systems of nonlinear equations * sparse pro blems * partially separable pro blems * limited-memory methods * discrete Newton methods * quasi-Newton methods * primal interior-point methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.904, year: 2009

  5. Reaching mothers and babies with early postnatal home visits: the implementation realities of achieving high coverage in large-scale programs.

    Directory of Open Access Journals (Sweden)

    Deborah Sitrin

    Full Text Available BACKGROUND: Nearly half of births in low-income countries occur without a skilled attendant, and even fewer mothers and babies have postnatal contact with providers who can deliver preventive or curative services that save lives. Community-based maternal and newborn care programs with postnatal home visits have been tested in Bangladesh, Malawi, and Nepal. This paper examines coverage and content of home visits in pilot areas and factors associated with receipt of postnatal visits. METHODS: Using data from cross-sectional surveys of women with live births (Bangladesh 398, Malawi: 900, Nepal: 615, generalized linear models were used to assess the strength of association between three factors - receipt of home visits during pregnancy, birth place, birth notification - and receipt of home visits within three days after birth. Meta-analytic techniques were used to generate pooled relative risks for each factor adjusting for other independent variables, maternal age, and education. FINDINGS: The proportion of mothers and newborns receiving home visits within three days after birth was 57% in Bangladesh, 11% in Malawi, and 50% in Nepal. Mothers and newborns were more likely to receive a postnatal home visit within three days if the mother received at least one home visit during pregnancy (OR2.18, CI1.46-3.25, the birth occurred outside a facility (OR1.48, CI1.28-1.73, and the mother reported a CHW was notified of the birth (OR2.66, CI1.40-5.08. Checking the cord was the most frequently reported action; most mothers reported at least one action for newborns. CONCLUSIONS: Reaching mothers and babies with home visits during pregnancy and within three days after birth is achievable using existing community health systems if workers are available; linked to communities; and receive training, supplies, and supervision. In all settings, programs must evaluate what community delivery systems can handle and how to best utilize them to improve postnatal care

  6. Large-Scale Spacecraft Fire Safety Tests

    Science.gov (United States)

    Urban, David; Ruff, Gary A.; Ferkul, Paul V.; Olson, Sandra; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Rouvreau, Sebastien; Minster, Olivier; hide

    2014-01-01

    An international collaborative program is underway to address open issues in spacecraft fire safety. Because of limited access to long-term low-gravity conditions and the small volume generally allotted for these experiments, there have been relatively few experiments that directly study spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample sizes and environment conditions typical of those expected in a spacecraft fire. The major constraint has been the size of the sample, with prior experiments limited to samples of the order of 10 cm in length and width or smaller. This lack of experimental data forces spacecraft designers to base their designs and safety precautions on 1-g understanding of flame spread, fire detection, and suppression. However, low-gravity combustion research has demonstrated substantial differences in flame behavior in low-gravity. This, combined with the differences caused by the confined spacecraft environment, necessitates practical scale spacecraft fire safety research to mitigate risks for future space missions. To address this issue, a large-scale spacecraft fire experiment is under development by NASA and an international team of investigators. This poster presents the objectives, status, and concept of this collaborative international project (Saffire). The project plan is to conduct fire safety experiments on three sequential flights of an unmanned ISS re-supply spacecraft (the Orbital Cygnus vehicle) after they have completed their delivery of cargo to the ISS and have begun their return journeys to earth. On two flights (Saffire-1 and Saffire-3), the experiment will consist of a flame spread test involving a meter-scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. On one of the flights (Saffire-2), 9 smaller (5 x 30 cm) samples will be tested to evaluate NASAs material flammability screening tests

  7. FY 1998 Report on development of large-scale wind power generation systems. Part 1. Operational research on large-scale wind power generation systems; 1998 nendo ogata furyoku hatsuden system kaihatsu seika hokokusho. 1. Ogata furyoku hatsuden system no unten kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    The research and development project is implemented for large-scale wind power generation systems, and the FY 1998 results are reported. In the FY 1998, a slip property variable generator is actually mounted on the wind power generator, to conduct various types of demonstration tests. The reliability validation tests include microstructure examinations, fatigue tests and fatigue strength tests to predict residual strength in the blade. It is confirmed that the blade has a sufficient residual strength. The performance validation tests include continuous measurement of power outputs and wind velocities, and analysis of the output fluctuations. The power output performance during winter when the west wind prevails is higher than designed. In the tests for evaluating the characteristics of the system on which a slip property variable generator is mounted, the output smoothing effect is confirmed in a range beyond the rated output. The wind power generation system is continuously operated, to accumulate the operational data for, e.g., capacity factor, operating time rate, and system failure status. The FY 1998 results are 920,000kWh as the output and 21% as capacity factor. The other items investigate include aerodynamic noise reduction countermeasures, fatigue life of the wind turbine blades, economics of wind power generation, and dismantling and reuse of the wind turbines. (NEDO)

  8. Novel insights in the fecal egg count reduction test for monitoring drug efficacy against soil-transmitted helminths in large-scale treatment programs.

    Directory of Open Access Journals (Sweden)

    Bruno Levecke

    2011-12-01

    Full Text Available The fecal egg count reduction test (FECRT is recommended to monitor drug efficacy against soil-transmitted helminths (STHs in public health. However, the impact of factors inherent to study design (sample size and detection limit of the fecal egg count (FEC method and host-parasite interactions (mean baseline FEC and aggregation of FEC across host population on the reliability of FECRT is poorly understood.A simulation study was performed in which FECRT was assessed under varying conditions of the aforementioned factors. Classification trees were built to explore critical values for these factors required to obtain conclusive FECRT results. The outcome of this analysis was subsequently validated on five efficacy trials across Africa, Asia, and Latin America. Unsatisfactory (<85.0% sensitivity and specificity results to detect reduced efficacy were found if sample sizes were small (<10 or if sample sizes were moderate (10-49 combined with highly aggregated FEC (k<0.25. FECRT remained inconclusive under any evaluated condition for drug efficacies ranging from 87.5% to 92.5% for a reduced-efficacy-threshold of 90% and from 92.5% to 97.5% for a threshold of 95%. The most discriminatory study design required 200 subjects independent of STH status (including subjects who are not excreting eggs. For this sample size, the detection limit of the FEC method and the level of aggregation of the FEC did not affect the interpretation of the FECRT. Only for a threshold of 90%, mean baseline FEC <150 eggs per gram of stool led to a reduced discriminatory power.This study confirms that the interpretation of FECRT is affected by a complex interplay of factors inherent to both study design and host-parasite interactions. The results also highlight that revision of the current World Health Organization guidelines to monitor drug efficacy is indicated. We, therefore, propose novel guidelines to support future monitoring programs.

  9. Large-Scale 3D Printing: The Way Forward

    Science.gov (United States)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  10. Research on the negative externality of breeding wastewater in rural construction—Taking the large-scale breeding project of Tongnan in Chongqing as an example

    Science.gov (United States)

    Yin, Mingqiang; Hu, Wen; Wu, Haonan

    2018-03-01

    The country vigorously promotes the collective operation of traditional culture, which brings a lot of negative externalities of environmental pollution. Environmental pollution causes the difference between social cost and private cost. The cost of pollution is not borne by private enterprises, it is an external cost for the polluters. This paper attempts to take the Chongqing pig farm as an example to select the COD and TN indicators in the wastewater as the focus point of the analysis. We explore the equilibrium point based on the personal cost of Party A and the public welfare cost brought by environmental pollution, and test the rationality and accuracy of the existing norms. On the basis of existing research, the end of the paper tries to explore the better solution according to the law of Pigou and the property rights delimitation.

  11. Reducing Data Center Loads for a Large-Scale, Low-Energy Office Building: NREL's Research Support Facility (Book)

    Energy Technology Data Exchange (ETDEWEB)

    Sheppy, M.; Lobato, C.; Van Geet, O.; Pless, S.; Donovan, K.; Powers, C.

    2011-12-01

    This publication detailing the design, implementation strategies, and continuous performance monitoring of NREL's Research Support Facility data center. Data centers are energy-intensive spaces that facilitate the transmission, receipt, processing, and storage of digital data. These spaces require redundancies in power and storage, as well as infrastructure, to cool computing equipment and manage the resulting waste heat (Tschudi, Xu, Sartor, and Stein, 2003). Data center spaces can consume more than 100 times the energy of standard office spaces (VanGeet 2011). The U.S. Environmental Protection Agency (EPA) reported that data centers used 61 billion kilowatt-hours (kWh) in 2006, which was 1.5% of the total electricity consumption in the U.S. (U.S. EPA, 2007). Worldwide, data centers now consume more energy annually than Sweden (New York Times, 2009). Given their high energy consumption and conventional operation practices, there is a potential for huge energy savings in data centers. The National Renewable Energy Laboratory (NREL) is world renowned for its commitment to green building construction. In June 2010, the laboratory finished construction of a 220,000-square-foot (ft{sup 2}), LEED Platinum, Research Support Facility (RSF), which included a 1,900-ft{sup 2} data center. The RSF will expand to 360,000 ft{sup 2} with the opening of an additional wing December, 2011. The project's request for proposals (RFP) set a whole-building demand-side energy use requirement of a nominal 35 kBtu/ft{sup 2} per year. On-site renewable energy generation will offset the annual energy consumption. To support the RSF's energy goals, NREL's new data center was designed to minimize its energy footprint without compromising service quality. Several implementation challenges emerged during the design, construction, and first 11 months of operation of the RSF data center. This document highlights these challenges and describes in detail how NREL successfully

  12. Coral larvae for restoration and research: a large-scale method for rearing Acropora millepora larvae, inducing settlement, and establishing symbiosis

    Directory of Open Access Journals (Sweden)

    F. Joseph Pollock

    2017-09-01

    Full Text Available Here we describe an efficient and effective technique for rearing sexually-derived coral propagules from spawning through larval settlement and symbiont uptake with minimal impact on natural coral populations. We sought to maximize larval survival while minimizing expense and daily husbandry maintenance by experimentally determining optimized conditions and protocols for gamete fertilization, larval cultivation, induction of larval settlement by crustose coralline algae, and inoculation of newly settled juveniles with their dinoflagellate symbiont Symbiodinium. Larval rearing densities at or below 0.2 larvae mL−1 were found to maximize larval survival and settlement success in culture tanks while minimizing maintenance effort. Induction of larval settlement via the addition of a ground mixture of diverse crustose coralline algae (CCA is recommended, given the challenging nature of in situ CCA identification and our finding that non settlement-inducing CCA assemblages do not inhibit larval settlement if suitable assemblages are present. Although order of magnitude differences in infectivity were found between common Great Barrier Reef Symbiodinium clades C and D, no significant differences in Symbiodinium uptake were observed between laboratory-cultured and wild-harvested symbionts in each case. The technique presented here for Acropora millepora can be adapted for research and restoration efforts in a wide range of broadcast spawning coral species.

  13. Large scale gas chromatographic demonstration system for hydrogen isotope separation

    International Nuclear Information System (INIS)

    Cheh, C.H.

    1988-01-01

    A large scale demonstration system was designed for a throughput of 3 mol/day equimolar mixture of H,D, and T. The demonstration system was assembled and an experimental program carried out. This project was funded by Kernforschungszentrum Karlsruhe, Canadian Fusion Fuel Technology Projects and Ontario Hydro Research Division. Several major design innovations were successfully implemented in the demonstration system and are discussed in detail. Many experiments were carried out in the demonstration system to study the performance of the system to separate hydrogen isotopes at high throughput. Various temperature programming schemes were tested, heart-cutting operation was evaluated, and very large (up to 138 NL/injection) samples were separated in the system. The results of the experiments showed that the specially designed column performed well as a chromatographic column and good separation could be achieved even when a 138 NL sample was injected

  14. Wellbore Completion Systems Containment Breach Solution Experiments at a Large Scale Underground Research Laboratory : Sealant placement & scale-up from Lab to Field

    Science.gov (United States)

    Goodman, H.

    2017-12-01

    This investigation seeks to develop sealant technology that can restore containment to completed wells that suffer CO2 gas leakages currently untreatable using conventional technologies. Experimentation is performed at the Mont Terri Underground Research Laboratory (MT-URL) located in NW Switzerland. The laboratory affords investigators an intermediate-scale test site that bridges the gap between the laboratory bench and full field-scale conditions. Project focus is the development of CO2 leakage remediation capability using sealant technology. The experimental concept includes design and installation of a field scale completion package designed to mimic well systems heating-cooling conditions that may result in the development of micro-annuli detachments between the casing-cement-formation boundaries (Figure 1). Of particular interest is to test novel sealants that can be injected in to relatively narrow micro-annuli flow-paths of less than 120 microns aperture. Per a special report on CO2 storage submitted to the IPCC[1], active injection wells, along with inactive wells that have been abandoned, are identified as one of the most probable sources of leakage pathways for CO2 escape to the surface. Origins of pressure leakage common to injection well and completions architecture often occur due to tensile cracking from temperature cycles, micro-annulus by casing contraction (differential casing to cement sheath movement) and cement sheath channel development. This discussion summarizes the experiment capability and sealant testing results. The experiment concludes with overcoring of the entire mock-completion test site to assess sealant performance in 2018. [1] IPCC Special Report on Carbon Dioxide Capture and Storage (September 2005), section 5.7.2 Processes and pathways for release of CO2 from geological storage sites, page 244

  15. The NIHR Collaborations for Leadership in Applied Health Research and Care (CLAHRC) for Greater Manchester: combining empirical, theoretical and experiential evidence to design and evaluate a large-scale implementation strategy.

    Science.gov (United States)

    Harvey, Gill; Fitzgerald, Louise; Fielden, Sandra; McBride, Anne; Waterman, Heather; Bamford, David; Kislov, Roman; Boaden, Ruth

    2011-08-23

    In response to policy recommendations, nine National Institute for Health Research (NIHR) Collaborations for Leadership in Applied Health Research and Care (CLAHRCs) were established in England in 2008, aiming to create closer working between the health service and higher education and narrow the gap between research and its implementation in practice. The Greater Manchester (GM) CLAHRC is a partnership between the University of Manchester and twenty National Health Service (NHS) trusts, with a five-year mission to improve healthcare and reduce health inequalities for people with cardiovascular conditions. This paper outlines the GM CLAHRC approach to designing and evaluating a large-scale, evidence- and theory-informed, context-sensitive implementation programme. The paper makes a case for embedding evaluation within the design of the implementation strategy. Empirical, theoretical, and experiential evidence relating to implementation science and methods has been synthesised to formulate eight core principles of the GM CLAHRC implementation strategy, recognising the multi-faceted nature of evidence, the complexity of the implementation process, and the corresponding need to apply approaches that are situationally relevant, responsive, flexible, and collaborative. In turn, these core principles inform the selection of four interrelated building blocks upon which the GM CLAHRC approach to implementation is founded. These determine the organizational processes, structures, and roles utilised by specific GM CLAHRC implementation projects, as well as the approach to researching implementation, and comprise: the Promoting Action on Research Implementation in Health Services (PARIHS) framework; a modified version of the Model for Improvement; multiprofessional teams with designated roles to lead, facilitate, and support the implementation process; and embedded evaluation and learning. Designing and evaluating a large-scale implementation strategy that can cope with and

  16. An Novel Architecture of Large-scale Communication in IOT

    Science.gov (United States)

    Ma, Wubin; Deng, Su; Huang, Hongbin

    2018-03-01

    In recent years, many scholars have done a great deal of research on the development of Internet of Things and networked physical systems. However, few people have made the detailed visualization of the large-scale communications architecture in the IOT. In fact, the non-uniform technology between IPv6 and access points has led to a lack of broad principles of large-scale communications architectures. Therefore, this paper presents the Uni-IPv6 Access and Information Exchange Method (UAIEM), a new architecture and algorithm that addresses large-scale communications in the IOT.

  17. Ethics of large-scale change

    OpenAIRE

    Arler, Finn

    2006-01-01

      The subject of this paper is long-term large-scale changes in human society. Some very significant examples of large-scale change are presented: human population growth, human appropriation of land and primary production, the human use of fossil fuels, and climate change. The question is posed, which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, th...

  18. Engineering management of large scale systems

    Science.gov (United States)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  19. Comparative Analysis of Different Protocols to Manage Large Scale Networks

    OpenAIRE

    Anil Rao Pimplapure; Dr Jayant Dubey; Prashant Sen

    2013-01-01

    In recent year the numbers, complexity and size is increased in Large Scale Network. The best example of Large Scale Network is Internet, and recently once are Data-centers in Cloud Environment. In this process, involvement of several management tasks such as traffic monitoring, security and performance optimization is big task for Network Administrator. This research reports study the different protocols i.e. conventional protocols like Simple Network Management Protocol and newly Gossip bas...

  20. Large Scale Community Detection Using a Small World Model

    Directory of Open Access Journals (Sweden)

    Ranjan Kumar Behera

    2017-11-01

    Full Text Available In a social network, small or large communities within the network play a major role in deciding the functionalities of the network. Despite of diverse definitions, communities in the network may be defined as the group of nodes that are more densely connected as compared to nodes outside the group. Revealing such hidden communities is one of the challenging research problems. A real world social network follows small world phenomena, which indicates that any two social entities can be reachable in a small number of steps. In this paper, nodes are mapped into communities based on the random walk in the network. However, uncovering communities in large-scale networks is a challenging task due to its unprecedented growth in the size of social networks. A good number of community detection algorithms based on random walk exist in literature. In addition, when large-scale social networks are being considered, these algorithms are observed to take considerably longer time. In this work, with an objective to improve the efficiency of algorithms, parallel programming framework like Map-Reduce has been considered for uncovering the hidden communities in social network. The proposed approach has been compared with some standard existing community detection algorithms for both synthetic and real-world datasets in order to examine its performance, and it is observed that the proposed algorithm is more efficient than the existing ones.

  1. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  2. Decentralized Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

    2013-01-01

    problem is formulated as a centralized large-scale optimization problem but is then decomposed into smaller subproblems that are solved locally by each unit connected to an aggregator. For large-scale systems the method is faster than solving the full problem and can be distributed to include an arbitrary...

  3. A Novel Large-scale Mentoring Program for Medical Students based on a Quantitative and Qualitative Needs Analysis [Aufbau eines innovativen Mentorenprogramms für eine große Zahl Medizinstudierender nach quantitativer und qualitativer Bedarfsanalyse

    Directory of Open Access Journals (Sweden)

    von der Borch, Philip

    2011-05-01

    Full Text Available [english] Purpose: Mentoring plays an important role in students' performance and career. The authors of this study assessed the need for mentoring among medical students and established a novel large-scale mentoring program at Ludwig-Maximilians-University (LMU Munich School of Medicine.Methods: Needs assessment was conducted using a survey distributed to all students at the medical school (n=578 of 4,109 students, return rate 14.1%. In addition, the authors held focus groups with selected medical students (n=24 and faculty physicians (n=22. All students signing up for the individual mentoring completed a survey addressing their expectations (n=534.Results: Needs assessment revealed that 83% of medical students expressed overall satisfaction with the teaching at LMU. In contrast, only 36.5% were satisfied with how the faculty supports their individual professional development and 86% of students voiced a desire for more personal and professional support. When asked to define the role of a mentor, 55.6% "very much" wanted their mentors to act as counselors, arrange contacts for them (36.4%, and provide ideas for professional development (28.1%. Topics that future mentees "very much" wished to discuss included research (56.6%, final year electives (55.8% and experiences abroad (45.5%.Conclusions: Based on the strong desire for mentoring among medical students, the authors developed a novel two-tiered system that introduces one-to-one mentoring for students in their clinical years and offers society-based peer mentoring for pre-clinical students. One year after launching the program, more than 300 clinical students had experienced one-to-one mentoring and 1,503 students and physicians were involved in peer mentoring societies.[german] Hintergrund: Mentoring ist eine wichtige Stütze in der Karriere von Studierenden. In der vorliegenden Untersuchung dokumentieren wir den Mentoring-Bedarf der Medizinstudierenden an der Medizinischen Fakultät der

  4. Presenting an Approach for Conducting Knowledge Architecture within Large-Scale Organizations.

    Science.gov (United States)

    Varaee, Touraj; Habibi, Jafar; Mohaghar, Ali

    2015-01-01

    Knowledge architecture (KA) establishes the basic groundwork for the successful implementation of a short-term or long-term knowledge management (KM) program. An example of KA is the design of a prototype before a new vehicle is manufactured. Due to a transformation to large-scale organizations, the traditional architecture of organizations is undergoing fundamental changes. This paper explores the main strengths and weaknesses in the field of KA within large-scale organizations and provides a suitable methodology and supervising framework to overcome specific limitations. This objective was achieved by applying and updating the concepts from the Zachman information architectural framework and the information architectural methodology of enterprise architecture planning (EAP). The proposed solution may be beneficial for architects in knowledge-related areas to successfully accomplish KM within large-scale organizations. The research method is descriptive; its validity is confirmed by performing a case study and polling the opinions of KA experts.

  5. Presenting an Approach for Conducting Knowledge Architecture within Large-Scale Organizations

    Science.gov (United States)

    Varaee, Touraj; Habibi, Jafar; Mohaghar, Ali

    2015-01-01

    Knowledge architecture (KA) establishes the basic groundwork for the successful implementation of a short-term or long-term knowledge management (KM) program. An example of KA is the design of a prototype before a new vehicle is manufactured. Due to a transformation to large-scale organizations, the traditional architecture of organizations is undergoing fundamental changes. This paper explores the main strengths and weaknesses in the field of KA within large-scale organizations and provides a suitable methodology and supervising framework to overcome specific limitations. This objective was achieved by applying and updating the concepts from the Zachman information architectural framework and the information architectural methodology of enterprise architecture planning (EAP). The proposed solution may be beneficial for architects in knowledge-related areas to successfully accomplish KM within large-scale organizations. The research method is descriptive; its validity is confirmed by performing a case study and polling the opinions of KA experts. PMID:25993414

  6. Trends in large-scale testing of reactor structures

    International Nuclear Information System (INIS)

    Blejwas, T.E.

    2003-01-01

    Large-scale tests of reactor structures have been conducted at Sandia National Laboratories since the late 1970s. This paper describes a number of different large-scale impact tests, pressurization tests of models of containment structures, and thermal-pressure tests of models of reactor pressure vessels. The advantages of large-scale testing are evident, but cost, in particular limits its use. As computer models have grown in size, such as number of degrees of freedom, the advent of computer graphics has made possible very realistic representation of results - results that may not accurately represent reality. A necessary condition to avoiding this pitfall is the validation of the analytical methods and underlying physical representations. Ironically, the immensely larger computer models sometimes increase the need for large-scale testing, because the modeling is applied to increasing more complex structural systems and/or more complex physical phenomena. Unfortunately, the cost of large-scale tests is a disadvantage that will likely severely limit similar testing in the future. International collaborations may provide the best mechanism for funding future programs with large-scale tests. (author)

  7. Large scale oil lease automation and electronic custody transfer

    International Nuclear Information System (INIS)

    Price, C.R.; Elmer, D.C.

    1995-01-01

    Typically, oil field production operations have only been automated at fields with long term production profiles and enhanced recovery. The automation generally consists of monitoring and control at the wellhead and centralized facilities. However, Union Pacific Resources Co. (UPRC) has successfully implemented a large scale automation program for rapid-decline primary recovery Austin Chalk wells where purchasers buy and transport oil from each individual wellsite. This project has resulted in two significant benefits. First, operators are using the system to re-engineer their work processes. Second, an inter-company team created a new electronic custody transfer method. This paper will describe: the progression of the company's automation objectives in the area; the field operator's interaction with the system, and the related benefits; the research and development of the new electronic custody transfer method

  8. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  9. A Combined Ethical and Scientific Analysis of Large-scale Tests of Solar Climate Engineering

    Science.gov (United States)

    Ackerman, T. P.

    2017-12-01

    Our research group recently published an analysis of the combined ethical and scientific issues surrounding large-scale testing of stratospheric aerosol injection (SAI; Lenferna et al., 2017, Earth's Future). We are expanding this study in two directions. The first is extending this same analysis to other geoengineering techniques, particularly marine cloud brightening (MCB). MCB has substantial differences to SAI in this context because MCB can be tested over significantly smaller areas of the planet and, following injection, has a much shorter lifetime of weeks as opposed to years for SAI. We examine issues such as the role of intent, the lesser of two evils, and the nature of consent. In addition, several groups are currently considering climate engineering governance tools such as a code of ethics and a registry. We examine how these tools might influence climate engineering research programs and, specifically, large-scale testing. The second direction of expansion is asking whether ethical and scientific issues associated with large-scale testing are so significant that they effectively preclude moving ahead with climate engineering research and testing. Some previous authors have suggested that no research should take place until these issues are resolved. We think this position is too draconian and consider a more nuanced version of this argument. We note, however, that there are serious questions regarding the ability of the scientific research community to move to the point of carrying out large-scale tests.

  10. Energy research program 83

    International Nuclear Information System (INIS)

    1983-01-01

    The energy research program 83 (EFP-83) is prepared by the Danish Ministry of Energy in order to continue the extension of the Danish energy research and development started through the former Trade Ministry's programs EM-1 (1976) and EM-2 (1978), and the Ministry of Energy's programs EFP-80, EFP-81 and EFP-82. The new program is a continuation of the activities in the period 1983-85 with a total budget of 111 mio. DKK. The program gives a brief description of background, principles, organization and financing, and a detailed description of each research area. (ln)

  11. Energy research program 85

    International Nuclear Information System (INIS)

    1985-01-01

    The energy research program 85 (EFP-85) is prepared by the Danish Ministry of Energy in order to continue the extension of the Danish energy research and development started through the former Trade Ministry's programs EM-1 (1976) and EM-2 (1978), and Ministry of Energy's programs EFP-80, EFP-81, EFP-82, EFP-83, and EFP-84. The new program is a continuation of the activities in the period 1985-87 with a total budget of 110 mio. DKK. The program gives a brief description of background, principles, organization and financing, and a detailed description of each research area. (ln)

  12. Energy research program 82

    International Nuclear Information System (INIS)

    1982-01-01

    The energy research program 82 (EFP-82) is prepared by the Danish ministry of energy in order to continue the extension of the Danish energy research and development started through the former trade ministry's programs EM-1 (1976) and EM-2 (1978), and the energy ministry's programs EFP-80 and EFP-81. The new program is a continuation of the activities in the period 1982-84 with a total budget of 100 mio.Dkr. The program gives a brief description of background, principles, organization and financing, and a detailed description of each research area. (BP)

  13. Energy research program 86

    International Nuclear Information System (INIS)

    1986-01-01

    The energy research program 86 (EFP-86) is prepared by the Danish Ministry of Energy in order to continue the extension of the Danish energy research and development started through the former Trade Ministry's programs EM-1 (1976) and EM-2 (1978), and the Ministry of Energy's programs EFP-80, EFP-81, EFP-82, EFP-83, EFP-84, and EFP-85. The new program is a continuation of the activities in the period 1986-88 with a total budget of 116 mio. DKK. The program gives a brief description of background, principles, organization and financing, and a detailed description of each research area. (ln)

  14. Energy research program 84

    International Nuclear Information System (INIS)

    1984-01-01

    The energy research program 84 (EFP-84) is prepared by the Danish Ministry of Energy in order to continue the extension of the Danish energy research and development started through the former Trade Ministry's programs EM-1 (1976) and EM-2 (1978), and the Ministry of Energy's programs EFP-80, EFP-81, EFP-82 and EFP-83. The new program is a continuation of the activities in the period 1984-86 with a total budget of 112 mio. DKK. The program gives a brief description of background, principles, organization and financing, and a detailed description of each research area. (ln)

  15. IKONOS imagery for the Large Scale Biosphere–Atmosphere Experiment in Amazonia (LBA).

    Science.gov (United States)

    George Hurtt; Xiangming Xiao; Michael Keller; Michael Palace; Gregory P. Asner; Rob Braswell; Brond& #305; Eduardo S. zio; Manoel Cardoso; Claudio J.R. Carvalho; Matthew G. Fearon; Liane Guild; Steve Hagen; Scott Hetrick; Berrien Moore III; Carlos Nobre; Jane M. Read; S& aacute; Tatiana NO-VALUE; Annette Schloss; George Vourlitis; Albertus J. Wickel

    2003-01-01

    The LBA-ECO program is one of several international research components under the Brazilian-led Large Scale Biosphere–Atmosphere Experiment in Amazonia (LBA). The field-oriented research activities of this study are organized along transects and include a set of primary field sites, where the major objective is to study land-use change and ecosystem dynamics, and a...

  16. Large-scale numerical simulations of plasmas

    International Nuclear Information System (INIS)

    Hamaguchi, Satoshi

    2004-01-01

    The recent trend of large scales simulations of fusion plasma and processing plasmas is briefly summarized. Many advanced simulation techniques have been developed for fusion plasmas and some of these techniques are now applied to analyses of processing plasmas. (author)

  17. Superconducting materials for large scale applications

    International Nuclear Information System (INIS)

    Dew-Hughes, D.

    1975-01-01

    Applications of superconductors capable of carrying large current densities in large-scale electrical devices are examined. Discussions are included on critical current density, superconducting materials available, and future prospects for improved superconducting materials. (JRD)

  18. Large-scale computing with Quantum Espresso

    International Nuclear Information System (INIS)

    Giannozzi, P.; Cavazzoni, C.

    2009-01-01

    This paper gives a short introduction to Quantum Espresso: a distribution of software for atomistic simulations in condensed-matter physics, chemical physics, materials science, and to its usage in large-scale parallel computing.

  19. Large-scale regions of antimatter

    International Nuclear Information System (INIS)

    Grobov, A. V.; Rubin, S. G.

    2015-01-01

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era

  20. Large-scale regions of antimatter

    Energy Technology Data Exchange (ETDEWEB)

    Grobov, A. V., E-mail: alexey.grobov@gmail.com; Rubin, S. G., E-mail: sgrubin@mephi.ru [National Research Nuclear University MEPhI (Russian Federation)

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  1. Epidemiology & Genomics Research Program

    Science.gov (United States)

    The Epidemiology and Genomics Research Program, in the National Cancer Institute's Division of Cancer Control and Population Sciences, funds research in human populations to understand the determinants of cancer occurrence and outcomes.

  2. Report of the Workshop on Petascale Systems Integration for LargeScale Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Kramer, William T.C.; Walter, Howard; New, Gary; Engle, Tom; Pennington, Rob; Comes, Brad; Bland, Buddy; Tomlison, Bob; Kasdorf, Jim; Skinner, David; Regimbal, Kevin

    2007-10-01

    There are significant issues regarding Large Scale System integration that are not being addressed in other forums such as current research portfolios or vendor user groups. Unfortunately, the issues in the area of large-scale system integration often fall into a netherworld; not research, not facilities, not procurement, not operations, not user services. Taken together, these issues along with the impact of sub-optimal integration technology means the time required to deploy, integrate and stabilize large scale system may consume up to 20 percent of the useful life of such systems. Improving the state of the art for large scale systems integration has potential to increase the scientific productivity of these systems. Sites have significant expertise, but there are no easy ways to leverage this expertise among them . Many issues inhibit the sharing of information, including available time and effort, as well as issues with sharing proprietary information. Vendors also benefit in the long run from the solutions to issues detected during site testing and integration. There is a great deal of enthusiasm for making large scale system integration a full-fledged partner along with the other major thrusts supported by funding agencies in the definition, design, and use of a petascale systems. Integration technology and issues should have a full 'seat at the table' as petascale and exascale initiatives and programs are planned. The workshop attendees identified a wide range of issues and suggested paths forward. Pursuing these with funding opportunities and innovation offers the opportunity to dramatically improve the state of large scale system integration.

  3. Energy research program 80

    International Nuclear Information System (INIS)

    1980-01-01

    The energy research program 80 contains an extension of the activities for the period 1980-82 within a budget of 100 mio.kr., that are a part of the goverment's employment plan for 1980. The research program is based on a number of project proposals, that have been collected, analysed, and supplemented in October-November 1979. This report consists of two parts. Part 1: a survey of the program, with a brief description of the background, principles, organization and financing. Part 2: Detailed description of the different research programs. (LN)

  4. Dissecting the large-scale galactic conformity

    Science.gov (United States)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  5. Research and development of system to utilize photovoltaic energy. Study on large-scale PV power supply system; Taiyoko hatsuden riyo system no kenkyu kaihatsu. Taiyo energy kyokyu system no chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    Tatsuta, M [New Energy and Industrial Technology Development Organization, Tokyo (Japan)

    1994-12-01

    This paper reports the study results on large-scale PV power supply systems in fiscal 1994. (1) On optimization of large-scale systems, the conceptual design of the model system was carried out which supposes a large-scale integrated PV power generation system in desert area. As a result, a pair of 250kW generation system was designed as minimum one consisting power unit. Its frame and construction method were designed considering weather conditions in the inland of China. (2) On optimization of large-scale transmission systems, as large-scale power transmission systems for PV power generation, the following were studied: AC aerial transmission, DC aerial transmission, superconducting transmission, hydrogen gas pipeline, and LH2 tanker transport. (3) On the influence of large-scale systems, it was estimated that emission control is expected by substituting PV power generation for coal fired power generation, the negative influence on natural environment cannot be supposed, and the favorable economic effect is expected as influence on social environment. 4 tabs.

  6. Programs | IDRC - International Development Research Centre

    International Development Research Centre (IDRC) Digital Library (Canada)

    Our development programs support innovative solutions that improve global ... Chestnut farm worker carries basket of harvest chestnuts on shoulders in China ... Invest in knowledge and innovation for large-scale positive change; Build the ...

  7. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its......The Subject of large scale networks is approached from the perspective of the network planner. An analysis of the long term planning problems is presented with the main focus on the changing requirements for large scale networks and the potential problems in meeting these requirements. The problems...... the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...

  8. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    1981-05-01

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  9. Malware Analysis: From Large-Scale Data Triage to Targeted Attack Recognition (Dagstuhl Seminar 17281)

    OpenAIRE

    Zennou, Sarah; Debray, Saumya K.; Dullien, Thomas; Lakhothia, Arun

    2018-01-01

    This report summarizes the program and the outcomes of the Dagstuhl Seminar 17281, entitled "Malware Analysis: From Large-Scale Data Triage to Targeted Attack Recognition". The seminar brought together practitioners and researchers from industry and academia to discuss the state-of-the art in the analysis of malware from both a big data perspective and a fine grained analysis. Obfuscation was also considered. The meeting created new links within this very diverse community.

  10. Large-Scale Analysis of Art Proportions

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2014-01-01

    While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square) and with majo......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  11. Piping research program plan

    International Nuclear Information System (INIS)

    1988-09-01

    This document presents the piping research program plan for the Structural and Seismic Engineering Branch and the Materials Engineering Branch of the Division of Engineering, Office of Nuclear Regulatory Research. The plan describes the research to be performed in the areas of piping design criteria, environmentally assisted cracking, pipe fracture, and leak detection and leak rate estimation. The piping research program addresses the regulatory issues regarding piping design and piping integrity facing the NRC today and in the foreseeable future. The plan discusses the regulatory issues and needs for the research, the objectives, key aspects, and schedule for each research project, or group of projects focussing of a specific topic, and, finally, the integration of the research areas into the regulatory process is described. The plan presents a snap-shot of the piping research program as it exists today. However, the program plan will change as the regulatory issues and needs change. Consequently, this document will be revised on a bi-annual basis to reflect the changes in the piping research program. (author)

  12. Fatigue Analysis of Large-scale Wind turbine

    Directory of Open Access Journals (Sweden)

    Zhu Yongli

    2017-01-01

    Full Text Available The paper does research on top flange fatigue damage of large-scale wind turbine generator. It establishes finite element model of top flange connection system with finite element analysis software MSC. Marc/Mentat, analyzes its fatigue strain, implements load simulation of flange fatigue working condition with Bladed software, acquires flange fatigue load spectrum with rain-flow counting method, finally, it realizes fatigue analysis of top flange with fatigue analysis software MSC. Fatigue and Palmgren-Miner linear cumulative damage theory. The analysis result indicates that its result provides new thinking for flange fatigue analysis of large-scale wind turbine generator, and possesses some practical engineering value.

  13. Large-scale networks in engineering and life sciences

    CERN Document Server

    Findeisen, Rolf; Flockerzi, Dietrich; Reichl, Udo; Sundmacher, Kai

    2014-01-01

    This edited volume provides insights into and tools for the modeling, analysis, optimization, and control of large-scale networks in the life sciences and in engineering. Large-scale systems are often the result of networked interactions between a large number of subsystems, and their analysis and control are becoming increasingly important. The chapters of this book present the basic concepts and theoretical foundations of network theory and discuss its applications in different scientific areas such as biochemical reactions, chemical production processes, systems biology, electrical circuits, and mobile agents. The aim is to identify common concepts, to understand the underlying mathematical ideas, and to inspire discussions across the borders of the various disciplines.  The book originates from the interdisciplinary summer school “Large Scale Networks in Engineering and Life Sciences” hosted by the International Max Planck Research School Magdeburg, September 26-30, 2011, and will therefore be of int...

  14. Depression among family caregivers of community-dwelling older people who used services under the Long Term Care Insurance program: a large-scale population-based study in Japan.

    Science.gov (United States)

    Arai, Yumiko; Kumamoto, Keigo; Mizuno, Yoko; Washio, Masakazu

    2014-01-01

    To identify predictors for depression among family caregivers of community-dwelling older people under the Long Term Care Insurance (LTCI) program in Japan through a large-scale population-based survey. All 5938 older people with disabilities, using domiciliary services under the LTCI in the city of Toyama, and their family caregivers participated in this study. Caregiver depression was defined as scores of ≥16 on the Center for Epidemiological Studies Depression Scale (CES-D). Other caregiver measures included age, sex, hours spent caregiving, relationship to the care recipient, income adequacy, living arrangement, self-rated health, and work status. Care recipient measures included age, sex, level of functional disability, and severity of dementia. The data from 4128 pairs of the care recipients and their family caregivers were eligible for further analyses. A multiple logistic regression analysis was used to examine the predictors associated with being at risk of clinical depression (CES-D of ≥16). Overall, 34.2% of caregivers scored ≥16 on the CES-D. The independent predictors for depression by logistic regression analysis were six caregiver characteristics (female, income inadequacy, longer hours spent caregiving, worse subjective health, and co-residence with the care recipient) and one care-recipient characteristic (moderate dementia). This is one of the first population-based examinations of caregivers of older people who are enrolled in a national service system that provides affordable access to services. The results highlighted the importance of monitoring caregivers who manifest the identified predictors to attenuate caregiver depression at the population level under the LTCI.

  15. Survey and research for the enhancement of large-scale technology development 3. Patent researches on new tasks for development under large-scale project; Ogata gijutsu kaihatsu suishin no tame no chosa kenkyu. 3. Ogata project shinki kaihatsu tema ni kansuru tokkyo chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1981-03-01

    Regarding 'high-speed computation systems for technological use' and 'manganese nodule mining systems,' researches are conducted into technological trends from the viewpoint of patent. As for applications for patents involving the Josephson effect device, there are 79 patents disclosed in Japan, with applications from Japan recording a peak in 1977-1978 and those from overseas in 1974-1975. As for important applicants, IBM distinguishes itself among those overseas while, in Japan, the Nippon Telegraph and Telephone Public Corporation occupies 47%, and Fujitsu, Ltd., 34%. In the case of GaAs-based transistors, businesses in Japan occupies as much as 90% of the applications overwhelming overseas businesses occupying less than 10%. As for the patents on manganese nodule mining systems, 183 Japanese patents are pending, with 88 already granted in America. While the main concern in Japan has transferred from the continuous elevator bucket system of 1971-1974 to the fluid dredge system, the fluid dredge system has consistently been occupying the overpowering majority in America. (NEDO)

  16. Large Scale Simulations of the Euler Equations on GPU Clusters

    KAUST Repository

    Liebmann, Manfred; Douglas, Craig C.; Haase, Gundolf; Horvá th, Zoltá n

    2010-01-01

    The paper investigates the scalability of a parallel Euler solver, using the Vijayasundaram method, on a GPU cluster with 32 Nvidia Geforce GTX 295 boards. The aim of this research is to enable large scale fluid dynamics simulations with up to one

  17. Reconsidering Replication: New Perspectives on Large-Scale School Improvement

    Science.gov (United States)

    Peurach, Donald J.; Glazer, Joshua L.

    2012-01-01

    The purpose of this analysis is to reconsider organizational replication as a strategy for large-scale school improvement: a strategy that features a "hub" organization collaborating with "outlet" schools to enact school-wide designs for improvement. To do so, we synthesize a leading line of research on commercial replication to construct a…

  18. Large-scale silviculture experiments of western Oregon and Washington.

    Science.gov (United States)

    Nathan J. Poage; Paul D. Anderson

    2007-01-01

    We review 12 large-scale silviculture experiments (LSSEs) in western Washington and Oregon with which the Pacific Northwest Research Station of the USDA Forest Service is substantially involved. We compiled and arrayed information about the LSSEs as a series of matrices in a relational database, which is included on the compact disc published with this report and...

  19. Configuration management in large scale infrastructure development

    NARCIS (Netherlands)

    Rijn, T.P.J. van; Belt, H. van de; Los, R.H.

    2000-01-01

    Large Scale Infrastructure (LSI) development projects such as the construction of roads, rail-ways and other civil engineering (water)works is tendered differently today than a decade ago. Traditional workflow requested quotes from construction companies for construction works where the works to be

  20. Large-scale Motion of Solar Filaments

    Indian Academy of Sciences (India)

    tribpo

    Large-scale Motion of Solar Filaments. Pavel Ambrož, Astronomical Institute of the Acad. Sci. of the Czech Republic, CZ-25165. Ondrejov, The Czech Republic. e-mail: pambroz@asu.cas.cz. Alfred Schroll, Kanzelhöehe Solar Observatory of the University of Graz, A-9521 Treffen,. Austria. e-mail: schroll@solobskh.ac.at.

  1. Sensitivity analysis for large-scale problems

    Science.gov (United States)

    Noor, Ahmed K.; Whitworth, Sandra L.

    1987-01-01

    The development of efficient techniques for calculating sensitivity derivatives is studied. The objective is to present a computational procedure for calculating sensitivity derivatives as part of performing structural reanalysis for large-scale problems. The scope is limited to framed type structures. Both linear static analysis and free-vibration eigenvalue problems are considered.

  2. Ethics of large-scale change

    DEFF Research Database (Denmark)

    Arler, Finn

    2006-01-01

    , which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, the neoclassical economists' approach, and finally the so-called Concentric Circle Theories approach...

  3. The origin of large scale cosmic structure

    International Nuclear Information System (INIS)

    Jones, B.J.T.; Palmer, P.L.

    1985-01-01

    The paper concerns the origin of large scale cosmic structure. The evolution of density perturbations, the nonlinear regime (Zel'dovich's solution and others), the Gott and Rees clustering hierarchy, the spectrum of condensations, and biassed galaxy formation, are all discussed. (UK)

  4. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  5. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data

  6. Learning from large scale neural simulations

    DEFF Research Database (Denmark)

    Serban, Maria

    2017-01-01

    Large-scale neural simulations have the marks of a distinct methodology which can be fruitfully deployed to advance scientific understanding of the human brain. Computer simulation studies can be used to produce surrogate observational data for better conceptual models and new how...

  7. Large-Scale Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S. A.

    1995-12-01

    \\catcode`\\@=11 \\ialign{m @th#1hfil ##hfil \\crcr#2\\crcr\\sim\\crcr}}} \\catcode`\\@=12 Highly collimated outflows extend out to Mpc scales in many radio-loud active galaxies. In Seyfert galaxies, which are radio-quiet, the outflows extend out to kpc scales and do not appear to be as highly collimated. In order to study the nature of large-scale (>~1 kpc) outflows in Seyferts, we have conducted optical, radio and X-ray surveys of a distance-limited sample of 22 edge-on Seyfert galaxies. Results of the optical emission-line imaging and spectroscopic survey imply that large-scale outflows are present in >~{{1} /{4}} of all Seyferts. The radio (VLA) and X-ray (ROSAT) surveys show that large-scale radio and X-ray emission is present at about the same frequency. Kinetic luminosities of the outflows in Seyferts are comparable to those in starburst-driven superwinds. Large-scale radio sources in Seyferts appear diffuse, but do not resemble radio halos found in some edge-on starburst galaxies (e.g. M82). We discuss the feasibility of the outflows being powered by the active nucleus (e.g. a jet) or a circumnuclear starburst.

  8. Stability of large scale interconnected dynamical systems

    International Nuclear Information System (INIS)

    Akpan, E.P.

    1993-07-01

    Large scale systems modelled by a system of ordinary differential equations are considered and necessary and sufficient conditions are obtained for the uniform asymptotic connective stability of the systems using the method of cone-valued Lyapunov functions. It is shown that this model significantly improves the existing models. (author). 9 refs

  9. Sensitivity technologies for large scale simulation

    International Nuclear Information System (INIS)

    Collis, Samuel Scott; Bartlett, Roscoe Ainsworth; Smith, Thomas Michael; Heinkenschloss, Matthias; Wilcox, Lucas C.; Hill, Judith C.; Ghattas, Omar; Berggren, Martin Olof; Akcelik, Volkan; Ober, Curtis Curry; van Bloemen Waanders, Bart Gustaaf; Keiter, Eric Richard

    2005-01-01

    Sensitivity analysis is critically important to numerous analysis algorithms, including large scale optimization, uncertainty quantification,reduced order modeling, and error estimation. Our research focused on developing tools, algorithms and standard interfaces to facilitate the implementation of sensitivity type analysis into existing code and equally important, the work was focused on ways to increase the visibility of sensitivity analysis. We attempt to accomplish the first objective through the development of hybrid automatic differentiation tools, standard linear algebra interfaces for numerical algorithms, time domain decomposition algorithms and two level Newton methods. We attempt to accomplish the second goal by presenting the results of several case studies in which direct sensitivities and adjoint methods have been effectively applied, in addition to an investigation of h-p adaptivity using adjoint based a posteriori error estimation. A mathematical overview is provided of direct sensitivities and adjoint methods for both steady state and transient simulations. Two case studies are presented to demonstrate the utility of these methods. A direct sensitivity method is implemented to solve a source inversion problem for steady state internal flows subject to convection diffusion. Real time performance is achieved using novel decomposition into offline and online calculations. Adjoint methods are used to reconstruct initial conditions of a contamination event in an external flow. We demonstrate an adjoint based transient solution. In addition, we investigated time domain decomposition algorithms in an attempt to improve the efficiency of transient simulations. Because derivative calculations are at the root of sensitivity calculations, we have developed hybrid automatic differentiation methods and implemented this approach for shape optimization for gas dynamics using the Euler equations. The hybrid automatic differentiation method was applied to a first

  10. The state-led large scale public private partnership 'Chiranjeevi Program' to increase access to institutional delivery among poor women in Gujarat, India: How has it done? What can we learn?

    Science.gov (United States)

    De Costa, Ayesha; Vora, Kranti S; Ryan, Kayleigh; Sankara Raman, Parvathy; Santacatterina, Michele; Mavalankar, Dileep

    2014-01-01

    Many low-middle income countries have focused on improving access to and quality of obstetric care, as part of promoting a facility based intra-partum care strategy to reduce maternal mortality. The state of Gujarat in India, implements a facility based intra-partum care program through its large for-profit private obstetric sector, under a state-led public-private-partnership, the Chiranjeevi Yojana (CY), under which the state pays accredited private obstetricians to perform deliveries for poor/tribal women. We examine CY performance, its contribution to overall trends in institutional deliveries in Gujarat over the last decade and its effect on private and public sector deliveries there. District level institutional delivery data (public, private, CY), national surveys, poverty estimates, census data were used. Institutional delivery trends in Gujarat 2000-2010 are presented; including contributions of different sectors and CY. Piece-wise regression was used to study the influence of the CY program on public and private sector institutional delivery. Institutional delivery rose from 40.7% (2001) to 89.3% (2010), driven by sharp increases in private sector deliveries. Public sector and CY contributed 25-29% and 13-16% respectively of all deliveries each year. In 2007, 860 of 2000 private obstetricians participated in CY. Since 2007, >600,000 CY deliveries occurred i.e. one-third of births in the target population. Caesareans under CY were 6%, higher than the 2% reported among poor women by the DLHS survey just before CY. CY did not influence the already rising proportion of private sector deliveries in Gujarat. This paper reports a state-led, fully state-funded, large-scale public-private partnership to improve poor women's access to institutional delivery - there have been >600,000 beneficiaries. While caesarean proportions are higher under CY than before, it is uncertain if all beneficiaries who require sections receive these. Other issues to explore include

  11. The state-led large scale public private partnership 'Chiranjeevi Program' to increase access to institutional delivery among poor women in Gujarat, India: How has it done? What can we learn?

    Directory of Open Access Journals (Sweden)

    Ayesha De Costa

    Full Text Available BACKGROUND: Many low-middle income countries have focused on improving access to and quality of obstetric care, as part of promoting a facility based intra-partum care strategy to reduce maternal mortality. The state of Gujarat in India, implements a facility based intra-partum care program through its large for-profit private obstetric sector, under a state-led public-private-partnership, the Chiranjeevi Yojana (CY, under which the state pays accredited private obstetricians to perform deliveries for poor/tribal women. We examine CY performance, its contribution to overall trends in institutional deliveries in Gujarat over the last decade and its effect on private and public sector deliveries there. METHODS: District level institutional delivery data (public, private, CY, national surveys, poverty estimates, census data were used. Institutional delivery trends in Gujarat 2000-2010 are presented; including contributions of different sectors and CY. Piece-wise regression was used to study the influence of the CY program on public and private sector institutional delivery. RESULTS: Institutional delivery rose from 40.7% (2001 to 89.3% (2010, driven by sharp increases in private sector deliveries. Public sector and CY contributed 25-29% and 13-16% respectively of all deliveries each year. In 2007, 860 of 2000 private obstetricians participated in CY. Since 2007, >600,000 CY deliveries occurred i.e. one-third of births in the target population. Caesareans under CY were 6%, higher than the 2% reported among poor women by the DLHS survey just before CY. CY did not influence the already rising proportion of private sector deliveries in Gujarat. CONCLUSION: This paper reports a state-led, fully state-funded, large-scale public-private partnership to improve poor women's access to institutional delivery - there have been >600,000 beneficiaries. While caesarean proportions are higher under CY than before, it is uncertain if all beneficiaries who require

  12. Technology for the large-scale production of multi-crystalline silicon solar cells and modules

    International Nuclear Information System (INIS)

    Weeber, A.W.; De Moor, H.H.C.

    1997-06-01

    In cooperation with Shell Solar Energy (formerly R and S Renewable Energy Systems) and the Research Institute for Materials of the Catholic University Nijmegen the Netherlands Energy Research Foundation (ECN) plans to develop a competitive technology for the large-scale manufacturing of solar cells and solar modules on the basis of multi-crystalline silicon. The project will be carried out within the framework of the Economy, Ecology and Technology (EET) program of the Dutch ministry of Economic Affairs and the Dutch ministry of Education, Culture and Sciences. The aim of the EET-project is to reduce the costs of a solar module by 50% by means of increasing the conversion efficiency as well as the development of cheap processes for large-scale production

  13. Fermilab Research Program Workbook

    International Nuclear Information System (INIS)

    Rubinstein, R.

    1984-05-01

    The Fermilab Research Program Workbook has been published annually for the past several years to assist the Physics Advisory Committee in the yearly program review conducted during its summer meeting. While this is still a major aim, it is hoped that the Workbook will also prove useful to others seeking information on the current status of Fermilab experiments and the properties of beams at the Laboratory. In addition, short summaries of approved experiments are also included

  14. Large-scale structure of the Universe

    International Nuclear Information System (INIS)

    Doroshkevich, A.G.

    1978-01-01

    The problems, discussed at the ''Large-scale Structure of the Universe'' symposium are considered on a popular level. Described are the cell structure of galaxy distribution in the Universe, principles of mathematical galaxy distribution modelling. The images of cell structures, obtained after reprocessing with the computer are given. Discussed are three hypothesis - vortical, entropic, adiabatic, suggesting various processes of galaxy and galaxy clusters origin. A considerable advantage of the adiabatic hypothesis is recognized. The relict radiation, as a method of direct studying the processes taking place in the Universe is considered. The large-scale peculiarities and small-scale fluctuations of the relict radiation temperature enable one to estimate the turbance properties at the pre-galaxy stage. The discussion of problems, pertaining to studying the hot gas, contained in galaxy clusters, the interactions within galaxy clusters and with the inter-galaxy medium, is recognized to be a notable contribution into the development of theoretical and observational cosmology

  15. Emerging large-scale solar heating applications

    International Nuclear Information System (INIS)

    Wong, W.P.; McClung, J.L.

    2009-01-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  16. Emerging large-scale solar heating applications

    Energy Technology Data Exchange (ETDEWEB)

    Wong, W.P.; McClung, J.L. [Science Applications International Corporation (SAIC Canada), Ottawa, Ontario (Canada)

    2009-07-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  17. Challenges for Large Scale Structure Theory

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    I will describe some of the outstanding questions in Cosmology where answers could be provided by observations of the Large Scale Structure of the Universe at late times.I will discuss some of the theoretical challenges which will have to be overcome to extract this information from the observations. I will describe some of the theoretical tools that might be useful to achieve this goal. 

  18. Methods for Large-Scale Nonlinear Optimization.

    Science.gov (United States)

    1980-05-01

    STANFORD, CALIFORNIA 94305 METHODS FOR LARGE-SCALE NONLINEAR OPTIMIZATION by Philip E. Gill, Waiter Murray, I Michael A. Saunden, and Masgaret H. Wright...typical iteration can be partitioned so that where B is an m X m basise matrix. This partition effectively divides the vari- ables into three classes... attention is given to the standard of the coding or the documentation. A much better way of obtaining mathematical software is from a software library

  19. Large scale inhomogeneities and the cosmological principle

    International Nuclear Information System (INIS)

    Lukacs, B.; Meszaros, A.

    1984-12-01

    The compatibility of cosmologic principles and possible large scale inhomogeneities of the Universe is discussed. It seems that the strongest symmetry principle which is still compatible with reasonable inhomogeneities, is a full conformal symmetry in the 3-space defined by the cosmological velocity field, but even in such a case, the standard model is isolated from the inhomogeneous ones when the whole evolution is considered. (author)

  20. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; White, B.W.; Nichols, B.D.; Smith, P.R.; Leslie, I.H.; Fenton, D.L.; Gunaji, M.V.; Blythe, J.P.

    1991-01-01

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  1. Large-Scale Transit Signal Priority Implementation

    OpenAIRE

    Lee, Kevin S.; Lozner, Bailey

    2018-01-01

    In 2016, the District Department of Transportation (DDOT) deployed Transit Signal Priority (TSP) at 195 intersections in highly urbanized areas of Washington, DC. In collaboration with a broader regional implementation, and in partnership with the Washington Metropolitan Area Transit Authority (WMATA), DDOT set out to apply a systems engineering–driven process to identify, design, test, and accept a large-scale TSP system. This presentation will highlight project successes and lessons learned.

  2. Economically viable large-scale hydrogen liquefaction

    Science.gov (United States)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  3. NASA: Assessments of Selected Large-Scale Projects

    Science.gov (United States)

    2011-03-01

    REPORT DATE MAR 2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Assessments Of Selected Large-Scale Projects...Volatile EvolutioN MEP Mars Exploration Program MIB Mishap Investigation Board MMRTG Multi Mission Radioisotope Thermoelectric Generator MMS Magnetospheric...probes designed to explore the Martian surface, to satellites equipped with advanced sensors to study the earth , to telescopes intended to explore the

  4. Large scale 2D spectral compressed sensing in continuous domain

    KAUST Repository

    Cai, Jian-Feng

    2017-06-20

    We consider the problem of spectral compressed sensing in continuous domain, which aims to recover a 2-dimensional spectrally sparse signal from partially observed time samples. The signal is assumed to be a superposition of s complex sinusoids. We propose a semidefinite program for the 2D signal recovery problem. Our model is able to handle large scale 2D signals of size 500 × 500, whereas traditional approaches only handle signals of size around 20 × 20.

  5. Large scale 2D spectral compressed sensing in continuous domain

    KAUST Repository

    Cai, Jian-Feng; Xu, Weiyu; Yang, Yang

    2017-01-01

    We consider the problem of spectral compressed sensing in continuous domain, which aims to recover a 2-dimensional spectrally sparse signal from partially observed time samples. The signal is assumed to be a superposition of s complex sinusoids. We propose a semidefinite program for the 2D signal recovery problem. Our model is able to handle large scale 2D signals of size 500 × 500, whereas traditional approaches only handle signals of size around 20 × 20.

  6. Marine biosurfaces research program

    Science.gov (United States)

    The Office of Naval Research (ONR) of the U.S. Navy is starting a basic research program to address the initial events that control colonization of surfaces by organisms in marine environments. The program “arises from the Navy's need to understand and ultimately control biofouling and biocorrosion in marine environments,” according to a Navy announcement.The program, “Biological Processes Controlling Surface Modification in the Marine Environment,” will emphasize the application of in situ techniques and modern molecular biological, biochemical, and biophysical approaches; it will also encourage the development of interdisciplinary projects. Specific areas of interest include sensing and response to environmental surface (physiology/physical chemistry), factors controlling movement to and retention at surfaces (behavior/hydrodynamics), genetic regulation of attachment (molecular genetics), and mechanisms of attachment (biochemistry/surface chemistry).

  7. Acquisition Research Program Homepage

    OpenAIRE

    2015-01-01

    Includes an image of the main page on this date and compressed file containing additional web pages. Established in 2003, Naval Postgraduate School’s (NPS) Acquisition Research Program provides leadership in innovation, creative problem solving and an ongoing dialogue, contributing to the evolution of Department of Defense acquisition strategies.

  8. Controlled thermonuclear research program

    International Nuclear Information System (INIS)

    Anon.

    1976-01-01

    The Plasma Physics and Controlled-Fusion Research Program at the Lawrence Berkeley Laboratory is divided into five projects: Plasma Production and Heating Experiments, Plasma Theory, Atomic Physics Studies, the Tormac Project, and Neutral-Beam Development and Technology listed in order of increasing magnitude, as regards manpower and budget. Some cross sections and yields are shown in atomic physics

  9. Large-scale parallel genome assembler over cloud computing environment.

    Science.gov (United States)

    Das, Arghya Kusum; Koppa, Praveen Kumar; Goswami, Sayan; Platania, Richard; Park, Seung-Jong

    2017-06-01

    The size of high throughput DNA sequencing data has already reached the terabyte scale. To manage this huge volume of data, many downstream sequencing applications started using locality-based computing over different cloud infrastructures to take advantage of elastic (pay as you go) resources at a lower cost. However, the locality-based programming model (e.g. MapReduce) is relatively new. Consequently, developing scalable data-intensive bioinformatics applications using this model and understanding the hardware environment that these applications require for good performance, both require further research. In this paper, we present a de Bruijn graph oriented Parallel Giraph-based Genome Assembler (GiGA), as well as the hardware platform required for its optimal performance. GiGA uses the power of Hadoop (MapReduce) and Giraph (large-scale graph analysis) to achieve high scalability over hundreds of compute nodes by collocating the computation and data. GiGA achieves significantly higher scalability with competitive assembly quality compared to contemporary parallel assemblers (e.g. ABySS and Contrail) over traditional HPC cluster. Moreover, we show that the performance of GiGA is significantly improved by using an SSD-based private cloud infrastructure over traditional HPC cluster. We observe that the performance of GiGA on 256 cores of this SSD-based cloud infrastructure closely matches that of 512 cores of traditional HPC cluster.

  10. Advanced maintenance research programs

    International Nuclear Information System (INIS)

    Marston, T.U.; Gelhaus, F.; Burke, R.

    1985-01-01

    The purpose of this paper is to provide the reader with an idea of the advanced maintenance research program at the Electric Power Research Institute (EPRI). A brief description of the maintenance-related activities is provided as a foundation for the advanced maintenance research projects. The projects can be divided into maintenance planning, preventive maintenance program development and implementation, predictive (or conditional) maintenance, and innovative maintenance techniques. The projects include hardware and software development, human factors considerations, and technology promotion and implementation. The advanced concepts include: the incorporation of artificial intelligence into outage planning; turbine and pump maintenance; rotating equipment monitoring and diagnostics with the aid of expert systems; and the development of mobile robots for nuclear power plant maintenance

  11. RESTRUCTURING OF THE LARGE-SCALE SPRINKLERS

    Directory of Open Access Journals (Sweden)

    Paweł Kozaczyk

    2016-09-01

    Full Text Available One of the best ways for agriculture to become independent from shortages of precipitation is irrigation. In the seventies and eighties of the last century a number of large-scale sprinklers in Wielkopolska was built. At the end of 1970’s in the Poznan province 67 sprinklers with a total area of 6400 ha were installed. The average size of the sprinkler reached 95 ha. In 1989 there were 98 sprinklers, and the area which was armed with them was more than 10 130 ha. The study was conducted on 7 large sprinklers with the area ranging from 230 to 520 hectares in 1986÷1998. After the introduction of the market economy in the early 90’s and ownership changes in agriculture, large-scale sprinklers have gone under a significant or total devastation. Land on the State Farms of the State Agricultural Property Agency has leased or sold and the new owners used the existing sprinklers to a very small extent. This involved a change in crop structure, demand structure and an increase in operating costs. There has also been a threefold increase in electricity prices. Operation of large-scale irrigation encountered all kinds of barriers in practice and limitations of system solutions, supply difficulties, high levels of equipment failure which is not inclined to rational use of available sprinklers. An effect of a vision of the local area was to show the current status of the remaining irrigation infrastructure. The adopted scheme for the restructuring of Polish agriculture was not the best solution, causing massive destruction of assets previously invested in the sprinkler system.

  12. Optical interconnect for large-scale systems

    Science.gov (United States)

    Dress, William

    2013-02-01

    This paper presents a switchless, optical interconnect module that serves as a node in a network of identical distribution modules for large-scale systems. Thousands to millions of hosts or endpoints may be interconnected by a network of such modules, avoiding the need for multi-level switches. Several common network topologies are reviewed and their scaling properties assessed. The concept of message-flow routing is discussed in conjunction with the unique properties enabled by the optical distribution module where it is shown how top-down software control (global routing tables, spanning-tree algorithms) may be avoided.

  13. Adaptive visualization for large-scale graph

    International Nuclear Information System (INIS)

    Nakamura, Hiroko; Shinano, Yuji; Ohzahata, Satoshi

    2010-01-01

    We propose an adoptive visualization technique for representing a large-scale hierarchical dataset within limited display space. A hierarchical dataset has nodes and links showing the parent-child relationship between the nodes. These nodes and links are described using graphics primitives. When the number of these primitives is large, it is difficult to recognize the structure of the hierarchical data because many primitives are overlapped within a limited region. To overcome this difficulty, we propose an adaptive visualization technique for hierarchical datasets. The proposed technique selects an appropriate graph style according to the nodal density in each area. (author)

  14. Neutrinos and large-scale structure

    International Nuclear Information System (INIS)

    Eisenstein, Daniel J.

    2015-01-01

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos

  15. Puzzles of large scale structure and gravitation

    International Nuclear Information System (INIS)

    Sidharth, B.G.

    2006-01-01

    We consider the puzzle of cosmic voids bounded by two-dimensional structures of galactic clusters as also a puzzle pointed out by Weinberg: How can the mass of a typical elementary particle depend on a cosmic parameter like the Hubble constant? An answer to the first puzzle is proposed in terms of 'Scaled' Quantum Mechanical like behaviour which appears at large scales. The second puzzle can be answered by showing that the gravitational mass of an elementary particle has a Machian character (see Ahmed N. Cantorian small worked, Mach's principle and the universal mass network. Chaos, Solitons and Fractals 2004;21(4))

  16. Neutrinos and large-scale structure

    Energy Technology Data Exchange (ETDEWEB)

    Eisenstein, Daniel J. [Daniel J. Eisenstein, Harvard-Smithsonian Center for Astrophysics, 60 Garden St., MS #20, Cambridge, MA 02138 (United States)

    2015-07-15

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos.

  17. Concepts for Large Scale Hydrogen Production

    OpenAIRE

    Jakobsen, Daniel; Åtland, Vegar

    2016-01-01

    The objective of this thesis is to perform a techno-economic analysis of large-scale, carbon-lean hydrogen production in Norway, in order to evaluate various production methods and estimate a breakeven price level. Norway possesses vast energy resources and the export of oil and gas is vital to the country s economy. The results of this thesis indicate that hydrogen represents a viable, carbon-lean opportunity to utilize these resources, which can prove key in the future of Norwegian energy e...

  18. Stabilization Algorithms for Large-Scale Problems

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg

    2006-01-01

    The focus of the project is on stabilization of large-scale inverse problems where structured models and iterative algorithms are necessary for computing approximate solutions. For this purpose, we study various iterative Krylov methods and their abilities to produce regularized solutions. Some......-curve. This heuristic is implemented as a part of a larger algorithm which is developed in collaboration with G. Rodriguez and P. C. Hansen. Last, but not least, a large part of the project has, in different ways, revolved around the object-oriented Matlab toolbox MOORe Tools developed by PhD Michael Jacobsen. New...

  19. Large scale phononic metamaterials for seismic isolation

    International Nuclear Information System (INIS)

    Aravantinos-Zafiris, N.; Sigalas, M. M.

    2015-01-01

    In this work, we numerically examine structures that could be characterized as large scale phononic metamaterials. These novel structures could have band gaps in the frequency spectrum of seismic waves when their dimensions are chosen appropriately, thus raising the belief that they could be serious candidates for seismic isolation structures. Different and easy to fabricate structures were examined made from construction materials such as concrete and steel. The well-known finite difference time domain method is used in our calculations in order to calculate the band structures of the proposed metamaterials

  20. Efficient algorithms for collaborative decision making for large scale settings

    DEFF Research Database (Denmark)

    Assent, Ira

    2011-01-01

    to bring about more effective and more efficient retrieval systems that support the users' decision making process. We sketch promising research directions for more efficient algorithms for collaborative decision making, especially for large scale systems.......Collaborative decision making is a successful approach in settings where data analysis and querying can be done interactively. In large scale systems with huge data volumes or many users, collaboration is often hindered by impractical runtimes. Existing work on improving collaboration focuses...... on avoiding redundancy for users working on the same task. While this improves the effectiveness of the user work process, the underlying query processing engine is typically considered a "black box" and left unchanged. Research in multiple query processing, on the other hand, ignores the application...

  1. Large-scale assembly of colloidal particles

    Science.gov (United States)

    Yang, Hongta

    This study reports a simple, roll-to-roll compatible coating technology for producing three-dimensional highly ordered colloidal crystal-polymer composites, colloidal crystals, and macroporous polymer membranes. A vertically beveled doctor blade is utilized to shear align silica microsphere-monomer suspensions to form large-area composites in a single step. The polymer matrix and the silica microspheres can be selectively removed to create colloidal crystals and self-standing macroporous polymer membranes. The thickness of the shear-aligned crystal is correlated with the viscosity of the colloidal suspension and the coating speed, and the correlations can be qualitatively explained by adapting the mechanisms developed for conventional doctor blade coating. Five important research topics related to the application of large-scale three-dimensional highly ordered macroporous films by doctor blade coating are covered in this study. The first topic describes the invention in large area and low cost color reflective displays. This invention is inspired by the heat pipe technology. The self-standing macroporous polymer films exhibit brilliant colors which originate from the Bragg diffractive of visible light form the three-dimensional highly ordered air cavities. The colors can be easily changed by tuning the size of the air cavities to cover the whole visible spectrum. When the air cavities are filled with a solvent which has the same refractive index as that of the polymer, the macroporous polymer films become completely transparent due to the index matching. When the solvent trapped in the cavities is evaporated by in-situ heating, the sample color changes back to brilliant color. This process is highly reversible and reproducible for thousands of cycles. The second topic reports the achievement of rapid and reversible vapor detection by using 3-D macroporous photonic crystals. Capillary condensation of a condensable vapor in the interconnected macropores leads to the

  2. Sandia Combustion Research Program

    Energy Technology Data Exchange (ETDEWEB)

    Johnston, S.C.; Palmer, R.E.; Montana, C.A. (eds.)

    1988-01-01

    During the late 1970s, in response to a national energy crisis, Sandia proposed to the US Department of Energy (DOE) a new, ambitious program in combustion research. Shortly thereafter, the Combustion Research Facility (CRF) was established at Sandia's Livermore location. Designated a ''user facility,'' the charter of the CRF was to develop and maintain special-purpose resources to support a nationwide initiative-involving US inventories, industry, and national laboratories--to improve our understanding and control of combustion. This report includes descriptions several research projects which have been simulated by working groups and involve the on-site participation of industry scientists. DOE's Industry Technology Fellowship program, supported through the Office of Energy Research, has been instrumental in the success of some of these joint efforts. The remainder of this report presents results of calendar year 1988, separated thematically into eleven categories. Referred journal articles appearing in print during 1988 and selected other publications are included at the end of Section 11. Our traditional'' research activities--combustion chemistry, reacting flows, diagnostics, engine and coal combustion--have been supplemented by a new effort aimed at understanding combustion-related issues in the management of toxic and hazardous materials.

  3. Large-Scale Demonstration of Liquid Hydrogen Storage with Zero Boiloff for In-Space Applications

    Science.gov (United States)

    Hastings, L. J.; Bryant, C. B.; Flachbart, R. H.; Holt, K. A.; Johnson, E.; Hedayat, A.; Hipp, B.; Plachta, D. W.

    2010-01-01

    Cryocooler and passive insulation technology advances have substantially improved prospects for zero-boiloff cryogenic storage. Therefore, a cooperative effort by NASA s Ames Research Center, Glenn Research Center, and Marshall Space Flight Center (MSFC) was implemented to develop zero-boiloff concepts for in-space cryogenic storage. Described herein is one program element - a large-scale, zero-boiloff demonstration using the MSFC multipurpose hydrogen test bed (MHTB). A commercial cryocooler was interfaced with an existing MHTB spray bar mixer and insulation system in a manner that enabled a balance between incoming and extracted thermal energy.

  4. Dipolar modulation of Large-Scale Structure

    Science.gov (United States)

    Yoon, Mijin

    For the last two decades, we have seen a drastic development of modern cosmology based on various observations such as the cosmic microwave background (CMB), type Ia supernovae, and baryonic acoustic oscillations (BAO). These observational evidences have led us to a great deal of consensus on the cosmological model so-called LambdaCDM and tight constraints on cosmological parameters consisting the model. On the other hand, the advancement in cosmology relies on the cosmological principle: the universe is isotropic and homogeneous on large scales. Testing these fundamental assumptions is crucial and will soon become possible given the planned observations ahead. Dipolar modulation is the largest angular anisotropy of the sky, which is quantified by its direction and amplitude. We measured a huge dipolar modulation in CMB, which mainly originated from our solar system's motion relative to CMB rest frame. However, we have not yet acquired consistent measurements of dipolar modulations in large-scale structure (LSS), as they require large sky coverage and a number of well-identified objects. In this thesis, we explore measurement of dipolar modulation in number counts of LSS objects as a test of statistical isotropy. This thesis is based on two papers that were published in peer-reviewed journals. In Chapter 2 [Yoon et al., 2014], we measured a dipolar modulation in number counts of WISE matched with 2MASS sources. In Chapter 3 [Yoon & Huterer, 2015], we investigated requirements for detection of kinematic dipole in future surveys.

  5. Large scale integration of photovoltaics in cities

    International Nuclear Information System (INIS)

    Strzalka, Aneta; Alam, Nazmul; Duminil, Eric; Coors, Volker; Eicker, Ursula

    2012-01-01

    Highlights: ► We implement the photovoltaics on a large scale. ► We use three-dimensional modelling for accurate photovoltaic simulations. ► We consider the shadowing effect in the photovoltaic simulation. ► We validate the simulated results using detailed hourly measured data. - Abstract: For a large scale implementation of photovoltaics (PV) in the urban environment, building integration is a major issue. This includes installations on roof or facade surfaces with orientations that are not ideal for maximum energy production. To evaluate the performance of PV systems in urban settings and compare it with the building user’s electricity consumption, three-dimensional geometry modelling was combined with photovoltaic system simulations. As an example, the modern residential district of Scharnhauser Park (SHP) near Stuttgart/Germany was used to calculate the potential of photovoltaic energy and to evaluate the local own consumption of the energy produced. For most buildings of the district only annual electrical consumption data was available and only selected buildings have electronic metering equipment. The available roof area for one of these multi-family case study buildings was used for a detailed hourly simulation of the PV power production, which was then compared to the hourly measured electricity consumption. The results were extrapolated to all buildings of the analyzed area by normalizing them to the annual consumption data. The PV systems can produce 35% of the quarter’s total electricity consumption and half of this generated electricity is directly used within the buildings.

  6. Status: Large-scale subatmospheric cryogenic systems

    International Nuclear Information System (INIS)

    Peterson, T.

    1989-01-01

    In the late 1960's and early 1970's an interest in testing and operating RF cavities at 1.8K motivated the development and construction of four large (300 Watt) 1.8K refrigeration systems. in the past decade, development of successful superconducting RF cavities and interest in obtaining higher magnetic fields with the improved Niobium-Titanium superconductors has once again created interest in large-scale 1.8K refrigeration systems. The L'Air Liquide plant for Tore Supra is a recently commissioned 300 Watt 1.8K system which incorporates new technology, cold compressors, to obtain the low vapor pressure for low temperature cooling. CEBAF proposes to use cold compressors to obtain 5KW at 2.0K. Magnetic refrigerators of 10 Watt capacity or higher at 1.8K are now being developed. The state of the art of large-scale refrigeration in the range under 4K will be reviewed. 28 refs., 4 figs., 7 tabs

  7. Large-scale Intelligent Transporation Systems simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, T.; Canfield, T.; Hannebutte, U.; Levine, D.; Tentner, A.

    1995-06-01

    A prototype computer system has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS) capable of running on massively parallel computers and distributed (networked) computer systems. The prototype includes the modelling of instrumented ``smart`` vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of our design is that vehicles will be represented by autonomus computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  8. NRL HIFAR research program

    International Nuclear Information System (INIS)

    1989-01-01

    The use of a beam of heavy ions to ignite a thermonuclear pellet places severe constraints on beam emittance throughout the accelerator system. Nonlinearities which occur during beam transport, acceleration, and focusing, can cause emittance growth which limits spot intensity. Because of the high beam intensities required to achieve ignition, details of the self-consistent evolution of nonlinear space charge forces are generally important in this process. Computer simulations have, in turn, become an important tool in examining beam dynamics in this nonlinear regime. The Naval Research Laboratory HIFAR research program has been a major contributor to the successful use of numerical simulation to understand the detailed mechanisms by which space charge nonlinearities can contribute to emittance growth and the dilution of beam intensity. This program has been conducted in close cooperation with LLNL and LBL personnel to maximize support for those programs. Codes developed at NRL have been extensively shared and models developed at the other laboratories have been incorporated in the NRL codes. Because of the collaborative nature of much of the work over the past year, which has emphasized the development of numerical tools and techniques for general use, progress has generally resulted from shared efforts. The work, as reported here, emphasizes those contributions which can be attributed primarily to the NRL effort

  9. FY 1998 Report on development of large-scale wind power generation systems. Part 2. Operational research on large-scale wind power generation systems; 1998 nendo ogata furyoku hatsuden system kaihatsu seika hokokusho. 2. Ogata furyoku hatsuden system no unten kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    The on-the-spot surveys are conducted and related information is collected for current status of wind power generators connected to power grid systems and simulation techniques therefor in the USA and European countries. In Denmark, the grid system to which wind power generators are connected is a 10kV radiation type system, by which these generators are connected to general consumers. Power quality is investigated by the programs developed by DEFU (Danske Elvarkers Forening Udredning). The German's Norderland Wind Park has the largest capacity in Europe with 35 units of 1.5MW generators. They are connected to a 110kV grind system via ISOREE to control disturbances to the commercial grid system. The USA, used to have the world largest wind power generation capacity, is now plays second fiddle to Germany whose capacity has now exceeded 2,000MW. The country is now seeing the second rush for construction of wind power generators, planning to have a new capacity of 570MW in 1998. Information is also collected from other countries or organizations, including the Netherlands, WREC, Italy and Spain. (NEDO)

  10. FY 1998 Report on development of large-scale wind power generation systems. Part 2. Operational research on large-scale wind power generation systems; 1998 nendo ogata furyoku hatsuden system kaihatsu seika hokokusho. 2. Ogata furyoku hatsuden system no unten kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    The on-the-spot surveys are conducted and related information is collected for current status of wind power generators connected to power grid systems and simulation techniques therefor in the USA and European countries. In Denmark, the grid system to which wind power generators are connected is a 10kV radiation type system, by which these generators are connected to general consumers. Power quality is investigated by the programs developed by DEFU (Danske Elvarkers Forening Udredning). The German's Norderland Wind Park has the largest capacity in Europe with 35 units of 1.5MW generators. They are connected to a 110kV grind system via ISOREE to control disturbances to the commercial grid system. The USA, used to have the world largest wind power generation capacity, is now plays second fiddle to Germany whose capacity has now exceeded 2,000MW. The country is now seeing the second rush for construction of wind power generators, planning to have a new capacity of 570MW in 1998. Information is also collected from other countries or organizations, including the Netherlands, WREC, Italy and Spain. (NEDO)

  11. Fermilab research program workbook

    International Nuclear Information System (INIS)

    Rubinstein, R.

    1983-05-01

    The Fermilab Research Program Workbook has been produced annually for the past several years, with the original motivation of assisting the Physics Advisory Committee in its yearly program review conducted during its summer meeting. While this is still the primary goal, the Workbook is increasingly used by others needing information on the current status of Fermilab experiments, properties of beams, and short summaries of approved experiments. At the present time, considerable changes are taking place in the facilities at Fermilab. We have come to the end of the physics program using the 400 GeV Main Ring, which is now relegated to be just an injector for the soon-to-be commissioned Tevatron. In addition, the experimental areas are in the midst of a several-year program of upgrading to 1000 GeV capability. Several new beam lines will be built in the next few years; some indications can be given of their properties, although with the caveat that designs for some are by no means final. Already there is considerable activity leading to experiments studying anti p p collisions at √s = 2000 GeV

  12. Radiations: large scale monitoring in Japan

    International Nuclear Information System (INIS)

    Linton, M.; Khalatbari, A.

    2011-01-01

    As the consequences of radioactive leaks on their health are a matter of concern for Japanese people, a large scale epidemiological study has been launched by the Fukushima medical university. It concerns the two millions inhabitants of the Fukushima Prefecture. On the national level and with the support of public funds, medical care and follow-up, as well as systematic controls are foreseen, notably to check the thyroid of 360.000 young people less than 18 year old and of 20.000 pregnant women in the Fukushima Prefecture. Some measurements have already been performed on young children. Despite the sometimes rather low measures, and because they know that some parts of the area are at least as much contaminated as it was the case around Chernobyl, some people are reluctant to go back home

  13. Large-scale digitizer system, analog converters

    International Nuclear Information System (INIS)

    Althaus, R.F.; Lee, K.L.; Kirsten, F.A.; Wagner, L.J.

    1976-10-01

    Analog to digital converter circuits that are based on the sharing of common resources, including those which are critical to the linearity and stability of the individual channels, are described. Simplicity of circuit composition is valued over other more costly approaches. These are intended to be applied in a large-scale processing and digitizing system for use with high-energy physics detectors such as drift-chambers or phototube-scintillator arrays. Signal distribution techniques are of paramount importance in maintaining adequate signal-to-noise ratio. Noise in both amplitude and time-jitter senses is held sufficiently low so that conversions with 10-bit charge resolution and 12-bit time resolution are achieved

  14. Grid sensitivity capability for large scale structures

    Science.gov (United States)

    Nagendra, Gopal K.; Wallerstein, David V.

    1989-01-01

    The considerations and the resultant approach used to implement design sensitivity capability for grids into a large scale, general purpose finite element system (MSC/NASTRAN) are presented. The design variables are grid perturbations with a rather general linking capability. Moreover, shape and sizing variables may be linked together. The design is general enough to facilitate geometric modeling techniques for generating design variable linking schemes in an easy and straightforward manner. Test cases have been run and validated by comparison with the overall finite difference method. The linking of a design sensitivity capability for shape variables in MSC/NASTRAN with an optimizer would give designers a powerful, automated tool to carry out practical optimization design of real life, complicated structures.

  15. Large - scale Rectangular Ruler Automated Verification Device

    Science.gov (United States)

    Chen, Hao; Chang, Luping; Xing, Minjian; Xie, Xie

    2018-03-01

    This paper introduces a large-scale rectangular ruler automated verification device, which consists of photoelectric autocollimator and self-designed mechanical drive car and data automatic acquisition system. The design of mechanical structure part of the device refer to optical axis design, drive part, fixture device and wheel design. The design of control system of the device refer to hardware design and software design, and the hardware mainly uses singlechip system, and the software design is the process of the photoelectric autocollimator and the automatic data acquisition process. This devices can automated achieve vertical measurement data. The reliability of the device is verified by experimental comparison. The conclusion meets the requirement of the right angle test procedure.

  16. Large Scale Landform Mapping Using Lidar DEM

    Directory of Open Access Journals (Sweden)

    Türkay Gökgöz

    2015-08-01

    Full Text Available In this study, LIDAR DEM data was used to obtain a primary landform map in accordance with a well-known methodology. This primary landform map was generalized using the Focal Statistics tool (Majority, considering the minimum area condition in cartographic generalization in order to obtain landform maps at 1:1000 and 1:5000 scales. Both the primary and the generalized landform maps were verified visually with hillshaded DEM and an orthophoto. As a result, these maps provide satisfactory visuals of the landforms. In order to show the effect of generalization, the area of each landform in both the primary and the generalized maps was computed. Consequently, landform maps at large scales could be obtained with the proposed methodology, including generalization using LIDAR DEM.

  17. Constructing sites on a large scale

    DEFF Research Database (Denmark)

    Braae, Ellen Marie; Tietjen, Anne

    2011-01-01

    Since the 1990s, the regional scale has regained importance in urban and landscape design. In parallel, the focus in design tasks has shifted from master plans for urban extension to strategic urban transformation projects. A prominent example of a contemporary spatial development approach...... for setting the design brief in a large scale urban landscape in Norway, the Jaeren region around the city of Stavanger. In this paper, we first outline the methodological challenges and then present and discuss the proposed method based on our teaching experiences. On this basis, we discuss aspects...... is the IBA Emscher Park in the Ruhr area in Germany. Over a 10 years period (1988-1998), more than a 100 local transformation projects contributed to the transformation from an industrial to a post-industrial region. The current paradigm of planning by projects reinforces the role of the design disciplines...

  18. Large scale study of tooth enamel

    International Nuclear Information System (INIS)

    Bodart, F.; Deconninck, G.; Martin, M.T.

    Human tooth enamel contains traces of foreign elements. The presence of these elements is related to the history and the environment of the human body and can be considered as the signature of perturbations which occur during the growth of a tooth. A map of the distribution of these traces on a large scale sample of the population will constitute a reference for further investigations of environmental effects. On hundred eighty samples of teeth were first analyzed using PIXE, backscattering and nuclear reaction techniques. The results were analyzed using statistical methods. Correlations between O, F, Na, P, Ca, Mn, Fe, Cu, Zn, Pb and Sr were observed and cluster analysis was in progress. The techniques described in the present work have been developed in order to establish a method for the exploration of very large samples of the Belgian population. (author)

  19. Testing Einstein's Gravity on Large Scales

    Science.gov (United States)

    Prescod-Weinstein, Chandra

    2011-01-01

    A little over a decade has passed since two teams studying high redshift Type Ia supernovae announced the discovery that the expansion of the universe was accelerating. After all this time, we?re still not sure how cosmic acceleration fits into the theory that tells us about the large-scale universe: General Relativity (GR). As part of our search for answers, we have been forced to question GR itself. But how will we test our ideas? We are fortunate enough to be entering the era of precision cosmology, where the standard model of gravity can be subjected to more rigorous testing. Various techniques will be employed over the next decade or two in the effort to better understand cosmic acceleration and the theory behind it. In this talk, I will describe cosmic acceleration, current proposals to explain it, and weak gravitational lensing, an observational effect that allows us to do the necessary precision cosmology.

  20. HTGR safety research program

    International Nuclear Information System (INIS)

    Barsell, A.W.; Olsen, B.E.; Silady, F.A.

    1981-01-01

    An HTGR safety research program is being performed supporting and guided in priorities by the AIPA Probabilistic Risk Study. Analytical and experimental studies have been conducted in four general areas where modeling or data assumptions contribute to large uncertainties in the consequence assessments and thus, in the risk assessment for key core heat-up accident scenarios. Experimental data have been obtained on time-dependent release of fission products from the fuel particles, and plateout characteristics of condensible fission products in the primary circuit. Potential failure modes of primarily top head PCRV components as well as concrete degradation processes have been analyzed using a series of newly developed models and interlinked computer programs. Containment phenomena, including fission product deposition and potential flammability of liberated combustible gases have been studied analytically. Lastly, the behaviour of boron control material in the core and reactor subcriticality during core heatup have been examined analytically. Research in these areas has formed the basis for consequence updates in GA-A15000. Systematic derivation of future safety research priorities is also discussed. (author)

  1. Base Research Program

    Energy Technology Data Exchange (ETDEWEB)

    Everett Sondreal; John Hendrikson

    2009-03-31

    In June 2009, the Energy & Environmental Research Center (EERC) completed 11 years of research under the U.S. Department of Energy (DOE) Base Cooperative Agreement No. DE-FC26-98FT40320 funded through the Office of Fossil Energy (OFE) and administered at the National Energy Technology Laboratory (NETL). A wide range of diverse research activities were performed under annual program plans approved by NETL in seven major task areas: (1) resource characterization and waste management, (2) air quality assessment and control, (3) advanced power systems, (4) advanced fuel forms, (5) value-added coproducts, (6) advanced materials, and (7) strategic studies. This report summarizes results of the 67 research subtasks and an additional 50 strategic studies. Selected highlights in the executive summary illustrate the contribution of the research to the energy industry in areas not adequately addressed by the private sector alone. During the period of performance of the agreement, concerns have mounted over the impact of carbon emissions on climate change, and new programs have been initiated by DOE to ensure that fossil fuel resources along with renewable resources can continue to supply the nation's transportation fuel and electric power. The agreement has addressed DOE goals for reductions in CO{sub 2} emissions through efficiency, capture, and sequestration while expanding the supply and use of domestic energy resources for energy security. It has further contributed to goals for near-zero emissions from highly efficient coal-fired power plants; environmental control capabilities for SO{sub 2}, NO{sub x}, fine respirable particulate (PM{sub 2.5}), and mercury; alternative transportation fuels including liquid synfuels and hydrogen; and synergistic integration of fossil and renewable resources (e.g., wind-, biomass-, and coal-based electrical generation).

  2. Achievement report on contract research. Large-scale project - Results of 1st-phase research and development of MHD power generation system; Plant system no hyoka. Ogata project dai 1 ki MHD hatsuden system kenkyu kaihatsu seika hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1976-03-01

    Described in detail involving the results of component development, assessment, and the indication of problems are the power generation channel, superconductive magnets and a helium refrigeration and liquefaction unit, seeds collector, heat exchanger, combustor, etc. Described involving the result and effect of power generation system research and development and the indication of problems is the research on Mark V and Mark VI operation tests. Described in relation to thermal performance calculation, economic effectiveness calculation, and environmental conservation involving an MHD (magnetohydrodynamic) power plant are the combustion of heavy oil, combustion of natural gas, plant having a 1,000MW power generator as its base load, control of NOx and sulfur in MHD power generation, etc. As for planning for the next stage, the configuration of a 10MW MHD power generation plant, its equipment, construction cost, and preliminary element research, etc., are described. Furthermore, propositions are presented concerning future plans and the prospect of commercial MHD power generators, technological ripple effects due to MHD power generation research and development, and research and development in the future. (NEDO)

  3. Jointly Sponsored Research Program

    Energy Technology Data Exchange (ETDEWEB)

    Everett A. Sondreal; John G. Hendrikson; Thomas A. Erickson

    2009-03-31

    U.S. Department of Energy (DOE) Cooperative Agreement DE-FC26-98FT40321 funded through the Office of Fossil Energy and administered at the National Energy Technology Laboratory (NETL) supported the performance of a Jointly Sponsored Research Program (JSRP) at the Energy & Environmental Research Center (EERC) with a minimum 50% nonfederal cost share to assist industry in commercializing and effectively applying highly efficient, nonpolluting energy systems that meet the nation's requirements for clean fuels, chemicals, and electricity in the 21st century. The EERC in partnership with its nonfederal partners jointly performed 131 JSRP projects for which the total DOE cost share was $22,716,634 (38%) and the nonfederal share was $36,776,573 (62%). Summaries of these projects are presented in this report for six program areas: (1) resource characterization and waste management, (2) air quality assessment and control, (3) advanced power systems, (4) advanced fuel forms, (5) value-added coproducts, and (6) advanced materials. The work performed under this agreement addressed DOE goals for reductions in CO{sub 2} emissions through efficiency, capture, and sequestration; near-zero emissions from highly efficient coal-fired power plants; environmental control capabilities for SO{sub 2}, NO{sub x}, fine respirable particulate (PM{sub 2.5}), and mercury; alternative transportation fuels including liquid synfuels and hydrogen; and synergistic integration of fossil and renewable resources.

  4. Multiresolution comparison of precipitation datasets for large-scale models

    Science.gov (United States)

    Chun, K. P.; Sapriza Azuri, G.; Davison, B.; DeBeer, C. M.; Wheater, H. S.

    2014-12-01

    Gridded precipitation datasets are crucial for driving large-scale models which are related to weather forecast and climate research. However, the quality of precipitation products is usually validated individually. Comparisons between gridded precipitation products along with ground observations provide another avenue for investigating how the precipitation uncertainty would affect the performance of large-scale models. In this study, using data from a set of precipitation gauges over British Columbia and Alberta, we evaluate several widely used North America gridded products including the Canadian Gridded Precipitation Anomalies (CANGRD), the National Center for Environmental Prediction (NCEP) reanalysis, the Water and Global Change (WATCH) project, the thin plate spline smoothing algorithms (ANUSPLIN) and Canadian Precipitation Analysis (CaPA). Based on verification criteria for various temporal and spatial scales, results provide an assessment of possible applications for various precipitation datasets. For long-term climate variation studies (~100 years), CANGRD, NCEP, WATCH and ANUSPLIN have different comparative advantages in terms of their resolution and accuracy. For synoptic and mesoscale precipitation patterns, CaPA provides appealing performance of spatial coherence. In addition to the products comparison, various downscaling methods are also surveyed to explore new verification and bias-reduction methods for improving gridded precipitation outputs for large-scale models.

  5. Study of a large scale neutron measurement channel

    International Nuclear Information System (INIS)

    Amarouayache, Anissa; Ben Hadid, Hayet.

    1982-12-01

    A large scale measurement channel allows the processing of the signal coming from an unique neutronic sensor, during three different running modes: impulses, fluctuations and current. The study described in this note includes three parts: - A theoretical study of the large scale channel and its brief description are given. The results obtained till now in that domain are presented. - The fluctuation mode is thoroughly studied and the improvements to be done are defined. The study of a fluctuation linear channel with an automatic commutation of scales is described and the results of the tests are given. In this large scale channel, the method of data processing is analogical. - To become independent of the problems generated by the use of a an analogical processing of the fluctuation signal, a digital method of data processing is tested. The validity of that method is improved. The results obtained on a test system realized according to this method are given and a preliminary plan for further research is defined [fr

  6. Parallel clustering algorithm for large-scale biological data sets.

    Science.gov (United States)

    Wang, Minchao; Zhang, Wu; Ding, Wang; Dai, Dongbo; Zhang, Huiran; Xie, Hao; Chen, Luonan; Guo, Yike; Xie, Jiang

    2014-01-01

    Recent explosion of biological data brings a great challenge for the traditional clustering algorithms. With increasing scale of data sets, much larger memory and longer runtime are required for the cluster identification problems. The affinity propagation algorithm outperforms many other classical clustering algorithms and is widely applied into the biological researches. However, the time and space complexity become a great bottleneck when handling the large-scale data sets. Moreover, the similarity matrix, whose constructing procedure takes long runtime, is required before running the affinity propagation algorithm, since the algorithm clusters data sets based on the similarities between data pairs. Two types of parallel architectures are proposed in this paper to accelerate the similarity matrix constructing procedure and the affinity propagation algorithm. The memory-shared architecture is used to construct the similarity matrix, and the distributed system is taken for the affinity propagation algorithm, because of its large memory size and great computing capacity. An appropriate way of data partition and reduction is designed in our method, in order to minimize the global communication cost among processes. A speedup of 100 is gained with 128 cores. The runtime is reduced from serval hours to a few seconds, which indicates that parallel algorithm is capable of handling large-scale data sets effectively. The parallel affinity propagation also achieves a good performance when clustering large-scale gene data (microarray) and detecting families in large protein superfamilies.

  7. Robust large-scale parallel nonlinear solvers for simulations.

    Energy Technology Data Exchange (ETDEWEB)

    Bader, Brett William; Pawlowski, Roger Patrick; Kolda, Tamara Gibson (Sandia National Laboratories, Livermore, CA)

    2005-11-01

    This report documents research to develop robust and efficient solution techniques for solving large-scale systems of nonlinear equations. The most widely used method for solving systems of nonlinear equations is Newton's method. While much research has been devoted to augmenting Newton-based solvers (usually with globalization techniques), little has been devoted to exploring the application of different models. Our research has been directed at evaluating techniques using different models than Newton's method: a lower order model, Broyden's method, and a higher order model, the tensor method. We have developed large-scale versions of each of these models and have demonstrated their use in important applications at Sandia. Broyden's method replaces the Jacobian with an approximation, allowing codes that cannot evaluate a Jacobian or have an inaccurate Jacobian to converge to a solution. Limited-memory methods, which have been successful in optimization, allow us to extend this approach to large-scale problems. We compare the robustness and efficiency of Newton's method, modified Newton's method, Jacobian-free Newton-Krylov method, and our limited-memory Broyden method. Comparisons are carried out for large-scale applications of fluid flow simulations and electronic circuit simulations. Results show that, in cases where the Jacobian was inaccurate or could not be computed, Broyden's method converged in some cases where Newton's method failed to converge. We identify conditions where Broyden's method can be more efficient than Newton's method. We also present modifications to a large-scale tensor method, originally proposed by Bouaricha, for greater efficiency, better robustness, and wider applicability. Tensor methods are an alternative to Newton-based methods and are based on computing a step based on a local quadratic model rather than a linear model. The advantage of Bouaricha's method is that it can use any

  8. Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Willcox, Karen [MIT; Marzouk, Youssef [MIT

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to

  9. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif

    2017-01-07

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  10. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif; Orakzai, Faisal Moeen; Abdelaziz, Ibrahim; Khayyat, Zuhair

    2017-01-01

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  11. Comprehensive research concerning the development of effective utilizing techniques of biological resources (large scale research out of the framework). Seibutsu shigen no koritsuteki riyo gijutsu no kaihatsu ni kansuru sogo kenkyu (ogata betsuwaku kenkyu)

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-25

    This paper is a research report in which the cultivated production of forest resources, the development to useful substances and the systematization to take the root in a region were studied. The distribution maps of biological resources on respective woodland systems: that is, the nationwide distribution maps of the amount of resources in Japan as to the broadleaf trees in private forests and national forests, the available amount and kinds of tree were prepared. As for the establishment of cultivation technique of the super-short deforestation forest, that is, pursue of wooden growth to the maximum limit, the superior clone 26 system was selected from the willow group through the research of wooden cultivation and the clone which showed the maximum yield reached 24t/ha every year. As for the material preparation technique due to microbial enzymes, that is, the creation and breeding of fungi which have the high lignin decomposition power, a stock of fungi which has the high lignin decomposition power and the decomposition selectivity was created by the cell fusion and the UV (ultraviolet ray) radiation treatment. As for the use of effective components in wooden resources, many useful characteristics could be detected by applying the boiling, bursting and ozone treatment. As for the mushroom cultivation through the application of unused tree kinds, a new kind of mushroom for food service was selected to clarify the possibility of fruit body formation. The development of a new material from conifers is promising. 1 tab.

  12. Large-scale stochasticity in Hamiltonian systems

    International Nuclear Information System (INIS)

    Escande, D.F.

    1982-01-01

    Large scale stochasticity (L.S.S.) in Hamiltonian systems is defined on the paradigm Hamiltonian H(v,x,t) =v 2 /2-M cos x-P cos k(x-t) which describes the motion of one particle in two electrostatic waves. A renormalization transformation Tsub(r) is described which acts as a microscope that focusses on a given KAM (Kolmogorov-Arnold-Moser) torus in phase space. Though approximate, Tsub(r) yields the threshold of L.S.S. in H with an error of 5-10%. The universal behaviour of KAM tori is predicted: for instance the scale invariance of KAM tori and the critical exponent of the Lyapunov exponent of Cantori. The Fourier expansion of KAM tori is computed and several conjectures by L. Kadanoff and S. Shenker are proved. Chirikov's standard mapping for stochastic layers is derived in a simpler way and the width of the layers is computed. A simpler renormalization scheme for these layers is defined. A Mathieu equation for describing the stability of a discrete family of cycles is derived. When combined with Tsub(r), it allows to prove the link between KAM tori and nearby cycles, conjectured by J. Greene and, in particular, to compute the mean residue of a torus. The fractal diagrams defined by G. Schmidt are computed. A sketch of a methodology for computing the L.S.S. threshold in any two-degree-of-freedom Hamiltonian system is given. (Auth.)

  13. Large scale molecular simulations of nanotoxicity.

    Science.gov (United States)

    Jimenez-Cruz, Camilo A; Kang, Seung-gu; Zhou, Ruhong

    2014-01-01

    The widespread use of nanomaterials in biomedical applications has been accompanied by an increasing interest in understanding their interactions with tissues, cells, and biomolecules, and in particular, on how they might affect the integrity of cell membranes and proteins. In this mini-review, we present a summary of some of the recent studies on this important subject, especially from the point of view of large scale molecular simulations. The carbon-based nanomaterials and noble metal nanoparticles are the main focus, with additional discussions on quantum dots and other nanoparticles as well. The driving forces for adsorption of fullerenes, carbon nanotubes, and graphene nanosheets onto proteins or cell membranes are found to be mainly hydrophobic interactions and the so-called π-π stacking (between aromatic rings), while for the noble metal nanoparticles the long-range electrostatic interactions play a bigger role. More interestingly, there are also growing evidences showing that nanotoxicity can have implications in de novo design of nanomedicine. For example, the endohedral metallofullerenol Gd@C₈₂(OH)₂₂ is shown to inhibit tumor growth and metastasis by inhibiting enzyme MMP-9, and graphene is illustrated to disrupt bacteria cell membranes by insertion/cutting as well as destructive extraction of lipid molecules. These recent findings have provided a better understanding of nanotoxicity at the molecular level and also suggested therapeutic potential by using the cytotoxicity of nanoparticles against cancer or bacteria cells. © 2014 Wiley Periodicals, Inc.

  14. Large-scale tides in general relativity

    Energy Technology Data Exchange (ETDEWEB)

    Ip, Hiu Yan; Schmidt, Fabian, E-mail: iphys@mpa-garching.mpg.de, E-mail: fabians@mpa-garching.mpg.de [Max-Planck-Institut für Astrophysik, Karl-Schwarzschild-Str. 1, 85741 Garching (Germany)

    2017-02-01

    Density perturbations in cosmology, i.e. spherically symmetric adiabatic perturbations of a Friedmann-Lemaȋtre-Robertson-Walker (FLRW) spacetime, are locally exactly equivalent to a different FLRW solution, as long as their wavelength is much larger than the sound horizon of all fluid components. This fact is known as the 'separate universe' paradigm. However, no such relation is known for anisotropic adiabatic perturbations, which correspond to an FLRW spacetime with large-scale tidal fields. Here, we provide a closed, fully relativistic set of evolutionary equations for the nonlinear evolution of such modes, based on the conformal Fermi (CFC) frame. We show explicitly that the tidal effects are encoded by the Weyl tensor, and are hence entirely different from an anisotropic Bianchi I spacetime, where the anisotropy is sourced by the Ricci tensor. In order to close the system, certain higher derivative terms have to be dropped. We show that this approximation is equivalent to the local tidal approximation of Hui and Bertschinger [1]. We also show that this very simple set of equations matches the exact evolution of the density field at second order, but fails at third and higher order. This provides a useful, easy-to-use framework for computing the fully relativistic growth of structure at second order.

  15. Food appropriation through large scale land acquisitions

    International Nuclear Information System (INIS)

    Cristina Rulli, Maria; D’Odorico, Paolo

    2014-01-01

    The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions (LSLAs) for commercial farming will bring the technology required to close the existing crops yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with LSLAs. We show how up to 300–550 million people could be fed by crops grown in the acquired land, should these investments in agriculture improve crop production and close the yield gap. In contrast, about 190–370 million people could be supported by this land without closing of the yield gap. These numbers raise some concern because the food produced in the acquired land is typically exported to other regions, while the target countries exhibit high levels of malnourishment. Conversely, if used for domestic consumption, the crops harvested in the acquired land could ensure food security to the local populations. (letter)

  16. Large Scale EOF Analysis of Climate Data

    Science.gov (United States)

    Prabhat, M.; Gittens, A.; Kashinath, K.; Cavanaugh, N. R.; Mahoney, M.

    2016-12-01

    We present a distributed approach towards extracting EOFs from 3D climate data. We implement the method in Apache Spark, and process multi-TB sized datasets on O(1000-10,000) cores. We apply this method to latitude-weighted ocean temperature data from CSFR, a 2.2 terabyte-sized data set comprising ocean and subsurface reanalysis measurements collected at 41 levels in the ocean, at 6 hour intervals over 31 years. We extract the first 100 EOFs of this full data set and compare to the EOFs computed simply on the surface temperature field. Our analyses provide evidence of Kelvin and Rossy waves and components of large-scale modes of oscillation including the ENSO and PDO that are not visible in the usual SST EOFs. Further, they provide information on the the most influential parts of the ocean, such as the thermocline, that exist below the surface. Work is ongoing to understand the factors determining the depth-varying spatial patterns observed in the EOFs. We will experiment with weighting schemes to appropriately account for the differing depths of the observations. We also plan to apply the same distributed approach to analysis of analysis of 3D atmospheric climatic data sets, including multiple variables. Because the atmosphere changes on a quicker time-scale than the ocean, we expect that the results will demonstrate an even greater advantage to computing 3D EOFs in lieu of 2D EOFs.

  17. Mirror dark matter and large scale structure

    International Nuclear Information System (INIS)

    Ignatiev, A.Yu.; Volkas, R.R.

    2003-01-01

    Mirror matter is a dark matter candidate. In this paper, we reexamine the linear regime of density perturbation growth in a universe containing mirror dark matter. Taking adiabatic scale-invariant perturbations as the input, we confirm that the resulting processed power spectrum is richer than for the more familiar cases of cold, warm and hot dark matter. The new features include a maximum at a certain scale λ max , collisional damping below a smaller characteristic scale λ S ' , with oscillatory perturbations between the two. These scales are functions of the fundamental parameters of the theory. In particular, they decrease for decreasing x, the ratio of the mirror plasma temperature to that of the ordinary. For x∼0.2, the scale λ max becomes galactic. Mirror dark matter therefore leads to bottom-up large scale structure formation, similar to conventional cold dark matter, for x(less-or-similar sign)0.2. Indeed, the smaller the value of x, the closer mirror dark matter resembles standard cold dark matter during the linear regime. The differences pertain to scales smaller than λ S ' in the linear regime, and generally in the nonlinear regime because mirror dark matter is chemically complex and to some extent dissipative. Lyman-α forest data and the early reionization epoch established by WMAP may hold the key to distinguishing mirror dark matter from WIMP-style cold dark matter

  18. Studies of land-cover, land-use, and biophysical properties of vegetation in the Large Scale Biosphere Atmosphere experiment in Amazonia.

    Science.gov (United States)

    Dar A. Robertsa; Michael Keller; Joao Vianei Soares

    2003-01-01

    We summarize early research on land-cover, land-use, and biophysical properties of vegetation from the Large Scale Biosphere Atmosphere (LBA) experiment in Amazoˆnia. LBA is an international research program developed to evaluate regional function and to determine how land-use and climate modify biological, chemical and physical processes there. Remote sensing has...

  19. Component fragility research program

    International Nuclear Information System (INIS)

    Tsai, N.C.; Mochizuki, G.L.; Holman, G.S.

    1989-11-01

    To demonstrate how ''high-level'' qualification test data can be used to estimate the ultimate seismic capacity of nuclear power plant equipment, we assessed in detail various electrical components tested by the Pacific Gas ampersand Electric Company for its Diablo Canyon plant. As part of our Phase I Component Fragility Research Program, we evaluated seismic fragility for five Diablo Canyon components: medium-voltage (4kV) switchgear; safeguard relay board; emergency light battery pack; potential transformer; and station battery and racks. This report discusses our Phase II fragility evaluation of a single Westinghouse Type W motor control center column, a fan cooler motor controller, and three local starters at the Diablo Canyon nuclear power plant. These components were seismically qualified by means of biaxial random motion tests on a shaker table, and the test response spectra formed the basis for the estimate of the seismic capacity of the components. The seismic capacity of each component is referenced to the zero period acceleration (ZPA) and, in our Phase II study only, to the average spectral acceleration (ASA) of the motion at its base. For the motor control center, the seismic capacity was compared to the capacity of a Westinghouse Five-Star MCC subjected to actual fragility tests by LLNL during the Phase I Component Fragility Research Program, and to generic capacities developed by the Brookhaven National Laboratory for motor control center. Except for the medium-voltage switchgear, all of the components considered in both our Phase I and Phase II evaluations were qualified in their standard commercial configurations or with only relatively minor modifications such as top bracing of cabinets. 8 refs., 67 figs., 7 tabs

  20. Achievement report for fiscal 1976 on research in materials for electrodes and insulation walls. Large-scale technology development (Research and development of magnetohydrodynamic power generation); 1976 nendo denkyoku oyobi zetsuenheki zairyo ni kansuru kenkyu seika

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1977-06-01

    This report covers the achievements attained in fiscal 1976 by the materials working group engaged in the study of materials for electrodes and insulation walls. Fabricated and tested in the study relative to the experimental fabrication of materials for magnetohydrodynamic (MHD) power generation are MgO-Si{sub 3}N{sub 4} based insulation materials, MgO-BN based insulation materials, tin oxide based electrode materials, ZrO{sub 2}-CeO{sub 2} based cold pressed electrode materials, cermet based electrode materials, etc. In the research on basic characteristics and measurement, various electrode materials and insulation wall materials are subjected to a 300-hour K{sub 2}SO{sub 4} corrosion test at 1,300 degrees C. In the simulation of MHD power generation, correlations are investigated between materials, cooling structures, and dynamic characteristics, and data are collected to enable the prediction of performance and consumption of the materials during power generation. A data processing system is developed for the said simulation, and this enhances experimenting efficiency. In the study of insulation wall structures and electrode phenomena, studies are conducted about the thermal stress in power generation duct wall materials, localized anomalous heating due to arc spots, and the transfer of heat between the power generation duct wall materials and the cooling material. (NEDO)

  1. Foundational perspectives on causality in large-scale brain networks

    Science.gov (United States)

    Mannino, Michael; Bressler, Steven L.

    2015-12-01

    A profusion of recent work in cognitive neuroscience has been concerned with the endeavor to uncover causal influences in large-scale brain networks. However, despite the fact that many papers give a nod to the important theoretical challenges posed by the concept of causality, this explosion of research has generally not been accompanied by a rigorous conceptual analysis of the nature of causality in the brain. This review provides both a descriptive and prescriptive account of the nature of causality as found within and between large-scale brain networks. In short, it seeks to clarify the concept of causality in large-scale brain networks both philosophically and scientifically. This is accomplished by briefly reviewing the rich philosophical history of work on causality, especially focusing on contributions by David Hume, Immanuel Kant, Bertrand Russell, and Christopher Hitchcock. We go on to discuss the impact that various interpretations of modern physics have had on our understanding of causality. Throughout all this, a central focus is the distinction between theories of deterministic causality (DC), whereby causes uniquely determine their effects, and probabilistic causality (PC), whereby causes change the probability of occurrence of their effects. We argue that, given the topological complexity of its large-scale connectivity, the brain should be considered as a complex system and its causal influences treated as probabilistic in nature. We conclude that PC is well suited for explaining causality in the brain for three reasons: (1) brain causality is often mutual; (2) connectional convergence dictates that only rarely is the activity of one neuronal population uniquely determined by another one; and (3) the causal influences exerted between neuronal populations may not have observable effects. A number of different techniques are currently available to characterize causal influence in the brain. Typically, these techniques quantify the statistical

  2. GPU-based large-scale visualization

    KAUST Repository

    Hadwiger, Markus

    2013-11-19

    Recent advances in image and volume acquisition as well as computational advances in simulation have led to an explosion of the amount of data that must be visualized and analyzed. Modern techniques combine the parallel processing power of GPUs with out-of-core methods and data streaming to enable the interactive visualization of giga- and terabytes of image and volume data. A major enabler for interactivity is making both the computational and the visualization effort proportional to the amount of data that is actually visible on screen, decoupling it from the full data size. This leads to powerful display-aware multi-resolution techniques that enable the visualization of data of almost arbitrary size. The course consists of two major parts: An introductory part that progresses from fundamentals to modern techniques, and a more advanced part that discusses details of ray-guided volume rendering, novel data structures for display-aware visualization and processing, and the remote visualization of large online data collections. You will learn how to develop efficient GPU data structures and large-scale visualizations, implement out-of-core strategies and concepts such as virtual texturing that have only been employed recently, as well as how to use modern multi-resolution representations. These approaches reduce the GPU memory requirements of extremely large data to a working set size that fits into current GPUs. You will learn how to perform ray-casting of volume data of almost arbitrary size and how to render and process gigapixel images using scalable, display-aware techniques. We will describe custom virtual texturing architectures as well as recent hardware developments in this area. We will also describe client/server systems for distributed visualization, on-demand data processing and streaming, and remote visualization. We will describe implementations using OpenGL as well as CUDA, exploiting parallelism on GPUs combined with additional asynchronous

  3. Large Scale Self-Organizing Information Distribution System

    National Research Council Canada - National Science Library

    Low, Steven

    2005-01-01

    This project investigates issues in "large-scale" networks. Here "large-scale" refers to networks with large number of high capacity nodes and transmission links, and shared by a large number of users...

  4. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  5. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard

    2012-01-01

    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  6. Development of Large-Scale Spacecraft Fire Safety Experiments

    DEFF Research Database (Denmark)

    Ruff, Gary A.; Urban, David L.; Fernandez-Pello, A. Carlos

    2013-01-01

    exploration missions outside of low-earth orbit and accordingly, more complex in terms of operations, logistics, and safety. This will increase the challenge of ensuring a fire-safe environment for the crew throughout the mission. Based on our fundamental uncertainty of the behavior of fires in low...... of the spacecraft fire safety risk. The activity of this project is supported by an international topical team of fire experts from other space agencies who conduct research that is integrated into the overall experiment design. The large-scale space flight experiment will be conducted in an Orbital Sciences...

  7. Large Scale Simulations of the Euler Equations on GPU Clusters

    KAUST Repository

    Liebmann, Manfred

    2010-08-01

    The paper investigates the scalability of a parallel Euler solver, using the Vijayasundaram method, on a GPU cluster with 32 Nvidia Geforce GTX 295 boards. The aim of this research is to enable large scale fluid dynamics simulations with up to one billion elements. We investigate communication protocols for the GPU cluster to compensate for the slow Gigabit Ethernet network between the GPU compute nodes and to maintain overall efficiency. A diesel engine intake-port and a nozzle, meshed in different resolutions, give good real world examples for the scalability tests on the GPU cluster. © 2010 IEEE.

  8. Enabling Large-Scale Biomedical Analysis in the Cloud

    Directory of Open Access Journals (Sweden)

    Ying-Chih Lin

    2013-01-01

    Full Text Available Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable.

  9. Properties of large-scale methane/hydrogen jet fires

    Energy Technology Data Exchange (ETDEWEB)

    Studer, E. [CEA Saclay, DEN, LTMF Heat Transfer and Fluid Mech Lab, 91 - Gif-sur-Yvette (France); Jamois, D.; Leroy, G.; Hebrard, J. [INERIS, F-60150 Verneuil En Halatte (France); Jallais, S. [Air Liquide, F-78350 Jouy En Josas (France); Blanchetiere, V. [GDF SUEZ, 93 - La Plaine St Denis (France)

    2009-12-15

    A future economy based on reduction of carbon-based fuels for power generation and transportation may consider hydrogen as possible energy carrier Extensive and widespread use of hydrogen might require a pipeline network. The alternatives might be the use of the existing natural gas network or to design a dedicated network. Whatever the solution, mixing hydrogen with natural gas will modify the consequences of accidents, substantially The French National Research Agency (ANR) funded project called HYDROMEL focuses on these critical questions Within this project large-scale jet fires have been studied experimentally and numerically The main characteristics of these flames including visible length, radiation fluxes and blowout have been assessed. (authors)

  10. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    Science.gov (United States)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  11. Probes of large-scale structure in the Universe

    International Nuclear Information System (INIS)

    Suto, Yasushi; Gorski, K.; Juszkiewicz, R.; Silk, J.

    1988-01-01

    Recent progress in observational techniques has made it possible to confront quantitatively various models for the large-scale structure of the Universe with detailed observational data. We develop a general formalism to show that the gravitational instability theory for the origin of large-scale structure is now capable of critically confronting observational results on cosmic microwave background radiation angular anisotropies, large-scale bulk motions and large-scale clumpiness in the galaxy counts. (author)

  12. Research report on the effect of the large-scale industrial technology development system and on how it should be in the future; Ogata kogyo gijutsu kaihatsu seido no seika oyobi kongo no arikata ni kansuru chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1986-03-01

    A survey was done about projects implemented under the above-named development system inaugurated in fiscal 1966, and studies are made as to how large projects should be in the future. The survey covered the subjects which had been completed by fiscal 1985, that is, the remotely controlled submarine drilling device for oil, seawater desalination and by-product utilization, electric vehicle, technology of comprehensive control of automobiles, pattern information processing system, direct iron making by use of high-temperature reduced gas, manufacture of olefines from heavy oil, aviation jet engine, resources recycling/reuse system, superhigh-performance laser-aided combined manufacturing system, submarine oil production system, and the optics-aided measurement/control system. Answers were heard from corporations concerned. The answers contained some complaints, concerning the shortage of experience on the part of participating corporations, degradation in planning functions, increase in the burden of leading companies, shortage of study or conference about an optimum promotion system, problems in accounting and auditing systems, etc., and suggestions were presented for improvement on large-scale projects. (NEDO)

  13. Large scale dynamics of protoplanetary discs

    Science.gov (United States)

    Béthune, William

    2017-08-01

    Planets form in the gaseous and dusty disks orbiting young stars. These protoplanetary disks are dispersed in a few million years, being accreted onto the central star or evaporated into the interstellar medium. To explain the observed accretion rates, it is commonly assumed that matter is transported through the disk by turbulence, although the mechanism sustaining turbulence is uncertain. On the other side, irradiation by the central star could heat up the disk surface and trigger a photoevaporative wind, but thermal effects cannot account for the observed acceleration and collimation of the wind into a narrow jet perpendicular to the disk plane. Both issues can be solved if the disk is sensitive to magnetic fields. Weak fields lead to the magnetorotational instability, whose outcome is a state of sustained turbulence. Strong fields can slow down the disk, causing it to accrete while launching a collimated wind. However, the coupling between the disk and the neutral gas is done via electric charges, each of which is outnumbered by several billion neutral molecules. The imperfect coupling between the magnetic field and the neutral gas is described in terms of "non-ideal" effects, introducing new dynamical behaviors. This thesis is devoted to the transport processes happening inside weakly ionized and weakly magnetized accretion disks; the role of microphysical effects on the large-scale dynamics of the disk is of primary importance. As a first step, I exclude the wind and examine the impact of non-ideal effects on the turbulent properties near the disk midplane. I show that the flow can spontaneously organize itself if the ionization fraction is low enough; in this case, accretion is halted and the disk exhibits axisymmetric structures, with possible consequences on planetary formation. As a second step, I study the launching of disk winds via a global model of stratified disk embedded in a warm atmosphere. This model is the first to compute non-ideal effects from

  14. Large-scale fuel cycle centres

    International Nuclear Information System (INIS)

    Smiley, S.H.; Black, K.M.

    1977-01-01

    The US Nuclear Regulatory Commission (NRC) has considered the nuclear energy centre concept for fuel cycle plants in the Nuclear Energy Centre Site Survey 1975 (NECSS-75) Rep. No. NUREG-0001, an important study mandated by the US Congress in the Energy Reorganization Act of 1974 which created the NRC. For this study, the NRC defined fuel cycle centres as consisting of fuel reprocessing and mixed-oxide fuel fabrication plants, and optional high-level waste and transuranic waste management facilities. A range of fuel cycle centre sizes corresponded to the fuel throughput of power plants with a total capacity of 50,000-300,000MW(e). The types of fuel cycle facilities located at the fuel cycle centre permit the assessment of the role of fuel cycle centres in enhancing the safeguard of strategic special nuclear materials - plutonium and mixed oxides. Siting fuel cycle centres presents a smaller problem than siting reactors. A single reprocessing plant of the scale projected for use in the USA (1500-2000t/a) can reprocess fuel from reactors producing 50,000-65,000MW(e). Only two or three fuel cycle centres of the upper limit size considered in the NECSS-75 would be required in the USA by the year 2000. The NECSS-75 fuel cycle centre evaluation showed that large-scale fuel cycle centres present no real technical siting difficulties from a radiological effluent and safety standpoint. Some construction economies may be achievable with fuel cycle centres, which offer opportunities to improve waste-management systems. Combined centres consisting of reactors and fuel reprocessing and mixed-oxide fuel fabrication plants were also studied in the NECSS. Such centres can eliminate shipment not only of Pu but also mixed-oxide fuel. Increased fuel cycle costs result from implementation of combined centres unless the fuel reprocessing plants are commercial-sized. Development of Pu-burning reactors could reduce any economic penalties of combined centres. The need for effective fissile

  15. Matrix Sampling of Items in Large-Scale Assessments

    Directory of Open Access Journals (Sweden)

    Ruth A. Childs

    2003-07-01

    Full Text Available Matrix sampling of items -' that is, division of a set of items into different versions of a test form..-' is used by several large-scale testing programs. Like other test designs, matrixed designs have..both advantages and disadvantages. For example, testing time per student is less than if each..student received all the items, but the comparability of student scores may decrease. Also,..curriculum coverage is maintained, but reporting of scores becomes more complex. In this paper,..matrixed designs are compared with more traditional designs in nine categories of costs:..development costs, materials costs, administration costs, educational costs, scoring costs,..reliability costs, comparability costs, validity costs, and reporting costs. In choosing among test..designs, a testing program should examine the costs in light of its mandate(s, the content of the..tests, and the financial resources available, among other considerations.

  16. Large scale injection test (LASGIT) modelling

    International Nuclear Information System (INIS)

    Arnedo, D.; Olivella, S.; Alonso, E.E.

    2010-01-01

    Document available in extended abstract form only. With the objective of understanding the gas flow processes through clay barriers in schemes of radioactive waste disposal, the Lasgit in situ experiment was planned and is currently in progress. The modelling of the experiment will permit to better understand of the responses, to confirm hypothesis of mechanisms and processes and to learn in order to design future experiments. The experiment and modelling activities are included in the project FORGE (FP7). The in situ large scale injection test Lasgit is currently being performed at the Aespoe Hard Rock Laboratory by SKB and BGS. An schematic layout of the test is shown. The deposition hole follows the KBS3 scheme. A copper canister is installed in the axe of the deposition hole, surrounded by blocks of highly compacted MX-80 bentonite. A concrete plug is placed at the top of the buffer. A metallic lid anchored to the surrounding host rock is included in order to prevent vertical movements of the whole system during gas injection stages (high gas injection pressures are expected to be reached). Hydration of the buffer material is achieved by injecting water through filter mats, two placed at the rock walls and two at the interfaces between bentonite blocks. Water is also injected through the 12 canister filters. Gas injection stages are performed injecting gas to some of the canister injection filters. Since the water pressure and the stresses (swelling pressure development) will be high during gas injection, it is necessary to inject at high gas pressures. This implies mechanical couplings as gas penetrates after the gas entry pressure is achieved and may produce deformations which in turn lead to permeability increments. A 3D hydro-mechanical numerical model of the test using CODE-BRIGHT is presented. The domain considered for the modelling is shown. The materials considered in the simulation are the MX-80 bentonite blocks (cylinders and rings), the concrete plug

  17. Large-scale fuel cycle centers

    International Nuclear Information System (INIS)

    Smiley, S.H.; Black, K.M.

    1977-01-01

    The United States Nuclear Regulatory Commission (NRC) has considered the nuclear energy center concept for fuel cycle plants in the Nuclear Energy Center Site Survey - 1975 (NECSS-75) -- an important study mandated by the U.S. Congress in the Energy Reorganization Act of 1974 which created the NRC. For the study, NRC defined fuel cycle centers to consist of fuel reprocessing and mixed oxide fuel fabrication plants, and optional high-level waste and transuranic waste management facilities. A range of fuel cycle center sizes corresponded to the fuel throughput of power plants with a total capacity of 50,000 - 300,000 MWe. The types of fuel cycle facilities located at the fuel cycle center permit the assessment of the role of fuel cycle centers in enhancing safeguarding of strategic special nuclear materials -- plutonium and mixed oxides. Siting of fuel cycle centers presents a considerably smaller problem than the siting of reactors. A single reprocessing plant of the scale projected for use in the United States (1500-2000 MT/yr) can reprocess the fuel from reactors producing 50,000-65,000 MWe. Only two or three fuel cycle centers of the upper limit size considered in the NECSS-75 would be required in the United States by the year 2000 . The NECSS-75 fuel cycle center evaluations showed that large scale fuel cycle centers present no real technical difficulties in siting from a radiological effluent and safety standpoint. Some construction economies may be attainable with fuel cycle centers; such centers offer opportunities for improved waste management systems. Combined centers consisting of reactors and fuel reprocessing and mixed oxide fuel fabrication plants were also studied in the NECSS. Such centers can eliminate not only shipment of plutonium, but also mixed oxide fuel. Increased fuel cycle costs result from implementation of combined centers unless the fuel reprocessing plants are commercial-sized. Development of plutonium-burning reactors could reduce any

  18. OffshoreDC DC grids for integration of large scale wind power

    DEFF Research Database (Denmark)

    Zeni, Lorenzo; Endegnanew, Atsede Gualu; Stamatiou, Georgios

    The present report summarizes the main findings of the Nordic Energy Research project “DC grids for large scale integration of offshore wind power – OffshoreDC”. The project is been funded by Nordic Energy Research through the TFI programme and was active between 2011 and 2016. The overall...... objective of the project was to drive the development of the VSC based HVDC technology for future large scale offshore grids, supporting a standardised and commercial development of the technology, and improving the opportunities for the technology to support power system integration of large scale offshore...

  19. Research Programs & Initiatives

    Science.gov (United States)

    CGH develops international initiatives and collaborates with other NCI divisions, NCI-designated Cancer Centers, and other countries to support cancer control planning, encourage capacity building, and support cancer research and research networks.

  20. Large-scale analysis of malware downloaders

    NARCIS (Netherlands)

    Rossow, Christian; Dietrich, Christian; Bos, Herbert

    2013-01-01

    Downloaders are malicious programs with the goal to subversively download and install malware (eggs) on a victim's machine. In this paper, we analyze and characterize 23 Windows-based malware downloaders. We first show a high diversity in downloaders' communication architectures (e.g., P2P), carrier

  1. Software for large scale tracking studies

    International Nuclear Information System (INIS)

    Niederer, J.

    1984-05-01

    Over the past few years, Brookhaven accelerator physicists have been adapting particle tracking programs in planning local storage rings, and lately for SSC reference designs. In addition, the Laboratory is actively considering upgrades to its AGS capabilities aimed at higher proton intensity, polarized proton beams, and heavy ion acceleration. Further activity concerns heavy ion transfer, a proposed booster, and most recently design studies for a heavy ion collider to join to this complex. Circumstances have thus encouraged a search for common features among design and modeling programs and their data, and the corresponding controls efforts among present and tentative machines. Using a version of PATRICIA with nonlinear forces as a vehicle, we have experimented with formal ways to describe accelerator lattice problems to computers as well as to speed up the calculations for large storage ring models. Code treated by straightforward reorganization has served for SSC explorations. The representation work has led to a relational data base centered program, LILA, which has desirable properties for dealing with the many thousands of rapidly changing variables in tracking and other model programs. 13 references

  2. 7. Framework Research Program

    International Nuclear Information System (INIS)

    Donghi, C.; Pieri, Alberto; Manzini, G.

    2006-01-01

    The UE it means to face the problem of the deficiency if investments in the RS field. In particular politics of research are turned to pursue three main goals: the strengthening of the scientific excellence in Europe; the increase of total investments for research; the realization of European space of research [it

  3. Equipment qualification research program: program plan

    International Nuclear Information System (INIS)

    Dong, R.G.; Smith, P.D.

    1982-01-01

    The Lawrence Livermore National Laboratory (LLNL) under the sponsorship of the US Nuclear Regulatory Commission (NRC) has developed this program plan for research in equipment qualification (EQA). In this report the research program which will be executed in accordance with this plan will be referred to as the Equipment Qualification Research Program (EQRP). Covered are electrical and mechanical equipment under the conditions described in the OBJECTIVE section of this report. The EQRP has two phases; Phase I is primarily to produce early results and to develop information for Phase II. Phase I will last 18 months and consists of six projects. The first project is program management. The second project is responsible for in-depth evaluation and review of EQ issues and EQ processes. The third project is responsible for detailed planning to initiate Phase II. The remaining three projects address specific equipment; i.e., valves, electrical equipment, and a pump

  4. GoFFish: A Sub-Graph Centric Framework for Large-Scale Graph Analytics1

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Kumbhare, Alok; Wickramaarachchi, Charith; Nagarkar, Soonil; Ravi, Santosh; Raghavendra, Cauligi; Prasanna, Viktor

    2014-08-25

    Large scale graph processing is a major research area for Big Data exploration. Vertex centric programming models like Pregel are gaining traction due to their simple abstraction that allows for scalable execution on distributed systems naturally. However, there are limitations to this approach which cause vertex centric algorithms to under-perform due to poor compute to communication overhead ratio and slow convergence of iterative superstep. In this paper we introduce GoFFish a scalable sub-graph centric framework co-designed with a distributed persistent graph storage for large scale graph analytics on commodity clusters. We introduce a sub-graph centric programming abstraction that combines the scalability of a vertex centric approach with the flexibility of shared memory sub-graph computation. We map Connected Components, SSSP and PageRank algorithms to this model to illustrate its flexibility. Further, we empirically analyze GoFFish using several real world graphs and demonstrate its significant performance improvement, orders of magnitude in some cases, compared to Apache Giraph, the leading open source vertex centric implementation. We map Connected Components, SSSP and PageRank algorithms to this model to illustrate its flexibility. Further, we empirically analyze GoFFish using several real world graphs and demonstrate its significant performance improvement, orders of magnitude in some cases, compared to Apache Giraph, the leading open source vertex centric implementation.

  5. Dose monitoring in large-scale flowing aqueous media

    International Nuclear Information System (INIS)

    Kuruca, C.N.

    1995-01-01

    The Miami Electron Beam Research Facility (EBRF) has been in operation for six years. The EBRF houses a 1.5 MV, 75 KW DC scanned electron beam. Experiments have been conducted to evaluate the effectiveness of high-energy electron irradiation in the removal of toxic organic chemicals from contaminated water and the disinfection of various wastewater streams. The large-scale plant operates at approximately 450 L/min (120 gal/min). The radiation dose absorbed by the flowing aqueous streams is estimated by measuring the difference in water temperature before and after it passes in front of the beam. Temperature measurements are made using resistance temperature devices (RTDs) and recorded by computer along with other operating parameters. Estimated dose is obtained from the measured temperature differences using the specific heat of water. This presentation will discuss experience with this measurement system, its application to different water presentation devices, sources of error, and the advantages and disadvantages of its use in large-scale process applications

  6. Thermal power generation projects ``Large Scale Solar Heating``; EU-Thermie-Projekte ``Large Scale Solar Heating``

    Energy Technology Data Exchange (ETDEWEB)

    Kuebler, R.; Fisch, M.N. [Steinbeis-Transferzentrum Energie-, Gebaeude- und Solartechnik, Stuttgart (Germany)

    1998-12-31

    The aim of this project is the preparation of the ``Large-Scale Solar Heating`` programme for an Europe-wide development of subject technology. The following demonstration programme was judged well by the experts but was not immediately (1996) accepted for financial subsidies. In November 1997 the EU-commission provided 1,5 million ECU which allowed the realisation of an updated project proposal. By mid 1997 a small project was approved, that had been requested under the lead of Chalmes Industriteteknik (CIT) in Sweden and is mainly carried out for the transfer of technology. (orig.) [Deutsch] Ziel dieses Vorhabens ist die Vorbereitung eines Schwerpunktprogramms `Large Scale Solar Heating`, mit dem die Technologie europaweit weiterentwickelt werden sollte. Das daraus entwickelte Demonstrationsprogramm wurde von den Gutachtern positiv bewertet, konnte jedoch nicht auf Anhieb (1996) in die Foerderung aufgenommen werden. Im November 1997 wurden von der EU-Kommission dann kurzfristig noch 1,5 Mio ECU an Foerderung bewilligt, mit denen ein aktualisierter Projektvorschlag realisiert werden kann. Bereits Mitte 1997 wurde ein kleineres Vorhaben bewilligt, das unter Federfuehrung von Chalmers Industriteknik (CIT) in Schweden beantragt worden war und das vor allem dem Technologietransfer dient. (orig.)

  7. Large-Scale Cooperative Task Distribution on Peer-to-Peer Networks

    Science.gov (United States)

    2012-01-01

    SUBTITLE Large-scale cooperative task distribution on peer-to-peer networks 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...disadvantages of ML- Chord are its fixed size (two layers), and limited scala - bility for large-scale systems. RC-Chord extends ML- D. Karrels et al...configurable before runtime. This can be improved by incorporating a distributed learning algorithm to tune the number and range of the DLoE tracking

  8. Aquatic Plant Control Research Program. Large-Scale Operations Management Test of Use of the White Amur for Control of Problem Aquatic Plants. Report 5. Synthesis Report.

    Science.gov (United States)

    1984-06-01

    in 1977. Pre- cipitation in the form of snowflakes, snow pellets, or sleet is rare, although hail is fairly common during storms (National Oceanic and...were the green tree frog, Florida cricket frog, pennisula cooter, and southern leopard frog, which comprised 22.0, 11.4, 7.5, and 4.1 percent of the...1976 1977 1978 1979 1980 Figure 24. Mean monthly abundances of benthic organisms prestocking period. During the second poststocking year this genus was

  9. Research, monitoring and evaluation of fish and wildlife restoration projects in the Columbia River Basin: Lessons learned and suggestions for large-scale monitoring programs.

    Science.gov (United States)

    Lyman L. McDonald; Robert Bilby; Peter A. Bisson; Charles C. Coutant; John M. Epifanio; Daniel Goodman; Susan Hanna; Nancy Huntly; Erik Merrill; Brian Riddell; William Liss; Eric J. Loudenslager; David P. Philipp; William Smoker; Richard R. Whitney; Richard N. Williams

    2007-01-01

    The year 2006 marked two milestones in the Columbia River Basin and the Pacific Northwest region's efforts to rebuild its once great salmon and steelhead runs: the 25th anniversary of the creation of the Northwest Power and Conservation Council and the 10th anniversary of an amendment to the Northwest Power Act that formalized scientific peer review of the council...

  10. BigSUR: large-scale structured urban reconstruction

    KAUST Repository

    Kelly, Tom; Femiani, John; Wonka, Peter; Mitra, Niloy J.

    2017-01-01

    The creation of high-quality semantically parsed 3D models for dense metropolitan areas is a fundamental urban modeling problem. Although recent advances in acquisition techniques and processing algorithms have resulted in large-scale imagery or 3D polygonal reconstructions, such data-sources are typically noisy, and incomplete, with no semantic structure. In this paper, we present an automatic data fusion technique that produces high-quality structured models of city blocks. From coarse polygonal meshes, street-level imagery, and GIS footprints, we formulate a binary integer program that globally balances sources of error to produce semantically parsed mass models with associated facade elements. We demonstrate our system on four city regions of varying complexity; our examples typically contain densely built urban blocks spanning hundreds of buildings. In our largest example, we produce a structured model of 37 city blocks spanning a total of 1,011 buildings at a scale and quality previously impossible to achieve automatically.

  11. Large Scale GW Calculations on the Cori System

    Science.gov (United States)

    Deslippe, Jack; Del Ben, Mauro; da Jornada, Felipe; Canning, Andrew; Louie, Steven

    The NERSC Cori system, powered by 9000+ Intel Xeon-Phi processors, represents one of the largest HPC systems for open-science in the United States and the world. We discuss the optimization of the GW methodology for this system, including both node level and system-scale optimizations. We highlight multiple large scale (thousands of atoms) case studies and discuss both absolute application performance and comparison to calculations on more traditional HPC architectures. We find that the GW method is particularly well suited for many-core architectures due to the ability to exploit a large amount of parallelism across many layers of the system. This work was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Sciences and Engineering Division, as part of the Computational Materials Sciences Program.

  12. BigSUR: large-scale structured urban reconstruction

    KAUST Repository

    Kelly, Tom

    2017-11-22

    The creation of high-quality semantically parsed 3D models for dense metropolitan areas is a fundamental urban modeling problem. Although recent advances in acquisition techniques and processing algorithms have resulted in large-scale imagery or 3D polygonal reconstructions, such data-sources are typically noisy, and incomplete, with no semantic structure. In this paper, we present an automatic data fusion technique that produces high-quality structured models of city blocks. From coarse polygonal meshes, street-level imagery, and GIS footprints, we formulate a binary integer program that globally balances sources of error to produce semantically parsed mass models with associated facade elements. We demonstrate our system on four city regions of varying complexity; our examples typically contain densely built urban blocks spanning hundreds of buildings. In our largest example, we produce a structured model of 37 city blocks spanning a total of 1,011 buildings at a scale and quality previously impossible to achieve automatically.

  13. Automatic Installation and Configuration for Large Scale Farms

    CERN Document Server

    Novák, J

    2005-01-01

    Since the early appearance of commodity hardware, the utilization of computers rose rapidly, and they became essential in all areas of life. Soon it was realized that nodes are able to work cooperatively, in order to solve new, more complex tasks. This conception got materialized in coherent aggregations of computers called farms and clusters. Collective application of nodes, being efficient and economical, was adopted in education, research and industry before long. But maintainance, especially in large scale, appeared as a problem to be resolved. New challenges needed new methods and tools. Development work has been started to build farm management applications and frameworks. In the first part of the thesis, these systems are introduced. After a general description of the matter, a comparative analysis of different approaches and tools illustrates the practical aspects of the theoretical discussion. CERN, the European Organization of Nuclear Research is the largest Particle Physics laboratory in the world....

  14. Analysis using large-scale ringing data

    Directory of Open Access Journals (Sweden)

    Baillie, S. R.

    2004-06-01

    survival and recruitment estimates from the French CES scheme to assess the relative contributions of survival and recruitment to overall population changes. He develops a novel approach to modelling survival rates from such multi–site data by using within–year recaptures to provide a covariate of between–year recapture rates. This provided parsimonious models of variation in recapture probabilities between sites and years. The approach provides promising results for the four species investigated and can potentially be extended to similar data from other CES/MAPS schemes. The final paper by Blandine Doligez, David Thomson and Arie van Noordwijk (Doligez et al., 2004 illustrates how large-scale studies of population dynamics can be important for evaluating the effects of conservation measures. Their study is concerned with the reintroduction of White Stork populations to the Netherlands where a re–introduction programme started in 1969 had resulted in a breeding population of 396 pairs by 2000. They demonstrate the need to consider a wide range of models in order to account for potential age, time, cohort and “trap–happiness” effects. As the data are based on resightings such trap–happiness must reflect some form of heterogeneity in resighting probabilities. Perhaps surprisingly, the provision of supplementary food did not influence survival, but it may havehad an indirect effect via the alteration of migratory behaviour. Spatially explicit modelling of data gathered at many sites inevitably results in starting models with very large numbers of parameters. The problem is often complicated further by having relatively sparse data at each site, even where the total amount of data gathered is very large. Both Julliard (2004 and Doligez et al. (2004 give explicit examples of problems caused by needing to handle very large numbers of parameters and show how they overcame them for their particular data sets. Such problems involve both the choice of appropriate

  15. Ecological Research Division, Marine Research Program

    Energy Technology Data Exchange (ETDEWEB)

    1980-05-01

    This report presents program summaries of the various projects sponsored during 1979 by the Marine Research Program of the Ecological Research Division. Program areas include the effects of petroleum hydrocarbons on the marine environment; a study of the baseline ecology of a proposed OTEC site near Puerto Rico; the environmental impact of offshore geothermal energy development; the movement of radionuclides through the marine environment; the environmental aspects of power plant cooling systems; and studies of the physical and biological oceangraphy of the continental shelves bordering the United States.

  16. Ecological Research Division, Marine Research Program

    International Nuclear Information System (INIS)

    1980-05-01

    This report presents program summaries of the various projects sponsored during 1979 by the Marine Research Program of the Ecological Research Division. Program areas include the effects of petroleum hydrocarbons on the marine environment; a study of the baseline ecology of a proposed OTEC site near Puerto Rico; the environmental impact of offshore geothermal energy development; the movement of radionuclides through the marine environment; the environmental aspects of power plant cooling systems; and studies of the physical and biological oceangraphy of the continental shelves bordering the United States

  17. Nuclear wastes: research programs

    International Nuclear Information System (INIS)

    Anon.

    2003-01-01

    The management of long-living and high level radioactive wastes in France belongs to the framework of the December 30, 1991 law which defines three ways of research: the separation and transmutation of radionuclides, their reversible storage or disposal in deep geologic formations, and their processing and surface storage during long duration. Research works are done in partnership between public research and industrial organizations in many French and foreign laboratories. Twelve years after its enforcement, the impact of this law has overstepped the simple research framework and has led to a deep reflection of the society about the use of nuclear energy. This short paper presents the main results obtained so far in the three research ways, the general energy policy of the French government, the industrial progresses made in the framework of the 1991 law and the international context of the management of nuclear wastes. (J.S.)

  18. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  19. NCI: DCTD: Biometric Research Program

    Science.gov (United States)

    The Biometric Research Program (BRP) is the statistical and biomathematical component of the Division of Cancer Treatment, Diagnosis and Centers (DCTDC). Its members provide statistical leadership for the national and international research programs of the division in developmental therapeutics, developmental diagnostics, diagnostic imaging and clinical trials.

  20. Linking Large-Scale Reading Assessments: Comment

    Science.gov (United States)

    Hanushek, Eric A.

    2016-01-01

    E. A. Hanushek points out in this commentary that applied researchers in education have only recently begun to appreciate the value of international assessments, even though there are now 50 years of experience with these. Until recently, these assessments have been stand-alone surveys that have not been linked, and analysis has largely focused on…

  1. Human Research Program

    Data.gov (United States)

    National Aeronautics and Space Administration — Strategically, the HRP conducts research and technology development that: 1) enables the development or modification of Agency-level human health and performance...

  2. Large scale propagation intermittency in the atmosphere

    Science.gov (United States)

    Mehrabi, Ali

    2000-11-01

    Long-term (several minutes to hours) amplitude variations observed in outdoor sound propagation experiments at Disneyland, California, in February 1998 are explained in terms of a time varying index of refraction. The experimentally propagated acoustic signals were received and recorded at several locations ranging from 300 meters to 2,800 meters. Meteorological data was taken as a function of altitude simultaneously with the received signal levels. There were many barriers along the path of acoustic propagation that affected the received signal levels, especially at short ranges. In a downward refraction situation, there could be a random change of amplitude in the predicted signals. A computer model based on the Fast Field Program (FFP) was used to compute the signal loss at the different receiving locations and to verify that the variations in the received signal levels can be predicted numerically. The calculations agree with experimental data with the same trend variations in average amplitude.

  3. Performance regression manager for large scale systems

    Science.gov (United States)

    Faraj, Daniel A.

    2017-08-01

    System and computer program product to perform an operation comprising generating, based on a first output generated by a first execution instance of a command, a first output file specifying a value of at least one performance metric, wherein the first output file is formatted according to a predefined format, comparing the value of the at least one performance metric in the first output file to a value of the performance metric in a second output file, the second output file having been generated based on a second output generated by a second execution instance of the command, and outputting for display an indication of a result of the comparison of the value of the at least one performance metric of the first output file to the value of the at least one performance metric of the second output file.

  4. Enablers and Barriers to Large-Scale Uptake of Improved Solid Fuel Stoves: A Systematic Review

    Science.gov (United States)

    Puzzolo, Elisa; Stanistreet, Debbi; Pope, Daniel; Bruce, Nigel G.

    2013-01-01

    Background: Globally, 2.8 billion people rely on household solid fuels. Reducing the resulting adverse health, environmental, and development consequences will involve transitioning through a mix of clean fuels and improved solid fuel stoves (IS) of demonstrable effectiveness. To date, achieving uptake of IS has presented significant challenges. Objectives: We performed a systematic review of factors that enable or limit large-scale uptake of IS in low- and middle-income countries. Methods: We conducted systematic searches through multidisciplinary databases, specialist websites, and consulting experts. The review drew on qualitative, quantitative, and case studies and used standardized methods for screening, data extraction, critical appraisal, and synthesis. We summarized our findings as “factors” relating to one of seven domains—fuel and technology characteristics; household and setting characteristics; knowledge and perceptions; finance, tax, and subsidy aspects; market development; regulation, legislation, and standards; programmatic and policy mechanisms—and also recorded issues that impacted equity. Results: We identified 31 factors influencing uptake from 57 studies conducted in Asia, Africa, and Latin America. All domains matter. Although factors such as offering technologies that meet household needs and save fuel, user training and support, effective financing, and facilitative government action appear to be critical, none guarantee success: All factors can be influential, depending on context. The nature of available evidence did not permit further prioritization. Conclusions: Achieving adoption and sustained use of IS at a large scale requires that all factors, spanning household/community and program/societal levels, be assessed and supported by policy. We propose a planning tool that would aid this process and suggest further research to incorporate an evaluation of effectiveness. Citation: Rehfuess EA, Puzzolo E, Stanistreet D, Pope D, Bruce

  5. Large scale chromatographic separations using continuous displacement chromatography (CDC)

    International Nuclear Information System (INIS)

    Taniguchi, V.T.; Doty, A.W.; Byers, C.H.

    1988-01-01

    A process for large scale chromatographic separations using a continuous chromatography technique is described. The process combines the advantages of large scale batch fixed column displacement chromatography with conventional analytical or elution continuous annular chromatography (CAC) to enable large scale displacement chromatography to be performed on a continuous basis (CDC). Such large scale, continuous displacement chromatography separations have not been reported in the literature. The process is demonstrated with the ion exchange separation of a binary lanthanide (Nd/Pr) mixture. The process is, however, applicable to any displacement chromatography separation that can be performed using conventional batch, fixed column chromatography

  6. Large-scale cryopumping for controlled fusion

    International Nuclear Information System (INIS)

    Pittenger, L.C.

    1977-01-01

    Vacuum pumping by freezing out or otherwise immobilizing the pumped gas is an old concept. In several plasma physics experiments for controlled fusion research, cryopumping has been used to provide clean, ultrahigh vacua. Present day fusion research devices, which rely almost universally upon neutral beams for heating, are high gas throughput systems, the pumping of which is best accomplished by cryopumping in the high mass-flow, moderate-to-high vacuum regime. Cryopumping systems have been developed for neutral beam injection systems on several fusion experiments (HVTS, TFTR) and are being developed for the overall pumping of a large, high-throughput mirror containment experiment (MFTF). In operation, these large cryopumps will require periodic defrosting, some schemes for which are discussed, along with other operational considerations. The development of cryopumps for fusion reactors is begun with the TFTR and MFTF systems. Likely paths for necessary further development for power-producing reactors are also discussed

  7. Large-scale cryopumping for controlled fusion

    Energy Technology Data Exchange (ETDEWEB)

    Pittenger, L.C.

    1977-07-25

    Vacuum pumping by freezing out or otherwise immobilizing the pumped gas is an old concept. In several plasma physics experiments for controlled fusion research, cryopumping has been used to provide clean, ultrahigh vacua. Present day fusion research devices, which rely almost universally upon neutral beams for heating, are high gas throughput systems, the pumping of which is best accomplished by cryopumping in the high mass-flow, moderate-to-high vacuum regime. Cryopumping systems have been developed for neutral beam injection systems on several fusion experiments (HVTS, TFTR) and are being developed for the overall pumping of a large, high-throughput mirror containment experiment (MFTF). In operation, these large cryopumps will require periodic defrosting, some schemes for which are discussed, along with other operational considerations. The development of cryopumps for fusion reactors is begun with the TFTR and MFTF systems. Likely paths for necessary further development for power-producing reactors are also discussed.

  8. The WIPP research and development test program

    International Nuclear Information System (INIS)

    Tyler, L.D.

    1985-01-01

    The WIPP (Waste Isolation Pilot Plant) is a DOE RandD Facility for the purpose of developing the technology needed for the safe disposal of the United States defense-related radioactive waste. The in-situ test program is defined for the thermal-structural interactions, plugging and sealing, and waste package interactions in a salt environment. An integrated series of large-scale underground tests address the issues of both systems and long-term isolation performance of a repository

  9. Radon Research Program, FY-1990

    International Nuclear Information System (INIS)

    1991-03-01

    The Department of Energy (DOE) Office of Health and Environmental Research (OHER) has established a Radon Research Program with the primary objectives of acquiring knowledge necessary to improve estimates of health risks associated with radon exposure and also to improve radon control. Through the Radon Research Program, OHER supports and coordinates the research activities of investigators at facilities all across the nation. From this research, significant advances are being made in our understanding of the health effects of radon. OHER publishes this annual report to provide information to interested researchers and the public about its research activities. This edition of the report summarizes the activities of program researchers during FY90. Chapter 2 of this report describes how risks associated with radon exposure are estimated, what assumptions are made in estimating radon risks for the general public, and how the uncertainties in these assumptions affect the risk estimates. Chapter 3 examines how OHER, through the Radon Research Program, is working to gather information for reducing the uncertainties and improving the risk estimates. Chapter 4 highlights some of the major findings of investigators participating in the Radon Research Program in the past year. And, finally, Chapter 5 discusses the direction in which the program is headed in the future. 20 figs

  10. Large-scale applications of superconductivity in the United States: an overview. Metallurgy, fabrication, and applications

    International Nuclear Information System (INIS)

    Hein, R.A.; Gubser, D.U.

    1981-01-01

    This report presents an overview of ongoing development efforts in the USA concerned with large-scale applications of superconductivity. These applications are grouped according to magnetic field regime, as low field regime, intermediate field regime, and high field regime. In the low field regime two diverse areas of large application are identified, superconducting power transmission lines for electric utilities, and RF cavities for particle accelerators for high energy physics research. Activity in the intermediate regime has been significantly increased due to Fermilab's energy doubler or Tevatron project, and BNL's ISABELLE project. Rotating electrical machines, such as DC acyclic (homopolar) motors, generators, and energy storage magnets are also studied. In the high field regime magnetohydrodynamics (MHD) and magnetically confined fusion in tokamaks are examined. In each regime all current work is summarized according to key person, research topic, type of program, funding, status, and future outlook

  11. Tansmutation Research program

    Energy Technology Data Exchange (ETDEWEB)

    Seidler, Paul

    2011-07-31

    Six years of research was conducted for the United States Department of Energy, Office of Nuclear Energy between the years of 2006 through 2011 at the University of Nevada, Las Vegas (UNLV). The results of this research are detailed in the narratives for tasks 1-45. The work performed spanned the range of experimental and modeling efforts. Radiochemistry (separations, waste separation, nuclear fuel, remote sensing, and waste forms) , material fabrication, material characterization, corrosion studies, nuclear criticality, sensors, and modeling comprise the major topics of study during these six years.

  12. Oligopolistic competition in wholesale electricity markets: Large-scale simulation and policy analysis using complementarity models

    Science.gov (United States)

    Helman, E. Udi

    This dissertation conducts research into the large-scale simulation of oligopolistic competition in wholesale electricity markets. The dissertation has two parts. Part I is an examination of the structure and properties of several spatial, or network, equilibrium models of oligopolistic electricity markets formulated as mixed linear complementarity problems (LCP). Part II is a large-scale application of such models to the electricity system that encompasses most of the United States east of the Rocky Mountains, the Eastern Interconnection. Part I consists of Chapters 1 to 6. The models developed in this part continue research into mixed LCP models of oligopolistic electricity markets initiated by Hobbs [67] and subsequently developed by Metzler [87] and Metzler, Hobbs and Pang [88]. Hobbs' central contribution is a network market model with Cournot competition in generation and a price-taking spatial arbitrage firm that eliminates spatial price discrimination by the Cournot firms. In one variant, the solution to this model is shown to be equivalent to the "no arbitrage" condition in a "pool" market, in which a Regional Transmission Operator optimizes spot sales such that the congestion price between two locations is exactly equivalent to the difference in the energy prices at those locations (commonly known as locational marginal pricing). Extensions to this model are presented in Chapters 5 and 6. One of these is a market model with a profit-maximizing arbitrage firm. This model is structured as a mathematical program with equilibrium constraints (MPEC), but due to the linearity of its constraints, can be solved as a mixed LCP. Part II consists of Chapters 7 to 12. The core of these chapters is a large-scale simulation of the U.S. Eastern Interconnection applying one of the Cournot competition with arbitrage models. This is the first oligopolistic equilibrium market model to encompass the full Eastern Interconnection with a realistic network representation (using

  13. Large-Scale Pattern Discovery in Music

    Science.gov (United States)

    Bertin-Mahieux, Thierry

    This work focuses on extracting patterns in musical data from very large collections. The problem is split in two parts. First, we build such a large collection, the Million Song Dataset, to provide researchers access to commercial-size datasets. Second, we use this collection to study cover song recognition which involves finding harmonic patterns from audio features. Regarding the Million Song Dataset, we detail how we built the original collection from an online API, and how we encouraged other organizations to participate in the project. The result is the largest research dataset with heterogeneous sources of data available to music technology researchers. We demonstrate some of its potential and discuss the impact it already has on the field. On cover song recognition, we must revisit the existing literature since there are no publicly available results on a dataset of more than a few thousand entries. We present two solutions to tackle the problem, one using a hashing method, and one using a higher-level feature computed from the chromagram (dubbed the 2DFTM). We further investigate the 2DFTM since it has potential to be a relevant representation for any task involving audio harmonic content. Finally, we discuss the future of the dataset and the hope of seeing more work making use of the different sources of data that are linked in the Million Song Dataset. Regarding cover songs, we explain how this might be a first step towards defining a harmonic manifold of music, a space where harmonic similarities between songs would be more apparent.

  14. Superconductivity for Large Scale Wind Turbines

    Energy Technology Data Exchange (ETDEWEB)

    R. Fair; W. Stautner; M. Douglass; R. Rajput-Ghoshal; M. Moscinski; P. Riley; D. Wagner; J. Kim; S. Hou; F. Lopez; K. Haran; J. Bray; T. Laskaris; J. Rochford; R. Duckworth

    2012-10-12

    A conceptual design has been completed for a 10MW superconducting direct drive wind turbine generator employing low temperature superconductors for the field winding. Key technology building blocks from the GE Wind and GE Healthcare businesses have been transferred across to the design of this concept machine. Wherever possible, conventional technology and production techniques have been used in order to support the case for commercialization of such a machine. Appendices A and B provide further details of the layout of the machine and the complete specification table for the concept design. Phase 1 of the program has allowed us to understand the trade-offs between the various sub-systems of such a generator and its integration with a wind turbine. A Failure Modes and Effects Analysis (FMEA) and a Technology Readiness Level (TRL) analysis have been completed resulting in the identification of high risk components within the design. The design has been analyzed from a commercial and economic point of view and Cost of Energy (COE) calculations have been carried out with the potential to reduce COE by up to 18% when compared with a permanent magnet direct drive 5MW baseline machine, resulting in a potential COE of 0.075 $/kWh. Finally, a top-level commercialization plan has been proposed to enable this technology to be transitioned to full volume production. The main body of this report will present the design processes employed and the main findings and conclusions.

  15. Research on Automatic Programming

    Science.gov (United States)

    1975-12-31

    Sequential processes, deadlocks, and semaphore primitives , Ph.D. Thesis, Harvard University, November 1974; Center for Research in Computing...verified. 13 Code generated to effect the synchronization makes use of the ECL control extension facility (Prenner’s CI, see [Prenner]). The... semaphore operations [Dijkstra] is being developed. Initial results for this code generator are very encouraging; in many cases generated code is

  16. Large-scale additive manufacturing with bioinspired cellulosic materials.

    Science.gov (United States)

    Sanandiya, Naresh D; Vijay, Yadunund; Dimopoulou, Marina; Dritsas, Stylianos; Fernandez, Javier G

    2018-06-05

    Cellulose is the most abundant and broadly distributed organic compound and industrial by-product on Earth. However, despite decades of extensive research, the bottom-up use of cellulose to fabricate 3D objects is still plagued with problems that restrict its practical applications: derivatives with vast polluting effects, use in combination with plastics, lack of scalability and high production cost. Here we demonstrate the general use of cellulose to manufacture large 3D objects. Our approach diverges from the common association of cellulose with green plants and it is inspired by the wall of the fungus-like oomycetes, which is reproduced introducing small amounts of chitin between cellulose fibers. The resulting fungal-like adhesive material(s) (FLAM) are strong, lightweight and inexpensive, and can be molded or processed using woodworking techniques. We believe this first large-scale additive manufacture with ubiquitous biological polymers will be the catalyst for the transition to environmentally benign and circular manufacturing models.

  17. Hydrogen-combustion analyses of large-scale tests

    International Nuclear Information System (INIS)

    Gido, R.G.; Koestel, A.

    1986-01-01

    This report uses results of the large-scale tests with turbulence performed by the Electric Power Research Institute at the Nevada Test Site to evaluate hydrogen burn-analysis procedures based on lumped-parameter codes like COMPARE-H2 and associated burn-parameter models. The test results: (1) confirmed, in a general way, the procedures for application to pulsed burning, (2) increased significantly our understanding of the burn phenomenon by demonstrating that continuous burning can occur, and (3) indicated that steam can terminate continuous burning. Future actions recommended include: (1) modification of the code to perform continuous-burn analyses, which is demonstrated, (2) analyses to determine the type of burning (pulsed or continuous) that will exist in nuclear containments and the stable location if the burning is continuous, and (3) changes to the models for estimating burn parameters

  18. Hydrogen-combustion analyses of large-scale tests

    International Nuclear Information System (INIS)

    Gido, R.G.; Koestel, A.

    1986-01-01

    This report uses results of the large-scale tests with turbulence performed by the Electric Power Research Institute at the Nevada Test Site to evaluate hydrogen burn-analysis procedures based on lumped-parameter codes like COMPARE-H2 and associated burn-parameter models. The test results (a) confirmed, in a general way, the procedures for application to pulsed burning, (b) increased significantly our understanding of the burn phenomenon by demonstrating that continuous burning can occur and (c) indicated that steam can terminate continuous burning. Future actions recommended include (a) modification of the code to perform continuous-burn analyses, which is demonstrated, (b) analyses to determine the type of burning (pulsed or continuous) that will exist in nuclear containments and the stable location if the burning is continuous, and (c) changes to the models for estimating burn parameters

  19. Measuring large-scale social networks with high resolution.

    Directory of Open Access Journals (Sweden)

    Arkadiusz Stopczynski

    Full Text Available This paper describes the deployment of a large-scale study designed to measure human interactions across a variety of communication channels, with high temporal resolution and spanning multiple years-the Copenhagen Networks Study. Specifically, we collect data on face-to-face interactions, telecommunication, social networks, location, and background information (personality, demographics, health, politics for a densely connected population of 1000 individuals, using state-of-the-art smartphones as social sensors. Here we provide an overview of the related work and describe the motivation and research agenda driving the study. Additionally, the paper details the data-types measured, and the technical infrastructure in terms of both backend and phone software, as well as an outline of the deployment procedures. We document the participant privacy procedures and their underlying principles. The paper is concluded with early results from data analysis, illustrating the importance of multi-channel high-resolution approach to data collection.

  20. Large-scale quantitative analysis of painting arts.

    Science.gov (United States)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-11

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  1. Using Large Scale Test Results for Pedagogical Purposes

    DEFF Research Database (Denmark)

    Dolin, Jens

    2012-01-01

    The use and influence of large scale tests (LST), both national and international, has increased dramatically within the last decade. This process has revealed a tension between the legitimate need for information about the performance of the educational system and teachers to inform policy......, and the teachers’ and students’ use of this information for pedagogical purposes in the classroom. We know well how the policy makers interpret and use the outcomes of such tests, but we know less about how teachers make use of LSTs to inform their pedagogical practice. An important question is whether...... there is a contradiction between the political system’s use of LST and teachers’ (possible) pedagogical use of LST. And if yes: What is a contradiction based on? This presentation will give some results from a systematic review on how tests have influenced the pedagogical practice. The research revealed many of the fatal...

  2. SIMON: Remote collaboration system based on large scale simulation

    International Nuclear Information System (INIS)

    Sugawara, Akihiro; Kishimoto, Yasuaki

    2003-01-01

    Development of SIMON (SImulation MONitoring) system is described. SIMON aims to investigate many physical phenomena of tokamak type nuclear fusion plasma by simulation and to exchange information and to carry out joint researches with scientists in the world using internet. The characteristics of SIMON are followings; 1) decrease load of simulation by trigger sending method, 2) visualization of simulation results and hierarchical structure of analysis, 3) decrease of number of license by using command line when software is used, 4) improvement of support for using network of simulation data output by use of HTML (Hyper Text Markup Language), 5) avoidance of complex built-in work in client part and 6) small-sized and portable software. The visualization method of large scale simulation, remote collaboration system by HTML, trigger sending method, hierarchical analytical method, introduction into three-dimensional electromagnetic transportation code and technologies of SIMON system are explained. (S.Y.)

  3. Large Scale Experiments on Spacecraft Fire Safety

    DEFF Research Database (Denmark)

    Urban, David L.; Ruff, Gary A.; Minster, Olivier

    2012-01-01

    -based microgravity facilities or has been limited to very small fuel samples. Still, the work conducted to date has shown that fire behaviour in low-gravity is very different from that in normal-gravity, with differences observed for flammability limits, ignition delay, flame spread behaviour, flame colour and flame......Full scale fire testing complemented by computer modelling has provided significant knowhow about the risk, prevention and suppression of fire in terrestrial systems (cars, ships, planes, buildings, mines, and tunnels). In comparison, no such testing has been carried out for manned spacecraft due...... to the complexity, cost and risk associ-ated with operating a long duration fire safety experiment of a relevant size in microgravity. Therefore, there is currently a gap in knowledge of fire behaviour in spacecraft. The entire body of low-gravity fire research has either been conducted in short duration ground...

  4. EPFM verification by a large scale test

    International Nuclear Information System (INIS)

    Okamura, H.; Yagawa, G.; Hidaka, T.; Sato, M.; Urabe, Y.; Iida, M.

    1993-01-01

    Step B test was carried out as one of the elastic plastic fracture mechanics (EPFR) study in Japanese PTS integrity research project. In step B test bending load was applied to the large flat specimen with thermal shock. Tensile load was kept constant during the test. Estimated stable crack growth at the deepest point of the crack was 3 times larger than the experimental value in the previous analysis. In order to diminish the difference between them from the point of FEM modeling, more precise FEM mesh was introduced. According to the new analysis, the difference considerably decreased. That is, stable crack growth evaluation was improved by adopting precise FEM model near the crack tip and the difference was almost same order as that in the NKS4-1 test analysis by MPA. 8 refs., 17 figs., 5 tabs

  5. Biological Defense Research Program

    Science.gov (United States)

    1989-04-01

    difference between life and death. Some recent examples are: BDRP developed VEE vaccine used in Central America, Mexico , and Texas (1969- 1971.) and Rift...Complex, is adn area owned by the Bureau of Land Management, which is available for grazina, and with specific permission, for use by DPG. 2.3...2.01 A Large European Laboratory, 1944-1950 50.00 Tuberculosis Laboratory 4 Technicians, Canada, 1947-1954 19.00 Research Institutes, 1930-1950 4.10

  6. Military Vision Research Program

    Science.gov (United States)

    2011-07-01

    Bietti Eye Foundation, IRCCS Rome, Italy . Word count: 2879 Corresponding author: Reza Dana, M.D., M.P.H., M.Sc. Schepens Eye Research...Massachusetts Eye and Ear Infirmary, Harvard Medical School, Boston, MA 02114 3 Bietti Eye Foundation, IRCCS Rome, Italy . Word count: 2879...with differentiated properties. Exp Eye Res. 62, 155-169. 18. Marneros A.G., Fan J., Yokoyama Y., Gerber H.P., Ferrara N., Crouch R.K., Olsen B.R

  7. Effects of a large-scale micronutrient powder and young child feeding education program on the micronutrient status of children 6-24 months of age in the Kyrgyz Republic.

    Science.gov (United States)

    Serdula, M K; Lundeen, E; Nichols, E K; Imanalieva, C; Minbaev, M; Mamyrbaeva, T; Timmer, A; Aburto, N J

    2013-07-01

    To combat iron and other micronutrient deficiencies, the Ministry of Health of the Kyrgyz Republic launched a regional Infant and Young Child Nutrition (IYCN) program in 2009, which included promotion of home fortification with micronutrient powder (MNP) containing iron (12.5 mg elemental iron), vitamin A (300 μg) and other micronutrients. Every 2 months children aged 6-24 months were provided 30 sachets to be taken on a flexible schedule. The objective was to assess biochemical indicators of iron and vitamin A status among children aged 6-24 months at the baseline and follow-up surveys. Cross-sectional representative cluster surveys were conducted in 2008 (n=571 children) and 2010 (n=541). Data collected included measurement of hemoglobin, serum ferritin, soluble transferrin receptor (sTfR), retinol-binding protein, C-reactive protein (CRP) and α1-glycoprotein acid (AGP). Among all children, declines were observed in the prevalence of: anemia, 50.6% versus 43.8% (P=0.05); total iron deficiency (either low ferritin or high sTfR), 77.3% versus 63.7% (P<0.01); and iron deficiency anemia, 45.5% versus 33.4% (P<0.01). Among children without inflammation as measured by CRP and AGP, similar declines were observed, but only declines in total iron deficiency and iron deficiency anemia reached statistical significance. Among all children and those without inflammation, the prevalence of vitamin A deficiency remained the same. One year after the introduction of home fortification with MNP, within a larger IYCN program, the prevalence of anemia, iron deficiency and iron deficiency anemia declined, but vitamin A deficiency remained unchanged.

  8. Experiences from Participants in Large-Scale Group Practice of the Maharishi Transcendental Meditation and TM-Sidhi Programs and Parallel Principles of Quantum Theory, Astrophysics, Quantum Cosmology, and String Theory: Interdisciplinary Qualitative Correspondences

    Science.gov (United States)

    Svenson, Eric Johan

    Participants on the Invincible America Assembly in Fairfield, Iowa, and neighboring Maharishi Vedic City, Iowa, practicing Maharishi Transcendental Meditation(TM) (TM) and the TM-Sidhi(TM) programs in large groups, submitted written experiences that they had had during, and in some cases shortly after, their daily practice of the TM and TM-Sidhi programs. Participants were instructed to include in their written experiences only what they observed and to leave out interpretation and analysis. These experiences were then read by the author and compared with principles and phenomena of modern physics, particularly with quantum theory, astrophysics, quantum cosmology, and string theory as well as defining characteristics of higher states of consciousness as described by Maharishi Vedic Science. In all cases, particular principles or phenomena of physics and qualities of higher states of consciousness appeared qualitatively quite similar to the content of the given experience. These experiences are presented in an Appendix, in which the corresponding principles and phenomena of physics are also presented. These physics "commentaries" on the experiences were written largely in layman's terms, without equations, and, in nearly every case, with clear reference to the corresponding sections of the experiences to which a given principle appears to relate. An abundance of similarities were apparent between the subjective experiences during meditation and principles of modern physics. A theoretic framework for understanding these rich similarities may begin with Maharishi's theory of higher states of consciousness provided herein. We conclude that the consistency and richness of detail found in these abundant similarities warrants the further pursuit and development of such a framework.

  9. Parallel supercomputing: Advanced methods, algorithms, and software for large-scale linear and nonlinear problems

    Energy Technology Data Exchange (ETDEWEB)

    Carey, G.F.; Young, D.M.

    1993-12-31

    The program outlined here is directed to research on methods, algorithms, and software for distributed parallel supercomputers. Of particular interest are finite element methods and finite difference methods together with sparse iterative solution schemes for scientific and engineering computations of very large-scale systems. Both linear and nonlinear problems will be investigated. In the nonlinear case, applications with bifurcation to multiple solutions will be considered using continuation strategies. The parallelizable numerical methods of particular interest are a family of partitioning schemes embracing domain decomposition, element-by-element strategies, and multi-level techniques. The methods will be further developed incorporating parallel iterative solution algorithms with associated preconditioners in parallel computer software. The schemes will be implemented on distributed memory parallel architectures such as the CRAY MPP, Intel Paragon, the NCUBE3, and the Connection Machine. We will also consider other new architectures such as the Kendall-Square (KSQ) and proposed machines such as the TERA. The applications will focus on large-scale three-dimensional nonlinear flow and reservoir problems with strong convective transport contributions. These are legitimate grand challenge class computational fluid dynamics (CFD) problems of significant practical interest to DOE. The methods developed and algorithms will, however, be of wider interest.

  10. Large-scale Exploration of Neuronal Morphologies Using Deep Learning and Augmented Reality.

    Science.gov (United States)

    Li, Zhongyu; Butler, Erik; Li, Kang; Lu, Aidong; Ji, Shuiwang; Zhang, Shaoting

    2018-02-12

    Recently released large-scale neuron morphological data has greatly facilitated the research in neuroinformatics. However, the sheer volume and complexity of these data pose significant challenges for efficient and accurate neuron exploration. In this paper, we propose an effective retrieval framework to address these problems, based on frontier techniques of deep learning and binary coding. For the first time, we develop a deep learning based feature representation method for the neuron morphological data, where the 3D neurons are first projected into binary images and then learned features using an unsupervised deep neural network, i.e., stacked convolutional autoencoders (SCAEs). The deep features are subsequently fused with the hand-crafted features for more accurate representation. Considering the exhaustive search is usually very time-consuming in large-scale databases, we employ a novel binary coding method to compress feature vectors into short binary codes. Our framework is validated on a public data set including 58,000 neurons, showing promising retrieval precision and efficiency compared with state-of-the-art methods. In addition, we develop a novel neuron visualization program based on the techniques of augmented reality (AR), which can help users take a deep exploration of neuron morphologies in an interactive and immersive manner.

  11. Prospects for large scale electricity storage in Denmark

    DEFF Research Database (Denmark)

    Krog Ekman, Claus; Jensen, Søren Højgaard

    2010-01-01

    In a future power systems with additional wind power capacity there will be an increased need for large scale power management as well as reliable balancing and reserve capabilities. Different technologies for large scale electricity storage provide solutions to the different challenges arising w...

  12. Large-scale matrix-handling subroutines 'ATLAS'

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Takeda, Tatsuoki; Fujita, Keiichi; Matsuura, Toshihiko; Tahara, Nobuo

    1978-03-01

    Subroutine package ''ATLAS'' has been developed for handling large-scale matrices. The package is composed of four kinds of subroutines, i.e., basic arithmetic routines, routines for solving linear simultaneous equations and for solving general eigenvalue problems and utility routines. The subroutines are useful in large scale plasma-fluid simulations. (auth.)

  13. Large-scale Agricultural Land Acquisitions in West Africa | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    This project will examine large-scale agricultural land acquisitions in nine West African countries -Burkina Faso, Guinea-Bissau, Guinea, Benin, Mali, Togo, Senegal, Niger, and Côte d'Ivoire. ... They will use the results to increase public awareness and knowledge about the consequences of large-scale land acquisitions.

  14. Large-scale synthesis of YSZ nanopowder by Pechini method

    Indian Academy of Sciences (India)

    Administrator

    structure and chemical purity of 99⋅1% by inductively coupled plasma optical emission spectroscopy on a large scale. Keywords. Sol–gel; yttria-stabilized zirconia; large scale; nanopowder; Pechini method. 1. Introduction. Zirconia has attracted the attention of many scientists because of its tremendous thermal, mechanical ...

  15. Radon Research Program, FY 1991

    International Nuclear Information System (INIS)

    1992-03-01

    The scientific information being sought in this program encompasses research designed to determine radon availability and transport outdoors, modeling transport into and within buildings, physics and chemistry of radon and radon progeny, dose response relationships, lung cancer risk, and mechanisms of radon carcinogenesis. The main goal of the DOE/OHER Radon Research Program is to develop information to reduce these uncertainties and thereby provide an improved health risk estimate of exposure to radon and its progeny as well as to provide information useful in radon control strategies. Results generated under the Program were highlighted in a National Research Council report on radon dosimetry. The study concluded that the risk of radon exposure is 30% less in homes than in mines. This program summary of book describes the OHER FY-1991 Radon Research Program. It is the fifth in an annual series of program books designed to provide scientific and research information to the public and to other government agencies on the DOE Radon Research Program

  16. Large Scale Experiments on Spacecraft Fire Safety

    Science.gov (United States)

    Urban, David; Ruff, Gary A.; Minster, Olivier; Fernandez-Pello, A. Carlos; Tien, James S.; Torero, Jose L.; Legros, Guillaume; Eigenbrod, Christian; Smirnov, Nickolay; Fujita, Osamu; hide

    2012-01-01

    Full scale fire testing complemented by computer modelling has provided significant knowhow about the risk, prevention and suppression of fire in terrestrial systems (cars, ships, planes, buildings, mines, and tunnels). In comparison, no such testing has been carried out for manned spacecraft due to the complexity, cost and risk associated with operating a long duration fire safety experiment of a relevant size in microgravity. Therefore, there is currently a gap in knowledge of fire behaviour in spacecraft. The entire body of low-gravity fire research has either been conducted in short duration ground-based microgravity facilities or has been limited to very small fuel samples. Still, the work conducted to date has shown that fire behaviour in low-gravity is very different from that in normal gravity, with differences observed for flammability limits, ignition delay, flame spread behaviour, flame colour and flame structure. As a result, the prediction of the behaviour of fires in reduced gravity is at present not validated. To address this gap in knowledge, a collaborative international project, Spacecraft Fire Safety, has been established with its cornerstone being the development of an experiment (Fire Safety 1) to be conducted on an ISS resupply vehicle, such as the Automated Transfer Vehicle (ATV) or Orbital Cygnus after it leaves the ISS and before it enters the atmosphere. A computer modelling effort will complement the experimental effort. Although the experiment will need to meet rigorous safety requirements to ensure the carrier vehicle does not sustain damage, the absence of a crew removes the need for strict containment of combustion products. This will facilitate the possibility of examining fire behaviour on a scale that is relevant to spacecraft fire safety and will provide unique data for fire model validation. This unprecedented opportunity will expand the understanding of the fundamentals of fire behaviour in spacecraft. The experiment is being

  17. Optimal Wind Energy Integration in Large-Scale Electric Grids

    Science.gov (United States)

    Albaijat, Mohammad H.

    profit for investors for renting their transmission capacity, and cheaper electricity for end users. We propose a hybrid method based on a heuristic and deterministic method to attain new transmission lines additions and increase transmission capacity. Renewable energy resources (RES) have zero operating cost, which makes them very attractive for generation companies and market participants. In addition, RES have zero carbon emission, which helps relieve the concerns of environmental impacts of electric generation resources' carbon emission. RES are wind, solar, hydro, biomass, and geothermal. By 2030, the expectation is that more than 30% of electricity in the U.S. will come from RES. One major contributor of RES generation will be from wind energy resources (WES). Furthermore, WES will be an important component of the future generation portfolio. However, the nature of WES is that it experiences a high intermittency and volatility. Because of the great expectation of high WES penetration and the nature of such resources, researchers focus on studying the effects of such resources on the electric grid operation and its adequacy from different aspects. Additionally, current market operations of electric grids add another complication to consider while integrating RES (e.g., specifically WES). Mandates by market rules and long-term analysis of renewable penetration in large-scale electric grid are also the focus of researchers in recent years. We advocate a method for high-wind resources penetration study on large-scale electric grid operations. PMU is a geographical positioning system (GPS) based device, which provides immediate and precise measurements of voltage angle in a high-voltage transmission system. PMUs can update the status of a transmission line and related measurements (e.g., voltage magnitude and voltage phase angle) more frequently. Every second, a PMU can provide 30 samples of measurements compared to traditional systems (e.g., supervisory control and

  18. FY 1992 research and development project for large-scale industrial technologies. Report on results of R and D of superhigh technological machining systems; 1992 nendo chosentan kako system no kenkyu kaihatsu seika hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-03-01

    Described herein are the FY 1992 results of the R and D project aimed at establishment of the technologies for development of, e.g., machine and electronic device members of superhigh precision and high functions by processing and superhigh-precision machining aided by excited beams. The elementary researches on superhigh-precision machining achieve the given targets for precision stability of the feed positioning device. The researches on development of high-precision rotating devices, on a trial basis, are directed to improvement of rotational precision of pneumatic static pressure bearings and magnetism correction/controlling circuits, increasing speed and precision of 3-point type rotational precision measurement methods, and development of rotation-driving motors, achieving rotational precision of 0.015{mu}m at 2000rpm. The researches on the surface modification technologies aided by ion beams involve experiments for production of crystalline Si films and thin-film transistors of the Si films, using the surface-modified portion of a large-size glass substrate. The researches on superhigh-technological machining standard measurement involve development of length-measuring systems aided by a dye laser, achieving a precision of {+-} 10nm or less in a 100mm measurement range. (NEDO)

  19. Large-Scale Spray Releases: Additional Aerosol Test Results

    Energy Technology Data Exchange (ETDEWEB)

    Daniel, Richard C.; Gauglitz, Phillip A.; Burns, Carolyn A.; Fountain, Matthew S.; Shimskey, Rick W.; Billing, Justin M.; Bontha, Jagannadha R.; Kurath, Dean E.; Jenks, Jeromy WJ; MacFarlan, Paul J.; Mahoney, Lenna A.

    2013-08-01

    One of the events postulated in the hazard analysis for the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak event involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids that behave as a Newtonian fluid. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and in processing facilities across the DOE complex. To expand the data set upon which the WTP accident and safety analyses were based, an aerosol spray leak testing program was conducted by Pacific Northwest National Laboratory (PNNL). PNNL’s test program addressed two key technical areas to improve the WTP methodology (Larson and Allen 2010). The first technical area was to quantify the role of slurry particles in small breaches where slurry particles may plug the hole and prevent high-pressure sprays. The results from an effort to address this first technical area can be found in Mahoney et al. (2012a). The second technical area was to determine aerosol droplet size distribution and total droplet volume from prototypic breaches and fluids, including sprays from larger breaches and sprays of slurries for which literature data are mostly absent. To address the second technical area, the testing program collected aerosol generation data at two scales, commonly referred to as small-scale and large-scale testing. The small-scale testing and resultant data are described in Mahoney et al. (2012b), and the large-scale testing and resultant data are presented in Schonewill et al. (2012). In tests at both scales, simulants were used

  20. Estimating the Effectiveness of Special Education Using Large-Scale Assessment Data

    Science.gov (United States)

    Ewing, Katherine Anne

    2009-01-01

    The inclusion of students with disabilities in large scale assessment and accountability programs has provided new opportunities to examine the impact of special education services on student achievement. Hanushek, Kain, and Rivkin (1998, 2002) evaluated the effectiveness of special education programs by examining students' gains on a large-scale…

  1. Fusion research program in Korea

    International Nuclear Information System (INIS)

    Hwang, Y.S.

    1996-01-01

    Fusion research in Korea is still premature, but it is a fast growing program. Groups in several universities and research institutes were working either in small experiments or in theoretical areas. Recently, couple of institutes who have small fusion-related experiments, proposed medium-size tokamak programs to jump into fusion research at the level of international recognition. Last year, Korean government finally approved to construct 'Superconducting Tokamak' as a national fusion program, and industries such as Korea Electric Power Corp. (KEPCO) and Samsung joined to support this program. Korea Basic Science Institute (KBSI) has organized national project teams including universities, research institutes and companies. National project teams are performing design works since this March. (author)

  2. International Research and Studies Program

    Science.gov (United States)

    Office of Postsecondary Education, US Department of Education, 2012

    2012-01-01

    The International Research and Studies Program supports surveys, studies, and instructional materials development to improve and strengthen instruction in modern foreign languages, area studies, and other international fields. The purpose of the program is to improve and strengthen instruction in modern foreign languages, area studies and other…

  3. Results of the research on electrode and insulation wall material in fiscal 1977. Large scale technological development 'R and D on magneto hydrodynamic generation'; 1977 nendo denkyoku oyobi zetsuenheki zairyo ni kansuru kenkyu seika

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1978-06-01

    Results of research in fiscal 1977 were compiled concerning electrodes and insulation wall materials, the research conducted by the material working group of the magneto hydrodynamic (MHD) generation R and D liaison conference. Researches on trial manufacturing of duct materials for MHD generation were conducted for a Si{sub 3}N{sub 4}-MgO, Si{sub 3}N{sub 4}-Spinel, Spinel and Sialon based insulation wall material, MgO-BN based insulation wall material, tin oxide based electrode material, cold press ZrO{sub 2}-CeO{sub 2} and ZrO{sub 2}-Y{sub 2}O{sub 2} based electrode material, hot press hot hydrostatic pressure ZrO{sub 2}-CeO{sub 2} based electrode material, cermet based electrode material, etc. In the investigation and measurement of basic characteristics, these materials were put through various tests such as 1,300 degree C-300 hr-K{sub 2}SO{sub 4} immersion test, thermal shock resistance, thermal expansibility, oxidation resistance of oxide/nitride based materials. In addition, selection of materials for MHD generation, as well as the examination and degradation analysis of dynamic characteristics, was carried out by simulation of MHD generation, which provided data of various electrodes such as consumption, electrical characteristics (electrode lowering voltage, critical current, etc.) and thermal characteristics (surface temperature, heat flow velocity, etc.) (NEDO)

  4. FY 1991 Research and development project for large-scale industrial technologies. Report on results of R and D of superhigh technological machining systems; 1991 nendo chosentan kako system no kenkyu kaihatsu seika hokokusho. Chosentan kako system no kenkyu kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-03-01

    Described herein are the FY 1991 results of the R and D project aimed at establishment of superprecision machining technologies for developing machining technologies and nano-technologies aided by excited beams. The researches on the superprecision machining technologies involve design and development, on a trial basis, of the totally static pressure type positioning device, for which automatically controlling drawing is adopted to improve its rigidity. The researches on the surface modification technologies aided by ion beams involve scanning the ion beams onto the metallic plate to be provided around the glass substrate. The results indicate that the secondary electrons generated can be used to control charge-up. In addition, part of a 30cm square glass substrate is modified by implantation of the spot type ions of high current density, and the modified portion is used to produce a thin-film silicon transistor. The researches on superhigh-technological machining standard measurement involve improvement of precision of the system aided by a dye laser, which attains a precision of 0 to 30nm in a 0.1m measurement range. (NEDO)

  5. Environmental and stewardship implications for the large scale conversion of municipal and agricultural organic waste to energy in Canada[Manure, biosolids, and organic industrial/commercial residuals in land applications programs : improving beneficial reuse and protection of water quality

    Energy Technology Data Exchange (ETDEWEB)

    Falletta, P.; Zhu, H. [Environment Canada, Burlington, ON (Canada). Wastewater Technology Centre; Oleszkiewicz, J. [Manitoba Univ., Winnipeg, MB (Canada). Dept. of Civil Engineering

    2007-07-01

    The move towards environmental sustainability in the Canadian industrial, agricultural and municipal sectors coupled with the requirements for Canada to meet its Kyoto obligations for reduction of greenhouse gas (GHG) emissions have led to the need to examine the feasibility of harvesting the energy contained in waste biomass. This paper discussed the current and projected Canadian inventories of municipal biosolids, municipal solid waste, food industry wastes and animal manure; anaerobic digestion; considerations and challenges in the management of waste biomass; and current technologies available for energy recovery for each of these waste streams. The paper also discussed the environmental, technical, economic, societal and regulatory issues which are likely to be triggered as alternative methods to traditional disposal practices. The research and action needed to bring Canada to the forefront of environmental sustainability in waste biomass management was also discussed. The paper made several recommendations in terms of regulations, demonstration projects and public education. It was concluded that the biggest factor in the adoption of technologies for waste management is cost. It was concluded that there is no one perfect solution to the management of organic wastes in Canada. A detailed analysis that takes into consideration all of the technical, societal, environmental, economic, and regulatory issues must be performed to determine the right choice of technology. 4 tabs.

  6. An establishment on the hazard mitigation system of large scale landslides for Zengwen reservoir watershed management in Taiwan

    Science.gov (United States)

    Tsai, Kuang-Jung; Lee, Ming-Hsi; Chen, Yie-Ruey; Huang, Meng-Hsuan; Yu, Chia-Ching

    2016-04-01

    Extremely heavy rainfall with accumulated rainfall amount more than 2900mm within continuous 3 day event occurred at southern Taiwan has been recognized as a serious natural hazard caused by Morakot typhoon in august, 2009. Very destructive large scale landslides and debris flows were induced by this heavy rainfall event. According to the satellite image processing and monitoring project was conducted by Soil & Water Conservation Bureau after Morakot typhoon. More than 10904 sites of landslide with total sliding area of 18113 ha were significantly found by this project. Also, the field investigation on all landslide areas were executed by this research on the basis of disaster type, scale and location related to the topographic condition, colluvium soil characteristics, bedrock formation and geological structure after Morakot hazard. The mechanism, characteristics and behavior of this large scale landslide combined with debris flow disasters are analyzed and Investigated to rule out the interaction of factors concerned above and identify the disaster extent of rainfall induced landslide during the period of this study. In order to reduce the disaster risk of large scale landslide and debris flow, the adaption strategy of hazard mitigation system should be set up as soon as possible and taken into consideration of slope land conservation, landslide control countermeasure planning, disaster database establishment, environment impact analysis and disaster risk assessment respectively. As a result, this 3-year research has been focused on the field investigation by using GPS/GIS/RS integration, mechanism and behavior study regarding to the rainfall induced landslide occurrence, disaster database and hazard mitigation system establishment. In fact, this project has become an important issue which was seriously concerned by the government and people live in Taiwan. Hopefully, all results come from this research can be used as a guidance for the disaster prevention and

  7. Research program on regulatory safety research

    International Nuclear Information System (INIS)

    Mailaender, R.

    2010-02-01

    This paper elaborated for the Swiss Federal Office of Energy (SFOE) presents the synthesis report for 2009 made by the SFOE's program leader on the research program concerning regulatory nuclear safety research, as co-ordinated by the Swiss Nuclear Safety Inspectorate ENSI. Work carried out in various areas is reviewed, including that done on reactor safety, radiation protection and waste disposal as well as human aspects, organisation and safety culture. Work done concerning materials, pressure vessel integrity, transient analysis, the analysis of serious accidents in light-water reactors, fuel and material behaviour, melt cooling and concrete interaction is presented. OECD data bank topics are discussed. Transport and waste disposal research at the Mont Terri rock laboratory is looked at. Requirements placed on the personnel employed in nuclear power stations are examined and national and international co-operation is reviewed

  8. Economic and agricultural transformation through large-scale farming : impacts of large-scale farming on local economic development, household food security and the environment in Ethiopia

    NARCIS (Netherlands)

    Bekele, M.S.

    2016-01-01

    This study examined impacts of large-scale farming in Ethiopia on local economic development, household food security, incomes, employment, and the environment. The study adopted a mixed research approach in which both qualitative and quantitative data were generated from secondary and primary

  9. Jointly Sponsored Research Program Energy Related Research

    Energy Technology Data Exchange (ETDEWEB)

    Western Research Institute

    2009-03-31

    Cooperative Agreement, DE-FC26-98FT40323, Jointly Sponsored Research (JSR) Program at Western Research Institute (WRI) began in 1998. Over the course of the Program, a total of seventy-seven tasks were proposed utilizing a total of $23,202,579 in USDOE funds. Against this funding, cosponsors committed $26,557,649 in private funds to produce a program valued at $49,760,228. The goal of the Jointly Sponsored Research Program was to develop or assist in the development of innovative technology solutions that will: (1) Increase the production of United States energy resources - coal, natural gas, oil, and renewable energy resources; (2) Enhance the competitiveness of United States energy technologies in international markets and assist in technology transfer; (3) Reduce the nation's dependence on foreign energy supplies and strengthen both the United States and regional economies; and (4) Minimize environmental impacts of energy production and utilization. Under the JSR Program, energy-related tasks emphasized enhanced oil recovery, heavy oil upgrading and characterization, coal beneficiation and upgrading, coal combustion systems development including oxy-combustion, emissions monitoring and abatement, coal gasification technologies including gas clean-up and conditioning, hydrogen and liquid fuels production, coal-bed methane recovery, and the development of technologies for the utilization of renewable energy resources. Environmental-related activities emphasized cleaning contaminated soils and waters, processing of oily wastes, mitigating acid mine drainage, and demonstrating uses for solid waste from clean coal technologies, and other advanced coal-based systems. Technology enhancement activities included resource characterization studies, development of improved methods, monitors and sensors. In general the goals of the tasks proposed were to enhance competitiveness of U.S. technology, increase production of domestic resources, and reduce environmental

  10. Side effects of problem-solving strategies in large-scale nutrition science: towards a diversification of health.

    Science.gov (United States)

    Penders, Bart; Vos, Rein; Horstman, Klasien

    2009-11-01

    Solving complex problems in large-scale research programmes requires cooperation and division of labour. Simultaneously, large-scale problem solving also gives rise to unintended side effects. Based upon 5 years of researching two large-scale nutrigenomic research programmes, we argue that problems are fragmented in order to be solved. These sub-problems are given priority for practical reasons and in the process of solving them, various changes are introduced in each sub-problem. Combined with additional diversity as a result of interdisciplinarity, this makes reassembling the original and overall goal of the research programme less likely. In the case of nutrigenomics and health, this produces a diversification of health. As a result, the public health goal of contemporary nutrition science is not reached in the large-scale research programmes we studied. Large-scale research programmes are very successful in producing scientific publications and new knowledge; however, in reaching their political goals they often are less successful.

  11. FY 1998 Report on development of large-scale wind power generation systems. Research on the future prospects of wind power generation systems; 1998 nendo ogata furyoku hatsuden system kaihatsu. Furyoku hatsuden system no shorai tenbo ni kansuru chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    Current status of wind power generation in Japan and situations in foreign countries ahead of Japan are surveyed, in order to clarify the prospects for the future diffusion and expansion of wind power generation systems in Japan. The surveyed trends of wind power generation in Japan include those related to mandatory laws and regulations, e.g., the Electricity Enterprises Act, introductory and operation situations in local autonomies and electric power companies, and R and D efforts by academic and research organizations. The surveyed wind power generation situations in foreign countries include trends of international standardization for wind power generation, and global situations of introducing these systems. The on-the-spot oversea surveys include location/wind conditions in Greece's islands, cyclone-caused damages in India, World Renewable Energy Congress in Perth and advanced technologies in Europe for wind power generation systems, and the survey results are reported in detail. The surveyed R and D projects in Japan include the basic technological R and D plans (draft) for, e.g., wind power generation systems for isolated islands. (NEDO)

  12. FY 1998 Report on development of large-scale wind power generation systems. Research of wind turbines for storm worthy and easy construction; 1998 nendo ogata furyoku hatsuden system kaihatsu. Taikyofu kensetsu yoigata fusha ni kansuru chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    The research and development statuses in various countries are surveyed, to have useful information to draw the future R and D directions for wind turbines resistant to storms and easy construction. Greece has sites suitable for wind power generation in mountainous districts, and is developing the systems while taking the characteristic weather conditions into consideration. The country provides information regarding aerodynamic/structural design methods for wind turbine blades applicable to turbulent wind generated by complex terrain, and wind assessment and analyses in complex terrain. In India, on-the-spot surveys are made at the cyclone-attacked wind farms. One of the areas on which the USA is putting emphasis is development of small-size wind turbines and wind-diesel hybrid systems for developing countries and independent grid systems in remote areas. Australia is constructing wind-diesel hybrid systems to be connected to a number of independent grid systems in its western area. In Europe, information is collected for the advanced aerodynamic analysis, construction of offshore wind turbines, and production engineering and facilities for blades and other components from Vestas and N.E.G. Micon as the leading wind turbine makers. (NEDO)

  13. Biotechnological lignite conversion - a large-scale concept

    Energy Technology Data Exchange (ETDEWEB)

    Reich-Walber, M.; Meyrahn, H.; Felgener, G.W. [Rheinbraun AG, Koeln (Germany). Fuel Technology and Lab. Dept.

    1997-12-31

    Concerning the research on biotechnological lignite upgrading, Rheinbraun`s overall objective is the large-scale production of liquid and gaseous products for the energy and chemical/refinery sectors. The presentation outlines Rheinbraun`s technical concept for electricity production on the basis of biotechnologically solubilized lignite. A first rough cost estimate based on the assumptions described in the paper in detail and compared with the latest power plant generation shows the general cost efficiency of this technology despite the additional costs in respect of coal solubilization. The main reasons are low-cost process techniques for coal conversion on the one hand and cost reductions mainly in power plant technology (more efficient combustion processes and simplified gas clean-up) but also in coal transport (easy fuel handling) on the other hand. Moreover, it is hoped that an extended range of products will make it possible to widen the fields of lignite application. The presentation also points out that there is still a huge gap between this scenario and reality by limited microbiological knowledge. To close this gap Rheinbraun started a research project supported by the North-Rhine Westphalian government in 1995. Several leading biotechnological companies and institutes in Germany and the United States are involved in the project. The latest results of the current project will be presented in the paper. This includes fundamental research activities in the field of microbial coal conversion as well as investigations into bioreactor design and product treatment (dewatering, deashing and desulphurization). (orig.)

  14. Program of Research in Aeronautics

    Science.gov (United States)

    1981-01-01

    A prospectus of the educational and research opportunities available at the Joint Institute for Advancement of Flight Sciences, operated at NASA Langley Research Center in conjunction with George Washington University's School of Engineering and Applied Sciences is presented. Requirements of admission to various degree programs are given as well as the course offerings in the areas of acoustics, aeronautics, environmental modelling, materials science, and structures and dynamics. Research facilities for each field of study are described. Presentations and publications (including dissertations and theses) generated by each program are listed as well as faculty members visting scientists and engineers.

  15. Fusion program research materials inventory

    International Nuclear Information System (INIS)

    Roche, T.K.; Wiffen, F.W.; Davis, J.W.; Lechtenberg, T.A.

    1984-01-01

    Oak Ridge National Laboratory maintains a central inventory of research materials to provide a common supply of materials for the Fusion Reactor Materials Program. This will minimize unintended material variations and provide for economy in procurement and for centralized record keeping. Initially this inventory is to focus on materials related to first-wall and structural applications and related research, but various special purpose materials may be added in the future. The use of materials from this inventory for research that is coordinated with or otherwise related technically to the Fusion Reactor Materials Program of DOE is encouraged

  16. Comparison Between Overtopping Discharge in Small and Large Scale Models

    DEFF Research Database (Denmark)

    Helgason, Einar; Burcharth, Hans F.

    2006-01-01

    The present paper presents overtopping measurements from small scale model test performed at the Haudraulic & Coastal Engineering Laboratory, Aalborg University, Denmark and large scale model tests performed at the Largde Wave Channel,Hannover, Germany. Comparison between results obtained from...... small and large scale model tests show no clear evidence of scale effects for overtopping above a threshold value. In the large scale model no overtopping was measured for waveheights below Hs = 0.5m as the water sunk into the voids between the stones on the crest. For low overtopping scale effects...

  17. Research program plan: steam generators

    International Nuclear Information System (INIS)

    Muscara, J.; Serpan, C.Z. Jr.

    1985-07-01

    This document presents a plan for research in Steam Generators to be performed by the Materials Engineering Branch, MEBR, Division of Engineering Technology, (EDET), Office of Nuclear Regulatory Research. It is one of four plans describing the ongoing research in the corresponding areas of MEBR activity. In order to answer the questions posed, the Steam Generator Program has been organized with the three elements of non-destructive examination; mechanical integrity testing; and corrosion, cleaning and decontamination

  18. Quality Control Charts in Large-Scale Assessment Programs

    Science.gov (United States)

    Schafer, William D.; Coverdale, Bradley J.; Luxenberg, Harlan; Jin, Ying

    2011-01-01

    There are relatively few examples of quantitative approaches to quality control in educational assessment and accountability contexts. Among the several techniques that are used in other fields, Shewart charts have been found in a few instances to be applicable in educational settings. This paper describes Shewart charts and gives examples of how…

  19. Program planning for large-scale control system upgrades

    International Nuclear Information System (INIS)

    Madani, M.; Giajnorio, J.; Richard, T.; Ho, D.; Volk, W.; Ertel, A.

    2011-01-01

    Bruce Power has been planning to replace the Bruce A Fuel Handling (FH) computer systems including the Controller and Protective computers for many years. This is a complex project, requiring an extended FH outage. To minimize operational disruption and fully identify associated project risks, Bruce Power is executing the project in phases starting with the Protective Computer replacement. GEH-C is collaborating with Bruce Power in a Preliminary Engineering (PE) phase to generate a project plan including specifications, budgetary cost, schedule, risks for the Protective computer replacement project. To assist Bruce Power in its evaluation, GEH-C's is using 6-Sigma methodologies to identify and rank Critical to Quality (CTQ) requirements in collaboration with Bruce Power Maintenance, Operations, Plant Design and FH Engineering teams. PE phase established the project scope, hardware and software specifications, material requirements and finally concluded with a recommended hardware platform and approved controls architecture.

  20. Aspects of FORTRAN in large-scale programming

    International Nuclear Information System (INIS)

    Metcalf, M.

    1983-01-01

    In these two lectures I examine the following three questions: i) Why did high-energy physicists begin to use FORTRAN. ii) Why do high-energy physicists continue to use FORTRAN. iii) Will high-energy physicists always use FORTRAN. In order to find answers to these questions, it is necessary to look at the history of the language, its present position, and its likely future, and also to consider its manner of use, the topic of portability, and the competition from other languages. Here we think especially of early competition from ALGOL, the more recent spread in the use of PASCAL, and the appearance of a completely new and ambitious language, ADA. (orig.)

  1. Aspects of FORTRAN in large-scale programming

    CERN Document Server

    Metcalf, M

    1983-01-01

    In these two lectures I shall try to examine the following three questions: i) Why did high-energy physicists begin to use FORTRAN? ii) Why do high-energy physicists continue to use FORTRAN? iii) Will high-energy physicists always use FORTRAN? In order to find answers to these questions, it is necessary to look at the history of the language, its present position, and its likely future, and also to consider its manner of use, the topic of portability, and the competition from other languages. Here we think especially of early competition from ALGOL, the more recent spread in the use of PASCAL, and the appearance of a completely new and ambitious language, ADA.

  2. GRI's Devonian Shales Research Program

    International Nuclear Information System (INIS)

    Guidry, F.K.

    1991-01-01

    This paper presents a summary of the key observations and conclusions from the Gas Research Institute's (GRI's) Comprehensive Study Well (CSW) research program conducted in the Devonian Shales of the Appalachian Basin. Initiated in 1987, the CSW program was a series of highly instrumented study wells drilled in cooperation with industry partners. Seven wells were drilled as part of the program. Extensive data sets were collected and special experiments were run on the CSW's in addition to the operator's normal operations, with the objectives of identifying geologic production controls, refining formation evaluation tools, and improving reservoir description and stimulation practices in the Devonian Shales. This paper highlights the key results from the research conducted in the CSW program in the areas of geologic production controls, formation evaluation, stimulation and reservoir engineering, and field operations. The development of geologic, log analysis, and reservoir models for the Shales from the data gathered and analysis, and reservoir models for the Shales from the data gathered and analyzed during the research is discussed. In addition, on the basis of what was learned in the CSW program, GRI's plans for new research in the Devonian Shales are described

  3. Optimization of large-scale industrial systems : an emerging method

    Energy Technology Data Exchange (ETDEWEB)

    Hammache, A.; Aube, F.; Benali, M.; Cantave, R. [Natural Resources Canada, Varennes, PQ (Canada). CANMET Energy Technology Centre

    2006-07-01

    This paper reviewed optimization methods of large-scale industrial production systems and presented a novel systematic multi-objective and multi-scale optimization methodology. The methodology was based on a combined local optimality search with global optimality determination, and advanced system decomposition and constraint handling. The proposed method focused on the simultaneous optimization of the energy, economy and ecology aspects of industrial systems (E{sup 3}-ISO). The aim of the methodology was to provide guidelines for decision-making strategies. The approach was based on evolutionary algorithms (EA) with specifications including hybridization of global optimality determination with a local optimality search; a self-adaptive algorithm to account for the dynamic changes of operating parameters and design variables occurring during the optimization process; interactive optimization; advanced constraint handling and decomposition strategy; and object-oriented programming and parallelization techniques. Flowcharts of the working principles of the basic EA were presented. It was concluded that the EA uses a novel decomposition and constraint handling technique to enhance the Pareto solution search procedure for multi-objective problems. 6 refs., 9 figs.

  4. Large-scale land transformations in Indonesia: The role of ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... enable timely responses to the impacts of large-scale land transformations in Central Kalimantan ... In partnership with UNESCO's Organization for Women in Science for the ... New funding opportunity for gender equality and climate change.

  5. Resolute large scale mining company contribution to health services of

    African Journals Online (AJOL)

    Resolute large scale mining company contribution to health services of Lusu ... in terms of socio economic, health, education, employment, safe drinking water, ... The data were analyzed using Scientific Package for Social Science (SPSS).

  6. Personalized Opportunistic Computing for CMS at Large Scale

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    **Douglas Thain** is an Associate Professor of Computer Science and Engineering at the University of Notre Dame, where he designs large scale distributed computing systems to power the needs of advanced science and...

  7. Bottom-Up Accountability Initiatives and Large-Scale Land ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Corey Piccioni

    fuel/energy, climate, and finance has occurred and one of the most ... this wave of large-scale land acquisitions. In fact, esti- ... Environmental Rights Action/Friends of the Earth,. Nigeria ... map the differentiated impacts (gender, ethnicity,.

  8. Bottom-Up Accountability Initiatives and Large-Scale Land ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... Security can help increase accountability for large-scale land acquisitions in ... to build decent economic livelihoods and participate meaningfully in decisions ... its 2017 call for proposals to establish Cyber Policy Centres in the Global South.

  9. Amplification of large-scale magnetic field in nonhelical magnetohydrodynamics

    KAUST Repository

    Kumar, Rohit

    2017-08-11

    It is typically assumed that the kinetic and magnetic helicities play a crucial role in the growth of large-scale dynamo. In this paper, we demonstrate that helicity is not essential for the amplification of large-scale magnetic field. For this purpose, we perform nonhelical magnetohydrodynamic (MHD) simulation, and show that the large-scale magnetic field can grow in nonhelical MHD when random external forcing is employed at scale 1/10 the box size. The energy fluxes and shell-to-shell transfer rates computed using the numerical data show that the large-scale magnetic energy grows due to the energy transfers from the velocity field at the forcing scales.

  10. No Large Scale Curvature Perturbations during Waterfall of Hybrid Inflation

    OpenAIRE

    Abolhasani, Ali Akbar; Firouzjahi, Hassan

    2010-01-01

    In this paper the possibility of generating large scale curvature perturbations induced from the entropic perturbations during the waterfall phase transition of standard hybrid inflation model is studied. We show that whether or not appreciable amounts of large scale curvature perturbations are produced during the waterfall phase transition depend crucially on the competition between the classical and the quantum mechanical back-reactions to terminate inflation. If one considers only the clas...

  11. Bayesian hierarchical model for large-scale covariance matrix estimation.

    Science.gov (United States)

    Zhu, Dongxiao; Hero, Alfred O

    2007-12-01

    Many bioinformatics problems implicitly depend on estimating large-scale covariance matrix. The traditional approaches tend to give rise to high variance and low accuracy due to "overfitting." We cast the large-scale covariance matrix estimation problem into the Bayesian hierarchical model framework, and introduce dependency between covariance parameters. We demonstrate the advantages of our approaches over the traditional approaches using simulations and OMICS data analysis.

  12. Benefits of transactive memory systems in large-scale development

    OpenAIRE

    Aivars, Sablis

    2016-01-01

    Context. Large-scale software development projects are those consisting of a large number of teams, maybe even spread across multiple locations, and working on large and complex software tasks. That means that neither a team member individually nor an entire team holds all the knowledge about the software being developed and teams have to communicate and coordinate their knowledge. Therefore, teams and team members in large-scale software development projects must acquire and manage expertise...

  13. Capabilities of the Large-Scale Sediment Transport Facility

    Science.gov (United States)

    2016-04-01

    pump flow meters, sediment trap weigh tanks , and beach profiling lidar. A detailed discussion of the original LSTF features and capabilities can be...ERDC/CHL CHETN-I-88 April 2016 Approved for public release; distribution is unlimited. Capabilities of the Large-Scale Sediment Transport...describes the Large-Scale Sediment Transport Facility (LSTF) and recent upgrades to the measurement systems. The purpose of these upgrades was to increase

  14. Large-scale fortification of condiments and seasonings as a public health strategy: equity considerations for implementation.

    Science.gov (United States)

    Zamora, Gerardo; Flores-Urrutia, Mónica Crissel; Mayén, Ana-Lucia

    2016-09-01

    Fortification of staple foods with vitamins and minerals is an effective approach to increase micronutrient intake and improve nutritional status. The specific use of condiments and seasonings as vehicles in large-scale fortification programs is a relatively new public health strategy. This paper underscores equity considerations for the implementation of large-scale fortification of condiments and seasonings as a public health strategy by examining nonexhaustive examples of programmatic experiences and pilot projects in various settings. An overview of conceptual elements in implementation research and equity is presented, followed by an examination of equity considerations for five implementation strategies: (1) enhancing the capabilities of the public sector, (2) improving the performance of implementing agencies, (3) strengthening the capabilities and performance of frontline workers, (3) empowering communities and individuals, and (4) supporting multiple stakeholders engaged in improving health. Finally, specific considerations related to intersectoral action are considered. Large-scale fortification of condiments and seasonings cannot be a standalone strategy and needs to be implemented with concurrent and coordinated public health strategies, which should be informed by a health equity lens. © 2016 New York Academy of Sciences.

  15. Large-Scale Laboratory Facility For Sediment Transport Research

    Data.gov (United States)

    Federal Laboratory Consortium — Effective design and maintenance of inlet navigation and shore protection projects require accurate estimates of the quantity of sand that moves along the beach. The...

  16. The development of a capability for aerodynamic testing of large-scale wing sections in a simulated natural rain environment

    Science.gov (United States)

    Bezos, Gaudy M.; Cambell, Bryan A.; Melson, W. Edward

    1989-01-01

    A research technique to obtain large-scale aerodynamic data in a simulated natural rain environment has been developed. A 10-ft chord NACA 64-210 wing section wing section equipped with leading-edge and trailing-edge high-lift devices was tested as part of a program to determine the effect of highly-concentrated, short-duration rainfall on airplane performance. Preliminary dry aerodynamic data are presented for the high-lift configuration at a velocity of 100 knots and an angle of attack of 18 deg. Also, data are presented on rainfield uniformity and rainfall concentration intensity levels obtained during the calibration of the rain simulation system.

  17. Containment integrity research program plan

    International Nuclear Information System (INIS)

    1987-08-01

    This report presents a plan for research on the question of containment performance in postulated severe accident scenarios. It focuses on the research being performed by the Structural and Seismic Engineering Branch, Division of Engineering, Office of Nuclear Regulatory Research. Summaries of the plans for this work have previously been published in the ''Nuclear Power Plant Severe Accident Research Plan'' (NUREG-0900). This report provides an update to reflect current status. This plan provides a summary of results to date as well as an outline of planned activities and milestones to the contemplated completion of the program in FY 1989

  18. Proceedings of the Joint IAEA/CSNI Specialists' Meeting on Fracture Mechanics Verification by Large-Scale Testing

    International Nuclear Information System (INIS)

    1993-10-01

    temperatures. The meeting in Oak Ridge was designed to allow leading specialists to share and review recent large - scale fracture experiments and to discuss them relative to verification of fracture mechanics methods. The objective was to assess the ability of analytical methods that may currently be used to model the fracture behavior of nuclear reactor components and structures. The meeting was organized into six technical sessions: Session I: CSNI Project FALSIRE - Current Results; Session II: Large-Scale Experiments and Applications; Session III: Assessments of Fracture Mechanics Analysis Methods; Session IV: Large-Scale Plate Experiments and Analyses; Session V: Fracture Modeling and Transferability; Session VI: Large-Scale Piping Experiments and Analyses. This report records all the papers presented at this meeting along with two others whose authors could not be present. While the report does not include session dividers, the table of contents shows the grouping of papers by session. The final chapter of this report provides summaries that rapporteurs prepared on the day the papers were presented. The papers are printed in the order of their presentation in each session and describe recent large-scale fracture (brittle and/or ductile) experiments, analyses of these experiments, and comparisons between predictions and experimental results. The goal of the meeting was to allow international experts to examine the fracture behavior of various materials and structures under conditions relevant to nuclear reactor components and operating environments. The emphasis was on the ability of various fracture models and analysis methods to predict the wide range of experimental data now available. The international nature of the meeting is illustrated by the fact that papers were presented by researchers from CSFR, Finland, France, Germany, Japan, Russia, U.S.A., and the U.K. There were experts present from several other countries who participated in discussing the results

  19. Cytology of DNA Replication Reveals Dynamic Plasticity of Large-Scale Chromatin Fibers.

    Science.gov (United States)

    Deng, Xiang; Zhironkina, Oxana A; Cherepanynets, Varvara D; Strelkova, Olga S; Kireev, Igor I; Belmont, Andrew S

    2016-09-26

    In higher eukaryotic interphase nuclei, the 100- to >1,000-fold linear compaction of chromatin is difficult to reconcile with its function as a template for transcription, replication, and repair. It is challenging to imagine how DNA and RNA polymerases with their associated molecular machinery would move along the DNA template without transient decondensation of observed large-scale chromatin "chromonema" fibers [1]. Transcription or "replication factory" models [2], in which polymerases remain fixed while DNA is reeled through, are similarly difficult to conceptualize without transient decondensation of these chromonema fibers. Here, we show how a dynamic plasticity of chromatin folding within large-scale chromatin fibers allows DNA replication to take place without significant changes in the global large-scale chromatin compaction or shape of these large-scale chromatin fibers. Time-lapse imaging of lac-operator-tagged chromosome regions shows no major change in the overall compaction of these chromosome regions during their DNA replication. Improved pulse-chase labeling of endogenous interphase chromosomes yields a model in which the global compaction and shape of large-Mbp chromatin domains remains largely invariant during DNA replication, with DNA within these domains undergoing significant movements and redistribution as they move into and then out of adjacent replication foci. In contrast to hierarchical folding models, this dynamic plasticity of large-scale chromatin organization explains how localized changes in DNA topology allow DNA replication to take place without an accompanying global unfolding of large-scale chromatin fibers while suggesting a possible mechanism for maintaining epigenetic programming of large-scale chromatin domains throughout DNA replication. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Alignment between galaxies and large-scale structure

    International Nuclear Information System (INIS)

    Faltenbacher, A.; Li Cheng; White, Simon D. M.; Jing, Yi-Peng; Mao Shude; Wang Jie

    2009-01-01

    Based on the Sloan Digital Sky Survey DR6 (SDSS) and the Millennium Simulation (MS), we investigate the alignment between galaxies and large-scale structure. For this purpose, we develop two new statistical tools, namely the alignment correlation function and the cos(2θ)-statistic. The former is a two-dimensional extension of the traditional two-point correlation function and the latter is related to the ellipticity correlation function used for cosmic shear measurements. Both are based on the cross correlation between a sample of galaxies with orientations and a reference sample which represents the large-scale structure. We apply the new statistics to the SDSS galaxy catalog. The alignment correlation function reveals an overabundance of reference galaxies along the major axes of red, luminous (L ∼ * ) galaxies out to projected separations of 60 h- 1 Mpc. The signal increases with central galaxy luminosity. No alignment signal is detected for blue galaxies. The cos(2θ)-statistic yields very similar results. Starting from a MS semi-analytic galaxy catalog, we assign an orientation to each red, luminous and central galaxy, based on that of the central region of the host halo (with size similar to that of the stellar galaxy). As an alternative, we use the orientation of the host halo itself. We find a mean projected misalignment between a halo and its central region of ∼ 25 deg. The misalignment decreases slightly with increasing luminosity of the central galaxy. Using the orientations and luminosities of the semi-analytic galaxies, we repeat our alignment analysis on mock surveys of the MS. Agreement with the SDSS results is good if the central orientations are used. Predictions using the halo orientations as proxies for central galaxy orientations overestimate the observed alignment by more than a factor of 2. Finally, the large volume of the MS allows us to generate a two-dimensional map of the alignment correlation function, which shows the reference

  1. Large scale solar district heating. Evaluation, modelling and designing

    Energy Technology Data Exchange (ETDEWEB)

    Heller, A.

    2000-07-01

    The main objective of the research was to evaluate large-scale solar heating connected to district heating (CSDHP), to build up a simulation tool and to demonstrate the application of the tool for design studies and on a local energy planning case. The evaluation of the central solar heating technology is based on measurements on the case plant in Marstal, Denmark, and on published and unpublished data for other, mainly Danish, CSDHP plants. Evaluations on the thermal, economical and environmental performances are reported, based on the experiences from the last decade. The measurements from the Marstal case are analysed, experiences extracted and minor improvements to the plant design proposed. For the detailed designing and energy planning of CSDHPs, a computer simulation model is developed and validated on the measurements from the Marstal case. The final model is then generalised to a 'generic' model for CSDHPs in general. The meteorological reference data, Danish Reference Year, is applied to find the mean performance for the plant designs. To find the expectable variety of the thermal performance of such plants, a method is proposed where data from a year with poor solar irradiation and a year with strong solar irradiation are applied. Equipped with a simulation tool design studies are carried out spreading from parameter analysis over energy planning for a new settlement to a proposal for the combination of plane solar collectors with high performance solar collectors, exemplified by a trough solar collector. The methodology of utilising computer simulation proved to be a cheap and relevant tool in the design of future solar heating plants. The thesis also exposed the demand for developing computer models for the more advanced solar collector designs and especially for the control operation of CSHPs. In the final chapter the CSHP technology is put into perspective with respect to other possible technologies to find the relevance of the application

  2. State of the Art in Large-Scale Soil Moisture Monitoring

    Science.gov (United States)

    Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.; hide

    2013-01-01

    Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.

  3. Experimental study on dynamic behavior of large scale foundation, 1

    International Nuclear Information System (INIS)

    Hanada, Kazufumi; Sawada, Yoshihiro; Esashi, Yasuyuki; Ueshima, Teruyuki; Nakamura, Hideharu

    1983-01-01

    The large-sized, high performance vibrating table in the Nuclear Power Engineering Test Center is installed on a large-scale concrete foundation of length 90.9 m, width 44.8 m and maximum thickness 21 m, weighing 150,000 tons. Through the experimental study on the behavior of the foundation, which is set on gravel ground, useful information should be obtained on the siting of a nuclear power plant on the Quaternary stratum ground. The objective of research is to grasp the vibration characteristics of the foundation during the vibration of the table to evaluate the interaction between the foundation and the ground, and to evaluate an analytical method for numerically simulating the vibration behavior. In the present study, the vibration behavior of the foundation was clarified by measurement, and in order to predict the vibration behavior, the semi-infinite theory of elasticity was applied. The accuracy of this analytical method was demonstrated by comparison with the measured results. (Mori, K.)

  4. Staghorn: An Automated Large-Scale Distributed System Analysis Platform

    Energy Technology Data Exchange (ETDEWEB)

    Gabert, Kasimir [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Burns, Ian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Elliott, Steven [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kallaher, Jenna [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vail, Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-09-01

    Conducting experiments on large-scale distributed computing systems is becoming significantly easier with the assistance of emulation. Researchers can now create a model of a distributed computing environment and then generate a virtual, laboratory copy of the entire system composed of potentially thousands of virtual machines, switches, and software. The use of real software, running at clock rate in full virtual machines, allows experiments to produce meaningful results without necessitating a full understanding of all model components. However, the ability to inspect and modify elements within these models is bound by the limitation that such modifications must compete with the model, either running in or alongside it. This inhibits entire classes of analyses from being conducted upon these models. We developed a mechanism to snapshot an entire emulation-based model as it is running. This allows us to \\freeze time" and subsequently fork execution, replay execution, modify arbitrary parts of the model, or deeply explore the model. This snapshot includes capturing packets in transit and other input/output state along with the running virtual machines. We were able to build this system in Linux using Open vSwitch and Kernel Virtual Machines on top of Sandia's emulation platform Firewheel. This primitive opens the door to numerous subsequent analyses on models, including state space exploration, debugging distributed systems, performance optimizations, improved training environments, and improved experiment repeatability.

  5. Episodic memory in aspects of large-scale brain networks

    Science.gov (United States)

    Jeong, Woorim; Chung, Chun Kee; Kim, June Sic

    2015-01-01

    Understanding human episodic memory in aspects of large-scale brain networks has become one of the central themes in neuroscience over the last decade. Traditionally, episodic memory was regarded as mostly relying on medial temporal lobe (MTL) structures. However, recent studies have suggested involvement of more widely distributed cortical network and the importance of its interactive roles in the memory process. Both direct and indirect neuro-modulations of the memory network have been tried in experimental treatments of memory disorders. In this review, we focus on the functional organization of the MTL and other neocortical areas in episodic memory. Task-related neuroimaging studies together with lesion studies suggested that specific sub-regions of the MTL are responsible for specific components of memory. However, recent studies have emphasized that connectivity within MTL structures and even their network dynamics with other cortical areas are essential in the memory process. Resting-state functional network studies also have revealed that memory function is subserved by not only the MTL system but also a distributed network, particularly the default-mode network (DMN). Furthermore, researchers have begun to investigate memory networks throughout the entire brain not restricted to the specific resting-state network (RSN). Altered patterns of functional connectivity (FC) among distributed brain regions were observed in patients with memory impairments. Recently, studies have shown that brain stimulation may impact memory through modulating functional networks, carrying future implications of a novel interventional therapy for memory impairment. PMID:26321939

  6. Large scale photovoltaic field trials. Second technical report: monitoring phase

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-09-15

    This report provides an update on the Large-Scale Building Integrated Photovoltaic Field Trials (LS-BIPV FT) programme commissioned by the Department of Trade and Industry (Department for Business, Enterprise and Industry; BERR). It provides detailed profiles of the 12 projects making up this programme, which is part of the UK programme on photovoltaics and has run in parallel with the Domestic Field Trial. These field trials aim to record the experience and use the lessons learnt to raise awareness of, and confidence in, the technology and increase UK capabilities. The projects involved: the visitor centre at the Gaia Energy Centre in Cornwall; a community church hall in London; council offices in West Oxfordshire; a sports science centre at Gloucester University; the visitor centre at Cotswold Water Park; the headquarters of the Insolvency Service; a Welsh Development Agency building; an athletics centre in Birmingham; a research facility at the University of East Anglia; a primary school in Belfast; and Barnstable civic centre in Devon. The report describes the aims of the field trials, monitoring issues, performance, observations and trends, lessons learnt and the results of occupancy surveys.

  7. Episodic memory in aspects of large-scale brain networks

    Directory of Open Access Journals (Sweden)

    Woorim eJeong

    2015-08-01

    Full Text Available Understanding human episodic memory in aspects of large-scale brain networks has become one of the central themes in neuroscience over the last decade. Traditionally, episodic memory was regarded as mostly relying on medial temporal lobe (MTL structures. However, recent studies have suggested involvement of more widely distributed cortical network and the importance of its interactive roles in the memory process. Both direct and indirect neuro-modulations of the memory network have been tried in experimental treatments of memory disorders. In this review, we focus on the functional organization of the MTL and other neocortical areas in episodic memory. Task-related neuroimaging studies together with lesion studies suggested that specific sub-regions of the MTL are responsible for specific components of memory. However, recent studies have emphasized that connectivity within MTL structures and even their network dynamics with other cortical areas are essential in the memory process. Resting-state functional network studies also have revealed that memory function is subserved by not only the MTL system but also a distributed network, particularly the default-mode network. Furthermore, researchers have begun to investigate memory networks throughout the entire brain not restricted to the specific resting-state network. Altered patterns of functional connectivity among distributed brain regions were observed in patients with memory impairments. Recently, studies have shown that brain stimulation may impact memory through modulating functional networks, carrying future implications of a novel interventional therapy for memory impairment.

  8. The Australian synchrotron research program

    International Nuclear Information System (INIS)

    Garrett, R.F.

    1998-01-01

    Full text: The Australian Synchrotron Research Program (ASRP) was established in 1996 under a 5 year grant from the Australian Government, and is managed by ANSTO on behalf of a consortium of Australian universities and research organisations. It has taken over the operation of the Australian National Beamline Facility (ANBF) at the Photon Factory, and has joined two CATS at the Advanced Photon Source: the Synchrotron Radiation Instrumentation CAT (SRI-CAT) and the Consortium for Advanced Radiation Sources (CARS). The ASRP thus manages a comprehensive range of synchrotron radiation research facilities for Australian science. The ANBF is a general purpose hard X-ray beamline which has been in operation at the Photon Factory since 1993. It currently caters for about 35 Australian research teams per year. The facilities available at the ANBF will be presented and the research program will be summarised. The ASRP facilities at the APS comprise the 5 sectors operated by SRI-CAT, BioCARS and ChemMatCARS. A brief description will be given of the ASRP research programs at the APS, which will considerably broaden the scope of Australian synchrotron science

  9. The Rights and Responsibility of Test Takers When Large-Scale Testing Is Used for Classroom Assessment

    Science.gov (United States)

    van Barneveld, Christina; Brinson, Karieann

    2017-01-01

    The purpose of this research was to identify conflicts in the rights and responsibility of Grade 9 test takers when some parts of a large-scale test are marked by teachers and used in the calculation of students' class marks. Data from teachers' questionnaires and students' questionnaires from a 2009-10 administration of a large-scale test of…

  10. Higher Education Teachers' Descriptions of Their Own Learning: A Large-Scale Study of Finnish Universities of Applied Sciences

    Science.gov (United States)

    Töytäri, Aija; Piirainen, Arja; Tynjälä, Päivi; Vanhanen-Nuutinen, Liisa; Mäki, Kimmo; Ilves, Vesa

    2016-01-01

    In this large-scale study, higher education teachers' descriptions of their own learning were examined with qualitative analysis involving application of principles of phenomenographic research. This study is unique: it is unusual to use large-scale data in qualitative studies. The data were collected through an e-mail survey sent to 5960 teachers…

  11. Nebraska Prostate Cancer Research Program

    Science.gov (United States)

    2015-10-01

    STUDENT ENGAGEMENT Welcome 2 UNMC 3 Omaha 4 Arrival 5-6 Living 7 Events 8...Graduates 9-11 Channing Bunch, M.B.A Director of Recruitment and Student Engagement channing.bunch...Program, Eppley Institute, Office of Research and Development, and Recruitment and Student Engagement Responses to Nebraska Prostate

  12. Optimiturve research program in 1991

    International Nuclear Information System (INIS)

    Leinonen, A.

    1992-01-01

    The target of the program is to develop a peat production method, based on solar energy, by which it is possible to double the present annual hectare yield. It has been estimated that if the target of the program can be fulfilled it is possible to decrease the production costs by about 20 %. The target has been strived by intensification of utilization of solar radiation, by improving the collection rate of dry peat, by decreasing the rain effects on production, by lengthening the production season and by decreasing the storage losses. Three new peat production methods have so far been developed in the Optimiturve research program, by which it is possible to obtain the targets of the program. These methods are the new sod peat production method, the ridge drying method and the Multi method

  13. Phylogenetic distribution of large-scale genome patchiness

    Directory of Open Access Journals (Sweden)

    Hackenberg Michael

    2008-04-01

    Full Text Available Abstract Background The phylogenetic distribution of large-scale genome structure (i.e. mosaic compositional patchiness has been explored mainly by analytical ultracentrifugation of bulk DNA. However, with the availability of large, good-quality chromosome sequences, and the recently developed computational methods to directly analyze patchiness on the genome sequence, an evolutionary comparative analysis can be carried out at the sequence level. Results The local variations in the scaling exponent of the Detrended Fluctuation Analysis are used here to analyze large-scale genome structure and directly uncover the characteristic scales present in genome sequences. Furthermore, through shuffling experiments of selected genome regions, computationally-identified, isochore-like regions were identified as the biological source for the uncovered large-scale genome structure. The phylogenetic distribution of short- and large-scale patchiness was determined in the best-sequenced genome assemblies from eleven eukaryotic genomes: mammals (Homo sapiens, Pan troglodytes, Mus musculus, Rattus norvegicus, and Canis familiaris, birds (Gallus gallus, fishes (Danio rerio, invertebrates (Drosophila melanogaster and Caenorhabditis elegans, plants (Arabidopsis thaliana and yeasts (Saccharomyces cerevisiae. We found large-scale patchiness of genome structure, associated with in silico determined, isochore-like regions, throughout this wide phylogenetic range. Conclusion Large-scale genome structure is detected by directly analyzing DNA sequences in a wide range of eukaryotic chromosome sequences, from human to yeast. In all these genomes, large-scale patchiness can be associated with the isochore-like regions, as directly detected in silico at the sequence level.

  14. Using Practitioner Inquiry within and against Large-Scale Educational Reform

    Science.gov (United States)

    Hines, Mary Beth; Conner-Zachocki, Jennifer

    2015-01-01

    This research study examines the impact of teacher research on participants in a large-scale educational reform initiative in the United States, No Child Left Behind, and its strand for reading teachers, Reading First. Reading First supported professional development for teachers in order to increase student scores on standardized tests. The…

  15. Hierarchical formation of large scale structures of the Universe: observations and models

    International Nuclear Information System (INIS)

    Maurogordato, Sophie

    2003-01-01

    In this report for an Accreditation to Supervise Research (HDR), the author proposes an overview of her research works in cosmology. These works notably addressed the large scale distribution of the Universe (with constraints on the scenario of formation, and on the bias relationship, and the structuring of clusters), the analysis of galaxy clusters during coalescence, mass distribution within relaxed clusters [fr

  16. Survey of large-scale solar water heaters installed in Taiwan, China

    Energy Technology Data Exchange (ETDEWEB)

    Chang Keh-Chin; Lee Tsong-Sheng; Chung Kung-Ming [Cheng Kung Univ., Tainan (China); Lien Ya-Feng; Lee Chine-An [Cheng Kung Univ. Research and Development Foundation, Tainan (China)

    2008-07-01

    Almost all the solar collectors installed in Taiwan, China were used for production of hot water for homeowners (residential systems), in which the area of solar collectors is less than 10 square meters. From 2001 to 2006, there were only 39 large-scale systems (defined as the area of solar collectors being over 100 m{sup 2}) installed. Their utilization purposes are for rooming house (dormitory), swimming pool, restaurant, and manufacturing process. A comprehensive survey of those large-scale solar water heaters was conducted in 2006. The objectives of the survey were to asses the systems' performance and to have the feedback from the individual users. It is found that lack of experience in system design and maintenance are the key factors for reliable operation of a system. For further promotion of large-scale solar water heaters in Taiwan, a more compressive program on a system design for manufacturing process should be conducted. (orig.)

  17. Analysis for preliminary evaluation of discrete fracture flow and large-scale permeability in sedimentary rocks

    International Nuclear Information System (INIS)

    Kanehiro, B.Y.; Lai, C.H.; Stow, S.H.

    1987-05-01

    Conceptual models for sedimentary rock settings that could be used in future evaluation and suitability studies are being examined through the DOE Repository Technology Program. One area of concern for the hydrologic aspects of these models is discrete fracture flow analysis as related to the estimation of the size of the representative elementary volume, evaluation of the appropriateness of continuum assumptions and estimation of the large-scale permeabilities of sedimentary rocks. A basis for preliminary analysis of flow in fracture systems of the types that might be expected to occur in low permeability sedimentary rocks is presented. The approach used involves numerical modeling of discrete fracture flow for the configuration of a large-scale hydrologic field test directed at estimation of the size of the representative elementary volume and large-scale permeability. Analysis of fracture data on the basis of this configuration is expected to provide a preliminary indication of the scale at which continuum assumptions can be made

  18. Subsurface transport program: Research summary

    International Nuclear Information System (INIS)

    1987-01-01

    DOE's research program in subsurface transport is designed to provide a base of fundamental scientific information so that the geochemical, hydrological, and biological mechanisms that contribute to the transport and long term fate of energy related contaminants in subsurface ecosystems can be understood. Understanding the physical and chemical mechanisms that control the transport of single and co-contaminants is the underlying concern of the program. Particular attention is given to interdisciplinary research and to geosphere-biosphere interactions. The scientific results of the program will contribute to resolving Departmental questions related to the disposal of energy-producing and defense wastes. The background papers prepared in support of this document contain additional information on the relevance of the research in the long term to energy-producing technologies. Detailed scientific plans and other research documents are available for high priority research areas, for example, in subsurface transport of organic chemicals and mixtures and in the microbiology of deep aquifers. 5 figs., 1 tab

  19. The relationship between large-scale and convective states in the tropics - Towards an improved representation of convection in large-scale models

    Energy Technology Data Exchange (ETDEWEB)

    Jakob, Christian [Monash Univ., Melbourne, VIC (Australia)

    2015-02-26

    This report summarises an investigation into the relationship of tropical thunderstorms to the atmospheric conditions they are embedded in. The study is based on the use of radar observations at the Atmospheric Radiation Measurement site in Darwin run under the auspices of the DOE Atmospheric Systems Research program. Linking the larger scales of the atmosphere with the smaller scales of thunderstorms is crucial for the development of the representation of thunderstorms in weather and climate models, which is carried out by a process termed parametrisation. Through the analysis of radar and wind profiler observations the project made several fundamental discoveries about tropical storms and quantified the relationship of the occurrence and intensity of these storms to the large-scale atmosphere. We were able to show that the rainfall averaged over an area the size of a typical climate model grid-box is largely controlled by the number of storms in the area, and less so by the storm intensity. This allows us to completely rethink the way we represent such storms in climate models. We also found that storms occur in three distinct categories based on their depth and that the transition between these categories is strongly related to the larger scale dynamical features of the atmosphere more so than its thermodynamic state. Finally, we used our observational findings to test and refine a new approach to cumulus parametrisation which relies on the stochastic modelling of the area covered by different convective cloud types.

  20. Development of large scale wind energy conservation system. Development of large scale wind energy conversion system; Ogata furyoku hatsuden system no kaihatsu. Ogata furyoku hatsuden system no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    Takita, M [New Energy and Industrial Technology Development Organization, Tokyo (Japan)

    1994-12-01

    Described herein are the results of the FY1994 research program for development of large scale wind energy conversion system. The study on technological development of key components evaluates performance of, and confirms reliability and applicability of, hydraulic systems centered by those equipped with variable pitch mechanisms and electrohydraulic servo valves that control them. The study on blade conducts fatigue and crack-propagation tests, which show that the blades developed have high strength. The study on speed-increasing gear conducts load tests, confirming the effects of reducing vibration and noise by modification of the gear teeth. The study on NACELLE cover conducts vibration tests to confirm its vibration characteristics, and analyzes three-dimensional vibration by the finite element method. Some components for a 500kW commercial wind mill are fabricated, including rotor heads, variable pitch mechanisms, speed-increasing gears, YAW systems, and hydraulic control systems. The others fabricated include a remote supervisory control system for maintenance, system to integrate the wind mill into a power system, and electrical control devices in which site conditions, such as atmospheric temperature and lightening, are taken into consideration.