WorldWideScience

Sample records for research program large-scale

  1. Possible research program on a large scale nuclear pressure vessel

    International Nuclear Information System (INIS)

    1983-01-01

    The nuclear pressure vessel structural integrity is actually one of the main items in the nuclear plants safety field. An international study group aimed at investigating the feasibility of a ''possible research program'' on a scale 1:1 LWR pressure vessel. This report presents the study group's work. The different research programs carried out or being carried out in various countries of the European Community are presented (phase I of the study). The main characteristics of the vessel considered for the program and an evaluation of activities required for making them available are listed. Research topic priorities from the different interested countries are summarized in tables (phase 2); a critical review by the study group of the topic is presented. Then, proposals for possible experimental programs and combination of these programs are presented, only as examples of possible useful research activities. The documents pertaining to the results of phase I inquiry performed by the study group are reported in the appendix

  2. Political consultation and large-scale research

    International Nuclear Information System (INIS)

    Bechmann, G.; Folkers, H.

    1977-01-01

    Large-scale research and policy consulting have an intermediary position between sociological sub-systems. While large-scale research coordinates science, policy, and production, policy consulting coordinates science, policy and political spheres. In this very position, large-scale research and policy consulting lack of institutional guarantees and rational back-ground guarantee which are characteristic for their sociological environment. This large-scale research can neither deal with the production of innovative goods under consideration of rentability, nor can it hope for full recognition by the basis-oriented scientific community. Policy consulting knows neither the competence assignment of the political system to make decisions nor can it judge succesfully by the critical standards of the established social science, at least as far as the present situation is concerned. This intermediary position of large-scale research and policy consulting has, in three points, a consequence supporting the thesis which states that this is a new form of institutionalization of science: These are: 1) external control, 2) the organization form, 3) the theoretical conception of large-scale research and policy consulting. (orig.) [de

  3. Parallelizing Gene Expression Programming Algorithm in Enabling Large-Scale Classification

    Directory of Open Access Journals (Sweden)

    Lixiong Xu

    2017-01-01

    Full Text Available As one of the most effective function mining algorithms, Gene Expression Programming (GEP algorithm has been widely used in classification, pattern recognition, prediction, and other research fields. Based on the self-evolution, GEP is able to mine an optimal function for dealing with further complicated tasks. However, in big data researches, GEP encounters low efficiency issue due to its long time mining processes. To improve the efficiency of GEP in big data researches especially for processing large-scale classification tasks, this paper presents a parallelized GEP algorithm using MapReduce computing model. The experimental results show that the presented algorithm is scalable and efficient for processing large-scale classification tasks.

  4. Needs, opportunities, and options for large scale systems research

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  5. Ecological research in the large-scale biosphere-atmosphere experiment in Amazonia: early results

    NARCIS (Netherlands)

    Keller, M.; Alencar, A.; Asner, G.P.; Braswell, B.; Bustamante, M.; Davidson, E.; Feldpausch, T.; Fernandes, E.; Goulden, M.; Kabat, P.; Kruijt, B.; Luizão, F.; Miller, S.; Markewitz, D.; Nobre, A.D.; Nobre, C.A.; Priante Filho, N.; Rocha, da H.; Silva Dias, P.; Randow, von C.; Vourlitis, G.L.

    2004-01-01

    The Large-scale Biosphere-Atmosphere Experiment in Amazonia (LBA) is a multinational, interdisciplinary research program led by Brazil. Ecological studies in LBA focus on how tropical forest conversion, regrowth, and selective logging influence carbon storage,. nutrient dynamics, trace gas fluxes,

  6. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  7. Large-scale linear programs in planning and prediction.

    Science.gov (United States)

    2017-06-01

    Large-scale linear programs are at the core of many traffic-related optimization problems in both planning and prediction. Moreover, many of these involve significant uncertainty, and hence are modeled using either chance constraints, or robust optim...

  8. Large-Scale Seismic Test Program at Hualien, Taiwan

    International Nuclear Information System (INIS)

    Tang, H.T.; Graves, H.L.; Chen, P.C.

    1992-01-01

    The Large-Scale Seismic Test (LSST) Program at Hualien, Taiwan, is a follow-on to the soil-structure interaction (SSI) experiments at Lotung, Taiwan. The planned SSI studies will be performed at a stiff soil site in Hualien, Taiwan, that historically has had slightly more destructive earthquakes in the past than Lotung. The LSST is a joint effort among many interested parties. Electric Power Research Institute (EPRI) and Taipower are the organizers of the program and have the lead in planning and managing the program. Other organizations participating in the LSST program are US Nuclear Regulatory Commission, the Central Research Institute of Electric Power Industry, the Tokyo Electric Power Company, the Commissariat A L'Energie Atomique, Electricite de France and Framatome. The LSST was initiated in January 1990, and is envisioned to be five years in duration. Based on the assumption of stiff soil and confirmed by soil boring and geophysical results the test model was designed to provide data needed for SSI studies covering: free-field input, nonlinear soil response, non-rigid body SSI, torsional response, kinematic interaction, spatial incoherency and other effects. Taipower had the lead in design of the test model and received significant input from other LSST members. Questions raised by LSST members were on embedment effects, model stiffness, base shear, and openings for equipment. This paper describes progress in site preparation, design and construction of the model and development of an instrumentation plan

  9. Large-scale sequential quadratic programming algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Eldersveld, S.K.

    1992-09-01

    The problem addressed is the general nonlinear programming problem: finding a local minimizer for a nonlinear function subject to a mixture of nonlinear equality and inequality constraints. The methods studied are in the class of sequential quadratic programming (SQP) algorithms, which have previously proved successful for problems of moderate size. Our goal is to devise an SQP algorithm that is applicable to large-scale optimization problems, using sparse data structures and storing less curvature information but maintaining the property of superlinear convergence. The main features are: 1. The use of a quasi-Newton approximation to the reduced Hessian of the Lagrangian function. Only an estimate of the reduced Hessian matrix is required by our algorithm. The impact of not having available the full Hessian approximation is studied and alternative estimates are constructed. 2. The use of a transformation matrix Q. This allows the QP gradient to be computed easily when only the reduced Hessian approximation is maintained. 3. The use of a reduced-gradient form of the basis for the null space of the working set. This choice of basis is more practical than an orthogonal null-space basis for large-scale problems. The continuity condition for this choice is proven. 4. The use of incomplete solutions of quadratic programming subproblems. Certain iterates generated by an active-set method for the QP subproblem are used in place of the QP minimizer to define the search direction for the nonlinear problem. An implementation of the new algorithm has been obtained by modifying the code MINOS. Results and comparisons with MINOS and NPSOL are given for the new algorithm on a set of 92 test problems.

  10. ECOLOGICAL RESEARCH IN THE LARGE-SCALE BIOSPHERE–ATMOSPHERE EXPERIMENT IN AMAZONIA: EARLY RESULTS.

    Science.gov (United States)

    M. Keller; A. Alencar; G. P. Asner; B. Braswell; M. Bustamente; E. Davidson; T. Feldpausch; E. Fern ndes; M. Goulden; P. Kabat; B. Kruijt; F. Luizao; S. Miller; D. Markewitz; A. D. Nobre; C. A. Nobre; N. Priante Filho; H. Rocha; P. Silva Dias; C von Randow; G. L. Vourlitis

    2004-01-01

    The Large-scale Biosphere–Atmosphere Experiment in Amazonia (LBA) is a multinational, interdisciplinary research program led by Brazil. Ecological studies in LBA focus on how tropical forest conversion, regrowth, and selective logging influence carbon storage, nutrient dynamics, trace gas fluxes, and the prospect for sustainable land use in the Amazon region. Early...

  11. BILGO: Bilateral greedy optimization for large scale semidefinite programming

    KAUST Repository

    Hao, Zhifeng; Yuan, Ganzhao; Ghanem, Bernard

    2013-01-01

    Many machine learning tasks (e.g. metric and manifold learning problems) can be formulated as convex semidefinite programs. To enable the application of these tasks on a large-scale, scalability and computational efficiency are considered as desirable properties for a practical semidefinite programming algorithm. In this paper, we theoretically analyze a new bilateral greedy optimization (denoted BILGO) strategy in solving general semidefinite programs on large-scale datasets. As compared to existing methods, BILGO employs a bilateral search strategy during each optimization iteration. In such an iteration, the current semidefinite matrix solution is updated as a bilateral linear combination of the previous solution and a suitable rank-1 matrix, which can be efficiently computed from the leading eigenvector of the descent direction at this iteration. By optimizing for the coefficients of the bilateral combination, BILGO reduces the cost function in every iteration until the KKT conditions are fully satisfied, thus, it tends to converge to a global optimum. In fact, we prove that BILGO converges to the global optimal solution at a rate of O(1/k), where k is the iteration counter. The algorithm thus successfully combines the efficiency of conventional rank-1 update algorithms and the effectiveness of gradient descent. Moreover, BILGO can be easily extended to handle low rank constraints. To validate the effectiveness and efficiency of BILGO, we apply it to two important machine learning tasks, namely Mahalanobis metric learning and maximum variance unfolding. Extensive experimental results clearly demonstrate that BILGO can solve large-scale semidefinite programs efficiently.

  12. BILGO: Bilateral greedy optimization for large scale semidefinite programming

    KAUST Repository

    Hao, Zhifeng

    2013-10-03

    Many machine learning tasks (e.g. metric and manifold learning problems) can be formulated as convex semidefinite programs. To enable the application of these tasks on a large-scale, scalability and computational efficiency are considered as desirable properties for a practical semidefinite programming algorithm. In this paper, we theoretically analyze a new bilateral greedy optimization (denoted BILGO) strategy in solving general semidefinite programs on large-scale datasets. As compared to existing methods, BILGO employs a bilateral search strategy during each optimization iteration. In such an iteration, the current semidefinite matrix solution is updated as a bilateral linear combination of the previous solution and a suitable rank-1 matrix, which can be efficiently computed from the leading eigenvector of the descent direction at this iteration. By optimizing for the coefficients of the bilateral combination, BILGO reduces the cost function in every iteration until the KKT conditions are fully satisfied, thus, it tends to converge to a global optimum. In fact, we prove that BILGO converges to the global optimal solution at a rate of O(1/k), where k is the iteration counter. The algorithm thus successfully combines the efficiency of conventional rank-1 update algorithms and the effectiveness of gradient descent. Moreover, BILGO can be easily extended to handle low rank constraints. To validate the effectiveness and efficiency of BILGO, we apply it to two important machine learning tasks, namely Mahalanobis metric learning and maximum variance unfolding. Extensive experimental results clearly demonstrate that BILGO can solve large-scale semidefinite programs efficiently.

  13. Quality assurance in a large research and development laboratory

    International Nuclear Information System (INIS)

    Neill, F.H.

    1980-01-01

    Developing a quality assurance program for a large research and development laboratory provided a unique opportunity for innovative planning. The quality assurance program that emerged has been tailored to meet the requirements of several sponsoring organizations and contains the flexibility for experimental programs ranging from large engineering-scale development projects to bench-scale basic research programs

  14. How Large-Scale Research Facilities Connect to Global Research

    DEFF Research Database (Denmark)

    Lauto, Giancarlo; Valentin, Finn

    2013-01-01

    Policies for large-scale research facilities (LSRFs) often highlight their spillovers to industrial innovation and their contribution to the external connectivity of the regional innovation system hosting them. Arguably, the particular institutional features of LSRFs are conducive for collaborative...... research. However, based on data on publications produced in 2006–2009 at the Neutron Science Directorate of Oak Ridge National Laboratory in Tennessee (United States), we find that internationalization of its collaborative research is restrained by coordination costs similar to those characterizing other...

  15. Internationalization Measures in Large Scale Research Projects

    Science.gov (United States)

    Soeding, Emanuel; Smith, Nancy

    2017-04-01

    Internationalization measures in Large Scale Research Projects Large scale research projects (LSRP) often serve as flagships used by universities or research institutions to demonstrate their performance and capability to stakeholders and other interested parties. As the global competition among universities for the recruitment of the brightest brains has increased, effective internationalization measures have become hot topics for universities and LSRP alike. Nevertheless, most projects and universities are challenged with little experience on how to conduct these measures and make internationalization an cost efficient and useful activity. Furthermore, those undertakings permanently have to be justified with the Project PIs as important, valuable tools to improve the capacity of the project and the research location. There are a variety of measures, suited to support universities in international recruitment. These include e.g. institutional partnerships, research marketing, a welcome culture, support for science mobility and an effective alumni strategy. These activities, although often conducted by different university entities, are interlocked and can be very powerful measures if interfaced in an effective way. On this poster we display a number of internationalization measures for various target groups, identify interfaces between project management, university administration, researchers and international partners to work together, exchange information and improve processes in order to be able to recruit, support and keep the brightest heads to your project.

  16. Mining the Mind Research Network: A Novel Framework for Exploring Large Scale, Heterogeneous Translational Neuroscience Research Data Sources

    Science.gov (United States)

    Bockholt, Henry J.; Scully, Mark; Courtney, William; Rachakonda, Srinivas; Scott, Adam; Caprihan, Arvind; Fries, Jill; Kalyanam, Ravi; Segall, Judith M.; de la Garza, Raul; Lane, Susan; Calhoun, Vince D.

    2009-01-01

    A neuroinformatics (NI) system is critical to brain imaging research in order to shorten the time between study conception and results. Such a NI system is required to scale well when large numbers of subjects are studied. Further, when multiple sites participate in research projects organizational issues become increasingly difficult. Optimized NI applications mitigate these problems. Additionally, NI software enables coordination across multiple studies, leveraging advantages potentially leading to exponential research discoveries. The web-based, Mind Research Network (MRN), database system has been designed and improved through our experience with 200 research studies and 250 researchers from seven different institutions. The MRN tools permit the collection, management, reporting and efficient use of large scale, heterogeneous data sources, e.g., multiple institutions, multiple principal investigators, multiple research programs and studies, and multimodal acquisitions. We have collected and analyzed data sets on thousands of research participants and have set up a framework to automatically analyze the data, thereby making efficient, practical data mining of this vast resource possible. This paper presents a comprehensive framework for capturing and analyzing heterogeneous neuroscience research data sources that has been fully optimized for end-users to perform novel data mining. PMID:20461147

  17. Mining the mind research network: a novel framework for exploring large scale, heterogeneous translational neuroscience research data sources.

    Directory of Open Access Journals (Sweden)

    Henry Jeremy Bockholt

    2010-04-01

    Full Text Available A neuroinformatics (NI system is critical to brain imaging research in order to shorten the time between study conception and results. Such a NI system is required to scale well when large numbers of subjects are studied. Further, when multiple sites participate in research projects organizational issues become increasingly difficult. Optimized NI applications mitigate these problems. Additionally, NI software enables coordination across multiple studies, leveraging advantages potentially leading to exponential research discoveries. The web-based, Mind Research Network (MRN, database system has been designed and improved through our experience with 200 research studies and 250 researchers from 7 different institutions. The MRN tools permit the collection, management, reporting and efficient use of large scale, heterogeneous data sources, e.g., multiple institutions, multiple principal investigators, multiple research programs and studies, and multimodal acquisitions. We have collected and analyzed data sets on thousands of research participants and have set up a framework to automatically analyze the data, thereby making efficient, practical data mining of this vast resource possible. This paper presents a comprehensive framework for capturing and analyzing heterogeneous neuroscience research data sources that has been fully optimized for end-users to perform novel data mining.

  18. Managing Risk and Uncertainty in Large-Scale University Research Projects

    Science.gov (United States)

    Moore, Sharlissa; Shangraw, R. F., Jr.

    2011-01-01

    Both publicly and privately funded research projects managed by universities are growing in size and scope. Complex, large-scale projects (over $50 million) pose new management challenges and risks for universities. This paper explores the relationship between project success and a variety of factors in large-scale university projects. First, we…

  19. Concurrent Programming Using Actors: Exploiting Large-Scale Parallelism,

    Science.gov (United States)

    1985-10-07

    ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK* Artificial Inteligence Laboratory AREA Is WORK UNIT NUMBERS 545 Technology Square...D-R162 422 CONCURRENT PROGRMMIZNG USING f"OS XL?ITP TEH l’ LARGE-SCALE PARALLELISH(U) NASI AC E Al CAMBRIDGE ARTIFICIAL INTELLIGENCE L. G AGHA ET AL...RESOLUTION TEST CHART N~ATIONAL BUREAU OF STANDA.RDS - -96 A -E. __ _ __ __’ .,*- - -- •. - MASSACHUSETTS INSTITUTE OF TECHNOLOGY ARTIFICIAL

  20. New Visions for Large Scale Networks: Research and Applications

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — This paper documents the findings of the March 12-14, 2001 Workshop on New Visions for Large-Scale Networks: Research and Applications. The workshops objectives were...

  1. Results of research and development in large-scale research centers as an innovation source for firms

    International Nuclear Information System (INIS)

    Theenhaus, R.

    1978-01-01

    The twelve large-scale research centres of the Federal Republic of Germany with their 16,000 employees represent a considerable scientific and technical potential. Cooperation with industry with regard to large-scale projects has already become very close and the know-how flow as well as the contributions to innovation connected therewith are largely established. The first successful steps to utilizing the results of basic research, of spin off and those within the frame of research and development as well as the fulfilling of services are encouraging. However, there is a number of detail problems which can only be solved between all parties concerned, in particular between industry and all large-scale research centres. (orig./RW) [de

  2. 5 years of experience with a large-scale mentoring program for medical students

    Directory of Open Access Journals (Sweden)

    Pinilla, Severin

    2015-02-01

    Full Text Available In this paper we present our 5-year-experience with a large-scale mentoring program for undergraduate medical students at the Ludwig Maximilians-Universität Munich (LMU. We implemented a two-tiered program with a peer-mentoring concept for preclinical students and a 1:1-mentoring concept for clinical students aided by a fully automated online-based matching algorithm. Approximately 20-30% of each student cohort participates in our voluntary mentoring program. Defining ideal program evaluation strategies, recruiting mentors from beyond the academic environment and accounting for the mentoring network reality remain challenging. We conclude that a two-tiered program is well accepted by students and faculty. In addition the online-based matching seems to be effective for large-scale mentoring programs.

  3. The Saskatchewan River Basin - a large scale observatory for water security research (Invited)

    Science.gov (United States)

    Wheater, H. S.

    2013-12-01

    multiple jurisdictions. The SaskRB has therefore been developed as a large scale observatory, now a Regional Hydroclimate Project of the World Climate Research Programme's GEWEX project, and is available to contribute to the emerging North American Water Program. State-of-the-art hydro-ecological experimental sites have been developed for the key biomes, and a river and lake biogeochemical research facility, focussed on impacts of nutrients and exotic chemicals. Data are integrated at SaskRB scale to support the development of improved large scale climate and hydrological modelling products, the development of DSS systems for local, provincial and basin-scale management, and the development of related social science research, engaging stakeholders in the research and exploring their values and priorities for water security. The observatory provides multiple scales of observation and modelling required to develop: a) new climate, hydrological and ecological science and modelling tools to address environmental change in key environments, and their integrated effects and feedbacks at large catchment scale, b) new tools needed to support river basin management under uncertainty, including anthropogenic controls on land and water management and c) the place-based focus for the development of new transdisciplinary science.

  4. A note on solving large-scale zero-one programming problems

    NARCIS (Netherlands)

    Adema, Jos J.

    1988-01-01

    A heuristic for solving large-scale zero-one programming problems is provided. The heuristic is based on the modifications made by H. Crowder et al. (1983) to the standard branch-and-bound strategy. First, the initialization is modified. The modification is only useful if the objective function

  5. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  6. Distributed system for large-scale remote research

    International Nuclear Information System (INIS)

    Ueshima, Yutaka

    2002-01-01

    In advanced photon research, large-scale simulations and high-resolution observations are powerfull tools. In numerical and real experiments, the real-time visualization and steering system is considered as a hopeful method of data analysis. This approach is valid in the typical analysis at one time or low cost experiment and simulation. In research of an unknown problem, it is necessary that the output data be analyzed many times because conclusive analysis is difficult at one time. Consequently, output data should be filed to refer and analyze at any time. To support research, we need the automatic functions, transporting data files from data generator to data storage, analyzing data, tracking history of data handling, and so on. The supporting system will be a functionally distributed system. (author)

  7. Model design for Large-Scale Seismic Test Program at Hualien, Taiwan

    International Nuclear Information System (INIS)

    Tang, H.T.; Graves, H.L.; Chen, P.C.

    1991-01-01

    The Large-Scale Seismic Test (LSST) Program at Hualien, Taiwan, is a follow-on to the soil-structure interaction (SSI) experiments at Lotung, Taiwan. The planned SSI studies will be performed at a stiff soil site in Hualien, Taiwan, that historically has had slightly more destructive earthquakes in the past than Lotung. The LSST is a joint effort among many interested parties. Electric Power Research Institute (EPRI) and Taipower are the organizers of the program and have the lead in planning and managing the program. Other organizations participating in the LSST program are US Nuclear Regulatory Commission (NRC), the Central Research Institute of Electric Power Industry (CRIEPI), the Tokyo Electric Power Company (TEPCO), the Commissariat A L'Energie Atomique (CEA), Electricite de France (EdF) and Framatome. The LSST was initiated in January 1990, and is envisioned to be five years in duration. Based on the assumption of stiff soil and confirmed by soil boring and geophysical results the test model was designed to provide data needed for SSI studies covering: free-field input, nonlinear soil response, non-rigid body SSI, torsional response, kinematic interaction, spatial incoherency and other effects. Taipower had the lead in design of the test model and received significant input from other LSST members. Questions raised by LSST members were on embedment effects, model stiffness, base shear, and openings for equipment. This paper describes progress in site preparation, design and construction of the model and development of an instrumentation plan

  8. Planning under uncertainty solving large-scale stochastic linear programs

    Energy Technology Data Exchange (ETDEWEB)

    Infanger, G. [Stanford Univ., CA (United States). Dept. of Operations Research]|[Technische Univ., Vienna (Austria). Inst. fuer Energiewirtschaft

    1992-12-01

    For many practical problems, solutions obtained from deterministic models are unsatisfactory because they fail to hedge against certain contingencies that may occur in the future. Stochastic models address this shortcoming, but up to recently seemed to be intractable due to their size. Recent advances both in solution algorithms and in computer technology now allow us to solve important and general classes of practical stochastic problems. We show how large-scale stochastic linear programs can be efficiently solved by combining classical decomposition and Monte Carlo (importance) sampling techniques. We discuss the methodology for solving two-stage stochastic linear programs with recourse, present numerical results of large problems with numerous stochastic parameters, show how to efficiently implement the methodology on a parallel multi-computer and derive the theory for solving a general class of multi-stage problems with dependency of the stochastic parameters within a stage and between different stages.

  9. Research on the Construction Management and Sustainable Development of Large-Scale Scientific Facilities in China

    Science.gov (United States)

    Guiquan, Xi; Lin, Cong; Xuehui, Jin

    2018-05-01

    As an important platform for scientific and technological development, large -scale scientific facilities are the cornerstone of technological innovation and a guarantee for economic and social development. Researching management of large-scale scientific facilities can play a key role in scientific research, sociology and key national strategy. This paper reviews the characteristics of large-scale scientific facilities, and summarizes development status of China's large-scale scientific facilities. At last, the construction, management, operation and evaluation of large-scale scientific facilities is analyzed from the perspective of sustainable development.

  10. Proceedings of the meeting on large scale computer simulation research

    International Nuclear Information System (INIS)

    2004-04-01

    The meeting to summarize the collaboration activities for FY2003 on the Large Scale Computer Simulation Research was held January 15-16, 2004 at Theory and Computer Simulation Research Center, National Institute for Fusion Science. Recent simulation results, methodologies and other related topics were presented. (author)

  11. Large-Scale Seismic Test Program at Hualien, Taiwan

    International Nuclear Information System (INIS)

    Tang, H.T.; Graves, H.L.; Yeh, Y.S.

    1991-01-01

    The Large-Scale Seismic Test (LSST) Program at Hualien, Taiwan, is a follow-on to the soil-structure interaction (SSI) experiments at Lotung, Taiwan. The planned SSI studies will be performed at a stiff soil site in Hualien, Taiwan, that historically has had slightly more destructive earthquakes in the past than Lotung. The objectives of the LSST project is as follows: To obtain earthquake-induced SSI data at a stiff soil site having similar prototypical nuclear power plant soil conditions. To confirm the findings and methodologies validated against the Lotung soft soil SSI data for prototypical plant condition applications. To further validate the technical basis of realistic SSI analysis approaches. To further support the resolution of USI A-40 Seismic Design Criteria issue. These objectives will be accomplished through an integrated and carefully planned experimental program consisting of: soil characterization, test model design and field construction, instrumentation layout and deployment, in-situ geophysical information collection, forced vibration test, and synthesis of results and findings. The LSST is a joint effort among many interested parties. EPRI and Taipower are the organizers of the program and have the lead in planning and managing the program

  12. PANDA: a Large Scale Multi-Purpose Test Facility for LWR Safety Research

    Energy Technology Data Exchange (ETDEWEB)

    Dreier, Joerg; Paladino, Domenico; Huggenberger, Max; Andreani, Michele [Laboratory for Thermal-Hydraulics, Nuclear Energy and Safety Research Department, Paul Scherrer Institut (PSI), CH-5232 Villigen PSI (Switzerland); Yadigaroglu, George [ETH Zuerich, Technoparkstrasse 1, Einstein 22- CH-8005 Zuerich (Switzerland)

    2008-07-01

    PANDA is a large-scale multi-purpose thermal-hydraulics test facility, built and operated by PSI. Due to its modular structure, PANDA provides flexibility for a variety of applications, ranging from integral containment system investigations, primary system tests, component experiments to large-scale separate-effects tests. For many applications, the experimental results are directly used for example for concept demonstrations or for the characterisation of phenomena or components, but all the experimental data generated in the various test campaigns is unique and was or/and will still be widely used for the validation and improvement of a variety of computer codes, including codes with 3D capabilities, for reactor safety analysis. The paper provides an overview of the already completed and on-going research programs performed in the PANDA facility in the different area of applications, including the main results and conclusions of the investigations. In particular the advanced passive containment cooling system concept investigations of the SBWR, ESBWR as well as of the SWR1000 in relation to various aspects are presented and the main findings are summarised. Finally the goals, planned investigations and expected results of the on-going OECD project SETH-2 are presented. (authors)

  13. PANDA: a Large Scale Multi-Purpose Test Facility for LWR Safety Research

    International Nuclear Information System (INIS)

    Dreier, Joerg; Paladino, Domenico; Huggenberger, Max; Andreani, Michele; Yadigaroglu, George

    2008-01-01

    PANDA is a large-scale multi-purpose thermal-hydraulics test facility, built and operated by PSI. Due to its modular structure, PANDA provides flexibility for a variety of applications, ranging from integral containment system investigations, primary system tests, component experiments to large-scale separate-effects tests. For many applications, the experimental results are directly used for example for concept demonstrations or for the characterisation of phenomena or components, but all the experimental data generated in the various test campaigns is unique and was or/and will still be widely used for the validation and improvement of a variety of computer codes, including codes with 3D capabilities, for reactor safety analysis. The paper provides an overview of the already completed and on-going research programs performed in the PANDA facility in the different area of applications, including the main results and conclusions of the investigations. In particular the advanced passive containment cooling system concept investigations of the SBWR, ESBWR as well as of the SWR1000 in relation to various aspects are presented and the main findings are summarised. Finally the goals, planned investigations and expected results of the on-going OECD project SETH-2 are presented. (authors)

  14. Test on large-scale seismic isolation elements, 2

    International Nuclear Information System (INIS)

    Mazda, T.; Moteki, M.; Ishida, K.; Shiojiri, H.; Fujita, T.

    1991-01-01

    Seismic isolation test program of Central Research Inst. of Electric Power Industry (CRIEPI) to apply seismic isolation to Fast Breeder Reactor (FBR) plant was started in 1987. In this test program, demonstration test of seismic isolation elements was considered as one of the most important research items. Facilities for testing seismic isolation elements were built in Abiko Research Laboratory of CRIEPI. Various tests of large-scale seismic isolation elements were conducted up to this day. Many important test data to develop design technical guidelines was obtained. (author)

  15. Large-scale budget applications of mathematical programming in the Forest Service

    Science.gov (United States)

    Malcolm Kirby

    1978-01-01

    Mathematical programming applications in the Forest Service, U.S. Department of Agriculture, are growing. They are being used for widely varying problems: budgeting, lane use planning, timber transport, road maintenance and timber harvest planning. Large-scale applications are being mace in budgeting. The model that is described can be used by developing economies....

  16. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  17. Factors Influencing Uptake of a Large Scale Curriculum Innovation.

    Science.gov (United States)

    Adey, Philip S.

    Educational research has all too often failed to be implemented on a large-scale basis. This paper describes the multiplier effect of a professional development program for teachers and for trainers in the United Kingdom, and how that program was developed, monitored, and evaluated. Cognitive Acceleration through Science Education (CASE) is a…

  18. The fastclime Package for Linear Programming and Large-Scale Precision Matrix Estimation in R.

    Science.gov (United States)

    Pang, Haotian; Liu, Han; Vanderbei, Robert

    2014-02-01

    We develop an R package fastclime for solving a family of regularized linear programming (LP) problems. Our package efficiently implements the parametric simplex algorithm, which provides a scalable and sophisticated tool for solving large-scale linear programs. As an illustrative example, one use of our LP solver is to implement an important sparse precision matrix estimation method called CLIME (Constrained L 1 Minimization Estimator). Compared with existing packages for this problem such as clime and flare, our package has three advantages: (1) it efficiently calculates the full piecewise-linear regularization path; (2) it provides an accurate dual certificate as stopping criterion; (3) it is completely coded in C and is highly portable. This package is designed to be useful to statisticians and machine learning researchers for solving a wide range of problems.

  19. CONSORT to community: translation of an RCT to a large-scale community intervention and learnings from evaluation of the upscaled program.

    Science.gov (United States)

    Moores, Carly Jane; Miller, Jacqueline; Perry, Rebecca Anne; Chan, Lily Lai Hang; Daniels, Lynne Allison; Vidgen, Helen Anna; Magarey, Anthea Margaret

    2017-11-29

    Translation encompasses the continuum from clinical efficacy to widespread adoption within the healthcare service and ultimately routine clinical practice. The Parenting, Eating and Activity for Child Health (PEACH™) program has previously demonstrated clinical effectiveness in the management of child obesity, and has been recently implemented as a large-scale community intervention in Queensland, Australia. This paper aims to describe the translation of the evaluation framework from a randomised controlled trial (RCT) to large-scale community intervention (PEACH™ QLD). Tensions between RCT paradigm and implementation research will be discussed along with lived evaluation challenges, responses to overcome these, and key learnings for future evaluation conducted at scale. The translation of evaluation from PEACH™ RCT to the large-scale community intervention PEACH™ QLD is described. While the CONSORT Statement was used to report findings from two previous RCTs, the REAIM framework was more suitable for the evaluation of upscaled delivery of the PEACH™ program. Evaluation of PEACH™ QLD was undertaken during the project delivery period from 2013 to 2016. Experiential learnings from conducting the evaluation of PEACH™ QLD to the described evaluation framework are presented for the purposes of informing the future evaluation of upscaled programs. Evaluation changes in response to real-time changes in the delivery of the PEACH™ QLD Project were necessary at stages during the project term. Key evaluation challenges encountered included the collection of complete evaluation data from a diverse and geographically dispersed workforce and the systematic collection of process evaluation data in real time to support program changes during the project. Evaluation of large-scale community interventions in the real world is challenging and divergent from RCTs which are rigourously evaluated within a more tightly-controlled clinical research setting. Constructs

  20. Large-scale research in the Federal Republic of Germany. Pt. 4

    International Nuclear Information System (INIS)

    Mock, W.

    1986-01-01

    The name is misleading: in the biggest of 13 large-scale research institutions, the KFA Nuclear Research Centre Juelich, nuclear research is now only one sphere of activities among many, besides other areas of research such as computer science, materials, and environmental research. This change in the areas of main emphasis constitutes the successful attempt - or so it seems up to now - of a 'research dinosaur' to answer to the necessities of an altered 'research landscape'. (orig.) [de

  1. Large-scale grid management

    International Nuclear Information System (INIS)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-01-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series

  2. Bonus algorithm for large scale stochastic nonlinear programming problems

    CERN Document Server

    Diwekar, Urmila

    2015-01-01

    This book presents the details of the BONUS algorithm and its real world applications in areas like sensor placement in large scale drinking water networks, sensor placement in advanced power systems, water management in power systems, and capacity expansion of energy systems. A generalized method for stochastic nonlinear programming based on a sampling based approach for uncertainty analysis and statistical reweighting to obtain probability information is demonstrated in this book. Stochastic optimization problems are difficult to solve since they involve dealing with optimization and uncertainty loops. There are two fundamental approaches used to solve such problems. The first being the decomposition techniques and the second method identifies problem specific structures and transforms the problem into a deterministic nonlinear programming problem. These techniques have significant limitations on either the objective function type or the underlying distributions for the uncertain variables. Moreover, these ...

  3. Large-scale seismic test for soil-structure interaction research in Hualien, Taiwan

    International Nuclear Information System (INIS)

    Ueshima, T.; Kokusho, T.; Okamoto, T.

    1995-01-01

    It is important to evaluate dynamic soil-structure interaction more accurately in the aseismic design of important facilities such as nuclear power plants. A large-scale model structure with about 1/4th of commercial nuclear power plants was constructed on the gravelly layers in seismically active Hualien, Taiwan. This international joint project is called 'the Hualien LSST Project', where 'LSST' is short for Large-Scale Seismic Test. In this paper, research tasks and responsibilities, the process of the construction work and research tasks along the time-line, main results obtained up to now, and so on in this Project are described. (J.P.N.)

  4. Large-scale Homogenization of Bulk Materials in Mammoth Silos

    NARCIS (Netherlands)

    Schott, D.L.

    2004-01-01

    This doctoral thesis concerns the large-scale homogenization of bulk materials in mammoth silos. The objective of this research was to determine the best stacking and reclaiming method for homogenization in mammoth silos. For this purpose a simulation program was developed to estimate the

  5. Icing Simulation Research Supporting the Ice-Accretion Testing of Large-Scale Swept-Wing Models

    Science.gov (United States)

    Yadlin, Yoram; Monnig, Jaime T.; Malone, Adam M.; Paul, Bernard P.

    2018-01-01

    The work summarized in this report is a continuation of NASA's Large-Scale, Swept-Wing Test Articles Fabrication; Research and Test Support for NASA IRT contract (NNC10BA05 -NNC14TA36T) performed by Boeing under the NASA Research and Technology for Aerospace Propulsion Systems (RTAPS) contract. In the study conducted under RTAPS, a series of icing tests in the Icing Research Tunnel (IRT) have been conducted to characterize ice formations on large-scale swept wings representative of modern commercial transport airplanes. The outcome of that campaign was a large database of ice-accretion geometries that can be used for subsequent aerodynamic evaluation in other experimental facilities and for validation of ice-accretion prediction codes.

  6. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  7. Application of Logic Models in a Large Scientific Research Program

    Science.gov (United States)

    O'Keefe, Christine M.; Head, Richard J.

    2011-01-01

    It is the purpose of this article to discuss the development and application of a logic model in the context of a large scientific research program within the Commonwealth Scientific and Industrial Research Organisation (CSIRO). CSIRO is Australia's national science agency and is a publicly funded part of Australia's innovation system. It conducts…

  8. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    Science.gov (United States)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  9. Large-scale production of megakaryocytes from human pluripotent stem cells by chemically defined forward programming.

    Science.gov (United States)

    Moreau, Thomas; Evans, Amanda L; Vasquez, Louella; Tijssen, Marloes R; Yan, Ying; Trotter, Matthew W; Howard, Daniel; Colzani, Maria; Arumugam, Meera; Wu, Wing Han; Dalby, Amanda; Lampela, Riina; Bouet, Guenaelle; Hobbs, Catherine M; Pask, Dean C; Payne, Holly; Ponomaryov, Tatyana; Brill, Alexander; Soranzo, Nicole; Ouwehand, Willem H; Pedersen, Roger A; Ghevaert, Cedric

    2016-04-07

    The production of megakaryocytes (MKs)--the precursors of blood platelets--from human pluripotent stem cells (hPSCs) offers exciting clinical opportunities for transfusion medicine. Here we describe an original approach for the large-scale generation of MKs in chemically defined conditions using a forward programming strategy relying on the concurrent exogenous expression of three transcription factors: GATA1, FLI1 and TAL1. The forward programmed MKs proliferate and differentiate in culture for several months with MK purity over 90% reaching up to 2 × 10(5) mature MKs per input hPSC. Functional platelets are generated throughout the culture allowing the prospective collection of several transfusion units from as few as 1 million starting hPSCs. The high cell purity and yield achieved by MK forward programming, combined with efficient cryopreservation and good manufacturing practice (GMP)-compatible culture, make this approach eminently suitable to both in vitro production of platelets for transfusion and basic research in MK and platelet biology.

  10. Modular, object-oriented redesign of a large-scale Monte Carlo neutron transport program

    International Nuclear Information System (INIS)

    Moskowitz, B.S.

    2000-01-01

    This paper describes the modular, object-oriented redesign of a large-scale Monte Carlo neutron transport program. This effort represents a complete 'white sheet of paper' rewrite of the code. In this paper, the motivation driving this project, the design objectives for the new version of the program, and the design choices and their consequences will be discussed. The design itself will also be described, including the important subsystems as well as the key classes within those subsystems

  11. Prediction of monthly rainfall on homogeneous monsoon regions of India based on large scale circulation patterns using Genetic Programming

    Science.gov (United States)

    Kashid, Satishkumar S.; Maity, Rajib

    2012-08-01

    SummaryPrediction of Indian Summer Monsoon Rainfall (ISMR) is of vital importance for Indian economy, and it has been remained a great challenge for hydro-meteorologists due to inherent complexities in the climatic systems. The Large-scale atmospheric circulation patterns from tropical Pacific Ocean (ENSO) and those from tropical Indian Ocean (EQUINOO) are established to influence the Indian Summer Monsoon Rainfall. The information of these two large scale atmospheric circulation patterns in terms of their indices is used to model the complex relationship between Indian Summer Monsoon Rainfall and the ENSO as well as EQUINOO indices. However, extracting the signal from such large-scale indices for modeling such complex systems is significantly difficult. Rainfall predictions have been done for 'All India' as one unit, as well as for five 'homogeneous monsoon regions of India', defined by Indian Institute of Tropical Meteorology. Recent 'Artificial Intelligence' tool 'Genetic Programming' (GP) has been employed for modeling such problem. The Genetic Programming approach is found to capture the complex relationship between the monthly Indian Summer Monsoon Rainfall and large scale atmospheric circulation pattern indices - ENSO and EQUINOO. Research findings of this study indicate that GP-derived monthly rainfall forecasting models, that use large-scale atmospheric circulation information are successful in prediction of All India Summer Monsoon Rainfall with correlation coefficient as good as 0.866, which may appears attractive for such a complex system. A separate analysis is carried out for All India Summer Monsoon rainfall for India as one unit, and five homogeneous monsoon regions, based on ENSO and EQUINOO indices of months of March, April and May only, performed at end of month of May. In this case, All India Summer Monsoon Rainfall could be predicted with 0.70 as correlation coefficient with somewhat lesser Correlation Coefficient (C.C.) values for different

  12. Research status and needs for shear tests on large-scale reinforced concrete containment elements

    International Nuclear Information System (INIS)

    Oesterle, R.G.; Russell, H.G.

    1982-01-01

    Reinforced concrete containments at nuclear power plants are designed to resist forces caused by internal pressure, gravity, and severe earthquakes. The size, shape, and possible stress states in containments produce unique problems for design and construction. A lack of experimental data on the capacity of reinforced concrete to transfer shear stresses while subjected to biaxial tension has led to cumbersome if not impractical design criteria. Research programs recently conducted at the Construction Technology Laboratories and at Cornell University indicate that design criteria for tangential, peripheral, and radial shear are conservative. This paper discusses results from recent research and presents tentative changes for shear design provisions of the current United States code for containment structures. Areas where information is still lacking to fully verify new design provisions are discussed. Needs for further experimental research on large-scale specimens to develop economical, practical, and reliable design criteria for resisting shear forces in containment are identified. (orig.)

  13. The Need for Large-Scale, Longitudinal Empirical Studies in Middle Level Education Research

    Science.gov (United States)

    Mertens, Steven B.; Caskey, Micki M.; Flowers, Nancy

    2016-01-01

    This essay describes and discusses the ongoing need for large-scale, longitudinal, empirical research studies focused on middle grades education. After a statement of the problem and concerns, the essay describes and critiques several prior middle grades efforts and research studies. Recommendations for future research efforts to inform policy…

  14. NASA's computer science research program

    Science.gov (United States)

    Larsen, R. L.

    1983-01-01

    Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.

  15. Research on large-scale wind farm modeling

    Science.gov (United States)

    Ma, Longfei; Zhang, Baoqun; Gong, Cheng; Jiao, Ran; Shi, Rui; Chi, Zhongjun; Ding, Yifeng

    2017-01-01

    Due to intermittent and adulatory properties of wind energy, when large-scale wind farm connected to the grid, it will have much impact on the power system, which is different from traditional power plants. Therefore it is necessary to establish an effective wind farm model to simulate and analyze the influence wind farms have on the grid as well as the transient characteristics of the wind turbines when the grid is at fault. However we must first establish an effective WTGs model. As the doubly-fed VSCF wind turbine has become the mainstream wind turbine model currently, this article first investigates the research progress of doubly-fed VSCF wind turbine, and then describes the detailed building process of the model. After that investigating the common wind farm modeling methods and pointing out the problems encountered. As WAMS is widely used in the power system, which makes online parameter identification of the wind farm model based on off-output characteristics of wind farm be possible, with a focus on interpretation of the new idea of identification-based modeling of large wind farms, which can be realized by two concrete methods.

  16. Improved decomposition–coordination and discrete differential dynamic programming for optimization of large-scale hydropower system

    International Nuclear Information System (INIS)

    Li, Chunlong; Zhou, Jianzhong; Ouyang, Shuo; Ding, Xiaoling; Chen, Lu

    2014-01-01

    Highlights: • Optimization of large-scale hydropower system in the Yangtze River basin. • Improved decomposition–coordination and discrete differential dynamic programming. • Generating initial solution randomly to reduce generation time. • Proposing relative coefficient for more power generation. • Proposing adaptive bias corridor technology to enhance convergence speed. - Abstract: With the construction of major hydro plants, more and more large-scale hydropower systems are taking shape gradually, which brings up a challenge to optimize these systems. Optimization of large-scale hydropower system (OLHS), which is to determine water discharges or water levels of overall hydro plants for maximizing total power generation when subjecting to lots of constrains, is a high dimensional, nonlinear and coupling complex problem. In order to solve the OLHS problem effectively, an improved decomposition–coordination and discrete differential dynamic programming (IDC–DDDP) method is proposed in this paper. A strategy that initial solution is generated randomly is adopted to reduce generation time. Meanwhile, a relative coefficient based on maximum output capacity is proposed for more power generation. Moreover, an adaptive bias corridor technology is proposed to enhance convergence speed. The proposed method is applied to long-term optimal dispatches of large-scale hydropower system (LHS) in the Yangtze River basin. Compared to other methods, IDC–DDDP has competitive performances in not only total power generation but also convergence speed, which provides a new method to solve the OLHS problem

  17. Linking the GLOBE Program With NASA and NSF Large-Scale Experiments

    Science.gov (United States)

    Filmer, P. E.

    2005-12-01

    NASA and the NSF, the sponsoring Federal agencies for the GLOBE Program, are seeking the participation of science teams who are working at the cutting edge of Earth systems science in large integrated Earth systems science programs. Connecting the GLOBE concept and structure with NASA and NSF's leading Earth systems science programs will give GLOBE schools and students access to top scientists, and expose them to programs that have been designated as scientific priorities. Students, teachers, parents, and their communities will be able to see how scientists of many disciplines work together to learn about the Earth system. The GLOBE solicitation released by the NSF targets partnerships between GLOBE and NSF/NASA-funded integrated Earth systems science programs. This presentation will focus on the goals and requirements of the NSF solicitation. Proponents will be expected to provide ways for the GLOBE community to interact with a group of scientists from their science programs as part of a wider joint Earth systems science educational strategy (the sponsoring agencies', GLOBE's, and the proposing programs'). Teams proposing to this solicitation must demonstrate: - A focus on direct connections with major NSF Geosciences and/or Polar Programs and/or NASA Earth-Sun research programs that are related to Earth systems science; - A demonstrable benefit to GLOBE and to NSF Geosciences and/or Polar Programs or NASA Earth-Sun education goals (providing access to program researchers and data, working with GLOBE in setting up campaigns where possible, using tested GLOBE or non-GLOBE protocols to the greatest extent possible, actively participating in the wider GLOBE community including schools, among other goals); - An international component; - How the existing educational efforts of the large science program will coordinate with GLOBE; - An Earth systems science education focus, rather than a GLOBE protocol-support focus; - A rigorous evaluation and assessment component

  18. Finite Mixture Multilevel Multidimensional Ordinal IRT Models for Large Scale Cross-Cultural Research

    Science.gov (United States)

    de Jong, Martijn G.; Steenkamp, Jan-Benedict E. M.

    2010-01-01

    We present a class of finite mixture multilevel multidimensional ordinal IRT models for large scale cross-cultural research. Our model is proposed for confirmatory research settings. Our prior for item parameters is a mixture distribution to accommodate situations where different groups of countries have different measurement operations, while…

  19. Understanding the Front-end of Large-scale Engineering Programs

    DEFF Research Database (Denmark)

    Lucae, Sebastian; Rebentisch, Eric; Oehmen, Josef

    2014-01-01

    Large engineering programs like sociotechnical infrastructure constructions of airports, plant constructions, or the development of radically innovative, high-tech industrial products such as electric vehicles or aircraft are affected by a number of serious risks, and subsequently commonly suffer...... from large cost overruns. Significant problems in program execution can be traced back to practices performed, or more frequently not performed, in the so-called “fuzzy front end” of the program. The lack of sufficient and effective efforts in the early stages of a program can result in unstable......, unclear and incomplete requirements, unclear roles and responsibilities within the program organization, insufficient planning, and unproductive tensions between program management and systems engineering. This study intends to clarify the importance of up-front planning to improve program performance...

  20. Large scale computing in the Energy Research Programs

    International Nuclear Information System (INIS)

    1991-05-01

    The Energy Research Supercomputer Users Group (ERSUG) comprises all investigators using resources of the Department of Energy Office of Energy Research supercomputers. At the December 1989 meeting held at Florida State University (FSU), the ERSUG executive committee determined that the continuing rapid advances in computational sciences and computer technology demanded a reassessment of the role computational science should play in meeting DOE's commitments. Initial studies were to be performed for four subdivisions: (1) Basic Energy Sciences (BES) and Applied Mathematical Sciences (AMS), (2) Fusion Energy, (3) High Energy and Nuclear Physics, and (4) Health and Environmental Research. The first two subgroups produced formal subreports that provided a basis for several sections of this report. Additional information provided in the AMS/BES is included as Appendix C in an abridged form that eliminates most duplication. Additionally, each member of the executive committee was asked to contribute area-specific assessments; these assessments are included in the next section. In the following sections, brief assessments are given for specific areas, a conceptual model is proposed that the entire computational effort for energy research is best viewed as one giant nation-wide computer, and then specific recommendations are made for the appropriate evolution of the system

  1. European research school on large scale solar thermal – SHINE

    DEFF Research Database (Denmark)

    Bales, Chris; Forteza, Pau Joan Cortés; Furbo, Simon

    2014-01-01

    The Solar Heat Integration NEtwork (SHINE) is a European research school in which 13 PhD students in solar thermal technologies are funded by the EU Marie-Curie program. It has five PhD course modules as well as workshops and seminars dedicated to PhD students both within the project as well...... as outside of it. The SHINE research activities focus on large solar heating systems and new applications: on district heating, industrial processes and new storage systems. The scope of this paper is on systems for district heating for which there are five PhD students, three at universities and two...

  2. Large Scale Survey Data in Career Development Research

    Science.gov (United States)

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  3. Large-Scale 3D Printing: The Way Forward

    Science.gov (United States)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  4. Assessing Programming Costs of Explicit Memory Localization on a Large Scale Shared Memory Multiprocessor

    Directory of Open Access Journals (Sweden)

    Silvio Picano

    1992-01-01

    Full Text Available We present detailed experimental work involving a commercially available large scale shared memory multiple instruction stream-multiple data stream (MIMD parallel computer having a software controlled cache coherence mechanism. To make effective use of such an architecture, the programmer is responsible for designing the program's structure to match the underlying multiprocessors capabilities. We describe the techniques used to exploit our multiprocessor (the BBN TC2000 on a network simulation program, showing the resulting performance gains and the associated programming costs. We show that an efficient implementation relies heavily on the user's ability to explicitly manage the memory system.

  5. Large-scale grid management; Storskala Nettforvaltning

    Energy Technology Data Exchange (ETDEWEB)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-07-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series.

  6. Towards Large-area Field-scale Operational Evapotranspiration for Water Use Mapping

    Science.gov (United States)

    Senay, G. B.; Friedrichs, M.; Morton, C.; Huntington, J. L.; Verdin, J.

    2017-12-01

    Field-scale evapotranspiration (ET) estimates are needed for improving surface and groundwater use and water budget studies. Ideally, field-scale ET estimates would be at regional to national levels and cover long time periods. As a result of large data storage and computational requirements associated with processing field-scale satellite imagery such as Landsat, numerous challenges remain to develop operational ET estimates over large areas for detailed water use and availability studies. However, the combination of new science, data availability, and cloud computing technology is enabling unprecedented capabilities for ET mapping. To demonstrate this capability, we used Google's Earth Engine cloud computing platform to create nationwide annual ET estimates with 30-meter resolution Landsat ( 16,000 images) and gridded weather data using the Operational Simplified Surface Energy Balance (SSEBop) model in support of the National Water Census, a USGS research program designed to build decision support capacity for water management agencies and other natural resource managers. By leveraging Google's Earth Engine Application Programming Interface (API) and developing software in a collaborative, open-platform environment, we rapidly advance from research towards applications for large-area field-scale ET mapping. Cloud computing of the Landsat image archive combined with other satellite, climate, and weather data, is creating never imagined opportunities for assessing ET model behavior and uncertainty, and ultimately providing the ability for more robust operational monitoring and assessment of water use at field-scales.

  7. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  8. Algorithm 896: LSA: Algorithms for Large-Scale Optimization

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2009-01-01

    Roč. 36, č. 3 (2009), 16-1-16-29 ISSN 0098-3500 R&D Pro jects: GA AV ČR IAA1030405; GA ČR GP201/06/P397 Institutional research plan: CEZ:AV0Z10300504 Keywords : algorithms * design * large-scale optimization * large-scale nonsmooth optimization * large-scale nonlinear least squares * large-scale nonlinear minimax * large-scale systems of nonlinear equations * sparse pro blems * partially separable pro blems * limited-memory methods * discrete Newton methods * quasi-Newton methods * primal interior-point methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.904, year: 2009

  9. NASA/MSFC FY91 Global Scale Atmospheric Processes Research Program Review

    Science.gov (United States)

    Leslie, Fred W. (Editor)

    1991-01-01

    The reports presented at the annual Marshall Research Review of Earth Science and Applications are compiled. The following subject areas are covered: understanding of atmospheric processes in a variety of spatial and temporal scales; measurements of geophysical parameters; measurements on a global scale from space; the Mission to Planet Earth Program (comprised of and Earth Observation System and the scientific strategy to analyze these data); and satellite data analysis and fundamental studies of atmospheric dynamics.

  10. Talking About The Smokes: a large-scale, community-based participatory research project.

    Science.gov (United States)

    Couzos, Sophia; Nicholson, Anna K; Hunt, Jennifer M; Davey, Maureen E; May, Josephine K; Bennet, Pele T; Westphal, Darren W; Thomas, David P

    2015-06-01

    To describe the Talking About The Smokes (TATS) project according to the World Health Organization guiding principles for conducting community-based participatory research (PR) involving indigenous peoples, to assist others planning large-scale PR projects. The TATS project was initiated in Australia in 2010 as part of the International Tobacco Control Policy Evaluation Project, and surveyed a representative sample of 2522 Aboriginal and Torres Strait Islander adults to assess the impact of tobacco control policies. The PR process of the TATS project, which aimed to build partnerships to create equitable conditions for knowledge production, was mapped and summarised onto a framework adapted from the WHO principles. Processes describing consultation and approval, partnerships and research agreements, communication, funding, ethics and consent, data and benefits of the research. The TATS project involved baseline and follow-up surveys conducted in 34 Aboriginal community-controlled health services and one Torres Strait community. Consistent with the WHO PR principles, the TATS project built on community priorities and strengths through strategic partnerships from project inception, and demonstrated the value of research agreements and trusting relationships to foster shared decision making, capacity building and a commitment to Indigenous data ownership. Community-based PR methodology, by definition, needs adaptation to local settings and priorities. The TATS project demonstrates that large-scale research can be participatory, with strong Indigenous community engagement and benefits.

  11. Programs on large scale applications of superconductivity in Japan

    International Nuclear Information System (INIS)

    Yasukochi, K.; Ogasawara, T.

    1974-01-01

    History of the large scale application of superconductivity in Japan is reported. Experimental works on superconducting magnet systems for high energy physics have just begun. The programs are described by dividing into five categories: 1) MHD power generation systems, 2) superconducting rotating machines, 3) cryogenic power transmission systems, 4) magnetically levitated transportation, and 5) application to high energy physics experiments. The development of a big superconducting magnet for a 1,000 kW class generator was set up as a target of first seven year plan, which came to end in 1972, and continues for three years with the budget of 900 million yen from 1973 on. In the second phase plan, a prototype MHD generator is argued. A plan is contemplated to develop a synchronous generator with inner rotating field by Fuji Electric Co. The total budget for the future plans of superconducting power transmission system amounts to 20 billion yen for the first period of 8 approximately 9 years. In JNR's research and development efforts, several characteristic points are picked up: 1) linear motor drive with active side on ground, 2) loop track, 3) combined test run of maglev and LSM. The field test at the speed of 500 km/hr on a 7 km track is scheduled to be performed in 1975. The target of operation is in 1985. A 12 GeV proton synchrotron is now under construction for the study on high energy physics. Three ring intersecting storage accelerator is discussed for future plan. (Iwakiri, K.)

  12. A European collaboration research programme to study and test large scale base isolated structures

    International Nuclear Information System (INIS)

    Renda, V.; Verzeletti, G.; Papa, L.

    1995-01-01

    The improvement of the technology of innovative anti-seismic mechanisms, as those for base isolation and energy dissipation, needs of testing capability for large scale models of structures integrated with these mechanisms. These kind experimental tests are of primary importance for the validation of design rules and the setting up of an advanced earthquake engineering for civil constructions of relevant interest. The Joint Research Centre of the European Commission offers the European Laboratory for Structural Assessment located at Ispra - Italy, as a focal point for an international european collaboration research programme to test large scale models of structure making use of innovative anti-seismic mechanisms. A collaboration contract, opened to other future contributions, has been signed with the national italian working group on seismic isolation (Gruppo di Lavoro sull's Isolamento Sismico GLIS) which includes the national research centre ENEA, the national electricity board ENEL, the industrial research centre ISMES and producer of isolators ALGA. (author). 3 figs

  13. Investigating and stimulating primary teachers’ attitudes towards science: Summary of a large-scale research project

    NARCIS (Netherlands)

    Walma van der Molen, Julie Henriëtte; van Aalderen-Smeets, Sandra

    2013-01-01

    Attention to the attitudes of primary teachers towards science is of fundamental importance to research on primary science education. The current article describes a large-scale research project that aims to overcome three main shortcomings in attitude research, i.e. lack of a strong theoretical

  14. Large Scale Solar Heating

    DEFF Research Database (Denmark)

    Heller, Alfred

    2001-01-01

    The main objective of the research was to evaluate large-scale solar heating connected to district heating (CSDHP), to build up a simulation tool and to demonstrate the application of the simulation tool for design studies and on a local energy planning case. The evaluation was mainly carried out...... model is designed and validated on the Marstal case. Applying the Danish Reference Year, a design tool is presented. The simulation tool is used for proposals for application of alternative designs, including high-performance solar collector types (trough solar collectors, vaccum pipe collectors......). Simulation programs are proposed as control supporting tool for daily operation and performance prediction of central solar heating plants. Finaly the CSHP technolgy is put into persepctive with respect to alternatives and a short discussion on the barries and breakthrough of the technology are given....

  15. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  16. Large-scale perspective as a challenge

    NARCIS (Netherlands)

    Plomp, M.G.A.

    2012-01-01

    1. Scale forms a challenge for chain researchers: when exactly is something ‘large-scale’? What are the underlying factors (e.g. number of parties, data, objects in the chain, complexity) that determine this? It appears to be a continuum between small- and large-scale, where positioning on that

  17. IKONOS imagery for the Large Scale Biosphere–Atmosphere Experiment in Amazonia (LBA).

    Science.gov (United States)

    George Hurtt; Xiangming Xiao; Michael Keller; Michael Palace; Gregory P. Asner; Rob Braswell; Brond& #305; Eduardo S. zio; Manoel Cardoso; Claudio J.R. Carvalho; Matthew G. Fearon; Liane Guild; Steve Hagen; Scott Hetrick; Berrien Moore III; Carlos Nobre; Jane M. Read; S& aacute; Tatiana NO-VALUE; Annette Schloss; George Vourlitis; Albertus J. Wickel

    2003-01-01

    The LBA-ECO program is one of several international research components under the Brazilian-led Large Scale Biosphere–Atmosphere Experiment in Amazonia (LBA). The field-oriented research activities of this study are organized along transects and include a set of primary field sites, where the major objective is to study land-use change and ecosystem dynamics, and a...

  18. Investigating and Stimulating Primary Teachers' Attitudes Towards Science: Summary of a Large-Scale Research Project

    Science.gov (United States)

    Walma van der Molen, Juliette; van Aalderen-Smeets, Sandra

    2013-01-01

    Attention to the attitudes of primary teachers towards science is of fundamental importance to research on primary science education. The current article describes a large-scale research project that aims to overcome three main shortcomings in attitude research, i.e. lack of a strong theoretical concept of attitude, methodological flaws in…

  19. Optimizing Implementation of Obesity Prevention Programs: A Qualitative Investigation Within a Large-Scale Randomized Controlled Trial.

    Science.gov (United States)

    Kozica, Samantha L; Teede, Helena J; Harrison, Cheryce L; Klein, Ruth; Lombard, Catherine B

    2016-01-01

    The prevalence of obesity in rural and remote areas is elevated in comparison to urban populations, highlighting the need for interventions targeting obesity prevention in these settings. Implementing evidence-based obesity prevention programs is challenging. This study aimed to investigate factors influencing the implementation of obesity prevention programs, including adoption, program delivery, community uptake, and continuation, specifically within rural settings. Nested within a large-scale randomized controlled trial, a qualitative exploratory approach was adopted, with purposive sampling techniques utilized, to recruit stakeholders from 41 small rural towns in Australia. In-depth semistructured interviews were conducted with clinical health professionals, health service managers, and local government employees. Open coding was completed independently by 2 investigators and thematic analysis undertaken. In-depth interviews revealed that obesity prevention programs were valued by the rural workforce. Program implementation is influenced by interrelated factors across: (1) contextual factors and (2) organizational capacity. Key recommendations to manage the challenges of implementing evidence-based programs focused on reducing program delivery costs, aided by the provision of a suite of implementation and evaluation resources. Informing the scale-up of future prevention programs, stakeholders highlighted the need to build local rural capacity through developing supportive university partnerships, generating local program ownership and promoting active feedback to all program partners. We demonstrate that the rural workforce places a high value on obesity prevention programs. Our results inform the future scale-up of obesity prevention programs, providing an improved understanding of strategies to optimize implementation of evidence-based prevention programs. © 2015 National Rural Health Association.

  20. Impact of Large Scale Energy Efficiency Programs On Consumer Tariffs and Utility Finances in India

    Energy Technology Data Exchange (ETDEWEB)

    Abhyankar, Nikit; Phadke, Amol

    2011-01-20

    Large-scale EE programs would modestly increase tariffs but reduce consumers' electricity bills significantly. However, the primary benefit of EE programs is a significant reduction in power shortages, which might make these programs politically acceptable even if tariffs increase. To increase political support, utilities could pursue programs that would result in minimal tariff increases. This can be achieved in four ways: (a) focus only on low-cost programs (such as replacing electric water heaters with gas water heaters); (b) sell power conserved through the EE program to the market at a price higher than the cost of peak power purchase; (c) focus on programs where a partial utility subsidy of incremental capital cost might work and (d) increase the number of participant consumers by offering a basket of EE programs to fit all consumer subcategories and tariff tiers. Large scale EE programs can result in consistently negative cash flows and significantly erode the utility's overall profitability. In case the utility is facing shortages, the cash flow is very sensitive to the marginal tariff of the unmet demand. This will have an important bearing on the choice of EE programs in Indian states where low-paying rural and agricultural consumers form the majority of the unmet demand. These findings clearly call for a flexible, sustainable solution to the cash-flow management issue. One option is to include a mechanism like FAC in the utility incentive mechanism. Another sustainable solution might be to have the net program cost and revenue loss built into utility's revenue requirement and thus into consumer tariffs up front. However, the latter approach requires institutionalization of EE as a resource. The utility incentive mechanisms would be able to address the utility disincentive of forgone long-run return but have a minor impact on consumer benefits. Fundamentally, providing incentives for EE programs to make them comparable to supply

  1. Conference on Large Scale Optimization

    CERN Document Server

    Hearn, D; Pardalos, P

    1994-01-01

    On February 15-17, 1993, a conference on Large Scale Optimization, hosted by the Center for Applied Optimization, was held at the University of Florida. The con­ ference was supported by the National Science Foundation, the U. S. Army Research Office, and the University of Florida, with endorsements from SIAM, MPS, ORSA and IMACS. Forty one invited speakers presented papers on mathematical program­ ming and optimal control topics with an emphasis on algorithm development, real world applications and numerical results. Participants from Canada, Japan, Sweden, The Netherlands, Germany, Belgium, Greece, and Denmark gave the meeting an important international component. At­ tendees also included representatives from IBM, American Airlines, US Air, United Parcel Serice, AT & T Bell Labs, Thinking Machines, Army High Performance Com­ puting Research Center, and Argonne National Laboratory. In addition, the NSF sponsored attendance of thirteen graduate students from universities in the United States and abro...

  2. Research on unit commitment with large-scale wind power connected power system

    Science.gov (United States)

    Jiao, Ran; Zhang, Baoqun; Chi, Zhongjun; Gong, Cheng; Ma, Longfei; Yang, Bing

    2017-01-01

    Large-scale integration of wind power generators into power grid brings severe challenges to power system economic dispatch due to its stochastic volatility. Unit commitment including wind farm is analyzed from the two parts of modeling and solving methods. The structures and characteristics can be summarized after classification has been done according to different objective function and constraints. Finally, the issues to be solved and possible directions of research and development in the future are discussed, which can adapt to the requirements of the electricity market, energy-saving power generation dispatching and smart grid, even providing reference for research and practice of researchers and workers in this field.

  3. First Joint Workshop on Energy Management for Large-Scale Research Infrastructures

    CERN Document Server

    2011-01-01

      CERN, ERF (European Association of National Research Facilities) and ESS (European Spallation Source) announce the first Joint Workshop on Energy Management for Large-Scale Research Infrastructures. The event will take place on 13-14 October 2011 at the ESS office in Sparta - Lund, Sweden.   The workshop will bring together international experts on energy and representatives from laboratories and future projects all over the world in order to identify the challenges and best practice in respect of energy efficiency and optimization, solutions and implementation as well as to review the challenges represented by potential future technical solutions and the tools for effective collaboration. Further information at: http://ess-scandinavia.eu/general-information

  4. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  5. Presenting an Approach for Conducting Knowledge Architecture within Large-Scale Organizations.

    Science.gov (United States)

    Varaee, Touraj; Habibi, Jafar; Mohaghar, Ali

    2015-01-01

    Knowledge architecture (KA) establishes the basic groundwork for the successful implementation of a short-term or long-term knowledge management (KM) program. An example of KA is the design of a prototype before a new vehicle is manufactured. Due to a transformation to large-scale organizations, the traditional architecture of organizations is undergoing fundamental changes. This paper explores the main strengths and weaknesses in the field of KA within large-scale organizations and provides a suitable methodology and supervising framework to overcome specific limitations. This objective was achieved by applying and updating the concepts from the Zachman information architectural framework and the information architectural methodology of enterprise architecture planning (EAP). The proposed solution may be beneficial for architects in knowledge-related areas to successfully accomplish KM within large-scale organizations. The research method is descriptive; its validity is confirmed by performing a case study and polling the opinions of KA experts.

  6. Institute for Fusion Research and Large Helical Device program

    International Nuclear Information System (INIS)

    Iiyoshi, Atsuo

    1989-01-01

    In the research on nuclear fusion, the final objective is to materialize nuclear fusion reactors, and for the purpose, it is necessary to cause nuclear combustion by making the plasma of higher than 100 million deg and confine it for a certain time. So far in various universities, the researches on diversified fusion processes have been advanced, but in February, 1986, the Science Council issued the report 'Nuclear fusion research in universities hereafter'. As the next large scale device, an external conductor system helical device was decided, and it is desirable to found the organization for joint utilization by national universities to promote the project. The researches on the other processes are continued by utilizing the existing facilitie. The reason of selecting a helical device is the data base of the researches carried out so far can be utilized sufficiently, it is sufficiently novel even after 10 years from now, and many researchers can be collected. The place of the research is Toki City, Gifu Prefecture, where the Institute of Plasma Physics, Nagoya University, is to be moved. The basic concept of the superconducting helical device project, the trend of nuclear fusion development in the world, the physical research using a helical system and so on are reported. (Kako, I.)

  7. Report of the Workshop on Petascale Systems Integration for LargeScale Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Kramer, William T.C.; Walter, Howard; New, Gary; Engle, Tom; Pennington, Rob; Comes, Brad; Bland, Buddy; Tomlison, Bob; Kasdorf, Jim; Skinner, David; Regimbal, Kevin

    2007-10-01

    There are significant issues regarding Large Scale System integration that are not being addressed in other forums such as current research portfolios or vendor user groups. Unfortunately, the issues in the area of large-scale system integration often fall into a netherworld; not research, not facilities, not procurement, not operations, not user services. Taken together, these issues along with the impact of sub-optimal integration technology means the time required to deploy, integrate and stabilize large scale system may consume up to 20 percent of the useful life of such systems. Improving the state of the art for large scale systems integration has potential to increase the scientific productivity of these systems. Sites have significant expertise, but there are no easy ways to leverage this expertise among them . Many issues inhibit the sharing of information, including available time and effort, as well as issues with sharing proprietary information. Vendors also benefit in the long run from the solutions to issues detected during site testing and integration. There is a great deal of enthusiasm for making large scale system integration a full-fledged partner along with the other major thrusts supported by funding agencies in the definition, design, and use of a petascale systems. Integration technology and issues should have a full 'seat at the table' as petascale and exascale initiatives and programs are planned. The workshop attendees identified a wide range of issues and suggested paths forward. Pursuing these with funding opportunities and innovation offers the opportunity to dramatically improve the state of large scale system integration.

  8. Large-Scale Astrophysical Visualization on Smartphones

    Science.gov (United States)

    Becciani, U.; Massimino, P.; Costa, A.; Gheller, C.; Grillo, A.; Krokos, M.; Petta, C.

    2011-07-01

    Nowadays digital sky surveys and long-duration, high-resolution numerical simulations using high performance computing and grid systems produce multidimensional astrophysical datasets in the order of several Petabytes. Sharing visualizations of such datasets within communities and collaborating research groups is of paramount importance for disseminating results and advancing astrophysical research. Moreover educational and public outreach programs can benefit greatly from novel ways of presenting these datasets by promoting understanding of complex astrophysical processes, e.g., formation of stars and galaxies. We have previously developed VisIVO Server, a grid-enabled platform for high-performance large-scale astrophysical visualization. This article reviews the latest developments on VisIVO Web, a custom designed web portal wrapped around VisIVO Server, then introduces VisIVO Smartphone, a gateway connecting VisIVO Web and data repositories for mobile astrophysical visualization. We discuss current work and summarize future developments.

  9. Ultra-Scale Visualization: Research and Education

    International Nuclear Information System (INIS)

    Ma, K-L; Ross, Robert; Huang Jian; Humphreys, Greg; Max, Nelson; Moreland, Kenneth; Owens, John D; Shen, H-W

    2007-01-01

    Understanding the science behind large-scale simulations and high-throughput experiments requires extracting meaning from data sets of hundreds of terabytes or more. Visualization is the most intuitive means for scientists to understand data at this scale, and the most effective way to communicate their findings with others. Even though visualization technology has matured over the past twenty years, it is still limited by the extent and scale of the data that it can be applied to, and also by the functionalities that were mostly designed for single-user, single-variable, and single-space investigation. The Institute for Ultra-Scale Visualization (IUSV), funded by the DOE SciDAC-2 program, has the mission to advance visualization technologies to enable knowledge discovery and dissemination for peta-scale applications. By working with the SciDAC application projects, Centers for Enabling Technology, and other Institutes, IUSV aims to lead the research innovation that can create new visualization capabilities needed for gleaning insights from data at petascale and beyond to solve forefront scientific problems. This paper outlines what we see as some of the biggest research challenges facing the visualization community, and how we can approach education and outreach to put successful research in the hands of scientists

  10. Trends in large-scale testing of reactor structures

    International Nuclear Information System (INIS)

    Blejwas, T.E.

    2003-01-01

    Large-scale tests of reactor structures have been conducted at Sandia National Laboratories since the late 1970s. This paper describes a number of different large-scale impact tests, pressurization tests of models of containment structures, and thermal-pressure tests of models of reactor pressure vessels. The advantages of large-scale testing are evident, but cost, in particular limits its use. As computer models have grown in size, such as number of degrees of freedom, the advent of computer graphics has made possible very realistic representation of results - results that may not accurately represent reality. A necessary condition to avoiding this pitfall is the validation of the analytical methods and underlying physical representations. Ironically, the immensely larger computer models sometimes increase the need for large-scale testing, because the modeling is applied to increasing more complex structural systems and/or more complex physical phenomena. Unfortunately, the cost of large-scale tests is a disadvantage that will likely severely limit similar testing in the future. International collaborations may provide the best mechanism for funding future programs with large-scale tests. (author)

  11. Research and development of safeguards measures for the large scale reprocessing plant

    Energy Technology Data Exchange (ETDEWEB)

    Kikuchi, Masahiro; Sato, Yuji; Yokota, Yasuhiro; Masuda, Shoichiro; Kobayashi, Isao; Uchikoshi, Seiji; Tsutaki, Yasuhiro; Nidaira, Kazuo [Nuclear Material Control Center, Tokyo (Japan)

    1994-12-31

    The Government of Japan agreed on the safeguards concepts of commercial size reprocessing plant under the bilateral agreement for cooperation between the Japan and the United States. In addition, the LASCAR, that is the forum of large scale reprocessing plant safeguards, could obtain the fruitful results in the spring of 1992. The research and development of safeguards measures for the Rokkasho Reprocessing Plant should be progressed with every regard to the concepts described in both documents. Basically, the material accountancy and monitoring system should be established, based on the NRTA and other measures in order to obtain the timeliness goal for plutonium, and the un-attended mode inspection approach based on the integrated containment/surveillance system coupled with radiation monitoring in order to reduce the inspection efforts. NMCC has been studying on the following measures for a large scale reprocessing plant safeguards (1) A radiation gate monitor and integrated surveillance system (2) A near real time Shipper and Receiver Difference monitoring (3) A near real time material accountancy system operated for the bulk handling area (4) A volume measurement technique in a large scale input accountancy vessel (5) An in-process inventory estimation technique applied to the process equipment such as the pulse column and evaporator (6) Solution transfer monitoring approach applied to buffer tanks in the chemical process (7) A timely analysis technique such as a hybrid K edge densitometer operated in the on-site laboratory (J.P.N.).

  12. A large-scale computer facility for computational aerodynamics

    International Nuclear Information System (INIS)

    Bailey, F.R.; Balhaus, W.F.

    1985-01-01

    The combination of computer system technology and numerical modeling have advanced to the point that computational aerodynamics has emerged as an essential element in aerospace vehicle design methodology. To provide for further advances in modeling of aerodynamic flow fields, NASA has initiated at the Ames Research Center the Numerical Aerodynamic Simulation (NAS) Program. The objective of the Program is to develop a leading-edge, large-scale computer facility, and make it available to NASA, DoD, other Government agencies, industry and universities as a necessary element in ensuring continuing leadership in computational aerodynamics and related disciplines. The Program will establish an initial operational capability in 1986 and systematically enhance that capability by incorporating evolving improvements in state-of-the-art computer system technologies as required to maintain a leadership role. This paper briefly reviews the present and future requirements for computational aerodynamics and discusses the Numerical Aerodynamic Simulation Program objectives, computational goals, and implementation plans

  13. Large scale intender test program to measure sub gouge displacements

    Energy Technology Data Exchange (ETDEWEB)

    Been, Ken; Lopez, Juan [Golder Associates Inc, Houston, TX (United States); Sancio, Rodolfo [MMI Engineering Inc., Houston, TX (United States)

    2011-07-01

    The production of submarine pipelines in an offshore environment covered with ice is very challenging. Several precautions must be taken such as burying the pipelines to protect them from ice movement caused by gouging. The estimation of the subgouge displacements is a key factor in pipeline design for ice gouged environments. This paper investigated a method to measure subgouge displacements. An experimental program was implemented in an open field to produce large scale idealized gouges on engineered soil beds (sand and clay). The horizontal force required to produce the gouge, the subgouge displacements in the soil and the strain imposed by these displacements were monitored on a buried model pipeline. The results showed that for a given keel, the gouge depth was inversely proportional to undrained shear strength in clay. The subgouge displacements measured did not show a relationship with the gouge depth, width or soil density in sand and clay tests.

  14. Presenting an Approach for Conducting Knowledge Architecture within Large-Scale Organizations

    Science.gov (United States)

    Varaee, Touraj; Habibi, Jafar; Mohaghar, Ali

    2015-01-01

    Knowledge architecture (KA) establishes the basic groundwork for the successful implementation of a short-term or long-term knowledge management (KM) program. An example of KA is the design of a prototype before a new vehicle is manufactured. Due to a transformation to large-scale organizations, the traditional architecture of organizations is undergoing fundamental changes. This paper explores the main strengths and weaknesses in the field of KA within large-scale organizations and provides a suitable methodology and supervising framework to overcome specific limitations. This objective was achieved by applying and updating the concepts from the Zachman information architectural framework and the information architectural methodology of enterprise architecture planning (EAP). The proposed solution may be beneficial for architects in knowledge-related areas to successfully accomplish KM within large-scale organizations. The research method is descriptive; its validity is confirmed by performing a case study and polling the opinions of KA experts. PMID:25993414

  15. LAVA: Large scale Automated Vulnerability Addition

    Science.gov (United States)

    2016-05-23

    LAVA: Large-scale Automated Vulnerability Addition Brendan Dolan -Gavitt∗, Patrick Hulin†, Tim Leek†, Fredrich Ulrich†, Ryan Whelan† (Authors listed...released, and thus rapidly become stale. We can expect tools to have been trained to detect bugs that have been released. Given the commercial price tag...low TCN) and dead (low liveness) program data is a powerful one for vulnera- bility injection. The DUAs it identifies are internal program quantities

  16. The Hualien Large-Scale Seismic Test for soil-structure interaction research

    International Nuclear Information System (INIS)

    Tang, H.T.; Stepp, J.C.; Cheng, Y.H.

    1991-01-01

    A Large-Scale Seismic Test (LSST) Program at Hualien, Taiwan, has been initiated with the primary objective of obtaining earthquake-induced SSI data at a stiff soil site having similar prototypical nuclear power plant soil conditions. Preliminary soil boring, geophysical testing and ambient and earthquake-induced ground motion monitoring have been conducted to understand the experiment site conditions. More refined field and laboratory tests will be conducted such as the state-of-the-art freezing sampling technique and the large penetration test (LPT) method to characterize the soil constitutive behavior. The test model to be constructed will be similar to the Lotung model. The instrumentation layout will be designed to provide data for studies of SSI, spatial incoherence, soil stability, foundation uplifting, ground motion wave field and structural response. A consortium consisting of EPRI, Taipower, CRIEPI, TEPCO, CEA, EdF and Framatome has been established to carry out the project. It is envisaged that the Hualien SSI array will be ready to record earthquakes by the middle of 1992. The duration of the recording scheduled for five years. (author)

  17. Managing sensitive phenotypic data and biomaterial in large-scale collaborative psychiatric genetic research projects: practical considerations.

    Science.gov (United States)

    Demiroglu, S Y; Skrowny, D; Quade, M; Schwanke, J; Budde, M; Gullatz, V; Reich-Erkelenz, D; Jakob, J J; Falkai, P; Rienhoff, O; Helbing, K; Heilbronner, U; Schulze, T G

    2012-12-01

    Large-scale collaborative research will be a hallmark of future psychiatric genetic research. Ideally, both academic and non-academic institutions should be able to participate in such collaborations to allow for the establishment of very large samples in a straightforward manner. Any such endeavor requires an easy-to-implement information technology (IT) framework. Here we present the requirements for a centralized framework and describe how they can be met through a modular IT toolbox.

  18. Simulation research on the process of large scale ship plane segmentation intelligent workshop

    Science.gov (United States)

    Xu, Peng; Liao, Liangchuang; Zhou, Chao; Xue, Rui; Fu, Wei

    2017-04-01

    Large scale ship plane segmentation intelligent workshop is a new thing, and there is no research work in related fields at home and abroad. The mode of production should be transformed by the existing industry 2.0 or part of industry 3.0, also transformed from "human brain analysis and judgment + machine manufacturing" to "machine analysis and judgment + machine manufacturing". In this transforming process, there are a great deal of tasks need to be determined on the aspects of management and technology, such as workshop structure evolution, development of intelligent equipment and changes in business model. Along with them is the reformation of the whole workshop. Process simulation in this project would verify general layout and process flow of large scale ship plane section intelligent workshop, also would analyze intelligent workshop working efficiency, which is significant to the next step of the transformation of plane segmentation intelligent workshop.

  19. A Combined Ethical and Scientific Analysis of Large-scale Tests of Solar Climate Engineering

    Science.gov (United States)

    Ackerman, T. P.

    2017-12-01

    Our research group recently published an analysis of the combined ethical and scientific issues surrounding large-scale testing of stratospheric aerosol injection (SAI; Lenferna et al., 2017, Earth's Future). We are expanding this study in two directions. The first is extending this same analysis to other geoengineering techniques, particularly marine cloud brightening (MCB). MCB has substantial differences to SAI in this context because MCB can be tested over significantly smaller areas of the planet and, following injection, has a much shorter lifetime of weeks as opposed to years for SAI. We examine issues such as the role of intent, the lesser of two evils, and the nature of consent. In addition, several groups are currently considering climate engineering governance tools such as a code of ethics and a registry. We examine how these tools might influence climate engineering research programs and, specifically, large-scale testing. The second direction of expansion is asking whether ethical and scientific issues associated with large-scale testing are so significant that they effectively preclude moving ahead with climate engineering research and testing. Some previous authors have suggested that no research should take place until these issues are resolved. We think this position is too draconian and consider a more nuanced version of this argument. We note, however, that there are serious questions regarding the ability of the scientific research community to move to the point of carrying out large-scale tests.

  20. A Heuristic Approach to Author Name Disambiguation in Bibliometrics Databases for Large-scale Research Assessments

    NARCIS (Netherlands)

    D'Angelo, C.A.; Giuffrida, C.; Abramo, G.

    2011-01-01

    National exercises for the evaluation of research activity by universities are becoming regular practice in ever more countries. These exercises have mainly been conducted through the application of peer-review methods. Bibliometrics has not been able to offer a valid large-scale alternative because

  1. Technology for the large-scale production of multi-crystalline silicon solar cells and modules

    International Nuclear Information System (INIS)

    Weeber, A.W.; De Moor, H.H.C.

    1997-06-01

    In cooperation with Shell Solar Energy (formerly R and S Renewable Energy Systems) and the Research Institute for Materials of the Catholic University Nijmegen the Netherlands Energy Research Foundation (ECN) plans to develop a competitive technology for the large-scale manufacturing of solar cells and solar modules on the basis of multi-crystalline silicon. The project will be carried out within the framework of the Economy, Ecology and Technology (EET) program of the Dutch ministry of Economic Affairs and the Dutch ministry of Education, Culture and Sciences. The aim of the EET-project is to reduce the costs of a solar module by 50% by means of increasing the conversion efficiency as well as the development of cheap processes for large-scale production

  2. National studies on recidivism: an inventory of large-scale recidivism research in 33 European countries

    NARCIS (Netherlands)

    Wartna, B.S.J.; Nijssen, L.T.J.

    2006-01-01

    Measuring recidivism is an established method for examining the effects of penal interventions. Over the last decades the automation of police and judiciary data has opened up opportunities to do large-scale recidivism research. The WODC has made an inventory of the studies that are carried out in

  3. News from heat-pump research - Large-scale heat pumps, components, heat pumps and solar heating; News aus der Waermepumpen-Forschung - Gross-Waermepumpen, Komponenten, Waermepumpe und Solar

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-06-15

    These proceedings summarise the presentations made at the 16{sup th} annual meeting held by the Swiss Federal Office of Energy's Heat Pump Research Program in Burgdorf, Switzerland. The proceedings include contributions on large-scale heat pumps, components and the activities of the heat pump promotion society. A summary of targets and trends in energy research in general is presented and an overview of the heat pump market in 2009 and future perspectives is given. International work within the framework of the International Energy Agency's heat pump group is reviewed, including solar - heat pump combinations. Field-monitoring and the analysis of large-scale heat pumps are discussed and the importance of the use of correct concepts in such installations is stressed. Large-scale heat pumps with carbon dioxide as working fluid are looked at, as are output-regulated air/water heat pumps. Efficient system solutions with heat pumps used both to heat and to cool are discussed. Deep geothermal probes and the potential offered by geothermal probes using carbon dioxide as a working fluid are discussed. The proceedings are rounded off with a list of useful addresses.

  4. Collaborative mining and interpretation of large-scale data for biomedical research insights.

    Directory of Open Access Journals (Sweden)

    Georgia Tsiliki

    Full Text Available Biomedical research becomes increasingly interdisciplinary and collaborative in nature. Researchers need to efficiently and effectively collaborate and make decisions by meaningfully assembling, mining and analyzing available large-scale volumes of complex multi-faceted data residing in different sources. In line with related research directives revealing that, in spite of the recent advances in data mining and computational analysis, humans can easily detect patterns which computer algorithms may have difficulty in finding, this paper reports on the practical use of an innovative web-based collaboration support platform in a biomedical research context. Arguing that dealing with data-intensive and cognitively complex settings is not a technical problem alone, the proposed platform adopts a hybrid approach that builds on the synergy between machine and human intelligence to facilitate the underlying sense-making and decision making processes. User experience shows that the platform enables more informed and quicker decisions, by displaying the aggregated information according to their needs, while also exploiting the associated human intelligence.

  5. Cohort Profile of The GOALS Study: A Large-scale Research of Physical Activity in Dutch Students

    NARCIS (Netherlands)

    De Groot, Renate; Van Dijk, Martin; Kirschner, Paul A.

    2016-01-01

    The GOALS study (Grootschalig Onderzoek naar Activiteiten van Limburgse Scholieren [Large-scale Research of Activities in Dutch Students]) was set up to investigate possible associations between different forms of physical activity and inactivity with cognitive performance, academic achievement and

  6. Analogue scale modelling of extensional tectonic processes using a large state-of-the-art centrifuge

    Science.gov (United States)

    Park, Heon-Joon; Lee, Changyeol

    2017-04-01

    Analogue scale modelling of extensional tectonic processes such as rifting and basin opening has been numerously conducted. Among the controlling factors, gravitational acceleration (g) on the scale models was regarded as a constant (Earth's gravity) in the most of the analogue model studies, and only a few model studies considered larger gravitational acceleration by using a centrifuge (an apparatus generating large centrifugal force by rotating the model at a high speed). Although analogue models using a centrifuge allow large scale-down and accelerated deformation that is derived by density differences such as salt diapir, the possible model size is mostly limited up to 10 cm. A state-of-the-art centrifuge installed at the KOCED Geotechnical Centrifuge Testing Center, Korea Advanced Institute of Science and Technology (KAIST) allows a large surface area of the scale-models up to 70 by 70 cm under the maximum capacity of 240 g-tons. Using the centrifuge, we will conduct analogue scale modelling of the extensional tectonic processes such as opening of the back-arc basin. Acknowledgement This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (grant number 2014R1A6A3A04056405).

  7. Large scale gas chromatographic demonstration system for hydrogen isotope separation

    International Nuclear Information System (INIS)

    Cheh, C.H.

    1988-01-01

    A large scale demonstration system was designed for a throughput of 3 mol/day equimolar mixture of H,D, and T. The demonstration system was assembled and an experimental program carried out. This project was funded by Kernforschungszentrum Karlsruhe, Canadian Fusion Fuel Technology Projects and Ontario Hydro Research Division. Several major design innovations were successfully implemented in the demonstration system and are discussed in detail. Many experiments were carried out in the demonstration system to study the performance of the system to separate hydrogen isotopes at high throughput. Various temperature programming schemes were tested, heart-cutting operation was evaluated, and very large (up to 138 NL/injection) samples were separated in the system. The results of the experiments showed that the specially designed column performed well as a chromatographic column and good separation could be achieved even when a 138 NL sample was injected

  8. Quantifying expert consensus against the existence of a secret, large-scale atmospheric spraying program

    Science.gov (United States)

    Shearer, Christine; West, Mick; Caldeira, Ken; Davis, Steven J.

    2016-08-01

    Nearly 17% of people in an international survey said they believed the existence of a secret large-scale atmospheric program (SLAP) to be true or partly true. SLAP is commonly referred to as ‘chemtrails’ or ‘covert geoengineering’, and has led to a number of websites purported to show evidence of widespread chemical spraying linked to negative impacts on human health and the environment. To address these claims, we surveyed two groups of experts—atmospheric chemists with expertize in condensation trails and geochemists working on atmospheric deposition of dust and pollution—to scientifically evaluate for the first time the claims of SLAP theorists. Results show that 76 of the 77 scientists (98.7%) that took part in this study said they had not encountered evidence of a SLAP, and that the data cited as evidence could be explained through other factors, including well-understood physics and chemistry associated with aircraft contrails and atmospheric aerosols. Our goal is not to sway those already convinced that there is a secret, large-scale spraying program—who often reject counter-evidence as further proof of their theories—but rather to establish a source of objective science that can inform public discourse.

  9. The Expanded Large Scale Gap Test

    Science.gov (United States)

    1987-03-01

    NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T

  10. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  11. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  12. Large-scale structures in turbulent Couette flow

    Science.gov (United States)

    Kim, Jung Hoon; Lee, Jae Hwa

    2016-11-01

    Direct numerical simulation of fully developed turbulent Couette flow is performed with a large computational domain in the streamwise and spanwise directions (40 πh and 6 πh) to investigate streamwise-scale growth mechanism of the streamwise velocity fluctuating structures in the core region, where h is the channel half height. It is shown that long streamwise-scale structures (> 3 h) are highly energetic and they contribute to more than 80% of the turbulent kinetic energy and Reynolds shear stress, compared to previous studies in canonical Poiseuille flows. Instantaneous and statistical analysis show that negative-u' structures on the bottom wall in the Couette flow continuously grow in the streamwise direction due to mean shear, and they penetrate to the opposite moving wall. The geometric center of the log layer is observed in the centerline with a dominant outer peak in streamwise spectrum, and the maximum streamwise extent for structure is found in the centerline, similar to previous observation in turbulent Poiseuille flows at high Reynolds number. Further inspection of time-evolving instantaneous fields clearly exhibits that adjacent long structures combine to form a longer structure in the centerline. This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (NRF-2014R1A1A2057031).

  13. BREEDER: a microcomputer program for financial analysis of a large-scale prototype breeder reactor

    International Nuclear Information System (INIS)

    Giese, R.F.

    1984-04-01

    This report describes a microcomputer-based, single-project financial analysis program: BREEDER. BREEDER is a user-friendly model designed to facilitate frequent and rapid analyses of the financial implications associated with alternative design and financing strategies for electric generating plants and large-scale prototype breeder (LSPB) reactors in particular. The model has proved to be a useful tool in establishing cost goals for LSPB reactors. The program is available on floppy disks for use on an IBM personal computer (or IBM look-a-like) running under PC-DOS or a Kaypro II transportable computer running under CP/M (and many other CP/M machines). The report documents version 1.5 of BREEDER and contains a user's guide. The report also includes a general overview of BREEDER, a summary of hardware requirements, a definition of all required program inputs, a description of all algorithms used in performing the construction-period and operation-period analyses, and a summary of all available reports. The appendixes contain a complete source-code listing, a cross-reference table, a sample interactive session, several sample runs, and additional documentation of the net-equity program option

  14. The relationship between large-scale and convective states in the tropics - Towards an improved representation of convection in large-scale models

    Energy Technology Data Exchange (ETDEWEB)

    Jakob, Christian [Monash Univ., Melbourne, VIC (Australia)

    2015-02-26

    This report summarises an investigation into the relationship of tropical thunderstorms to the atmospheric conditions they are embedded in. The study is based on the use of radar observations at the Atmospheric Radiation Measurement site in Darwin run under the auspices of the DOE Atmospheric Systems Research program. Linking the larger scales of the atmosphere with the smaller scales of thunderstorms is crucial for the development of the representation of thunderstorms in weather and climate models, which is carried out by a process termed parametrisation. Through the analysis of radar and wind profiler observations the project made several fundamental discoveries about tropical storms and quantified the relationship of the occurrence and intensity of these storms to the large-scale atmosphere. We were able to show that the rainfall averaged over an area the size of a typical climate model grid-box is largely controlled by the number of storms in the area, and less so by the storm intensity. This allows us to completely rethink the way we represent such storms in climate models. We also found that storms occur in three distinct categories based on their depth and that the transition between these categories is strongly related to the larger scale dynamical features of the atmosphere more so than its thermodynamic state. Finally, we used our observational findings to test and refine a new approach to cumulus parametrisation which relies on the stochastic modelling of the area covered by different convective cloud types.

  15. Large scale seismic test research at Hualien site in Taiwan. Results of site investigation and characterization of the foundation ground

    International Nuclear Information System (INIS)

    Okamoto, Toshiro; Kokusho, Takeharu; Nishi, Koichi

    1998-01-01

    An international joint research program called ''HLSST'' is under way. Large-Scale Seismic Test (LSST) is to be conducted to investigate Soil-Structure Interaction (SSI) during large earthquakes in the field in Hualien, a high seismic region in Taiwan. A 1/4-scale model building was constructed on the excavated gravelly ground, and the backfill material of crushed stones was placed around the model plant. The model building and the foundation ground were extensively instrumented to monitor structure and ground response. To accurately evaluate SSI during earthquakes, geotechnical investigation and forced vibration test were performed during construction process namely before/after the base excavation, after the structure construction and after the backfilling. Main results are as follows. (1) The distribution of the mechanical properties of the gravelly soil are measured by various techniques including penetration tests and PS-logging and it found that the shear wave velocities (Vs) change clearly and it depends on changing overburden pressures during the construction process. (2) Measuring Vs in the surrounding soils, it found that the Vs is smaller than that at almost same depth in the farther location. Discussion is made further on the numerical soil model for SSI analysis. (author)

  16. Estimating the Effectiveness of Special Education Using Large-Scale Assessment Data

    Science.gov (United States)

    Ewing, Katherine Anne

    2009-01-01

    The inclusion of students with disabilities in large scale assessment and accountability programs has provided new opportunities to examine the impact of special education services on student achievement. Hanushek, Kain, and Rivkin (1998, 2002) evaluated the effectiveness of special education programs by examining students' gains on a large-scale…

  17. Interior Point Methods for Large-Scale Nonlinear Programming

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2005-01-01

    Roč. 20, č. 4-5 (2005), s. 569-582 ISSN 1055-6788 R&D Projects: GA AV ČR IAA1030405 Institutional research plan: CEZ:AV0Z10300504 Keywords : nonlinear programming * interior point methods * KKT systems * indefinite preconditioners * filter methods * algorithms Subject RIV: BA - General Mathematics Impact factor: 0.477, year: 2005

  18. Cohort Profile of the Goals Study: A Large-Scale Research of Physical Activity in Dutch Students

    Science.gov (United States)

    de Groot, Renate H. M.; van Dijk, Martin L.; Kirschner, Paul A.

    2015-01-01

    The GOALS study (Grootschalig Onderzoek naar Activiteiten van Limburgse Scholieren [Large-scale Research of Activities in Dutch Students]) was set up to investigate possible associations between different forms of physical activity and inactivity with cognitive performance, academic achievement and mental well-being. It was conducted at a…

  19. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif

    2017-01-07

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  20. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif; Orakzai, Faisal Moeen; Abdelaziz, Ibrahim; Khayyat, Zuhair

    2017-01-01

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  1. Development, implementation and evaluation of a clinical research engagement and leadership capacity building program in a large Australian health care service.

    Science.gov (United States)

    Misso, Marie L; Ilic, Dragan; Haines, Terry P; Hutchinson, Alison M; East, Christine E; Teede, Helena J

    2016-01-14

    Health professionals need to be integrated more effectively in clinical research to ensure that research addresses clinical needs and provides practical solutions at the coal face of care. In light of limited evidence on how best to achieve this, evaluation of strategies to introduce, adapt and sustain evidence-based practices across different populations and settings is required. This project aims to address this gap through the co-design, development, implementation, evaluation, refinement and ultimately scale-up of a clinical research engagement and leadership capacity building program in a clinical setting with little to no co-ordinated approach to clinical research engagement and education. The protocol is based on principles of research capacity building and on a six-step framework, which have previously led to successful implementation and long-term sustainability. A mixed methods study design will be used. Methods will include: (1) a review of the literature about strategies that engage health professionals in research through capacity building and/or education in research methods; (2) a review of existing local research education and support elements; (3) a needs assessment in the local clinical setting, including an online cross-sectional survey and semi-structured interviews; (4) co-design and development of an educational and support program; (5) implementation of the program in the clinical environment; and (6) pre- and post-implementation evaluation and ultimately program scale-up. The evaluation focuses on research activity and knowledge, attitudes and preferences about clinical research, evidence-based practice and leadership and post implementation, about their satisfaction with the program. The investigators will evaluate the feasibility and effect of the program according to capacity building measures and will revise where appropriate prior to scale-up. It is anticipated that this clinical research engagement and leadership capacity building

  2. Organizational Influences on Interdisciplinary Interactions during Research and Design of Large-Scale Complex Engineered Systems

    Science.gov (United States)

    McGowan, Anna-Maria R.; Seifert, Colleen M.; Papalambros, Panos Y.

    2012-01-01

    The design of large-scale complex engineered systems (LaCES) such as an aircraft is inherently interdisciplinary. Multiple engineering disciplines, drawing from a team of hundreds to thousands of engineers and scientists, are woven together throughout the research, development, and systems engineering processes to realize one system. Though research and development (R&D) is typically focused in single disciplines, the interdependencies involved in LaCES require interdisciplinary R&D efforts. This study investigates the interdisciplinary interactions that take place during the R&D and early conceptual design phases in the design of LaCES. Our theoretical framework is informed by both engineering practices and social science research on complex organizations. This paper provides preliminary perspective on some of the organizational influences on interdisciplinary interactions based on organization theory (specifically sensemaking), data from a survey of LaCES experts, and the authors experience in the research and design. The analysis reveals couplings between the engineered system and the organization that creates it. Survey respondents noted the importance of interdisciplinary interactions and their significant benefit to the engineered system, such as innovation and problem mitigation. Substantial obstacles to interdisciplinarity are uncovered beyond engineering that include communication and organizational challenges. Addressing these challenges may ultimately foster greater efficiencies in the design and development of LaCES and improved system performance by assisting with the collective integration of interdependent knowledge bases early in the R&D effort. This research suggests that organizational and human dynamics heavily influence and even constrain the engineering effort for large-scale complex systems.

  3. Large-scale solar purchasing

    International Nuclear Information System (INIS)

    1999-01-01

    The principal objective of the project was to participate in the definition of a new IEA task concerning solar procurement (''the Task'') and to assess whether involvement in the task would be in the interest of the UK active solar heating industry. The project also aimed to assess the importance of large scale solar purchasing to UK active solar heating market development and to evaluate the level of interest in large scale solar purchasing amongst potential large scale purchasers (in particular housing associations and housing developers). A further aim of the project was to consider means of stimulating large scale active solar heating purchasing activity within the UK. (author)

  4. Punctuated equilibrium in the large-scale evolution of programming languages†

    Science.gov (United States)

    Valverde, Sergi; Solé, Ricard V.

    2015-01-01

    The analogies and differences between biological and cultural evolution have been explored by evolutionary biologists, historians, engineers and linguists alike. Two well-known domains of cultural change are language and technology. Both share some traits relating the evolution of species, but technological change is very difficult to study. A major challenge in our way towards a scientific theory of technological evolution is how to properly define evolutionary trees or clades and how to weight the role played by horizontal transfer of information. Here, we study the large-scale historical development of programming languages, which have deeply marked social and technological advances in the last half century. We analyse their historical connections using network theory and reconstructed phylogenetic networks. Using both data analysis and network modelling, it is shown that their evolution is highly uneven, marked by innovation events where new languages are created out of improved combinations of different structural components belonging to previous languages. These radiation events occur in a bursty pattern and are tied to novel technological and social niches. The method can be extrapolated to other systems and consistently captures the major classes of languages and the widespread horizontal design exchanges, revealing a punctuated evolutionary path. PMID:25994298

  5. Punctuated equilibrium in the large-scale evolution of programming languages.

    Science.gov (United States)

    Valverde, Sergi; Solé, Ricard V

    2015-06-06

    The analogies and differences between biological and cultural evolution have been explored by evolutionary biologists, historians, engineers and linguists alike. Two well-known domains of cultural change are language and technology. Both share some traits relating the evolution of species, but technological change is very difficult to study. A major challenge in our way towards a scientific theory of technological evolution is how to properly define evolutionary trees or clades and how to weight the role played by horizontal transfer of information. Here, we study the large-scale historical development of programming languages, which have deeply marked social and technological advances in the last half century. We analyse their historical connections using network theory and reconstructed phylogenetic networks. Using both data analysis and network modelling, it is shown that their evolution is highly uneven, marked by innovation events where new languages are created out of improved combinations of different structural components belonging to previous languages. These radiation events occur in a bursty pattern and are tied to novel technological and social niches. The method can be extrapolated to other systems and consistently captures the major classes of languages and the widespread horizontal design exchanges, revealing a punctuated evolutionary path. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  6. Using Agent Base Models to Optimize Large Scale Network for Large System Inventories

    Science.gov (United States)

    Shameldin, Ramez Ahmed; Bowling, Shannon R.

    2010-01-01

    The aim of this paper is to use Agent Base Models (ABM) to optimize large scale network handling capabilities for large system inventories and to implement strategies for the purpose of reducing capital expenses. The models used in this paper either use computational algorithms or procedure implementations developed by Matlab to simulate agent based models in a principal programming language and mathematical theory using clusters, these clusters work as a high performance computational performance to run the program in parallel computational. In both cases, a model is defined as compilation of a set of structures and processes assumed to underlie the behavior of a network system.

  7. The assessment of the readiness of five countries to implement child maltreatment prevention programs on a large scale.

    Science.gov (United States)

    Mikton, Christopher; Power, Mick; Raleva, Marija; Makoae, Mokhantso; Al Eissa, Majid; Cheah, Irene; Cardia, Nancy; Choo, Claire; Almuneef, Maha

    2013-12-01

    This study aimed to systematically assess the readiness of five countries - Brazil, the Former Yugoslav Republic of Macedonia, Malaysia, Saudi Arabia, and South Africa - to implement evidence-based child maltreatment prevention programs on a large scale. To this end, it applied a recently developed method called Readiness Assessment for the Prevention of Child Maltreatment based on two parallel 100-item instruments. The first measures the knowledge, attitudes, and beliefs concerning child maltreatment prevention of key informants; the second, completed by child maltreatment prevention experts using all available data in the country, produces a more objective assessment readiness. The instruments cover all of the main aspects of readiness including, for instance, availability of scientific data on the problem, legislation and policies, will to address the problem, and material resources. Key informant scores ranged from 31.2 (Brazil) to 45.8/100 (the Former Yugoslav Republic of Macedonia) and expert scores, from 35.2 (Brazil) to 56/100 (Malaysia). Major gaps identified in almost all countries included a lack of professionals with the skills, knowledge, and expertise to implement evidence-based child maltreatment programs and of institutions to train them; inadequate funding, infrastructure, and equipment; extreme rarity of outcome evaluations of prevention programs; and lack of national prevalence surveys of child maltreatment. In sum, the five countries are in a low to moderate state of readiness to implement evidence-based child maltreatment prevention programs on a large scale. Such an assessment of readiness - the first of its kind - allows gaps to be identified and then addressed to increase the likelihood of program success. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Increasing condom use and declining STI prevalence in high-risk MSM and TGs: evaluation of a large-scale prevention program in Tamil Nadu, India.

    Science.gov (United States)

    Subramanian, Thilakavathi; Ramakrishnan, Lakshmi; Aridoss, Santhakumar; Goswami, Prabuddhagopal; Kanguswami, Boopathi; Shajan, Mathew; Adhikary, Rajat; Purushothaman, Girish Kumar Chethrapilly; Ramamoorthy, Senthil Kumar; Chinnaswamy, Eswaramurthy; Veeramani, Ilaya Bharathy; Paranjape, Ramesh Shivram

    2013-09-17

    This paper presents an evaluation of Avahan, a large scale HIV prevention program that was implemented using peer-mediated strategies, condom distribution and sexually transmitted infection (STI) clinical services among high-risk men who have sex with men (HR-MSM) and male to female transgender persons (TGs) in six high-prevalence state of Tamil Nadu, in southern India. Two rounds of large scale cross-sectional bio-behavioural surveys among HR-MSM and TGs and routine program monitoring data were used to assess changes in program coverage, condom use and prevalence of STIs (including HIV) and their association to program exposure. The Avahan program for HR-MSM and TGs in Tamil Nadu was significantly scaled up and contacts by peer educators reached 77 percent of the estimated denominator by the end of the program's fourth year. Exposure to the program increased between the two rounds of surveys for both HR-MSM (from 66 percent to 90 percent; AOR = 4.6; p Tamil Nadu achieved a high coverage, resulting in improved condom use by HR-MSM with their regular and commercial male partners. Declining STI prevalence and stable HIV prevalence reflect the positive effects of the prevention strategy. Outcomes from the program logic model indiacte the effectiveness of the program for HR-MSM and TGs in Tamil Nadu.

  9. Large-Scale Cooperative Task Distribution on Peer-to-Peer Networks

    Science.gov (United States)

    2012-01-01

    SUBTITLE Large-scale cooperative task distribution on peer-to-peer networks 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...disadvantages of ML- Chord are its fixed size (two layers), and limited scala - bility for large-scale systems. RC-Chord extends ML- D. Karrels et al...configurable before runtime. This can be improved by incorporating a distributed learning algorithm to tune the number and range of the DLoE tracking

  10. An Novel Architecture of Large-scale Communication in IOT

    Science.gov (United States)

    Ma, Wubin; Deng, Su; Huang, Hongbin

    2018-03-01

    In recent years, many scholars have done a great deal of research on the development of Internet of Things and networked physical systems. However, few people have made the detailed visualization of the large-scale communications architecture in the IOT. In fact, the non-uniform technology between IPv6 and access points has led to a lack of broad principles of large-scale communications architectures. Therefore, this paper presents the Uni-IPv6 Access and Information Exchange Method (UAIEM), a new architecture and algorithm that addresses large-scale communications in the IOT.

  11. Development of fine-resolution analyses and expanded large-scale forcing properties: 2. Scale awareness and application to single-column model experiments

    Science.gov (United States)

    Feng, Sha; Li, Zhijin; Liu, Yangang; Lin, Wuyin; Zhang, Minghua; Toto, Tami; Vogelmann, Andrew M.; Endo, Satoshi

    2015-01-01

    three-dimensional fields have been produced using the Community Gridpoint Statistical Interpolation (GSI) data assimilation system for the U.S. Department of Energy's Atmospheric Radiation Measurement Program (ARM) Southern Great Plains region. The GSI system is implemented in a multiscale data assimilation framework using the Weather Research and Forecasting model at a cloud-resolving resolution of 2 km. From the fine-resolution three-dimensional fields, large-scale forcing is derived explicitly at grid-scale resolution; a subgrid-scale dynamic component is derived separately, representing subgrid-scale horizontal dynamic processes. Analyses show that the subgrid-scale dynamic component is often a major component over the large-scale forcing for grid scales larger than 200 km. The single-column model (SCM) of the Community Atmospheric Model version 5 is used to examine the impact of the grid-scale and subgrid-scale dynamic components on simulated precipitation and cloud fields associated with a mesoscale convective system. It is found that grid-scale size impacts simulated precipitation, resulting in an overestimation for grid scales of about 200 km but an underestimation for smaller grids. The subgrid-scale dynamic component has an appreciable impact on the simulations, suggesting that grid-scale and subgrid-scale dynamic components should be considered in the interpretation of SCM simulations.

  12. Effect Of A Large-Scale Social Franchising And Telemedicine Program On Childhood Diarrhea And Pneumonia Outcomes In India.

    Science.gov (United States)

    Mohanan, Manoj; Babiarz, Kimberly S; Goldhaber-Fiebert, Jeremy D; Miller, Grant; Vera-Hernández, Marcos

    2016-10-01

    Despite the rapid growth of social franchising, there is little evidence on its population impact in the health sector. Similar in many ways to private-sector commercial franchising, social franchising can be found in sectors with a social objective, such as health care. This article evaluates the World Health Partners (WHP) Sky program, a large-scale social franchising and telemedicine program in Bihar, India. We studied appropriate treatment for childhood diarrhea and pneumonia and associated health care outcomes. We used multivariate difference-in-differences models to analyze data on 67,950 children ages five and under in 2011 and 2014. We found that the WHP-Sky program did not improve rates of appropriate treatment or disease prevalence. Both provider participation and service use among target populations were low. Our results do not imply that social franchising cannot succeed; instead, they underscore the importance of understanding factors that explain variation in the performance of social franchises. Our findings also highlight, for donors and governments in particular, the importance of conducting rigorous impact evaluations of new and potentially innovative health care delivery programs before investing in scaling them up. Published by Project HOPE—The People-to-People Health Foundation, Inc.

  13. Towards agile large-scale predictive modelling in drug discovery with flow-based programming design principles.

    Science.gov (United States)

    Lampa, Samuel; Alvarsson, Jonathan; Spjuth, Ola

    2016-01-01

    Predictive modelling in drug discovery is challenging to automate as it often contains multiple analysis steps and might involve cross-validation and parameter tuning that create complex dependencies between tasks. With large-scale data or when using computationally demanding modelling methods, e-infrastructures such as high-performance or cloud computing are required, adding to the existing challenges of fault-tolerant automation. Workflow management systems can aid in many of these challenges, but the currently available systems are lacking in the functionality needed to enable agile and flexible predictive modelling. We here present an approach inspired by elements of the flow-based programming paradigm, implemented as an extension of the Luigi system which we name SciLuigi. We also discuss the experiences from using the approach when modelling a large set of biochemical interactions using a shared computer cluster.Graphical abstract.

  14. OffshoreDC DC grids for integration of large scale wind power

    DEFF Research Database (Denmark)

    Zeni, Lorenzo; Endegnanew, Atsede Gualu; Stamatiou, Georgios

    The present report summarizes the main findings of the Nordic Energy Research project “DC grids for large scale integration of offshore wind power – OffshoreDC”. The project is been funded by Nordic Energy Research through the TFI programme and was active between 2011 and 2016. The overall...... objective of the project was to drive the development of the VSC based HVDC technology for future large scale offshore grids, supporting a standardised and commercial development of the technology, and improving the opportunities for the technology to support power system integration of large scale offshore...

  15. Practical experience from the Office of Adolescent Health's large scale implementation of an evidence-based Teen Pregnancy Prevention Program.

    Science.gov (United States)

    Margolis, Amy Lynn; Roper, Allison Yvonne

    2014-03-01

    After 3 years of experience overseeing the implementation and evaluation of evidence-based teen pregnancy prevention programs in a diversity of populations and settings across the country, the Office of Adolescent Health (OAH) has learned numerous lessons through practical application and new experiences. These lessons and experiences are applicable to those working to implement evidence-based programs on a large scale. The lessons described in this paper focus on what it means for a program to be implementation ready, the role of the program developer in replicating evidence-based programs, the importance of a planning period to ensure quality implementation, the need to define and measure fidelity, and the conditions necessary to support rigorous grantee-level evaluation. Published by Elsevier Inc.

  16. Large Scale Computing and Storage Requirements for Basic Energy Sciences Research

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard; Wasserman, Harvey

    2011-03-31

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility supporting research within the Department of Energy's Office of Science. NERSC provides high-performance computing (HPC) resources to approximately 4,000 researchers working on about 400 projects. In addition to hosting large-scale computing facilities, NERSC provides the support and expertise scientists need to effectively and efficiently use HPC systems. In February 2010, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR) and DOE's Office of Basic Energy Sciences (BES) held a workshop to characterize HPC requirements for BES research through 2013. The workshop was part of NERSC's legacy of anticipating users future needs and deploying the necessary resources to meet these demands. Workshop participants reached a consensus on several key findings, in addition to achieving the workshop's goal of collecting and characterizing computing requirements. The key requirements for scientists conducting research in BES are: (1) Larger allocations of computational resources; (2) Continued support for standard application software packages; (3) Adequate job turnaround time and throughput; and (4) Guidance and support for using future computer architectures. This report expands upon these key points and presents others. Several 'case studies' are included as significant representative samples of the needs of science teams within BES. Research teams scientific goals, computational methods of solution, current and 2013 computing requirements, and special software and support needs are summarized in these case studies. Also included are researchers strategies for computing in the highly parallel, 'multi-core' environment that is expected to dominate HPC architectures over the next few years. NERSC has strategic plans and initiatives already underway that address key workshop findings. This report includes a

  17. Survey and research for the enhancement of large-scale technology development 2. How large-scale technology development should be in the future; Ogata gijutsu kaihatsu suishin no tame no chosa kenkyu. 2. Kongo no ogata gijutsu kaihatsu no arikata

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1981-03-01

    A survey is conducted over the subject matter by holding interviews with people, employed with the entrusted businesses participating in the large-scale industrial technology development system, who are engaged in the development of industrial technologies, and with people of experience or academic background involved in the project enhancement effort. Needs of improvement are pointed out that the competition principle based for example on parallel development be introduced; that research-on-research be practiced for effective task institution; midway evaluation be substantiated since prior evaluation is difficult; efforts be made to organize new industries utilizing the fruits of large-scale industrial technology for the creation of markets, not to induce economic conflicts; that transfer of technologies be enhanced from the private sector to public sector. Studies are made about the review of research conducting systems; utilization of the power of private sector research and development efforts; enlightening about industrial proprietorship; and the diffusion of large-scale project systems. In this connection, problems are pointed out, requests are submitted, and remedial measures and suggestions are presented. (NEDO)

  18. Research highlights from a large scale residential monitoring study in a hot climate

    Energy Technology Data Exchange (ETDEWEB)

    Parker, Danny S. [Florida Solar Energy Center, Cocoa, FL (United States)

    2003-10-01

    A utility load research project has monitored a large number of residences in Central Florida, collecting detailed end-use data. The monitoring was performed to better estimate the impact of a load control program, as well as obtain improved appliance energy load profiles. The monitoring measured total as well as a number of electrical end-uses on a 15 min basis. The measured end-uses included space cooling, heating, water heating, range and cooking, clothes drying, and swimming pools electricity use and demand. The project identified a number of influences on electrical demand that are not commonly described. (Author)

  19. Large Scale Community Detection Using a Small World Model

    Directory of Open Access Journals (Sweden)

    Ranjan Kumar Behera

    2017-11-01

    Full Text Available In a social network, small or large communities within the network play a major role in deciding the functionalities of the network. Despite of diverse definitions, communities in the network may be defined as the group of nodes that are more densely connected as compared to nodes outside the group. Revealing such hidden communities is one of the challenging research problems. A real world social network follows small world phenomena, which indicates that any two social entities can be reachable in a small number of steps. In this paper, nodes are mapped into communities based on the random walk in the network. However, uncovering communities in large-scale networks is a challenging task due to its unprecedented growth in the size of social networks. A good number of community detection algorithms based on random walk exist in literature. In addition, when large-scale social networks are being considered, these algorithms are observed to take considerably longer time. In this work, with an objective to improve the efficiency of algorithms, parallel programming framework like Map-Reduce has been considered for uncovering the hidden communities in social network. The proposed approach has been compared with some standard existing community detection algorithms for both synthetic and real-world datasets in order to examine its performance, and it is observed that the proposed algorithm is more efficient than the existing ones.

  20. A radiation service centre for research and large-scale irradiation

    International Nuclear Information System (INIS)

    Offermann, B.P.; Hofmann, E.G.

    1978-01-01

    In the near future radiation processing of food may change from the present laboratory-scale to large industrial application. This step will require large irradiation facilities with high flexibility, a safe dose control system and simple food-handling systems. Some design parameters of such an irradiation facility have already been realized at the AEG-Telefunken Radiation Service Centre in Wedel. This centre came into operation in autumn 1976. It is equipped with one research-type high-power X-ray unit (200kV/32mA) and one industrial-type electron accelerator (1500kV/37.5kW). Handling systems are available for radiation crosslinking of wire and cable insulations, of plastic films, for irradiation treatment of components and parts of different types and coatings as also of sewage sludge and waste water. Some of these handling systems can be used for food irradiation too. Other handling systems will be added sometime later. As an additional service the Company's existing material and environmental testing laboratory will be available. The centre is already being used by many interested companies to investigate the effects of radiation on a broad range of organic and inorganic materials, to develop special processing equipment, to process supplied products and to perform R and D work and contracts. The service centre fills an existing gap and will have an impact on the commercialization of radiation processing techniques in Europe. (author)

  1. Results and lessons learned from UMANG program: A large scale community-managed supplementary feeding program in India

    International Nuclear Information System (INIS)

    Chockalingham, David; Gnanaraj, Grana Pu Selvi; Indriani, Esther

    2014-01-01

    feeding program called “UMANG” (Urgent Management & Action for Nutrition Growth) was developed and implemented across 84 ADPs. Through this program a malnourished child gets an additional feeding (one full meal and healthy snack), apart from what is provided at home and through the Government run Anganwadi Centre (an Indian policy to provide free mid-day meal to the children, but recent review shows varying degree of quality and attendance). UMANG menu meets one third of the daily requirement of children using locally available low cost nutritious food provided for a period of 90 days. Through UMANG mothers were educated and trained on healthy cooking, feeding and caring practices. Within the period of October 2012 to May 2013, as many as 24,154 children were enrolled in UMANG, and 44% have graduated to normal nutritional status at the end of 90 days program. Review of the program revealed that UMANG has increased the knowledge of mothers on malnutrition, contributed to the formation of common interest groups and enhanced the co-ordination of the frontline workers in addressing malnutrition. The presentation will highlight lessons learned from the 90-day implementation of this large scale community-managed supplementary feeding program. (author)

  2. Large-Scale Demonstration of Liquid Hydrogen Storage with Zero Boiloff for In-Space Applications

    Science.gov (United States)

    Hastings, L. J.; Bryant, C. B.; Flachbart, R. H.; Holt, K. A.; Johnson, E.; Hedayat, A.; Hipp, B.; Plachta, D. W.

    2010-01-01

    Cryocooler and passive insulation technology advances have substantially improved prospects for zero-boiloff cryogenic storage. Therefore, a cooperative effort by NASA s Ames Research Center, Glenn Research Center, and Marshall Space Flight Center (MSFC) was implemented to develop zero-boiloff concepts for in-space cryogenic storage. Described herein is one program element - a large-scale, zero-boiloff demonstration using the MSFC multipurpose hydrogen test bed (MHTB). A commercial cryocooler was interfaced with an existing MHTB spray bar mixer and insulation system in a manner that enabled a balance between incoming and extracted thermal energy.

  3. State of the Art in Large-Scale Soil Moisture Monitoring

    Science.gov (United States)

    Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.; hide

    2013-01-01

    Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.

  4. Can wide consultation help with setting priorities for large-scale biodiversity monitoring programs?

    Directory of Open Access Journals (Sweden)

    Frédéric Boivin

    Full Text Available Climate and other global change phenomena affecting biodiversity require monitoring to track ecosystem changes and guide policy and management actions. Designing a biodiversity monitoring program is a difficult task that requires making decisions that often lack consensus due to budgetary constrains. As monitoring programs require long-term investment, they also require strong and continuing support from all interested parties. As such, stakeholder consultation is key to identify priorities and make sound design decisions that have as much support as possible. Here, we present the results of a consultation conducted to serve as an aid for designing a large-scale biodiversity monitoring program for the province of Québec (Canada. The consultation took the form of a survey with 13 discrete choices involving tradeoffs in respect to design priorities and 10 demographic questions (e.g., age, profession. The survey was sent to thousands of individuals having expected interests and knowledge about biodiversity and was completed by 621 participants. Overall, consensuses were few and it appeared difficult to create a design fulfilling the priorities of the majority. Most participants wanted 1 a monitoring design covering the entire territory and focusing on natural habitats; 2 a focus on species related to ecosystem services, on threatened and on invasive species. The only demographic characteristic that was related to the type of prioritization was the declared level of knowledge in biodiversity (null to high, but even then the influence was quite small.

  5. Can wide consultation help with setting priorities for large-scale biodiversity monitoring programs?

    Science.gov (United States)

    Boivin, Frédéric; Simard, Anouk; Peres-Neto, Pedro

    2014-01-01

    Climate and other global change phenomena affecting biodiversity require monitoring to track ecosystem changes and guide policy and management actions. Designing a biodiversity monitoring program is a difficult task that requires making decisions that often lack consensus due to budgetary constrains. As monitoring programs require long-term investment, they also require strong and continuing support from all interested parties. As such, stakeholder consultation is key to identify priorities and make sound design decisions that have as much support as possible. Here, we present the results of a consultation conducted to serve as an aid for designing a large-scale biodiversity monitoring program for the province of Québec (Canada). The consultation took the form of a survey with 13 discrete choices involving tradeoffs in respect to design priorities and 10 demographic questions (e.g., age, profession). The survey was sent to thousands of individuals having expected interests and knowledge about biodiversity and was completed by 621 participants. Overall, consensuses were few and it appeared difficult to create a design fulfilling the priorities of the majority. Most participants wanted 1) a monitoring design covering the entire territory and focusing on natural habitats; 2) a focus on species related to ecosystem services, on threatened and on invasive species. The only demographic characteristic that was related to the type of prioritization was the declared level of knowledge in biodiversity (null to high), but even then the influence was quite small.

  6. Energy transfers in large-scale and small-scale dynamos

    Science.gov (United States)

    Samtaney, Ravi; Kumar, Rohit; Verma, Mahendra

    2015-11-01

    We present the energy transfers, mainly energy fluxes and shell-to-shell energy transfers in small-scale dynamo (SSD) and large-scale dynamo (LSD) using numerical simulations of MHD turbulence for Pm = 20 (SSD) and for Pm = 0.2 on 10243 grid. For SSD, we demonstrate that the magnetic energy growth is caused by nonlocal energy transfers from the large-scale or forcing-scale velocity field to small-scale magnetic field. The peak of these energy transfers move towards lower wavenumbers as dynamo evolves, which is the reason for the growth of the magnetic fields at the large scales. The energy transfers U2U (velocity to velocity) and B2B (magnetic to magnetic) are forward and local. For LSD, we show that the magnetic energy growth takes place via energy transfers from large-scale velocity field to large-scale magnetic field. We observe forward U2U and B2B energy flux, similar to SSD.

  7. Cooperative Severe Accident Research Program of the USNRC and its foreign partners: Program content and principal results

    International Nuclear Information System (INIS)

    Wright, R.W.; Eltawila, F.

    1993-01-01

    The U.S. Nuclear Regulatory Commission (NRC) and its associated foreign partners have been engaged in an extensive Cooperative Severe Accident Research Program. In addition to the NRC, the partners currently include Belgium, the Czech Republic, Canada, Finland, France, Germany, Hungary, Italy, Japan, Korea, the Netherlands, Russia, Spain, Sweden, Switzerland, Taiwan, the United Kingdom, and the Community of European Countries. The purpose of this research is to provide a technical basis for decisions involved in potential severe accidents in light water reactor (LWR) power plants. The research includes relatively large-scale integral tests and smaller scale separate-effects experiments on the dominant phenomena regarding severe accident behavior in LWR power plants, the development of phenomenological models of the key phenomena involved, and the development and validation of large computer codes for use in the analysis of core behavior and of a LWR systems behavior under severe accident conditions. The research results are also used in probabilistic risk assessment for LWRS

  8. Cost analysis of large-scale implementation of the 'Helping Babies Breathe' newborn resuscitation-training program in Tanzania.

    Science.gov (United States)

    Chaudhury, Sumona; Arlington, Lauren; Brenan, Shelby; Kairuki, Allan Kaijunga; Meda, Amunga Robson; Isangula, Kahabi G; Mponzi, Victor; Bishanga, Dunstan; Thomas, Erica; Msemo, Georgina; Azayo, Mary; Molinier, Alice; Nelson, Brett D

    2016-12-01

    Helping Babies Breathe (HBB) has become the gold standard globally for training birth-attendants in neonatal resuscitation in low-resource settings in efforts to reduce early newborn asphyxia and mortality. The purpose of this study was to do a first-ever activity-based cost-analysis of at-scale HBB program implementation and initial follow-up in a large region of Tanzania and evaluate costs of national scale-up as one component of a multi-method external evaluation of the implementation of HBB at scale in Tanzania. We used activity-based costing to examine budget expense data during the two-month implementation and follow-up of HBB in one of the target regions. Activity-cost centers included administrative, initial training (including resuscitation equipment), and follow-up training expenses. Sensitivity analysis was utilized to project cost scenarios incurred to achieve countrywide expansion of the program across all mainland regions of Tanzania and to model costs of program maintenance over one and five years following initiation. Total costs for the Mbeya Region were $202,240, with the highest proportion due to initial training and equipment (45.2%), followed by central program administration (37.2%), and follow-up visits (17.6%). Within Mbeya, 49 training sessions were undertaken, involving the training of 1,341 health providers from 336 health facilities in eight districts. To similarly expand the HBB program across the 25 regions of mainland Tanzania, the total economic cost is projected to be around $4,000,000 (around $600 per facility). Following sensitivity analyses, the estimated total for all Tanzania initial rollout lies between $2,934,793 to $4,309,595. In order to maintain the program nationally under the current model, it is estimated it would cost $2,019,115 for a further one year and $5,640,794 for a further five years of ongoing program support. HBB implementation is a relatively low-cost intervention with potential for high impact on perinatal

  9. Malware Analysis: From Large-Scale Data Triage to Targeted Attack Recognition (Dagstuhl Seminar 17281)

    OpenAIRE

    Zennou, Sarah; Debray, Saumya K.; Dullien, Thomas; Lakhothia, Arun

    2018-01-01

    This report summarizes the program and the outcomes of the Dagstuhl Seminar 17281, entitled "Malware Analysis: From Large-Scale Data Triage to Targeted Attack Recognition". The seminar brought together practitioners and researchers from industry and academia to discuss the state-of-the art in the analysis of malware from both a big data perspective and a fine grained analysis. Obfuscation was also considered. The meeting created new links within this very diverse community.

  10. Large-Scale Spray Releases: Additional Aerosol Test Results

    Energy Technology Data Exchange (ETDEWEB)

    Daniel, Richard C.; Gauglitz, Phillip A.; Burns, Carolyn A.; Fountain, Matthew S.; Shimskey, Rick W.; Billing, Justin M.; Bontha, Jagannadha R.; Kurath, Dean E.; Jenks, Jeromy WJ; MacFarlan, Paul J.; Mahoney, Lenna A.

    2013-08-01

    One of the events postulated in the hazard analysis for the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak event involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids that behave as a Newtonian fluid. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and in processing facilities across the DOE complex. To expand the data set upon which the WTP accident and safety analyses were based, an aerosol spray leak testing program was conducted by Pacific Northwest National Laboratory (PNNL). PNNL’s test program addressed two key technical areas to improve the WTP methodology (Larson and Allen 2010). The first technical area was to quantify the role of slurry particles in small breaches where slurry particles may plug the hole and prevent high-pressure sprays. The results from an effort to address this first technical area can be found in Mahoney et al. (2012a). The second technical area was to determine aerosol droplet size distribution and total droplet volume from prototypic breaches and fluids, including sprays from larger breaches and sprays of slurries for which literature data are mostly absent. To address the second technical area, the testing program collected aerosol generation data at two scales, commonly referred to as small-scale and large-scale testing. The small-scale testing and resultant data are described in Mahoney et al. (2012b), and the large-scale testing and resultant data are presented in Schonewill et al. (2012). In tests at both scales, simulants were used

  11. Large-scale networks in engineering and life sciences

    CERN Document Server

    Findeisen, Rolf; Flockerzi, Dietrich; Reichl, Udo; Sundmacher, Kai

    2014-01-01

    This edited volume provides insights into and tools for the modeling, analysis, optimization, and control of large-scale networks in the life sciences and in engineering. Large-scale systems are often the result of networked interactions between a large number of subsystems, and their analysis and control are becoming increasingly important. The chapters of this book present the basic concepts and theoretical foundations of network theory and discuss its applications in different scientific areas such as biochemical reactions, chemical production processes, systems biology, electrical circuits, and mobile agents. The aim is to identify common concepts, to understand the underlying mathematical ideas, and to inspire discussions across the borders of the various disciplines.  The book originates from the interdisciplinary summer school “Large Scale Networks in Engineering and Life Sciences” hosted by the International Max Planck Research School Magdeburg, September 26-30, 2011, and will therefore be of int...

  12. Neurite, a finite difference large scale parallel program for the simulation of electrical signal propagation in neurites under mechanical loading.

    Directory of Open Access Journals (Sweden)

    Julián A García-Grajales

    Full Text Available With the growing body of research on traumatic brain injury and spinal cord injury, computational neuroscience has recently focused its modeling efforts on neuronal functional deficits following mechanical loading. However, in most of these efforts, cell damage is generally only characterized by purely mechanistic criteria, functions of quantities such as stress, strain or their corresponding rates. The modeling of functional deficits in neurites as a consequence of macroscopic mechanical insults has been rarely explored. In particular, a quantitative mechanically based model of electrophysiological impairment in neuronal cells, Neurite, has only very recently been proposed. In this paper, we present the implementation details of this model: a finite difference parallel program for simulating electrical signal propagation along neurites under mechanical loading. Following the application of a macroscopic strain at a given strain rate produced by a mechanical insult, Neurite is able to simulate the resulting neuronal electrical signal propagation, and thus the corresponding functional deficits. The simulation of the coupled mechanical and electrophysiological behaviors requires computational expensive calculations that increase in complexity as the network of the simulated cells grows. The solvers implemented in Neurite--explicit and implicit--were therefore parallelized using graphics processing units in order to reduce the burden of the simulation costs of large scale scenarios. Cable Theory and Hodgkin-Huxley models were implemented to account for the electrophysiological passive and active regions of a neurite, respectively, whereas a coupled mechanical model accounting for the neurite mechanical behavior within its surrounding medium was adopted as a link between electrophysiology and mechanics. This paper provides the details of the parallel implementation of Neurite, along with three different application examples: a long myelinated axon

  13. Fatigue Analysis of Large-scale Wind turbine

    Directory of Open Access Journals (Sweden)

    Zhu Yongli

    2017-01-01

    Full Text Available The paper does research on top flange fatigue damage of large-scale wind turbine generator. It establishes finite element model of top flange connection system with finite element analysis software MSC. Marc/Mentat, analyzes its fatigue strain, implements load simulation of flange fatigue working condition with Bladed software, acquires flange fatigue load spectrum with rain-flow counting method, finally, it realizes fatigue analysis of top flange with fatigue analysis software MSC. Fatigue and Palmgren-Miner linear cumulative damage theory. The analysis result indicates that its result provides new thinking for flange fatigue analysis of large-scale wind turbine generator, and possesses some practical engineering value.

  14. Contribution of Large Animals to Translational Research on Prenatal Programming of Obesity and Associated Diseases.

    Science.gov (United States)

    Gonzalez-Bulnes, Antonio; Chavatte-Palmer, Pascale

    2017-01-01

    The awareness of factors causing obesity and associated disorders has grown up in the last years from genome to a more complicated concept (developmental programming) in which prenatal and early-postnatal conditions markedly modify the phenotype and homeostasis of the individuals and determine juvenile growth, life-time fitness/obesity and disease risks. Experimentation in human beings is impeded by ethical issues plus inherent high variability and confounding factors (genetics, lifestyle and socioeconomic heterogeneity) and preclinical studies in adequate translational animal models are therefore decisive. Most of the studies have been performed in rodents, whilst the use of large animals is scarce. Having in mind body-size, handlingeasiness and cost-efficiency, the main large animal species for use in biomedical research are rabbits, sheep and swine. The choice of the model depends on the research objectives. To outline the main features of the use of rabbits, sheep and swine and their contributions as translational models in prenatal programming of obesity and associated disorders. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  15. NASA: Assessments of Selected Large-Scale Projects

    Science.gov (United States)

    2011-03-01

    REPORT DATE MAR 2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Assessments Of Selected Large-Scale Projects...Volatile EvolutioN MEP Mars Exploration Program MIB Mishap Investigation Board MMRTG Multi Mission Radioisotope Thermoelectric Generator MMS Magnetospheric...probes designed to explore the Martian surface, to satellites equipped with advanced sensors to study the earth , to telescopes intended to explore the

  16. Research on precision grinding technology of large scale and ultra thin optics

    Science.gov (United States)

    Zhou, Lian; Wei, Qiancai; Li, Jie; Chen, Xianhua; Zhang, Qinghua

    2018-03-01

    The flatness and parallelism error of large scale and ultra thin optics have an important influence on the subsequent polishing efficiency and accuracy. In order to realize the high precision grinding of those ductile elements, the low deformation vacuum chuck was designed first, which was used for clamping the optics with high supporting rigidity in the full aperture. Then the optics was planar grinded under vacuum adsorption. After machining, the vacuum system was turned off. The form error of optics was on-machine measured using displacement sensor after elastic restitution. The flatness would be convergenced with high accuracy by compensation machining, whose trajectories were integrated with the measurement result. For purpose of getting high parallelism, the optics was turned over and compensation grinded using the form error of vacuum chuck. Finally, the grinding experiment of large scale and ultra thin fused silica optics with aperture of 430mm×430mm×10mm was performed. The best P-V flatness of optics was below 3 μm, and parallelism was below 3 ″. This machining technique has applied in batch grinding of large scale and ultra thin optics.

  17. Study of a large scale neutron measurement channel

    International Nuclear Information System (INIS)

    Amarouayache, Anissa; Ben Hadid, Hayet.

    1982-12-01

    A large scale measurement channel allows the processing of the signal coming from an unique neutronic sensor, during three different running modes: impulses, fluctuations and current. The study described in this note includes three parts: - A theoretical study of the large scale channel and its brief description are given. The results obtained till now in that domain are presented. - The fluctuation mode is thoroughly studied and the improvements to be done are defined. The study of a fluctuation linear channel with an automatic commutation of scales is described and the results of the tests are given. In this large scale channel, the method of data processing is analogical. - To become independent of the problems generated by the use of a an analogical processing of the fluctuation signal, a digital method of data processing is tested. The validity of that method is improved. The results obtained on a test system realized according to this method are given and a preliminary plan for further research is defined [fr

  18. NASA Land Cover and Land Use Change (LCLUC): an interdisciplinary research program.

    Science.gov (United States)

    Justice, Chris; Gutman, Garik; Vadrevu, Krishna Prasad

    2015-01-15

    Understanding Land Cover/Land Use Change (LCLUC) in diverse regions of the world and at varied spatial scales is one of the important challenges in global change research. In this article, we provide a brief overview of the NASA LCLUC program, its focus areas, and the importance of satellite remote sensing observations in LCLUC research including future directions. The LCLUC Program was designed to be a cross-cutting theme within NASA's Earth Science program. The program aims to develop and use remote sensing technologies to improve understanding of human interactions with the environment. Since 1997, the NASA LCLUC program has supported nearly 280 research projects on diverse topics such as forest loss and carbon, urban expansion, land abandonment, wetland loss, agricultural land use change and land use change in mountain systems. The NASA LCLUC program emphasizes studies where land-use changes are rapid or where there are significant regional or global LCLUC implications. Over a period of years, the LCLUC program has contributed to large regional science programs such as Land Biosphere-Atmosphere (LBA), the Northern Eurasia Earth Science Partnership Initiative (NEESPI), and the Monsoon Area Integrated Regional Study (MAIRS). The primary emphasis of the program will remain on using remote sensing datasets for LCLUC research. The program will continue to emphasize integration of physical and social sciences to address regional to global scale issues of LCLUC for the benefit of society. Copyright © 2014. Published by Elsevier Ltd.

  19. Studies of land-cover, land-use, and biophysical properties of vegetation in the Large Scale Biosphere Atmosphere experiment in Amazonia.

    Science.gov (United States)

    Dar A. Robertsa; Michael Keller; Joao Vianei Soares

    2003-01-01

    We summarize early research on land-cover, land-use, and biophysical properties of vegetation from the Large Scale Biosphere Atmosphere (LBA) experiment in Amazoˆnia. LBA is an international research program developed to evaluate regional function and to determine how land-use and climate modify biological, chemical and physical processes there. Remote sensing has...

  20. Large Scale Computing and Storage Requirements for Nuclear Physics Research

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard A.; Wasserman, Harvey J.

    2012-03-02

    IThe National Energy Research Scientific Computing Center (NERSC) is the primary computing center for the DOE Office of Science, serving approximately 4,000 users and hosting some 550 projects that involve nearly 700 codes for a wide variety of scientific disciplines. In addition to large-scale computing resources NERSC provides critical staff support and expertise to help scientists make the most efficient use of these resources to advance the scientific mission of the Office of Science. In May 2011, NERSC, DOE’s Office of Advanced Scientific Computing Research (ASCR) and DOE’s Office of Nuclear Physics (NP) held a workshop to characterize HPC requirements for NP research over the next three to five years. The effort is part of NERSC’s continuing involvement in anticipating future user needs and deploying necessary resources to meet these demands. The workshop revealed several key requirements, in addition to achieving its goal of characterizing NP computing. The key requirements include: 1. Larger allocations of computational resources at NERSC; 2. Visualization and analytics support; and 3. Support at NERSC for the unique needs of experimental nuclear physicists. This report expands upon these key points and adds others. The results are based upon representative samples, called “case studies,” of the needs of science teams within NP. The case studies were prepared by NP workshop participants and contain a summary of science goals, methods of solution, current and future computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, “multi-core” environment that is expected to dominate HPC architectures over the next few years. The report also includes a section with NERSC responses to the workshop findings. NERSC has many initiatives already underway that address key workshop findings and all of the action items are aligned with NERSC strategic plans.

  1. Research into condensed matter using large-scale apparatus. Physics, chemistry, biology. Progress report 1992-1995. Summarizing reports

    International Nuclear Information System (INIS)

    1996-01-01

    Activities for research into condensed matter have been supported by the German BMBF with approx. 102 million Deutschmarks in the years 1992 through 1995. These financial means have been distributed among 314 research projects in the fields of physics, chemistry, biology, materials science, and other fields, which all rely on the intensive utilization of photon and particle beams generated in large-scale apparatus of institutions for basic research. The volume in hand first gives information of a general kind and statistical data on the distribution of financial means, for a number of priority research projects. The project reports are summarizing reports on the progress achieved in the various projects. (CB) [de

  2. Programmed Control of Optical Grating Scales for Visual Research.

    Science.gov (United States)

    1980-12-01

    A -AOO .9 AIR FORCE INST OF TECH WRIGHT-PATTERSON AFB OH SCHOO--ETC F/6 14/2 PROGRAMMED CONTROL OF OPTI CAL GRATING SCALES FOR VISUAL RESEARC --ETC(fl...custom system for AMRL. The cost in memory parts alone was $40,000, a good indication that the market is not over-priced. Ca-? western Reserve

  3. Large scale oil lease automation and electronic custody transfer

    International Nuclear Information System (INIS)

    Price, C.R.; Elmer, D.C.

    1995-01-01

    Typically, oil field production operations have only been automated at fields with long term production profiles and enhanced recovery. The automation generally consists of monitoring and control at the wellhead and centralized facilities. However, Union Pacific Resources Co. (UPRC) has successfully implemented a large scale automation program for rapid-decline primary recovery Austin Chalk wells where purchasers buy and transport oil from each individual wellsite. This project has resulted in two significant benefits. First, operators are using the system to re-engineer their work processes. Second, an inter-company team created a new electronic custody transfer method. This paper will describe: the progression of the company's automation objectives in the area; the field operator's interaction with the system, and the related benefits; the research and development of the new electronic custody transfer method

  4. Large Scale GW Calculations on the Cori System

    Science.gov (United States)

    Deslippe, Jack; Del Ben, Mauro; da Jornada, Felipe; Canning, Andrew; Louie, Steven

    The NERSC Cori system, powered by 9000+ Intel Xeon-Phi processors, represents one of the largest HPC systems for open-science in the United States and the world. We discuss the optimization of the GW methodology for this system, including both node level and system-scale optimizations. We highlight multiple large scale (thousands of atoms) case studies and discuss both absolute application performance and comparison to calculations on more traditional HPC architectures. We find that the GW method is particularly well suited for many-core architectures due to the ability to exploit a large amount of parallelism across many layers of the system. This work was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Sciences and Engineering Division, as part of the Computational Materials Sciences Program.

  5. Galaxies distribution in the universe: large-scale statistics and structures

    International Nuclear Information System (INIS)

    Maurogordato, Sophie

    1988-01-01

    This research thesis addresses the distribution of galaxies in the Universe, and more particularly large scale statistics and structures. Based on an assessment of the main used statistical techniques, the author outlines the need to develop additional tools to correlation functions in order to characterise the distribution. She introduces a new indicator: the probability of a volume randomly tested in the distribution to be void. This allows a characterisation of void properties at the work scales (until 10h"-"1 Mpc) in the Harvard Smithsonian Center for Astrophysics Redshift Survey, or CfA catalog. A systematic analysis of statistical properties of different sub-samples has then been performed with respect to the size and location, luminosity class, and morphological type. This analysis is then extended to different scenarios of structure formation. A program of radial speed measurements based on observations allows the determination of possible relationships between apparent structures. The author also presents results of the search for south extensions of Perseus supernova [fr

  6. An inertia-free filter line-search algorithm for large-scale nonlinear programming

    Energy Technology Data Exchange (ETDEWEB)

    Chiang, Nai-Yuan; Zavala, Victor M.

    2016-02-15

    We present a filter line-search algorithm that does not require inertia information of the linear system. This feature enables the use of a wide range of linear algebra strategies and libraries, which is essential to tackle large-scale problems on modern computing architectures. The proposed approach performs curvature tests along the search step to detect negative curvature and to trigger convexification. We prove that the approach is globally convergent and we implement the approach within a parallel interior-point framework to solve large-scale and highly nonlinear problems. Our numerical tests demonstrate that the inertia-free approach is as efficient as inertia detection via symmetric indefinite factorizations. We also demonstrate that the inertia-free approach can lead to reductions in solution time because it reduces the amount of convexification needed.

  7. Exploiting multi-scale parallelism for large scale numerical modelling of laser wakefield accelerators

    International Nuclear Information System (INIS)

    Fonseca, R A; Vieira, J; Silva, L O; Fiuza, F; Davidson, A; Tsung, F S; Mori, W B

    2013-01-01

    A new generation of laser wakefield accelerators (LWFA), supported by the extreme accelerating fields generated in the interaction of PW-Class lasers and underdense targets, promises the production of high quality electron beams in short distances for multiple applications. Achieving this goal will rely heavily on numerical modelling to further understand the underlying physics and identify optimal regimes, but large scale modelling of these scenarios is computationally heavy and requires the efficient use of state-of-the-art petascale supercomputing systems. We discuss the main difficulties involved in running these simulations and the new developments implemented in the OSIRIS framework to address these issues, ranging from multi-dimensional dynamic load balancing and hybrid distributed/shared memory parallelism to the vectorization of the PIC algorithm. We present the results of the OASCR Joule Metric program on the issue of large scale modelling of LWFA, demonstrating speedups of over 1 order of magnitude on the same hardware. Finally, scalability to over ∼10 6 cores and sustained performance over ∼2 P Flops is demonstrated, opening the way for large scale modelling of LWFA scenarios. (paper)

  8. Ethics of large-scale change

    OpenAIRE

    Arler, Finn

    2006-01-01

      The subject of this paper is long-term large-scale changes in human society. Some very significant examples of large-scale change are presented: human population growth, human appropriation of land and primary production, the human use of fossil fuels, and climate change. The question is posed, which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, th...

  9. Superconducting magnet and conductor research activities in the US fusion program

    International Nuclear Information System (INIS)

    Michael, P.C.; Schultz, J.H.; Antaya, T.A.; Ballinger, R.; Chiesa, L.; Feng, J.; Gung, C.-Y.; Harris, D.; Kim, J.-H.; Lee, P.; Martovetsky, N.; Minervini, J.V.; Radovinsky, A.; Salvetti, M.; Takayasu, M.; Titus, P.

    2006-01-01

    Fusion research in the United States is sponsored by the Department of Energy's Office of Fusion Energy Sciences (OFES). The OFES sponsors a wide range of programs to advance fusion science, fusion technology, and basic plasma science. Most experimental devices in the US fusion program are constructed using conventional technologies; however, a small portion of the fusion research program is directed towards large scale commercial power generation, which typically relies on superconductor technology to facilitate steady-state operation with high fusion power gain, Q. The superconductor portion of the US fusion research program is limited to a small number of laboratories including the Plasma Science and Fusion Center at MIT, Lawrence Livermore National Laboratory (LLNL), and the Applied Superconductivity Center at University of Wisconsin, Madison. Although Brookhaven National Laboratory (BNL) and Lawrence Berkeley National Laboratory (LBNL) are primarily sponsored by the US's High Energy Physics program, both have made significant contributions to advance the superconductor technology needed for the US fusion program. This paper summarizes recent superconductor activities in the US fusion program

  10. On the scaling features of high-latitude geomagnetic field fluctuations during a large geomagnetic storm

    Science.gov (United States)

    De Michelis, Paola; Federica Marcucci, Maria; Consolini, Giuseppe

    2015-04-01

    Recently we have investigated the spatial distribution of the scaling features of short-time scale magnetic field fluctuations using measurements from several ground-based geomagnetic observatories distributed in the northern hemisphere. We have found that the scaling features of fluctuations of the horizontal magnetic field component at time scales below 100 minutes are correlated with the geomagnetic activity level and with changes in the currents flowing in the ionosphere. Here, we present a detailed analysis of the dynamical changes of the magnetic field scaling features as a function of the geomagnetic activity level during the well-known large geomagnetic storm occurred on July, 15, 2000 (the Bastille event). The observed dynamical changes are discussed in relationship with the changes of the overall ionospheric polar convection and potential structure as reconstructed using SuperDARN data. This work is supported by the Italian National Program for Antarctic Research (PNRA) - Research Project 2013/AC3.08 and by the European Community's Seventh Framework Programme ([FP7/2007-2013]) under Grant no. 313038/STORM and

  11. Research Guidelines in the Era of Large-scale Collaborations: An Analysis of Genome-wide Association Study Consortia

    Science.gov (United States)

    Austin, Melissa A.; Hair, Marilyn S.; Fullerton, Stephanie M.

    2012-01-01

    Scientific research has shifted from studies conducted by single investigators to the creation of large consortia. Genetic epidemiologists, for example, now collaborate extensively for genome-wide association studies (GWAS). The effect has been a stream of confirmed disease-gene associations. However, effects on human subjects oversight, data-sharing, publication and authorship practices, research organization and productivity, and intellectual property remain to be examined. The aim of this analysis was to identify all research consortia that had published the results of a GWAS analysis since 2005, characterize them, determine which have publicly accessible guidelines for research practices, and summarize the policies in these guidelines. A review of the National Human Genome Research Institute’s Catalog of Published Genome-Wide Association Studies identified 55 GWAS consortia as of April 1, 2011. These consortia were comprised of individual investigators, research centers, studies, or other consortia and studied 48 different diseases or traits. Only 14 (25%) were found to have publicly accessible research guidelines on consortia websites. The available guidelines provide information on organization, governance, and research protocols; half address institutional review board approval. Details of publication, authorship, data-sharing, and intellectual property vary considerably. Wider access to consortia guidelines is needed to establish appropriate research standards with broad applicability to emerging forms of large-scale collaboration. PMID:22491085

  12. Evaluating the potential for large-scale fracturing at a disposal vault: an example using the underground research laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Martin, C D; Chandler, N A; Brown, Anton

    1994-09-01

    The potential for large-scale fracturing (> 10 m{sup 2}) around a nuclear fuel waste disposal vault is investigated in this report. The disposal vault is assumed to be located at a depth of 500 m in the plutonic rocks of the Canadian Shield. The rock mass surrounding the disposal vault is considered to have similar mechanical properties and in situ stress conditions to that found at a depth of 420 m at the Underground Research Laboratory. Theoretical, experimental and field evidence shows that Mode I fractures propagate in a plane perpendicular to {sigma}{sub 3} and only if the tensile stress at the tip of the advancing crack is sufficient to overcome the tensile strength of the rock. Because the stress state at a depth of 500 m or more is compressive, and will very probably stay so during the 10,000 year life of the disposal vault, there does not appear to be any mechanism which could propagate large-scale Mode I fracturing in the rock mass surrounding the vault. In addition because {sigma}{sub 3} is near vertical any Mode I fracture propagation that might occur would be in a horizontal plane. The development of either Mode I or large-scale shear fractures would require a drastic change in the compressive in situ stress state at the depth of the disposal vault. The stresses developed as a result of both thermal and glacial loading do not appear sufficient to cause new fracturing. Glacial loading would reduce the shear stresses in the rock mass and hence improve the stability of the rock mass surrounding the vault. Thus, it is not feasible that large-scale fracturing would occur over the 10,000 year life of a disposal vault in the Canadian Shield, at depths of 500 m or greater, where the compressive stress state is similar to that found at the Underground Research Laboratory. 107 refs., 44 figs.

  13. Evaluating the potential for large-scale fracturing at a disposal vault: an example using the underground research laboratory

    International Nuclear Information System (INIS)

    Martin, C.D.; Chandler, N.A.; Brown, Anton.

    1994-09-01

    The potential for large-scale fracturing (> 10 m 2 ) around a nuclear fuel waste disposal vault is investigated in this report. The disposal vault is assumed to be located at a depth of 500 m in the plutonic rocks of the Canadian Shield. The rock mass surrounding the disposal vault is considered to have similar mechanical properties and in situ stress conditions to that found at a depth of 420 m at the Underground Research Laboratory. Theoretical, experimental and field evidence shows that Mode I fractures propagate in a plane perpendicular to σ 3 and only if the tensile stress at the tip of the advancing crack is sufficient to overcome the tensile strength of the rock. Because the stress state at a depth of 500 m or more is compressive, and will very probably stay so during the 10,000 year life of the disposal vault, there does not appear to be any mechanism which could propagate large-scale Mode I fracturing in the rock mass surrounding the vault. In addition because σ 3 is near vertical any Mode I fracture propagation that might occur would be in a horizontal plane. The development of either Mode I or large-scale shear fractures would require a drastic change in the compressive in situ stress state at the depth of the disposal vault. The stresses developed as a result of both thermal and glacial loading do not appear sufficient to cause new fracturing. Glacial loading would reduce the shear stresses in the rock mass and hence improve the stability of the rock mass surrounding the vault. Thus, it is not feasible that large-scale fracturing would occur over the 10,000 year life of a disposal vault in the Canadian Shield, at depths of 500 m or greater, where the compressive stress state is similar to that found at the Underground Research Laboratory. 107 refs., 44 figs

  14. Pressure Scalings and Influence Region Research

    Science.gov (United States)

    2018-01-01

    AFRL-RQ-WP-TP-2018-0015 PRESSURE SCALINGS AND INFLUENCE REGION RESEARCH James H. Miller High Speed Systems Division...inside pages STINFO COPY AIR FORCE RESEARCH LABORATORY AEROSPACE SYSTEMS DIRECTORATE WRIGHT-PATTERSON AIR FORCE BASE, OH... RESEARCH 5a. CONTRACT NUMBER In-house 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 62201F 6. AUTHOR(S) James H. Miller 5d. PROJECT NUMBER

  15. Side effects of problem-solving strategies in large-scale nutrition science: towards a diversification of health.

    Science.gov (United States)

    Penders, Bart; Vos, Rein; Horstman, Klasien

    2009-11-01

    Solving complex problems in large-scale research programmes requires cooperation and division of labour. Simultaneously, large-scale problem solving also gives rise to unintended side effects. Based upon 5 years of researching two large-scale nutrigenomic research programmes, we argue that problems are fragmented in order to be solved. These sub-problems are given priority for practical reasons and in the process of solving them, various changes are introduced in each sub-problem. Combined with additional diversity as a result of interdisciplinarity, this makes reassembling the original and overall goal of the research programme less likely. In the case of nutrigenomics and health, this produces a diversification of health. As a result, the public health goal of contemporary nutrition science is not reached in the large-scale research programmes we studied. Large-scale research programmes are very successful in producing scientific publications and new knowledge; however, in reaching their political goals they often are less successful.

  16. Factors Affecting the Rate of Penetration of Large-Scale Electricity Technologies: The Case of Carbon Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    James R. McFarland; Howard J. Herzog

    2007-05-14

    This project falls under the Technology Innovation and Diffusion topic of the Integrated Assessment of Climate Change Research Program. The objective was to better understand the critical variables that affect the rate of penetration of large-scale electricity technologies in order to improve their representation in integrated assessment models. We conducted this research in six integrated tasks. In our first two tasks, we identified potential factors that affect penetration rates through discussions with modeling groups and through case studies of historical precedent. In the next three tasks, we investigated in detail three potential sets of critical factors: industrial conditions, resource conditions, and regulatory/environmental considerations. Research to assess the significance and relative importance of these factors involved the development of a microeconomic, system dynamics model of the US electric power sector. Finally, we implemented the penetration rate models in an integrated assessment model. While the focus of this effort is on carbon capture and sequestration technologies, much of the work will be applicable to other large-scale energy conversion technologies.

  17. The relevance of large scale environmental research infrastructures from the point of view of Ethics: the case of EMSO

    Science.gov (United States)

    Favali, Paolo; Beranzoli, Laura; Best, Mairi; Franceschini, PierLuigi; Materia, Paola; Peppoloni, Silvia; Picard, John

    2014-05-01

    EMSO (European Multidisciplinary Seafloor and Water Column Observatory) is a large-scale European Research Infrastructure (RI). It is a geographically distributed infrastructure composed of several deep-seafloor and water-column observatories, which will be deployed at key sites in European waters, spanning from the Arctic, through the Atlantic and Mediterranean, to the Black Sea, with the basic scientific objective of real-time, long-term monitoring of environmental processes related to the interaction between the geosphere, biosphere and hydrosphere. EMSO is one of the environmental RIs on the ESFRI roadmap. The ESRFI Roadmap identifies new RIs of pan-European importance that correspond to the long term needs of European research communities. EMSO will be the sub-sea segment of the EU's large-scale Earth Observation program, Copernicus (previously known as GMES - Global Monitoring for Environment and Security) and will significantly enhance the observational capabilities of European member states. An open data policy compliant with the recommendations being developed within the GEOSS initiative (Global Earth Observation System of Systems) will allow for shared use of the infrastructure and the exchange of scientific information and knowledge. The processes that occur in the oceans have a direct impact on human societies, therefore it is crucial to improve our understanding of how they operate and interact. To encompass the breadth of these major processes, sustained and integrated observations are required that appreciate the interconnectedness of atmospheric, surface ocean, biological pump, deep-sea, and solid-Earth dynamics and that can address: • natural and anthropogenic change; • interactions between ecosystem services, biodiversity, biogeochemistry, physics, and climate; • impacts of exploration and extraction of energy, minerals, and living resources; • geo-hazard early warning capability for earthquakes, tsunamis, gas-hydrate release, and slope

  18. Obtaining large-scale funding for empowerment-oriented qualitative research: a report from personal experience.

    Science.gov (United States)

    Padgett, Deborah K; Henwood, Benjamin F

    2009-06-01

    Obtaining funding for qualitative research remains a challenge despite greater openness to methodological pluralism. Such hurdles are presumably compounded when the proposed study employs empowerment theory, rendering it susceptible to charges of elevating ideology over rigor. This article draws on the authors' experience in securing large-scale funding for an empowerment-oriented qualitative study of homeless mentally ill adults. Lessons learned include the importance of weaving empowerment theory into the proposal's "argument," and infusing empowerment values into study protocols while simultaneously paying close attention to rigorous and transparent methods. Additional benefits accrue from having prior relationships with study sites and being willing to revise and resubmit proposals whenever possible. Though representing a fraction of all externally funded projects in the United States, qualitative research has tremendous untapped potential for success in this competitive arena-success that need not entail surrendering a commitment to empowerment values.

  19. Programs | IDRC - International Development Research Centre

    International Development Research Centre (IDRC) Digital Library (Canada)

    Our development programs support innovative solutions that improve global ... Chestnut farm worker carries basket of harvest chestnuts on shoulders in China ... Invest in knowledge and innovation for large-scale positive change; Build the ...

  20. Comparative Analysis of Different Protocols to Manage Large Scale Networks

    OpenAIRE

    Anil Rao Pimplapure; Dr Jayant Dubey; Prashant Sen

    2013-01-01

    In recent year the numbers, complexity and size is increased in Large Scale Network. The best example of Large Scale Network is Internet, and recently once are Data-centers in Cloud Environment. In this process, involvement of several management tasks such as traffic monitoring, security and performance optimization is big task for Network Administrator. This research reports study the different protocols i.e. conventional protocols like Simple Network Management Protocol and newly Gossip bas...

  1. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  2. A large scale test of the gaming-enhancement hypothesis

    Directory of Open Access Journals (Sweden)

    Andrew K. Przybylski

    2016-11-01

    Full Text Available A growing research literature suggests that regular electronic game play and game-based training programs may confer practically significant benefits to cognitive functioning. Most evidence supporting this idea, the gaming-enhancement hypothesis, has been collected in small-scale studies of university students and older adults. This research investigated the hypothesis in a general way with a large sample of 1,847 school-aged children. Our aim was to examine the relations between young people’s gaming experiences and an objective test of reasoning performance. Using a Bayesian hypothesis testing approach, evidence for the gaming-enhancement and null hypotheses were compared. Results provided no substantive evidence supporting the idea that having preference for or regularly playing commercially available games was positively associated with reasoning ability. Evidence ranged from equivocal to very strong in support for the null hypothesis over what was predicted. The discussion focuses on the value of Bayesian hypothesis testing for investigating electronic gaming effects, the importance of open science practices, and pre-registered designs to improve the quality of future work.

  3. A large scale test of the gaming-enhancement hypothesis.

    Science.gov (United States)

    Przybylski, Andrew K; Wang, John C

    2016-01-01

    A growing research literature suggests that regular electronic game play and game-based training programs may confer practically significant benefits to cognitive functioning. Most evidence supporting this idea, the gaming-enhancement hypothesis , has been collected in small-scale studies of university students and older adults. This research investigated the hypothesis in a general way with a large sample of 1,847 school-aged children. Our aim was to examine the relations between young people's gaming experiences and an objective test of reasoning performance. Using a Bayesian hypothesis testing approach, evidence for the gaming-enhancement and null hypotheses were compared. Results provided no substantive evidence supporting the idea that having preference for or regularly playing commercially available games was positively associated with reasoning ability. Evidence ranged from equivocal to very strong in support for the null hypothesis over what was predicted. The discussion focuses on the value of Bayesian hypothesis testing for investigating electronic gaming effects, the importance of open science practices, and pre-registered designs to improve the quality of future work.

  4. Large-Scale Spacecraft Fire Safety Tests

    Science.gov (United States)

    Urban, David; Ruff, Gary A.; Ferkul, Paul V.; Olson, Sandra; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Rouvreau, Sebastien; Minster, Olivier; hide

    2014-01-01

    An international collaborative program is underway to address open issues in spacecraft fire safety. Because of limited access to long-term low-gravity conditions and the small volume generally allotted for these experiments, there have been relatively few experiments that directly study spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample sizes and environment conditions typical of those expected in a spacecraft fire. The major constraint has been the size of the sample, with prior experiments limited to samples of the order of 10 cm in length and width or smaller. This lack of experimental data forces spacecraft designers to base their designs and safety precautions on 1-g understanding of flame spread, fire detection, and suppression. However, low-gravity combustion research has demonstrated substantial differences in flame behavior in low-gravity. This, combined with the differences caused by the confined spacecraft environment, necessitates practical scale spacecraft fire safety research to mitigate risks for future space missions. To address this issue, a large-scale spacecraft fire experiment is under development by NASA and an international team of investigators. This poster presents the objectives, status, and concept of this collaborative international project (Saffire). The project plan is to conduct fire safety experiments on three sequential flights of an unmanned ISS re-supply spacecraft (the Orbital Cygnus vehicle) after they have completed their delivery of cargo to the ISS and have begun their return journeys to earth. On two flights (Saffire-1 and Saffire-3), the experiment will consist of a flame spread test involving a meter-scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. On one of the flights (Saffire-2), 9 smaller (5 x 30 cm) samples will be tested to evaluate NASAs material flammability screening tests

  5. Accomplishments of LOCA/ECCS experimental research at Japan Atomic Energy Research Institute

    International Nuclear Information System (INIS)

    Tasaka, Kanji; Murao, Yoshio; Koizumi, Yasuo

    1984-01-01

    Japan Atomic Energy Research Institute has investigated loss-of-coolant accident (LOCA)/emergency core cooling system (ECCS) from 1970. Major results of the LOCA/ECCS research are summarized in this report. ROSA-II program was LOCA/ECCS research for a pressurized water reactor (PWR) and ROSA-III program was for a boiling water reactor (BWR). The both test facilities were scaled at approximately 1/400 of the respective reference PWR and BWR. Large scale reflood test is research on reflood phenomena during a large break LOCA of PWR. The test facility is scaled at approximately 1/20 of the reference PWR and the research is still being continued. (author)

  6. Bringing the Incredible Years® Programs to Scale

    Science.gov (United States)

    Webster-Stratton, Carolyn; McCoy, Kathleen P.

    2015-01-01

    The Incredible Years® (IY) program series is a set of interlocking and comprehensive training programs for parents, teachers, and children. This article briefly reviews the theoretical foundations, goals, and research underlying these programs. The main purpose of the paper is to describe how the IY programs have been scaled up slowly and…

  7. Decentralized Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

    2013-01-01

    problem is formulated as a centralized large-scale optimization problem but is then decomposed into smaller subproblems that are solved locally by each unit connected to an aggregator. For large-scale systems the method is faster than solving the full problem and can be distributed to include an arbitrary...

  8. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  9. Development of a pharmacy student research program at a large academic medical center.

    Science.gov (United States)

    McLaughlin, Milena M; Skoglund, Erik; Bergman, Scott; Scheetz, Marc H

    2015-11-01

    A program to promote research by pharmacy students created through the collaboration of an academic medical center and a college of pharmacy is described. In 2009, Midwestern University Chicago College of Pharmacy and Northwestern Memorial Hospital (NMH) expanded their existing partnership by establishing a program to increase opportunities for pharmacy students to conduct clinical-translational research. All professional year 1, 2, or 3 students at the college, as well as professional year 4 students on rotation at NMH, can participate in the program. Central to the program's infrastructure is the mentorship of student leads by faculty- and hospital-based pharmacists. The mentors oversee the student research projects and guide development of poster presentations; student leads mentor junior students and assist with orientation and training activities. Publication of research findings in the peer-reviewed literature is a key program goal. In the first four years after program implementation, participation in a summer research program grew nearly 10-fold (mainly among incoming professional year 2 or 3 students, and student poster presentations at national pharmacy meetings increased nearly 20-fold; the number of published research articles involving student authors increased from zero in 2009 to three in 2012 and two in 2013. A collaborative program between an academic medical center and a college of pharmacy has enabled pharmacy students to conduct research at the medical center and has been associated with increases in the numbers of poster presentations and publications involving students. Copyright © 2015 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  10. Scaling Research Results: Design and Evaluation | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Design and evaluation The project will provide helpful guidance to IDRC management and ... scaling and programming for scalable research Offer the monograph in multiple forms, ... Asian outlook: New growth dependent on new productivity.

  11. GoFFish: A Sub-Graph Centric Framework for Large-Scale Graph Analytics1

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Kumbhare, Alok; Wickramaarachchi, Charith; Nagarkar, Soonil; Ravi, Santosh; Raghavendra, Cauligi; Prasanna, Viktor

    2014-08-25

    Large scale graph processing is a major research area for Big Data exploration. Vertex centric programming models like Pregel are gaining traction due to their simple abstraction that allows for scalable execution on distributed systems naturally. However, there are limitations to this approach which cause vertex centric algorithms to under-perform due to poor compute to communication overhead ratio and slow convergence of iterative superstep. In this paper we introduce GoFFish a scalable sub-graph centric framework co-designed with a distributed persistent graph storage for large scale graph analytics on commodity clusters. We introduce a sub-graph centric programming abstraction that combines the scalability of a vertex centric approach with the flexibility of shared memory sub-graph computation. We map Connected Components, SSSP and PageRank algorithms to this model to illustrate its flexibility. Further, we empirically analyze GoFFish using several real world graphs and demonstrate its significant performance improvement, orders of magnitude in some cases, compared to Apache Giraph, the leading open source vertex centric implementation. We map Connected Components, SSSP and PageRank algorithms to this model to illustrate its flexibility. Further, we empirically analyze GoFFish using several real world graphs and demonstrate its significant performance improvement, orders of magnitude in some cases, compared to Apache Giraph, the leading open source vertex centric implementation.

  12. Robust large-scale parallel nonlinear solvers for simulations.

    Energy Technology Data Exchange (ETDEWEB)

    Bader, Brett William; Pawlowski, Roger Patrick; Kolda, Tamara Gibson (Sandia National Laboratories, Livermore, CA)

    2005-11-01

    This report documents research to develop robust and efficient solution techniques for solving large-scale systems of nonlinear equations. The most widely used method for solving systems of nonlinear equations is Newton's method. While much research has been devoted to augmenting Newton-based solvers (usually with globalization techniques), little has been devoted to exploring the application of different models. Our research has been directed at evaluating techniques using different models than Newton's method: a lower order model, Broyden's method, and a higher order model, the tensor method. We have developed large-scale versions of each of these models and have demonstrated their use in important applications at Sandia. Broyden's method replaces the Jacobian with an approximation, allowing codes that cannot evaluate a Jacobian or have an inaccurate Jacobian to converge to a solution. Limited-memory methods, which have been successful in optimization, allow us to extend this approach to large-scale problems. We compare the robustness and efficiency of Newton's method, modified Newton's method, Jacobian-free Newton-Krylov method, and our limited-memory Broyden method. Comparisons are carried out for large-scale applications of fluid flow simulations and electronic circuit simulations. Results show that, in cases where the Jacobian was inaccurate or could not be computed, Broyden's method converged in some cases where Newton's method failed to converge. We identify conditions where Broyden's method can be more efficient than Newton's method. We also present modifications to a large-scale tensor method, originally proposed by Bouaricha, for greater efficiency, better robustness, and wider applicability. Tensor methods are an alternative to Newton-based methods and are based on computing a step based on a local quadratic model rather than a linear model. The advantage of Bouaricha's method is that it can use any

  13. Workshop Report on Additive Manufacturing for Large-Scale Metal Components - Development and Deployment of Metal Big-Area-Additive-Manufacturing (Large-Scale Metals AM) System

    Energy Technology Data Exchange (ETDEWEB)

    Babu, Sudarsanam Suresh [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Love, Lonnie J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Peter, William H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Dehoff, Ryan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility

    2016-05-01

    Additive manufacturing (AM) is considered an emerging technology that is expected to transform the way industry can make low-volume, high value complex structures. This disruptive technology promises to replace legacy manufacturing methods for the fabrication of existing components in addition to bringing new innovation for new components with increased functional and mechanical properties. This report outlines the outcome of a workshop on large-scale metal additive manufacturing held at Oak Ridge National Laboratory (ORNL) on March 11, 2016. The charter for the workshop was outlined by the Department of Energy (DOE) Advanced Manufacturing Office program manager. The status and impact of the Big Area Additive Manufacturing (BAAM) for polymer matrix composites was presented as the background motivation for the workshop. Following, the extension of underlying technology to low-cost metals was proposed with the following goals: (i) High deposition rates (approaching 100 lbs/h); (ii) Low cost (<$10/lbs) for steel, iron, aluminum, nickel, as well as, higher cost titanium, (iii) large components (major axis greater than 6 ft) and (iv) compliance of property requirements. The above concept was discussed in depth by representatives from different industrial sectors including welding, metal fabrication machinery, energy, construction, aerospace and heavy manufacturing. In addition, DOE’s newly launched High Performance Computing for Manufacturing (HPC4MFG) program was reviewed. This program will apply thermo-mechanical models to elucidate deeper understanding of the interactions between design, process, and materials during additive manufacturing. Following these presentations, all the attendees took part in a brainstorming session where everyone identified the top 10 challenges in large-scale metal AM from their own perspective. The feedback was analyzed and grouped in different categories including, (i) CAD to PART software, (ii) selection of energy source, (iii

  14. Resolving the Circumstellar Environment of the Galactic B[e] Supergiant Star MWC 137 from Large to Small Scales

    Science.gov (United States)

    Kraus, Michaela; Liimets, Tiina; Cappa, Cristina E.; Cidale, Lydia S.; Nickeler, Dieter H.; Duronea, Nicolas U.; Arias, Maria L.; Gunawan, Diah S.; Oksala, Mary E.; Borges Fernandes, Marcelo; Maravelias, Grigoris; Curé, Michel; Santander-García, Miguel

    2017-11-01

    The Galactic object MWC 137 has been suggested to belong to the group of B[e] supergiants. However, with its large-scale optical bipolar ring nebula and high-velocity jet and knots, it is a rather atypical representative of this class. We performed multiwavelength observations spreading from the optical to the radio regimes. Based on optical imaging and long-slit spectroscopic data, we found that the northern parts of the large-scale nebula are predominantly blueshifted, while the southern regions appear mostly redshifted. We developed a geometrical model consisting of two double cones. Although various observational features can be approximated with such a scenario, the observed velocity pattern is more complex. Using near-infrared integral-field unit spectroscopy, we studied the hot molecular gas in the vicinity of the star. The emission from the hot CO gas arises in a small-scale disk revolving around the star on Keplerian orbits. Although the disk itself cannot be spatially resolved, its emission is reflected by the dust arranged in arc-like structures and the clumps surrounding MWC 137 on small scales. In the radio regime, we mapped the cold molecular gas in the outskirts of the optical nebula. We found that large amounts of cool molecular gas and warm dust embrace the optical nebula in the east, south, and west. No cold gas or dust was detected in the north and northwestern regions. Despite the new insights into the nebula kinematics gained from our studies, the real formation scenario of the large-scale nebula remains an open issue. Based on observations collected with (1) the ESO VLT Paranal Observatory under programs 094.D-0637(B) and 097.D-0033(A), (2) the MPG 2.2 m telescope at La Silla Observatory, Chile, under programs 096.A-9030(A) and 096.A-9039(A), (3) the Gemini Observatory, which is operated by the Association of Universities for Research in Astronomy, Inc., under a cooperative agreement with the NSF on behalf of the Gemini partnership: the

  15. A Novel Large-scale Mentoring Program for Medical Students based on a Quantitative and Qualitative Needs Analysis

    Science.gov (United States)

    von der Borch, Philip; Dimitriadis, Konstantinos; Störmann, Sylvère; Meinel, Felix G.; Moder, Stefan; Reincke, Martin; Tekian, Ara; Fischer, Martin R.

    2011-01-01

    Purpose: Mentoring plays an important role in students' performance and career. The authors of this study assessed the need for mentoring among medical students and established a novel large-scale mentoring program at Ludwig-Maximilians-University (LMU) Munich School of Medicine. Methods: Needs assessment was conducted using a survey distributed to all students at the medical school (n=578 of 4,109 students, return rate 14.1%). In addition, the authors held focus groups with selected medical students (n=24) and faculty physicians (n=22). All students signing up for the individual mentoring completed a survey addressing their expectations (n=534). Results: Needs assessment revealed that 83% of medical students expressed overall satisfaction with the teaching at LMU. In contrast, only 36.5% were satisfied with how the faculty supports their individual professional development and 86% of students voiced a desire for more personal and professional support. When asked to define the role of a mentor, 55.6% "very much" wanted their mentors to act as counselors, arrange contacts for them (36.4%), and provide ideas for professional development (28.1%). Topics that future mentees "very much" wished to discuss included research (56.6%), final year electives (55.8%) and experiences abroad (45.5%). Conclusions: Based on the strong desire for mentoring among medical students, the authors developed a novel two-tiered system that introduces one-to-one mentoring for students in their clinical years and offers society-based peer mentoring for pre-clinical students. One year after launching the program, more than 300 clinical students had experienced one-to-one mentoring and 1,503 students and physicians were involved in peer mentoring societies. PMID:21818236

  16. Cytology of DNA Replication Reveals Dynamic Plasticity of Large-Scale Chromatin Fibers.

    Science.gov (United States)

    Deng, Xiang; Zhironkina, Oxana A; Cherepanynets, Varvara D; Strelkova, Olga S; Kireev, Igor I; Belmont, Andrew S

    2016-09-26

    In higher eukaryotic interphase nuclei, the 100- to >1,000-fold linear compaction of chromatin is difficult to reconcile with its function as a template for transcription, replication, and repair. It is challenging to imagine how DNA and RNA polymerases with their associated molecular machinery would move along the DNA template without transient decondensation of observed large-scale chromatin "chromonema" fibers [1]. Transcription or "replication factory" models [2], in which polymerases remain fixed while DNA is reeled through, are similarly difficult to conceptualize without transient decondensation of these chromonema fibers. Here, we show how a dynamic plasticity of chromatin folding within large-scale chromatin fibers allows DNA replication to take place without significant changes in the global large-scale chromatin compaction or shape of these large-scale chromatin fibers. Time-lapse imaging of lac-operator-tagged chromosome regions shows no major change in the overall compaction of these chromosome regions during their DNA replication. Improved pulse-chase labeling of endogenous interphase chromosomes yields a model in which the global compaction and shape of large-Mbp chromatin domains remains largely invariant during DNA replication, with DNA within these domains undergoing significant movements and redistribution as they move into and then out of adjacent replication foci. In contrast to hierarchical folding models, this dynamic plasticity of large-scale chromatin organization explains how localized changes in DNA topology allow DNA replication to take place without an accompanying global unfolding of large-scale chromatin fibers while suggesting a possible mechanism for maintaining epigenetic programming of large-scale chromatin domains throughout DNA replication. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Technical instrumentation R&D for ILD SiW ECAL large scale device

    Science.gov (United States)

    Balagura, V.

    2018-03-01

    Calorimeters with silicon detectors have many unique features and are proposed for several world-leading experiments. We describe the R&D program of the large scale detector element with up to 12 000 readout channels for the International Large Detector (ILD) at the future e+e‑ ILC collider. The program is focused on the readout front-end electronics embedded inside the calorimeter. The first part with 2 000 channels and two small silicon sensors has already been constructed, the full prototype is planned for the beginning of 2018.

  18. Technical instrumentation R&D for ILD SiW ECAL large scale device

    CERN Document Server

    Balagura, V. (on behalf of SIW ECAL ILD collaboration)

    2018-01-01

    Calorimeters with silicon detectors have many unique features and are proposed for several world-leading experiments. We describe the R&D program of the large scale detector element with up to 12 000 readout channels for the International Large Detector (ILD) at the future e+e- ILC collider. The program is focused on the readout front-end electronics embedded inside the calorimeter. The first part with 2 000 channels and two small silicon sensors has already been constructed, the full prototype is planned for the beginning of 2018.

  19. Phylogenetic distribution of large-scale genome patchiness

    Directory of Open Access Journals (Sweden)

    Hackenberg Michael

    2008-04-01

    Full Text Available Abstract Background The phylogenetic distribution of large-scale genome structure (i.e. mosaic compositional patchiness has been explored mainly by analytical ultracentrifugation of bulk DNA. However, with the availability of large, good-quality chromosome sequences, and the recently developed computational methods to directly analyze patchiness on the genome sequence, an evolutionary comparative analysis can be carried out at the sequence level. Results The local variations in the scaling exponent of the Detrended Fluctuation Analysis are used here to analyze large-scale genome structure and directly uncover the characteristic scales present in genome sequences. Furthermore, through shuffling experiments of selected genome regions, computationally-identified, isochore-like regions were identified as the biological source for the uncovered large-scale genome structure. The phylogenetic distribution of short- and large-scale patchiness was determined in the best-sequenced genome assemblies from eleven eukaryotic genomes: mammals (Homo sapiens, Pan troglodytes, Mus musculus, Rattus norvegicus, and Canis familiaris, birds (Gallus gallus, fishes (Danio rerio, invertebrates (Drosophila melanogaster and Caenorhabditis elegans, plants (Arabidopsis thaliana and yeasts (Saccharomyces cerevisiae. We found large-scale patchiness of genome structure, associated with in silico determined, isochore-like regions, throughout this wide phylogenetic range. Conclusion Large-scale genome structure is detected by directly analyzing DNA sequences in a wide range of eukaryotic chromosome sequences, from human to yeast. In all these genomes, large-scale patchiness can be associated with the isochore-like regions, as directly detected in silico at the sequence level.

  20. A Qualitative Program Evaluation of a Structured Leadership Mentoring Program at a Large Aerospace Corporation

    Science.gov (United States)

    Teller, Romney P.

    2011-01-01

    The researcher utilized a qualitative approach to conduct a program evaluation of the organization where he is employed. The study intended to serve as a program evaluation for the structured in-house mentoring program at a large aerospace corporation (A-Corp). This program evaluation clarified areas in which the current mentoring program is…

  1. From efficacy research to large-scale impact on undernutrition: the role of organizational cultures.

    Science.gov (United States)

    Pelletier, David; Pelto, Gretel

    2013-11-01

    Undernutrition in low-income countries is receiving unprecedented attention at global and national levels due to the convergence of many forces, including strong evidence concerning its magnitude, consequences, and potential solutions and effective advocacy by many organizations. The translation of this attention into large-scale reductions in undernutrition at the country level requires the alignment and support of many organizations in the development and implementation of a coherent policy agenda for nutrition, including the strengthening of operational and strategic capacities and a supportive research agenda. However, many countries experience difficulties achieving such alignment. This article uses the concept of organizational culture to better understand some of the reasons for these difficulties. This concept is applied to the constellation of organizations that make up the "National Nutrition Network" in a given country and some of the individual organizations within that network, including academic institutions that conduct research on undernutrition. We illustrate this concept through a case study involving a middle-income country. We conclude that efforts to align organizations in support of coherent nutrition agendas should do the following: 1) make intentional and sustained efforts to foster common understanding, shared learning, and socialization of new members and other elements of a shared culture among partners; 2) seek a way to frame problems and solutions in a fashion that enables individual organizations to secure some of their particular interests by joining the effort; and 3) not only advocate on the importance of nutrition but also insist that high-level officials hold organizations accountable for aligning in support of common-interest solutions (through some elements of a common culture) that can be effective and appropriate in the national context. We further conclude that a culture change is needed within academic departments if the

  2. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    1981-05-01

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  3. Large Scale Self-Organizing Information Distribution System

    National Research Council Canada - National Science Library

    Low, Steven

    2005-01-01

    This project investigates issues in "large-scale" networks. Here "large-scale" refers to networks with large number of high capacity nodes and transmission links, and shared by a large number of users...

  4. Self-Report Measures of the Home Learning Environment in Large Scale Research: Measurement Properties and Associations with Key Developmental Outcomes

    Science.gov (United States)

    Niklas, Frank; Nguyen, Cuc; Cloney, Daniel S.; Tayler, Collette; Adams, Raymond

    2016-01-01

    Favourable home learning environments (HLEs) support children's literacy, numeracy and social development. In large-scale research, HLE is typically measured by self-report survey, but there is little consistency between studies and many different items and latent constructs are observed. Little is known about the stability of these items and…

  5. Efficient algorithms for collaborative decision making for large scale settings

    DEFF Research Database (Denmark)

    Assent, Ira

    2011-01-01

    to bring about more effective and more efficient retrieval systems that support the users' decision making process. We sketch promising research directions for more efficient algorithms for collaborative decision making, especially for large scale systems.......Collaborative decision making is a successful approach in settings where data analysis and querying can be done interactively. In large scale systems with huge data volumes or many users, collaboration is often hindered by impractical runtimes. Existing work on improving collaboration focuses...... on avoiding redundancy for users working on the same task. While this improves the effectiveness of the user work process, the underlying query processing engine is typically considered a "black box" and left unchanged. Research in multiple query processing, on the other hand, ignores the application...

  6. Medical Research Volunteer Program (MRVP): innovative program promoting undergraduate research in the medical field.

    Science.gov (United States)

    Dagher, Michael M; Atieh, Jessica A; Soubra, Marwa K; Khoury, Samia J; Tamim, Hani; Kaafarani, Bilal R

    2016-06-06

    Most educational institutions lack a structured system that provides undergraduate students with research exposure in the medical field. The objective of this paper is to describe the structure of the Medical Research Volunteer Program (MRVP) which was established at the American University of Beirut, Lebanon, as well as to assess the success of the program. The MRVP is a program that targets undergraduate students interested in becoming involved in the medical research field early on in their academic career. It provides students with an active experience and the opportunity to learn from and support physicians, clinical researchers, basic science researchers and other health professionals. Through this program, students are assigned to researchers and become part of a research team where they observe and aid on a volunteer basis. This paper presents the MRVP's four major pillars: the students, the faculty members, the MRVP committee, and the online portal. Moreover, details of the MRVP process are provided. The success of the program was assessed by carrying out analyses using information gathered from the MRVP participants (both students and faculty). Satisfaction with the program was assessed using a set of questions rated on a Likert scale, ranging from 1 (lowest satisfaction) to 5 (highest satisfaction). A total of 211 students applied to the program with a total of 164 matches being completed. Since the beginning of the program, three students have each co-authored a publication in peer-reviewed journals with their respective faculty members. The majority of the students rated the program positively. Of the total number of students who completed the program period, 35.1 % rated the effectiveness of the program with a 5, 54.8 % rated 4, and 8.6 % rated 3. A small number of students gave lower ratings of 2 and 1 (1.1 % and 0.4 %, respectively). The MRVP is a program that provides undergraduate students with the opportunity to learn about research firsthand

  7. Large scale structure and baryogenesis

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    2001-08-01

    We discuss a possible connection between the large scale structure formation and the baryogenesis in the universe. An update review of the observational indications for the presence of a very large scale 120h -1 Mpc in the distribution of the visible matter of the universe is provided. The possibility to generate a periodic distribution with the characteristic scale 120h -1 Mpc through a mechanism producing quasi-periodic baryon density perturbations during inflationary stage, is discussed. The evolution of the baryon charge density distribution is explored in the framework of a low temperature boson condensate baryogenesis scenario. Both the observed very large scale of a the visible matter distribution in the universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, for some model's parameters a natural separation of matter superclusters from antimatter ones can be achieved. (author)

  8. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  9. Survey of large-scale solar water heaters installed in Taiwan, China

    Energy Technology Data Exchange (ETDEWEB)

    Chang Keh-Chin; Lee Tsong-Sheng; Chung Kung-Ming [Cheng Kung Univ., Tainan (China); Lien Ya-Feng; Lee Chine-An [Cheng Kung Univ. Research and Development Foundation, Tainan (China)

    2008-07-01

    Almost all the solar collectors installed in Taiwan, China were used for production of hot water for homeowners (residential systems), in which the area of solar collectors is less than 10 square meters. From 2001 to 2006, there were only 39 large-scale systems (defined as the area of solar collectors being over 100 m{sup 2}) installed. Their utilization purposes are for rooming house (dormitory), swimming pool, restaurant, and manufacturing process. A comprehensive survey of those large-scale solar water heaters was conducted in 2006. The objectives of the survey were to asses the systems' performance and to have the feedback from the individual users. It is found that lack of experience in system design and maintenance are the key factors for reliable operation of a system. For further promotion of large-scale solar water heaters in Taiwan, a more compressive program on a system design for manufacturing process should be conducted. (orig.)

  10. Large-scale fortification of condiments and seasonings as a public health strategy: equity considerations for implementation.

    Science.gov (United States)

    Zamora, Gerardo; Flores-Urrutia, Mónica Crissel; Mayén, Ana-Lucia

    2016-09-01

    Fortification of staple foods with vitamins and minerals is an effective approach to increase micronutrient intake and improve nutritional status. The specific use of condiments and seasonings as vehicles in large-scale fortification programs is a relatively new public health strategy. This paper underscores equity considerations for the implementation of large-scale fortification of condiments and seasonings as a public health strategy by examining nonexhaustive examples of programmatic experiences and pilot projects in various settings. An overview of conceptual elements in implementation research and equity is presented, followed by an examination of equity considerations for five implementation strategies: (1) enhancing the capabilities of the public sector, (2) improving the performance of implementing agencies, (3) strengthening the capabilities and performance of frontline workers, (3) empowering communities and individuals, and (4) supporting multiple stakeholders engaged in improving health. Finally, specific considerations related to intersectoral action are considered. Large-scale fortification of condiments and seasonings cannot be a standalone strategy and needs to be implemented with concurrent and coordinated public health strategies, which should be informed by a health equity lens. © 2016 New York Academy of Sciences.

  11. Large-scale pool fires

    Directory of Open Access Journals (Sweden)

    Steinhaus Thomas

    2007-01-01

    Full Text Available A review of research into the burning behavior of large pool fires and fuel spill fires is presented. The features which distinguish such fires from smaller pool fires are mainly associated with the fire dynamics at low source Froude numbers and the radiative interaction with the fire source. In hydrocarbon fires, higher soot levels at increased diameters result in radiation blockage effects around the perimeter of large fire plumes; this yields lower emissive powers and a drastic reduction in the radiative loss fraction; whilst there are simplifying factors with these phenomena, arising from the fact that soot yield can saturate, there are other complications deriving from the intermittency of the behavior, with luminous regions of efficient combustion appearing randomly in the outer surface of the fire according the turbulent fluctuations in the fire plume. Knowledge of the fluid flow instabilities, which lead to the formation of large eddies, is also key to understanding the behavior of large-scale fires. Here modeling tools can be effectively exploited in order to investigate the fluid flow phenomena, including RANS- and LES-based computational fluid dynamics codes. The latter are well-suited to representation of the turbulent motions, but a number of challenges remain with their practical application. Massively-parallel computational resources are likely to be necessary in order to be able to adequately address the complex coupled phenomena to the level of detail that is necessary.

  12. Visual analysis of inter-process communication for large-scale parallel computing.

    Science.gov (United States)

    Muelder, Chris; Gygi, Francois; Ma, Kwan-Liu

    2009-01-01

    In serial computation, program profiling is often helpful for optimization of key sections of code. When moving to parallel computation, not only does the code execution need to be considered but also communication between the different processes which can induce delays that are detrimental to performance. As the number of processes increases, so does the impact of the communication delays on performance. For large-scale parallel applications, it is critical to understand how the communication impacts performance in order to make the code more efficient. There are several tools available for visualizing program execution and communications on parallel systems. These tools generally provide either views which statistically summarize the entire program execution or process-centric views. However, process-centric visualizations do not scale well as the number of processes gets very large. In particular, the most common representation of parallel processes is a Gantt char t with a row for each process. As the number of processes increases, these charts can become difficult to work with and can even exceed screen resolution. We propose a new visualization approach that affords more scalability and then demonstrate it on systems running with up to 16,384 processes.

  13. The Breathmobile Program: structure, implementation, and evolution of a large-scale, urban, pediatric asthma disease management program.

    Science.gov (United States)

    Jones, Craig A; Clement, Loran T; Hanley-Lopez, Jean; Morphew, Tricia; Kwong, Kenny Yat Choi; Lifson, Francene; Opas, Lawrence; Guterman, Jeffrey J

    2005-08-01

    Despite more than a decade of education and research-oriented intervention programs, inner city children with asthma continue to engage in episodic "rescue" patterns of healthcare and experience a disproportionate level of morbidity. The aim of this study was to establish and evaluate a sustainable community-wide pediatric asthma disease management program designed to shift inner city children in Los Angeles from acute episodic care to regular preventive care in accordance with national standards. In 1995 the Southern California Chapter of the Asthma and Allergy Foundation of America (AAFA), the Los Angeles County Department of Health Services (LAC DHS), and the Los Angeles Unified School District (LAUSD) established an agreement to initiate and sustain the Breathmobile Program. This program includes automated case identification, mobile school-based clinics, and highly structured clinical encounters supported by an advanced information technology solution. Interdisciplinary teams of asthma care specialists provide regular and ongoing care to children at school and county clinic sites over a wide geographic area of urban Los Angeles. Each team operates in a specially equipped mobile clinic (Breathmobile), efficiently moving a structured healthcare process to school and county clinic sites with large numbers of children. Demographic, clinical, and participation data is tracked carefully in an electronic medical record system. Program operations, clinical oversight, and patient tracking are centralized at a care coordination center. Clinical operations and methods have been replicated in fixed specialty clinic sites at the Los Angeles County + University of Southern California Medical Center. Clinical and process measures are regularly evaluated to assure quality, plan iterative improvement, and support evidence-based care. Four Breathmobiles deliver ongoing care at more than 90 school sites. The program has engaged over five thousand patients and their families in a

  14. Using Practitioner Inquiry within and against Large-Scale Educational Reform

    Science.gov (United States)

    Hines, Mary Beth; Conner-Zachocki, Jennifer

    2015-01-01

    This research study examines the impact of teacher research on participants in a large-scale educational reform initiative in the United States, No Child Left Behind, and its strand for reading teachers, Reading First. Reading First supported professional development for teachers in order to increase student scores on standardized tests. The…

  15. Comparisons of benthic filter feeder communities before and after a large-scale capital dredging program.

    Science.gov (United States)

    Abdul Wahab, Muhammad Azmi; Fromont, Jane; Gomez, Oliver; Fisher, Rebecca; Jones, Ross

    2017-09-15

    Changes in turbidity, sedimentation and light over a two year large scale capital dredging program at Onslow, northwestern Australia, were quantified to assess their effects on filter feeder communities, in particular sponges. Community functional morphological composition was quantified using towed video surveys, while dive surveys allowed for assessments of species composition and chlorophyll content. Onslow is relatively diverse recording 150 sponge species. The area was naturally turbid (1.1 mean P 80 NTU), with inshore sites recording 6.5× higher turbidity than offshore localities, likely influenced by the Ashburton River discharge. Turbidity and sedimentation increased by up to 146% and 240% through dredging respectively, with corresponding decreases in light levels. The effects of dredging was variable, and despite existing caveats (i.e. bleaching event and passing of a cyclone), the persistence of sponges and the absence of a pronounced response post-dredging suggest environmental filtering or passive adaptation acquired pre-dredging may have benefited these communities. Copyright © 2017. Published by Elsevier Ltd.

  16. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  17. Large-Scale Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S. A.

    1995-12-01

    \\catcode`\\@=11 \\ialign{m @th#1hfil ##hfil \\crcr#2\\crcr\\sim\\crcr}}} \\catcode`\\@=12 Highly collimated outflows extend out to Mpc scales in many radio-loud active galaxies. In Seyfert galaxies, which are radio-quiet, the outflows extend out to kpc scales and do not appear to be as highly collimated. In order to study the nature of large-scale (>~1 kpc) outflows in Seyferts, we have conducted optical, radio and X-ray surveys of a distance-limited sample of 22 edge-on Seyfert galaxies. Results of the optical emission-line imaging and spectroscopic survey imply that large-scale outflows are present in >~{{1} /{4}} of all Seyferts. The radio (VLA) and X-ray (ROSAT) surveys show that large-scale radio and X-ray emission is present at about the same frequency. Kinetic luminosities of the outflows in Seyferts are comparable to those in starburst-driven superwinds. Large-scale radio sources in Seyferts appear diffuse, but do not resemble radio halos found in some edge-on starburst galaxies (e.g. M82). We discuss the feasibility of the outflows being powered by the active nucleus (e.g. a jet) or a circumnuclear starburst.

  18. Large-scale applications of superconductivity in the United States: an overview. Metallurgy, fabrication, and applications

    International Nuclear Information System (INIS)

    Hein, R.A.; Gubser, D.U.

    1981-01-01

    This report presents an overview of ongoing development efforts in the USA concerned with large-scale applications of superconductivity. These applications are grouped according to magnetic field regime, as low field regime, intermediate field regime, and high field regime. In the low field regime two diverse areas of large application are identified, superconducting power transmission lines for electric utilities, and RF cavities for particle accelerators for high energy physics research. Activity in the intermediate regime has been significantly increased due to Fermilab's energy doubler or Tevatron project, and BNL's ISABELLE project. Rotating electrical machines, such as DC acyclic (homopolar) motors, generators, and energy storage magnets are also studied. In the high field regime magnetohydrodynamics (MHD) and magnetically confined fusion in tokamaks are examined. In each regime all current work is summarized according to key person, research topic, type of program, funding, status, and future outlook

  19. Investigating the dependence of SCM simulated precipitation and clouds on the spatial scale of large-scale forcing at SGP

    Science.gov (United States)

    Tang, Shuaiqi; Zhang, Minghua; Xie, Shaocheng

    2017-08-01

    Large-scale forcing data, such as vertical velocity and advective tendencies, are required to drive single-column models (SCMs), cloud-resolving models, and large-eddy simulations. Previous studies suggest that some errors of these model simulations could be attributed to the lack of spatial variability in the specified domain-mean large-scale forcing. This study investigates the spatial variability of the forcing and explores its impact on SCM simulated precipitation and clouds. A gridded large-scale forcing data during the March 2000 Cloud Intensive Operational Period at the Atmospheric Radiation Measurement program's Southern Great Plains site is used for analysis and to drive the single-column version of the Community Atmospheric Model Version 5 (SCAM5). When the gridded forcing data show large spatial variability, such as during a frontal passage, SCAM5 with the domain-mean forcing is not able to capture the convective systems that are partly located in the domain or that only occupy part of the domain. This problem has been largely reduced by using the gridded forcing data, which allows running SCAM5 in each subcolumn and then averaging the results within the domain. This is because the subcolumns have a better chance to capture the timing of the frontal propagation and the small-scale systems. Other potential uses of the gridded forcing data, such as understanding and testing scale-aware parameterizations, are also discussed.

  20. Comparison of particle swarm optimization and dynamic programming for large scale hydro unit load dispatch

    International Nuclear Information System (INIS)

    Cheng Chuntian; Liao Shengli; Tang Zitian; Zhao Mingyan

    2009-01-01

    Dynamic programming (DP) is one of classic and sophisticated optimization methods that have successfully been applied to solve the problem of hydro unit load dispatch (HULD). However, DP will be faced with the curse of dimensionality with the increase of unit number and installed generating capacity of hydropower station. With the appearance of the huge hydropower station similar to the Three George with 26 generators of 700 MW, it is hard to apply the DP to large scale HULD problem. It is crucial to seek for other optimization techniques in order to improve the operation quality and efficiency. Different with the most of literature about power generation scheduling that focused on the comparisons of novel PSO algorithms with other techniques, the paper will pay emphasis on comparison study of PSO with DP based on a case hydropower station. The objective of study is to seek for an effective and feasible method for the large scale of hydropower station of the current and future in China. This paper first compares the performance of PSO and DP using a sample load curve of the Wujiangdu hydropower plant located in the upper stream of the Yangtze River in China and contained five units with the installed capacity of 1250 MW. Next, the effect of different load interval and unit number on the optimal results and efficiency of two methods has also been implemented. The comparison results show that the PSO is feasible for HULD. Furthermore, we simulated the effect of the magnitude of unit number and load capacity on the optimal results and cost time. The simulation comparisons show that PSO has a great advantage over DP in the efficiency and will be one of effective methods for HULD problem of huge hydropower stations.

  1. Comparison of particle swarm optimization and dynamic programming for large scale hydro unit load dispatch

    Energy Technology Data Exchange (ETDEWEB)

    Cheng Chuntian, E-mail: ctcheng@dlut.edu.c [Department of Civil and Hydraulic Engineering, Dalian University of Technology, 116024 Dalian (China); Liao Shengli; Tang Zitian [Department of Civil and Hydraulic Engineering, Dalian University of Technology, 116024 Dalian (China); Zhao Mingyan [Department of Environmental Science and Engineering, Tsinghua University, 100084 Beijing (China)

    2009-12-15

    Dynamic programming (DP) is one of classic and sophisticated optimization methods that have successfully been applied to solve the problem of hydro unit load dispatch (HULD). However, DP will be faced with the curse of dimensionality with the increase of unit number and installed generating capacity of hydropower station. With the appearance of the huge hydropower station similar to the Three George with 26 generators of 700 MW, it is hard to apply the DP to large scale HULD problem. It is crucial to seek for other optimization techniques in order to improve the operation quality and efficiency. Different with the most of literature about power generation scheduling that focused on the comparisons of novel PSO algorithms with other techniques, the paper will pay emphasis on comparison study of PSO with DP based on a case hydropower station. The objective of study is to seek for an effective and feasible method for the large scale of hydropower station of the current and future in China. This paper first compares the performance of PSO and DP using a sample load curve of the Wujiangdu hydropower plant located in the upper stream of the Yangtze River in China and contained five units with the installed capacity of 1250 MW. Next, the effect of different load interval and unit number on the optimal results and efficiency of two methods has also been implemented. The comparison results show that the PSO is feasible for HULD. Furthermore, we simulated the effect of the magnitude of unit number and load capacity on the optimal results and cost time. The simulation comparisons show that PSO has a great advantage over DP in the efficiency and will be one of effective methods for HULD problem of huge hydropower stations.

  2. Comparison of particle swarm optimization and dynamic programming for large scale hydro unit load dispatch

    Energy Technology Data Exchange (ETDEWEB)

    Chun-tian Cheng; Sheng-li Liao; Zi-Tian Tang [Dept. of Civil and Hydraulic Engineering, Dalian Univ. of Technology, 116024 Dalian (China); Ming-yan Zhao [Dept. of Environmental Science and Engineering, Tsinghua Univ., 100084 Beijing (China)

    2009-12-15

    Dynamic programming (DP) is one of classic and sophisticated optimization methods that have successfully been applied to solve the problem of hydro unit load dispatch (HULD). However, DP will be faced with the curse of dimensionality with the increase of unit number and installed generating capacity of hydropower station. With the appearance of the huge hydropower station similar to the Three George with 26 generators of 700 MW, it is hard to apply the DP to large scale HULD problem. It is crucial to seek for other optimization techniques in order to improve the operation quality and efficiency. Different with the most of literature about power generation scheduling that focused on the comparisons of novel PSO algorithms with other techniques, the paper will pay emphasis on comparison study of PSO with DP based on a case hydropower station. The objective of study is to seek for an effective and feasible method for the large scale of hydropower station of the current and future in China. This paper first compares the performance of PSO and DP using a sample load curve of the Wujiangdu hydropower plant located in the upper stream of the Yangtze River in China and contained five units with the installed capacity of 1250 MW. Next, the effect of different load interval and unit number on the optimal results and efficiency of two methods has also been implemented. The comparison results show that the PSO is feasible for HULD. Furthermore, we simulated the effect of the magnitude of unit number and load capacity on the optimal results and cost time. The simulation comparisons show that PSO has a great advantage over DP in the efficiency and will be one of effective methods for HULD problem of huge hydropower stations. (author)

  3. Indigenous Engagement in Tropical River Research in Australia: The TRaCK Program

    Directory of Open Access Journals (Sweden)

    Sue E. Jackson

    2015-05-01

    Full Text Available The literature on scientific-Indigenous ecological knowledge collaborations rarely analyses programmatic efforts undertaken by multi-disciplinary research groups over very large geographic scales. The TRaCK (Tropical Rivers and Coastal Knowledge research program was established to provide the science and knowledge needed by governments, industries, and communities to sustainably manage northern Australia’s rivers and estuaries. A number of policies and procedures were developed to ensure that the needs of Indigenous people of the multi-jurisdictional region were addressed and to enhance the benefits they might derive from participating in the research. An overarching Indigenous Engagement Strategy undergirded the program’s engagement activities, providing guidance on matters relating to the protection of intellectual property, negotiation of research agreements, remuneration for Indigenous expertise, and communications standards. This article reviews the achievements and shortcomings of the TRaCK experience of Indigenous engagement and highlights lessons for researchers and research organisations contemplating applied environmental science initiatives of this scale and scope.

  4. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele

    2015-08-23

    The interaction between scales is investigated in a turbulent mixing layer. The large-scale amplitude modulation of the small scales already observed in other works depends on the crosswise location. Large-scale positive fluctuations correlate with a stronger activity of the small scales on the low speed-side of the mixing layer, and a reduced activity on the high speed-side. However, from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  5. Analysis for Large Scale Integration of Electric Vehicles into Power Grids

    DEFF Research Database (Denmark)

    Hu, Weihao; Chen, Zhe; Wang, Xiaoru

    2011-01-01

    Electric Vehicles (EVs) provide a significant opportunity for reducing the consumption of fossil energies and the emission of carbon dioxide. With more and more electric vehicles integrated in the power systems, it becomes important to study the effects of EV integration on the power systems......, especially the low and middle voltage level networks. In the paper, the basic structure and characteristics of the electric vehicles are introduced. The possible impacts of large scale integration of electric vehicles on the power systems especially the advantage to the integration of the renewable energies...... are discussed. Finally, the research projects related to the large scale integration of electric vehicles into the power systems are introduced, it will provide reference for large scale integration of Electric Vehicles into power grids....

  6. Review of Dynamic Modeling and Simulation of Large Scale Belt Conveyor System

    Science.gov (United States)

    He, Qing; Li, Hong

    Belt conveyor is one of the most important devices to transport bulk-solid material for long distance. Dynamic analysis is the key to decide whether the design is rational in technique, safe and reliable in running, feasible in economy. It is very important to study dynamic properties, improve efficiency and productivity, guarantee conveyor safe, reliable and stable running. The dynamic researches and applications of large scale belt conveyor are discussed. The main research topics, the state-of-the-art of dynamic researches on belt conveyor are analyzed. The main future works focus on dynamic analysis, modeling and simulation of main components and whole system, nonlinear modeling, simulation and vibration analysis of large scale conveyor system.

  7. Parallel clustering algorithm for large-scale biological data sets.

    Science.gov (United States)

    Wang, Minchao; Zhang, Wu; Ding, Wang; Dai, Dongbo; Zhang, Huiran; Xie, Hao; Chen, Luonan; Guo, Yike; Xie, Jiang

    2014-01-01

    Recent explosion of biological data brings a great challenge for the traditional clustering algorithms. With increasing scale of data sets, much larger memory and longer runtime are required for the cluster identification problems. The affinity propagation algorithm outperforms many other classical clustering algorithms and is widely applied into the biological researches. However, the time and space complexity become a great bottleneck when handling the large-scale data sets. Moreover, the similarity matrix, whose constructing procedure takes long runtime, is required before running the affinity propagation algorithm, since the algorithm clusters data sets based on the similarities between data pairs. Two types of parallel architectures are proposed in this paper to accelerate the similarity matrix constructing procedure and the affinity propagation algorithm. The memory-shared architecture is used to construct the similarity matrix, and the distributed system is taken for the affinity propagation algorithm, because of its large memory size and great computing capacity. An appropriate way of data partition and reduction is designed in our method, in order to minimize the global communication cost among processes. A speedup of 100 is gained with 128 cores. The runtime is reduced from serval hours to a few seconds, which indicates that parallel algorithm is capable of handling large-scale data sets effectively. The parallel affinity propagation also achieves a good performance when clustering large-scale gene data (microarray) and detecting families in large protein superfamilies.

  8. Data management strategies for multinational large-scale systems biology projects.

    Science.gov (United States)

    Wruck, Wasco; Peuker, Martin; Regenbrecht, Christian R A

    2014-01-01

    Good accessibility of publicly funded research data is essential to secure an open scientific system and eventually becomes mandatory [Wellcome Trust will Penalise Scientists Who Don't Embrace Open Access. The Guardian 2012]. By the use of high-throughput methods in many research areas from physics to systems biology, large data collections are increasingly important as raw material for research. Here, we present strategies worked out by international and national institutions targeting open access to publicly funded research data via incentives or obligations to share data. Funding organizations such as the British Wellcome Trust therefore have developed data sharing policies and request commitment to data management and sharing in grant applications. Increased citation rates are a profound argument for sharing publication data. Pre-publication sharing might be rewarded by a data citation credit system via digital object identifiers (DOIs) which have initially been in use for data objects. Besides policies and incentives, good practice in data management is indispensable. However, appropriate systems for data management of large-scale projects for example in systems biology are hard to find. Here, we give an overview of a selection of open-source data management systems proved to be employed successfully in large-scale projects.

  9. Dissecting the large-scale galactic conformity

    Science.gov (United States)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  10. A research on the excavation, support, and environment control of large scale underground space

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Pil Chong; Kwon, Kwang Soo; Jeong, So Keul [Korea Institute of Geology Mining and Materials, Taejon (Korea, Republic of)

    1995-12-01

    With the growing necessity of the underground space due to the deficiency of above-ground space, the size and shape of underground structures tend to be complex and diverse. This complexity and variety force the development of new techniques for rock mass classification, excavation and supporting of underground space, monitoring and control of underground environment. All these techniques should be applied together to make the underground space comfortable. To achieve this, efforts have been made on 5 different areas; research on the underground space design and stability analysis, research on the techniques for excavation of rock by controlled blasting, research on the development of monitoring system to forecast the rock behaviour of underground space, research on the environment inspection system in closed space, and research on dynamic analysis of the airflow and environmental control in the large geos-spaces. The 5 main achievements are improvement of the existing structure analysis program(EXCRACK) to consider the deformation and failure characteristics of rock joints, development of new blasting design (SK-cut), prediction of ground vibration through the newly proposed wave propagation equation, development and In-Situ application of rock mass deformation monitoring system and data acquisition software, and trial manufacture of the environment inspection system in closed space. Should these techniques be applied to the development of underground space, prevention of industrial disaster, cut down of construction cost, domestication of monitoring system, improvement of tunnel stability, curtailment of royalty, upgrade of domestic technologies will be brought forth. (Abstract Truncated)

  11. Research on Operational Aspects of Large Autonomous Underwater Glider Fleets

    National Research Council Canada - National Science Library

    Fratantoni, David M

    2007-01-01

    This program supported research on the operational and management issues stemming from application of large fleets of autonomous underwater gliders to oceanographic research and rapid environmental...

  12. Large-scale Exploration of Neuronal Morphologies Using Deep Learning and Augmented Reality.

    Science.gov (United States)

    Li, Zhongyu; Butler, Erik; Li, Kang; Lu, Aidong; Ji, Shuiwang; Zhang, Shaoting

    2018-02-12

    Recently released large-scale neuron morphological data has greatly facilitated the research in neuroinformatics. However, the sheer volume and complexity of these data pose significant challenges for efficient and accurate neuron exploration. In this paper, we propose an effective retrieval framework to address these problems, based on frontier techniques of deep learning and binary coding. For the first time, we develop a deep learning based feature representation method for the neuron morphological data, where the 3D neurons are first projected into binary images and then learned features using an unsupervised deep neural network, i.e., stacked convolutional autoencoders (SCAEs). The deep features are subsequently fused with the hand-crafted features for more accurate representation. Considering the exhaustive search is usually very time-consuming in large-scale databases, we employ a novel binary coding method to compress feature vectors into short binary codes. Our framework is validated on a public data set including 58,000 neurons, showing promising retrieval precision and efficiency compared with state-of-the-art methods. In addition, we develop a novel neuron visualization program based on the techniques of augmented reality (AR), which can help users take a deep exploration of neuron morphologies in an interactive and immersive manner.

  13. A Review of Research on Large Scale Modern Vertical Axis Wind Turbines at Uppsala University

    Directory of Open Access Journals (Sweden)

    Senad Apelfröjd

    2016-07-01

    Full Text Available This paper presents a review of over a decade of research on Vertical Axis Wind Turbines (VAWTs conducted at Uppsala University. The paper presents, among others, an overview of the 200 kW VAWT located in Falkenberg, Sweden, as well as a description of the work done on the 12 kW prototype VAWT in Marsta, Sweden. Several key aspects have been tested and successfully demonstrated at our two experimental research sites. The effort of the VAWT research has been aimed at developing a robust large scale VAWT technology based on an electrical control system with a direct driven energy converter. This approach allows for a simplification where most or all of the control of the turbines can be managed by the electrical converter system, reducing investment cost and need for maintenance. The concept features an H-rotor that is omnidirectional in regards to wind direction, meaning that it can extract energy from all wind directions without the need for a yaw system. The turbine is connected to a direct driven permanent magnet synchronous generator (PMSG, located at ground level, that is specifically developed to control and extract power from the turbine. The research is ongoing and aims for a multi-megawatt VAWT in the near future.

  14. Scale interactions in a mixing layer – the role of the large-scale gradients

    KAUST Repository

    Fiscaletti, D.

    2016-02-15

    © 2016 Cambridge University Press. The interaction between the large and the small scales of turbulence is investigated in a mixing layer, at a Reynolds number based on the Taylor microscale of , via direct numerical simulations. The analysis is performed in physical space, and the local vorticity root-mean-square (r.m.s.) is taken as a measure of the small-scale activity. It is found that positive large-scale velocity fluctuations correspond to large vorticity r.m.s. on the low-speed side of the mixing layer, whereas, they correspond to low vorticity r.m.s. on the high-speed side. The relationship between large and small scales thus depends on position if the vorticity r.m.s. is correlated with the large-scale velocity fluctuations. On the contrary, the correlation coefficient is nearly constant throughout the mixing layer and close to unity if the vorticity r.m.s. is correlated with the large-scale velocity gradients. Therefore, the small-scale activity appears closely related to large-scale gradients, while the correlation between the small-scale activity and the large-scale velocity fluctuations is shown to reflect a property of the large scales. Furthermore, the vorticity from unfiltered (small scales) and from low pass filtered (large scales) velocity fields tend to be aligned when examined within vortical tubes. These results provide evidence for the so-called \\'scale invariance\\' (Meneveau & Katz, Annu. Rev. Fluid Mech., vol. 32, 2000, pp. 1-32), and suggest that some of the large-scale characteristics are not lost at the small scales, at least at the Reynolds number achieved in the present simulation.

  15. A Topology Visualization Early Warning Distribution Algorithm for Large-Scale Network Security Incidents

    Directory of Open Access Journals (Sweden)

    Hui He

    2013-01-01

    Full Text Available It is of great significance to research the early warning system for large-scale network security incidents. It can improve the network system’s emergency response capabilities, alleviate the cyber attacks’ damage, and strengthen the system’s counterattack ability. A comprehensive early warning system is presented in this paper, which combines active measurement and anomaly detection. The key visualization algorithm and technology of the system are mainly discussed. The large-scale network system’s plane visualization is realized based on the divide and conquer thought. First, the topology of the large-scale network is divided into some small-scale networks by the MLkP/CR algorithm. Second, the sub graph plane visualization algorithm is applied to each small-scale network. Finally, the small-scale networks’ topologies are combined into a topology based on the automatic distribution algorithm of force analysis. As the algorithm transforms the large-scale network topology plane visualization problem into a series of small-scale network topology plane visualization and distribution problems, it has higher parallelism and is able to handle the display of ultra-large-scale network topology.

  16. DupTree: a program for large-scale phylogenetic analyses using gene tree parsimony.

    Science.gov (United States)

    Wehe, André; Bansal, Mukul S; Burleigh, J Gordon; Eulenstein, Oliver

    2008-07-01

    DupTree is a new software program for inferring rooted species trees from collections of gene trees using the gene tree parsimony approach. The program implements a novel algorithm that significantly improves upon the run time of standard search heuristics for gene tree parsimony, and enables the first truly genome-scale phylogenetic analyses. In addition, DupTree allows users to examine alternate rootings and to weight the reconciliation costs for gene trees. DupTree is an open source project written in C++. DupTree for Mac OS X, Windows, and Linux along with a sample dataset and an on-line manual are available at http://genome.cs.iastate.edu/CBL/DupTree

  17. Third annual Walker Branch Watershed research symposium. Program and abstracts

    Energy Technology Data Exchange (ETDEWEB)

    1992-03-01

    The methods and concepts of watershed research, originally applied in an experimental or monitoring mode to relatively small catchments, are increasingly being used at larger scales and for specific applied problems. Research at Oak Ridge National Laboratory, the Tennessee Valley Authority, the US Forest Service, and other agencies and institutions participating in this symposium reflects research over a broad range of spatial scales that is being integrated through large-scale experiments along with computer modeling and graphical interfaces. These research projects address the basic atmospheric, geophysical, biogeochemical, and biological processes that regulate the responses of forested ecosystems to natural environmental variation and anthropogenic stresses. Regional and global issues addressed by presentations include emissions of carbon dioxide, methane, and other hydrocarbons; deposition of sulfate, nitrate, and mercury; land-use changes; biological diversity; droughts; and water quality. The reports presented in this symposium illustrate a wide range of methods and approaches and focus more on concepts and techniques than on a specific physical site. Sites and projects that have contributed research results to this symposium include Walker Branch Watershed (DOE), the Coweeta Hydrologic Laboratory and LTER site (USFS and NSF), Great Smoky Mountains National Park (research funded by NPS, TVA, and EPRI), Imnavait Creek, Alaska (DOE), the TVA-Norris Whole-tree Facility (TVA and EPRI), and DOE`s Biomass Program.

  18. Commercial applications of large-scale Research and Development computer simulation technologies

    International Nuclear Information System (INIS)

    Kuok Mee Ling; Pascal Chen; Wen Ho Lee

    1998-01-01

    The potential commercial applications of two large-scale R and D computer simulation technologies are presented. One such technology is based on the numerical solution of the hydrodynamics equations, and is embodied in the two-dimensional Eulerian code EULE2D, which solves the hydrodynamic equations with various models for the equation of state (EOS), constitutive relations and fracture mechanics. EULE2D is an R and D code originally developed to design and analyze conventional munitions for anti-armor penetrations such as shaped charges, explosive formed projectiles, and kinetic energy rods. Simulated results agree very well with actual experiments. A commercial application presented here is the design and simulation of shaped charges for oil and gas well bore perforation. The other R and D simulation technology is based on the numerical solution of Maxwell's partial differential equations of electromagnetics in space and time, and is implemented in the three-dimensional code FDTD-SPICE, which solves Maxwell's equations in the time domain with finite-differences in the three spatial dimensions and calls SPICE for information when nonlinear active devices are involved. The FDTD method has been used in the radar cross-section modeling of military aircrafts and many other electromagnetic phenomena. The coupling of FDTD method with SPICE, a popular circuit and device simulation program, provides a powerful tool for the simulation and design of microwave and millimeter-wave circuits containing nonlinear active semiconductor devices. A commercial application of FDTD-SPICE presented here is the simulation of a two-element active antenna system. The simulation results and the experimental measurements are in excellent agreement. (Author)

  19. Large-scale matrix-handling subroutines 'ATLAS'

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Takeda, Tatsuoki; Fujita, Keiichi; Matsuura, Toshihiko; Tahara, Nobuo

    1978-03-01

    Subroutine package ''ATLAS'' has been developed for handling large-scale matrices. The package is composed of four kinds of subroutines, i.e., basic arithmetic routines, routines for solving linear simultaneous equations and for solving general eigenvalue problems and utility routines. The subroutines are useful in large scale plasma-fluid simulations. (auth.)

  20. NASA/MSFC FY90 Global Scale Atmospheric Processes Research Program Review

    Science.gov (United States)

    Leslie, Fred W. (Editor)

    1990-01-01

    Research supported by the Global Atmospheric Research Program at the Marshall Space Flight Center on atmospheric remote sensing, meteorology, numerical weather forecasting, satellite data analysis, cloud precipitation, atmospheric circulation, atmospheric models and related topics is discussed.

  1. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  2. Multiresolution comparison of precipitation datasets for large-scale models

    Science.gov (United States)

    Chun, K. P.; Sapriza Azuri, G.; Davison, B.; DeBeer, C. M.; Wheater, H. S.

    2014-12-01

    Gridded precipitation datasets are crucial for driving large-scale models which are related to weather forecast and climate research. However, the quality of precipitation products is usually validated individually. Comparisons between gridded precipitation products along with ground observations provide another avenue for investigating how the precipitation uncertainty would affect the performance of large-scale models. In this study, using data from a set of precipitation gauges over British Columbia and Alberta, we evaluate several widely used North America gridded products including the Canadian Gridded Precipitation Anomalies (CANGRD), the National Center for Environmental Prediction (NCEP) reanalysis, the Water and Global Change (WATCH) project, the thin plate spline smoothing algorithms (ANUSPLIN) and Canadian Precipitation Analysis (CaPA). Based on verification criteria for various temporal and spatial scales, results provide an assessment of possible applications for various precipitation datasets. For long-term climate variation studies (~100 years), CANGRD, NCEP, WATCH and ANUSPLIN have different comparative advantages in terms of their resolution and accuracy. For synoptic and mesoscale precipitation patterns, CaPA provides appealing performance of spatial coherence. In addition to the products comparison, various downscaling methods are also surveyed to explore new verification and bias-reduction methods for improving gridded precipitation outputs for large-scale models.

  3. Status of the INL high-temperature electrolysis research program –experimental and modeling

    Energy Technology Data Exchange (ETDEWEB)

    J. E. O' Brien; C. M. Stoots; M. G. McKellar; E. A. Harvego; K. G. Condie; G. K. Housley; J. S. Herring; J. J. Hartvigsen

    2009-04-01

    This paper provides a status update on the high-temperature electrolysis (HTE) research and development program at the Idaho National Laboratory (INL), with an overview of recent large-scale system modeling results and the status of the experimental program. System analysis results have been obtained using the commercial code UniSim, augmented with a custom high-temperature electrolyzer module. The process flow diagrams for the system simulations include an advanced nuclear reactor as a source of high-temperature process heat, a power cycle and a coupled steam electrolysis loop. Several reactor types and power cycles have been considered, over a range of reactor coolant outlet temperatures. In terms of experimental research, the INL has recently completed an Integrated Laboratory Scale (ILS) HTE test at the 15 kW level. The initial hydrogen production rate for the ILS test was in excess of 5000 liters per hour. Details of the ILS design and operation will be presented. Current small-scale experimental research is focused on improving the degradation characteristics of the electrolysis cells and stacks. Small-scale testing ranges from single cells to multiple-cell stacks. The INL is currently in the process of testing several state-of-the-art anode-supported cells and is working to broaden its relationship with industry in order to improve the long-term performance of the cells.

  4. BigSUR: large-scale structured urban reconstruction

    KAUST Repository

    Kelly, Tom

    2017-11-22

    The creation of high-quality semantically parsed 3D models for dense metropolitan areas is a fundamental urban modeling problem. Although recent advances in acquisition techniques and processing algorithms have resulted in large-scale imagery or 3D polygonal reconstructions, such data-sources are typically noisy, and incomplete, with no semantic structure. In this paper, we present an automatic data fusion technique that produces high-quality structured models of city blocks. From coarse polygonal meshes, street-level imagery, and GIS footprints, we formulate a binary integer program that globally balances sources of error to produce semantically parsed mass models with associated facade elements. We demonstrate our system on four city regions of varying complexity; our examples typically contain densely built urban blocks spanning hundreds of buildings. In our largest example, we produce a structured model of 37 city blocks spanning a total of 1,011 buildings at a scale and quality previously impossible to achieve automatically.

  5. BigSUR: large-scale structured urban reconstruction

    KAUST Repository

    Kelly, Tom; Femiani, John; Wonka, Peter; Mitra, Niloy J.

    2017-01-01

    The creation of high-quality semantically parsed 3D models for dense metropolitan areas is a fundamental urban modeling problem. Although recent advances in acquisition techniques and processing algorithms have resulted in large-scale imagery or 3D polygonal reconstructions, such data-sources are typically noisy, and incomplete, with no semantic structure. In this paper, we present an automatic data fusion technique that produces high-quality structured models of city blocks. From coarse polygonal meshes, street-level imagery, and GIS footprints, we formulate a binary integer program that globally balances sources of error to produce semantically parsed mass models with associated facade elements. We demonstrate our system on four city regions of varying complexity; our examples typically contain densely built urban blocks spanning hundreds of buildings. In our largest example, we produce a structured model of 37 city blocks spanning a total of 1,011 buildings at a scale and quality previously impossible to achieve automatically.

  6. Probes of large-scale structure in the Universe

    International Nuclear Information System (INIS)

    Suto, Yasushi; Gorski, K.; Juszkiewicz, R.; Silk, J.

    1988-01-01

    Recent progress in observational techniques has made it possible to confront quantitatively various models for the large-scale structure of the Universe with detailed observational data. We develop a general formalism to show that the gravitational instability theory for the origin of large-scale structure is now capable of critically confronting observational results on cosmic microwave background radiation angular anisotropies, large-scale bulk motions and large-scale clumpiness in the galaxy counts. (author)

  7. Evaluation of the clinical implementation of a large-scale online e-learning program on venous blood specimen collection guideline practices.

    Science.gov (United States)

    Willman, Britta; Grankvist, Kjell; Bölenius, Karin

    2018-05-11

    When performed erroneously, the venous blood specimen collection (VBSC) practice steps patient identification, test request management and test tube labeling are at high risk to jeopardize patient safety. VBSC educational programs with the intention to minimize risk of harm to patients are therefore needed. In this study, we evaluate the efficiency of a large-scale online e-learning program on personnel's adherence to VBSC practices and their experience of the e-learning program. An interprofessional team transformed an implemented traditional VBSC education program to an online e-learning program developed to stimulate reflection with focus on the high-risk practice steps. We used questionnaires to evaluate the effect of the e-learning program on personnel's self-reported adherence to VBSC practices compared to questionnaire surveys before and after introduction of the traditional education program. We used content analysis to evaluate the participants free text experience of the VBSC e-learning program. Adherence to the VBSC guideline high-risk practice steps generally increased following the implementation of a traditional educational program followed by an e-learning program. We however found a negative trend over years regarding participation rates and the practice to always send/sign the request form following the introduction of an electronic request system. The participants were in general content with the VBSC e-learning program. Properly designed e-learning programs on VBSC practices supersedes traditional educational programs in usefulness and functionality. Inclusion of questionnaires in the e-learning program is necessary for follow-up of VBSC participant's practices and educational program efficiency.

  8. Methods for large-scale international studies on ICT in education

    NARCIS (Netherlands)

    Pelgrum, W.J.; Plomp, T.; Voogt, Joke; Knezek, G.A.

    2008-01-01

    International comparative assessment is a research method applied for describing and analyzing educational processes and outcomes. They are used to ‘describe the status quo’ in educational systems from an international comparative perspective. This chapter reviews different large scale international

  9. A Cooperative Industry - Government Woodland Caribou Research Program in Northeastern Alberta

    Directory of Open Access Journals (Sweden)

    Blair Rippin

    1996-01-01

    Full Text Available Rapid development of large scale logging and increasingly intensive petroleum exploration and development in northeastern Alberta prompted the establishment of a cooperative research program to investigate various aspects of woodland caribou (Rangifer tarandus caribou biology. The ultimate goal of the program is to develop an effective plan that will ensure the long term survival of caribou while allowing for renewable and non-renewable resource development. There are three parts to the program. Part I began early in 1991 and makes use of conventional radio telemetry as a means of recording various parameters of general caribou biology. The study area encompasses approximately 4000 km2 of low relief, boreal mixedwood forest. Preliminary results from 2500 radio locations (involving 50 individuals indicate that woodland caribou inhabiting the study area are non-migratory and are strongly associated with some of the more scarce peatland forest types present in the area. Investigations to document the basic biology and ecology will continue for another two years. Part II began in early 1993 as a part of a two-year investigation into the disturbance effects of petroleum exploration and development on caribou movements and behaviour. One objective of this study is to develop a predictive model useful in determining the cumulative effects of varying intensities of disturbance on caribou. Part III began in early 1994 with a proposed three-year investigation to determine the mechanism of spatial and temporal separation of caribou and moose in the study area. These relationships may indicate the means by which caribou minimize the impact of wolf predation on their populations in northeastern Alberta. Results will be applied to industrial land use and specifically to large scale forest harvesting planned for the area. The research program is supported through cooperative funding contributed by 24 petroleum companies, 1 forest company, 2 peat companies and

  10. PERSEUS-HUB: Interactive and Collective Exploration of Large-Scale Graphs

    Directory of Open Access Journals (Sweden)

    Di Jin

    2017-07-01

    Full Text Available Graphs emerge naturally in many domains, such as social science, neuroscience, transportation engineering, and more. In many cases, such graphs have millions or billions of nodes and edges, and their sizes increase daily at a fast pace. How can researchers from various domains explore large graphs interactively and efficiently to find out what is ‘important’? How can multiple researchers explore a new graph dataset collectively and “help” each other with their findings? In this article, we present Perseus-Hub, a large-scale graph mining tool that computes a set of graph properties in a distributed manner, performs ensemble, multi-view anomaly detection to highlight regions that are worth investigating, and provides users with uncluttered visualization and easy interaction with complex graph statistics. Perseus-Hub uses a Spark cluster to calculate various statistics of large-scale graphs efficiently, and aggregates the results in a summary on the master node to support interactive user exploration. In Perseus-Hub, the visualized distributions of graph statistics provide preliminary analysis to understand a graph. To perform a deeper analysis, users with little prior knowledge can leverage patterns (e.g., spikes in the power-law degree distribution marked by other users or experts. Moreover, Perseus-Hub guides users to regions of interest by highlighting anomalous nodes and helps users establish a more comprehensive understanding about the graph at hand. We demonstrate our system through the case study on real, large-scale networks.

  11. Reconsidering Replication: New Perspectives on Large-Scale School Improvement

    Science.gov (United States)

    Peurach, Donald J.; Glazer, Joshua L.

    2012-01-01

    The purpose of this analysis is to reconsider organizational replication as a strategy for large-scale school improvement: a strategy that features a "hub" organization collaborating with "outlet" schools to enact school-wide designs for improvement. To do so, we synthesize a leading line of research on commercial replication to construct a…

  12. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  13. Image subsampling and point scoring approaches for large-scale marine benthic monitoring programs

    Science.gov (United States)

    Perkins, Nicholas R.; Foster, Scott D.; Hill, Nicole A.; Barrett, Neville S.

    2016-07-01

    Benthic imagery is an effective tool for quantitative description of ecologically and economically important benthic habitats and biota. The recent development of autonomous underwater vehicles (AUVs) allows surveying of spatial scales that were previously unfeasible. However, an AUV collects a large number of images, the scoring of which is time and labour intensive. There is a need to optimise the way that subsamples of imagery are chosen and scored to gain meaningful inferences for ecological monitoring studies. We examine the trade-off between the number of images selected within transects and the number of random points scored within images on the percent cover of target biota, the typical output of such monitoring programs. We also investigate the efficacy of various image selection approaches, such as systematic or random, on the bias and precision of cover estimates. We use simulated biotas that have varying size, abundance and distributional patterns. We find that a relatively small sampling effort is required to minimise bias. An increased precision for groups that are likely to be the focus of monitoring programs is best gained through increasing the number of images sampled rather than the number of points scored within images. For rare species, sampling using point count approaches is unlikely to provide sufficient precision, and alternative sampling approaches may need to be employed. The approach by which images are selected (simple random sampling, regularly spaced etc.) had no discernible effect on mean and variance estimates, regardless of the distributional pattern of biota. Field validation of our findings is provided through Monte Carlo resampling analysis of a previously scored benthic survey from temperate waters. We show that point count sampling approaches are capable of providing relatively precise cover estimates for candidate groups that are not overly rare. The amount of sampling required, in terms of both the number of images and

  14. An establishment on the hazard mitigation system of large scale landslides for Zengwen reservoir watershed management in Taiwan

    Science.gov (United States)

    Tsai, Kuang-Jung; Lee, Ming-Hsi; Chen, Yie-Ruey; Huang, Meng-Hsuan; Yu, Chia-Ching

    2016-04-01

    Extremely heavy rainfall with accumulated rainfall amount more than 2900mm within continuous 3 day event occurred at southern Taiwan has been recognized as a serious natural hazard caused by Morakot typhoon in august, 2009. Very destructive large scale landslides and debris flows were induced by this heavy rainfall event. According to the satellite image processing and monitoring project was conducted by Soil & Water Conservation Bureau after Morakot typhoon. More than 10904 sites of landslide with total sliding area of 18113 ha were significantly found by this project. Also, the field investigation on all landslide areas were executed by this research on the basis of disaster type, scale and location related to the topographic condition, colluvium soil characteristics, bedrock formation and geological structure after Morakot hazard. The mechanism, characteristics and behavior of this large scale landslide combined with debris flow disasters are analyzed and Investigated to rule out the interaction of factors concerned above and identify the disaster extent of rainfall induced landslide during the period of this study. In order to reduce the disaster risk of large scale landslide and debris flow, the adaption strategy of hazard mitigation system should be set up as soon as possible and taken into consideration of slope land conservation, landslide control countermeasure planning, disaster database establishment, environment impact analysis and disaster risk assessment respectively. As a result, this 3-year research has been focused on the field investigation by using GPS/GIS/RS integration, mechanism and behavior study regarding to the rainfall induced landslide occurrence, disaster database and hazard mitigation system establishment. In fact, this project has become an important issue which was seriously concerned by the government and people live in Taiwan. Hopefully, all results come from this research can be used as a guidance for the disaster prevention and

  15. Programs of the Office of Energy Research

    International Nuclear Information System (INIS)

    1985-07-01

    The purpose of this research has been to support the energy technology development programs by providing insight into fundamental science and associated phenomena and developing new or advanced concepts and techniques. Today, this responsibility rests with the Office of Energy Research (ER), DOE, whose present programs have their origins in pioneering energy-related research which was initiated nearly 40 years ago. The Director, Office of Energy Research, also acts as the chief scientist and scientific advisor to the Secretary of Energy for the entire spectrum of energy research and development (R and D) programs of the Department. ER programs include several thousand individual projects and hundreds of laboratories, universities, and other research facilities throughout the United States. The current organization of ER is shown. The budgets for the various ER programs for the last two fiscal years are shown. In the following pages, each of these programs and activities are described briefly for the information of the scientific community and the public at large

  16. Large Scale Simulations of the Euler Equations on GPU Clusters

    KAUST Repository

    Liebmann, Manfred; Douglas, Craig C.; Haase, Gundolf; Horvá th, Zoltá n

    2010-01-01

    The paper investigates the scalability of a parallel Euler solver, using the Vijayasundaram method, on a GPU cluster with 32 Nvidia Geforce GTX 295 boards. The aim of this research is to enable large scale fluid dynamics simulations with up to one

  17. Large-scale parallel genome assembler over cloud computing environment.

    Science.gov (United States)

    Das, Arghya Kusum; Koppa, Praveen Kumar; Goswami, Sayan; Platania, Richard; Park, Seung-Jong

    2017-06-01

    The size of high throughput DNA sequencing data has already reached the terabyte scale. To manage this huge volume of data, many downstream sequencing applications started using locality-based computing over different cloud infrastructures to take advantage of elastic (pay as you go) resources at a lower cost. However, the locality-based programming model (e.g. MapReduce) is relatively new. Consequently, developing scalable data-intensive bioinformatics applications using this model and understanding the hardware environment that these applications require for good performance, both require further research. In this paper, we present a de Bruijn graph oriented Parallel Giraph-based Genome Assembler (GiGA), as well as the hardware platform required for its optimal performance. GiGA uses the power of Hadoop (MapReduce) and Giraph (large-scale graph analysis) to achieve high scalability over hundreds of compute nodes by collocating the computation and data. GiGA achieves significantly higher scalability with competitive assembly quality compared to contemporary parallel assemblers (e.g. ABySS and Contrail) over traditional HPC cluster. Moreover, we show that the performance of GiGA is significantly improved by using an SSD-based private cloud infrastructure over traditional HPC cluster. We observe that the performance of GiGA on 256 cores of this SSD-based cloud infrastructure closely matches that of 512 cores of traditional HPC cluster.

  18. Subgrid-scale models for large-eddy simulation of rotating turbulent channel flows

    Science.gov (United States)

    Silvis, Maurits H.; Bae, Hyunji Jane; Trias, F. Xavier; Abkar, Mahdi; Moin, Parviz; Verstappen, Roel

    2017-11-01

    We aim to design subgrid-scale models for large-eddy simulation of rotating turbulent flows. Rotating turbulent flows form a challenging test case for large-eddy simulation due to the presence of the Coriolis force. The Coriolis force conserves the total kinetic energy while transporting it from small to large scales of motion, leading to the formation of large-scale anisotropic flow structures. The Coriolis force may also cause partial flow laminarization and the occurrence of turbulent bursts. Many subgrid-scale models for large-eddy simulation are, however, primarily designed to parametrize the dissipative nature of turbulent flows, ignoring the specific characteristics of transport processes. We, therefore, propose a new subgrid-scale model that, in addition to the usual dissipative eddy viscosity term, contains a nondissipative nonlinear model term designed to capture transport processes, such as those due to rotation. We show that the addition of this nonlinear model term leads to improved predictions of the energy spectra of rotating homogeneous isotropic turbulence as well as of the Reynolds stress anisotropy in spanwise-rotating plane-channel flows. This work is financed by the Netherlands Organisation for Scientific Research (NWO) under Project Number 613.001.212.

  19. Research on fatigue behavior and residual stress of large-scale cruciform welding joint with groove

    International Nuclear Information System (INIS)

    Zhao, Xiaohui; Liu, Yu; Liu, Yong; Gao, Yuan

    2014-01-01

    Highlights: • The fatigue behavior of the large-scale cruciform welding joint with groove was studied. • The longitudinal residual stress of the large-scale cruciform welding joint was tested by contour method. • The fatigue fracture mechanism of the large-scale cruciform welding joint with groove was analyzed. - Abstract: Fatigue fracture behavior of the 30 mm thick Q460C-Z steel cruciform welded joint with groove was investigated. The fatigue test results indicated that fatigue strength of 30 mm thick Q460C-Z steel cruciform welded joint with groove can reach fatigue level of 80 MPa (FAT80). Fatigue crack source of the failure specimen initiated from weld toe. Meanwhile, the microcrack was also found in the fusion zones of the fatigue failure specimen, which was caused by weld quality and weld metal integrity resulting from the multi-pass welds. Two-dimensional map of the longitudinal residual stress of 30 mm thick Q460C-Z steel cruciform welded joint with groove was obtained by using the contour method. The stress nephogram of Two-dimensional map indicated that longitudinal residual stress in the welding center is the largest

  20. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  1. Why small-scale cannabis growers stay small: five mechanisms that prevent small-scale growers from going large scale.

    Science.gov (United States)

    Hammersvik, Eirik; Sandberg, Sveinung; Pedersen, Willy

    2012-11-01

    Over the past 15-20 years, domestic cultivation of cannabis has been established in a number of European countries. New techniques have made such cultivation easier; however, the bulk of growers remain small-scale. In this study, we explore the factors that prevent small-scale growers from increasing their production. The study is based on 1 year of ethnographic fieldwork and qualitative interviews conducted with 45 Norwegian cannabis growers, 10 of whom were growing on a large-scale and 35 on a small-scale. The study identifies five mechanisms that prevent small-scale indoor growers from going large-scale. First, large-scale operations involve a number of people, large sums of money, a high work-load and a high risk of detection, and thus demand a higher level of organizational skills than for small growing operations. Second, financial assets are needed to start a large 'grow-site'. Housing rent, electricity, equipment and nutrients are expensive. Third, to be able to sell large quantities of cannabis, growers need access to an illegal distribution network and knowledge of how to act according to black market norms and structures. Fourth, large-scale operations require advanced horticultural skills to maximize yield and quality, which demands greater skills and knowledge than does small-scale cultivation. Fifth, small-scale growers are often embedded in the 'cannabis culture', which emphasizes anti-commercialism, anti-violence and ecological and community values. Hence, starting up large-scale production will imply having to renegotiate or abandon these values. Going from small- to large-scale cannabis production is a demanding task-ideologically, technically, economically and personally. The many obstacles that small-scale growers face and the lack of interest and motivation for going large-scale suggest that the risk of a 'slippery slope' from small-scale to large-scale growing is limited. Possible political implications of the findings are discussed. Copyright

  2. Economic and agricultural transformation through large-scale farming : impacts of large-scale farming on local economic development, household food security and the environment in Ethiopia

    NARCIS (Netherlands)

    Bekele, M.S.

    2016-01-01

    This study examined impacts of large-scale farming in Ethiopia on local economic development, household food security, incomes, employment, and the environment. The study adopted a mixed research approach in which both qualitative and quantitative data were generated from secondary and primary

  3. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  4. European Facility for Antiproton and Ion Research (FAIR): the new international center for fundamental physics and its research program

    International Nuclear Information System (INIS)

    Fortov, Vladimir E; Sharkov, Boris Yu; Stöker, H

    2012-01-01

    The Facility for Antiproton and Ion Research (FAIR) accelerator center at Darmstadt, Germany, will provide the international scientific community with unique experimental opportunities of a scope and scale out of reach for any other large-scale facility in the world. With its staff of over 2500, it is expected to fundamentally expand our knowledge of hadron, nuclear, and atomic physics and their application to cosmology, astrophysics, and technology. In this review, the design details of the accelerator complex are discussed and the experimental research program for FAIR is presented. Particular attention is paid to experiments on the extreme state of matter arising from the isochoric heating of a material by heavy-ion beams. One of the largest facilities of its kind in Europe, FAIR is a part of the strategic development roadmap for the European Strategic Forum on Research Infrastructures (ESFRI). (physics of our days)

  5. Matrix Sampling of Items in Large-Scale Assessments

    Directory of Open Access Journals (Sweden)

    Ruth A. Childs

    2003-07-01

    Full Text Available Matrix sampling of items -' that is, division of a set of items into different versions of a test form..-' is used by several large-scale testing programs. Like other test designs, matrixed designs have..both advantages and disadvantages. For example, testing time per student is less than if each..student received all the items, but the comparability of student scores may decrease. Also,..curriculum coverage is maintained, but reporting of scores becomes more complex. In this paper,..matrixed designs are compared with more traditional designs in nine categories of costs:..development costs, materials costs, administration costs, educational costs, scoring costs,..reliability costs, comparability costs, validity costs, and reporting costs. In choosing among test..designs, a testing program should examine the costs in light of its mandate(s, the content of the..tests, and the financial resources available, among other considerations.

  6. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele; Attili, Antonio; Bisetti, Fabrizio; Elsinga, Gerrit E.

    2015-01-01

    from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  7. Large-scale silviculture experiments of western Oregon and Washington.

    Science.gov (United States)

    Nathan J. Poage; Paul D. Anderson

    2007-01-01

    We review 12 large-scale silviculture experiments (LSSEs) in western Washington and Oregon with which the Pacific Northwest Research Station of the USDA Forest Service is substantially involved. We compiled and arrayed information about the LSSEs as a series of matrices in a relational database, which is included on the compact disc published with this report and...

  8. Large-scale computation in solid state physics - Recent developments and prospects

    International Nuclear Information System (INIS)

    DeVreese, J.T.

    1985-01-01

    During the past few years an increasing interest in large-scale computation is developing. Several initiatives were taken to evaluate and exploit the potential of ''supercomputers'' like the CRAY-1 (or XMP) or the CYBER-205. In the U.S.A., there first appeared the Lax report in 1982 and subsequently (1984) the National Science Foundation in the U.S.A. announced a program to promote large-scale computation at the universities. Also, in Europe several CRAY- and CYBER-205 systems have been installed. Although the presently available mainframes are the result of a continuous growth in speed and memory, they might have induced a discontinuous transition in the evolution of the scientific method; between theory and experiment a third methodology, ''computational science'', has become or is becoming operational

  9. Environmental research program. 1995 Annual report

    Energy Technology Data Exchange (ETDEWEB)

    Brown, N.J.

    1996-06-01

    The objective of the Environmental Research Program is to enhance the understanding of, and mitigate the effects of pollutants on health, ecological systems, global and regional climate, and air quality. The program is multidisciplinary and includes fundamental research and development in efficient and environmentally benign combustion, pollutant abatement and destruction, and novel methods of detection and analysis of criteria and noncriteria pollutants. This diverse group conducts investigations in combustion, atmospheric and marine processes, flue-gas chemistry, and ecological systems. Combustion chemistry research emphasizes modeling at microscopic and macroscopic scales. At the microscopic scale, functional sensitivity analysis is used to explore the nature of the potential-to-dynamics relationships for reacting systems. Rate coefficients are estimated using quantum dynamics and path integral approaches. At the macroscopic level, combustion processes are modelled using chemical mechanisms at the appropriate level of detail dictated by the requirements of predicting particular aspects of combustion behavior. Parallel computing has facilitated the efforts to use detailed chemistry in models of turbulent reacting flow to predict minor species concentrations.

  10. Dose monitoring in large-scale flowing aqueous media

    International Nuclear Information System (INIS)

    Kuruca, C.N.

    1995-01-01

    The Miami Electron Beam Research Facility (EBRF) has been in operation for six years. The EBRF houses a 1.5 MV, 75 KW DC scanned electron beam. Experiments have been conducted to evaluate the effectiveness of high-energy electron irradiation in the removal of toxic organic chemicals from contaminated water and the disinfection of various wastewater streams. The large-scale plant operates at approximately 450 L/min (120 gal/min). The radiation dose absorbed by the flowing aqueous streams is estimated by measuring the difference in water temperature before and after it passes in front of the beam. Temperature measurements are made using resistance temperature devices (RTDs) and recorded by computer along with other operating parameters. Estimated dose is obtained from the measured temperature differences using the specific heat of water. This presentation will discuss experience with this measurement system, its application to different water presentation devices, sources of error, and the advantages and disadvantages of its use in large-scale process applications

  11. The Ownership Structure Dilemma and its Implications on the Transition from Small-Scale to Large-Scale Electric Road Systems

    OpenAIRE

    BEDNARCIK ABDULHADI, EMMA; VITEZ, MARINA

    2016-01-01

    This master thesis is written on behalf of KTH Royal Institute of Technology and the Swedish National Road and Transport Research Institute (VTI). The study investigates how infrastructure ownership could affect the transition from small-scale to large-scale electric road systems (ERS) and how infrastructure ownership affects the foreseen future roles of the ERS stakeholders. The authors have used a qualitative research method, including a literature study within the areas of infrastructure t...

  12. New Approaches for Very Large-Scale Integer Programming

    Science.gov (United States)

    2016-06-24

    DISTRIBUTION/ AVAILABILITY STATEMENT Approved for Public Release 13. SUPPLEMENTARY NOTES 14. ABSTRACT The focus of this project is new computational... heuristics for integer programs in order to rapidly improve dual bounds. 2. Choosing good branching variables in branch-and-bound algorithms for MIP. 3...programming, algorithms, parallel processing, machine learning, heuristics 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF

  13. Parameter and State Estimation of Large-Scale Complex Systems Using Python Tools

    Directory of Open Access Journals (Sweden)

    M. Anushka S. Perera

    2015-07-01

    Full Text Available This paper discusses the topics related to automating parameter, disturbance and state estimation analysis of large-scale complex nonlinear dynamic systems using free programming tools. For large-scale complex systems, before implementing any state estimator, the system should be analyzed for structural observability and the structural observability analysis can be automated using Modelica and Python. As a result of structural observability analysis, the system may be decomposed into subsystems where some of them may be observable --- with respect to parameter, disturbances, and states --- while some may not. The state estimation process is carried out for those observable subsystems and the optimum number of additional measurements are prescribed for unobservable subsystems to make them observable. In this paper, an industrial case study is considered: the copper production process at Glencore Nikkelverk, Kristiansand, Norway. The copper production process is a large-scale complex system. It is shown how to implement various state estimators, in Python, to estimate parameters and disturbances, in addition to states, based on available measurements.

  14. Development of polymers for large scale roll-to-roll processing of polymer solar cells

    DEFF Research Database (Denmark)

    Carlé, Jon Eggert

    Development of polymers for large scale roll-to-roll processing of polymer solar cells Conjugated polymers potential to both absorb light and transport current as well as the perspective of low cost and large scale production has made these kinds of material attractive in solar cell research....... The research field of polymer solar cells (PSCs) is rapidly progressing along three lines: Improvement of efficiency and stability together with the introduction of large scale production methods. All three lines are explored in this work. The thesis describes low band gap polymers and why these are needed....... Polymer of this type display broader absorption resulting in better overlap with the solar spectrum and potentially higher current density. Synthesis, characterization and device performance of three series of polymers illustrating how the absorption spectrum of polymers can be manipulated synthetically...

  15. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its......The Subject of large scale networks is approached from the perspective of the network planner. An analysis of the long term planning problems is presented with the main focus on the changing requirements for large scale networks and the potential problems in meeting these requirements. The problems...... the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...

  16. Benchmarking processes for managing large international space programs

    Science.gov (United States)

    Mandell, Humboldt C., Jr.; Duke, Michael B.

    1993-01-01

    The relationship between management style and program costs is analyzed to determine the feasibility of financing large international space missions. The incorporation of management systems is considered to be essential to realizing low cost spacecraft and planetary surface systems. Several companies ranging from large Lockheed 'Skunk Works' to small companies including Space Industries, Inc., Rocket Research Corp., and Orbital Sciences Corp. were studied. It is concluded that to lower the prices, the ways in which spacecraft and hardware are developed must be changed. Benchmarking of successful low cost space programs has revealed a number of prescriptive rules for low cost managements, including major changes in the relationships between the public and private sectors.

  17. Software quality assurance: in large scale and complex software-intensive systems

    NARCIS (Netherlands)

    Mistrik, I.; Soley, R.; Ali, N.; Grundy, J.; Tekinerdogan, B.

    2015-01-01

    Software Quality Assurance in Large Scale and Complex Software-intensive Systems presents novel and high-quality research related approaches that relate the quality of software architecture to system requirements, system architecture and enterprise-architecture, or software testing. Modern software

  18. Large scale 2D spectral compressed sensing in continuous domain

    KAUST Repository

    Cai, Jian-Feng

    2017-06-20

    We consider the problem of spectral compressed sensing in continuous domain, which aims to recover a 2-dimensional spectrally sparse signal from partially observed time samples. The signal is assumed to be a superposition of s complex sinusoids. We propose a semidefinite program for the 2D signal recovery problem. Our model is able to handle large scale 2D signals of size 500 × 500, whereas traditional approaches only handle signals of size around 20 × 20.

  19. Large scale 2D spectral compressed sensing in continuous domain

    KAUST Repository

    Cai, Jian-Feng; Xu, Weiyu; Yang, Yang

    2017-01-01

    We consider the problem of spectral compressed sensing in continuous domain, which aims to recover a 2-dimensional spectrally sparse signal from partially observed time samples. The signal is assumed to be a superposition of s complex sinusoids. We propose a semidefinite program for the 2D signal recovery problem. Our model is able to handle large scale 2D signals of size 500 × 500, whereas traditional approaches only handle signals of size around 20 × 20.

  20. Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Willcox, Karen [MIT; Marzouk, Youssef [MIT

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to

  1. Jointly Sponsored Research Program on Energy Related Research

    Energy Technology Data Exchange (ETDEWEB)

    No, author

    2013-12-31

    Cooperative Agreements, DE-FC26-08NT43293, DOE-WRI Cooperative Research and Development Program for Fossil Energy-Related Resources began in June 2009. The goal of the Program was to develop, commercialize, and deploy technologies of value to the nation’s fossil and renewable energy industries. To ensure relevancy and early commercialization, the involvement of an industrial partner was encouraged. In that regard, the Program stipulated that a minimum of 20% cost share be achieved in a fiscal year. This allowed WRI to carry a diverse portfolio of technologies and projects at various development technology readiness levels. Depending upon the maturity of the research concept and technology, cost share for a given task ranged from none to as high as 67% (two-thirds). Over the course of the Program, a total of twenty six tasks were proposed for DOE approval. Over the period of performance of the Cooperative agreement, WRI has put in place projects utilizing a total of $7,089,581 in USDOE funds. Against this funding, cosponsors have committed $7,398,476 in private funds to produce a program valued at $14,488,057. Tables 1 and 2 presented at the end of this section is a compilation of the funding for all the tasks conducted under the program. The goal of the Cooperative Research and Development Program for Fossil Energy-Related Resources was to through collaborative research with the industry, develop or assist in the development of innovative technology solutions that will: • Increase the production of United States energy resources – coal, natural gas, oil, and renewable energy resources; • Enhance the competitiveness of United States energy technologies in international markets and assist in technology transfer; • Reduce the nation's dependence on foreign energy supplies and strengthen both the United States and regional economies; and • Minimize environmental impacts of energy production and utilization. Success of the Program can be measured by

  2. Performance of lap splices in large-scale column specimens affected by ASR and/or DEF-extension phase.

    Science.gov (United States)

    2015-03-01

    A large experimental program, consisting of the design, construction, curing, exposure, and structural load : testing of 16 large-scale column specimens with a critical lap splice region that were influenced by varying : stages of alkali-silica react...

  3. Time Discounting and Credit Market Access in a Large-Scale Cash Transfer Programme

    Science.gov (United States)

    Handa, Sudhanshu; Martorano, Bruno; Halpern, Carolyn; Pettifor, Audrey; Thirumurthy, Harsha

    2017-01-01

    Summary Time discounting is thought to influence decision-making in almost every sphere of life, including personal finances, diet, exercise and sexual behavior. In this article we provide evidence on whether a national poverty alleviation program in Kenya can affect inter-temporal decisions. We administered a preferences module as part of a large-scale impact evaluation of the Kenyan Government’s Cash Transfer for Orphans and Vulnerable Children. Four years into the program we find that individuals in the treatment group are only marginally more likely to wait for future money, due in part to the erosion of the value of the transfer by inflation. However among the poorest households for whom the value of transfer is still relatively large we find significant program effects on the propensity to wait. We also find strong program effects among those who have access to credit markets though the program itself does not improve access to credit. PMID:28260842

  4. Time Discounting and Credit Market Access in a Large-Scale Cash Transfer Programme.

    Science.gov (United States)

    Handa, Sudhanshu; Martorano, Bruno; Halpern, Carolyn; Pettifor, Audrey; Thirumurthy, Harsha

    2016-06-01

    Time discounting is thought to influence decision-making in almost every sphere of life, including personal finances, diet, exercise and sexual behavior. In this article we provide evidence on whether a national poverty alleviation program in Kenya can affect inter-temporal decisions. We administered a preferences module as part of a large-scale impact evaluation of the Kenyan Government's Cash Transfer for Orphans and Vulnerable Children. Four years into the program we find that individuals in the treatment group are only marginally more likely to wait for future money, due in part to the erosion of the value of the transfer by inflation. However among the poorest households for whom the value of transfer is still relatively large we find significant program effects on the propensity to wait. We also find strong program effects among those who have access to credit markets though the program itself does not improve access to credit.

  5. Novel material and structural design for large-scale marine protective devices

    International Nuclear Information System (INIS)

    Qiu, Ang; Lin, Wei; Ma, Yong; Zhao, Chengbi; Tang, Youhong

    2015-01-01

    Highlights: • Large-scale protective devices with different structural designs have been optimized. • Large-scale protective devices with novel material designs have been optimized. • Protective devices constructed of sandwich panels have the best anti-collision performance. • Protective devices with novel material design can reduce weight and construction cost. - Abstract: Large-scale protective devices must endure the impact of severe forces, large structural deformation, the increased stress and strain rate effects, and multiple coupling effects. In evaluation of the safety of conceptual design through simulation, several key parameters considered in this research are maximum impact force, energy dissipated by the impactor (e.g. a ship) and energy absorbed by the device and the impactor stroke. During impact, the main function of the ring beam structure is to resist and buffer the impact force between ship and bridge pile caps, which could guarantee that the magnitude of impact force meets the corresponding requirements. The means of improving anti-collision performance can be to increase the strength of the beam section or to exchange the steel material with novel fiber reinforced polymer laminates. The main function of the buoyancy tank is to absorb and transfer the ship’s kinetic energy through large plastic deformation, damage, or friction occurring within itself. The energy absorption effect can be improved by structure optimization or by the use of new sandwich panels. Structural and material optimization schemes are proposed on the basis of conceptual design in this research, and protective devices constructed of sandwich panels prove to have the best anti-collision performance

  6. Coordinating a Large, Amalgamated REU Program with Multiple Funding Sources

    Science.gov (United States)

    Fiorini, Eugene; Myers, Kellen; Naqvi, Yusra

    2017-01-01

    In this paper, we discuss the challenges of organizing a large REU program amalgamated from multiple funding sources, including diverse participants, mentors, and research projects. We detail the program's structure, activities, and recruitment, and we hope to demonstrate that the organization of this REU is not only beneficial to its…

  7. Large Scale Cosmological Anomalies and Inhomogeneous Dark Energy

    Directory of Open Access Journals (Sweden)

    Leandros Perivolaropoulos

    2014-01-01

    Full Text Available A wide range of large scale observations hint towards possible modifications on the standard cosmological model which is based on a homogeneous and isotropic universe with a small cosmological constant and matter. These observations, also known as “cosmic anomalies” include unexpected Cosmic Microwave Background perturbations on large angular scales, large dipolar peculiar velocity flows of galaxies (“bulk flows”, the measurement of inhomogenous values of the fine structure constant on cosmological scales (“alpha dipole” and other effects. The presence of the observational anomalies could either be a large statistical fluctuation in the context of ΛCDM or it could indicate a non-trivial departure from the cosmological principle on Hubble scales. Such a departure is very much constrained by cosmological observations for matter. For dark energy however there are no significant observational constraints for Hubble scale inhomogeneities. In this brief review I discuss some of the theoretical models that can naturally lead to inhomogeneous dark energy, their observational constraints and their potential to explain the large scale cosmic anomalies.

  8. Sustainability of donor programs: evaluating and informing the transition of a large HIV prevention program in India to local ownership

    Directory of Open Access Journals (Sweden)

    Sara Bennett

    2011-12-01

    Full Text Available Sustainability is the holy grail of many development projects, yet there is limited evidence about strategies that effectively support transition of programs from donor funding to national governments. The first phase of Avahan, the India AIDS Initiative supported by the Bill and Melinda Gates Foundation (2003–2009, aimed to demonstrate an HIV/AIDS prevention program at scale, primarily targeted at high-risk groups. During the second phase (2009–2013, this large-scale program will be transitioned to its natural owners: the Government of India and local communities. This paper describes the evaluation design for the Avahan transition strategy.A detailed logic model for the transition was developed. The Avahan transition strategy focuses on three activities: 1 enhancing capacities among communities, non-governmental organizations (NGOs, and government entities, in line with India's national AIDS control strategy; 2 aligning technical and managerial aspects of Avahan programs with government norms and standards; and 3 promoting and sustaining commitment to services for most-at-risk populations. It is anticipated that programs will then transfer smoothly to government and community ownership, become institutionalized within the government system, and support a sustained HIV/AIDS response.The research design evaluates the implementation and effectiveness of 1 activities undertaken by the program; 2 intermediate effects including the process of institutionalization and the extent to which key Avahan organizational procedures and behaviors are integrated into government systems; and 3 overarching effects namely the impact of the transition process on the sustained delivery of HIV/AIDS prevention services to high-risk groups. Both qualitative and quantitative research approaches are employed so that the evaluation will both assess outcomes and explain why they have occurred.It is unusual for donor-supported projects in low- and middle

  9. Large-scale patterns in Rayleigh-Benard convection

    International Nuclear Information System (INIS)

    Hardenberg, J. von; Parodi, A.; Passoni, G.; Provenzale, A.; Spiegel, E.A.

    2008-01-01

    Rayleigh-Benard convection at large Rayleigh number is characterized by the presence of intense, vertically moving plumes. Both laboratory and numerical experiments reveal that the rising and descending plumes aggregate into separate clusters so as to produce large-scale updrafts and downdrafts. The horizontal scales of the aggregates reported so far have been comparable to the horizontal extent of the containers, but it has not been clear whether that represents a limitation imposed by domain size. In this work, we present numerical simulations of convection at sufficiently large aspect ratio to ascertain whether there is an intrinsic saturation scale for the clustering process when that ratio is large enough. From a series of simulations of Rayleigh-Benard convection with Rayleigh numbers between 10 5 and 10 8 and with aspect ratios up to 12π, we conclude that the clustering process has a finite horizontal saturation scale with at most a weak dependence on Rayleigh number in the range studied

  10. Large-scale hydropower system optimization using dynamic programming and object-oriented programming: the case of the Northeast China Power Grid.

    Science.gov (United States)

    Li, Ji-Qing; Zhang, Yu-Shan; Ji, Chang-Ming; Wang, Ai-Jing; Lund, Jay R

    2013-01-01

    This paper examines long-term optimal operation using dynamic programming for a large hydropower system of 10 reservoirs in Northeast China. Besides considering flow and hydraulic head, the optimization explicitly includes time-varying electricity market prices to maximize benefit. Two techniques are used to reduce the 'curse of dimensionality' of dynamic programming with many reservoirs. Discrete differential dynamic programming (DDDP) reduces the search space and computer memory needed. Object-oriented programming (OOP) and the ability to dynamically allocate and release memory with the C++ language greatly reduces the cumulative effect of computer memory for solving multi-dimensional dynamic programming models. The case study shows that the model can reduce the 'curse of dimensionality' and achieve satisfactory results.

  11. Fast and accurate solution for the SCUC problem in large-scale power systems using adapted binary programming and enhanced dual neural network

    International Nuclear Information System (INIS)

    Shafie-khah, M.; Moghaddam, M.P.; Sheikh-El-Eslami, M.K.; Catalão, J.P.S.

    2014-01-01

    Highlights: • A novel hybrid method based on decomposition of SCUC into QP and BP problems is proposed. • An adapted binary programming and an enhanced dual neural network model are applied. • The proposed EDNN is exactly convergent to the global optimal solution of QP. • An AC power flow procedure is developed for including contingency/security issues. • It is suited for large-scale systems, providing both accurate and fast solutions. - Abstract: This paper presents a novel hybrid method for solving the security constrained unit commitment (SCUC) problem. The proposed formulation requires much less computation time in comparison with other methods while assuring the accuracy of the results. Furthermore, the framework provided here allows including an accurate description of warmth-dependent startup costs, valve point effects, multiple fuel costs, forbidden zones of operation, and AC load flow bounds. To solve the nonconvex problem, an adapted binary programming method and enhanced dual neural network model are utilized as optimization tools, and a procedure for AC power flow modeling is developed for including contingency/security issues, as new contributions to earlier studies. Unlike classical SCUC methods, the proposed method allows to simultaneously solve the unit commitment problem and comply with the network limits. In addition to conventional test systems, a real-world large-scale power system with 493 units has been used to fully validate the effectiveness of the novel hybrid method proposed

  12. Maps4Science - National Roadmap for Large-Scale Research Facilities 2011 (NWO Application form)

    NARCIS (Netherlands)

    Van Oosterom, P.J.M.; Van der Wal, T.; De By, R.A.

    2011-01-01

    The Netherlands is historically known as one of worlds' best-measured countries. It is continuing this tradition today with unequalled new datasets, such as the nationwide large-scale topographic map, our unique digital height map (nationwide coverage; ten very accurate 3D points for every Dutch m2)

  13. Implementing Projects in Calculus on a Large Scale at the University of South Florida

    Science.gov (United States)

    Fox, Gordon A.; Campbell, Scott; Grinshpan, Arcadii; Xu, Xiaoying; Holcomb, John; Bénéteau, Catherine; Lewis, Jennifer E.; Ramachandran, Kandethody

    2017-01-01

    This paper describes the development of a program of project-based learning in Calculus courses at a large urban research university. In this program, students developed research projects in consultation with a faculty advisor in their major, and supervised by their calculus instructors. Students wrote up their projects in a prescribed format…

  14. Manufacturing test of large scale hollow capsule and long length cladding in the large scale oxide dispersion strengthened (ODS) martensitic steel

    International Nuclear Information System (INIS)

    Narita, Takeshi; Ukai, Shigeharu; Kaito, Takeji; Ohtsuka, Satoshi; Fujiwara, Masayuki

    2004-04-01

    Mass production capability of oxide dispersion strengthened (ODS) martensitic steel cladding (9Cr) has being evaluated in the Phase II of the Feasibility Studies on Commercialized Fast Reactor Cycle System. The cost for manufacturing mother tube (raw materials powder production, mechanical alloying (MA) by ball mill, canning, hot extrusion, and machining) is a dominant factor in the total cost for manufacturing ODS ferritic steel cladding. In this study, the large-sale 9Cr-ODS martensitic steel mother tube which is made with a large-scale hollow capsule, and long length claddings were manufactured, and the applicability of these processes was evaluated. Following results were obtained in this study. (1) Manufacturing the large scale mother tube in the dimension of 32 mm OD, 21 mm ID, and 2 m length has been successfully carried out using large scale hollow capsule. This mother tube has a high degree of accuracy in size. (2) The chemical composition and the micro structure of the manufactured mother tube are similar to the existing mother tube manufactured by a small scale can. And the remarkable difference between the bottom and top sides in the manufactured mother tube has not been observed. (3) The long length cladding has been successfully manufactured from the large scale mother tube which was made using a large scale hollow capsule. (4) For reducing the manufacturing cost of the ODS steel claddings, manufacturing process of the mother tubes using a large scale hollow capsules is promising. (author)

  15. Research waste management program - An action proposal

    International Nuclear Information System (INIS)

    Costa Ramos, A.; Esposito, I.

    1997-01-01

    The Brazilian Nuclear Energy Commission planned prepared and established a Research Waste Management Program, started in 1996, in order to map, to analyze and to solve the common problems in the research field. The specific study done included a large number of academic institutions. The procedures, results and operational methodology used by the Team linked to the Program, in one of the research institutions studied where corrective actions were implemented to avoid unnecessary dose to the public, will be discussed in this article. (author)

  16. Amplification of large-scale magnetic field in nonhelical magnetohydrodynamics

    KAUST Repository

    Kumar, Rohit

    2017-08-11

    It is typically assumed that the kinetic and magnetic helicities play a crucial role in the growth of large-scale dynamo. In this paper, we demonstrate that helicity is not essential for the amplification of large-scale magnetic field. For this purpose, we perform nonhelical magnetohydrodynamic (MHD) simulation, and show that the large-scale magnetic field can grow in nonhelical MHD when random external forcing is employed at scale 1/10 the box size. The energy fluxes and shell-to-shell transfer rates computed using the numerical data show that the large-scale magnetic energy grows due to the energy transfers from the velocity field at the forcing scales.

  17. Scale up of nutrition and health programs in Ethiopia and their overlap with reductions in child stunting.

    Science.gov (United States)

    Wirth, James P; Matji, Joan; Woodruff, Bradley A; Chamois, Sylvie; Getahun, Zewditu; White, Jessica M; Rohner, Fabian

    2017-04-01

    The prevalence of stunting in Sub-Saharan Africa has changed little since 2000, and the number of stunted children has increased. In contrast, Ethiopia is an example where the national stunting prevalence and number of stunted children have decreased consistently. We compare regional differences and temporal patterns in stunting with large-scale program coverage to identify where and when programs may have led to reductions in stunting. Data from three national demographic and health surveys and population statistics illustrate, at the regional level, where and when the prevalence and number of stunted children changed since 2000. Reports from large-scale nutrition and health programs were used to identify ecologic associations between geographic program coverage and reductions in stunting. From 2000 to 2005, the decline in the national stunting prevalence was mainly a result of reductions in Oromiya, SNNP and Tigray. Few nutrition programs had high coverage during this time, and economic growth may have contributed to stunting reduction by increasing household wealth and investments in sanitation. From 2005 to 2011, declines in stunting prevalence in Amhara, SNNP, Somali and Oromiya were largely responsible for national reductions. Numerous programs were implemented at scale and could have plausibly improved stunting. While ecologic relationships suggest that economic growth and large-scale programs may have contributed to the reduction in stunting in Ethiopia, stunting did not decrease in all regions despite increased program coverage expansion of the health system. Additional impact evaluations are needed identify the most effective programs to accelerate the reduction in the prevalence and number of stunted children. © 2016 John Wiley & Sons Ltd. © 2016 John Wiley & Sons Ltd.

  18. NASA's Information Power Grid: Large Scale Distributed Computing and Data Management

    Science.gov (United States)

    Johnston, William E.; Vaziri, Arsi; Hinke, Tom; Tanner, Leigh Ann; Feiereisen, William J.; Thigpen, William; Tang, Harry (Technical Monitor)

    2001-01-01

    Large-scale science and engineering are done through the interaction of people, heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. The overall motivation for Grids is to facilitate the routine interactions of these resources in order to support large-scale science and engineering. Multi-disciplinary simulations provide a good example of a class of applications that are very likely to require aggregation of widely distributed computing, data, and intellectual resources. Such simulations - e.g. whole system aircraft simulation and whole system living cell simulation - require integrating applications and data that are developed by different teams of researchers frequently in different locations. The research team's are the only ones that have the expertise to maintain and improve the simulation code and/or the body of experimental data that drives the simulations. This results in an inherently distributed computing and data management environment.

  19. Hydrometeorological variability on a large french catchment and its relation to large-scale circulation across temporal scales

    Science.gov (United States)

    Massei, Nicolas; Dieppois, Bastien; Fritier, Nicolas; Laignel, Benoit; Debret, Maxime; Lavers, David; Hannah, David

    2015-04-01

    In the present context of global changes, considerable efforts have been deployed by the hydrological scientific community to improve our understanding of the impacts of climate fluctuations on water resources. Both observational and modeling studies have been extensively employed to characterize hydrological changes and trends, assess the impact of climate variability or provide future scenarios of water resources. In the aim of a better understanding of hydrological changes, it is of crucial importance to determine how and to what extent trends and long-term oscillations detectable in hydrological variables are linked to global climate oscillations. In this work, we develop an approach associating large-scale/local-scale correlation, enmpirical statistical downscaling and wavelet multiresolution decomposition of monthly precipitation and streamflow over the Seine river watershed, and the North Atlantic sea level pressure (SLP) in order to gain additional insights on the atmospheric patterns associated with the regional hydrology. We hypothesized that: i) atmospheric patterns may change according to the different temporal wavelengths defining the variability of the signals; and ii) definition of those hydrological/circulation relationships for each temporal wavelength may improve the determination of large-scale predictors of local variations. The results showed that the large-scale/local-scale links were not necessarily constant according to time-scale (i.e. for the different frequencies characterizing the signals), resulting in changing spatial patterns across scales. This was then taken into account by developing an empirical statistical downscaling (ESD) modeling approach which integrated discrete wavelet multiresolution analysis for reconstructing local hydrometeorological processes (predictand : precipitation and streamflow on the Seine river catchment) based on a large-scale predictor (SLP over the Euro-Atlantic sector) on a monthly time-step. This approach

  20. Superconducting materials for large scale applications

    International Nuclear Information System (INIS)

    Dew-Hughes, D.

    1975-01-01

    Applications of superconductors capable of carrying large current densities in large-scale electrical devices are examined. Discussions are included on critical current density, superconducting materials available, and future prospects for improved superconducting materials. (JRD)

  1. Large-scale influences in near-wall turbulence.

    Science.gov (United States)

    Hutchins, Nicholas; Marusic, Ivan

    2007-03-15

    Hot-wire data acquired in a high Reynolds number facility are used to illustrate the need for adequate scale separation when considering the coherent structure in wall-bounded turbulence. It is found that a large-scale motion in the log region becomes increasingly comparable in energy to the near-wall cycle as the Reynolds number increases. Through decomposition of fluctuating velocity signals, it is shown that this large-scale motion has a distinct modulating influence on the small-scale energy (akin to amplitude modulation). Reassessment of DNS data, in light of these results, shows similar trends, with the rate and intensity of production due to the near-wall cycle subject to a modulating influence from the largest-scale motions.

  2. Building Participation in Large-scale Conservation: Lessons from Belize and Panama

    Directory of Open Access Journals (Sweden)

    Jesse Guite Hastings

    2015-01-01

    Full Text Available Motivated by biogeography and a desire for alignment with the funding priorities of donors, the twenty-first century has seen big international NGOs shifting towards a large-scale conservation approach. This shift has meant that even before stakeholders at the national and local scale are involved, conservation programmes often have their objectives defined and funding allocated. This paper uses the experiences of Conservation International′s Marine Management Area Science (MMAS programme in Belize and Panama to explore how to build participation at the national and local scale while working within the bounds of the current conservation paradigm. Qualitative data about MMAS was gathered through a multi-sited ethnographic research process, utilising document review, direct observation, and semi-structured interviews with 82 informants in Belize, Panama, and the United States of America. Results indicate that while a large-scale approach to conservation disadvantages early national and local stakeholder participation, this effect can be mediated through focusing engagement efforts, paying attention to context, building horizontal and vertical partnerships, and using deliberative processes that promote learning. While explicit consideration of geopolitics and local complexity alongside biogeography in the planning phase of a large-scale conservation programme is ideal, actions taken by programme managers during implementation can still have a substantial impact on conservation outcomes.

  3. An industrial perspective on bioreactor scale-down: what we can learn from combined large-scale bioprocess and model fluid studies.

    Science.gov (United States)

    Noorman, Henk

    2011-08-01

    For industrial bioreactor design, operation, control and optimization, the scale-down approach is often advocated to efficiently generate data on a small scale, and effectively apply suggested improvements to the industrial scale. In all cases it is important to ensure that the scale-down conditions are representative of the real large-scale bioprocess. Progress is hampered by limited detailed and local information from large-scale bioprocesses. Complementary to real fermentation studies, physical aspects of model fluids such as air-water in large bioreactors provide useful information with limited effort and cost. Still, in industrial practice, investments of time, capital and resources often prohibit systematic work, although, in the end, savings obtained in this way are trivial compared to the expenses that result from real process disturbances, batch failures, and non-flyers with loss of business opportunity. Here we try to highlight what can be learned from real large-scale bioprocess in combination with model fluid studies, and to provide suitable computation tools to overcome data restrictions. Focus is on a specific well-documented case for a 30-m(3) bioreactor. Areas for further research from an industrial perspective are also indicated. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Undergraduate Research Program Between SCU and SOFIA

    Science.gov (United States)

    Kulas, Kristin Rose; Andersson, B.-G.

    2018-06-01

    We present results on an undergraduate research program run in collaboration between Santa Clara University (SCU), a predominately undergraduate liberal arts college and the SOFIA Science Center/USRA. We have started a synergistic program between SCU and SOFIA (located at NASA Ames) where the students are able to be fully immersed in astronomical research; from helping to write telescope observing proposal; to observing at a world-class telescope; to reducing and analyzing the data that they acquired and ultimately to presenting/publishing their findings. A recently awarded NSF collaborative grant will allow us to execute and expand this program over the next several years. In this poster we present some of our students research and their success after the program. In addition, we discuss how a small university can actively collaborate with a large government-funded program like SOFIA, funded by NASA.

  5. Large scale tracking algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Ross L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Love, Joshua Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Melgaard, David Kennett [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Karelitz, David B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pitts, Todd Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Zollweg, Joshua David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Anderson, Dylan Z. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Nandy, Prabal [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Whitlow, Gary L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bender, Daniel A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Byrne, Raymond Harry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  6. Research data management support for large-scale, long-term, interdisciplinary collaborative research centers with a focus on environmental sciences

    Science.gov (United States)

    Curdt, C.; Hoffmeister, D.; Bareth, G.; Lang, U.

    2017-12-01

    Science conducted in collaborative, cross-institutional research projects, requires active sharing of research ideas, data, documents and further information in a well-managed, controlled and structured manner. Thus, it is important to establish corresponding infrastructures and services for the scientists. Regular project meetings and joint field campaigns support the exchange of research ideas. Technical infrastructures facilitate storage, documentation, exchange and re-use of data as results of scientific output. Additionally, also publications, conference contributions, reports, pictures etc. should be managed. Both, knowledge and data sharing is essential to create synergies. Within the coordinated programme `Collaborative Research Center' (CRC), the German Research Foundation offers funding to establish research data management (RDM) infrastructures and services. CRCs are large-scale, interdisciplinary, multi-institutional, long-term (up to 12 years), university-based research institutions (up to 25 sub-projects). These CRCs address complex and scientifically challenging research questions. This poster presents the RDM services and infrastructures that have been established for two CRCs, both focusing on environmental sciences. Since 2007, a RDM support infrastructure and associated services have been set up for the CRC/Transregio 32 (CRC/TR32) `Patterns in Soil-Vegetation-Atmosphere-Systems: Monitoring, Modelling and Data Assimilation' (www.tr32.de). The experiences gained have been used to arrange RDM services for the CRC1211 `Earth - Evolution at the Dry Limit' (www.crc1211.de), funded since 2016. In both projects scientists from various disciplines collect heterogeneous data at field campaigns or by modelling approaches. To manage the scientific output, the TR32DB data repository (www.tr32db.de) has been designed and implemented for the CRC/TR32. This system was transferred and adapted to the CRC1211 needs (www.crc1211db.uni-koeln.de) in 2016. Both

  7. Child Development Program Evaluation Scale.

    Science.gov (United States)

    Fiene, Richard J.

    The Child Development Program Evaluation Scale (CDPES) is actually two scales in one, a licensing scale and a quality scale. Licensing predictor items have been found to predict overall compliance of child day care centers with state regulations in four states. Quality scale items have been found to predict the overall quality of child day care…

  8. PKI security in large-scale healthcare networks.

    Science.gov (United States)

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  9. Parallel supercomputing: Advanced methods, algorithms, and software for large-scale linear and nonlinear problems

    Energy Technology Data Exchange (ETDEWEB)

    Carey, G.F.; Young, D.M.

    1993-12-31

    The program outlined here is directed to research on methods, algorithms, and software for distributed parallel supercomputers. Of particular interest are finite element methods and finite difference methods together with sparse iterative solution schemes for scientific and engineering computations of very large-scale systems. Both linear and nonlinear problems will be investigated. In the nonlinear case, applications with bifurcation to multiple solutions will be considered using continuation strategies. The parallelizable numerical methods of particular interest are a family of partitioning schemes embracing domain decomposition, element-by-element strategies, and multi-level techniques. The methods will be further developed incorporating parallel iterative solution algorithms with associated preconditioners in parallel computer software. The schemes will be implemented on distributed memory parallel architectures such as the CRAY MPP, Intel Paragon, the NCUBE3, and the Connection Machine. We will also consider other new architectures such as the Kendall-Square (KSQ) and proposed machines such as the TERA. The applications will focus on large-scale three-dimensional nonlinear flow and reservoir problems with strong convective transport contributions. These are legitimate grand challenge class computational fluid dynamics (CFD) problems of significant practical interest to DOE. The methods developed and algorithms will, however, be of wider interest.

  10. Safe Patient Handling and Mobility: Development and Implementation of a Large-Scale Education Program.

    Science.gov (United States)

    Lee, Corinne; Knight, Suzanne W; Smith, Sharon L; Nagle, Dorothy J; DeVries, Lori

    This article addresses the development, implementation, and evaluation of an education program for safe patient handling and mobility at a large academic medical center. The ultimate goal of the program was to increase safety during patient mobility/transfer and reduce nursing staff injury from lifting/pulling. This comprehensive program was designed on the basis of the principles of prework, application, and support at the point of care. A combination of online learning, demonstration, skill evaluation, and coaching at the point of care was used to achieve the goal. Specific roles and responsibilities were developed to facilitate implementation. It took 17 master trainers, 88 certified trainers, 176 unit-based trainers, and 98 coaches to put 3706 nurses and nursing assistants through the program. Evaluations indicated both an increase in knowledge about safe patient handling and an increased ability to safely mobilize patients. The challenge now is sustainability of safe patient-handling practices and the growth and development of trainers and coaches.

  11. Emerging large-scale solar heating applications

    International Nuclear Information System (INIS)

    Wong, W.P.; McClung, J.L.

    2009-01-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  12. Emerging large-scale solar heating applications

    Energy Technology Data Exchange (ETDEWEB)

    Wong, W.P.; McClung, J.L. [Science Applications International Corporation (SAIC Canada), Ottawa, Ontario (Canada)

    2009-07-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  13. Transitioning a Large Scale HIV/AIDS Prevention Program to Local Stakeholders: Findings from the Avahan Transition Evaluation.

    Directory of Open Access Journals (Sweden)

    Sara Bennett

    Full Text Available Between 2009-2013 the Bill and Melinda Gates Foundation transitioned its HIV/AIDS prevention initiative in India from being a stand-alone program outside of government, to being fully government funded and implemented. We present an independent prospective evaluation of the transition.The evaluation drew upon (1 a structured survey of transition readiness in a sample of 80 targeted HIV prevention programs prior to transition; (2 a structured survey assessing institutionalization of program features in a sample of 70 targeted intervention (TI programs, one year post-transition; and (3 case studies of 15 TI programs.Transition was conducted in 3 rounds. While the 2009 transition round was problematic, subsequent rounds were implemented more smoothly. In the 2011 and 2012 transition rounds, Avahan programs were well prepared for transition with the large majority of TI program staff trained for transition, high alignment with government clinical, financial and managerial norms, and strong government commitment to the program. One year post transition there were significant program changes, but these were largely perceived positively. Notable negative changes were: limited flexibility in program management, delays in funding, commodity stock outs, and community member perceptions of a narrowing in program focus. Service coverage outcomes were sustained at least six months post-transition.The study suggests that significant investments in transition preparation contributed to a smooth transition and sustained service coverage. Notwithstanding, there were substantive program changes post-transition. Five key lessons for transition design and implementation are identified.

  14. Evaluation of Large-Scale Public-Sector Reforms: A Comparative Analysis

    Science.gov (United States)

    Breidahl, Karen N.; Gjelstrup, Gunnar; Hansen, Hanne Foss; Hansen, Morten Balle

    2017-01-01

    Research on the evaluation of large-scale public-sector reforms is rare. This article sets out to fill that gap in the evaluation literature and argues that it is of vital importance since the impact of such reforms is considerable and they change the context in which evaluations of other and more delimited policy areas take place. In our…

  15. Vercors: a 1/3 scaled mockup and an ambitious research program to better understand the different mechanisms of leakage and aging

    International Nuclear Information System (INIS)

    Masson, B.; Galenne, E.; Oukhemanou, E.; Aubry, C.; Laou-Sio-Hoi, G.

    2015-01-01

    This paper presents the new EDF research program on a NPP containment building. It is based on a very large mockup at scale 1/3 of a PWR reactor containment. The conception choices have been guided by the mockup objectives: representativeness (the mock-up has to be as close as possible to the real containment building in order to facilitate the transfer of the results from the test case to the NPP cases) and accelerated aging (the proposed solution is to construct the mock-up at a reduced scale (scale 1:3): the drying of the structure will be then faster, as the wall thickness will be reduced. This will result in a faster drying creep, which is supposed to be the main phenomena explaining the leak rate evolution. In addition, in order to enable visual inspections and monitoring, a metallic structure supporting various annular floors is erected inside the containment, and another one is anchored in the external wall. Materials of the model have been selected to be as much as possible similar than the ones used for the construction of full scale containments, in terms of mechanical and thermal behavior as well. Concrete class is 34/37 MPa. Nevertheless, the concrete mix microstructure cannot be perfectly scaled due to aggregates size. The prestressing tendons layout is exactly scaled, including any deviations around penetrations: tendons spacing is divided by 3, and the ducts diameter is scaled as much as reasonably possible for the contractor (50 mm). The mock-up will be finely instrumented so that its behavior is monitored from the beginning of the construction. More than 500 sensors and 2 km of fiber optic cables are to be positioned in the concrete, both on the rebars and on the prestressing cables. Lots of results can be of interest for the civil engineering community: scale effects on shrinkage, drying and creep, apparition and evolution of cracks, influence of the temperature on the creep velocity, on-field validation of new inspections techniques

  16. Research Program Overview

    Science.gov (United States)

    PEER logo Pacific Earthquake Engineering Research Center home about peer news events research products laboratories publications nisee b.i.p. members education FAQs links research Research Program Overview Tall Buildings Initiative Transportation Research Program Lifelines Program Concrete Grand

  17. SEARCH: Study of Environmental Arctic Change—A System-scale, Cross-disciplinary Arctic Research Program

    Science.gov (United States)

    Wiggins, H. V.; Eicken, H.; Fox, S. E.

    2012-12-01

    SEARCH is an interdisciplinary and interagency program that works with academic and government agency scientists to plan, conduct, and synthesize studies of arctic change. The vision of SEARCH is to provide scientific understanding of arctic environmental change to help society understand and respond to a rapidly changing Arctic. Towards this end, SEARCH: 1. Generates and synthesizes research findings and promotes arctic science and scientific discovery across disciplines and among agencies. 2. Identifies emerging issues in arctic environmental change. 3. Provides information resources to arctic stakeholders, policy-makers, and the public to help them respond to arctic environmental change. 4. Coordinates with national arctic science programs integral to SEARCH goals. 5. Facilitates research activities across local-to-global scales with stakeholder concerns incorporated from the start of the planning process. 6. Represents the U.S. arctic environmental change science community in international and global change research initiatives. Specific current activities include: Arctic Observing Network (AON) - coordinating a system of atmospheric, land- and ocean-based environmental monitoring capabilities that will significantly advance our observations of arctic environmental conditions. Arctic Sea Ice Outlook ¬- an international effort that provides monthly summer reports synthesizing community estimates of the expected sea ice minimum. Sea Ice for Walrus Outlook - a resource for Alaska Native subsistence hunters, coastal communities, and others that provides weekly reports with information on sea ice conditions relevant to walrus in Alaska waters. In April, the SEARCH Science Steering Committee (SSC) released a set of draft 5-year goals and objectives for review by the broader arctic science community. The goals and objectives will direct the SEARCH program in the next five years. The draft SEARCH goals focus on four areas: ice-diminished Arctic Ocean, warming

  18. Large-scale regions of antimatter

    International Nuclear Information System (INIS)

    Grobov, A. V.; Rubin, S. G.

    2015-01-01

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era

  19. Large-scale regions of antimatter

    Energy Technology Data Exchange (ETDEWEB)

    Grobov, A. V., E-mail: alexey.grobov@gmail.com; Rubin, S. G., E-mail: sgrubin@mephi.ru [National Research Nuclear University MEPhI (Russian Federation)

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  20. Oligopolistic competition in wholesale electricity markets: Large-scale simulation and policy analysis using complementarity models

    Science.gov (United States)

    Helman, E. Udi

    This dissertation conducts research into the large-scale simulation of oligopolistic competition in wholesale electricity markets. The dissertation has two parts. Part I is an examination of the structure and properties of several spatial, or network, equilibrium models of oligopolistic electricity markets formulated as mixed linear complementarity problems (LCP). Part II is a large-scale application of such models to the electricity system that encompasses most of the United States east of the Rocky Mountains, the Eastern Interconnection. Part I consists of Chapters 1 to 6. The models developed in this part continue research into mixed LCP models of oligopolistic electricity markets initiated by Hobbs [67] and subsequently developed by Metzler [87] and Metzler, Hobbs and Pang [88]. Hobbs' central contribution is a network market model with Cournot competition in generation and a price-taking spatial arbitrage firm that eliminates spatial price discrimination by the Cournot firms. In one variant, the solution to this model is shown to be equivalent to the "no arbitrage" condition in a "pool" market, in which a Regional Transmission Operator optimizes spot sales such that the congestion price between two locations is exactly equivalent to the difference in the energy prices at those locations (commonly known as locational marginal pricing). Extensions to this model are presented in Chapters 5 and 6. One of these is a market model with a profit-maximizing arbitrage firm. This model is structured as a mathematical program with equilibrium constraints (MPEC), but due to the linearity of its constraints, can be solved as a mixed LCP. Part II consists of Chapters 7 to 12. The core of these chapters is a large-scale simulation of the U.S. Eastern Interconnection applying one of the Cournot competition with arbitrage models. This is the first oligopolistic equilibrium market model to encompass the full Eastern Interconnection with a realistic network representation (using

  1. Analysis for preliminary evaluation of discrete fracture flow and large-scale permeability in sedimentary rocks

    International Nuclear Information System (INIS)

    Kanehiro, B.Y.; Lai, C.H.; Stow, S.H.

    1987-05-01

    Conceptual models for sedimentary rock settings that could be used in future evaluation and suitability studies are being examined through the DOE Repository Technology Program. One area of concern for the hydrologic aspects of these models is discrete fracture flow analysis as related to the estimation of the size of the representative elementary volume, evaluation of the appropriateness of continuum assumptions and estimation of the large-scale permeabilities of sedimentary rocks. A basis for preliminary analysis of flow in fracture systems of the types that might be expected to occur in low permeability sedimentary rocks is presented. The approach used involves numerical modeling of discrete fracture flow for the configuration of a large-scale hydrologic field test directed at estimation of the size of the representative elementary volume and large-scale permeability. Analysis of fracture data on the basis of this configuration is expected to provide a preliminary indication of the scale at which continuum assumptions can be made

  2. Engineering management of large scale systems

    Science.gov (United States)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  3. Large-Scale Analysis of Art Proportions

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2014-01-01

    While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square) and with majo......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  4. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard

    2012-01-01

    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  5. Integrating scientific knowledge into large-scale restoration programs: the CALFED Bay-Delta Program experience

    Science.gov (United States)

    Taylor, K.A.; Short, A.

    2009-01-01

    Integrating science into resource management activities is a goal of the CALFED Bay-Delta Program, a multi-agency effort to address water supply reliability, ecological condition, drinking water quality, and levees in the Sacramento-San Joaquin Delta of northern California. Under CALFED, many different strategies were used to integrate science, including interaction between the research and management communities, public dialogues about scientific work, and peer review. This paper explores ways science was (and was not) integrated into CALFED's management actions and decision systems through three narratives describing different patterns of scientific integration and application in CALFED. Though a collaborative process and certain organizational conditions may be necessary for developing new understandings of the system of interest, we find that those factors are not sufficient for translating that knowledge into management actions and decision systems. We suggest that the application of knowledge may be facilitated or hindered by (1) differences in the objectives, approaches, and cultures of scientists operating in the research community and those operating in the management community and (2) other factors external to the collaborative process and organization.

  6. Large scale cluster computing workshop

    International Nuclear Information System (INIS)

    Dane Skow; Alan Silverman

    2002-01-01

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community

  7. Feasibility analysis of large length-scale thermocapillary flow experiment for the International Space Station

    Science.gov (United States)

    Alberts, Samantha J.

    The investigation of microgravity fluid dynamics emerged out of necessity with the advent of space exploration. In particular, capillary research took a leap forward in the 1960s with regards to liquid settling and interfacial dynamics. Due to inherent temperature variations in large spacecraft liquid systems, such as fuel tanks, forces develop on gas-liquid interfaces which induce thermocapillary flows. To date, thermocapillary flows have been studied in small, idealized research geometries usually under terrestrial conditions. The 1 to 3m lengths in current and future large tanks and hardware are designed based on hardware rather than research, which leaves spaceflight systems designers without the technological tools to effectively create safe and efficient designs. This thesis focused on the design and feasibility of a large length-scale thermocapillary flow experiment, which utilizes temperature variations to drive a flow. The design of a helical channel geometry ranging from 1 to 2.5m in length permits a large length-scale thermocapillary flow experiment to fit in a seemingly small International Space Station (ISS) facility such as the Fluids Integrated Rack (FIR). An initial investigation determined the proposed experiment produced measurable data while adhering to the FIR facility limitations. The computational portion of this thesis focused on the investigation of functional geometries of fuel tanks and depots using Surface Evolver. This work outlines the design of a large length-scale thermocapillary flow experiment for the ISS FIR. The results from this work improve the understanding thermocapillary flows and thus improve technological tools for predicting heat and mass transfer in large length-scale thermocapillary flows. Without the tools to understand the thermocapillary flows in these systems, engineers are forced to design larger, heavier vehicles to assure safety and mission success.

  8. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    Science.gov (United States)

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Safety and protection for large scale superconducting magnets. FY 1984 report

    International Nuclear Information System (INIS)

    Thome, R.J.; Pillsbury, R.D. Jr.; Minervini, J.V.

    1984-11-01

    The Fusion Program is moving rapidly into design and construction of systems using magnets with stored energies in the range of hundreds of megajoules to gigajoules. For example, the toroidal field coil system alone for TFCX would store about 4 GJ and the mirror system MFTF-B would store about 1.6 GJ. Safety and protection analyses of the magnet subsystems become progressively more important as the size and complexity of the installations increase. MIT has been carrying out a program for INEL oriented toward safety and protection in large scale superconducting magnet systems. The program involves collection and analysis of information on actual magnet failures, analyses of general problems associated with safety and protection, and performance of safety oriented experiments. This report summarizes work performed in FY 1984

  10. Large-Scale Agriculture and Outgrower Schemes in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    , the impact of large-scale agriculture and outgrower schemes on productivity, household welfare and wages in developing countries is highly contentious. Chapter 1 of this thesis provides an introduction to the study, while also reviewing the key debate in the contemporary land ‘grabbing’ and historical large...... sugarcane outgrower scheme on household income and asset stocks. Chapter 5 examines the wages and working conditions in ‘formal’ large-scale and ‘informal’ small-scale irrigated agriculture. The results in Chapter 2 show that moisture stress, the use of untested planting materials, and conflict over land...... commands a higher wage than ‘formal’ large-scale agriculture, while rather different wage determination mechanisms exist in the two sectors. Human capital characteristics (education and experience) partly explain the differences in wages within the formal sector, but play no significant role...

  11. Economically viable large-scale hydrogen liquefaction

    Science.gov (United States)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  12. Large scale chromatographic separations using continuous displacement chromatography (CDC)

    International Nuclear Information System (INIS)

    Taniguchi, V.T.; Doty, A.W.; Byers, C.H.

    1988-01-01

    A process for large scale chromatographic separations using a continuous chromatography technique is described. The process combines the advantages of large scale batch fixed column displacement chromatography with conventional analytical or elution continuous annular chromatography (CAC) to enable large scale displacement chromatography to be performed on a continuous basis (CDC). Such large scale, continuous displacement chromatography separations have not been reported in the literature. The process is demonstrated with the ion exchange separation of a binary lanthanide (Nd/Pr) mixture. The process is, however, applicable to any displacement chromatography separation that can be performed using conventional batch, fixed column chromatography

  13. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  14. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data

  15. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; White, B.W.; Nichols, B.D.; Smith, P.R.; Leslie, I.H.; Fenton, D.L.; Gunaji, M.V.; Blythe, J.P.

    1991-01-01

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  16. Studies of Sub-Synchronous Oscillations in Large-Scale Wind Farm Integrated System

    Science.gov (United States)

    Yue, Liu; Hang, Mend

    2018-01-01

    With the rapid development and construction of large-scale wind farms and grid-connected operation, the series compensation wind power AC transmission is gradually becoming the main way of power usage and improvement of wind power availability and grid stability, but the integration of wind farm will change the SSO (Sub-Synchronous oscillation) damping characteristics of synchronous generator system. Regarding the above SSO problem caused by integration of large-scale wind farms, this paper focusing on doubly fed induction generator (DFIG) based wind farms, aim to summarize the SSO mechanism in large-scale wind power integrated system with series compensation, which can be classified as three types: sub-synchronous control interaction (SSCI), sub-synchronous torsional interaction (SSTI), sub-synchronous resonance (SSR). Then, SSO modelling and analysis methods are categorized and compared by its applicable areas. Furthermore, this paper summarizes the suppression measures of actual SSO projects based on different control objectives. Finally, the research prospect on this field is explored.

  17. The ENIGMA Consortium : large-scale collaborative analyses of neuroimaging and genetic data

    NARCIS (Netherlands)

    Thompson, Paul M.; Stein, Jason L.; Medland, Sarah E.; Hibar, Derrek P.; Vasquez, Alejandro Arias; Renteria, Miguel E.; Toro, Roberto; Jahanshad, Neda; Schumann, Gunter; Franke, Barbara; Wright, Margaret J.; Martin, Nicholas G.; Agartz, Ingrid; Alda, Martin; Alhusaini, Saud; Almasy, Laura; Almeida, Jorge; Alpert, Kathryn; Andreasen, Nancy C.; Andreassen, Ole A.; Apostolova, Liana G.; Appel, Katja; Armstrong, Nicola J.; Aribisala, Benjamin; Bastin, Mark E.; Bauer, Michael; Bearden, Carrie E.; Bergmann, Orjan; Binder, Elisabeth B.; Blangero, John; Bockholt, Henry J.; Boen, Erlend; Bois, Catherine; Boomsma, Dorret I.; Booth, Tom; Bowman, Ian J.; Bralten, Janita; Brouwer, Rachel M.; Brunner, Han G.; Brohawn, David G.; Buckner, Randy L.; Buitelaar, Jan; Bulayeva, Kazima; Bustillo, Juan R.; Calhoun, Vince D.; Hartman, Catharina A.; Hoekstra, Pieter J.; Penninx, Brenda W.; Schmaal, Lianne; van Tol, Marie-Jose

    The Enhancing NeuroImaging Genetics through Meta-Analysis (ENIGMA) Consortium is a collaborative network of researchers working together on a range of large-scale studies that integrate data from 70 institutions worldwide. Organized into Working Groups that tackle questions in neuroscience,

  18. The ENIGMA Consortium: large-scale collaborative analyses of neuroimaging and genetic data

    NARCIS (Netherlands)

    Thompson, Paul M.; Stein, Jason L.; Medland, Sarah E.; Hibar, Derrek P.; Vasquez, Alejandro Arias; Renteria, Miguel E.; Toro, Roberto; Jahanshad, Neda; Schumann, Gunter; Franke, Barbara; Wright, Margaret J.; Martin, Nicholas G.; Agartz, Ingrid; Alda, Martin; Alhusaini, Saud; Almasy, Laura; Almeida, Jorge; Alpert, Kathryn; Andreasen, Nancy C.; Andreassen, Ole A.; Apostolova, Liana G.; Appel, Katja; Armstrong, Nicola J.; Aribisala, Benjamin; Bastin, Mark E.; Bauer, Michael; Bearden, Carrie E.; Bergmann, Orjan; Binder, Elisabeth B.; Blangero, John; Bockholt, Henry J.; Bøen, Erlend; Bois, Catherine; Boomsma, Dorret I.; Booth, Tom; Bowman, Ian J.; Bralten, Janita; Brouwer, Rachel M.; Brunner, Han G.; Brohawn, David G.; Buckner, Randy L.; Buitelaar, Jan; Bulayeva, Kazima; Bustillo, Juan R.; Calhoun, Vince D.; Cannon, Dara M.; Cantor, Rita M.; Carless, Melanie A.; Caseras, Xavier; Cavalleri, Gianpiero L.; Chakravarty, M. Mallar; Chang, Kiki D.; Ching, Christopher R. K.; Christoforou, Andrea; Cichon, Sven; Clark, Vincent P.; Conrod, Patricia; Coppola, Giovanni; Crespo-Facorro, Benedicto; Curran, Joanne E.; Czisch, Michael; Deary, Ian J.; de Geus, Eco J. C.; den Braber, Anouk; Delvecchio, Giuseppe; Depondt, Chantal; de Haan, Lieuwe; de Zubicaray, Greig I.; Dima, Danai; Dimitrova, Rali; Djurovic, Srdjan; Dong, Hongwei; Donohoe, Gary; Duggirala, Ravindranath; Dyer, Thomas D.; Ehrlich, Stefan; Ekman, Carl Johan; Elvsåshagen, Torbjørn; Emsell, Louise; Erk, Susanne; Espeseth, Thomas; Fagerness, Jesen; Fears, Scott; Fedko, Iryna; Fernández, Guillén; Fisher, Simon E.; Foroud, Tatiana; Fox, Peter T.; Francks, Clyde; Frangou, Sophia; Frey, Eva Maria; Frodl, Thomas; Frouin, Vincent; Garavan, Hugh; Giddaluru, Sudheer; Glahn, David C.; Godlewska, Beata; Goldstein, Rita Z.; Gollub, Randy L.; Grabe, Hans J.; Grimm, Oliver; Gruber, Oliver; Guadalupe, Tulio; Gur, Raquel E.; Gur, Ruben C.; Göring, Harald H. H.; Hagenaars, Saskia; Hajek, Tomas; Hall, Geoffrey B.; Hall, Jeremy; Hardy, John; Hartman, Catharina A.; Hass, Johanna; Hatton, Sean N.; Haukvik, Unn K.; Hegenscheid, Katrin; Heinz, Andreas; Hickie, Ian B.; Ho, Beng-Choon; Hoehn, David; Hoekstra, Pieter J.; Hollinshead, Marisa; Holmes, Avram J.; Homuth, Georg; Hoogman, Martine; Hong, L. Elliot; Hosten, Norbert; Hottenga, Jouke-Jan; Hulshoff Pol, Hilleke E.; Hwang, Kristy S.; Jack, Clifford R.; Jenkinson, Mark; Johnston, Caroline; Jönsson, Erik G.; Kahn, René S.; Kasperaviciute, Dalia; Kelly, Sinead; Kim, Sungeun; Kochunov, Peter; Koenders, Laura; Krämer, Bernd; Kwok, John B. J.; Lagopoulos, Jim; Laje, Gonzalo; Landen, Mikael; Landman, Bennett A.; Lauriello, John; Lawrie, Stephen M.; Lee, Phil H.; Le Hellard, Stephanie; Lemaître, Herve; Leonardo, Cassandra D.; Li, Chiang-Shan; Liberg, Benny; Liewald, David C.; Liu, Xinmin; Lopez, Lorna M.; Loth, Eva; Lourdusamy, Anbarasu; Luciano, Michelle; Macciardi, Fabio; Machielsen, Marise W. J.; Macqueen, Glenda M.; Malt, Ulrik F.; Mandl, René; Manoach, Dara S.; Martinot, Jean-Luc; Matarin, Mar; Mather, Karen A.; Mattheisen, Manuel; Mattingsdal, Morten; Meyer-Lindenberg, Andreas; McDonald, Colm; McIntosh, Andrew M.; McMahon, Francis J.; McMahon, Katie L.; Meisenzahl, Eva; Melle, Ingrid; Milaneschi, Yuri; Mohnke, Sebastian; Montgomery, Grant W.; Morris, Derek W.; Moses, Eric K.; Mueller, Bryon A.; Muñoz Maniega, Susana; Mühleisen, Thomas W.; Müller-Myhsok, Bertram; Mwangi, Benson; Nauck, Matthias; Nho, Kwangsik; Nichols, Thomas E.; Nilsson, Lars-Göran; Nugent, Allison C.; Nyberg, Lars; Olvera, Rene L.; Oosterlaan, Jaap; Ophoff, Roel A.; Pandolfo, Massimo; Papalampropoulou-Tsiridou, Melina; Papmeyer, Martina; Paus, Tomas; Pausova, Zdenka; Pearlson, Godfrey D.; Penninx, Brenda W.; Peterson, Charles P.; Pfennig, Andrea; Phillips, Mary; Pike, G. Bruce; Poline, Jean-Baptiste; Potkin, Steven G.; Pütz, Benno; Ramasamy, Adaikalavan; Rasmussen, Jerod; Rietschel, Marcella; Rijpkema, Mark; Risacher, Shannon L.; Roffman, Joshua L.; Roiz-Santiañez, Roberto; Romanczuk-Seiferth, Nina; Rose, Emma J.; Royle, Natalie A.; Rujescu, Dan; Ryten, Mina; Sachdev, Perminder S.; Salami, Alireza; Satterthwaite, Theodore D.; Savitz, Jonathan; Saykin, Andrew J.; Scanlon, Cathy; Schmaal, Lianne; Schnack, Hugo G.; Schork, Andrew J.; Schulz, S. Charles; Schür, Remmelt; Seidman, Larry; Shen, Li; Shoemaker, Jody M.; Simmons, Andrew; Sisodiya, Sanjay M.; Smith, Colin; Smoller, Jordan W.; Soares, Jair C.; Sponheim, Scott R.; Sprooten, Emma; Starr, John M.; Steen, Vidar M.; Strakowski, Stephen; Strike, Lachlan; Sussmann, Jessika; Sämann, Philipp G.; Teumer, Alexander; Toga, Arthur W.; Tordesillas-Gutierrez, Diana; Trabzuni, Daniah; Trost, Sarah; Turner, Jessica; van den Heuvel, Martijn; van der Wee, Nic J.; van Eijk, Kristel; van Erp, Theo G. M.; van Haren, Neeltje E. M.; van 't Ent, Dennis; van Tol, Marie-Jose; Valdés Hernández, Maria C.; Veltman, Dick J.; Versace, Amelia; Völzke, Henry; Walker, Robert; Walter, Henrik; Wang, Lei; Wardlaw, Joanna M.; Weale, Michael E.; Weiner, Michael W.; Wen, Wei; Westlye, Lars T.; Whalley, Heather C.; Whelan, Christopher D.; White, Tonya; Winkler, Anderson M.; Wittfeld, Katharina; Woldehawariat, Girma; Wolf, Christiane; Zilles, David; Zwiers, Marcel P.; Thalamuthu, Anbupalam; Schofield, Peter R.; Freimer, Nelson B.; Lawrence, Natalia S.; Drevets, Wayne

    2014-01-01

    The Enhancing NeuroImaging Genetics through Meta-Analysis (ENIGMA) Consortium is a collaborative network of researchers working together on a range of large-scale studies that integrate data from 70 institutions worldwide. Organized into Working Groups that tackle questions in neuroscience,

  19. The ENIGMA Consortium: Large-scale collaborative analyses of neuroimaging and genetic data

    NARCIS (Netherlands)

    P.M. Thompson (Paul); J.L. Stein; S.E. Medland (Sarah Elizabeth); D.P. Hibar (Derrek); A.A. Vásquez (Arias); M.E. Rentería (Miguel); R. Toro (Roberto); N. Jahanshad (Neda); G. Schumann (Gunter); B. Franke (Barbara); M.J. Wright (Margaret); N.G. Martin (Nicholas); I. Agartz (Ingrid); M. Alda (Martin); S. Alhusaini (Saud); L. Almasy (Laura); K. Alpert (Kathryn); N.C. Andreasen; O.A. Andreassen (Ole); L.G. Apostolova (Liana); K. Appel (Katja); N.J. Armstrong (Nicola); B. Aribisala (Benjamin); M.E. Bastin (Mark); M. Bauer (Michael); C.E. Bearden (Carrie); Ø. Bergmann (Ørjan); E.B. Binder (Elisabeth); J. Blangero (John); H.J. Bockholt; E. Bøen (Erlend); M. Bois (Monique); D.I. Boomsma (Dorret); T. Booth (Tom); I.J. Bowman (Ian); L.B.C. Bralten (Linda); R.M. Brouwer (Rachel); H.G. Brunner; D.G. Brohawn (David); M. Buckner; J.K. Buitelaar (Jan); K. Bulayeva (Kazima); J. Bustillo; V.D. Calhoun (Vince); D.M. Cannon (Dara); R.M. Cantor; M.A. Carless (Melanie); X. Caseras (Xavier); G. Cavalleri (Gianpiero); M.M. Chakravarty (M. Mallar); K.D. Chang (Kiki); C.R.K. Ching (Christopher); A. Christoforou (Andrea); S. Cichon (Sven); V.P. Clark; P. Conrod (Patricia); D. Coppola (Domenico); B. Crespo-Facorro (Benedicto); J.E. Curran (Joanne); M. Czisch (Michael); I.J. Deary (Ian); E.J.C. de Geus (Eco); A. den Braber (Anouk); G. Delvecchio (Giuseppe); C. Depondt (Chantal); L. de Haan (Lieuwe); G.I. de Zubicaray (Greig); D. Dima (Danai); R. Dimitrova (Rali); S. Djurovic (Srdjan); H. Dong (Hongwei); D.J. Donohoe (Dennis); A. Duggirala (Aparna); M.D. Dyer (Matthew); S.M. Ehrlich (Stefan); C.J. Ekman (Carl Johan); T. Elvsåshagen (Torbjørn); L. Emsell (Louise); S. Erk; T. Espeseth (Thomas); J. Fagerness (Jesen); S. Fears (Scott); I. Fedko (Iryna); G. Fernandez (Guillén); S.E. Fisher (Simon); T. Foroud (Tatiana); P.T. Fox (Peter); C. Francks (Clyde); S. Frangou (Sophia); E.M. Frey (Eva Maria); T. Frodl (Thomas); V. Frouin (Vincent); H. Garavan (Hugh); S. Giddaluru (Sudheer); D.C. Glahn (David); B. Godlewska (Beata); R.Z. Goldstein (Rita); R.L. Gollub (Randy); H.J. Grabe (Hans Jörgen); O. Grimm (Oliver); O. Gruber (Oliver); T. Guadalupe (Tulio); R.E. Gur (Raquel); R.C. Gur (Ruben); H.H.H. Göring (Harald); S. Hagenaars (Saskia); T. Hajek (Tomas); G.B. Hall (Garry); J. Hall (Jeremy); J. Hardy (John); C.A. Hartman (Catharina); J. Hass (Johanna); W. Hatton; U.K. Haukvik (Unn); K. Hegenscheid (Katrin); J. Heinz (Judith); I.B. Hickie (Ian); B.C. Ho (Beng ); D. Hoehn (David); P.J. Hoekstra (Pieter); M. Hollinshead (Marisa); A.J. Holmes (Avram); G. Homuth (Georg); M. Hoogman (Martine); L.E. Hong (L.Elliot); N. Hosten (Norbert); J.J. Hottenga (Jouke Jan); H.E. Hulshoff Pol (Hilleke); K.S. Hwang (Kristy); C.R. Jack Jr. (Clifford); S. Jenkinson (Sarah); C. Johnston; E.G. Jönsson (Erik); R.S. Kahn (René); D. Kasperaviciute (Dalia); S. Kelly (Steve); S. Kim (Shinseog); P. Kochunov (Peter); L. Koenders (Laura); B. Krämer (Bernd); J.B.J. Kwok (John); J. Lagopoulos (Jim); G. Laje (Gonzalo); M. Landén (Mikael); B.A. Landman (Bennett); J. Lauriello; S. Lawrie (Stephen); P.H. Lee (Phil); S. Le Hellard (Stephanie); H. Lemaître (Herve); C.D. Leonardo (Cassandra); C.-S. Li (Chiang-shan); B. Liberg (Benny); D.C. Liewald (David C.); X. Liu (Xinmin); L.M. Lopez (Lorna); E. Loth (Eva); A. Lourdusamy (Anbarasu); M. Luciano (Michelle); F. MacCiardi (Fabio); M.W.J. Machielsen (Marise); G.M. MacQueen (Glenda); U.F. Malt (Ulrik); R. Mandl (René); D.S. Manoach (Dara); J.-L. Martinot (Jean-Luc); M. Matarin (Mar); R. Mather; M. Mattheisen (Manuel); M. Mattingsdal (Morten); A. Meyer-Lindenberg; C. McDonald (Colm); A.M. McIntosh (Andrew); F.J. Mcmahon (Francis J); K.L. Mcmahon (Katie); E. Meisenzahl (Eva); I. Melle (Ingrid); Y. Milaneschi (Yuri); S. Mohnke (Sebastian); G.W. Montgomery (Grant); D.W. Morris (Derek W); E.K. Moses (Eric); B.A. Mueller (Bryon ); S. Muñoz Maniega (Susana); T.W. Mühleisen (Thomas); B. Müller-Myhsok (Bertram); B. Mwangi (Benson); M. Nauck (Matthias); K. Nho (Kwangsik); T.E. Nichols (Thomas); L.G. Nilsson; A.C. Nugent (Allison); L. Nyberg (Lisa); R.L. Olvera (Rene); J. Oosterlaan (Jaap); R.A. Ophoff (Roel); M. Pandolfo (Massimo); M. Papalampropoulou-Tsiridou (Melina); M. Papmeyer (Martina); T. Paus (Tomas); Z. Pausova (Zdenka); G. Pearlson (Godfrey); B.W.J.H. Penninx (Brenda); C.P. Peterson (Charles); A. Pfennig (Andrea); M. Phillips (Mary); G.B. Pike (G Bruce); J.B. Poline (Jean Baptiste); S.G. Potkin (Steven); B. Pütz (Benno); A. Ramasamy (Adaikalavan); J. Rasmussen (Jerod); M. Rietschel (Marcella); M. Rijpkema (Mark); S.L. Risacher (Shannon); J.L. Roffman (Joshua); R. Roiz-Santiañez (Roberto); N. Romanczuk-Seiferth (Nina); E.J. Rose (Emma); N.A. Royle (Natalie); D. Rujescu (Dan); M. Ryten (Mina); P.S. Sachdev (Perminder); A. Salami (Alireza); T.D. Satterthwaite (Theodore); J. Savitz (Jonathan); A.J. Saykin (Andrew); C. Scanlon (Cathy); L. Schmaal (Lianne); H. Schnack (Hugo); N.J. Schork (Nicholas); S.C. Schulz (S.Charles); R. Schür (Remmelt); L.J. Seidman (Larry); L. Shen (Li); L. Shoemaker (Lawrence); A. Simmons (Andrew); S.M. Sisodiya (Sanjay); C. Smith (Colin); J.W. Smoller; J.C. Soares (Jair); S.R. Sponheim (Scott); R. Sprooten (Roy); J.M. Starr (John); V.M. Steen (Vidar); S. Strakowski (Stephen); L.T. Strike (Lachlan); J. Sussmann (Jessika); P.G. Sämann (Philipp); A. Teumer (Alexander); A.W. Toga (Arthur); D. Tordesillas-Gutierrez (Diana); D. Trabzuni (Danyah); S. Trost (Sarah); J. Turner (Jessica); M. van den Heuvel (Martijn); N.J. van der Wee (Nic); K.R. van Eijk (Kristel); T.G.M. van Erp (Theo G.); N.E.M. van Haren (Neeltje E.); D. van 't Ent (Dennis); M.J.D. van Tol (Marie-José); M.C. Valdés Hernández (Maria); D.J. Veltman (Dick); A. Versace (Amelia); H. Völzke (Henry); R. Walker (Robert); H.J. Walter (Henrik); L. Wang (Lei); J.M. Wardlaw (J.); M.E. Weale (Michael); M.W. Weiner (Michael); W. Wen (Wei); L.T. Westlye (Lars); H.C. Whalley (Heather); C.D. Whelan (Christopher); T.J.H. White (Tonya); A.M. Winkler (Anderson); K. Wittfeld (Katharina); G. Woldehawariat (Girma); A. Björnsson (Asgeir); D. Zilles (David); M.P. Zwiers (Marcel); A. Thalamuthu (Anbupalam); J.R. Almeida (Jorge); C.J. Schofield (Christopher); N.B. Freimer (Nelson); N.S. Lawrence (Natalia); D.A. Drevets (Douglas)

    2014-01-01

    textabstractThe Enhancing NeuroImaging Genetics through Meta-Analysis (ENIGMA) Consortium is a collaborative network of researchers working together on a range of large-scale studies that integrate data from 70 institutions worldwide. Organized into Working Groups that tackle questions in

  20. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed; Elsawy, Hesham; Gharbieh, Mohammad; Alouini, Mohamed-Slim; Adinoyi, Abdulkareem; Alshaalan, Furaih

    2017-01-01

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end

  1. Hierarchical formation of large scale structures of the Universe: observations and models

    International Nuclear Information System (INIS)

    Maurogordato, Sophie

    2003-01-01

    In this report for an Accreditation to Supervise Research (HDR), the author proposes an overview of her research works in cosmology. These works notably addressed the large scale distribution of the Universe (with constraints on the scenario of formation, and on the bias relationship, and the structuring of clusters), the analysis of galaxy clusters during coalescence, mass distribution within relaxed clusters [fr

  2. HTGR safety research program

    International Nuclear Information System (INIS)

    Barsell, A.W.; Olsen, B.E.; Silady, F.A.

    1981-01-01

    An HTGR safety research program is being performed supporting and guided in priorities by the AIPA Probabilistic Risk Study. Analytical and experimental studies have been conducted in four general areas where modeling or data assumptions contribute to large uncertainties in the consequence assessments and thus, in the risk assessment for key core heat-up accident scenarios. Experimental data have been obtained on time-dependent release of fission products from the fuel particles, and plateout characteristics of condensible fission products in the primary circuit. Potential failure modes of primarily top head PCRV components as well as concrete degradation processes have been analyzed using a series of newly developed models and interlinked computer programs. Containment phenomena, including fission product deposition and potential flammability of liberated combustible gases have been studied analytically. Lastly, the behaviour of boron control material in the core and reactor subcriticality during core heatup have been examined analytically. Research in these areas has formed the basis for consequence updates in GA-A15000. Systematic derivation of future safety research priorities is also discussed. (author)

  3. Large Scale Glazed Concrete Panels

    DEFF Research Database (Denmark)

    Bache, Anja Margrethe

    2010-01-01

    Today, there is a lot of focus on concrete surface’s aesthitic potential, both globally and locally. World famous architects such as Herzog De Meuron, Zaha Hadid, Richard Meyer and David Chippenfield challenge the exposure of concrete in their architecture. At home, this trend can be seen...... in the crinkly façade of DR-Byen (the domicile of the Danish Broadcasting Company) by architect Jean Nouvel and Zaha Hadid’s Ordrupgård’s black curved smooth concrete surfaces. Furthermore, one can point to initiatives such as “Synlig beton” (visible concrete) that can be seen on the website www.......synligbeton.dk and spæncom’s aesthetic relief effects by the designer Line Kramhøft (www.spaencom.com). It is my hope that the research-development project “Lasting large scale glazed concrete formwork,” I am working on at DTU, department of Architectural Engineering will be able to complement these. It is a project where I...

  4. Small watershed-scale research and the challenges ahead

    Science.gov (United States)

    Larsen, M. C.; Glynn, P. D.

    2008-12-01

    For the past century, Federal mission science agencies (eg. USFS, NRCS, ARS, USGS) have had the long- term agency goals, infrastructure, and research staff to conduct research and data collection in small watersheds as well as support these activities for non-Federal partners. The National Science Foundation has been a strong partner with the Federal mission science agencies, through the LTER network, which is dependent on Federally supported research sites, and more recently with the emerging CUAHSI, WATERS, CZEN, and NEON initiatives. Much of the NSF-supported research builds on the foundations provided by their Federally supported partners, who sustain the long-term, extensive monitoring activity and research sites, including making long-term data available to all users via public interfaces. The future of these programs, and their enhancement/expansion to face the intensifying concurrent challenges of population growth, land-use change, and climate change, is dependent on a well-funded national commitment to basic science. Such a commitment will allow the scientific community to advance our understanding of these scientific challenges and to synthesize our understanding among research sites and at the national scale. Small watersheds serve as essential platforms where hypotheses can be tested, as sentinels for climate change, and as a basis for comparing and scaling up local information and syntheses to regional and continental scales. The science guides resource management and mitigation decisions and is fundamental to the development of predictive models. Furthermore, small-watershed research and monitoring programs are generally undervalued because many research questions that can be addressed now or in the future were not anticipated when the sites were initiated. Some examples include: 1) the quantification, characterization, and understanding of how emerging contaminants, personal care products, and endocrine disruptors affect organisms - substances that

  5. The Rights and Responsibility of Test Takers When Large-Scale Testing Is Used for Classroom Assessment

    Science.gov (United States)

    van Barneveld, Christina; Brinson, Karieann

    2017-01-01

    The purpose of this research was to identify conflicts in the rights and responsibility of Grade 9 test takers when some parts of a large-scale test are marked by teachers and used in the calculation of students' class marks. Data from teachers' questionnaires and students' questionnaires from a 2009-10 administration of a large-scale test of…

  6. Copy of Using Emulation and Simulation to Understand the Large-Scale Behavior of the Internet.

    Energy Technology Data Exchange (ETDEWEB)

    Adalsteinsson, Helgi; Armstrong, Robert C.; Chiang, Ken; Gentile, Ann C.; Lloyd, Levi; Minnich, Ronald G.; Vanderveen, Keith; Van Randwyk, Jamie A; Rudish, Don W.

    2008-10-01

    We report on the work done in the late-start LDRDUsing Emulation and Simulation toUnderstand the Large-Scale Behavior of the Internet. We describe the creation of a researchplatform that emulates many thousands of machines to be used for the study of large-scale inter-net behavior. We describe a proof-of-concept simple attack we performed in this environment.We describe the successful capture of a Storm bot and, from the study of the bot and furtherliterature search, establish large-scale aspects we seek to understand via emulation of Storm onour research platform in possible follow-on work. Finally, we discuss possible future work.3

  7. Prospects for large scale electricity storage in Denmark

    DEFF Research Database (Denmark)

    Krog Ekman, Claus; Jensen, Søren Højgaard

    2010-01-01

    In a future power systems with additional wind power capacity there will be an increased need for large scale power management as well as reliable balancing and reserve capabilities. Different technologies for large scale electricity storage provide solutions to the different challenges arising w...

  8. Evaluating Educational Programs. ETS R&D Scientific and Policy Contributions Series. ETS SPC-11-01. ETS Research Report No. RR-11-15

    Science.gov (United States)

    Ball, Samuel

    2011-01-01

    Since its founding in 1947, ETS has conducted a significant and wide-ranging research program that has focused on, among other things, psychometric and statistical methodology; educational evaluation; performance assessment and scoring; large-scale assessment and evaluation; cognitive, developmental, personality, and social psychology; and…

  9. A new asynchronous parallel algorithm for inferring large-scale gene regulatory networks.

    Directory of Open Access Journals (Sweden)

    Xiangyun Xiao

    Full Text Available The reconstruction of gene regulatory networks (GRNs from high-throughput experimental data has been considered one of the most important issues in systems biology research. With the development of high-throughput technology and the complexity of biological problems, we need to reconstruct GRNs that contain thousands of genes. However, when many existing algorithms are used to handle these large-scale problems, they will encounter two important issues: low accuracy and high computational cost. To overcome these difficulties, the main goal of this study is to design an effective parallel algorithm to infer large-scale GRNs based on high-performance parallel computing environments. In this study, we proposed a novel asynchronous parallel framework to improve the accuracy and lower the time complexity of large-scale GRN inference by combining splitting technology and ordinary differential equation (ODE-based optimization. The presented algorithm uses the sparsity and modularity of GRNs to split whole large-scale GRNs into many small-scale modular subnetworks. Through the ODE-based optimization of all subnetworks in parallel and their asynchronous communications, we can easily obtain the parameters of the whole network. To test the performance of the proposed approach, we used well-known benchmark datasets from Dialogue for Reverse Engineering Assessments and Methods challenge (DREAM, experimentally determined GRN of Escherichia coli and one published dataset that contains more than 10 thousand genes to compare the proposed approach with several popular algorithms on the same high-performance computing environments in terms of both accuracy and time complexity. The numerical results demonstrate that our parallel algorithm exhibits obvious superiority in inferring large-scale GRNs.

  10. A new asynchronous parallel algorithm for inferring large-scale gene regulatory networks.

    Science.gov (United States)

    Xiao, Xiangyun; Zhang, Wei; Zou, Xiufen

    2015-01-01

    The reconstruction of gene regulatory networks (GRNs) from high-throughput experimental data has been considered one of the most important issues in systems biology research. With the development of high-throughput technology and the complexity of biological problems, we need to reconstruct GRNs that contain thousands of genes. However, when many existing algorithms are used to handle these large-scale problems, they will encounter two important issues: low accuracy and high computational cost. To overcome these difficulties, the main goal of this study is to design an effective parallel algorithm to infer large-scale GRNs based on high-performance parallel computing environments. In this study, we proposed a novel asynchronous parallel framework to improve the accuracy and lower the time complexity of large-scale GRN inference by combining splitting technology and ordinary differential equation (ODE)-based optimization. The presented algorithm uses the sparsity and modularity of GRNs to split whole large-scale GRNs into many small-scale modular subnetworks. Through the ODE-based optimization of all subnetworks in parallel and their asynchronous communications, we can easily obtain the parameters of the whole network. To test the performance of the proposed approach, we used well-known benchmark datasets from Dialogue for Reverse Engineering Assessments and Methods challenge (DREAM), experimentally determined GRN of Escherichia coli and one published dataset that contains more than 10 thousand genes to compare the proposed approach with several popular algorithms on the same high-performance computing environments in terms of both accuracy and time complexity. The numerical results demonstrate that our parallel algorithm exhibits obvious superiority in inferring large-scale GRNs.

  11. Evolution of scaling emergence in large-scale spatial epidemic spreading.

    Science.gov (United States)

    Wang, Lin; Li, Xiang; Zhang, Yi-Qing; Zhang, Yan; Zhang, Kan

    2011-01-01

    Zipf's law and Heaps' law are two representatives of the scaling concepts, which play a significant role in the study of complexity science. The coexistence of the Zipf's law and the Heaps' law motivates different understandings on the dependence between these two scalings, which has still hardly been clarified. In this article, we observe an evolution process of the scalings: the Zipf's law and the Heaps' law are naturally shaped to coexist at the initial time, while the crossover comes with the emergence of their inconsistency at the larger time before reaching a stable state, where the Heaps' law still exists with the disappearance of strict Zipf's law. Such findings are illustrated with a scenario of large-scale spatial epidemic spreading, and the empirical results of pandemic disease support a universal analysis of the relation between the two laws regardless of the biological details of disease. Employing the United States domestic air transportation and demographic data to construct a metapopulation model for simulating the pandemic spread at the U.S. country level, we uncover that the broad heterogeneity of the infrastructure plays a key role in the evolution of scaling emergence. The analyses of large-scale spatial epidemic spreading help understand the temporal evolution of scalings, indicating the coexistence of the Zipf's law and the Heaps' law depends on the collective dynamics of epidemic processes, and the heterogeneity of epidemic spread indicates the significance of performing targeted containment strategies at the early time of a pandemic disease.

  12. Proceedings of joint meeting of the 6th simulation science symposium and the NIFS collaboration research 'large scale computer simulation'

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-03-01

    Joint meeting of the 6th Simulation Science Symposium and the NIFS Collaboration Research 'Large Scale Computer Simulation' was held on December 12-13, 2002 at National Institute for Fusion Science, with the aim of promoting interdisciplinary collaborations in various fields of computer simulations. The present meeting attended by more than 40 people consists of the 11 invited and 22 contributed papers, of which topics were extended not only to fusion science but also to related fields such as astrophysics, earth science, fluid dynamics, molecular dynamics, computer science etc. (author)

  13. An efficient and novel computation method for simulating diffraction patterns from large-scale coded apertures on large-scale focal plane arrays

    Science.gov (United States)

    Shrekenhamer, Abraham; Gottesman, Stephen R.

    2012-10-01

    A novel and memory efficient method for computing diffraction patterns produced on large-scale focal planes by largescale Coded Apertures at wavelengths where diffraction effects are significant has been developed and tested. The scheme, readily implementable on portable computers, overcomes the memory limitations of present state-of-the-art simulation codes such as Zemax. The method consists of first calculating a set of reference complex field (amplitude and phase) patterns on the focal plane produced by a single (reference) central hole, extending to twice the focal plane array size, with one such pattern for each Line-of-Sight (LOS) direction and wavelength in the scene, and with the pattern amplitude corresponding to the square-root of the spectral irradiance from each such LOS direction in the scene at selected wavelengths. Next the set of reference patterns is transformed to generate pattern sets for other holes. The transformation consists of a translational pattern shift corresponding to each hole's position offset and an electrical phase shift corresponding to each hole's position offset and incoming radiance's direction and wavelength. The set of complex patterns for each direction and wavelength is then summed coherently and squared for each detector to yield a set of power patterns unique for each direction and wavelength. Finally the set of power patterns is summed to produce the full waveband diffraction pattern from the scene. With this tool researchers can now efficiently simulate diffraction patterns produced from scenes by large-scale Coded Apertures onto large-scale focal plane arrays to support the development and optimization of coded aperture masks and image reconstruction algorithms.

  14. Large-Scale Structure and Hyperuniformity of Amorphous Ices

    Science.gov (United States)

    Martelli, Fausto; Torquato, Salvatore; Giovambattista, Nicolas; Car, Roberto

    2017-09-01

    We investigate the large-scale structure of amorphous ices and transitions between their different forms by quantifying their large-scale density fluctuations. Specifically, we simulate the isothermal compression of low-density amorphous ice (LDA) and hexagonal ice to produce high-density amorphous ice (HDA). Both HDA and LDA are nearly hyperuniform; i.e., they are characterized by an anomalous suppression of large-scale density fluctuations. By contrast, in correspondence with the nonequilibrium phase transitions to HDA, the presence of structural heterogeneities strongly suppresses the hyperuniformity and the system becomes hyposurficial (devoid of "surface-area fluctuations"). Our investigation challenges the largely accepted "frozen-liquid" picture, which views glasses as structurally arrested liquids. Beyond implications for water, our findings enrich our understanding of pressure-induced structural transformations in glasses.

  15. A continental-scale hydrology and water quality model for Europe: Calibration and uncertainty of a high-resolution large-scale SWAT model

    Science.gov (United States)

    Abbaspour, K. C.; Rouholahnejad, E.; Vaghefi, S.; Srinivasan, R.; Yang, H.; Kløve, B.

    2015-05-01

    A combination of driving forces are increasing pressure on local, national, and regional water supplies needed for irrigation, energy production, industrial uses, domestic purposes, and the environment. In many parts of Europe groundwater quantity, and in particular quality, have come under sever degradation and water levels have decreased resulting in negative environmental impacts. Rapid improvements in the economy of the eastern European block of countries and uncertainties with regard to freshwater availability create challenges for water managers. At the same time, climate change adds a new level of uncertainty with regard to freshwater supplies. In this research we build and calibrate an integrated hydrological model of Europe using the Soil and Water Assessment Tool (SWAT) program. Different components of water resources are simulated and crop yield and water quality are considered at the Hydrological Response Unit (HRU) level. The water resources are quantified at subbasin level with monthly time intervals. Leaching of nitrate into groundwater is also simulated at a finer spatial level (HRU). The use of large-scale, high-resolution water resources models enables consistent and comprehensive examination of integrated system behavior through physically-based, data-driven simulation. In this article we discuss issues with data availability, calibration of large-scale distributed models, and outline procedures for model calibration and uncertainty analysis. The calibrated model and results provide information support to the European Water Framework Directive and lay the basis for further assessment of the impact of climate change on water availability and quality. The approach and methods developed are general and can be applied to any large region around the world.

  16. Numerical and experimental simulation of accident processes using KMS large-scale test facility under the program of training university students for nuclear power industry

    International Nuclear Information System (INIS)

    Aniskevich, Yu.N.

    2005-01-01

    The KMS large-scale test facility is being constructed at NITI site and designed to model accident processes in VVER reactor plants and provide experimental data for safety analysis of both existing and future NPPs. The KMS phase I is at the completion stage. This is a containment model of 2000 m3 volume intended for experimentally simulating heat and mass transfers of steam-gas mixtures and aerosols inside containment. The KMS phase II will incorporate a reactor model (1:27 scale) and be used for analysing a number of events including primary and secondary LOCA. The KMS program for background training of university students in the nuclear field will include preparation and conduction of experiments, analysis of experiment data. The KMS program for background training of university students in nuclear will include: participation in the development and application of experiment procedures, preparation and carrying out experiments; carrying out pretest and post-test calculations with different computer codes; on-the-job training as operators of experiment scenarios; training of specialists in measurement and information acquisition technologies. (author)

  17. REQUIREMENTS FOR SYSTEMS DEVELOPMENT LIFE CYCLE MODELS FOR LARGE-SCALE DEFENSE SYSTEMS

    Directory of Open Access Journals (Sweden)

    Kadir Alpaslan DEMIR

    2015-10-01

    Full Text Available TLarge-scale defense system projects are strategic for maintaining and increasing the national defense capability. Therefore, governments spend billions of dollars in the acquisition and development of large-scale defense systems. The scale of defense systems is always increasing and the costs to build them are skyrocketing. Today, defense systems are software intensive and they are either a system of systems or a part of it. Historically, the project performances observed in the development of these systems have been signifi cantly poor when compared to other types of projects. It is obvious that the currently used systems development life cycle models are insuffi cient to address today’s challenges of building these systems. Using a systems development life cycle model that is specifi cally designed for largescale defense system developments and is effective in dealing with today’s and near-future challenges will help to improve project performances. The fi rst step in the development a large-scale defense systems development life cycle model is the identifi cation of requirements for such a model. This paper contributes to the body of literature in the fi eld by providing a set of requirements for system development life cycle models for large-scale defense systems. Furthermore, a research agenda is proposed.

  18. Programs of the Office of Energy Research: Revision

    International Nuclear Information System (INIS)

    1987-06-01

    In establishing each of the Federal Agencies that have been successively responsible for energy technologies and their development - the Atomic Energy Commission, the Energy Research and Development Administration, and, currently, the US Department of Energy (DOE) - Congress made specific provisions for the conduct of advanced and fundamental research. The purpose of this research has been to support the energy technology development programs by providing insight into fundamental science and associated phenomena and developing new or advanced concepts and techniques. Today, this responsibility rests with the Office of Energy Research (ER), DOE, whose present programs have their origins in pioneering energy-related research of this nature, which was initiated nearly 40 years ago. The Director, Office of Energy Research, also acts as the chief scientist and scientific advisor to the Secretary of Energy for the entire spectrum of energy research and development (R and D) programs of the Department. ER programs include several thousand individual projects and hundreds of laboratories, universities, and other research facilities throughout the Unites States. In the following pages, each of these programs and activities are described briefly for the information of the scientific community and the public at large. 5 figs., 6 tabs

  19. Large-scale Meteorological Patterns Associated with Extreme Precipitation Events over Portland, OR

    Science.gov (United States)

    Aragon, C.; Loikith, P. C.; Lintner, B. R.; Pike, M.

    2017-12-01

    Extreme precipitation events can have profound impacts on human life and infrastructure, with broad implications across a range of stakeholders. Changes to extreme precipitation events are a projected outcome of climate change that warrants further study, especially at regional- to local-scales. While global climate models are generally capable of simulating mean climate at global-to-regional scales with reasonable skill, resiliency and adaptation decisions are made at local-scales where most state-of-the-art climate models are limited by coarse resolution. Characterization of large-scale meteorological patterns associated with extreme precipitation events at local-scales can provide climatic information without this scale limitation, thus facilitating stakeholder decision-making. This research will use synoptic climatology as a tool by which to characterize the key large-scale meteorological patterns associated with extreme precipitation events in the Portland, Oregon metro region. Composite analysis of meteorological patterns associated with extreme precipitation days, and associated watershed-specific flooding, is employed to enhance understanding of the climatic drivers behind such events. The self-organizing maps approach is then used to characterize the within-composite variability of the large-scale meteorological patterns associated with extreme precipitation events, allowing us to better understand the different types of meteorological conditions that lead to high-impact precipitation events and associated hydrologic impacts. A more comprehensive understanding of the meteorological drivers of extremes will aid in evaluation of the ability of climate models to capture key patterns associated with extreme precipitation over Portland and to better interpret projections of future climate at impact-relevant scales.

  20. Development of large scale production of Nd-doped phosphate glasses for megajoule-scale laser systems

    International Nuclear Information System (INIS)

    Ficini, G.; Campbell, J.H.

    1996-01-01

    Nd-doped phosphate glasses are the preferred gain medium for high-peak-power lasers used for Inertial Confinement Fusion research because they have excellent energy storage and extraction characteristics. In addition, these glasses can be manufactured defect-free in large sizes and at relatively low cost. To meet the requirements of the future mega-joule size lasers, advanced laser glass manufacturing methods are being developed that would enable laser glass to be continuously produced at the rate of several thousand large (790 x 440 x 44 mm 3 ) plates of glass per year. This represents more than a 10 to 100-fold improvement in the scale of the present manufacturing technology

  1. A generic library for large scale solution of PDEs on modern heterogeneous architectures

    DEFF Research Database (Denmark)

    Glimberg, Stefan Lemvig; Engsig-Karup, Allan Peter

    2012-01-01

    Adapting to new programming models for modern multi- and many-core architectures requires code-rewriting and changing algorithms and data structures, in order to achieve good efficiency and scalability. We present a generic library for solving large scale partial differential equations (PDEs......), capable of utilizing heterogeneous CPU/GPU environments. The library can be used for fast proto-typing of PDE solvers, based on finite difference approximations of spatial derivatives in one, two, or three dimensions. In order to efficiently solve large scale problems, we keep memory consumption...... and memory access low, using a low-storage implementation of flexible-order finite difference operators. We will illustrate the use of library components by assembling such matrix-free operators to be used with one of the supported iterative solvers, such as GMRES, CG, Multigrid or Defect Correction...

  2. Laboratory astrophysics. Model experiments of astrophysics with large-scale lasers

    International Nuclear Information System (INIS)

    Takabe, Hideaki

    2012-01-01

    I would like to review the model experiment of astrophysics with high-power, large-scale lasers constructed mainly for laser nuclear fusion research. The four research directions of this new field named 'Laser Astrophysics' are described with four examples mainly promoted in our institute. The description is of magazine style so as to be easily understood by non-specialists. A new theory and its model experiment on the collisionless shock and particle acceleration observed in supernova remnants (SNRs) are explained in detail and its result and coming research direction are clarified. In addition, the vacuum breakdown experiment to be realized with the near future ultra-intense laser is also introduced. (author)

  3. Biomass Gasification - A synthesis of technical barriers and current research issues for deployment at large scale

    Energy Technology Data Exchange (ETDEWEB)

    Heyne, Stefan [Chalmers Univ. of Technology, Gothenburg (Sweden); Liliedahl, Truls [KTH, Royal Inst. of Technology, Stockholm (Sweden); Marklund, Magnus [Energy Technology Centre, Piteaa (Sweden)

    2013-09-01

    Thermal gasification at large scale for cogeneration of power and heat and/or production of fuels and materials is a main pathway for a sustainable deployment of biomass resources. However, so far no such full scale production exists and biomass gasification projects remain at the pilot or demonstration scale. This report focuses on the key critical technology challenges for the large-scale deployment of the following biomass-based gasification concepts: Direct Fluidized Bed Gasification (FBG), Entrained Flow Gasification (EFG) and indirect Dual Fluidized Bed Gasification (DFBG). The main content in this report is based on responses from a number of experts in biomass gasification obtained from a questionnaire. The survey was composed of a number of more or less specific questions on technical barriers as to the three gasification concepts considered. For formalising the questionnaire, the concept of Technology Readiness Level (TRL 1-9) was used for grading the level of technical maturity of the different sub-processes within the three generic biomass gasification technologies. For direct fluidized bed gasification (FBG) it is mentioned that the technology is already available at commercial scale as air-blown technology and thus that air-blown FBG gasification may be reckoned a mature technology. The remaining technical challenge is the conversion to operation on oxygen with the final goal of producing chemicals or transport fuels. Tar reduction, in particular, and gas cleaning and upgrading in general are by far the most frequently named technical issues considered problematic. Other important aspects are problems that may occur when operating on low-grade fuels - i.e. low-cost fuels. These problems include bed agglomeration/ash sintering as well as alkali fouling. Even the preparation and feeding of these low-grade fuels tend to be problematic and require further development to be used on a commercial scale. Furthermore, efficient char conversion is mentioned by

  4. Double inflation: A possible resolution of the large-scale structure problem

    International Nuclear Information System (INIS)

    Turner, M.S.; Villumsen, J.V.; Vittorio, N.; Silk, J.; Juszkiewicz, R.

    1986-11-01

    A model is presented for the large-scale structure of the universe in which two successive inflationary phases resulted in large small-scale and small large-scale density fluctuations. This bimodal density fluctuation spectrum in an Ω = 1 universe dominated by hot dark matter leads to large-scale structure of the galaxy distribution that is consistent with recent observational results. In particular, large, nearly empty voids and significant large-scale peculiar velocity fields are produced over scales of ∼100 Mpc, while the small-scale structure over ≤ 10 Mpc resembles that in a low density universe, as observed. Detailed analytical calculations and numerical simulations are given of the spatial and velocity correlations. 38 refs., 6 figs

  5. Large-scale fracture mechancis testing -- requirements and possibilities

    International Nuclear Information System (INIS)

    Brumovsky, M.

    1993-01-01

    Application of fracture mechanics to very important and/or complicated structures, like reactor pressure vessels, brings also some questions about the reliability and precision of such calculations. These problems become more pronounced in cases of elastic-plastic conditions of loading and/or in parts with non-homogeneous materials (base metal and austenitic cladding, property gradient changes through material thickness) or with non-homogeneous stress fields (nozzles, bolt threads, residual stresses etc.). For such special cases some verification by large-scale testing is necessary and valuable. This paper discusses problems connected with planning of such experiments with respect to their limitations, requirements to a good transfer of received results to an actual vessel. At the same time, an analysis of possibilities of small-scale model experiments is also shown, mostly in connection with application of results between standard, small-scale and large-scale experiments. Experience from 30 years of large-scale testing in SKODA is used as an example to support this analysis. 1 fig

  6. Ethics of large-scale change

    DEFF Research Database (Denmark)

    Arler, Finn

    2006-01-01

    , which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, the neoclassical economists' approach, and finally the so-called Concentric Circle Theories approach...

  7. Large scale FCI experiments in subassembly geometry. Test facility and model experiments

    International Nuclear Information System (INIS)

    Beutel, H.; Gast, K.

    A program is outlined for the study of fuel/coolant interaction under SNR conditions. The program consists of a) under water explosion experiments with full size models of the SNR-core, in which the fuel/coolant system is simulated by a pyrotechnic mixture. b) large scale fuel/coolant interaction experiments with up to 5kg of molten UO 2 interacting with liquid sodium at 300 deg C to 600 deg C in a highly instrumented test facility simulating an SNR subassembly. The experimental results will be compared to theoretical models under development at Karlsruhe. Commencement of the experiments is expected for the beginning of 1975

  8. Assessment of the technology required to develop photovoltaic power system for large scale national energy applications

    Science.gov (United States)

    Lutwack, R.

    1974-01-01

    A technical assessment of a program to develop photovoltaic power system technology for large-scale national energy applications was made by analyzing and judging the alternative candidate photovoltaic systems and development tasks. A program plan was constructed based on achieving the 10 year objective of a program to establish the practicability of large-scale terrestrial power installations using photovoltaic conversion arrays costing less than $0.50/peak W. Guidelines for the tasks of a 5 year program were derived from a set of 5 year objectives deduced from the 10 year objective. This report indicates the need for an early emphasis on the development of the single-crystal Si photovoltaic system for commercial utilization; a production goal of 5 x 10 to the 8th power peak W/year of $0.50 cells was projected for the year 1985. The developments of other photovoltaic conversion systems were assigned to longer range development roles. The status of the technology developments and the applicability of solar arrays in particular power installations, ranging from houses to central power plants, was scheduled to be verified in a series of demonstration projects. The budget recommended for the first 5 year phase of the program is $268.5M.

  9. Foundational perspectives on causality in large-scale brain networks

    Science.gov (United States)

    Mannino, Michael; Bressler, Steven L.

    2015-12-01

    A profusion of recent work in cognitive neuroscience has been concerned with the endeavor to uncover causal influences in large-scale brain networks. However, despite the fact that many papers give a nod to the important theoretical challenges posed by the concept of causality, this explosion of research has generally not been accompanied by a rigorous conceptual analysis of the nature of causality in the brain. This review provides both a descriptive and prescriptive account of the nature of causality as found within and between large-scale brain networks. In short, it seeks to clarify the concept of causality in large-scale brain networks both philosophically and scientifically. This is accomplished by briefly reviewing the rich philosophical history of work on causality, especially focusing on contributions by David Hume, Immanuel Kant, Bertrand Russell, and Christopher Hitchcock. We go on to discuss the impact that various interpretations of modern physics have had on our understanding of causality. Throughout all this, a central focus is the distinction between theories of deterministic causality (DC), whereby causes uniquely determine their effects, and probabilistic causality (PC), whereby causes change the probability of occurrence of their effects. We argue that, given the topological complexity of its large-scale connectivity, the brain should be considered as a complex system and its causal influences treated as probabilistic in nature. We conclude that PC is well suited for explaining causality in the brain for three reasons: (1) brain causality is often mutual; (2) connectional convergence dictates that only rarely is the activity of one neuronal population uniquely determined by another one; and (3) the causal influences exerted between neuronal populations may not have observable effects. A number of different techniques are currently available to characterize causal influence in the brain. Typically, these techniques quantify the statistical

  10. Comparison Between Overtopping Discharge in Small and Large Scale Models

    DEFF Research Database (Denmark)

    Helgason, Einar; Burcharth, Hans F.

    2006-01-01

    The present paper presents overtopping measurements from small scale model test performed at the Haudraulic & Coastal Engineering Laboratory, Aalborg University, Denmark and large scale model tests performed at the Largde Wave Channel,Hannover, Germany. Comparison between results obtained from...... small and large scale model tests show no clear evidence of scale effects for overtopping above a threshold value. In the large scale model no overtopping was measured for waveheights below Hs = 0.5m as the water sunk into the voids between the stones on the crest. For low overtopping scale effects...

  11. WVU cooperative agreement, decontamination systems information and research program, deployment support leading to implementation

    International Nuclear Information System (INIS)

    Cook, E.E.

    1996-01-01

    This program at West Virginia University is a Cooperative Agreement that focuses on R ampersand D associated with hazardous waste remediation problems existing at DOE, Corps of Engineers, and private sector sites. The Agreement builds on a unique combination of resources coupling university researchers with DOE sponsored small businesses, leading toward field tests and large scale technology demonstrations of environmental technologies. Most of the Agreement's projects are categorized in the Technology Maturity Levels under Gates 3-Advanced Development, Gate 4-Engineering Development, and Gate 5-Demonstration. The program includes a diversity of projects: subsurface contaminants; mixed wastes; mixed wastes/efficient separations; mixed wastes/characterization, monitoring, and sensor technologies; and decontamination and decommissioning/efficient separations

  12. Large-scale immigration and political response: popular reaction in California.

    Science.gov (United States)

    Clark, W A

    1998-03-01

    Over the past 3 years, the level of political debate has grown over the nature and extent of the recent large-scale immigration to the US in general, and to California in particular. California's Proposition 187 to deny welfare benefits to illegal immigrants brought national attention to the immigration debate, and no doubt influenced recent decisions to significantly change the US's welfare program. The author studied the vote on Proposition 187 in the November 1994 California election to better understand the nature of reaction to large-scale immigration and recent arguments about anti-immigrant sentiment and nativism. The only counties which voted against the proposition were Sonoma, Marin, San Mateo, Santa Cruz, Yolo, Alameda, and Santa Clara, as well as the population of San Francisco. The vote generated political responses from across the border as well as within California. Statements from Mexican and other Central American governments reflected their concern over the possibility of returning populations, for whom there are neither jobs nor public services in their countries of origin. Findings are presented from a spatial analysis of the vote by census tracts in Los Angeles County.

  13. Large-Scale Spacecraft Fire Safety Experiments in ISS Resupply Vehicles

    Science.gov (United States)

    Ruff, Gary A.; Urban, David

    2013-01-01

    Our understanding of the fire safety risk in manned spacecraft has been limited by the small scale of the testing we have been able to conduct in low-gravity. Fire growth and spread cannot be expected to scale linearly with sample size so we cannot make accurate predictions of the behavior of realistic scale fires in spacecraft based on the limited low-g testing to date. As a result, spacecraft fire safety protocols are necessarily very conservative and costly. Future crewed missions are expected to be longer in duration than previous exploration missions outside of low-earth orbit and accordingly, more complex in terms of operations, logistics, and safety. This will increase the challenge of ensuring a fire-safe environment for the crew throughout the mission. Based on our fundamental uncertainty of the behavior of fires in low-gravity, the need for realistic scale testing at reduced gravity has been demonstrated. To address this concern, a spacecraft fire safety research project is underway to reduce the uncertainty and risk in the design of spacecraft fire safety systems by testing at nearly full scale in low-gravity. This project is supported by the NASA Advanced Exploration Systems Program Office in the Human Exploration and Operations Mission Directorate. The activity of this project is supported by an international topical team of fire experts from other space agencies to maximize the utility of the data and to ensure the widest possible scrutiny of the concept. The large-scale space flight experiment will be conducted on three missions; each in an Orbital Sciences Corporation Cygnus vehicle after it has deberthed from the ISS. Although the experiment will need to meet rigorous safety requirements to ensure the carrier vehicle does not sustain damage, the absence of a crew allows the fire products to be released into the cabin. The tests will be fully automated with the data downlinked at the conclusion of the test before the Cygnus vehicle reenters the

  14. Large-scale computer networks and the future of legal knowledge-based systems

    NARCIS (Netherlands)

    Leenes, R.E.; Svensson, Jorgen S.; Hage, J.C.; Bench-Capon, T.J.M.; Cohen, M.J.; van den Herik, H.J.

    1995-01-01

    In this paper we investigate the relation between legal knowledge-based systems and large-scale computer networks such as the Internet. On the one hand, researchers of legal knowledge-based systems have claimed huge possibilities, but despite the efforts over the last twenty years, the number of

  15. Enabling Large-Scale Biomedical Analysis in the Cloud

    Directory of Open Access Journals (Sweden)

    Ying-Chih Lin

    2013-01-01

    Full Text Available Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable.

  16. Development of Large-Scale Spacecraft Fire Safety Experiments

    DEFF Research Database (Denmark)

    Ruff, Gary A.; Urban, David L.; Fernandez-Pello, A. Carlos

    2013-01-01

    exploration missions outside of low-earth orbit and accordingly, more complex in terms of operations, logistics, and safety. This will increase the challenge of ensuring a fire-safe environment for the crew throughout the mission. Based on our fundamental uncertainty of the behavior of fires in low...... of the spacecraft fire safety risk. The activity of this project is supported by an international topical team of fire experts from other space agencies who conduct research that is integrated into the overall experiment design. The large-scale space flight experiment will be conducted in an Orbital Sciences...

  17. Lichen elemental content bioindicators for air quality in upper Midwest, USA: A model for large-scale monitoring

    Science.gov (United States)

    Susan Will-Wolf; Sarah Jovan; Michael C. Amacher

    2017-01-01

    Our development of lichen elemental bioindicators for a United States of America (USA) national monitoring program is a useful model for other large-scale programs. Concentrations of 20 elements were measured, validated, and analyzed for 203 samples of five common lichen species. Collections were made by trained non-specialists near 75 permanent plots and an expert...

  18. A fast approach to generate large-scale topographic maps based on new Chinese vehicle-borne Lidar system

    International Nuclear Information System (INIS)

    Youmei, Han; Bogang, Yang

    2014-01-01

    Large -scale topographic maps are important basic information for city and regional planning and management. Traditional large- scale mapping methods are mostly based on artificial mapping and photogrammetry. The traditional mapping method is inefficient and limited by the environments. While the photogrammetry methods(such as low-altitude aerial mapping) is an economical and effective way to map wide and regulate range of large scale topographic map but doesn't work well in the small area due to the high cost of manpower and resources. Recent years, the vehicle-borne LIDAR technology has a rapid development, and its application in surveying and mapping is becoming a new topic. The main objective of this investigation is to explore the potential of vehicle-borne LIDAR technology to be used to fast mapping large scale topographic maps based on new Chinese vehicle-borne LIDAR system. It studied how to use the new Chinese vehicle-borne LIDAR system measurement technology to map large scale topographic maps. After the field data capture, it can be mapped in the office based on the LIDAR data (point cloud) by software which programmed by ourselves. In addition, the detailed process and accuracy analysis were proposed by an actual case. The result show that this new technology provides a new fast method to generate large scale topographic maps, which is high efficient and accuracy compared to traditional methods

  19. Large-scale structure of the Universe

    International Nuclear Information System (INIS)

    Doroshkevich, A.G.

    1978-01-01

    The problems, discussed at the ''Large-scale Structure of the Universe'' symposium are considered on a popular level. Described are the cell structure of galaxy distribution in the Universe, principles of mathematical galaxy distribution modelling. The images of cell structures, obtained after reprocessing with the computer are given. Discussed are three hypothesis - vortical, entropic, adiabatic, suggesting various processes of galaxy and galaxy clusters origin. A considerable advantage of the adiabatic hypothesis is recognized. The relict radiation, as a method of direct studying the processes taking place in the Universe is considered. The large-scale peculiarities and small-scale fluctuations of the relict radiation temperature enable one to estimate the turbance properties at the pre-galaxy stage. The discussion of problems, pertaining to studying the hot gas, contained in galaxy clusters, the interactions within galaxy clusters and with the inter-galaxy medium, is recognized to be a notable contribution into the development of theoretical and observational cosmology

  20. Large-Scale Submerged Aquatic Vegetation Restoration in Chesapeake Bay: Status Report, 2003-2006

    National Research Council Canada - National Science Library

    Shafer, Deborah J; Bergstrom, Peter

    2008-01-01

    ...). New techniques and equipment developed as part of this research have introduced the capability to collect and disperse millions of eelgrass seeds. These results demonstrate these programs success in developing tools and techniques necessary to plant SAV at scales unattainable with technologies existing only a few years ago.

  1. Seismic safety in conducting large-scale blasts

    Science.gov (United States)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  2. Modeling and control of a large nuclear reactor. A three-time-scale approach

    Energy Technology Data Exchange (ETDEWEB)

    Shimjith, S.R. [Indian Institute of Technology Bombay, Mumbai (India); Bhabha Atomic Research Centre, Mumbai (India); Tiwari, A.P. [Bhabha Atomic Research Centre, Mumbai (India); Bandyopadhyay, B. [Indian Institute of Technology Bombay, Mumbai (India). IDP in Systems and Control Engineering

    2013-07-01

    Recent research on Modeling and Control of a Large Nuclear Reactor. Presents a three-time-scale approach. Written by leading experts in the field. Control analysis and design of large nuclear reactors requires a suitable mathematical model representing the steady state and dynamic behavior of the reactor with reasonable accuracy. This task is, however, quite challenging because of several complex dynamic phenomena existing in a reactor. Quite often, the models developed would be of prohibitively large order, non-linear and of complex structure not readily amenable for control studies. Moreover, the existence of simultaneously occurring dynamic variations at different speeds makes the mathematical model susceptible to numerical ill-conditioning, inhibiting direct application of standard control techniques. This monograph introduces a technique for mathematical modeling of large nuclear reactors in the framework of multi-point kinetics, to obtain a comparatively smaller order model in standard state space form thus overcoming these difficulties. It further brings in innovative methods for controller design for systems exhibiting multi-time-scale property, with emphasis on three-time-scale systems.

  3. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-05-27

    While real-time applications are nowadays routinely used in visualizing large nu- merical simulations and volumes, handling these large-scale datasets requires high-end graphics clusters or supercomputers to process and visualize them. However, not all users have access to powerful clusters. Therefore, it is challenging to come up with a visualization approach that provides insight to large-scale datasets on a single com- puter. Explorable images (EI) is one of the methods that allows users to handle large data on a single workstation. Although it is a view-dependent method, it combines both exploration and modification of visual aspects without re-accessing the original huge data. In this thesis, we propose a novel image-based method that applies the concept of EI in visualizing large flow-field pathlines data. The goal of our work is to provide an optimized image-based method, which scales well with the dataset size. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  4. Research on a Small Signal Stability Region Boundary Model of the Interconnected Power System with Large-Scale Wind Power

    Directory of Open Access Journals (Sweden)

    Wenying Liu

    2015-03-01

    Full Text Available For the interconnected power system with large-scale wind power, the problem of the small signal stability has become the bottleneck of restricting the sending-out of wind power as well as the security and stability of the whole power system. Around this issue, this paper establishes a small signal stability region boundary model of the interconnected power system with large-scale wind power based on catastrophe theory, providing a new method for analyzing the small signal stability. Firstly, we analyzed the typical characteristics and the mathematic model of the interconnected power system with wind power and pointed out that conventional methods can’t directly identify the topological properties of small signal stability region boundaries. For this problem, adopting catastrophe theory, we established a small signal stability region boundary model of the interconnected power system with large-scale wind power in two-dimensional power injection space and extended it to multiple dimensions to obtain the boundary model in multidimensional power injection space. Thirdly, we analyzed qualitatively the topological property’s changes of the small signal stability region boundary caused by large-scale wind power integration. Finally, we built simulation models by DIgSILENT/PowerFactory software and the final simulation results verified the correctness and effectiveness of the proposed model.

  5. Exploratory Technology Research Program for electrochemical energy storage

    Science.gov (United States)

    Kinoshita, Kim

    1994-09-01

    The U.S. Department of Energy's Office of Propulsion Systems provides support for an Electrochemical Energy Storage Program, that includes research and development (R&D) on advanced rechargeable batteries and fuel cells. A major goal of this program is to develop electrochemical power sources suitable for application in electric vehicles (EV's). The program centers on advanced systems that offer the potential for high performance and low life-cycle costs, both of which are necessary to permit significant penetration into commercial markets. The DOE Electrochemical Energy Storage Program is divided into two projects: the Electric Vehicle Advanced Battery Systems (EVABS) Development Program and the Exploratory Technology Research (ETR) Program. The EVABS Program management responsibility has been assigned to Sandia National Laboratories (SNL); Lawrence Berkeley Laboratory (LBL) is responsible for management of the ETR Program. The EVABS and ETR Programs include an integrated matrix of R&D efforts designed to advance progress on selected candidate electrochemical systems. The United States Advanced Battery Consortium (USABC), a tripartite undertaking between DOE, the U.S. automobile manufacturers and the Electric Power Research Institute (EPRI), was formed in 1991 to accelerate the development of advanced batteries for consumer EV's. The role of the FIR Program is to perform supporting research on the advanced battery systems under development by the USABC and EVABS Program, and to evaluate new systems with potentially superior performance, durability and/or cost characteristics. The specific goal of the ETR Program is to identify the most promising electrochemical technologies and transfer them to the USABC, the battery industry and/or the EVABS Program for further development and scale-up. This report summarizes the research, financial and management activities relevant to the ETR Program in CY 1993.

  6. Homogenization of Large-Scale Movement Models in Ecology

    Science.gov (United States)

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  7. The role of large-scale, extratropical dynamics in climate change

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, T.G. [ed.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop`s University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database.

  8. The role of large-scale, extratropical dynamics in climate change

    International Nuclear Information System (INIS)

    Shepherd, T.G.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop's University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database

  9. Status: Large-scale subatmospheric cryogenic systems

    International Nuclear Information System (INIS)

    Peterson, T.

    1989-01-01

    In the late 1960's and early 1970's an interest in testing and operating RF cavities at 1.8K motivated the development and construction of four large (300 Watt) 1.8K refrigeration systems. in the past decade, development of successful superconducting RF cavities and interest in obtaining higher magnetic fields with the improved Niobium-Titanium superconductors has once again created interest in large-scale 1.8K refrigeration systems. The L'Air Liquide plant for Tore Supra is a recently commissioned 300 Watt 1.8K system which incorporates new technology, cold compressors, to obtain the low vapor pressure for low temperature cooling. CEBAF proposes to use cold compressors to obtain 5KW at 2.0K. Magnetic refrigerators of 10 Watt capacity or higher at 1.8K are now being developed. The state of the art of large-scale refrigeration in the range under 4K will be reviewed. 28 refs., 4 figs., 7 tabs

  10. Lessons from a large-scale assessment: Results from conceptual inventories

    Directory of Open Access Journals (Sweden)

    Beth Thacker

    2014-07-01

    Full Text Available We report conceptual inventory results of a large-scale assessment project at a large university. We studied the introduction of materials and instructional methods informed by physics education research (PER (physics education research-informed materials into a department where most instruction has previously been traditional and a significant number of faculty are hesitant, ambivalent, or even resistant to the introduction of such reforms. Data were collected in all of the sections of both the large algebra- and calculus-based introductory courses for a number of years employing commonly used conceptual inventories. Results from a small PER-informed, inquiry-based, laboratory-based class are also reported. Results suggest that when PER-informed materials are introduced in the labs and recitations, independent of the lecture style, there is an increase in students’ conceptual inventory gains. There is also an increase in the results on conceptual inventories if PER-informed instruction is used in the lecture. The highest conceptual inventory gains were achieved by the combination of PER-informed lectures and laboratories in large class settings and by the hands-on, laboratory-based, inquiry-based course taught in a small class setting.

  11. Large-scale weakly supervised object localization via latent category learning.

    Science.gov (United States)

    Chong Wang; Kaiqi Huang; Weiqiang Ren; Junge Zhang; Maybank, Steve

    2015-04-01

    Localizing objects in cluttered backgrounds is challenging under large-scale weakly supervised conditions. Due to the cluttered image condition, objects usually have large ambiguity with backgrounds. Besides, there is also a lack of effective algorithm for large-scale weakly supervised localization in cluttered backgrounds. However, backgrounds contain useful latent information, e.g., the sky in the aeroplane class. If this latent information can be learned, object-background ambiguity can be largely reduced and background can be suppressed effectively. In this paper, we propose the latent category learning (LCL) in large-scale cluttered conditions. LCL is an unsupervised learning method which requires only image-level class labels. First, we use the latent semantic analysis with semantic object representation to learn the latent categories, which represent objects, object parts or backgrounds. Second, to determine which category contains the target object, we propose a category selection strategy by evaluating each category's discrimination. Finally, we propose the online LCL for use in large-scale conditions. Evaluation on the challenging PASCAL Visual Object Class (VOC) 2007 and the large-scale imagenet large-scale visual recognition challenge 2013 detection data sets shows that the method can improve the annotation precision by 10% over previous methods. More importantly, we achieve the detection precision which outperforms previous results by a large margin and can be competitive to the supervised deformable part model 5.0 baseline on both data sets.

  12. Adaptive Virtual Tow Bar, research results 2016

    NARCIS (Netherlands)

    Willemsen, D.M.C.; Hueting, T.F.; Joosten, B.; Uittenbogaard, J.; Martens, M.H.

    2017-01-01

    This document reports the advances made in 2016 for the Early Research Program (ERP) Human Enhancement: Adaptive Automation, sub-project Adaptive Virtual Tow Bar. The ambition of the large scale TNO Early Research Program (ERP) Human Enhancement is to develop a transparent (human-in-the-loop)

  13. Large-scale production of lentiviral vector in a closed system hollow fiber bioreactor

    Directory of Open Access Journals (Sweden)

    Jonathan Sheu

    Full Text Available Lentiviral vectors are widely used in the field of gene therapy as an effective method for permanent gene delivery. While current methods of producing small scale vector batches for research purposes depend largely on culture flasks, the emergence and popularity of lentiviral vectors in translational, preclinical and clinical research has demanded their production on a much larger scale, a task that can be difficult to manage with the numbers of producer cell culture flasks required for large volumes of vector. To generate a large scale, partially closed system method for the manufacturing of clinical grade lentiviral vector suitable for the generation of induced pluripotent stem cells (iPSCs, we developed a method employing a hollow fiber bioreactor traditionally used for cell expansion. We have demonstrated the growth, transfection, and vector-producing capability of 293T producer cells in this system. Vector particle RNA titers after subsequent vector concentration yielded values comparable to lentiviral iPSC induction vector batches produced using traditional culture methods in 225 cm2 flasks (T225s and in 10-layer cell factories (CF10s, while yielding a volume nearly 145 times larger than the yield from a T225 flask and nearly three times larger than the yield from a CF10. Employing a closed system hollow fiber bioreactor for vector production offers the possibility of manufacturing large quantities of gene therapy vector while minimizing reagent usage, equipment footprint, and open system manipulation.

  14. [Results and evaluation of 3 years of a large scale mammography program in the Ariana area of Tunisia].

    Science.gov (United States)

    Bouchlaka, A; Ben Abdallah, M; Ben Aissa, R; Zaanouni, E; Kribi, L; Smida, S; M'barek, F; Ben Hamida, A; Boussen, H; Gueddana, N

    2009-07-01

    To asses and analyse the results of 3 years large scale mammography screening of breast cancer in Ariana state in Tunisia. This program, managed by the Family and Population National Office, was addressing to women aged from 49 to 69 years old residing in a area with adds up a population of 459 700 inhabitants including 52,729 women in the target age population. The screening was including a breast clinical examination and a mammography with two incidences face and external profile. The women was invited at their residence or were sensitized in the reproductive health centers, care and base health centers or by a close relation which heard of the program. An enlightened assent was submitted to the women who wished to profit from the screening. In three years, 9093 mammography were carried out of which 8244 were retained in the analyses, that is to say a rate participation of 9.6%. The rate of women recalled for suspect test was of 18.1% and it was of 13.1% among women of more than 50 years. The rate of practiced surgical biopsies was of 0.5% and the positive predictive value was of 45.5%. The average time between the date of screening and the result of the screening was 9.7 days, more important in the event of tests requiring a complementary assessment (61.7 days). On the whole 40 cancers were detected by the program, that is to say a rough rate of detected cancers, of 4.9 per thousand, in conformity with the recommendations. The percentage of invasive cancers detect an important rate of cancers of which cancers infra-private clinics.

  15. A Dynamic Optimization Strategy for the Operation of Large Scale Seawater Reverses Osmosis System

    Directory of Open Access Journals (Sweden)

    Aipeng Jiang

    2014-01-01

    Full Text Available In this work, an efficient strategy was proposed for efficient solution of the dynamic model of SWRO system. Since the dynamic model is formulated by a set of differential-algebraic equations, simultaneous strategies based on collocations on finite element were used to transform the DAOP into large scale nonlinear programming problem named Opt2. Then, simulation of RO process and storage tanks was carried element by element and step by step with fixed control variables. All the obtained values of these variables then were used as the initial value for the optimal solution of SWRO system. Finally, in order to accelerate the computing efficiency and at the same time to keep enough accuracy for the solution of Opt2, a simple but efficient finite element refinement rule was used to reduce the scale of Opt2. The proposed strategy was applied to a large scale SWRO system with 8 RO plants and 4 storage tanks as case study. Computing result shows that the proposed strategy is quite effective for optimal operation of the large scale SWRO system; the optimal problem can be successfully solved within decades of iterations and several minutes when load and other operating parameters fluctuate.

  16. Large-scale quantitative analysis of painting arts.

    Science.gov (United States)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-11

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  17. Ready-to-use foods for management of moderate acute malnutrition: considerations for scaling up production and use in programs.

    Science.gov (United States)

    Osendarp, Saskia; Rogers, Beatrice; Ryan, Kelsey; Manary, Mark; Akomo, Peter; Bahwere, Paluku; Belete, Hilina; Zeilani, Mamane; Islam, Munirul; Dibari, Filippo; De Pee, Saskia

    2015-03-01

    Ready-to-use foods are one of the available strategies for the treatment of moderate acute malnutrition (MAM), but challenges remain in the use of these products in programs at scale. This paper focuses on two challenges: the need for cheaper formulations using locally available ingredients that are processed in a safe, reliable, and financially sustainable local production facility; and the effective use of these products in large-scale community-based programs. Linear programming tools can be used successfully to design local compositions that are in line with international guidelines, low in cost, and acceptable, and the efficacy of these local formulations in the treatment of MAM was recently demonstrated in Malawi. The production of local formulations for programs at scale relies on the existence of a reliable and efficient local production facility. Technical assistance may be required in the development of sustainable business models at an early stage in the process, taking into account the stringent product quality and safety criteria and the required investments. The use of ready-to-use products, as of any food supplement, in programs at scale will be affected by the practice of household sharing and diversion of these products for other uses. Additional measures can be considered to account for sharing. These products designed for the treatment and prevention of MAM are to be used in community-based programs and should therefore be used in conjunction with other interventions and designed so that they do not replace the intake of other foods and breastmilk. Remaining challenges and implications for the (operations) research agenda are discussed.

  18. Extreme Scale Computing for First-Principles Plasma Physics Research

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Choogn-Seock [Princeton University

    2011-10-12

    World superpowers are in the middle of the “Computnik” race. US Department of Energy (and National Nuclear Security Administration) wishes to launch exascale computer systems into the scientific (and national security) world by 2018. The objective is to solve important scientific problems and to predict the outcomes using the most fundamental scientific laws, which would not be possible otherwise. Being chosen into the next “frontier” group can be of great benefit to a scientific discipline. An extreme scale computer system requires different types of algorithms and programming philosophy from those we have been accustomed to. Only a handful of scientific codes are blessed to be capable of scalable usage of today’s largest computers in operation at petascale (using more than 100,000 cores concurrently). Fortunately, a few magnetic fusion codes are competing well in this race using the “first principles” gyrokinetic equations.These codes are beginning to study the fusion plasma dynamics in full-scale realistic diverted device geometry in natural nonlinear multiscale, including the large scale neoclassical and small scale turbulence physics, but excluding some ultra fast dynamics. In this talk, most of the above mentioned topics will be introduced at executive level. Representative properties of the extreme scale computers, modern programming exercises to take advantage of them, and different philosophies in the data flows and analyses will be presented. Examples of the multi-scale multi-physics scientific discoveries made possible by solving the gyrokinetic equations on extreme scale computers will be described. Future directions into “virtual tokamak experiments” will also be discussed.

  19. Do you kiss your mother with that mouth? An authentic large-scale undergraduate research experience in mapping the human oral microbiome.

    Science.gov (United States)

    Wang, Jack T H; Daly, Joshua N; Willner, Dana L; Patil, Jayee; Hall, Roy A; Schembri, Mark A; Tyson, Gene W; Hugenholtz, Philip

    2015-05-01

    Clinical microbiology testing is crucial for the diagnosis and treatment of community and hospital-acquired infections. Laboratory scientists need to utilize technical and problem-solving skills to select from a wide array of microbial identification techniques. The inquiry-driven laboratory training required to prepare microbiology graduates for this professional environment can be difficult to replicate within undergraduate curricula, especially in courses that accommodate large student cohorts. We aimed to improve undergraduate scientific training by engaging hundreds of introductory microbiology students in an Authentic Large-Scale Undergraduate Research Experience (ALURE). The ALURE aimed to characterize the microorganisms that reside in the healthy human oral cavity-the oral microbiome-by analyzing hundreds of samples obtained from student volunteers within the course. Students were able to choose from selective and differential culture media, Gram-staining, microscopy, as well as polymerase chain reaction (PCR) and 16S rRNA gene sequencing techniques, in order to collect, analyze, and interpret novel data to determine the collective oral microbiome of the student cohort. Pre- and postsurvey analysis of student learning gains across two iterations of the course (2012-2013) revealed significantly higher student confidence in laboratory skills following the completion of the ALURE (p < 0.05 using the Mann-Whitney U-test). Learning objectives on effective scientific communication were also met through effective student performance in laboratory reports describing the research outcomes of the project. The integration of undergraduate research in clinical microbiology has the capacity to deliver authentic research experiences and improve scientific training for large cohorts of undergraduate students.

  20. Benefits of transactive memory systems in large-scale development

    OpenAIRE

    Aivars, Sablis

    2016-01-01

    Context. Large-scale software development projects are those consisting of a large number of teams, maybe even spread across multiple locations, and working on large and complex software tasks. That means that neither a team member individually nor an entire team holds all the knowledge about the software being developed and teams have to communicate and coordinate their knowledge. Therefore, teams and team members in large-scale software development projects must acquire and manage expertise...

  1. Final Report: Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Ghattas, Omar [The University of Texas at Austin

    2013-10-15

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimiza- tion) Project focuses on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimiza- tion and inversion methods. Our research is directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. Our efforts are integrated in the context of a challenging testbed problem that considers subsurface reacting flow and transport. The MIT component of the SAGUARO Project addresses the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas-Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to- observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as "reduce then sample" and "sample then reduce." In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.

  2. Large-scale transportation network congestion evolution prediction using deep learning theory.

    Science.gov (United States)

    Ma, Xiaolei; Yu, Haiyang; Wang, Yunpeng; Wang, Yinhai

    2015-01-01

    Understanding how congestion at one location can cause ripples throughout large-scale transportation network is vital for transportation researchers and practitioners to pinpoint traffic bottlenecks for congestion mitigation. Traditional studies rely on either mathematical equations or simulation techniques to model traffic congestion dynamics. However, most of the approaches have limitations, largely due to unrealistic assumptions and cumbersome parameter calibration process. With the development of Intelligent Transportation Systems (ITS) and Internet of Things (IoT), transportation data become more and more ubiquitous. This triggers a series of data-driven research to investigate transportation phenomena. Among them, deep learning theory is considered one of the most promising techniques to tackle tremendous high-dimensional data. This study attempts to extend deep learning theory into large-scale transportation network analysis. A deep Restricted Boltzmann Machine and Recurrent Neural Network architecture is utilized to model and predict traffic congestion evolution based on Global Positioning System (GPS) data from taxi. A numerical study in Ningbo, China is conducted to validate the effectiveness and efficiency of the proposed method. Results show that the prediction accuracy can achieve as high as 88% within less than 6 minutes when the model is implemented in a Graphic Processing Unit (GPU)-based parallel computing environment. The predicted congestion evolution patterns can be visualized temporally and spatially through a map-based platform to identify the vulnerable links for proactive congestion mitigation.

  3. How Can the Evidence from Global Large-scale Clinical Trials for Cardiovascular Diseases be Improved?

    Science.gov (United States)

    Sawata, Hiroshi; Tsutani, Kiichiro

    2011-06-29

    Clinical investigations are important for obtaining evidence to improve medical treatment. Large-scale clinical trials with thousands of participants are particularly important for this purpose in cardiovascular diseases. Conducting large-scale clinical trials entails high research costs. This study sought to investigate global trends in large-scale clinical trials in cardiovascular diseases. We searched for trials using clinicaltrials.gov (URL: http://www.clinicaltrials.gov/) using the key words 'cardio' and 'event' in all fields on 10 April, 2010. We then selected trials with 300 or more participants examining cardiovascular diseases. The search revealed 344 trials that met our criteria. Of 344 trials, 71% were randomized controlled trials, 15% involved more than 10,000 participants, and 59% were funded by industry. In RCTs whose results were disclosed, 55% of industry-funded trials and 25% of non-industry funded trials reported statistically significant superiority over control (p = 0.012, 2-sided Fisher's exact test). Our findings highlighted concerns regarding potential bias related to funding sources, and that researchers should be aware of the importance of trial information disclosures and conflicts of interest. We should keep considering management and training regarding information disclosures and conflicts of interest for researchers. This could lead to better clinical evidence and further improvements in the development of medical treatment worldwide.

  4. Capabilities of the Large-Scale Sediment Transport Facility

    Science.gov (United States)

    2016-04-01

    pump flow meters, sediment trap weigh tanks , and beach profiling lidar. A detailed discussion of the original LSTF features and capabilities can be...ERDC/CHL CHETN-I-88 April 2016 Approved for public release; distribution is unlimited. Capabilities of the Large-Scale Sediment Transport...describes the Large-Scale Sediment Transport Facility (LSTF) and recent upgrades to the measurement systems. The purpose of these upgrades was to increase

  5. A decomposition heuristics based on multi-bottleneck machines for large-scale job shop scheduling problems

    Directory of Open Access Journals (Sweden)

    Yingni Zhai

    2014-10-01

    Full Text Available Purpose: A decomposition heuristics based on multi-bottleneck machines for large-scale job shop scheduling problems (JSP is proposed.Design/methodology/approach: In the algorithm, a number of sub-problems are constructed by iteratively decomposing the large-scale JSP according to the process route of each job. And then the solution of the large-scale JSP can be obtained by iteratively solving the sub-problems. In order to improve the sub-problems' solving efficiency and the solution quality, a detection method for multi-bottleneck machines based on critical path is proposed. Therewith the unscheduled operations can be decomposed into bottleneck operations and non-bottleneck operations. According to the principle of “Bottleneck leads the performance of the whole manufacturing system” in TOC (Theory Of Constraints, the bottleneck operations are scheduled by genetic algorithm for high solution quality, and the non-bottleneck operations are scheduled by dispatching rules for the improvement of the solving efficiency.Findings: In the process of the sub-problems' construction, partial operations in the previous scheduled sub-problem are divided into the successive sub-problem for re-optimization. This strategy can improve the solution quality of the algorithm. In the process of solving the sub-problems, the strategy that evaluating the chromosome's fitness by predicting the global scheduling objective value can improve the solution quality.Research limitations/implications: In this research, there are some assumptions which reduce the complexity of the large-scale scheduling problem. They are as follows: The processing route of each job is predetermined, and the processing time of each operation is fixed. There is no machine breakdown, and no preemption of the operations is allowed. The assumptions should be considered if the algorithm is used in the actual job shop.Originality/value: The research provides an efficient scheduling method for the

  6. Spatiotemporal property and predictability of large-scale human mobility

    Science.gov (United States)

    Zhang, Hai-Tao; Zhu, Tao; Fu, Dongfei; Xu, Bowen; Han, Xiao-Pu; Chen, Duxin

    2018-04-01

    Spatiotemporal characteristics of human mobility emerging from complexity on individual scale have been extensively studied due to the application potential on human behavior prediction and recommendation, and control of epidemic spreading. We collect and investigate a comprehensive data set of human activities on large geographical scales, including both websites browse and mobile towers visit. Numerical results show that the degree of activity decays as a power law, indicating that human behaviors are reminiscent of scale-free random walks known as Lévy flight. More significantly, this study suggests that human activities on large geographical scales have specific non-Markovian characteristics, such as a two-segment power-law distribution of dwelling time and a high possibility for prediction. Furthermore, a scale-free featured mobility model with two essential ingredients, i.e., preferential return and exploration, and a Gaussian distribution assumption on the exploration tendency parameter is proposed, which outperforms existing human mobility models under scenarios of large geographical scales.

  7. Properties of large-scale methane/hydrogen jet fires

    Energy Technology Data Exchange (ETDEWEB)

    Studer, E. [CEA Saclay, DEN, LTMF Heat Transfer and Fluid Mech Lab, 91 - Gif-sur-Yvette (France); Jamois, D.; Leroy, G.; Hebrard, J. [INERIS, F-60150 Verneuil En Halatte (France); Jallais, S. [Air Liquide, F-78350 Jouy En Josas (France); Blanchetiere, V. [GDF SUEZ, 93 - La Plaine St Denis (France)

    2009-12-15

    A future economy based on reduction of carbon-based fuels for power generation and transportation may consider hydrogen as possible energy carrier Extensive and widespread use of hydrogen might require a pipeline network. The alternatives might be the use of the existing natural gas network or to design a dedicated network. Whatever the solution, mixing hydrogen with natural gas will modify the consequences of accidents, substantially The French National Research Agency (ANR) funded project called HYDROMEL focuses on these critical questions Within this project large-scale jet fires have been studied experimentally and numerically The main characteristics of these flames including visible length, radiation fluxes and blowout have been assessed. (authors)

  8. Alignment between galaxies and large-scale structure

    International Nuclear Information System (INIS)

    Faltenbacher, A.; Li Cheng; White, Simon D. M.; Jing, Yi-Peng; Mao Shude; Wang Jie

    2009-01-01

    Based on the Sloan Digital Sky Survey DR6 (SDSS) and the Millennium Simulation (MS), we investigate the alignment between galaxies and large-scale structure. For this purpose, we develop two new statistical tools, namely the alignment correlation function and the cos(2θ)-statistic. The former is a two-dimensional extension of the traditional two-point correlation function and the latter is related to the ellipticity correlation function used for cosmic shear measurements. Both are based on the cross correlation between a sample of galaxies with orientations and a reference sample which represents the large-scale structure. We apply the new statistics to the SDSS galaxy catalog. The alignment correlation function reveals an overabundance of reference galaxies along the major axes of red, luminous (L ∼ * ) galaxies out to projected separations of 60 h- 1 Mpc. The signal increases with central galaxy luminosity. No alignment signal is detected for blue galaxies. The cos(2θ)-statistic yields very similar results. Starting from a MS semi-analytic galaxy catalog, we assign an orientation to each red, luminous and central galaxy, based on that of the central region of the host halo (with size similar to that of the stellar galaxy). As an alternative, we use the orientation of the host halo itself. We find a mean projected misalignment between a halo and its central region of ∼ 25 deg. The misalignment decreases slightly with increasing luminosity of the central galaxy. Using the orientations and luminosities of the semi-analytic galaxies, we repeat our alignment analysis on mock surveys of the MS. Agreement with the SDSS results is good if the central orientations are used. Predictions using the halo orientations as proxies for central galaxy orientations overestimate the observed alignment by more than a factor of 2. Finally, the large volume of the MS allows us to generate a two-dimensional map of the alignment correlation function, which shows the reference

  9. Problems of large-scale vertically-integrated aquaculture

    Energy Technology Data Exchange (ETDEWEB)

    Webber, H H; Riordan, P F

    1976-01-01

    The problems of vertically-integrated aquaculture are outlined; they are concerned with: species limitations (in the market, biological and technological); site selection, feed, manpower needs, and legal, institutional and financial requirements. The gaps in understanding of, and the constraints limiting, large-scale aquaculture are listed. Future action is recommended with respect to: types and diversity of species to be cultivated, marketing, biotechnology (seed supply, disease control, water quality and concerted effort), siting, feed, manpower, legal and institutional aids (granting of water rights, grants, tax breaks, duty-free imports, etc.), and adequate financing. The last of hard data based on experience suggests that large-scale vertically-integrated aquaculture is a high risk enterprise, and with the high capital investment required, banks and funding institutions are wary of supporting it. Investment in pilot projects is suggested to demonstrate that large-scale aquaculture can be a fully functional and successful business. Construction and operation of such pilot farms is judged to be in the interests of both the public and private sector.

  10. Higher Education Teachers' Descriptions of Their Own Learning: A Large-Scale Study of Finnish Universities of Applied Sciences

    Science.gov (United States)

    Töytäri, Aija; Piirainen, Arja; Tynjälä, Päivi; Vanhanen-Nuutinen, Liisa; Mäki, Kimmo; Ilves, Vesa

    2016-01-01

    In this large-scale study, higher education teachers' descriptions of their own learning were examined with qualitative analysis involving application of principles of phenomenographic research. This study is unique: it is unusual to use large-scale data in qualitative studies. The data were collected through an e-mail survey sent to 5960 teachers…

  11. Large-scale computing with Quantum Espresso

    International Nuclear Information System (INIS)

    Giannozzi, P.; Cavazzoni, C.

    2009-01-01

    This paper gives a short introduction to Quantum Espresso: a distribution of software for atomistic simulations in condensed-matter physics, chemical physics, materials science, and to its usage in large-scale parallel computing.

  12. Initial condition effects on large scale structure in numerical simulations of plane mixing layers

    Science.gov (United States)

    McMullan, W. A.; Garrett, S. J.

    2016-01-01

    In this paper, Large Eddy Simulations are performed on the spatially developing plane turbulent mixing layer. The simulated mixing layers originate from initially laminar conditions. The focus of this research is on the effect of the nature of the imposed fluctuations on the large-scale spanwise and streamwise structures in the flow. Two simulations are performed; one with low-level three-dimensional inflow fluctuations obtained from pseudo-random numbers, the other with physically correlated fluctuations of the same magnitude obtained from an inflow generation technique. Where white-noise fluctuations provide the inflow disturbances, no spatially stationary streamwise vortex structure is observed, and the large-scale spanwise turbulent vortical structures grow continuously and linearly. These structures are observed to have a three-dimensional internal geometry with branches and dislocations. Where physically correlated provide the inflow disturbances a "streaky" streamwise structure that is spatially stationary is observed, with the large-scale turbulent vortical structures growing with the square-root of time. These large-scale structures are quasi-two-dimensional, on top of which the secondary structure rides. The simulation results are discussed in the context of the varying interpretations of mixing layer growth that have been postulated. Recommendations are made concerning the data required from experiments in order to produce accurate numerical simulation recreations of real flows.

  13. Support Vector Machines Trained with Evolutionary Algorithms Employing Kernel Adatron for Large Scale Classification of Protein Structures.

    Science.gov (United States)

    Arana-Daniel, Nancy; Gallegos, Alberto A; López-Franco, Carlos; Alanís, Alma Y; Morales, Jacob; López-Franco, Adriana

    2016-01-01

    With the increasing power of computers, the amount of data that can be processed in small periods of time has grown exponentially, as has the importance of classifying large-scale data efficiently. Support vector machines have shown good results classifying large amounts of high-dimensional data, such as data generated by protein structure prediction, spam recognition, medical diagnosis, optical character recognition and text classification, etc. Most state of the art approaches for large-scale learning use traditional optimization methods, such as quadratic programming or gradient descent, which makes the use of evolutionary algorithms for training support vector machines an area to be explored. The present paper proposes an approach that is simple to implement based on evolutionary algorithms and Kernel-Adatron for solving large-scale classification problems, focusing on protein structure prediction. The functional properties of proteins depend upon their three-dimensional structures. Knowing the structures of proteins is crucial for biology and can lead to improvements in areas such as medicine, agriculture and biofuels.

  14. The application of two-step linear temperature program to thermal analysis for monitoring the lipid induction of Nostoc sp. KNUA003 in large scale cultivation.

    Science.gov (United States)

    Kang, Bongmun; Yoon, Ho-Sung

    2015-02-01

    Recently, microalgae was considered as a renewable energy for fuel production because its production is nonseasonal and may take place on nonarable land. Despite all of these advantages, microalgal oil production is significantly affected by environmental factors. Furthermore, the large variability remains an important problem in measurement of algae productivity and compositional analysis, especially, the total lipid content. Thus, there is considerable interest in accurate determination of total lipid content during the biotechnological process. For these reason, various high-throughput technologies were suggested for accurate measurement of total lipids contained in the microorganisms, especially oleaginous microalgae. In addition, more advanced technologies were employed to quantify the total lipids of the microalgae without a pretreatment. However, these methods are difficult to measure total lipid content in wet form microalgae obtained from large-scale production. In present study, the thermal analysis performed with two-step linear temeperature program was applied to measure heat evolved in temperature range from 310 to 351 °C of Nostoc sp. KNUA003 obtained from large-scale cultivation. And then, we examined the relationship between the heat evolved in 310-351 °C (HE) and total lipid content of the wet Nostoc cell cultivated in raceway. As a result, the linear relationship was determined between HE value and total lipid content of Nostoc sp. KNUA003. Particularly, there was a linear relationship of 98% between the HE value and the total lipid content of the tested microorganism. Based on this relationship, the total lipid content converted from the heat evolved of wet Nostoc sp. KNUA003 could be used for monitoring its lipid induction in large-scale cultivation. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. The WIPP research and development test program

    International Nuclear Information System (INIS)

    Tyler, L.D.

    1985-01-01

    The WIPP (Waste Isolation Pilot Plant) is a DOE RandD Facility for the purpose of developing the technology needed for the safe disposal of the United States defense-related radioactive waste. The in-situ test program is defined for the thermal-structural interactions, plugging and sealing, and waste package interactions in a salt environment. An integrated series of large-scale underground tests address the issues of both systems and long-term isolation performance of a repository

  16. Optimization of large-scale industrial systems : an emerging method

    Energy Technology Data Exchange (ETDEWEB)

    Hammache, A.; Aube, F.; Benali, M.; Cantave, R. [Natural Resources Canada, Varennes, PQ (Canada). CANMET Energy Technology Centre

    2006-07-01

    This paper reviewed optimization methods of large-scale industrial production systems and presented a novel systematic multi-objective and multi-scale optimization methodology. The methodology was based on a combined local optimality search with global optimality determination, and advanced system decomposition and constraint handling. The proposed method focused on the simultaneous optimization of the energy, economy and ecology aspects of industrial systems (E{sup 3}-ISO). The aim of the methodology was to provide guidelines for decision-making strategies. The approach was based on evolutionary algorithms (EA) with specifications including hybridization of global optimality determination with a local optimality search; a self-adaptive algorithm to account for the dynamic changes of operating parameters and design variables occurring during the optimization process; interactive optimization; advanced constraint handling and decomposition strategy; and object-oriented programming and parallelization techniques. Flowcharts of the working principles of the basic EA were presented. It was concluded that the EA uses a novel decomposition and constraint handling technique to enhance the Pareto solution search procedure for multi-objective problems. 6 refs., 9 figs.

  17. Reflections on the political economy of large-scale technology using the example of German fast-breeder development

    International Nuclear Information System (INIS)

    Keck, O.

    1981-01-01

    Proceeding from Anglo-Saxon opinions which, from a liberal point of view, criticize the German practice of research policy - state centres of large-scale research and state subventions for research and development in industry - to be inefficient, the author empirically verified these statements taking the German fast breeder project as an example. If the case of the German fast breeder can be generalized, this had consequences for the research political practice and for other technologies. Supporters as well as opponents of large-scale technology today proceed from the assumption that almost every technology can be made commercially viable when using sufficient amounts of money and persons. This is a migth which owes its existence to the technical success of great projects in non-commercial fields. The German fast breeder project confirms the opinion that the recipes for success of these non-commercial projects cannot be applied to the field of commercial technology. The results of this study suggest that practice and theory of technology policy can be misdirected if they are uncritically oriented according to the form of state intervention so far used in large-scale technology. (orig./HSCH) [de

  18. Structural design of superconducting magnets for the large coil program

    International Nuclear Information System (INIS)

    Gray, W.H.; Long, C.J.; Stoddart, W.C.T.

    1979-09-01

    The Large Coil Program (LCP) is a research, development, and demonstration effort specifically for the advancement of the technologies involved in the production of large superconducting magnets. This paper presents a review of the status of the structural designs, analysis methods, and verification tests being performed by the participating LCP design teams in the USA, Switzerland, Japan, and the Federal Republic of Germany. The significant structural mechanics concerns that are being investigated with the LCP are presented

  19. Thermal anchoring of wires in large scale superconducting coil test experiment

    International Nuclear Information System (INIS)

    Patel, Dipak; Sharma, A.N.; Prasad, Upendra; Khristi, Yohan; Varmora, Pankaj; Doshi, Kalpesh; Pradhan, S.

    2013-01-01

    Highlights: • We addressed how thermal anchoring in large scale coil test is different compare to small cryogenic apparatus? • We did precise estimation of thermal anchoring length at 77 K and 4.2 K heat sink in large scale superconducting coil test experiment. • We addressed, the quality of anchoring without covering entire wires using Kapton/Teflon tape. • We obtained excellent results in temperature measurement without using GE Varnish by doubling estimated anchoring length. -- Abstract: Effective and precise thermal anchoring of wires in cryogenic experiment is mandatory to measure temperature in milikelvin accuracy and to avoid unnecessary cooling power due to additional heat conduction from room temperature (RT) to operating temperature (OT) through potential, field, displacement and stress measurement instrumentation wires. Instrumentation wires used in large scale superconducting coil test experiments are different compare to cryogenic apparatus in terms of unique construction and overall diameter/area due to errorless measurement in large time-varying magnetic field compare to small cryogenic apparatus, often shielded wires are used. Hence, along with other variables, anchoring techniques and required thermal anchoring length are entirely different in this experiment compare to cryogenic apparatus. In present paper, estimation of thermal anchoring length of five different types of instrumentation wires used in coils test campaign at Institute for Plasma Research (IPR), India has been discussed and some temperature measurement results of coils test campaign have been presented

  20. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  1. Large-scale simulations of plastic neural networks on neuromorphic hardware

    Directory of Open Access Journals (Sweden)

    James Courtney Knight

    2016-04-01

    Full Text Available SpiNNaker is a digital, neuromorphic architecture designed for simulating large-scale spiking neural networks at speeds close to biological real-time. Rather than using bespoke analog or digital hardware, the basic computational unit of a SpiNNaker system is a general-purpose ARM processor, allowing it to be programmed to simulate a wide variety of neuron and synapse models. This flexibility is particularly valuable in the study of biological plasticity phenomena. A recently proposed learning rule based on the Bayesian Confidence Propagation Neural Network (BCPNN paradigm offers a generic framework for modeling the interaction of different plasticity mechanisms using spiking neurons. However, it can be computationally expensive to simulate large networks with BCPNN learning since it requires multiple state variables for each synapse, each of which needs to be updated every simulation time-step. We discuss the trade-offs in efficiency and accuracy involved in developing an event-based BCPNN implementation for SpiNNaker based on an analytical solution to the BCPNN equations, and detail the steps taken to fit this within the limited computational and memory resources of the SpiNNaker architecture. We demonstrate this learning rule by learning temporal sequences of neural activity within a recurrent attractor network which we simulate at scales of up to 20000 neurons and 51200000 plastic synapses: the largest plastic neural network ever to be simulated on neuromorphic hardware. We also run a comparable simulation on a Cray XC-30 supercomputer system and find that, if it is to match the run-time of our SpiNNaker simulation, the super computer system uses approximately more power. This suggests that cheaper, more power efficient neuromorphic systems are becoming useful discovery tools in the study of plasticity in large-scale brain models.

  2. RESTRUCTURING OF THE LARGE-SCALE SPRINKLERS

    Directory of Open Access Journals (Sweden)

    Paweł Kozaczyk

    2016-09-01

    Full Text Available One of the best ways for agriculture to become independent from shortages of precipitation is irrigation. In the seventies and eighties of the last century a number of large-scale sprinklers in Wielkopolska was built. At the end of 1970’s in the Poznan province 67 sprinklers with a total area of 6400 ha were installed. The average size of the sprinkler reached 95 ha. In 1989 there were 98 sprinklers, and the area which was armed with them was more than 10 130 ha. The study was conducted on 7 large sprinklers with the area ranging from 230 to 520 hectares in 1986÷1998. After the introduction of the market economy in the early 90’s and ownership changes in agriculture, large-scale sprinklers have gone under a significant or total devastation. Land on the State Farms of the State Agricultural Property Agency has leased or sold and the new owners used the existing sprinklers to a very small extent. This involved a change in crop structure, demand structure and an increase in operating costs. There has also been a threefold increase in electricity prices. Operation of large-scale irrigation encountered all kinds of barriers in practice and limitations of system solutions, supply difficulties, high levels of equipment failure which is not inclined to rational use of available sprinklers. An effect of a vision of the local area was to show the current status of the remaining irrigation infrastructure. The adopted scheme for the restructuring of Polish agriculture was not the best solution, causing massive destruction of assets previously invested in the sprinkler system.

  3. Large-scale synthesis of YSZ nanopowder by Pechini method

    Indian Academy of Sciences (India)

    Administrator

    structure and chemical purity of 99⋅1% by inductively coupled plasma optical emission spectroscopy on a large scale. Keywords. Sol–gel; yttria-stabilized zirconia; large scale; nanopowder; Pechini method. 1. Introduction. Zirconia has attracted the attention of many scientists because of its tremendous thermal, mechanical ...

  4. III. FROM SMALL TO BIG: METHODS FOR INCORPORATING LARGE SCALE DATA INTO DEVELOPMENTAL SCIENCE.

    Science.gov (United States)

    Davis-Kean, Pamela E; Jager, Justin

    2017-06-01

    For decades, developmental science has been based primarily on relatively small-scale data collections with children and families. Part of the reason for the dominance of this type of data collection is the complexity of collecting cognitive and social data on infants and small children. These small data sets are limited in both power to detect differences and the demographic diversity to generalize clearly and broadly. Thus, in this chapter we will discuss the value of using existing large-scale data sets to tests the complex questions of child development and how to develop future large-scale data sets that are both representative and can answer the important questions of developmental scientists. © 2017 The Society for Research in Child Development, Inc.

  5. Scaling up HIV viral load - lessons from the large-scale implementation of HIV early infant diagnosis and CD4 testing.

    Science.gov (United States)

    Peter, Trevor; Zeh, Clement; Katz, Zachary; Elbireer, Ali; Alemayehu, Bereket; Vojnov, Lara; Costa, Alex; Doi, Naoko; Jani, Ilesh

    2017-11-01

    The scale-up of effective HIV viral load (VL) testing is an urgent public health priority. Implementation of testing is supported by the availability of accurate, nucleic acid based laboratory and point-of-care (POC) VL technologies and strong WHO guidance recommending routine testing to identify treatment failure. However, test implementation faces challenges related to the developing health systems in many low-resource countries. The purpose of this commentary is to review the challenges and solutions from the large-scale implementation of other diagnostic tests, namely nucleic-acid based early infant HIV diagnosis (EID) and CD4 testing, and identify key lessons to inform the scale-up of VL. Experience with EID and CD4 testing provides many key lessons to inform VL implementation and may enable more effective and rapid scale-up. The primary lessons from earlier implementation efforts are to strengthen linkage to clinical care after testing, and to improve the efficiency of testing. Opportunities to improve linkage include data systems to support the follow-up of patients through the cascade of care and test delivery, rapid sample referral networks, and POC tests. Opportunities to increase testing efficiency include improvements to procurement and supply chain practices, well connected tiered laboratory networks with rational deployment of test capacity across different levels of health services, routine resource mapping and mobilization to ensure adequate resources for testing programs, and improved operational and quality management of testing services. If applied to VL testing programs, these approaches could help improve the impact of VL on ART failure management and patient outcomes, reduce overall costs and help ensure the sustainable access to reduced pricing for test commodities, as well as improve supportive health systems such as efficient, and more rigorous quality assurance. These lessons draw from traditional laboratory practices as well as fields

  6. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    Science.gov (United States)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  7. The Phoenix series large scale LNG pool fire experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  8. The development of a capability for aerodynamic testing of large-scale wing sections in a simulated natural rain environment

    Science.gov (United States)

    Bezos, Gaudy M.; Cambell, Bryan A.; Melson, W. Edward

    1989-01-01

    A research technique to obtain large-scale aerodynamic data in a simulated natural rain environment has been developed. A 10-ft chord NACA 64-210 wing section wing section equipped with leading-edge and trailing-edge high-lift devices was tested as part of a program to determine the effect of highly-concentrated, short-duration rainfall on airplane performance. Preliminary dry aerodynamic data are presented for the high-lift configuration at a velocity of 100 knots and an angle of attack of 18 deg. Also, data are presented on rainfield uniformity and rainfall concentration intensity levels obtained during the calibration of the rain simulation system.

  9. Analysis on the Critical Rainfall Value For Predicting Large Scale Landslides Caused by Heavy Rainfall In Taiwan.

    Science.gov (United States)

    Tsai, Kuang-Jung; Chiang, Jie-Lun; Lee, Ming-Hsi; Chen, Yie-Ruey

    2017-04-01

    research. Hopefully, all results developed from this research can be used as a warning system for Predicting Large Scale Landslides in the southern Taiwan. Keywords:Heavy Rainfall, Large Scale, landslides, Critical Rainfall Value

  10. Thermal System Analysis and Optimization of Large-Scale Compressed Air Energy Storage (CAES

    Directory of Open Access Journals (Sweden)

    Zhongguang Fu

    2015-08-01

    Full Text Available As an important solution to issues regarding peak load and renewable energy resources on grids, large-scale compressed air energy storage (CAES power generation technology has recently become a popular research topic in the area of large-scale industrial energy storage. At present, the combination of high-expansion ratio turbines with advanced gas turbine technology is an important breakthrough in energy storage technology. In this study, a new gas turbine power generation system is coupled with current CAES technology. Moreover, a thermodynamic cycle system is optimized by calculating for the parameters of a thermodynamic system. Results show that the thermal efficiency of the new system increases by at least 5% over that of the existing system.

  11. Synthesis and sintering Ni-Zn ferrite obtained for combustion reaction in large scale

    International Nuclear Information System (INIS)

    Vieira, D.A.; Diniz, V.C.S.; Costa, A.C.F.M.; Cornejo, D.R.; Kiminami, R.H.G.A.

    2014-01-01

    This research proposes to evaluate the magnetic properties of ferrite Ni-Zn synthesized by combustion reaction on a large scale and sintered at 1250 deg C in resistive furnace. The sample was characterized by X-ray diffraction (XRD), scanning electron microscopy (SEM), and magnetic measurements. The results show that the synthesized product in large scale resulted in soft magnetic material with saturation magnetization of 40 emu·g"-"1 and coercivity of 0.080 kOe, after sintering it was observed an increase to 68 emu·g"-"1 in the magnetization and a reduction to 0.016 kOe in coercivity, indicating that the obtained material has promising characteristics for applications in electro-electronic devices. (author)

  12. Studies on combined model based on functional objectives of large scale complex engineering

    Science.gov (United States)

    Yuting, Wang; Jingchun, Feng; Jiabao, Sun

    2018-03-01

    As various functions were included in large scale complex engineering, and each function would be conducted with completion of one or more projects, combined projects affecting their functions should be located. Based on the types of project portfolio, the relationship of projects and their functional objectives were analyzed. On that premise, portfolio projects-technics based on their functional objectives were introduced, then we studied and raised the principles of portfolio projects-technics based on the functional objectives of projects. In addition, The processes of combined projects were also constructed. With the help of portfolio projects-technics based on the functional objectives of projects, our research findings laid a good foundation for management of large scale complex engineering portfolio management.

  13. Geospatial Optimization of Siting Large-Scale Solar Projects

    Energy Technology Data Exchange (ETDEWEB)

    Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Quinby, Ted [National Renewable Energy Lab. (NREL), Golden, CO (United States); Caulfield, Emmet [Stanford Univ., CA (United States); Gerritsen, Margot [Stanford Univ., CA (United States); Diffendorfer, Jay [U.S. Geological Survey, Boulder, CO (United States); Haines, Seth [U.S. Geological Survey, Boulder, CO (United States)

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent with each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  14. Large-scale Agricultural Land Acquisitions in West Africa | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    This project will examine large-scale agricultural land acquisitions in nine West African countries -Burkina Faso, Guinea-Bissau, Guinea, Benin, Mali, Togo, Senegal, Niger, and Côte d'Ivoire. ... They will use the results to increase public awareness and knowledge about the consequences of large-scale land acquisitions.

  15. General aviation crash safety program at Langley Research Center

    Science.gov (United States)

    Thomson, R. G.

    1976-01-01

    The purpose of the crash safety program is to support development of the technology to define and demonstrate new structural concepts for improved crash safety and occupant survivability in general aviation aircraft. The program involves three basic areas of research: full-scale crash simulation testing, nonlinear structural analyses necessary to predict failure modes and collapse mechanisms of the vehicle, and evaluation of energy absorption concepts for specific component design. Both analytical and experimental methods are being used to develop expertise in these areas. Analyses include both simplified procedures for estimating energy absorption capabilities and more complex computer programs for analysis of general airframe response. Full-scale tests of typical structures as well as tests on structural components are being used to verify the analyses and to demonstrate improved design concepts.

  16. The Contribution of International Large-Scale Assessments to Educational Research: Combining Individual and Institutional Data Sources

    Science.gov (United States)

    Strietholt, Rolf; Scherer, Ronny

    2018-01-01

    The present paper aims to discuss how data from international large-scale assessments (ILSAs) can be utilized and combined, even with other existing data sources, in order to monitor educational outcomes and study the effectiveness of educational systems. We consider different purposes of linking data, namely, extending outcomes measures,…

  17. Applications of Data Assimilation to Analysis of the Ocean on Large Scales

    Science.gov (United States)

    Miller, Robert N.; Busalacchi, Antonio J.; Hackert, Eric C.

    1997-01-01

    It is commonplace to begin talks on this topic by noting that oceanographic data are too scarce and sparse to provide complete initial and boundary conditions for large-scale ocean models. Even considering the availability of remotely-sensed data such as radar altimetry from the TOPEX and ERS-1 satellites, a glance at a map of available subsurface data should convince most observers that this is still the case. Data are still too sparse for comprehensive treatment of interannual to interdecadal climate change through the use of models, since the new data sets have not been around for very long. In view of the dearth of data, we must note that the overall picture is changing rapidly. Recently, there have been a number of large scale ocean analysis and prediction efforts, some of which now run on an operational or at least quasi-operational basis, most notably the model based analyses of the tropical oceans. These programs are modeled on numerical weather prediction. Aside from the success of the global tide models, assimilation of data in the tropics, in support of prediction and analysis of seasonal to interannual climate change, is probably the area of large scale ocean modeling and data assimilation in which the most progress has been made. Climate change is a problem which is particularly suited to advanced data assimilation methods. Linear models are useful, and the linear theory can be exploited. For the most part, the data are sufficiently sparse that implementation of advanced methods is worthwhile. As an example of a large scale data assimilation experiment with a recent extensive data set, we present results of a tropical ocean experiment in which the Kalman filter was used to assimilate three years of altimetric data from Geosat into a coarsely resolved linearized long wave shallow water model. Since nonlinear processes dominate the local dynamic signal outside the tropics, subsurface dynamical quantities cannot be reliably inferred from surface height

  18. Large-scale motions in the universe: a review

    International Nuclear Information System (INIS)

    Burstein, D.

    1990-01-01

    The expansion of the universe can be retarded in localised regions within the universe both by the presence of gravity and by non-gravitational motions generated in the post-recombination universe. The motions of galaxies thus generated are called 'peculiar motions', and the amplitudes, size scales and coherence of these peculiar motions are among the most direct records of the structure of the universe. As such, measurements of these properties of the present-day universe provide some of the severest tests of cosmological theories. This is a review of the current evidence for large-scale motions of galaxies out to a distance of ∼5000 km s -1 (in an expanding universe, distance is proportional to radial velocity). 'Large-scale' in this context refers to motions that are correlated over size scales larger than the typical sizes of groups of galaxies, up to and including the size of the volume surveyed. To orient the reader into this relatively new field of study, a short modern history is given together with an explanation of the terminology. Careful consideration is given to the data used to measure the distances, and hence the peculiar motions, of galaxies. The evidence for large-scale motions is presented in a graphical fashion, using only the most reliable data for galaxies spanning a wide range in optical properties and over the complete range of galactic environments. The kinds of systematic errors that can affect this analysis are discussed, and the reliability of these motions is assessed. The predictions of two models of large-scale motion are compared to the observations, and special emphasis is placed on those motions in which our own Galaxy directly partakes. (author)

  19. Large-scale additive manufacturing with bioinspired cellulosic materials.

    Science.gov (United States)

    Sanandiya, Naresh D; Vijay, Yadunund; Dimopoulou, Marina; Dritsas, Stylianos; Fernandez, Javier G

    2018-06-05

    Cellulose is the most abundant and broadly distributed organic compound and industrial by-product on Earth. However, despite decades of extensive research, the bottom-up use of cellulose to fabricate 3D objects is still plagued with problems that restrict its practical applications: derivatives with vast polluting effects, use in combination with plastics, lack of scalability and high production cost. Here we demonstrate the general use of cellulose to manufacture large 3D objects. Our approach diverges from the common association of cellulose with green plants and it is inspired by the wall of the fungus-like oomycetes, which is reproduced introducing small amounts of chitin between cellulose fibers. The resulting fungal-like adhesive material(s) (FLAM) are strong, lightweight and inexpensive, and can be molded or processed using woodworking techniques. We believe this first large-scale additive manufacture with ubiquitous biological polymers will be the catalyst for the transition to environmentally benign and circular manufacturing models.

  20. Large-scale continuous process to vitrify nuclear defense waste: operating experience with nonradioactive waste

    International Nuclear Information System (INIS)

    Cosper, M.B.; Randall, C.T.; Traverso, G.M.

    1982-01-01

    The developmental program underway at SRL has demonstrated the vitrification process proposed for the sludge processing facility of the DWPF on a large scale. DWPF design criteria for production rate, equipment lifetime, and operability have all been met. The expected authorization and construction of the DWPF will result in the safe and permanent immobilization of a major quantity of existing high level waste. 11 figures, 4 tables

  1. Moditored unsaturated soil transport processes as a support for large scale soil and water management

    Science.gov (United States)

    Vanclooster, Marnik

    2010-05-01

    The current societal demand for sustainable soil and water management is very large. The drivers of global and climate change exert many pressures on the soil and water ecosystems, endangering appropriate ecosystem functioning. The unsaturated soil transport processes play a key role in soil-water system functioning as it controls the fluxes of water and nutrients from the soil to plants (the pedo-biosphere link), the infiltration flux of precipitated water to groundwater and the evaporative flux, and hence the feed back from the soil to the climate system. Yet, unsaturated soil transport processes are difficult to quantify since they are affected by huge variability of the governing properties at different space-time scales and the intrinsic non-linearity of the transport processes. The incompatibility of the scales between the scale at which processes reasonably can be characterized, the scale at which the theoretical process correctly can be described and the scale at which the soil and water system need to be managed, calls for further development of scaling procedures in unsaturated zone science. It also calls for a better integration of theoretical and modelling approaches to elucidate transport processes at the appropriate scales, compatible with the sustainable soil and water management objective. Moditoring science, i.e the interdisciplinary research domain where modelling and monitoring science are linked, is currently evolving significantly in the unsaturated zone hydrology area. In this presentation, a review of current moditoring strategies/techniques will be given and illustrated for solving large scale soil and water management problems. This will also allow identifying research needs in the interdisciplinary domain of modelling and monitoring and to improve the integration of unsaturated zone science in solving soil and water management issues. A focus will be given on examples of large scale soil and water management problems in Europe.

  2. Fourth annual Walker Branch Watershed research symposium: Program and abstracts

    International Nuclear Information System (INIS)

    1993-03-01

    The methods and concepts of watershed research, originally applied in an experimental or monitoring mode to relatively small catchments, are increasingly being used at larger scales and for specific applied problems. Research at Oak Ridge National Laboratory, the Tennessee Valley Authority, the US Forest Service, and other agencies and institutions participating in this symposium reflects research over a broad range of spatial scales that is being integrated through large-scale experiments along with computer modeling and graphical interfaces. These research projects address the basic atmospheric, geophysical, biogeochemical, and biological processes that regulate the responses of forested ecosystems to natural environmental variation and anthropogenic stresses. This symposium highlights the use of large-scale ecosystem experiments to address environmental issues of global concern. These experiments provide the only effective way to test models of ecosystem response that are based on the current state of knowledge of hydrology, biogeochemistry, plant physiology, and other ecosystem processes. Major environmental problems that are being addressed include acidic deposition and nitrogen loading (Bear Brook Watershed, Maine; and the Girdsjoen Covered Catchment, Sweden); climate warming (Soil Warming Experiment, Maine); and altered rainfall amounts (Savannah River Loblolly Pine Soil Water Manipulation and the Walker Branch Watershed Throughfall Displacement Experiment)

  3. Qualitative Research in an International Research Program: Maintaining Momentum while Building Capacity in Nurses

    Directory of Open Access Journals (Sweden)

    Judy Mill RN, PhD

    2014-02-01

    Full Text Available Nurses are knowledgeable about issues that affect quality and equity of care and are well qualified to inform policy, yet their expertise is seldom acknowledged and their input infrequently invited. In 2007, a large multidisciplinary team of researchers and decision-makers from Canada and five low- and middle-income countries (Barbados, Jamaica, Uganda, Kenya, and South Africa received funding to implement a participatory action research (PAR program entitled “Strengthening Nurses' Capacity for HIV Policy Development in sub-Saharan Africa and the Caribbean.” The goal of the research program was to explore and promote nurses' involvement in HIV policy development and to improve nursing practice in countries with a high HIV disease burden. A core element of the PAR program was the enhancement of the research capacity, and particularly qualitative capacity, of nurses through the use of mentorship, role-modeling, and the enhancement of institutional support. In this article we: (a describe the PAR program and research team; (b situate the research program by discussing attitudes to qualitative research in the study countries; (c highlight the incremental formal and informal qualitative research capacity building initiatives undertaken as part of this PAR program; (d describe the approaches used to maintain rigor while implementing a complex research program; and (e identify strategies to ensure that capacity building was locally-owned. We conclude with a discussion of challenges and opportunities and provide an informal analysis of the research capacity that was developed within our international team using a PAR approach.

  4. A route to explosive large-scale magnetic reconnection in a super-ion-scale current sheet

    Directory of Open Access Journals (Sweden)

    K. G. Tanaka

    2009-01-01

    Full Text Available How to trigger magnetic reconnection is one of the most interesting and important problems in space plasma physics. Recently, electron temperature anisotropy (αeo=Te⊥/Te|| at the center of a current sheet and non-local effect of the lower-hybrid drift instability (LHDI that develops at the current sheet edges have attracted attention in this context. In addition to these effects, here we also study the effects of ion temperature anisotropy (αio=Ti⊥/Ti||. Electron anisotropy effects are known to be helpless in a current sheet whose thickness is of ion-scale. In this range of current sheet thickness, the LHDI effects are shown to weaken substantially with a small increase in thickness and the obtained saturation level is too low for a large-scale reconnection to be achieved. Then we investigate whether introduction of electron and ion temperature anisotropies in the initial stage would couple with the LHDI effects to revive quick triggering of large-scale reconnection in a super-ion-scale current sheet. The results are as follows. (1 The initial electron temperature anisotropy is consumed very quickly when a number of minuscule magnetic islands (each lateral length is 1.5~3 times the ion inertial length form. These minuscule islands do not coalesce into a large-scale island to enable large-scale reconnection. (2 The subsequent LHDI effects disturb the current sheet filled with the small islands. This makes the triggering time scale to be accelerated substantially but does not enhance the saturation level of reconnected flux. (3 When the ion temperature anisotropy is added, it survives through the small island formation stage and makes even quicker triggering to happen when the LHDI effects set-in. Furthermore the saturation level is seen to be elevated by a factor of ~2 and large-scale reconnection is achieved only in this case. Comparison with two-dimensional simulations that exclude the LHDI effects confirms that the saturation level

  5. Large-scale Labeled Datasets to Fuel Earth Science Deep Learning Applications

    Science.gov (United States)

    Maskey, M.; Ramachandran, R.; Miller, J.

    2017-12-01

    Deep learning has revolutionized computer vision and natural language processing with various algorithms scaled using high-performance computing. However, generic large-scale labeled datasets such as the ImageNet are the fuel that drives the impressive accuracy of deep learning results. Large-scale labeled datasets already exist in domains such as medical science, but creating them in the Earth science domain is a challenge. While there are ways to apply deep learning using limited labeled datasets, there is a need in the Earth sciences for creating large-scale labeled datasets for benchmarking and scaling deep learning applications. At the NASA Marshall Space Flight Center, we are using deep learning for a variety of Earth science applications where we have encountered the need for large-scale labeled datasets. We will discuss our approaches for creating such datasets and why these datasets are just as valuable as deep learning algorithms. We will also describe successful usage of these large-scale labeled datasets with our deep learning based applications.

  6. Exploratory Technology Research Program for Electrochemical Energy Storage. Annual report, 1992

    Energy Technology Data Exchange (ETDEWEB)

    Kinoshita, K. [ed.

    1993-10-01

    This report summarizes the progress made by the Exploratory Technology Research (ETR) Program for Electrochemical Energy Storage during calendar year 1992. The primary objective of the ETR Program, which is sponsored by the US Department of Energy (DOE) and managed by Lawrence Berkeley Laboratory (LBL), is to identify electrochemical technologies that can satisfy stringent performance, durability and economic requirements for electric vehicles (EVs). The ultimate goal is to transfer the most-promising electrochemical technologies to the private sector or to another DOE program (e.g., SNL`s Electric Vehicle Advanced Battery Systems Development Program, EVABS) for further development and scale-up. Besides LBL, which has overall responsibility for the ETR Program, LANL and BNL have participated in the ETR Program by providing key research support in several of the program elements. The ETR Program consists of three major elements: Exploratory Research; Applied Science Research; and Air Systems Research. The objectives and the specific battery and electrochemical systems addressed by each program element are discussed in the following sections, which also include technical summaries that relate to the individual programs. Financial information that relates to the various programs and a description of the management activities for the ETR Program are described in the Executive Summary.

  7. Development of large scale and wind energy conservation system. Operational studies on a large-scale wind energy conservation system; Ogata furyoku hatsuden system no kaihatsu. Ogata furyoku hatsuden system no unten kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    Takita, M [New Energy and Industrial Technology Development Organization, Tokyo (Japan)

    1994-12-01

    Described herein are the results of the FY1994 research program for operational studies on a large-scale wind energy conversion system. A total of 8 domestic and foreign cases are studied for wind energy conversion cost, to clarify the causes for higher cost of the Japanese system. The wind power systems studied include Japanese (5 units at Tappi Wind Park, the same type supplied by company M), US (California Wind Farm, 300 units) and UK (Wales Wind Farm, 103 units) systems. The investment costs are 639, 285 and 189 thousand yen/kW for the Japanese, US and UK systems, respectively. It is also revealed that the power plant itself and assembling costs account for a majority (70 to 88%) of the total investment cost. The higher cost of the Japanese system results from a smaller number of units installed, and the power plant cost can be drastically reduced by mass production. Increasing size also reduces cost greatly.

  8. Preparing laboratory and real-world EEG data for large-scale analysis: A containerized approach

    Directory of Open Access Journals (Sweden)

    Nima eBigdely-Shamlo

    2016-03-01

    Full Text Available Large-scale analysis of EEG and other physiological measures promises new insights into brain processes and more accurate and robust brain-computer interface (BCI models.. However, the absence of standard-ized vocabularies for annotating events in a machine understandable manner, the welter of collection-specific data organizations, the diffi-culty in moving data across processing platforms, and the unavailability of agreed-upon standards for preprocessing have prevented large-scale analyses of EEG. Here we describe a containerized approach and freely available tools we have developed to facilitate the process of an-notating, packaging, and preprocessing EEG data collections to enable data sharing, archiving, large-scale machine learning/data mining and (meta-analysis. The EEG Study Schema (ESS comprises three data Levels, each with its own XML-document schema and file/folder convention, plus a standardized (PREP pipeline to move raw (Data Level 1 data to a basic preprocessed state (Data Level 2 suitable for application of a large class of EEG analysis methods. Researchers can ship a study as a single unit and operate on its data using a standardized interface. ESS does not require a central database and provides all the metadata data necessary to execute a wide variety of EEG processing pipelines. The primary focus of ESS is automated in-depth analysis and meta-analysis EEG studies. However, ESS can also encapsulate meta-information for the other modalities such as eye tracking, that are in-creasingly used in both laboratory and real-world neuroimaging. ESS schema and tools are freely available at eegstudy.org, and a central cata-log of over 850 GB of existing data in ESS format is available at study-catalog.org. These tools and resources are part of a larger effort to ena-ble data sharing at sufficient scale for researchers to engage in truly large-scale EEG analysis and data mining (BigEEG.org.

  9. Position Paper on Jatropha curcas. State of the Art Small and Large Scale Project Development

    Energy Technology Data Exchange (ETDEWEB)

    Daey Ouwens, K.; Franken, Y.J.; Rijssenbeek, W. [Fuels from Agriculture in Communal Technology FACT, Eindhoven (Netherlands); Francis, G. [University of Hohenheim, Hohenheim (Germany); Riedacker, A. [French National Institute for Agricultural Research INRA, Paris (France); Foidl, N.; Jongschaap, R.; Bindraban, P. [Plant Research International PRI, Wageningen (Netherlands)

    2007-06-15

    Much information has been collected during the Seminar on Jatropha held in Wageningen, Netherlands, March 2007, summarized in this paper. Much research is still necessary to improve yield, to allow use of biological products such as oil cake as animal fodder, etc. Good documented yield data are still scarce. Cooperation with research institutions is therefore recommended. At this stage it is still particularly important to distinguish between reality, promises and dangerous extrapolations. To avoid, spectacular and regretful failures and waste of money for investors as well as great disappointments of local populations, promoters of large scale plantation are invited to adopt stepwise approaches: large scale plantations should only be considered after some 4 to 5 years obtaining experimental data (annual seed yield and oil yield, economical viability etc.) from a sufficient number of small scale experimental plots (about 1 ha) corresponding to the whole range of soil and climatic conditions of such projects.

  10. Fundamentals of applied multidimensional scaling for educational and psychological research

    CERN Document Server

    Ding, Cody S

    2018-01-01

    This book explores the fundamentals of multidimensional scaling (MDS) and how this analytic method can be used in applied setting for educational and psychological research. The book tries to make MDS more accessible to a wider audience in terms of the language and examples that are more relevant to educational and psychological research and less technical so that the readers are not overwhelmed by equations. The goal is for readers to learn the methods described in this book and immediately start using MDS via available software programs. The book also examines new applications that have previously not been discussed in MDS literature. It should be an ideal book for graduate students and researchers to better understand MDS. Fundamentals of Applied Multidimensional Scaling for Educational and Psychological Research is divided into three parts. Part I covers the basic and fundamental features of MDS models pertaining to applied research applications. Chapters in this section cover the essential features of da...

  11. Large-scale transportation network congestion evolution prediction using deep learning theory.

    Directory of Open Access Journals (Sweden)

    Xiaolei Ma

    Full Text Available Understanding how congestion at one location can cause ripples throughout large-scale transportation network is vital for transportation researchers and practitioners to pinpoint traffic bottlenecks for congestion mitigation. Traditional studies rely on either mathematical equations or simulation techniques to model traffic congestion dynamics. However, most of the approaches have limitations, largely due to unrealistic assumptions and cumbersome parameter calibration process. With the development of Intelligent Transportation Systems (ITS and Internet of Things (IoT, transportation data become more and more ubiquitous. This triggers a series of data-driven research to investigate transportation phenomena. Among them, deep learning theory is considered one of the most promising techniques to tackle tremendous high-dimensional data. This study attempts to extend deep learning theory into large-scale transportation network analysis. A deep Restricted Boltzmann Machine and Recurrent Neural Network architecture is utilized to model and predict traffic congestion evolution based on Global Positioning System (GPS data from taxi. A numerical study in Ningbo, China is conducted to validate the effectiveness and efficiency of the proposed method. Results show that the prediction accuracy can achieve as high as 88% within less than 6 minutes when the model is implemented in a Graphic Processing Unit (GPU-based parallel computing environment. The predicted congestion evolution patterns can be visualized temporally and spatially through a map-based platform to identify the vulnerable links for proactive congestion mitigation.

  12. Large-scale structure observables in general relativity

    International Nuclear Information System (INIS)

    Jeong, Donghui; Schmidt, Fabian

    2015-01-01

    We review recent studies that rigorously define several key observables of the large-scale structure of the Universe in a general relativistic context. Specifically, we consider (i) redshift perturbation of cosmic clock events; (ii) distortion of cosmic rulers, including weak lensing shear and magnification; and (iii) observed number density of tracers of the large-scale structure. We provide covariant and gauge-invariant expressions of these observables. Our expressions are given for a linearly perturbed flat Friedmann–Robertson–Walker metric including scalar, vector, and tensor metric perturbations. While we restrict ourselves to linear order in perturbation theory, the approach can be straightforwardly generalized to higher order. (paper)

  13. Real-time simulation of large-scale floods

    Science.gov (United States)

    Liu, Q.; Qin, Y.; Li, G. D.; Liu, Z.; Cheng, D. J.; Zhao, Y. H.

    2016-08-01

    According to the complex real-time water situation, the real-time simulation of large-scale floods is very important for flood prevention practice. Model robustness and running efficiency are two critical factors in successful real-time flood simulation. This paper proposed a robust, two-dimensional, shallow water model based on the unstructured Godunov- type finite volume method. A robust wet/dry front method is used to enhance the numerical stability. An adaptive method is proposed to improve the running efficiency. The proposed model is used for large-scale flood simulation on real topography. Results compared to those of MIKE21 show the strong performance of the proposed model.

  14. Research on the impacts of large-scale electric vehicles integration into power grid

    Science.gov (United States)

    Su, Chuankun; Zhang, Jian

    2018-06-01

    Because of its special energy driving mode, electric vehicles can improve the efficiency of energy utilization and reduce the pollution to the environment, which is being paid more and more attention. But the charging behavior of electric vehicles is random and intermittent. If the electric vehicle is disordered charging in a large scale, it causes great pressure on the structure and operation of the power grid and affects the safety and economic operation of the power grid. With the development of V2G technology in electric vehicle, the study of the charging and discharging characteristics of electric vehicles is of great significance for improving the safe operation of the power grid and the efficiency of energy utilization.

  15. How Can the Evidence from Global Large-scale Clinical Trials for Cardiovascular Diseases be Improved?

    Directory of Open Access Journals (Sweden)

    Tsutani Kiichiro

    2011-06-01

    Full Text Available Abstract Background Clinical investigations are important for obtaining evidence to improve medical treatment. Large-scale clinical trials with thousands of participants are particularly important for this purpose in cardiovascular diseases. Conducting large-scale clinical trials entails high research costs. This study sought to investigate global trends in large-scale clinical trials in cardiovascular diseases. Findings We searched for trials using clinicaltrials.gov (URL: http://www.clinicaltrials.gov/ using the key words 'cardio' and 'event' in all fields on 10 April, 2010. We then selected trials with 300 or more participants examining cardiovascular diseases. The search revealed 344 trials that met our criteria. Of 344 trials, 71% were randomized controlled trials, 15% involved more than 10,000 participants, and 59% were funded by industry. In RCTs whose results were disclosed, 55% of industry-funded trials and 25% of non-industry funded trials reported statistically significant superiority over control (p = 0.012, 2-sided Fisher's exact test. Conclusions Our findings highlighted concerns regarding potential bias related to funding sources, and that researchers should be aware of the importance of trial information disclosures and conflicts of interest. We should keep considering management and training regarding information disclosures and conflicts of interest for researchers. This could lead to better clinical evidence and further improvements in the development of medical treatment worldwide.

  16. Stereotype Threat, Inquiring about Test Takers' Race and Gender, and Performance on Low-Stakes Tests in a Large-Scale Assessment. Research Report. ETS RR-15-02

    Science.gov (United States)

    Stricker, Lawrence J.; Rock, Donald A.; Bridgeman, Brent

    2015-01-01

    This study explores stereotype threat on low-stakes tests used in a large-scale assessment, math and reading tests in the Education Longitudinal Study of 2002 (ELS). Issues identified in laboratory research (though not observed in studies of high-stakes tests) were assessed: whether inquiring about their race and gender is related to the…

  17. Large-scale numerical simulations of plasmas

    International Nuclear Information System (INIS)

    Hamaguchi, Satoshi

    2004-01-01

    The recent trend of large scales simulations of fusion plasma and processing plasmas is briefly summarized. Many advanced simulation techniques have been developed for fusion plasmas and some of these techniques are now applied to analyses of processing plasmas. (author)

  18. Nearly incompressible fluids: Hydrodynamics and large scale inhomogeneity

    International Nuclear Information System (INIS)

    Hunana, P.; Zank, G. P.; Shaikh, D.

    2006-01-01

    A system of hydrodynamic equations in the presence of large-scale inhomogeneities for a high plasma beta solar wind is derived. The theory is derived under the assumption of low turbulent Mach number and is developed for the flows where the usual incompressible description is not satisfactory and a full compressible treatment is too complex for any analytical studies. When the effects of compressibility are incorporated only weakly, a new description, referred to as 'nearly incompressible hydrodynamics', is obtained. The nearly incompressible theory, was originally applied to homogeneous flows. However, large-scale gradients in density, pressure, temperature, etc., are typical in the solar wind and it was unclear how inhomogeneities would affect the usual incompressible and nearly incompressible descriptions. In the homogeneous case, the lowest order expansion of the fully compressible equations leads to the usual incompressible equations, followed at higher orders by the nearly incompressible equations, as introduced by Zank and Matthaeus. With this work we show that the inclusion of large-scale inhomogeneities (in this case time-independent and radially symmetric background solar wind) modifies the leading-order incompressible description of solar wind flow. We find, for example, that the divergence of velocity fluctuations is nonsolenoidal and that density fluctuations can be described to leading order as a passive scalar. Locally (for small lengthscales), this system of equations converges to the usual incompressible equations and we therefore use the term 'locally incompressible' to describe the equations. This term should be distinguished from the term 'nearly incompressible', which is reserved for higher-order corrections. Furthermore, we find that density fluctuations scale with Mach number linearly, in contrast to the original homogeneous nearly incompressible theory, in which density fluctuations scale with the square of Mach number. Inhomogeneous nearly

  19. Performance Health Monitoring of Large-Scale Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rajamony, Ram [IBM Research, Austin, TX (United States)

    2014-11-20

    This report details the progress made on the ASCR funded project Performance Health Monitoring for Large Scale Systems. A large-­scale application may not achieve its full performance potential due to degraded performance of even a single subsystem. Detecting performance faults, isolating them, and taking remedial action is critical for the scale of systems on the horizon. PHM aims to develop techniques and tools that can be used to identify and mitigate such performance problems. We accomplish this through two main aspects. The PHM framework encompasses diagnostics, system monitoring, fault isolation, and performance evaluation capabilities that indicates when a performance fault has been detected, either due to an anomaly present in the system itself or due to contention for shared resources between concurrently executing jobs. Software components called the PHM Control system then build upon the capabilities provided by the PHM framework to mitigate degradation caused by performance problems.

  20. Proceedings of joint meeting of the 6th simulation science symposium and the NIFS collaboration research 'large scale computer simulation'

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-03-01

    Joint meeting of the 6th Simulation Science Symposium and the NIFS Collaboration Research 'Large Scale Computer Simulation' was held on December 12-13, 2002 at National Institute for Fusion Science, with the aim of promoting interdisciplinary collaborations in various fields of computer simulations. The present meeting attended by more than 40 people consists of the 11 invited and 22 contributed papers, of which topics were extended not only to fusion science but also to related fields such as astrophysics, earth science, fluid dynamics, molecular dynamics, computer science etc. (author)

  1. Large-scale overview of the summer monsoon over West Africa during the AMMA field experiment in 2006

    Directory of Open Access Journals (Sweden)

    S. Janicot

    2008-09-01

    Full Text Available The AMMA (African Monsoon Multidisciplinary Analysis program is dedicated to providing a better understanding of the West African monsoon and its influence on the physical, chemical and biological environment regionally and globally, as well as relating variability of this monsoon system to issues of health, water resources, food security and demography for West African nations. Within this framework, an intensive field campaign took place during the summer of 2006 to better document specific processes and weather systems at various key stages of this monsoon season. This campaign was embedded within a longer observation period that documented the annual cycle of surface and atmospheric conditions between 2005 and 2007. The present paper provides a large and regional scale overview of the 2006 summer monsoon season, that includes consideration of of the convective activity, mean atmospheric circulation and synoptic/intraseasonal weather systems, oceanic and land surface conditions, continental hydrology, dust concentration and ozone distribution. The 2006 African summer monsoon was a near-normal rainy season except for a large-scale rainfall excess north of 15° N. This monsoon season was also characterized by a 10-day delayed onset compared to climatology, with convection becoming developed only after 10 July. This onset delay impacted the continental hydrology, soil moisture and vegetation dynamics as well as dust emission. More details of some less-well-known atmospheric features in the African monsoon at intraseasonal and synoptic scales are provided in order to promote future research in these areas.

  2. Parallel Computational Fluid Dynamics 2007 : Implementations and Experiences on Large Scale and Grid Computing

    CERN Document Server

    2009-01-01

    At the 19th Annual Conference on Parallel Computational Fluid Dynamics held in Antalya, Turkey, in May 2007, the most recent developments and implementations of large-scale and grid computing were presented. This book, comprised of the invited and selected papers of this conference, details those advances, which are of particular interest to CFD and CFD-related communities. It also offers the results related to applications of various scientific and engineering problems involving flows and flow-related topics. Intended for CFD researchers and graduate students, this book is a state-of-the-art presentation of the relevant methodology and implementation techniques of large-scale computing.

  3. Large-scale experiments on aerosol behavior in light water reactor containments

    International Nuclear Information System (INIS)

    Schock, W.; Bunz, H.; Adams, R.E.; Tobias, M.L.; Rahn, F.J.

    1988-01-01

    Recently, three large-scale experimental programs were carried out dealing with the behavior of aerosols during core-melt accidents in light water reactors (LWRs). In the Nuclear Safety Pilot Plant (NSPP) program, the principal behaviors of different insoluble aerosols and of mixed aerosols were measured in dry air atmospheres and in condensing steam-air atmospheres contained in a 38-m/sup 3/ steel vessel. The Demonstration of Nuclear Aerosol Behavior (DEMONA) program used a 640-m/sup 3/ concrete containment model to simulate typical accident sequence conditions, and measured the behavior of different insoluble aerosols and mixed aerosols in condensing and transient atmospheric conditions. Part of the LWR Aerosol Containment Experiments (LACE) program was also devoted to aerosol behavior in containment; and 852-m/sup 3/ steel vessel was used, and the aerosols were composed of mixtures of insoluble and soluble species. The results of these experiments provide a suitable data base for validation of aerosol behavior codes. Fundamental insight into details of aerosol behavior in condensing environments has been gained through the results of the NSPP tests. Code comparisons have been and are being performed in the DEMONA and LACE experiments

  4. Learning from large scale neural simulations

    DEFF Research Database (Denmark)

    Serban, Maria

    2017-01-01

    Large-scale neural simulations have the marks of a distinct methodology which can be fruitfully deployed to advance scientific understanding of the human brain. Computer simulation studies can be used to produce surrogate observational data for better conceptual models and new how...

  5. Large-Scale Academic Achievement Testing of Deaf and Hard-of-Hearing Students: Past, Present, and Future

    Science.gov (United States)

    Qi, Sen; Mitchell, Ross E.

    2012-01-01

    The first large-scale, nationwide academic achievement testing program using Stanford Achievement Test (Stanford) for deaf and hard-of-hearing children in the United States started in 1969. Over the past three decades, the Stanford has served as a benchmark in the field of deaf education for assessing student academic achievement. However, the…

  6. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek; Verma, Mahendra K.; Sukhatme, Jai

    2017-01-01

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  7. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek

    2017-01-11

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  8. Implementation of Cyberinfrastructure and Data Management Workflow for a Large-Scale Sensor Network

    Science.gov (United States)

    Jones, A. S.; Horsburgh, J. S.

    2014-12-01

    Monitoring with in situ environmental sensors and other forms of field-based observation presents many challenges for data management, particularly for large-scale networks consisting of multiple sites, sensors, and personnel. The availability and utility of these data in addressing scientific questions relies on effective cyberinfrastructure that facilitates transformation of raw sensor data into functional data products. It also depends on the ability of researchers to share and access the data in useable formats. In addition to addressing the challenges presented by the quantity of data, monitoring networks need practices to ensure high data quality, including procedures and tools for post processing. Data quality is further enhanced if practitioners are able to track equipment, deployments, calibrations, and other events related to site maintenance and associate these details with observational data. In this presentation we will describe the overall workflow that we have developed for research groups and sites conducting long term monitoring using in situ sensors. Features of the workflow include: software tools to automate the transfer of data from field sites to databases, a Python-based program for data quality control post-processing, a web-based application for online discovery and visualization of data, and a data model and web interface for managing physical infrastructure. By automating the data management workflow, the time from collection to analysis is reduced and sharing and publication is facilitated. The incorporation of metadata standards and descriptions and the use of open-source tools enhances the sustainability and reusability of the data. We will describe the workflow and tools that we have developed in the context of the iUTAH (innovative Urban Transitions and Aridregion Hydrosustainability) monitoring network. The iUTAH network consists of aquatic and climate sensors deployed in three watersheds to monitor Gradients Along Mountain to Urban

  9. Exploring the large-scale structure of Taylor–Couette turbulence through Large-Eddy Simulations

    Science.gov (United States)

    Ostilla-Mónico, Rodolfo; Zhu, Xiaojue; Verzicco, Roberto

    2018-04-01

    Large eddy simulations (LES) of Taylor-Couette (TC) flow, the flow between two co-axial and independently rotating cylinders are performed in an attempt to explore the large-scale axially-pinned structures seen in experiments and simulations. Both static and dynamic LES models are used. The Reynolds number is kept fixed at Re = 3.4 · 104, and the radius ratio η = ri /ro is set to η = 0.909, limiting the effects of curvature and resulting in frictional Reynolds numbers of around Re τ ≈ 500. Four rotation ratios from Rot = ‑0.0909 to Rot = 0.3 are simulated. First, the LES of TC is benchmarked for different rotation ratios. Both the Smagorinsky model with a constant of cs = 0.1 and the dynamic model are found to produce reasonable results for no mean rotation and cyclonic rotation, but deviations increase for increasing rotation. This is attributed to the increasing anisotropic character of the fluctuations. Second, “over-damped” LES, i.e. LES with a large Smagorinsky constant is performed and is shown to reproduce some features of the large-scale structures, even when the near-wall region is not adequately modeled. This shows the potential for using over-damped LES for fast explorations of the parameter space where large-scale structures are found.

  10. Mathematical programming methods for large-scale topology optimization problems

    DEFF Research Database (Denmark)

    Rojas Labanda, Susana

    for mechanical problems, but has rapidly extended to many other disciplines, such as fluid dynamics and biomechanical problems. However, the novelty and improvements of optimization methods has been very limited. It is, indeed, necessary to develop of new optimization methods to improve the final designs......, and at the same time, reduce the number of function evaluations. Nonlinear optimization methods, such as sequential quadratic programming and interior point solvers, have almost not been embraced by the topology optimization community. Thus, this work is focused on the introduction of this kind of second...... for the classical minimum compliance problem. Two of the state-of-the-art optimization algorithms are investigated and implemented for this structural topology optimization problem. A Sequential Quadratic Programming (TopSQP) and an interior point method (TopIP) are developed exploiting the specific mathematical...

  11. Automatic Installation and Configuration for Large Scale Farms

    CERN Document Server

    Novák, J

    2005-01-01

    Since the early appearance of commodity hardware, the utilization of computers rose rapidly, and they became essential in all areas of life. Soon it was realized that nodes are able to work cooperatively, in order to solve new, more complex tasks. This conception got materialized in coherent aggregations of computers called farms and clusters. Collective application of nodes, being efficient and economical, was adopted in education, research and industry before long. But maintainance, especially in large scale, appeared as a problem to be resolved. New challenges needed new methods and tools. Development work has been started to build farm management applications and frameworks. In the first part of the thesis, these systems are introduced. After a general description of the matter, a comparative analysis of different approaches and tools illustrates the practical aspects of the theoretical discussion. CERN, the European Organization of Nuclear Research is the largest Particle Physics laboratory in the world....

  12. Large-scale preparation of hollow graphitic carbon nanospheres

    International Nuclear Information System (INIS)

    Feng, Jun; Li, Fu; Bai, Yu-Jun; Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning; Lu, Xi-Feng

    2013-01-01

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 °C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g −1 after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 °C, which exhibit superior electrochemical performance to graphite. Highlights: ► Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 °C ► The preparation is simple, effective and eco-friendly. ► The in situ yielded MgO nanocrystals promote the graphitization. ► The HGCNSs exhibit superior electrochemical performance to graphite.

  13. Accelerating large-scale phase-field simulations with GPU

    Directory of Open Access Journals (Sweden)

    Xiaoming Shi

    2017-10-01

    Full Text Available A new package for accelerating large-scale phase-field simulations was developed by using GPU based on the semi-implicit Fourier method. The package can solve a variety of equilibrium equations with different inhomogeneity including long-range elastic, magnetostatic, and electrostatic interactions. Through using specific algorithm in Compute Unified Device Architecture (CUDA, Fourier spectral iterative perturbation method was integrated in GPU package. The Allen-Cahn equation, Cahn-Hilliard equation, and phase-field model with long-range interaction were solved based on the algorithm running on GPU respectively to test the performance of the package. From the comparison of the calculation results between the solver executed in single CPU and the one on GPU, it was found that the speed on GPU is enormously elevated to 50 times faster. The present study therefore contributes to the acceleration of large-scale phase-field simulations and provides guidance for experiments to design large-scale functional devices.

  14. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed

    2017-03-16

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end, cellular networks are indeed a strong first mile candidate to accommodate the data tsunami to be generated by the IoT. However, IoT devices are required in the cellular paradigm to undergo random access procedures as a precursor to resource allocation. Such procedures impose a major bottleneck that hinders cellular networks\\' ability to support large-scale IoT. In this article, we shed light on the random access dilemma and present a case study based on experimental data as well as system-level simulations. Accordingly, a case is built for the latent need to revisit random access procedures. A call for action is motivated by listing a few potential remedies and recommendations.

  15. WImpiBLAST: web interface for mpiBLAST to help biologists perform large-scale annotation using high performance computing.

    Directory of Open Access Journals (Sweden)

    Parichit Sharma

    Full Text Available The function of a newly sequenced gene can be discovered by determining its sequence homology with known proteins. BLAST is the most extensively used sequence analysis program for sequence similarity search in large databases of sequences. With the advent of next generation sequencing technologies it has now become possible to study genes and their expression at a genome-wide scale through RNA-seq and metagenome sequencing experiments. Functional annotation of all the genes is done by sequence similarity search against multiple protein databases. This annotation task is computationally very intensive and can take days to obtain complete results. The program mpiBLAST, an open-source parallelization of BLAST that achieves superlinear speedup, can be used to accelerate large-scale annotation by using supercomputers and high performance computing (HPC clusters. Although many parallel bioinformatics applications using the Message Passing Interface (MPI are available in the public domain, researchers are reluctant to use them due to lack of expertise in the Linux command line and relevant programming experience. With these limitations, it becomes difficult for biologists to use mpiBLAST for accelerating annotation. No web interface is available in the open-source domain for mpiBLAST. We have developed WImpiBLAST, a user-friendly open-source web interface for parallel BLAST searches. It is implemented in Struts 1.3 using a Java backbone and runs atop the open-source Apache Tomcat Server. WImpiBLAST supports script creation and job submission features and also provides a robust job management interface for system administrators. It combines script creation and modification features with job monitoring and management through the Torque resource manager on a Linux-based HPC cluster. Use case information highlights the acceleration of annotation analysis achieved by using WImpiBLAST. Here, we describe the WImpiBLAST web interface features and architecture

  16. WImpiBLAST: web interface for mpiBLAST to help biologists perform large-scale annotation using high performance computing.

    Science.gov (United States)

    Sharma, Parichit; Mantri, Shrikant S

    2014-01-01

    The function of a newly sequenced gene can be discovered by determining its sequence homology with known proteins. BLAST is the most extensively used sequence analysis program for sequence similarity search in large databases of sequences. With the advent of next generation sequencing technologies it has now become possible to study genes and their expression at a genome-wide scale through RNA-seq and metagenome sequencing experiments. Functional annotation of all the genes is done by sequence similarity search against multiple protein databases. This annotation task is computationally very intensive and can take days to obtain complete results. The program mpiBLAST, an open-source parallelization of BLAST that achieves superlinear speedup, can be used to accelerate large-scale annotation by using supercomputers and high performance computing (HPC) clusters. Although many parallel bioinformatics applications using the Message Passing Interface (MPI) are available in the public domain, researchers are reluctant to use them due to lack of expertise in the Linux command line and relevant programming experience. With these limitations, it becomes difficult for biologists to use mpiBLAST for accelerating annotation. No web interface is available in the open-source domain for mpiBLAST. We have developed WImpiBLAST, a user-friendly open-source web interface for parallel BLAST searches. It is implemented in Struts 1.3 using a Java backbone and runs atop the open-source Apache Tomcat Server. WImpiBLAST supports script creation and job submission features and also provides a robust job management interface for system administrators. It combines script creation and modification features with job monitoring and management through the Torque resource manager on a Linux-based HPC cluster. Use case information highlights the acceleration of annotation analysis achieved by using WImpiBLAST. Here, we describe the WImpiBLAST web interface features and architecture, explain design

  17. Hydrogen-combustion analyses of large-scale tests

    International Nuclear Information System (INIS)

    Gido, R.G.; Koestel, A.

    1986-01-01

    This report uses results of the large-scale tests with turbulence performed by the Electric Power Research Institute at the Nevada Test Site to evaluate hydrogen burn-analysis procedures based on lumped-parameter codes like COMPARE-H2 and associated burn-parameter models. The test results: (1) confirmed, in a general way, the procedures for application to pulsed burning, (2) increased significantly our understanding of the burn phenomenon by demonstrating that continuous burning can occur, and (3) indicated that steam can terminate continuous burning. Future actions recommended include: (1) modification of the code to perform continuous-burn analyses, which is demonstrated, (2) analyses to determine the type of burning (pulsed or continuous) that will exist in nuclear containments and the stable location if the burning is continuous, and (3) changes to the models for estimating burn parameters

  18. Hydrogen-combustion analyses of large-scale tests

    International Nuclear Information System (INIS)

    Gido, R.G.; Koestel, A.

    1986-01-01

    This report uses results of the large-scale tests with turbulence performed by the Electric Power Research Institute at the Nevada Test Site to evaluate hydrogen burn-analysis procedures based on lumped-parameter codes like COMPARE-H2 and associated burn-parameter models. The test results (a) confirmed, in a general way, the procedures for application to pulsed burning, (b) increased significantly our understanding of the burn phenomenon by demonstrating that continuous burning can occur and (c) indicated that steam can terminate continuous burning. Future actions recommended include (a) modification of the code to perform continuous-burn analyses, which is demonstrated, (b) analyses to determine the type of burning (pulsed or continuous) that will exist in nuclear containments and the stable location if the burning is continuous, and (c) changes to the models for estimating burn parameters

  19. Large Scale Hierarchical K-Means Based Image Retrieval With MapReduce

    Science.gov (United States)

    2014-03-27

    flat vocabulary on MapReduce. In 2013, Moise and Shestakov [32, 40], have been researching large scale indexing and search with MapReduce. They...time will be greatly reduced, however image retrieval performance will almost certainly suffer. Moise and Shestakov ran tests with 100M images on 108...43–72, 2005. [32] Diana Moise , Denis Shestakov, Gylfi Gudmundsson, and Laurent Amsaleg. Indexing and searching 100m images with map-reduce. In

  20. Thermal power generation projects ``Large Scale Solar Heating``; EU-Thermie-Projekte ``Large Scale Solar Heating``

    Energy Technology Data Exchange (ETDEWEB)

    Kuebler, R.; Fisch, M.N. [Steinbeis-Transferzentrum Energie-, Gebaeude- und Solartechnik, Stuttgart (Germany)

    1998-12-31

    The aim of this project is the preparation of the ``Large-Scale Solar Heating`` programme for an Europe-wide development of subject technology. The following demonstration programme was judged well by the experts but was not immediately (1996) accepted for financial subsidies. In November 1997 the EU-commission provided 1,5 million ECU which allowed the realisation of an updated project proposal. By mid 1997 a small project was approved, that had been requested under the lead of Chalmes Industriteteknik (CIT) in Sweden and is mainly carried out for the transfer of technology. (orig.) [Deutsch] Ziel dieses Vorhabens ist die Vorbereitung eines Schwerpunktprogramms `Large Scale Solar Heating`, mit dem die Technologie europaweit weiterentwickelt werden sollte. Das daraus entwickelte Demonstrationsprogramm wurde von den Gutachtern positiv bewertet, konnte jedoch nicht auf Anhieb (1996) in die Foerderung aufgenommen werden. Im November 1997 wurden von der EU-Kommission dann kurzfristig noch 1,5 Mio ECU an Foerderung bewilligt, mit denen ein aktualisierter Projektvorschlag realisiert werden kann. Bereits Mitte 1997 wurde ein kleineres Vorhaben bewilligt, das unter Federfuehrung von Chalmers Industriteknik (CIT) in Schweden beantragt worden war und das vor allem dem Technologietransfer dient. (orig.)

  1. Development of large scale wind energy conservation system. Development of large scale wind energy conversion system; Ogata furyoku hatsuden system no kaihatsu. Ogata furyoku hatsuden system no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    Takita, M [New Energy and Industrial Technology Development Organization, Tokyo (Japan)

    1994-12-01

    Described herein are the results of the FY1994 research program for development of large scale wind energy conversion system. The study on technological development of key components evaluates performance of, and confirms reliability and applicability of, hydraulic systems centered by those equipped with variable pitch mechanisms and electrohydraulic servo valves that control them. The study on blade conducts fatigue and crack-propagation tests, which show that the blades developed have high strength. The study on speed-increasing gear conducts load tests, confirming the effects of reducing vibration and noise by modification of the gear teeth. The study on NACELLE cover conducts vibration tests to confirm its vibration characteristics, and analyzes three-dimensional vibration by the finite element method. Some components for a 500kW commercial wind mill are fabricated, including rotor heads, variable pitch mechanisms, speed-increasing gears, YAW systems, and hydraulic control systems. The others fabricated include a remote supervisory control system for maintenance, system to integrate the wind mill into a power system, and electrical control devices in which site conditions, such as atmospheric temperature and lightening, are taken into consideration.

  2. Large-scale retrieval for medical image analytics: A comprehensive review.

    Science.gov (United States)

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Valuation framework for large scale electricity storage in a case with wind curtailment

    International Nuclear Information System (INIS)

    Loisel, Rodica; Mercier, Arnaud; Gatzen, Christoph; Elms, Nick; Petric, Hrvoje

    2010-01-01

    This paper investigates the value of large scale applications of electricity storage in selected European power systems in the context of wind generation confronted with a grid bottleneck. It analyzes the market value to 2030 of two storage technologies, assuming the market situation projected for Germany and France. The analysis assesses the evolution of storage economics based on the net present value of cash flows. Sensitivities to market and regulatory drivers of value are assessed, e.g. electricity price spreads, ancillary services revenues, wind curtailment and the level of carbon prices. The paper concludes by suggesting possible ways to improve the competitiveness of electricity storage, such as research and development and deployment programmes, and changes to the design of power markets and regulatory arrangements to enable storage owners to better capture the benefits of storage. Such changes would allow electricity storage, where economically viable, to play a critical role in establishing a future sustainable European power system. - Research highlights: →CAES and PHS are not cost-effective for current market design in France and Germany → Market reforms are run to reward bottleneck avoiding and ancillary reserves → Storage is profitable when all potential socio-economic benefits are aggregated → R and D and D programs for storage improvement are economically and socially justified.

  4. Distribution of ground rigidity and ground model for seismic response analysis in Hualian project of large scale seismic test

    International Nuclear Information System (INIS)

    Kokusho, T.; Nishi, K.; Okamoto, T.; Tanaka, Y.; Ueshima, T.; Kudo, K.; Kataoka, T.; Ikemi, M.; Kawai, T.; Sawada, Y.; Suzuki, K.; Yajima, K.; Higashi, S.

    1997-01-01

    An international joint research program called HLSST is proceeding. HLSST is large-scale seismic test (LSST) to investigate soil-structure interaction (SSI) during large earthquake in the field in Hualien, a high seismic region in Taiwan. A 1/4-scale model building was constructed on the gravelly soil in this site, and the backfill material of crushed stone was placed around the model plant after excavation for the construction. Also the model building and the foundation ground were extensively instrumental to monitor structure and ground response. To accurately evaluate SSI during earthquakes, geotechnical investigation and forced vibration test were performed during construction process namely before/after base excavation, after structure construction and after backfilling. And the distribution of the mechanical properties of the gravelly soil and the backfill are measured after the completion of the construction by penetration test and PS-logging etc. This paper describes the distribution and the change of the shear wave velocity (V s ) measured by the field test. Discussion is made on the effect of overburden pressure during the construction process on V s in the neighbouring soil and, further on the numerical soil model for SSI analysis. (orig.)

  5. Photorealistic large-scale urban city model reconstruction.

    Science.gov (United States)

    Poullis, Charalambos; You, Suya

    2009-01-01

    The rapid and efficient creation of virtual environments has become a crucial part of virtual reality applications. In particular, civil and defense applications often require and employ detailed models of operations areas for training, simulations of different scenarios, planning for natural or man-made events, monitoring, surveillance, games, and films. A realistic representation of the large-scale environments is therefore imperative for the success of such applications since it increases the immersive experience of its users and helps reduce the difference between physical and virtual reality. However, the task of creating such large-scale virtual environments still remains a time-consuming and manual work. In this work, we propose a novel method for the rapid reconstruction of photorealistic large-scale virtual environments. First, a novel, extendible, parameterized geometric primitive is presented for the automatic building identification and reconstruction of building structures. In addition, buildings with complex roofs containing complex linear and nonlinear surfaces are reconstructed interactively using a linear polygonal and a nonlinear primitive, respectively. Second, we present a rendering pipeline for the composition of photorealistic textures, which unlike existing techniques, can recover missing or occluded texture information by integrating multiple information captured from different optical sensors (ground, aerial, and satellite).

  6. Piping research program plan

    International Nuclear Information System (INIS)

    1988-09-01

    This document presents the piping research program plan for the Structural and Seismic Engineering Branch and the Materials Engineering Branch of the Division of Engineering, Office of Nuclear Regulatory Research. The plan describes the research to be performed in the areas of piping design criteria, environmentally assisted cracking, pipe fracture, and leak detection and leak rate estimation. The piping research program addresses the regulatory issues regarding piping design and piping integrity facing the NRC today and in the foreseeable future. The plan discusses the regulatory issues and needs for the research, the objectives, key aspects, and schedule for each research project, or group of projects focussing of a specific topic, and, finally, the integration of the research areas into the regulatory process is described. The plan presents a snap-shot of the piping research program as it exists today. However, the program plan will change as the regulatory issues and needs change. Consequently, this document will be revised on a bi-annual basis to reflect the changes in the piping research program. (author)

  7. Prototype Vector Machine for Large Scale Semi-Supervised Learning

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Kai; Kwok, James T.; Parvin, Bahram

    2009-04-29

    Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of the kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.

  8. Extraction of relations between genes and diseases from text and large-scale data analysis: implications for translational research.

    Science.gov (United States)

    Bravo, Àlex; Piñero, Janet; Queralt-Rosinach, Núria; Rautschka, Michael; Furlong, Laura I

    2015-02-21

    Current biomedical research needs to leverage and exploit the large amount of information reported in scientific publications. Automated text mining approaches, in particular those aimed at finding relationships between entities, are key for identification of actionable knowledge from free text repositories. We present the BeFree system aimed at identifying relationships between biomedical entities with a special focus on genes and their associated diseases. By exploiting morpho-syntactic information of the text, BeFree is able to identify gene-disease, drug-disease and drug-target associations with state-of-the-art performance. The application of BeFree to real-case scenarios shows its effectiveness in extracting information relevant for translational research. We show the value of the gene-disease associations extracted by BeFree through a number of analyses and integration with other data sources. BeFree succeeds in identifying genes associated to a major cause of morbidity worldwide, depression, which are not present in other public resources. Moreover, large-scale extraction and analysis of gene-disease associations, and integration with current biomedical knowledge, provided interesting insights on the kind of information that can be found in the literature, and raised challenges regarding data prioritization and curation. We found that only a small proportion of the gene-disease associations discovered by using BeFree is collected in expert-curated databases. Thus, there is a pressing need to find alternative strategies to manual curation, in order to review, prioritize and curate text-mining data and incorporate it into domain-specific databases. We present our strategy for data prioritization and discuss its implications for supporting biomedical research and applications. BeFree is a novel text mining system that performs competitively for the identification of gene-disease, drug-disease and drug-target associations. Our analyses show that mining only a

  9. Special educaction and rewiews in the municipality of large scale Sobral (CE

    Directory of Open Access Journals (Sweden)

    Ana Paula Lima Barbosa Cardoso

    2012-11-01

    Full Text Available This article aims to discuss and analyze the participation of students with disabilities in public schools of the city of Sobral-CE in the assessment scale developed in that context. It follows a case study, a qualitative approach, conducted within the Department of Education and two municipal schools; the highest and lowest IDEB results (2009. The data collection instruments: analysis of documents, interviews and observation, and content analysis. The theoretical framework discusses the large-scale evaluation in the brazilian context in conjunction with the literature on the evaluation of teaching for students with disabilities. We describe the landscape of education in general sobralense and also data on special education. The research results discussed two cases of large-scale evaluation that occurred in that municipality: municipal evaluation and Proof Brazil. Regarding the first, the subjects affirms the participation of students with disabilities through a mechanism that prevents these results affect other students, are called "children of the shore." In Proof Brazil, the subjects again reported the participation of these students in national testing. It's criticizing the appropriateness of that instrument to assess this particular student body, suggesting the need of developping more "relevant" ones. Finally, it appears that the large-scale evaluation calls into question the process of schooling experienced by pupils with disabilities in Sobral-CE, showing the challenges and difficulties of the actions of school inclusion proposals in that context.

  10. Large-scale academic achievement testing of deaf and hard-of-hearing students: past, present, and future.

    Science.gov (United States)

    Qi, Sen; Mitchell, Ross E

    2012-01-01

    The first large-scale, nationwide academic achievement testing program using Stanford Achievement Test (Stanford) for deaf and hard-of-hearing children in the United States started in 1969. Over the past three decades, the Stanford has served as a benchmark in the field of deaf education for assessing student academic achievement. However, the validity and reliability of using the Stanford for this special student population still require extensive scrutiny. Recent shifts in educational policy environment, which require that schools enable all children to achieve proficiency through accountability testing, warrants a close examination of the adequacy and relevance of the current large-scale testing of deaf and hard-of-hearing students. This study has three objectives: (a) it will summarize the historical data over the last three decades to indicate trends in academic achievement for this special population, (b) it will analyze the current federal laws and regulations related to educational testing and special education, thereby identifying gaps between policy and practice in the field, especially identifying the limitations of current testing programs in assessing what deaf and hard-of-hearing students know, and (c) it will offer some insights and suggestions for future testing programs for deaf and hard-of-hearing students.

  11. French high level nuclear waste program: key research areas

    International Nuclear Information System (INIS)

    Sombret, G.

    1985-09-01

    The most important aspects of this research program concern disposal safety: the long-term behavior and sensitivity of the materials to the variability inherent in industrial processes, and the characterization of the final product. This research requires different investigations involving various scientific fields, and implements radioactive and non-radioactive glass samples as well as industrial scale glass blocks. Certain studies have now been completed; others are still in progress

  12. Scaling up high throughput field phenotyping of corn and soy research plots using ground rovers

    Science.gov (United States)

    Peshlov, Boyan; Nakarmi, Akash; Baldwin, Steven; Essner, Scott; French, Jasenka

    2017-05-01

    Crop improvement programs require large and meticulous selection processes that effectively and accurately collect and analyze data to generate quality plant products as efficiently as possible, develop superior cropping and/or crop improvement methods. Typically, data collection for such testing is performed by field teams using hand-held instruments or manually-controlled devices. Although steps are taken to reduce error, the data collected in such manner can be unreliable due to human error and fatigue, which reduces the ability to make accurate selection decisions. Monsanto engineering teams have developed a high-clearance mobile platform (Rover) as a step towards high throughput and high accuracy phenotyping at an industrial scale. The rovers are equipped with GPS navigation, multiple cameras and sensors and on-board computers to acquire data and compute plant vigor metrics per plot. The supporting IT systems enable automatic path planning, plot identification, image and point cloud data QA/QC and near real-time analysis where results are streamed to enterprise databases for additional statistical analysis and product advancement decisions. Since the rover program was launched in North America in 2013, the number of research plots we can analyze in a growing season has expanded dramatically. This work describes some of the successes and challenges in scaling up of the rover platform for automated phenotyping to enable science at scale.

  13. Impact of climate change on large scale coastal currents of South Africa

    CSIR Research Space (South Africa)

    Meyer, A

    2010-09-01

    Full Text Available of temperature trends in the southern indian ocean. Geophysical Research Letters, 34, L14611, doi:10.1029/2007gL030380. • De Ruijter, W.P.M., van Aken, H.M., Beier, E.J., Lutjeharms, J.R.E., Matano, R.P. Schouten, M.W. 2004. eddies and dipoles around South... Madagascar: formation, pathways and large- scale impact. Deep-Sea Research I, 51, 383-400. • Lutjeharms, J. R. E. and van Ballegooyen, R. C. 1988. Anomalous upstream retroflection in the agulhas current. Science, 240, 1770-1772. • Rouault, M., Penven...

  14. Energy research program 82

    International Nuclear Information System (INIS)

    1982-01-01

    The energy research program 82 (EFP-82) is prepared by the Danish ministry of energy in order to continue the extension of the Danish energy research and development started through the former trade ministry's programs EM-1 (1976) and EM-2 (1978), and the energy ministry's programs EFP-80 and EFP-81. The new program is a continuation of the activities in the period 1982-84 with a total budget of 100 mio.Dkr. The program gives a brief description of background, principles, organization and financing, and a detailed description of each research area. (BP)

  15. Use of a large-scale rainfall simulator reveals novel insights into stemflow generation

    Science.gov (United States)

    Levia, D. F., Jr.; Iida, S. I.; Nanko, K.; Sun, X.; Shinohara, Y.; Sakai, N.

    2017-12-01

    Detailed knowledge of stemflow generation and its effects on both hydrological and biogoechemical cycling is important to achieve a holistic understanding of forest ecosystems. Field studies and a smaller set of experiments performed under laboratory conditions have increased our process-based knowledge of stemflow production. Building upon these earlier works, a large-scale rainfall simulator was employed to deepen our understanding of stemflow generation processes. The use of the large-scale rainfall simulator provides a unique opportunity to examine a range of rainfall intensities under constant conditions that are difficult under natural conditions due to the variable nature of rainfall intensities in the field. Stemflow generation and production was examined for three species- Cryptomeria japonica D. Don (Japanese cedar), Chamaecyparis obtusa (Siebold & Zucc.) Endl. (Japanese cypress), Zelkova serrata Thunb. (Japanese zelkova)- under both leafed and leafless conditions at several different rainfall intensities (15, 20, 30, 40, 50, and 100 mm h-1) using a large-scale rainfall simulator in National Research Institute for Earth Science and Disaster Resilience (Tsukuba, Japan). Stemflow production and rates and funneling ratios were examined in relation to both rainfall intensity and canopy structure. Preliminary results indicate a dynamic and complex response of the funneling ratios of individual trees to different rainfall intensities among the species examined. This is partly the result of different canopy structures, hydrophobicity of vegetative surfaces, and differential wet-up processes across species and rainfall intensities. This presentation delves into these differences and attempts to distill them into generalizable patterns, which can advance our theories of stemflow generation processes and ultimately permit better stewardship of forest resources. ________________ Funding note: This research was supported by JSPS Invitation Fellowship for Research in

  16. Large-scale, multi-compartment tests in PANDA for LWR-containment analysis and code validation

    International Nuclear Information System (INIS)

    Paladino, Domenico; Auban, Olivier; Zboray, Robert

    2006-01-01

    The large-scale thermal-hydraulic PANDA facility has been used for the last years for investigating passive decay heat removal systems and related containment phenomena relevant for next-generation and current light water reactors. As part of the 5. EURATOM framework program project TEMPEST, a series of tests was performed in PANDA to experimentally investigate the distribution of hydrogen inside the containment and its effect on the performance of the Passive Containment Cooling System (PCCS) designed for the Economic Simplified Boiling Water Reactor (ESBWR). In a postulated severe accident, a large amount of hydrogen could be released in the Reactor Pressure Vessel (RPV) as a consequence of the cladding Metal- Water (M-W) reaction and discharged together with steam to the Drywell (DW) compartment. In PANDA tests, hydrogen was simulated by using helium. This paper illustrates the results of a TEMPEST test performed in PANDA and named as Test T1.2. In Test T1.2, the gas stratification (steam-helium) patterns forming in the large-scale multi-compartment PANDA DW, and the effect of non-condensable gas (helium) on the overall behaviour of the PCCS were identified. Gas mixing and stratification in a large-scale multi-compartment system are currently being further investigated in PANDA in the frame of the OECD project SETH. The testing philosophy in this new PANDA program is to produce data for code validation in relation to specific phenomena, such as: gas stratification in the containment, gas transport between containment compartments, wall condensation, etc. These types of phenomena are driven by buoyant high-momentum injections (jets) and/or low momentum injection (plumes), depending on the transient scenario. In this context, the new SETH tests in PANDA are particularly valuable to produce an experimental database for code assessment. This paper also presents an overview of the PANDA SETH tests and the major improvements in instrumentation carried out in the PANDA

  17. Accelerating Relevance Vector Machine for Large-Scale Data on Spark

    Directory of Open Access Journals (Sweden)

    Liu Fang

    2017-01-01

    Full Text Available Relevance vector machine (RVM is a machine learning algorithm based on a sparse Bayesian framework, which performs well when running classification and regression tasks on small-scale datasets. However, RVM also has certain drawbacks which restricts its practical applications such as (1 slow training process, (2 poor performance on training large-scale datasets. In order to solve these problem, we propose Discrete AdaBoost RVM (DAB-RVM which incorporate ensemble learning in RVM at first. This method performs well with large-scale low-dimensional datasets. However, as the number of features increases, the training time of DAB-RVM increases as well. To avoid this phenomenon, we utilize the sufficient training samples of large-scale datasets and propose all features boosting RVM (AFB-RVM, which modifies the way of obtaining weak classifiers. In our experiments we study the differences between various boosting techniques with RVM, demonstrating the performance of the proposed approaches on Spark. As a result of this paper, two proposed approaches on Spark for different types of large-scale datasets are available.

  18. Bayesian hierarchical model for large-scale covariance matrix estimation.

    Science.gov (United States)

    Zhu, Dongxiao; Hero, Alfred O

    2007-12-01

    Many bioinformatics problems implicitly depend on estimating large-scale covariance matrix. The traditional approaches tend to give rise to high variance and low accuracy due to "overfitting." We cast the large-scale covariance matrix estimation problem into the Bayesian hierarchical model framework, and introduce dependency between covariance parameters. We demonstrate the advantages of our approaches over the traditional approaches using simulations and OMICS data analysis.

  19. The U.S. Minerals Management Service - oil spill response research program

    International Nuclear Information System (INIS)

    Mullin, J.V.

    1998-01-01

    The Minerals Management Service (MMS), is the principal U.S. Government agency funding offshore oil spill response research. The MMS, a bureau of the Department of the lnterior, maintains a comprehensive Oil Spill Response Research program in support of oil spill prevention and response. Through funding provided by MMS, scientists and engineers from the public and private sectors worldwide are working to address outstanding gaps in information and technology concerning the cleanup of oil spills. A large portion of the program is executed through cooperation with major research centers to leverage funds and maximize sharing of research results. This paper outlines the program, its goals, results from recently funded projects and future research directions. (author)

  20. Creating Large Scale Database Servers

    International Nuclear Information System (INIS)

    Becla, Jacek

    2001-01-01

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region