WorldWideScience

Sample records for large-scale test facility

  1. Development of a Large Scale, High Speed Wheel Test Facility

    Science.gov (United States)

    Kondoleon, Anthony; Seltzer, Donald; Thornton, Richard; Thompson, Marc

    1996-01-01

    Draper Laboratory, with its internal research and development budget, has for the past two years been funding a joint effort with the Massachusetts Institute of Technology (MIT) for the development of a large scale, high speed wheel test facility. This facility was developed to perform experiments and carry out evaluations on levitation and propulsion designs for MagLev systems currently under consideration. The facility was developed to rotate a large (2 meter) wheel which could operate with peripheral speeds of greater than 100 meters/second. The rim of the wheel was constructed of a non-magnetic, non-conductive composite material to avoid the generation of errors from spurious forces. A sensor package containing a multi-axis force and torque sensor mounted to the base of the station, provides a signal of the lift and drag forces on the package being tested. Position tables mounted on the station allow for the introduction of errors in real time. A computer controlled data acquisition system was developed around a Macintosh IIfx to record the test data and control the speed of the wheel. This paper describes the development of this test facility. A detailed description of the major components is presented. Recently completed tests carried out on a novel Electrodynamic (EDS) suspension system, developed by MIT as part of this joint effort are described and presented. Adaptation of this facility for linear motor and other propulsion and levitation testing is described.

  2. A review of large-scale testing facilities in geotechnical earthquake engineering

    OpenAIRE

    Elgamal, A; Pitilakis, K.; Raptakis, D.; J. Garnier; GOPAL MADABHUSHI, SP; Pinto, A; Steidl, J.; STEWART, HE; STOKOE, KH; TAUCER, F; TOKIMATSU, K; WALLACE, JW

    2007-01-01

    In this new century, new large-scale testing facilities are being developed worldwide for earthquake engineering research. Concurrently, the advances in Information Technology (IT) are increasingly allowing unprecedented opportunities for : - remote access and tele-presence during extended remote off-site experimentation, - hybrid simulation of entire structural systems through a multi-site experimentation and computational overall model, and - near-real time data archival, processing and sha...

  3. Preliminary Design of Large Scale Sodium Thermal-Hydraulic Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Tae Ho; Kim, Tae Joon; Eoh, Jae Hyuk; Lee, Hyeong Yeon; Lee, Jae Han; Jeong, Ji Young; Park, Su Ki; Han, Ji Woong; Yoo, Yong Hwan; Lee, Yong Bum [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2009-10-15

    A large scale sodium thermal-hydraulic test facility is being designed for verification of the advanced design concept of the passive decay heat removal circuit (PDRC) in a medium- or large-sized pool-type SFR. In the test, its cooling capability during the long- and short-term periods after the reactor trip will be evaluated, and also the produced experimental data will be utilized for the assessment and verification of the safety and performance analysis codes. Starting with the preliminary design of the test facility this year using KALIMER-600 as a reference reactor, the basic and the detailed designs will be made through 2011-2012 based on the demonstration reactor which is intended to be constructed by 2028 according to a long-term national SFR development plan. The installation is scheduled to be completed by the end of 2013, and the main experiments will commence from 2015 after the startup test in 2014. This paper briefly introduces the preliminary design features which were produced as a first step to assess the appropriateness of the facility design methodology.

  4. Development of explosive event scale model testing capability at Sandia`s large scale centrifuge facility

    Energy Technology Data Exchange (ETDEWEB)

    Blanchat, T.K.; Davie, N.T.; Calderone, J.J. [and others

    1998-02-01

    Geotechnical structures such as underground bunkers, tunnels, and building foundations are subjected to stress fields produced by the gravity load on the structure and/or any overlying strata. These stress fields may be reproduced on a scaled model of the structure by proportionally increasing the gravity field through the use of a centrifuge. This technology can then be used to assess the vulnerability of various geotechnical structures to explosive loading. Applications of this technology include assessing the effectiveness of earth penetrating weapons, evaluating the vulnerability of various structures, counter-terrorism, and model validation. This document describes the development of expertise in scale model explosive testing on geotechnical structures using Sandia`s large scale centrifuge facility. This study focused on buried structures such as hardened storage bunkers or tunnels. Data from this study was used to evaluate the predictive capabilities of existing hydrocodes and structural dynamics codes developed at Sandia National Laboratories (such as Pronto/SPH, Pronto/CTH, and ALEGRA). 7 refs., 50 figs., 8 tabs.

  5. PTF, a new facility for pulse field testing of large scale superconducting cables and joints

    NARCIS (Netherlands)

    Smith, Bradford A.; Hale, J. Richard; Zhukovsky, Alex; Michael, Philip C.; Minervini, Joseph V.; Olmstead, Michael M.; Dekow, Gary L.; Rosati, James; Camille, Richard J.; Gung, Chen-yu; Gwinn, David; Silva, Frank; Fairfax, Stephen A.; Shen, Stewart; Knoopers, H.G.; Wessel, S.; Krooshoop, H.J.G.; Shevchenko, O.A.; Godeke, A.; Kate, ten H.H.J.

    1997-01-01

    A magnetic Pulse Test Facility (PTF), in which samples of CICC electrical joints from each ITER home team will be tested, has been fabricated at the MIT Plasma Fusion Center under an ITER task agreement. Construction of this facility has recently been completed, and an initial test phase on the firs

  6. PTF; a new facility for pulse field testing of large scale superconducting

    NARCIS (Netherlands)

    Smith, Bradford A.; Hale, J. Richard; Zhukovsky, Alex; Michael, Philip C.; Minervini, Joseph V.; Olmstead, Michael M.; Dekow, Gary L.; Rosati, James; Camille, Richard J.; Gung, Chen-yu; Gwinn, David; Silva, Frank; Fairfax, Stephen A.; Shen, Stewart; Knoopers, H.G.; Wessel, Wilhelm A.J.; Krooshoop, Hendrikus J.G.; Chevtchenko, O.A.; Godeke, A.; ten Kate, Herman H.J.

    1997-01-01

    A magnetic Pulse Test Facility (PTF), in which samples of CICC electrical joints from each ITER home team will be tested, has been fabricated at the MIT Plasma Fusion Center under an ITER task agreement. Construction of this facility has recently been completed, and an initial test phase on the

  7. Functional and large-scale testing of the ATLAS distributed analysis facilities with Ganga

    CERN Document Server

    Vanderster, D C; Biglietti, M; Galeazzi, F; Serfon, C; Slater, M

    2010-01-01

    Effective distributed user analysis requires a system which meets the demands of running arbitrary user applications on sites with varied configurations and availabilities. The challenge of tracking such a system requires a tool to monitor not only the functional statuses of each grid site, but also to perform large-scale analysis challenges on the ATLAS grids. This work presents one such tool, the ATLAS GangaRobot, and the results of its use in tests and challenges. For functional testing, the GangaRobot performs daily tests of all sites; specifically, a set of exemplary applications are submitted to all sites and then monitored for success and failure conditions. These results are fed back into Ganga to improve job placements by avoiding currently problematic sites. For analysis challenges, a cloud is first prepared by replicating a number of desired DQ2 datasets across all the sites. Next, the GangaRobot is used to submit and manage a large number of jobs targeting these datasets. The high-loads resulting ...

  8. Functional and large-scale testing of the ATLAS distributed analysis facilities with Ganga

    Energy Technology Data Exchange (ETDEWEB)

    Vanderster, D C [European Organization for Nuclear Research, CERN CH-1211, Geneve 23 (Switzerland); Elmsheuser, J; Serfon, C [Ludwig-Maximilians-Universitaet Muenchen, Geschwister-Scholl-Platz 1, 80539 Munich (Germany); Biglietti, M [University of Birmingham, Edgbaston, Birmingham, B15 2TT (United Kingdom); Galeazzi, F [Istituto Nazionale di Fisica Nucleare, Sezione di Napoli (Italy); Slater, M, E-mail: daniel.colin.vanderster@cern.c [Istituto Nazionale di Fisica Nucleare, Sezione di Roma Tre (Italy)

    2010-04-01

    Effective distributed user analysis requires a system which meets the demands of running arbitrary user applications on sites with varied configurations and availabilities. The challenge of tracking such a system requires a tool to monitor not only the functional statuses of each grid site, but also to perform large-scale analysis challenges on the ATLAS grids. This work presents one such tool, the ATLAS GangaRobot, and the results of its use in tests and challenges. For functional testing, the GangaRobot performs daily tests of all sites; specifically, a set of exemplary applications are submitted to all sites and then monitored for success and failure conditions. These results are fed back into Ganga to improve job placements by avoiding currently problematic sites. For analysis challenges, a cloud is first prepared by replicating a number of desired DQ2 datasets across all the sites. Next, the GangaRobot is used to submit and manage a large number of jobs targeting these datasets. The high-loads resulting from multiple parallel instances of the GangaRobot exposes shortcomings in storage and network configurations. The results from a series of cloud-by-cloud analysis challenges starting in fall 2008 are presented.

  9. Modelling and operation strategies of DLR's large scale thermocline test facility (TESIS)

    Science.gov (United States)

    Odenthal, Christian; Breidenbach, Nils; Bauer, Thomas

    2017-06-01

    In this work an overview of the TESIS:store thermocline test facility and its current construction status will be given. Based on this, the TESIS:store facility using sensible solid filler material is modelled with a fully transient model, implemented in MATLAB®. Results in terms of the impact of filler site and operation strategies will be presented. While low porosity and small particle diameters for the filler material are beneficial, operation strategy is one key element with potential for optimization. It is shown that plant operators have to ponder between utilization and exergetic efficiency. Different durations of the charging and discharging period enable further potential for optimizations.

  10. A Large-scale Test Facility for Heat Load Measurements down to 1.9 K

    CERN Document Server

    Dufay, L; Rieubland, Jean Michel; Vandoni, Giovanna

    2002-01-01

    Laboratory-scale tests aimed at minimizing the thermal loads of the LHC magnet cryostat have gone along with the development of the various mechanical components. For final validation of the industrial design with respect to heat inleaks between large surfaces at different temperatures, a full-scale test cryostat has been constructed. The facility reproduces the same pattern of temperature levels as the LHC dipole cryostat, avoiding the heat inleaks from local components like supports and feedthroughs and carefully minimizing fringe effects due to the truncated geometry of the facility with respect to the LHC cryostats serial layout. Thermal loads to the actively cooled radiation screen, operated between 50 K and 65 K, are measured by enthalpy difference along its length. At 1.9 K, the loads are obtained from the temperature difference across a superfluid helium exchanger. On the beam screen, the electrical power needed to stabilize the temperature at 20 K yields a direct reading of the heat losses. Precise i...

  11. Testing gravity on Large Scales

    OpenAIRE

    Raccanelli Alvise

    2013-01-01

    We show how it is possible to test general relativity and different models of gravity via Redshift-Space Distortions using forthcoming cosmological galaxy surveys. However, the theoretical models currently used to interpret the data often rely on simplifications that make them not accurate enough for precise measurements. We will discuss improvements to the theoretical modeling at very large scales, including wide-angle and general relativistic corrections; we then show that for wide and deep...

  12. ROSA-V large scale test facility (LSTF) system description for the third and fourth simulated fuel assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Mitsuhiro; Nakamura, Hideo; Ohtsu, Iwao [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment] [and others

    2003-03-01

    The Large Scale Test Facility (LSTF) is a full-height and 1/48 volumetrically scaled test facility of the Japan Atomic Energy Research Institute (JAERI) for system integral experiments simulating the thermal-hydraulic responses at full-pressure conditions of a 1100 MWe-class pressurized water reactor (PWR) during small break loss-of-coolant accidents (SBLOCAs) and other transients. The LSTF can also simulate well a next-generation type PWR such as the AP600 reactor. In the fifth phase of the Rig-of-Safety Assessment (ROSA-V) Program, eighty nine experiments have been conducted at the LSTF with the third simulated fuel assembly until June 2001, and five experiments have been conducted with the newly-installed fourth simulated fuel assembly until December 2002. In the ROSA-V program, various system integral experiments have been conducted to certify effectiveness of both accident management (AM) measures in beyond design basis accidents (BDBAs) and improved safety systems in the next-generation reactors. In addition, various separate-effect tests have been conducted to verify and develop computer codes and analytical models to predict non-homogeneous and multi-dimensional phenomena such as heat transfer across the steam generator U-tubes under the presence of non-condensable gases in both current and next-generation reactors. This report presents detailed information of the LSTF system with the third and fourth simulated fuel assemblies for the aid of experiment planning and analyses of experiment results. (author)

  13. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  14. Testing gravity on Large Scales

    Directory of Open Access Journals (Sweden)

    Raccanelli Alvise

    2013-09-01

    Full Text Available We show how it is possible to test general relativity and different models of gravity via Redshift-Space Distortions using forthcoming cosmological galaxy surveys. However, the theoretical models currently used to interpret the data often rely on simplifications that make them not accurate enough for precise measurements. We will discuss improvements to the theoretical modeling at very large scales, including wide-angle and general relativistic corrections; we then show that for wide and deep surveys those corrections need to be taken into account if we want to measure the growth of structures at a few percent level, and so perform tests on gravity, without introducing systematic errors. Finally, we report the results of some recent cosmological model tests carried out using those precise models.

  15. Condition Monitoring of Large-Scale Facilities

    Science.gov (United States)

    Hall, David L.

    1999-01-01

    This document provides a summary of the research conducted for the NASA Ames Research Center under grant NAG2-1182 (Condition-Based Monitoring of Large-Scale Facilities). The information includes copies of view graphs presented at NASA Ames in the final Workshop (held during December of 1998), as well as a copy of a technical report provided to the COTR (Dr. Anne Patterson-Hine) subsequent to the workshop. The material describes the experimental design, collection of data, and analysis results associated with monitoring the health of large-scale facilities. In addition to this material, a copy of the Pennsylvania State University Applied Research Laboratory data fusion visual programming tool kit was also provided to NASA Ames researchers.

  16. Scientific design of a large-scale sodium thermal–hydraulic test facility for KALIMER—Part I: Scientific facility design

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Soon-Joon; Lee, Doo-Yong [FNC Tech. Co. Ltd., SNU 135-308, Kwanak-Ro 1, Kwanak-Gu, Seoul 151-742 (Korea, Republic of); Eoh, Jae-Hyuk, E-mail: jheoh@kaeri.re.kr [Korea Atomic Energy Research Institute, Fast Reactor Technology Development Division, 1045 Daedeok-daero, Yuseong, Daejeon 305-353 (Korea, Republic of); Lee, Tae-Ho; Lee, Yong-Bum [Korea Atomic Energy Research Institute, Fast Reactor Technology Development Division, 1045 Daedeok-daero, Yuseong, Daejeon 305-353 (Korea, Republic of)

    2013-12-15

    Highlights: • A 1/5 scale integral test facility design for a pool type sodium fast reactor. • Similarity assessment in heat transfer between the solid and fluid. • Reasonable scale-down of the prototype sodium fast reactor. • A 1-dimensional verification calculation with the scale-down model. • Scale-down approach showed good agreement of overall behaviors. - Abstract: A large scale test facility design simulating a pool type sodium fast reactor was carried out on the basis of rigorous scaling approach with a plan of its installation by 2016. In particular, a similarity in heat transfer between solid and fluid were intensively discussed together with the distortion, and viable design parameters were derived. A 1-dimensional verification calculation was also conducted and showed good agreement for the important parameters. This scaling approach in this study is expected to be used in the exact reproduction of heat transfer between fluid and solid in a single phase fluid system.

  17. Large-Scale Damage Control Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: Performs large‑scale fire protection experiments that simulate actual Navy platform conditions. Remote control firefighting systems are also tested....

  18. Large Scale Flame Spread Environmental Characterization Testing

    Science.gov (United States)

    Clayman, Lauren K.; Olson, Sandra L.; Gokoghi, Suleyman A.; Brooker, John E.; Ferkul, Paul V.; Kacher, Henry F.

    2013-01-01

    Under the Advanced Exploration Systems (AES) Spacecraft Fire Safety Demonstration Project (SFSDP), as a risk mitigation activity in support of the development of a large-scale fire demonstration experiment in microgravity, flame-spread tests were conducted in normal gravity on thin, cellulose-based fuels in a sealed chamber. The primary objective of the tests was to measure pressure rise in a chamber as sample material, burning direction (upward/downward), total heat release, heat release rate, and heat loss mechanisms were varied between tests. A Design of Experiments (DOE) method was imposed to produce an array of tests from a fixed set of constraints and a coupled response model was developed. Supplementary tests were run without experimental design to additionally vary select parameters such as initial chamber pressure. The starting chamber pressure for each test was set below atmospheric to prevent chamber overpressure. Bottom ignition, or upward propagating burns, produced rapid acceleratory turbulent flame spread. Pressure rise in the chamber increases as the amount of fuel burned increases mainly because of the larger amount of heat generation and, to a much smaller extent, due to the increase in gaseous number of moles. Top ignition, or downward propagating burns, produced a steady flame spread with a very small flat flame across the burning edge. Steady-state pressure is achieved during downward flame spread as the pressure rises and plateaus. This indicates that the heat generation by the flame matches the heat loss to surroundings during the longer, slower downward burns. One heat loss mechanism included mounting a heat exchanger directly above the burning sample in the path of the plume to act as a heat sink and more efficiently dissipate the heat due to the combustion event. This proved an effective means for chamber overpressure mitigation for those tests producing the most total heat release and thusly was determined to be a feasible mitigation

  19. Large-Scale Spacecraft Fire Safety Tests

    Science.gov (United States)

    Urban, David; Ruff, Gary A.; Ferkul, Paul V.; Olson, Sandra; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Rouvreau, Sebastien; Minster, Olivier; Toth, Balazs; Legros, Guillaume; Eigenbrod, Christian; Smirnov, Nickolay; Fujita, Osamu; Jomaas, Grunde

    2014-01-01

    An international collaborative program is underway to address open issues in spacecraft fire safety. Because of limited access to long-term low-gravity conditions and the small volume generally allotted for these experiments, there have been relatively few experiments that directly study spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample sizes and environment conditions typical of those expected in a spacecraft fire. The major constraint has been the size of the sample, with prior experiments limited to samples of the order of 10 cm in length and width or smaller. This lack of experimental data forces spacecraft designers to base their designs and safety precautions on 1-g understanding of flame spread, fire detection, and suppression. However, low-gravity combustion research has demonstrated substantial differences in flame behavior in low-gravity. This, combined with the differences caused by the confined spacecraft environment, necessitates practical scale spacecraft fire safety research to mitigate risks for future space missions. To address this issue, a large-scale spacecraft fire experiment is under development by NASA and an international team of investigators. This poster presents the objectives, status, and concept of this collaborative international project (Saffire). The project plan is to conduct fire safety experiments on three sequential flights of an unmanned ISS re-supply spacecraft (the Orbital Cygnus vehicle) after they have completed their delivery of cargo to the ISS and have begun their return journeys to earth. On two flights (Saffire-1 and Saffire-3), the experiment will consist of a flame spread test involving a meter-scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. On one of the flights (Saffire-2), 9 smaller (5 x 30 cm) samples will be tested to evaluate NASAs material flammability screening tests

  20. How Large-Scale Research Facilities Connect to Global Research

    DEFF Research Database (Denmark)

    Lauto, Giancarlo; Valentin, Finn

    2013-01-01

    research. However, based on data on publications produced in 2006–2009 at the Neutron Science Directorate of Oak Ridge National Laboratory in Tennessee (United States), we find that internationalization of its collaborative research is restrained by coordination costs similar to those characterizing other......Policies for large-scale research facilities (LSRFs) often highlight their spillovers to industrial innovation and their contribution to the external connectivity of the regional innovation system hosting them. Arguably, the particular institutional features of LSRFs are conducive for collaborative...... institutional settings. Policies mandating LSRFs should consider that research prioritized on the basis of technological relevance limits the international reach of collaborations. Additionally, the propensity for international collaboration is lower for resident scientists than for those affiliated...

  1. Sequential Tests for Large Scale Learning

    NARCIS (Netherlands)

    Korattikara, A.; Chen, Y.; Welling, M.

    2016-01-01

    We argue that when faced with big data sets, learning and inference algorithms should compute updates using only subsets of data items. We introduce algorithms that use sequential hypothesis tests to adaptively select such a subset of data points. The statistical properties of this subsampling proce

  2. Large-Scale Spray Releases: Additional Aerosol Test Results

    Energy Technology Data Exchange (ETDEWEB)

    Daniel, Richard C.; Gauglitz, Phillip A.; Burns, Carolyn A.; Fountain, Matthew S.; Shimskey, Rick W.; Billing, Justin M.; Bontha, Jagannadha R.; Kurath, Dean E.; Jenks, Jeromy WJ; MacFarlan, Paul J.; Mahoney, Lenna A.

    2013-08-01

    One of the events postulated in the hazard analysis for the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak event involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids that behave as a Newtonian fluid. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and in processing facilities across the DOE complex. To expand the data set upon which the WTP accident and safety analyses were based, an aerosol spray leak testing program was conducted by Pacific Northwest National Laboratory (PNNL). PNNL’s test program addressed two key technical areas to improve the WTP methodology (Larson and Allen 2010). The first technical area was to quantify the role of slurry particles in small breaches where slurry particles may plug the hole and prevent high-pressure sprays. The results from an effort to address this first technical area can be found in Mahoney et al. (2012a). The second technical area was to determine aerosol droplet size distribution and total droplet volume from prototypic breaches and fluids, including sprays from larger breaches and sprays of slurries for which literature data are mostly absent. To address the second technical area, the testing program collected aerosol generation data at two scales, commonly referred to as small-scale and large-scale testing. The small-scale testing and resultant data are described in Mahoney et al. (2012b), and the large-scale testing and resultant data are presented in Schonewill et al. (2012). In tests at both scales, simulants were used

  3. GAS MIXING ANALYSIS IN A LARGE-SCALED SALTSTONE FACILITY

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S

    2008-05-28

    Computational fluid dynamics (CFD) methods have been used to estimate the flow patterns mainly driven by temperature gradients inside vapor space in a large-scaled Saltstone vault facility at Savannah River site (SRS). The purpose of this work is to examine the gas motions inside the vapor space under the current vault configurations by taking a three-dimensional transient momentum-energy coupled approach for the vapor space domain of the vault. The modeling calculations were based on prototypic vault geometry and expected normal operating conditions as defined by Waste Solidification Engineering. The modeling analysis was focused on the air flow patterns near the ventilated corner zones of the vapor space inside the Saltstone vault. The turbulence behavior and natural convection mechanism used in the present model were benchmarked against the literature information and theoretical results. The verified model was applied to the Saltstone vault geometry for the transient assessment of the air flow patterns inside the vapor space of the vault region using the potential operating conditions. The baseline model considered two cases for the estimations of the flow patterns within the vapor space. One is the reference nominal case. The other is for the negative temperature gradient between the roof inner and top grout surface temperatures intended for the potential bounding condition. The flow patterns of the vapor space calculated by the CFD model demonstrate that the ambient air comes into the vapor space of the vault through the lower-end ventilation hole, and it gets heated up by the Benard-cell type circulation before leaving the vault via the higher-end ventilation hole. The calculated results are consistent with the literature information. Detailed results and the cases considered in the calculations will be discussed here.

  4. GroFi: Large-scale fiber placement research facility

    Directory of Open Access Journals (Sweden)

    Christian Krombholz

    2016-03-01

    and processes for large-scale composite components. Due to the use of coordinated and simultaneously working layup units a high exibility of the research platform is achieved. This allows the investigation of new materials, technologies and processes on both, small coupons, but also large components such as wing covers or fuselage skins.

  5. Large-Scale Spray Releases: Initial Aerosol Test Results

    Energy Technology Data Exchange (ETDEWEB)

    Schonewill, Philip P.; Gauglitz, Phillip A.; Bontha, Jagannadha R.; Daniel, Richard C.; Kurath, Dean E.; Adkins, Harold E.; Billing, Justin M.; Burns, Carolyn A.; Davis, James M.; Enderlin, Carl W.; Fischer, Christopher M.; Jenks, Jeromy WJ; Lukins, Craig D.; MacFarlan, Paul J.; Shutthanandan, Janani I.; Smith, Dennese M.

    2012-12-01

    One of the events postulated in the hazard analysis at the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids with Newtonian fluid behavior. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and across processing facilities in the DOE complex. Two key technical areas were identified where testing results were needed to improve the technical basis by reducing the uncertainty due to extrapolating existing literature results. The first technical need was to quantify the role of slurry particles in small breaches where the slurry particles may plug and result in substantially reduced, or even negligible, respirable fraction formed by high-pressure sprays. The second technical need was to determine the aerosol droplet size distribution and volume from prototypic breaches and fluids, specifically including sprays from larger breaches with slurries where data from the literature are scarce. To address these technical areas, small- and large-scale test stands were constructed and operated with simulants to determine aerosol release fractions and generation rates from a range of breach sizes and geometries. The properties of the simulants represented the range of properties expected in the WTP process streams and included water, sodium salt solutions, slurries containing boehmite or gibbsite, and a hazardous chemical simulant. The effect of anti-foam agents was assessed with most of the simulants. Orifices included round holes and

  6. Segmentation by Large Scale Hypothesis Testing - Segmentation as Outlier Detection

    DEFF Research Database (Denmark)

    Darkner, Sune; Dahl, Anders Lindbjerg; Larsen, Rasmus

    2010-01-01

    locally. We propose a method based on large scale hypothesis testing with a consistent method for selecting an appropriate threshold for the given data. By estimating the background distribution we characterize the segment of interest as a set of outliers with a certain probability based on the estimated...

  7. PROPERTIES IMPORTANT TO MIXING FOR WTP LARGE SCALE INTEGRATED TESTING

    Energy Technology Data Exchange (ETDEWEB)

    Koopman, D.; Martino, C.; Poirier, M.

    2012-04-26

    Large Scale Integrated Testing (LSIT) is being planned by Bechtel National, Inc. to address uncertainties in the full scale mixing performance of the Hanford Waste Treatment and Immobilization Plant (WTP). Testing will use simulated waste rather than actual Hanford waste. Therefore, the use of suitable simulants is critical to achieving the goals of the test program. External review boards have raised questions regarding the overall representativeness of simulants used in previous mixing tests. Accordingly, WTP requested the Savannah River National Laboratory (SRNL) to assist with development of simulants for use in LSIT. Among the first tasks assigned to SRNL was to develop a list of waste properties that matter to pulse-jet mixer (PJM) mixing of WTP tanks. This report satisfies Commitment 5.2.3.1 of the Department of Energy Implementation Plan for Defense Nuclear Facilities Safety Board Recommendation 2010-2: physical properties important to mixing and scaling. In support of waste simulant development, the following two objectives are the focus of this report: (1) Assess physical and chemical properties important to the testing and development of mixing scaling relationships; (2) Identify the governing properties and associated ranges for LSIT to achieve the Newtonian and non-Newtonian test objectives. This includes the properties to support testing of sampling and heel management systems. The test objectives for LSIT relate to transfer and pump out of solid particles, prototypic integrated operations, sparger operation, PJM controllability, vessel level/density measurement accuracy, sampling, heel management, PJM restart, design and safety margin, Computational Fluid Dynamics (CFD) Verification and Validation (V and V) and comparison, performance testing and scaling, and high temperature operation. The slurry properties that are most important to Performance Testing and Scaling depend on the test objective and rheological classification of the slurry (i

  8. Large-scale direct shear testing of geocell reinforced soil

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The tests on the shear property of geocell reinforced soils were carried out by using large-scale direct shear equipment with shear-box-dimensions of 500 mm×500 mm×400 mm (length×width×height).Three types of specimens,silty gravel soil,geoceli reinforced silty gravel soil and geoceli reinforood cement stabilizing silty gravel soil were used to investigate the shear stress-displacement behavior,the shear strength and the strengthening mechanism of geocell reinforced soils.The comparisons of large-scale shear test with triaxial compression test for the same type of soil were conducted to evaluate the influences of testing method on the shear strength as well.The test results show that the unreinforced soil and geocell reinforced soil give similar nonlinear features on the behavior of shear stress and displacement.The geocell reinforced cement stabilizing soil has a quasi-elastic characteristic in the case of normal stress coming up to 1.0 GPa.The tests with the reinforcement of geocell result in an increase of 244% in cohesion,and the tests with the geocell and the cement stabilization result in an increase of 10 times in cohesion compared with the unreinforced soil.The friction angle does not change markedly.The geocell reinforcement develops a large amount of cohesion on the shear strength of soils.

  9. Cleanliness improvements of NIF (National Ignition Facility) amplifiers as compared to previous large-scale lasers

    Energy Technology Data Exchange (ETDEWEB)

    Honig, J

    2004-06-09

    Prior to the recent commissioning of the first NIF (National Ignition Facility) beamline, full-scale laser-amplifier-glass cleanliness experiments were performed. Aerosol measurements and obscuration data acquired using a modified flatbed scanner compare favorably to historical large-scale lasers and indicate that NIF is the cleanest large-scale laser built to date.

  10. Large Scale and Performance tests of the ATLAS Online Software

    Institute of Scientific and Technical Information of China (English)

    Alexandrov; H.Wolters; 等

    2001-01-01

    One of the sub-systems of the Trigger/DAQ system of the future ATLAS experiment is the Online Software system.It encompasses the functionality needed to configure,control and monitor the DAQ.Its architecture is based on a component structure described in the ATLAS Trigger/DAQ technical proposal.Resular integration tests ensure its smooth operation in test beam setups during its evolutionary development towards the final ATLAS online system.Feedback is received and returned into the development process.Studies of the system.behavior have been performed on a set of up to 111 PCs on a configuration which is getting closer to the final size,Large scale and performance tests of the integrated system were performed on this setup with emphasis on investigating the aspects of the inter-dependence of the components and the performance of the communication software.Of particular interest were the run control state transitions in various configurations of the run control hierarchy.For the purpose of the tests,the software from other Trigger/DAQ sub-systems has been emulated.This paper presents a brief overview of the online system structure,its components and the large scale integration tests and their results.

  11. Using Large Scale Test Results for Pedagogical Purposes

    DEFF Research Database (Denmark)

    Dolin, Jens

    2012-01-01

    The use and influence of large scale tests (LST), both national and international, has increased dramatically within the last decade. This process has revealed a tension between the legitimate need for information about the performance of the educational system and teachers to inform policy...... wash back effects known from other research but gave additionally some insight in teachers’ attitudes towards LSTs. To account for these findings results from another research project - the Validation of PISA – will be included. This project analyzed how PISA has influenced the Danish educational...

  12. Analysis of system thermal hydraulic responses for passive safety injection experiment at ROSA-IV Large Scale Test Facility. Using JAERI modified version of RELAP5/MOD2 code

    Energy Technology Data Exchange (ETDEWEB)

    Asaka, Hideaki; Yonomoto, Taisuke; Kukita, Yutaka (Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment)

    1994-12-01

    An experiment was conducted at the ROSA-IV/Large Scale Test Facility (LSTF) on the performance of a gravity-driven emergency core coolant (ECC) injection system attached to a pressurized water reactor (PWR). Such a gravity-driven injection system, though not used in the current-generation PWRs, is proposed for future reactor designs. The experiment was performed to identify key phenomena peculiar to the operation of a gravity injection system and to provide data base for code assessment against such phenomena. The simulated injection system consisted of a tank which was initially filled with cold water of the same pressure as the primary system. The tank was connected at its top and bottom, respectively, to the cold leg and the vessel downcomer. The injection into the downcomer was driven primarily by the static head difference between the cold water in the tank and the hot water in the pressure balance line (PBL) connecting the cold leg to the tank top. The injection flow was oscillatory after the flow through the PBL became two-phase flow. The experiment was post-test analyzed using a JAERI modified version of the RELAP5/MOD2 code. The code calculation simulated reasonably well the system responses observed in the experiment, and suggested that the oscillations in the injection flow was caused by oscillatory liquid holdup in the PBL connecting the cold leg to tank top. (author).

  13. Capabilities of the Large-Scale Sediment Transport Facility

    Science.gov (United States)

    2016-04-01

    over a 1 hr period. Long-term (72 hr) drift tests revealed water elevation changes of ±0.25 mm which was correlated to a temperature changes...and replace. The ultrasonic sensors clamp onto the pipe and are fully functional on PVC, steel , stainless steel , or iron pipes. Figure 5 shows a

  14. Large-Scale Multiobjective Static Test Generation for Web-Based Testing with Integer Programming

    Science.gov (United States)

    Nguyen, M. L.; Hui, Siu Cheung; Fong, A. C. M.

    2013-01-01

    Web-based testing has become a ubiquitous self-assessment method for online learning. One useful feature that is missing from today's web-based testing systems is the reliable capability to fulfill different assessment requirements of students based on a large-scale question data set. A promising approach for supporting large-scale web-based…

  15. The Netherlands Roadmap for Large-scale Research Facilities; Nederlandse Roadmap Grootschalige Onderzoeksfaciliteiten

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2008-10-15

    Large-scale research facilities are of inestimable strategic value for science and research and, hence, for the Dutch knowledge economy. In July 2007, the Dutch Minister of Education, Culture and Science set up the National Roadmap Committee for Large-Scale Research Facilities, whose main task was to advise him as to which large-scale research facilities the Netherlands should construct or participate in within an international context. In the present advisory report, the Committee presents 25 large-scale research facilities whose construction or operation the Committee believes is important for the robustness and innovativeness of the Dutch science system. [Dutch] Grootschalige onderzoeksfaciliteiten zijn van onschatbaar strategisch belang voor onderzoek en wetenschap en daarmee voor de Nederlandse kenniseconomie. De Minister van OCW heeft in juli 2007 de Commissie Nationale Roadmap Grootschalige Onderzoeksfaciliteiten ingesteld met het primaire doel hem te adviseren welke grootschalige onderzoeksfaciliteiten geschikt zijn om in Nederland zelf te bouwen of om in een internationale context aan mee te doen. De Commissie presenteert in dit advies 25 grootschalige onderzoeksfaciliteiten waarvan naar het oordeel van de Commissie de bouw en exploitatie van belang zijn voor de vitaliteit en het innovatief vermogen van het Nederlandse wetenschap systeem.

  16. Using Large Scale Structure to test Multifield Inflation

    CERN Document Server

    Ferraro, Simone

    2014-01-01

    Primordial non-Gaussianity of local type is known to produce a scale-dependent contribution to the galaxy bias. Several classes of multi-field inflationary models predict non-Gaussian bias which is stochastic, in the sense that dark matter and halos don't trace each other perfectly on large scales. In this work, we forecast the ability of next-generation Large Scale Structure surveys to constrain common types of primordial non-Gaussianity like $f_{NL}$, $g_{NL}$ and $\\tau_{NL}$ using halo bias, including stochastic contributions. We provide fitting functions for statistical errors on these parameters which can be used for rapid forecasting or survey optimization. A next-generation survey with volume $V = 25 h^{-3}$Mpc$^3$, median redshift $z = 0.7$ and mean bias $b_g = 2.5$, can achieve $\\sigma(f_{NL}) = 6$, $\\sigma(g_{NL}) = 10^5$ and $\\sigma(\\tau_{NL}) = 10^3$ if no mass information is available. If halo masses are available, we show that optimally weighting the halo field in order to reduce sample variance...

  17. Goethite Bench-scale and Large-scale Preparation Tests

    Energy Technology Data Exchange (ETDEWEB)

    Josephson, Gary B.; Westsik, Joseph H.

    2011-10-23

    The Hanford Waste Treatment and Immobilization Plant (WTP) is the keystone for cleanup of high-level radioactive waste from our nation's nuclear defense program. The WTP will process high-level waste from the Hanford tanks and produce immobilized high-level waste glass for disposal at a national repository, low activity waste (LAW) glass, and liquid effluent from the vitrification off-gas scrubbers. The liquid effluent will be stabilized into a secondary waste form (e.g. grout-like material) and disposed on the Hanford site in the Integrated Disposal Facility (IDF) along with the low-activity waste glass. The major long-term environmental impact at Hanford results from technetium that volatilizes from the WTP melters and finally resides in the secondary waste. Laboratory studies have indicated that pertechnetate ({sup 99}TcO{sub 4}{sup -}) can be reduced and captured into a solid solution of {alpha}-FeOOH, goethite (Um 2010). Goethite is a stable mineral and can significantly retard the release of technetium to the environment from the IDF. The laboratory studies were conducted using reaction times of many days, which is typical of environmental subsurface reactions that were the genesis of this new process. This study was the first step in considering adaptation of the slow laboratory steps to a larger-scale and faster process that could be conducted either within the WTP or within the effluent treatment facility (ETF). Two levels of scale-up tests were conducted (25x and 400x). The largest scale-up produced slurries of Fe-rich precipitates that contained rhenium as a nonradioactive surrogate for {sup 99}Tc. The slurries were used in melter tests at Vitreous State Laboratory (VSL) to determine whether captured rhenium was less volatile in the vitrification process than rhenium in an unmodified feed. A critical step in the technetium immobilization process is to chemically reduce Tc(VII) in the pertechnetate (TcO{sub 4}{sup -}) to Tc(Iv)by reaction with the

  18. Testing Inflation with Large Scale Structure: Connecting Hopes with Reality

    Energy Technology Data Exchange (ETDEWEB)

    Alvarez, Marcello [Univ. of Toronto, ON (Canada); Baldauf, T. [Inst. of Advanced Studies, Princeton, NJ (United States); Bond, J. Richard [Univ. of Toronto, ON (Canada); Canadian Inst. for Advanced Research, Toronto, ON (Canada); Dalal, N. [Univ. of Illinois, Urbana-Champaign, IL (United States); Putter, R. D. [Jet Propulsion Lab., Pasadena, CA (United States); California Inst. of Technology (CalTech), Pasadena, CA (United States); Dore, O. [Jet Propulsion Lab., Pasadena, CA (United States); California Inst. of Technology (CalTech), Pasadena, CA (United States); Green, Daniel [Univ. of Toronto, ON (Canada); Canadian Inst. for Advanced Research, Toronto, ON (Canada); Hirata, Chris [The Ohio State Univ., Columbus, OH (United States); Huang, Zhiqi [Univ. of Toronto, ON (Canada); Huterer, Dragan [Univ. of Michigan, Ann Arbor, MI (United States); Jeong, Donghui [Pennsylvania State Univ., University Park, PA (United States); Johnson, Matthew C. [York Univ., Toronto, ON (Canada); Perimeter Inst., Waterloo, ON (Canada); Krause, Elisabeth [Stanford Univ., CA (United States); Loverde, Marilena [Univ. of Chicago, IL (United States); Meyers, Joel [Univ. of Toronto, ON (Canada); Meeburg, Daniel [Univ. of Toronto, ON (Canada); Senatore, Leonardo [Stanford Univ., CA (United States); Shandera, Sarah [Pennsylvania State Univ., University Park, PA (United States); Silverstein, Eva [Stanford Univ., CA (United States); Slosar, Anze [Brookhaven National Lab. (BNL), Upton, NY (United States); Smith, Kendrick [Perimeter Inst., Waterloo, Toronto, ON (Canada); Zaldarriaga, Matias [Univ. of Toronto, ON (Canada); Assassi, Valentin [Cambridge Univ. (United Kingdom); Braden, Jonathan [Univ. of Toronto, ON (Canada); Hajian, Amir [Univ. of Toronto, ON (Canada); Kobayashi, Takeshi [Perimeter Inst., Waterloo, Toronto, ON (Canada); Univ. of Toronto, ON (Canada); Stein, George [Univ. of Toronto, ON (Canada); Engelen, Alexander van [Univ. of Toronto, ON (Canada)

    2014-12-15

    The statistics of primordial curvature fluctuations are our window into the period of inflation, where these fluctuations were generated. To date, the cosmic microwave background has been the dominant source of information about these perturbations. Large-scale structure is, however, from where drastic improvements should originate. In this paper, we explain the theoretical motivations for pursuing such measurements and the challenges that lie ahead. In particular, we discuss and identify theoretical targets regarding the measurement of primordial non-Gaussianity. We argue that when quantified in terms of the local (equilateral) template amplitude f$loc\\atop{NL}$ (f$eq\\atop{NL}$), natural target levels of sensitivity are Δf$loc, eq\\atop{NL}$ ≃ 1. We highlight that such levels are within reach of future surveys by measuring 2-, 3- and 4-point statistics of the galaxy spatial distribution. This paper summarizes a workshop held at CITA (University of Toronto) on October 23-24, 2014.

  19. Large scale facilities for synchrotron radiation and neutrons. New possibilities for Denmark

    Energy Technology Data Exchange (ETDEWEB)

    Feidenhans' l, R

    2003-02-01

    New large-scale facilities for investigating the structure and dynamics of matter and biological systems are currently under construction or are being planned at many locations around Europe. These facilities are likely to have a large impact on the science landscape in Europe, and more locally, as some will be located in the immediate neighbourhood of Denmark. The facilities will enable new analytical capabilities of matter, which will push the frontiers of science and technology in many areas of research of importance for Denmark. This report provides an overview of the new facilities (including very rough funding estimates) of importance for Danish science, describes possible ways of engaging in the various projects and identifies potential user groups. The report also includes a summary of the status of the current use of existing facilities as a benchmark. This is done by showing different cross sections through this multiple parameter space of activities. The intention is that the report should serve as guideline for making a long-term national strategy for the exploitation of large-scale facilities in order to help to optimise their impact on science, education and industry within Denmark, and to safeguard the prominent internationally leading role that Denmark has in this area. (LN)

  20. Testing Inflation with Large Scale Structure: Connecting Hopes with Reality

    CERN Document Server

    Alvarez, Marcelo; Bond, J Richard; Dalal, Neal; de Putter, Roland; Doré, Olivier; Green, Daniel; Hirata, Chris; Huang, Zhiqi; Huterer, Dragan; Jeong, Donghui; Johnson, Matthew C; Krause, Elisabeth; Loverde, Marilena; Meyers, Joel; Meerburg, P Daniel; Senatore, Leonardo; Shandera, Sarah; Silverstein, Eva; Slosar, Anže; Smith, Kendrick; Zaldarriaga, Matias; Assassi, Valentin; Braden, Jonathan; Hajian, Amir; Kobayashi, Takeshi; Stein, George; van Engelen, Alexander

    2014-01-01

    The statistics of primordial curvature fluctuations are our window into the period of inflation, where these fluctuations were generated. To date, the cosmic microwave background has been the dominant source of information about these perturbations. Large scale structure is however from where drastic improvements should originate. In this paper, we explain the theoretical motivations for pursuing such measurements and the challenges that lie ahead. In particular, we discuss and identify theoretical targets regarding the measurement of primordial non-Gaussianity. We argue that when quantified in terms of the local (equilateral) template amplitude $f_{\\rm NL}^{\\rm loc}$ ($f_{\\rm NL}^{\\rm eq}$), natural target levels of sensitivity are $\\Delta f_{\\rm NL}^{\\rm loc, eq.} \\simeq 1$. We highlight that such levels are within reach of future surveys by measuring 2-, 3- and 4-point statistics of the galaxy spatial distribution. This paper summarizes a workshop held at CITA (University of Toronto) on October 23-24, 2014...

  1. A NESTED PARTITIONS FRAMEWORK FOR SOLVING LARGE-SCALE MULTICOMMODITY FACILITY LOCATION PROBLEMS

    Institute of Scientific and Technical Information of China (English)

    Leyuan SHI; Robert R.MEYER; Mehmet BOZBAY; Andrew J.MILLER

    2004-01-01

    Large-scale multicommodity facility location problems are generally intractable with respect to standard mixed-integer programming (MIP) tools such as the direct application of general-purpose Branch & Cut (BC) commercial solvers i.e. CPLEX. In this paper, the authors investigate a nested partitions (NP) framework that combines meta-heuristics with MIP tools (including branch-and-cut).We also consider a variety of alternative formulations and decomposition methods for this problem class. Our results show that our NP framework is capable of efficiently producing very high quality solutions to multicommodity facility location problems. For large-scale problems in this class, this approach is significantly faster and generates better feasible solutions than either CPLEX (applied directly to the given MIP) or the iterative Lagrangian-based methods that have generally been regarded as the most effective structure-based techniques for optimization of these problems. We also briefly discuss some other large-scale MIP problem classes for which this approach is expected to be very effective.

  2. Plasma separation process facility for large-scale stable isotope production

    Energy Technology Data Exchange (ETDEWEB)

    Bigelow, T.S.; Collins, E.D.; Tracy, J.G. [Oak Ridge National Lab., TN (United States)

    1997-12-01

    A facility for large-scale separation of stable isotopes using the plasma separation process (PSP) is under development at the Oak Ridge National Laboratory. The PSP is capable of separating isotopes at a large throughput rate with medium purity product and at relatively low cost. The PSP has a number of convenient features that make it an attractive technology for general isotope separation purposes. Several isotopes for medical and industrial applications, including {sup 102}Pd, {sup 98}Mo, {sup 203}Tl, {sup 184}W, and others, are expected to be processed in this facility. The large throughput and low processing cost of the PSP will likely lead to new applications for stable isotopes. A description of this facility and its typical throughput capability is presented here.

  3. Experimental and theoretical study of large scale debris bed reflood in the PEARL facility

    Energy Technology Data Exchange (ETDEWEB)

    Chikhi, Nourdine, E-mail: nourdine.chikhi@irsn.fr; Fichot, F.

    2017-02-15

    Highlights: • Five reflooding tests have been carried out with an experimental bed, 500 mm in height and 500 mm in diameter, made of 4 mm stainless steel balls. • For the first time, such a large bed was heated practically homogenously. • The quench front velocity was determined according to thermocouple measurements inside the bed. • An analytical model, assuming a quasi-steady progression of the quench front, allows to predict the conversion ratio in most cases. • It appears that the efficiency of cooling can be increased only up to a certain limit when increasing the inlet water flow rate. - Abstract: During a severe accident in a nuclear power plant, the degradation of fuel rods and melting of materials lead to the accumulation of core materials, which are commonly, called “debris beds”. To stop core degradation and avoid the reactor vessel rupture, the main accident management procedure consists in injecting water. In the case of debris bed, the reflooding models used for Loss of Coolant Accident are not applicable. The IRSN has launched an experimental program on debris bed reflooding to develop new models and to validate severe accident codes. The PEARL facility has been designed to perform, for the first time, the reflooding of large scale debris bed (Ø540 mm, h = 500 mm and 500 kg of steel debris) in a pressurized containment. The bed is heated by means of an induction system. A specific instrumentation has been developed to measure the debris bed temperature, pressure drop inside the bed and the steam flow rate during the reflooding. In this paper, the results of the first integral reflooding tests performed in the PEARL facility at atmospheric pressure up to 700 °C are presented. Focus is made on the quench front propagation and on the steam flow rate during reflooding. The effect of water injection flow rate, debris initial temperature and residual power are also discussed. Finally, an analytical model providing the steam flow rate and

  4. A large scale cryopanel test arrangement for tritium pumping

    Energy Technology Data Exchange (ETDEWEB)

    Day, Chr. E-mail: christian.day@itp.fzk.de; Brennan, D.; Jensen, H.S.; Mack, A

    2003-09-01

    A cryosorption panel test arrangement will be installed in the cryogenic forevacuum system of the Active Gas Handling System (AGHS) at Joint European Torus (JET). The panel is of International Thermonuclear Experimental Reactor (ITER) relevant design in terms of geometry and dimension, coating and sorbent material. The central objective of this task is to study, for the first time in such an in-depth and parametric way, the interaction of tritium and tritiated gas mixtures with the panel, with respect to pumping performance, desorption characteristics and structural influences. This paper describes the motivation for this task and outlines the experimental aims and how they are planned to be achieved. It presents the actual status and gives a description of the test arrangement design. The paper demonstrates how the AGHS is used as a unique benchmark test bed for an ITER component to qualify ITER tritium technology.

  5. Testing gravity at large scales with H I intensity mapping

    Science.gov (United States)

    Pourtsidou, Alkistis

    2016-09-01

    We investigate the possibility of testing Einstein's general theory of relativity (GR) and the standard cosmological model via the EG statistic using neutral hydrogen (H I) intensity mapping. We generalize the Fourier space estimator for EG to include H I as a biased tracer of matter and forecast statistical errors using H I clustering and lensing surveys that can be performed in the near future, in combination with ongoing and forthcoming optical galaxy and cosmic microwave background (CMB) surveys. We find that fractional errors <1 per cent in the EG measurement can be achieved in a number of cases and compare the ability of various survey combinations to differentiate between GR and specific modified gravity models. Measuring EG with intensity mapping and the Square Kilometre Array can provide exquisite tests of gravity at cosmological scales.

  6. A large scale test of the gaming-enhancement hypothesis

    Directory of Open Access Journals (Sweden)

    Andrew K. Przybylski

    2016-11-01

    Full Text Available A growing research literature suggests that regular electronic game play and game-based training programs may confer practically significant benefits to cognitive functioning. Most evidence supporting this idea, the gaming-enhancement hypothesis, has been collected in small-scale studies of university students and older adults. This research investigated the hypothesis in a general way with a large sample of 1,847 school-aged children. Our aim was to examine the relations between young people’s gaming experiences and an objective test of reasoning performance. Using a Bayesian hypothesis testing approach, evidence for the gaming-enhancement and null hypotheses were compared. Results provided no substantive evidence supporting the idea that having preference for or regularly playing commercially available games was positively associated with reasoning ability. Evidence ranged from equivocal to very strong in support for the null hypothesis over what was predicted. The discussion focuses on the value of Bayesian hypothesis testing for investigating electronic gaming effects, the importance of open science practices, and pre-registered designs to improve the quality of future work.

  7. Testing coupled dark energy with large scale structure observation

    CERN Document Server

    yang, Weiqiang

    2014-01-01

    The coupling between the dark sectors provides a new approach to mitigate the coincidence problem of cosmological standard model. In this paper, dark energy is treated as a fluid with a constant equation of state, whose coupling with dark matter is proportional the Hubble parameter and dark energy density, that is, $Q=3H\\xi_x\\rho_x$. Via combining the background energy transfer and vanishing momentum transfer potential in the frame of either dark matter or dark energy, we derive the evolution equations for the density and velocity perturbations. Using jointing data sets which include cosmic microwave background radiation, baryon acoustic oscillation, type Ia supernovae, and redshift-space distortion, we perform a full Monte Carlo Markov Chain likelihood analysis for the coupled model. The results show that information provided by $f\\sigma_8(z)$ test significantly enhances the precision of the constraints on the cosmological parameters compared to the case where only geometric measurements are adopted. In part...

  8. Large-Scale Tests of the DGP Model

    CERN Document Server

    Song, Y S; Hu, W; Song, Yong-Seon; Sawicki, Ignacy; Hu, Wayne

    2006-01-01

    The self-accelerating braneworld model (DGP) can be tested from measurements of the expansion history of the universe and the formation of structure. Current constraints on the expansion history from supernova luminosity distances, the CMB, and the Hubble constant exclude the simplest flat DGP model at about 3sigma. The best-fit open DGP model is, however, only a marginally poorer fit to the data than flat LCDM. Its substantially different expansion history raises structure formation challenges for the model. A dark-energy model with the same expansion history would predict a highly significant discrepancy with the baryon oscillation measurement due the high Hubble constant required and a large enhancement of CMB anisotropies at the lowest multipoles due to the ISW effect. For the DGP model to satisfy these constraints new gravitational phenomena would have to appear at the non-linear and cross-over scales respectively. A prediction of the DGP expansion history in a region where the phenomenology is well unde...

  9. Test of the CLAS12 RICH large scale prototype in the direct proximity focusing configuration

    CERN Document Server

    Anefalos Pereira, S; Barion, L.; Benmokhtar, F; Brooks, W; Cisbani, E; Contalbrigo, M; El Alaoui, A; Hafidi, K; Hoek, M; Kubarovsky, V; Lagamba, L; Lucherini, V; Malaguti, R; Mirazita, M; Montgomery, R A; Movsisyan, A; Musico, P; Orlandi, A; Orecchini, D; Pappalardo, L L; Perrino, R; Phillips, J; Pisano, S; Rossi, P; Squerzanti, S; Tomassini, S; Turisini, M; Viticchiè, A

    2016-01-01

    A large area ring-imaging Cherenkov detector has been designed to provide clean hadron identification capability in the momentum range from 3 GeV/c up to 8 GeV/c for the CLAS12 experiments at the upgraded 12 GeV continuous electron beam accelerator facility of Jefferson Laboratory. The adopted solution foresees a novel hybrid optics design based on aerogel radiator, composite mirrors and high-packed and high-segmented photon detectors. Cherenkov light will either be imaged directly (forward tracks) or after two mirror reflections (large angle tracks). We report here the results of the tests of a large scale prototype of the RICH detector performed with the hadron beam of the CERN T9 experimental hall for the direct detection configuration. The tests demonstrated that the proposed design provides the required pion-to-kaon rejection factor of 1:500 in the whole momentum range.

  10. Test of the CLAS12 RICH large-scale prototype in the direct proximity focusing configuration

    Energy Technology Data Exchange (ETDEWEB)

    Anefalos Pereira, S.; Lucherini, V.; Mirazita, M.; Orlandi, A.; Orecchini, D.; Pisano, S.; Tomassini, S.; Viticchie, A. [Laboratori Nazionali di Frascati, INFN, Frascati (Italy); Baltzell, N.; El Alaoui, A.; Hafidi, K. [Physics Division, Argonne National Laboratory, Argonne, IL (United States); Barion, L.; Contalbrigo, M.; Malaguti, R.; Movsisyan, A.; Pappalardo, L.L.; Squerzanti, S. [INFN, Ferrara (Italy); Benmokhtar, F. [Department of Physics, Duquesne University, Pittsburgh, PA (United States); Brooks, W. [Universidad Tecnica Federico Santa Maria, Valparaiso (Chile); Cisbani, E. [Gruppo Sanita and Istituto Superiore di Sanita, INFN, Rome (Italy); Hoek, M.; Phillips, J. [School of Physics and Astronomy, Kelvin Building, University of Glasgow, Scotland (United Kingdom); Kubarovsky, V. [Thomas Jefferson National Accelerator Facility, Jefferson Laboratory, Newport News, VA (United States); Lagamba, L.; Perrino, R. [INFN, Bari (Italy); Montgomery, R.A. [Laboratori Nazionali di Frascati, INFN, Frascati (Italy); School of Physics and Astronomy, Kelvin Building, University of Glasgow, Scotland (United Kingdom); Musico, P. [INFN, Genova (Italy); Rossi, P. [Laboratori Nazionali di Frascati, INFN, Frascati (Italy); Thomas Jefferson National Accelerator Facility, Jefferson Laboratory, Newport News, VA (United States); Turisini, M. [INFN, Ferrara (Italy); Universidad Tecnica Federico Santa Maria, Valparaiso (Chile)

    2016-02-15

    A large-area ring-imaging Cherenkov detector has been designed to provide clean hadron identification capability in the momentum range from 3GeV/c up to 8GeV/c for the CLAS12 experiment at the upgraded 12GeV continuous electron beam accelerator facility of Jefferson Laboratory. The adopted solution foresees a novel hybrid optics design based on aerogel radiator, composite mirrors and highly packed and highly segmented photon detectors. Cherenkov light will either be imaged directly (forward tracks) or after two mirror reflections (large-angle tracks). We report here the results of the tests of a large-scale prototype of the RICH detector performed with the hadron beam of the CERN T9 experimental hall for the direct detection configuration. The tests demonstrated that the proposed design provides the required pion-to-kaon rejection factor of 1: 500 in the whole momentum range. (orig.)

  11. Test of the CLAS12 Rich Large Scale Prototype in the Direct Proximity Focusing Configuration

    Energy Technology Data Exchange (ETDEWEB)

    Anefalos Pereira, S.; Baltzell, N.; Barion, L.; Benmokhtar, F.; Brooks, W. K.; Cisbani, E.; Contalbrigo, M.; El Alaoui, A.; Hafidi, K.; Hoek, M.

    2016-02-11

    A large area ring-imaging Cherenkov detector has been designed to provide clean hadron identification capability in the momentum range from 3 GeV/c up to 8 GeV/c for the CLAS12 experiments at the upgraded 12 GeV continuous electron beam accelerator facility of Jefferson Laboratory. The adopted solution foresees a novel hybrid optics design based on aerogel radiator, composite mirrors and high-packed and high-segmented photon detectors. Cherenkov light will either be imaged directly (forward tracks) or after two mirror reflections (large angle tracks). We report here the results of the tests of a large scale prototype of the RICH detector performed with the hadron beam of the CERN T9 experimental hall for the direct detection configuration. The tests demonstrated that the proposed design provides the required pion-to-kaon rejection factor of 1:500 in the whole momentum range.

  12. High temperature thermal behaviour modeling of large-scale fused silica optics for laser facility

    Institute of Scientific and Technical Information of China (English)

    Yu Jing-Xia; He Shao-Bo; Xiang Xia; Yuan Xiao-Dong; Zheng Wan-Guo; Lü Hai-Bing; Zu Xiao-Tao

    2012-01-01

    High temperature annealing is often used for the stress control of optical materials.However,weight and viscosity at high temperature may destroy the surface morphology,especially for the large-scale,thin and heavy optics used for large laser facilities.It is necessary to understand the thermal behaviour and design proper support systems for large-scale optics at high temperature.In this work,three support systems for fused silica optics are designed and simulated with the finite element method.After the analysis of the thermal behaviours of different support systems,some advantages and disadvantages can be revealed.The results show that the support with the optical surface vertical is optimal because both pollution and deformation of optics could be well controlled during annealing at high temperature.Annealing process of the optics irradiated by CO2 laser is also simulated.It can be concluded that high temperature annealing can effectively reduce the residual stress.However,the effects of annealing on surface morphology of the optics are complex.Annealing creep is closely related to the residual stress and strain distribution.In the region with large residual stress,the creep is too large and probably increases the deformation gradient which may affect the laser beam propagation.

  13. Application of Tomo-PIV in a large-scale supersonic jet flow facility

    Science.gov (United States)

    Wernet, Mark P.

    2016-09-01

    Particle imaging velocimetry (PIV) has been used extensively at NASA GRC over the last 15 years to build a benchmark data set of hot and cold jet flow measurements in an effort to understand acoustic noise sources in high-speed jets. Identifying the noise sources in high-speed jets is critical for ultimately modifying the nozzle hardware design/operation and therefore reducing the jet noise. Tomographic PIV (Tomo-PIV) is an innovative approach for acquiring and extracting velocity information across extended volumes of a flow field, enabling the computation of additional fluid mechanical properties not typically available using traditional PIV techniques. The objective of this work was to develop and implement the Tomo-PIV measurement capability and apply it in a large-scale outdoor test facility, where seeding multiple flow streams and operating in the presence of daylight presents formidable challenges. The newly developed Tomo-PIV measurement capability was applied in both a subsonic M 0.9 flow and an under-expanded M 1.4 heated jet flow field. Measurements were also obtained using traditional two-component (2C) PIV and stereo PIV in the M 0.9 flow field for comparison and validation of the Tomo-PIV results. In the case of the M 1.4 flow, only the 2C PIV was applied to allow a comparison with the Tomo-PIV measurement. The Tomo-PIV fields-of-view covered 180 × 180 × 10 mm, and the reconstruction domains were 3500 × 3500 × 200 voxels. These Tomo-PIV measurements yielded all three components of vorticity across entire planes for the first time in heated supersonic jet flows and provided the first full 3D reconstruction of the Mach disk and oblique shock intersections inside of the barrel shocks. Measuring all three components of vorticity across multiple planes in the flow, potentially reduces the number of measurement configurations (streamwise and cross-stream PIV) required to fully characterize the mixing-enhanced nozzle flows routinely studied in

  14. A study of residence time distribution using radiotracer technique in the large scale plant facility

    Science.gov (United States)

    Wetchagarun, S.; Tippayakul, C.; Petchrak, A.; Sukrod, K.; Khoonkamjorn, P.

    2017-06-01

    As the demand for troubleshooting of large industrial plants increases, radiotracer techniques, which have capability to provide fast, online and effective detections to plant problems, have been continually developed. One of the good potential applications of the radiotracer for troubleshooting in a process plant is the analysis of Residence Time Distribution (RTD). In this paper, the study of RTD in a large scale plant facility using radiotracer technique was presented. The objective of this work is to gain experience on the RTD analysis using radiotracer technique in a “larger than laboratory” scale plant setup which can be comparable to the real industrial application. The experiment was carried out at the sedimentation tank in the water treatment facility of Thailand Institute of Nuclear Technology (Public Organization). Br-82 was selected to use in this work due to its chemical property, its suitable half-life and its on-site availability. NH4Br in the form of aqueous solution was injected into the system as the radiotracer. Six NaI detectors were placed along the pipelines and at the tank in order to determine the RTD of the system. The RTD and the Mean Residence Time (MRT) of the tank was analysed and calculated from the measured data. The experience and knowledge attained from this study is important for extending this technique to be applied to industrial facilities in the future.

  15. Feasibility Assessment of Using Power Plant Waste Heat in Large Scale Horticulture Facility Energy Supply Systems

    Directory of Open Access Journals (Sweden)

    Min Gyung Yu

    2016-02-01

    Full Text Available Recently, the Korean government has been carrying out projects to construct several large scale horticulture facilities. However, it is difficult for an energy supply to operate stably and economically with only a conventional fossil fuel boiler system. For this reason, several unused energy sources have become attractive and it was found that power plant waste heat has the greatest potential for application in this scenario. In this study, we performed a feasibility assessment of power plant waste heat as an energy source for horticulture facilities. As a result, it was confirmed that there was a sufficient amount of energy potential for the use of waste heat to supply energy to the assumed area. In Dangjin, an horticultural area of 500 ha could be constructed by utilizing 20% of the energy reserves. In Hadong, a horticulture facility can be set up to be 260 ha with 7.4% of the energy reserves. In Youngdong, an assumed area of 65 ha could be built utilizing about 19% of the energy reserves. Furthermore, the payback period was calculated in order to evaluate the economic feasibility compared with a conventional system. The initial investment costs can be recovered by the approximately 83% reduction in the annual operating costs.

  16. Characterization of microbial communities in exhaust air treatment systems of large-scale pig housing facilities.

    Science.gov (United States)

    Haneke, J; Lee, N M; Gaul, T W; Van den Weghe, H F A

    2010-01-01

    Exhaust air treatment has gained importance as an essential factor in intensive livestock areas due to the rising emissions in the environment. Wet filter walls of multi-stage exhaust air treatment systems precipitate gaseous ammonia and dust particles from exhaust air in washing water. Microbial communities in the biomass developed in the washing water of five large-scale exhaust air treatment units of pig housing facilities, were investigated by fluorescence in situ hybridization (FISH) and 16S rDNA sequence analyses. No "standard" nitrifying bacteria were found in the washing water. Instead mainly α-Proteobacteria, aggregating β- and χ-Proteobacteria, a large number of Actinobacteria, as well as individual Planctomycetales and Crenarchaeota were detected after more than twelve months' operation. The main Proteobacteria species present were affiliated to the families Alcaligenaceae, Comamonadaceae and Xanthomonadaceae. Furthermore, we investigated the consumption of inorganic nitrogen compounds in the washing water of one exhaust air treatment unit during a fattening period with and without pH control. Maintaining the pH at 6.0 resulted in a ca. fivefold higher ammonium concentration and a ca. fourfold lower concentration of oxidized nitrogen compounds after the fattening period was finished.

  17. Simulation of sandstone degradation using large-scale slake durability index testing device

    Directory of Open Access Journals (Sweden)

    Chaowarin Walsri

    2012-11-01

    Full Text Available Large-scale slake durability index tests have been performed on Khok Kruat (KK, Phu Kradung (PK and Phra Wihan(PW sandstone. A rotating drum with a diameter of 64 cm and length of 40 cm was fabricated to accommodate ten rockfragments with a nominal size of 10 cm. Both large-scale and standard-testing were performed under dry and wet conditions.The large-scale test yields rock deterioration twice greater than the small-scale test, primarily due to the greater energyimposed on the rock fragments. The weight losses under wet condition are 12%, 8%, and 3% greater than under dry condition for KK, PK, and PW sandstones, respectively. After 10 test cycles the water absorption values for PW, KK and PKsandstones are 12%, 3%, and 2%, respectively. Rock degradation under the rapid cooling-heating cycles in the laboratory isabout 18 times faster than under the field condition in the northeast of Thailand.

  18. LUCI: A facility at DUSEL for large-scale experimental study of geologic carbon sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Peters, C. A.; Dobson, P.F.; Oldenburg, C.M.; Wang, J. S. Y.; Onstott, T.C.; Scherer, G.W.; Freifeld, B.M.; Ramakrishnan, T.S.; Stabinski, E.L.; Liang, K.; Verma, S.

    2010-10-01

    LUCI, the Laboratory for Underground CO{sub 2} Investigations, is an experimental facility being planned for the DUSEL underground laboratory in South Dakota, USA. It is designed to study vertical flow of CO{sub 2} in porous media over length scales representative of leakage scenarios in geologic carbon sequestration. The plan for LUCI is a set of three vertical column pressure vessels, each of which is {approx}500 m long and {approx}1 m in diameter. The vessels will be filled with brine and sand or sedimentary rock. Each vessel will have an inner column to simulate a well for deployment of down-hole logging tools. The experiments are configured to simulate CO{sub 2} leakage by releasing CO{sub 2} into the bottoms of the columns. The scale of the LUCI facility will permit measurements to study CO{sub 2} flow over pressure and temperature variations that span supercritical to subcritical gas conditions. It will enable observation or inference of a variety of relevant processes such as buoyancy-driven flow in porous media, Joule-Thomson cooling, thermal exchange, viscous fingering, residual trapping, and CO{sub 2} dissolution. Experiments are also planned for reactive flow of CO{sub 2} and acidified brines in caprock sediments and well cements, and for CO{sub 2}-enhanced methanogenesis in organic-rich shales. A comprehensive suite of geophysical logging instruments will be deployed to monitor experimental conditions as well as provide data to quantify vertical resolution of sensor technologies. The experimental observations from LUCI will generate fundamental new understanding of the processes governing CO{sub 2} trapping and vertical migration, and will provide valuable data to calibrate and validate large-scale model simulations.

  19. Testing, development and demonstration of large scale solar district heating systems

    DEFF Research Database (Denmark)

    Furbo, Simon; Fan, Jianhua; Perers, Bengt

    2015-01-01

    In 2013-2014 the project “Testing, development and demonstration of large scale solar district heating systems” was carried out within the Sino-Danish Renewable Energy Development Programme, the so called RED programme jointly developed by the Chinese and Danish governments. In the project Danish...... know how on solar heating plants and solar heating test technology have been transferred from Denmark to China, large solar heating systems have been promoted in China, test capabilities on solar collectors and large scale solar heating systems have been improved in China and Danish-Chinese cooperation...

  20. Large-scale decontamination and decommissioning technology demonstration project at a former uranium metal production facility

    Energy Technology Data Exchange (ETDEWEB)

    Martineit, R.A.; Borgman, T.D.; Peters, M.S.; Stebbins, L.L. [and others

    1997-03-05

    The Department of Energy`s (DOE) Office of Science and Technology Decontamination and Decommissioning (D&D) Focus Area, led by the Federal Energy Technology Center, has been charged with improving upon baseline D&D technologies with the goal of demonstrating and validating more cost-effective and safer technologies to characterize, deactivate, survey, decontaminate, dismantle, and dispose of surplus structures, buildings, and their contents at DOE sites. The D&D Focus Area`s approach to verifying the benefits of the improved D&D technologies is to use them in large-scale technology demonstration (LSTD) projects at several DOE sites. The Fernald Environmental Management Project (FEMP) was selected to host one of the first three LSTD`s awarded by the D&D Focus Area. The FEMP is a DOE facility near Cincinnati, Ohio, that was formerly engaged in the production of high quality uranium metal. The FEMP is a Superfund site which has completed its RUFS process and is currently undergoing environmental restoration. With the FEMP`s selection to host an LSTD, the FEMP was immediately faced with some challenges. The primary challenge was that this LSTD was to be integrated into the FEMP`s Plant 1 D&D Project which was an ongoing D&D Project for which a firm fixed price contract had been issued to the D&D Contractor. Thus, interferences with the baseline D&D project could have significant financial implications. Other challenges include defining and selecting meaningful technology demonstrations, finding/selecting technology providers, and integrating the technology into the baseline D&D project. To date, twelve technologies have been selected, and six have been demonstrated. The technology demonstrations have yielded a high proportion of {open_quotes}winners.{close_quotes} All demonstrated, technologies will be evaluated for incorporation into the FEMP`s baseline D&D strategy.

  1. Using Multimedia in Large-Scale Computer-Based Testing Programs.

    Science.gov (United States)

    Bennett, R. E.; Goodman, M.; Hessinger, J.; Kahn, H.; Ligget, J.; Marshall, G.; Zack, J.

    1999-01-01

    Discusses the use of multimedia in large-scale computer-based testing programs to measure problem solving and related cognitive constructs more effectively. Considers the incorporation of dynamic stimuli such as audio, video, and animation, and gives examples in history, physical education, and the sciences. (Author/LRW)

  2. Large-Scale Pressurizable Fire Test Facility-Fire 1

    Science.gov (United States)

    1982-12-30

    440 CALL PLOTR(ICCB.ID,1.LUG) 0144 CALL SETAR(IGCB.1 0) 0141 CALL VIEUP( IGCC ,O ,135 .0-100.) 0142 CALL UINDB(IGCB.O 150 .o.,100.) 0143 CALL CSIZE( IGCC ,2...InY1I-1 fl 0250 CALL DRAU(ICCB.X(J),yl(J)) 0251 390 CALL NAVE(1CCB.X(4),Y6J1)) 0252 CALL CPLO T IGCCE ,-NU.-NH.-2) 0253 CALL LABELCICCB,’ 0254 UR1 T E

  3. Maps4Science - National Roadmap for Large-Scale Research Facilities 2011 (NWO Application form)

    NARCIS (Netherlands)

    Van Oosterom, P.J.M.; Van der Wal, T.; De By, R.A.

    2011-01-01

    The Netherlands is historically known as one of worlds' best-measured countries. It is continuing this tradition today with unequalled new datasets, such as the nationwide large-scale topographic map, our unique digital height map (nationwide coverage; ten very accurate 3D points for every Dutch m2)

  4. Large scale calcium channel gene rearrangements in episodic ataxia and hemiplegic migraine: implications for diagnostic testing.

    Science.gov (United States)

    Labrum, R W; Rajakulendran, S; Graves, T D; Eunson, L H; Bevan, R; Sweeney, M G; Hammans, S R; Tubridy, N; Britton, T; Carr, L J; Ostergaard, J R; Kennedy, C R; Al-Memar, A; Kullmann, D M; Schorge, S; Temple, K; Davis, M B; Hanna, M G

    2009-11-01

    Episodic ataxia type 2 (EA2) and familial hemiplegic migraine type 1 (FHM1) are autosomal dominant disorders characterised by paroxysmal ataxia and migraine, respectively. Point mutations in CACNA1A, which encodes the neuronal P/Q-type calcium channel, have been detected in many cases of EA2 and FHM1. The genetic basis of typical cases without CACNA1A point mutations is not fully known. Standard DNA sequencing methods may miss large scale genetic rearrangements such as deletions and duplications. The authors investigated whether large scale genetic rearrangements in CACNA1A can cause EA2 and FHM1. The authors used multiplex ligation dependent probe amplification (MLPA) to screen for intragenic CACNA1A rearrangements. The authors identified five previously unreported large scale deletions in CACNA1A in seven families with episodic ataxia and in one case with hemiplegic migraine. One of the deletions (exon 6 of CACNA1A) segregated with episodic ataxia in a four generation family with eight affected individuals previously mapped to 19p13. In addition, the authors identified the first pathogenic duplication in CACNA1A in an index case with isolated episodic diplopia without ataxia and in a first degree relative with episodic ataxia. Large scale deletions and duplications can cause CACNA1A associated channelopathies. Direct DNA sequencing alone is not sufficient as a diagnostic screening test.

  5. Examining item-position effects in large-scale assessment using the Linear Logistic Test Model

    Directory of Open Access Journals (Sweden)

    CHRISTINE HOHENSINN

    2008-09-01

    Full Text Available When administering large-scale assessments, item-position effects are of particular importance because the applied test designs very often contain several test booklets with the same items presented at different test positions. Establishing such position effects would be most critical; it would mean that the estimated item parameters do not depend exclusively on the items’ difficulties due to content but also on their presentation positions. As a consequence, item calibration would be biased. By means of the linear logistic test model (LLTM, item-position effects can be tested. In this paper, the results of a simulation study demonstrating how LLTM is indeed able to detect certain position effects in the framework of a large-scale assessment are presented first. Second, empirical item-position effects of a specific large-scale competence assessment in mathematics (4th grade students are analyzed using the LLTM. The results indicate that a small fatigue effect seems to take place. The most important consequence of the given paper is that it is advisable to try pertinent simulation studies before an analysis of empirical data takes place; the reason is, that for the given example, the suggested Likelihood-Ratio test neither holds the nominal type-I-risk, nor qualifies as “robust”, and furthermore occasionally shows very low power.

  6. Violent wave impacts on vertical and inclined walls: Large scale model tests

    DEFF Research Database (Denmark)

    Obhrai, C.; Bullock, G.; Wolters, G.

    2005-01-01

    New data is presented from large scale model tests where combined measurements of wave pressure and aeration have been made on the front of a vertical and an inclined wall. The shape of the breaking wave was found to have a significant effect on the distribution of the wave impact pressures...... on the wall. The characteristics of violent wave impacts are discussed and related to the impulse on the structure....

  7. Testing the Big Bang: Light elements, neutrinos, dark matter and large-scale structure

    Science.gov (United States)

    Schramm, David N.

    1991-01-01

    Several experimental and observational tests of the standard cosmological model are examined. In particular, a detailed discussion is presented regarding: (1) nucleosynthesis, the light element abundances, and neutrino counting; (2) the dark matter problems; and (3) the formation of galaxies and large-scale structure. Comments are made on the possible implications of the recent solar neutrino experimental results for cosmology. An appendix briefly discusses the 17 keV thing and the cosmological and astrophysical constraints on it.

  8. Large-scale Samples Irradiation Facility at the IBR-2 Reactor in Dubna

    CERN Document Server

    Cheplakov, A P; Golubyh, S M; Kaskanov, G Ya; Kulagin, E N; Kukhtin, V V; Luschikov, V I; Shabalin, E P; León-Florián, E; Leroy, C

    1998-01-01

    The irradiation facility at the beam line no.3 of the IBR-2 reactor of the Frank Laboratory for Neutron Physics is described. The facility is aimed at irradiation studies of various objects with area up to 800 cm$^2$ both at cryogenic and ambient temperatures. The energy spectra of neutrons are reconstructed by the method of threshold detector activation. The neutron fluence and $\\gamma$ dose rates are measured by means of alanine and thermoluminescent dosimeters. The boron carbide and lead filters or $(n/\\gamma)$ converter provide beams of different ratio of doses induced by neutrons and photons. For the lead filter, the flux of fast neutrons with energy more than 0.1 MeV is $1.4 \\cdot 10^{10}$ \\fln and the neutron dose is about 96\\% of the total radiation dose. For the $(n/\\gamma)$ converter, the $\\gamma$ dose rate is $\\sim$500 Gy h$^{-1}$ which is about 85\\% of the total dose. The radiation hardness tests of GaAs electronics and materials for the ATLAS detector to be put into operation at the Large Hadron ...

  9. Active self-testing noise measurement sensors for large-scale environmental sensor networks.

    Science.gov (United States)

    Domínguez, Federico; Cuong, Nguyen The; Reinoso, Felipe; Touhafi, Abdellah; Steenhaut, Kris

    2013-12-13

    Large-scale noise pollution sensor networks consist of hundreds of spatially distributed microphones that measure environmental noise. These networks provide historical and real-time environmental data to citizens and decision makers and are therefore a key technology to steer environmental policy. However, the high cost of certified environmental microphone sensors render large-scale environmental networks prohibitively expensive. Several environmental network projects have started using off-the-shelf low-cost microphone sensors to reduce their costs, but these sensors have higher failure rates and produce lower quality data. To offset this disadvantage, we developed a low-cost noise sensor that actively checks its condition and indirectly the integrity of the data it produces. The main design concept is to embed a 13 mm speaker in the noise sensor casing and, by regularly scheduling a frequency sweep, estimate the evolution of the microphone's frequency response over time. This paper presents our noise sensor's hardware and software design together with the results of a test deployment in a large-scale environmental network in Belgium. Our middle-range-value sensor (around €50) effectively detected all experienced malfunctions, in laboratory tests and outdoor deployments, with a few false positives. Future improvements could further lower the cost of our sensor below €10.

  10. Active Self-Testing Noise Measurement Sensors for Large-Scale Environmental Sensor Networks

    Directory of Open Access Journals (Sweden)

    Federico Domínguez

    2013-12-01

    Full Text Available Large-scale noise pollution sensor networks consist of hundreds of spatially distributed microphones that measure environmental noise. These networks provide historical and real-time environmental data to citizens and decision makers and are therefore a key technology to steer environmental policy. However, the high cost of certified environmental microphone sensors render large-scale environmental networks prohibitively expensive. Several environmental network projects have started using off-the-shelf low-cost microphone sensors to reduce their costs, but these sensors have higher failure rates and produce lower quality data. To offset this disadvantage, we developed a low-cost noise sensor that actively checks its condition and indirectly the integrity of the data it produces. The main design concept is to embed a 13 mm speaker in the noise sensor casing and, by regularly scheduling a frequency sweep, estimate the evolution of the microphone’s frequency response over time. This paper presents our noise sensor’s hardware and software design together with the results of a test deployment in a large-scale environmental network in Belgium. Our middle-range-value sensor (around €50 effectively detected all experienced malfunctions, in laboratory tests and outdoor deployments, with a few false positives. Future improvements could further lower the cost of our sensor below €10.

  11. The Observatorio Astrofisico de Javalambre. A planned facility for large scale surveys

    Science.gov (United States)

    Moles, M.; Cenarro, A. J.; Cristóbal-Hornillos, D.; Gruel, N.; Marín Franch, A.; Valdivielso, L.; Viironen, K.

    2011-11-01

    All-sky surveys play a fundamental role for the development of Astrophysics. The need for large-scale surveys comes from two basic motivations: one is to make an inventory of sources as complete as possible and allow for their classification in families. The other is to attack some problems demanding the sampling of large volumes to give a detectable signal. New challenges, in particular in the domain of Cosmology are giving impulse to a new kind of large-scale surveys, combining area coverage, depth and accurate enough spectral information to recover the redshift and spectral energy distribution (SED) of the detected objects. New instruments are needed to satisfy the requirements of those large-scale surveys, in particular large Etendue telescopes. The Observatorio Astrofisico de Javalambre, OAJ, project includes a telescope of 2.5 m aperture, with a wide field of view, 3 degrees in diameter, and excellent image quality in the whole field. Taking into account that it is going to be fully devoted to carry out surveys, it will be the highest effective Etendue telescope up to date. The project is completed with a smaller, wide field auxiliary telescope. The Observatory is being built at Pico del Buitre, Sierra de Javalambre, Teruel, a site with excellent seeing and low sky surface brightness. The institution in charge of the Observatory is the Centro de Estudios de Fisica del Cosmos de Aragon, CEFCA, a new center created in Teruel for the operation and scientific exploitation of the Javalambre Observatory. CEFCA will be also in charge of the data management and archiving. The data will be made accessible to the community.The first planned scientific project is a multi-narrow-band photometric survey covering 8,000 square degrees, designed to produce precise SEDs, and photometric redshifts accurate at the 0.3 % level. A total of 42, 100-120 Å band pass filters covering most of the optical spectral range will be used. In this sense it is the development, at a much

  12. Testing the big bang: Light elements, neutrinos, dark matter and large-scale structure

    Energy Technology Data Exchange (ETDEWEB)

    Schramm, D.N. (Chicago Univ., IL (United States) Fermi National Accelerator Lab., Batavia, IL (United States))

    1991-06-01

    In this series of lectures, several experimental and observational tests of the standard cosmological model are examined. In particular, detailed discussion is presented regarding nucleosynthesis, the light element abundances and neutrino counting; the dark matter problems; and the formation of galaxies and large-scale structure. Comments will also be made on the possible implications of the recent solar neutrino experimental results for cosmology. An appendix briefly discusses the 17 keV thing'' and the cosmological and astrophysical constraints on it. 126 refs., 8 figs., 2 tabs.

  13. Boolean networks using the chi-square test for inferring large-scale gene regulatory networks

    Directory of Open Access Journals (Sweden)

    Lee Jae K

    2007-02-01

    Full Text Available Abstract Background Boolean network (BN modeling is a commonly used method for constructing gene regulatory networks from time series microarray data. However, its major drawback is that its computation time is very high or often impractical to construct large-scale gene networks. We propose a variable selection method that are not only reduces BN computation times significantly but also obtains optimal network constructions by using chi-square statistics for testing the independence in contingency tables. Results Both the computation time and accuracy of the network structures estimated by the proposed method are compared with those of the original BN methods on simulated and real yeast cell cycle microarray gene expression data sets. Our results reveal that the proposed chi-square testing (CST-based BN method significantly improves the computation time, while its ability to identify all the true network mechanisms was effectively the same as that of full-search BN methods. The proposed BN algorithm is approximately 70.8 and 7.6 times faster than the original BN algorithm when the error sizes of the Best-Fit Extension problem are 0 and 1, respectively. Further, the false positive error rate of the proposed CST-based BN algorithm tends to be less than that of the original BN. Conclusion The CST-based BN method dramatically improves the computation time of the original BN algorithm. Therefore, it can efficiently infer large-scale gene regulatory network mechanisms.

  14. Facile synthesis of large-scale Ag nanosheet-assembled films with sub-10 nm gaps as highly active and homogeneous SERS substrates

    Science.gov (United States)

    Li, Zhongbo; Meng, Guowen; Liang, Ting; Zhang, Zhuo; Zhu, Xiaoguang

    2013-01-01

    We report a facile low-cost synthetic approach to large-scale Ag nanosheet-assembled films with a high density of uniformly distributed sub-10 nm gaps between the adjacent nanosheets on Si substrates via galvanic cell reactions. The distribution density of Ag nanosheets on substrates could be tailored by tuning the duration of the HF-etching and the concentration of citric acid in the solution. Furthermore, in conjunction with a conventional photolithography, highly uniform patterned Ag nanosheet-assembled structures with different morphologies can be achieved on Si substrates via galvanic-cell-induced growth. By using rhodamine 6G as a standard test molecule, the large-scale Ag nanosheet-assembled films exhibit highly active and homogenous surface-enhanced Raman scattering (SERS) effect and also show promising potentials as reliable SERS substrates for rapid detection of trace polychlorinated biphenyls (PCBs).

  15. "Second order" exploratory data analysis of the large scale gas injection test (lasgit) dataset, focused around known gas migration events

    NARCIS (Netherlands)

    Bennett, D.; Cuss, R.; Vardon, P.J.; Harrington, J.; Sedighi, M.; Shaw, R.; Thomas, H.

    2013-01-01

    Within large-scale experimental datasets a wealth of small scale information can typically be found. An example of such an experiment is the Large Scale Gas Injection Test (Lasgit). A toolkit has been developed to facilitate the investigation of the small scale or ‘second order’ detail contained wit

  16. Large-scale laser-microwave synchronization for attosecond photon science facilities

    Energy Technology Data Exchange (ETDEWEB)

    Shafak, Kemal

    2017-04-15

    Low-noise transfer of time and frequency standards over large distances provides high temporal resolution for ambitious scientific explorations such as sensitive imaging of astronomical objects using multi-telescope arrays, comparison of distant optical clocks or gravitational-wave detection using large laser interferometers. In particular, rapidly expanding photon science facilities such as X-ray free-electron lasers (FELs) and attoscience centers have the most challenging synchronization requirements of sub-fs timing precision to generate ultrashort X-ray pulses for the benefit of creating super-microscopes with sub-atomic spatiotemporal resolution. The critical task in these facilities is to synchronize various pulsed lasers and microwave sources across multi-kilometer distances as required for seeded FELs and attosecond pump-probe experiments. So far, there has been no timing distribution system meeting this strict requirement. Therefore, insufficient temporal precision provided by the current synchronization systems hinders the development of attosecond hard X-ray photon science facilities. The aim of this thesis is to devise a timing distribution system satisfying the most challenging synchronization requirements in science mandated by the next-generation photon science facilities. Using the pulsed-optical timing distribution approach, attosecond timing precision is realized by thoroughly investigating and eliminating the remaining noise sources in the synchronization system. First, optical and microwave timing detection schemes are further developed to support long-term stable, attosecond-precision measurements. Second, the feasibility of the master laser to support a kilometer-scale timing network with attosecond precision is examined by experimentally characterizing its free-running timing jitter and improving its long-term frequency stability with a sophisticated environmental insulation. Third, nonlinear pulse propagation inside optical fibers is studied

  17. The new large-scale international facility for antiproton and ion research in Europe, FAIR

    Energy Technology Data Exchange (ETDEWEB)

    Rosner, Guenther [Facility for Antiprotons and Ion Research (FAIR), Darmstadt (Germany)

    2012-07-01

    Full text: FAIR is currently the largest project in nuclear and particle physics worldwide, with investment costs of 1.6B euro in its first phase. It has been founded by Finland, France, Germany, India, Poland, Romania, Russia, Slovenia and Sweden in Oct. 2010. The facility will provide the international scientific community with a unique and technically innovative particle accelerator system to perform cutting-edge research in the sciences concerned with the basic structure of matter in: nuclear and particle physics, atomic and anti-matter physics, high density plasma physics, and applications in condensed matter physics, biology and the bio-medical sciences. The work horse of FAIR will be a 1.1 km circumference double ring of rapidly cycling 100 and 300 Tm synchrotrons, which will be used to produce high intensity secondary beams of anti-protons and very short-lived radioactive ions. A subsequent suite of cooler and storage rings will deliver anti-proton and heavy-ion beams of unprecedented quality regarding intensity and resolution. Large experimental facilities are presently being prototyped by the APPA, CBM, NuSTAR and PANDA Collaborations to be used by a global community of more than 3000 scientists from 2018. (author)

  18. A safe, effective, and facility compatible cleaning in place procedure for affinity resin in large-scale monoclonal antibody purification.

    Science.gov (United States)

    Wang, Lu; Dembecki, Jill; Jaffe, Neil E; O'Mara, Brian W; Cai, Hui; Sparks, Colleen N; Zhang, Jian; Laino, Sarah G; Russell, Reb J; Wang, Michelle

    2013-09-20

    Cleaning-in-place (CIP) for column chromatography plays an important role in therapeutic protein production. A robust and efficient CIP procedure ensures product quality, improves column life time and reduces the cost of the purification processes, particularly for those using expensive affinity resins, such as MabSelect protein A resin. Cleaning efficiency, resin compatibility, and facility compatibility are the three major aspects to consider in CIP process design. Cleaning MabSelect resin with 50mM sodium hydroxide (NaOH) along with 1M sodium chloride is one of the most popular cleaning procedures used in biopharmaceutical industries. However, high concentration sodium chloride is a leading cause of corrosion in the stainless steel containers used in large scale manufacture. Corroded containers may potentially introduce metal contaminants into purified drug products. Therefore, it is challenging to apply this cleaning procedure into commercial manufacturing due to facility compatibility and drug safety concerns. This paper reports a safe, effective and environmental and facility-friendly cleaning procedure that is suitable for large scale affinity chromatography. An alternative salt (sodium sulfate) is used to prevent the stainless steel corrosion caused by sodium chloride. Sodium hydroxide and salt concentrations were optimized using a high throughput screening approach to achieve the best combination of facility compatibility, cleaning efficiency and resin stability. Additionally, benzyl alcohol is applied to achieve more effective microbial control. Based on the findings, the recommended optimum cleaning strategy is cleaning MabSelect resin with 25 mM NaOH, 0.25 M Na2SO4 and 1% benzyl alcohol solution every cycle, followed by a more stringent cleaning using 50 mM NaOH with 0.25 M Na2SO4 and 1% benzyl alcohol at the end of each manufacturing campaign. A resin life cycle study using the MabSelect affinity resin demonstrates that the new cleaning strategy

  19. Testing the Minimum Variance Method for Estimating Large Scale Velocity Moments

    CERN Document Server

    Agarwal, Shankar; Watkins, Richard

    2012-01-01

    The estimation and analysis of large-scale bulk flow moments of peculiar velocity surveys is complicated by non-spherical survey geometry, the non-uniform sampling of the matter velocity field by the survey objects, and the typically large measurement errors of the measured line-of-sight velocities. Previously we have developed an optimal "minimum variance" (MV) weighting scheme for using peculiar velocity data to estimate bulk flow moments for idealized dense and isotropic surveys with Gaussian radial distributions that avoids many of these complications. These moments are designed to be easy to interpret and are comparable between surveys. In this paper, we test the robustness of our MV estimators using numerical simulations. Using MV weights, we estimate the underlying bulk flow moments for DEEP, SFI++ and COMPOSITE mock catalogues extracted from the LasDamas and the Horizon Run numerical simulations and compare these estimates to the true moments calculated directly from the simulation boxes. We show that...

  20. Microfluidic very large-scale integration for biochips: Technology, testing and fault-tolerant design

    DEFF Research Database (Denmark)

    Araci, Ismail Emre; Pop, Paul; Chakrabarty, Krishnendu

    2015-01-01

    in the deployment of microfluidic biochips is their low reliability and lack of test techniques to screen defective devices before they are used for biochemical analysis. Defective chips lead to repetition of experiments, which is undesirable due to high reagent cost and limited availability of samples. This paper......Microfluidic biochips are replacing the conventional biochemical analyzers by integrating all the necessary functions for biochemical analysis using microfluidics. Biochips are used in many application areas, such as, in vitro diagnostics, drug discovery, biotech and ecology. The focus...... of this paper is on continuous-flow biochips, where the basic building block is a microvalve. By combining these microvalves, more complex units such as mixers, switches, multiplexers can be built, hence the name of the technology, “microfluidic Very Large-Scale Integration” (mVLSI). A roadblock...

  1. Large scale steam flow test: Pressure drop data and calculated pressure loss coefficients

    Energy Technology Data Exchange (ETDEWEB)

    Meadows, J.B.; Spears, J.R.; Feder, A.R.; Moore, B.P.; Young, C.E. [Bettis Atomic Power Lab., Pittsburgh, PA (United States)

    1993-12-01

    This report presents the result of large scale steam flow testing, 3 million to 7 million lbs/hr., conducted at approximate steam qualities of 25, 45, 70 and 100 percent (dry, saturated). It is concluded from the test data that reasonable estimates of piping component pressure loss coefficients for single phase flow in complex piping geometries can be calculated using available engineering literature. This includes the effects of nearby upstream and downstream components, compressibility, and internal obstructions, such as splitters, and ladder rungs on individual piping components. Despite expected uncertainties in the data resulting from the complexity of the piping geometry and two-phase flow, the test data support the conclusion that the predicted dry steam K-factors are accurate and provide useful insight into the effect of entrained liquid on the flow resistance. The K-factors calculated from the wet steam test data were compared to two-phase K-factors based on the Martinelli-Nelson pressure drop correlations. This comparison supports the concept of a two-phase multiplier for estimating the resistance of piping with liquid entrained into the flow. The test data in general appears to be reasonably consistent with the shape of a curve based on the Martinelli-Nelson correlation over the tested range of steam quality.

  2. Large scale gas injection test (Lasgit) performed at the Aespoe Hard Rock Laboratory. Summary report 2008

    Energy Technology Data Exchange (ETDEWEB)

    Cuss, R.J.; Harrington, J.F.; Noy, D.J. (British Geological Survey (United Kingdom))

    2010-02-15

    This report describes the set-up, operation and observations from the first 1,385 days (3.8 years) of the large scale gas injection test (Lasgit) experiment conducted at the Aespoe Hard Rock Laboratory. During this time the bentonite buffer has been artificially hydrated and has given new insight into the evolution of the buffer. After 2 years (849 days) of artificial hydration a canister filter was identified to perform a series of hydraulic and gas tests, a period that lasted 268 days. The results from the gas test showed that the full-scale bentonite buffer behaved in a similar way to previous laboratory experiments. This confirms the up-scaling of laboratory observations with the addition of considerable information on the stress responses throughout the deposition hole. During the gas testing stage, the buffer was continued to artificially hydrate. Hydraulic results, from controlled and uncontrolled events, show that the buffer continues to mature and has yet to reach full maturation. Lasgit has yielded high quality data relating to the hydration of the bentonite and the evolution in hydrogeological properties adjacent to the deposition hole. The initial hydraulic and gas injection tests confirm the correct working of all control and data acquisition systems. Lasgit has been in successful operation for in excess of 1,385 days

  3. Using landscape ecology to test hypotheses about large-scale abundance patterns in migratory birds

    Science.gov (United States)

    Flather, C.H.; Sauer, J.R.

    1996-01-01

    The hypothesis that Neotropical migrant birds may be undergoing widespread declines due to land use activities on the breeding grounds has been examined primarily by synthesizing results from local studies. Growing concern for the cumulative influence of land use activities on ecological systems has heightened the need for large-scale studies to complement what has been observed at local scales. We investigated possible landscape effects on Neotropical migrant bird populations for the eastern United States by linking two large-scale inventories designed to monitor breeding-bird abundances and land use patterns. The null hypothesis of no relation between landscape structure and Neotropical migrant abundance was tested by correlating measures of landscape structure with bird abundance, while controlling for the geographic distance among samples. Neotropical migrants as a group were more 'sensitive' to landscape structure than either temperate migrants or permanent residents. Neotropical migrants tended to be more abundant in landscapes with a greater proportion of forest and wetland habitats, fewer edge habitats, large forest patches, and with forest habitats well dispersed throughout the scene. Permanent residents showed few correlations with landscape structure and temperate migrants were associated with habitat diversity and edge attributes rather than with the amount, size, and dispersion of forest habitats. The association between Neotropical migrant abundance and forest fragmentation differed among physiographic strata, suggesting that land-scape context affects observed relations between bird abundance and landscape structure. Finally, associations between landscape structure and temporal trends in Neotropical migrant abundance were negatively correlated with forest habitats. These results suggest that extrapolation of patterns observed in some landscapes is not likely to hold regionally, and that conservation policies must consider the variation in landscape

  4. Testing Early Universe Theories Using Large Scale Structure: Moving Beyond Phenomenology

    Science.gov (United States)

    Shandera, Sarah

    models of the very early universe, while being computationally efficient. We will run N-body simulations of these models, exploring the signals of non-Gaussianity in the initial conditions in the power spectrum, bispectrum, and mass function of dark matter halos and comparing with analytic prescriptions for calculating those quantities. Mathematically, a probability distribution that is non-Gaussian must have a complete structure of moments and so implementing generic non-Gaussian initial conditions is a difficult task. We will use the structure of moments that result from early universe theories to ensure that the simulations capture key physical features of the models. The output of the simulations will allow us to determine how best to test for, confirm, or rule out the patterns expected from inflation and alternative theories. This project will increase our understanding of the possible origins of Large Scale Cosmic Structures and Dark Matter (8), and has applications to Dark Energy and the Cosmic Microwave Background (9) through complementary predictions for scalar and tensor modes in the CMB. The project directly contributes to answering the Astrophysics Science Question "How did the universe originate and evolve to produce the galaxies, stars, and planets we see today?" and Astrophysics Science Area Objective 1: Understand the origin and destiny of the universe, and the nature of black holes, dark energy, dark matter, and gravity. The project is timely, since data from large surveys is already available and will arrive at an accelerated pace in the next several years. In addition, the Planck satellite will provide either evidence of non-Gaussianity or significantly improved constraints by 2013 and this project will develop our ability to check the consistency of that result with LSS observations.

  5. Facile fabrication of large-scale stable superhydrophobic surfaces with carbon sphere films by burning rapeseed oil

    Science.gov (United States)

    Qu, Mengnan; He, Jinmei; Cao, Biyun

    2010-10-01

    Stable anti-corrosive superhydrophobic surfaces were successfully prepared with the carbon nanosphere films by means of depositing the soot of burning rapeseed oil. The method is extremely cheap, facile, time-saving and avoided any of the special equipments, special reagents and complex process control. The method is suitable for the large-scale preparation of superhydrophobic surface and the substrate can be easily changed. The as-prepared surfaces showed stable superhydrophobicity and anti-corrosive property even in many corrosive solutions, such as acidic or basic solutions over a wide pH range. The as-prepared superhydrophobic surface was carefully characterized by the field emission scanning electron microscopy and transmission electron microscope to confirm the synergistic binary geometric structures at micro- and nanometer scale. This result will open a new avenue in the superhydrophobic paint research with these easily obtained carbon nanospheres in the near future.

  6. Static Loads Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: Provides the capability to perform large-scale structural loads testing on spacecraft and other structures. Results from these tests can be used to verify...

  7. Large-scale ELISA testing of Spanish red deer for paratuberculosis.

    Science.gov (United States)

    Reyes-García, R; Pérez-de-la-Lastra, J M; Vicente, J; Ruiz-Fons, F; Garrido, J M; Gortázar, C

    2008-07-15

    A role of wildlife species as paratuberculosis reservoirs is strongly suspected based on field and molecular epidemiologic evidence. This paper presents the first large-scale data on enzyme-linked immunosorbent assay (ELISA) against Mycobacterium avium subspecies paratuberculosis (MAP) antibodies in red deer from Spain, and tests the effect of host and environmental risk factors on antibody levels. A total of 257 out of 852 serum samples tested positive, yielding a total seroprevalence of 30.16% (95% CI 27.08-33.24). Sampling locality, presence of cattle and increasing age explained the variation in the individual ELISA optical density (OD) results. Data presented in this study strongly suggest that Spanish red deer are exposed to MAP. While contact with cattle was statistically significant, some wild populations showed the highest positivity to the ELISA. The results support the need of a careful study of MAP prevalence based on culture and molecular tools in order to clarify if deer play a significant role as paratuberculosis reservoirs for livestock, and if deer paratuberculosis is affecting hunting harvest, trophy quality, or wild animal welfare in Spain.

  8. Improving large-scale testing capability by modifying the 40- by 80-foot wind tunnel

    Science.gov (United States)

    Mort, K. W.; Soderman, P. T.; Eckert, W. T.

    1977-01-01

    Interagency studies conducted during the last several years have indicated the need to improve full-scale testing capabilities. The studies showed that the most effective trade between test capability and facility cost was provided by repowering the existing Ames Research Center 40- by 80-foot wind tunnel to increase the maximum speed from about 100 m/s (200 knots) to about 150 m/s (300 knots) and by adding a new 24- by 37-m (80- by 120-ft) test section powered for about a 50-m/s (100-knot) maximum speed. This paper reviews the design of the facility, a few of its test capabilities, and some of its unique features.

  9. A Polar Rover for Large-scale Scientific Surveys: Design, Implementation and Field Test Results

    Directory of Open Access Journals (Sweden)

    Yuqing He

    2015-10-01

    Full Text Available Exploration of polar regions is of great importance to scientific research. Unfortunately, due to the harsh environment, most of the regions on the Antarctic continent are still unreachable for humankind. Therefore, in 2011, the Chinese National Antarctic Research Expedition (CHINARE launched a project to design a rover to conduct large-scale scientific surveys on the Antarctic. The main challenges for the rover are twofold: one is the mobility, i.e., how to make a rover that could survive the harsh environment and safely move on the uneven, icy and snowy terrain; the other is the autonomy, in that the robot should be able to move at a relatively high speed with little or no human intervention so that it can explore a large region in a limit time interval under the communication constraints. In this paper, the corresponding techniques, especially the polar rover’s design and autonomous navigation algorithms, are introduced in detail. Subsequently, an experimental report of the fields tests on the Antarctic is given to show some preliminary evaluation of the rover. Finally, experiences and existing challenging problems are summarized.

  10. A Polar Rover for Large-Scale Scientific Surveys: Design, Implementation and Field Test Results

    Directory of Open Access Journals (Sweden)

    Yuqing He

    2015-10-01

    Full Text Available Exploration of polar regions is of great importance to scientific research. Unfortunately, due to the harsh environment, most of the regions on the Antarctic continent are still unreachable for humankind. Therefore, in 2011, the Chinese National Antarctic Research Expedition (CHINARE launched a project to design a rover to conduct large-scale scientific surveys on the Antarctic. The main challenges for the rover are twofold: one is the mobility, i.e., how to make a rover that could survive the harsh environment and safely move on the uneven, icy and snowy terrain; the other is the autonomy, in that the robot should be able to move at a relatively high speed with little or no human intervention so that it can explore a large region in a limit time interval under the communication constraints. In this paper, the corresponding techniques, especially the polar rover's design and autonomous navigation algorithms, are introduced in detail. Subsequently, an experimental report of the fields tests on the Antarctic is given to show some preliminary evaluation of the rover. Finally, experiences and existing challenging problems are summarized.

  11. Vibration Mitigation without Dissipative Devices: First Large-Scale Testing of a State Switched Inducer

    Directory of Open Access Journals (Sweden)

    Daniel Tirelli

    2014-01-01

    Full Text Available A new passive device for mitigating cable vibrations is proposed and its efficiency is assessed on 45-meter long taut cables through a series of free and forced vibration tests. It consists of a unilateral spring attached perpendicularly to the cable near the anchorage. Because of its ability to change the cable dynamic behaviour through intermittent activation, the device has been called state switched inducer (SSI. The cable behaviour is shown to be deeply modified by the SSI: the forced vibration response is anharmonicc and substantially reduced in amplitude whereas the free vibration decay is largely sped up through a beating phenomenon. The vibration mitigation effect is mainly due to the activation and coupling of various vibration modes, as evidenced in the response spectra of the equipped cable. This first large-scale experimental campaign shows that the SSI outperforms classical passive devices, thus paving the way to a new kind of low-cost vibration mitigation systems which do not rely on dissipation.

  12. Large-Scale Pumping Test Recommendations for the 200-ZP-1 Operable Unit

    Energy Technology Data Exchange (ETDEWEB)

    Spane, Frank A.

    2010-09-08

    CH2M Hill Plateau Remediation Company (CHPRC) is currently assessing aquifer characterization needs to optimize pump-and-treat remedial strategies (e.g., extraction well pumping rates, pumping schedule/design) in the 200-ZP-1 operable unit (OU), and in particular for the immediate area of the 241 TX-TY Tank Farm. Specifically, CHPRC is focusing on hydrologic characterization opportunities that may be available for newly constructed and planned ZP-1 extraction wells. These new extraction wells will be used to further refine the 3-dimensional subsurface contaminant distribution within this area and will be used in concert with other existing pump-and-treat wells to remediate the existing carbon tetrachloride contaminant plume. Currently, 14 extraction wells are actively used in the Interim Record of Decision ZP-1 pump-and-treat system for the purpose of remediating the existing carbon tetrachloride contamination in groundwater within this general area. As many as 20 new extraction wells and 17 injection wells may be installed to support final pump-and-treat operations within the OU area. It should be noted that although the report specifically refers to the 200-ZP-1 OU, the large-scale test recommendations are also applicable to the adjacent 200-UP-1 OU area. This is because of the similar hydrogeologic conditions exhibited within these two adjoining OU locations.

  13. Testing cosmological models with large-scale power modulation using microwave background polarization observations

    CERN Document Server

    Bunn, Emory F; Zheng, Haoxuan

    2016-01-01

    We examine the degree to which observations of large-scale cosmic microwave background (CMB) polarization can shed light on the puzzling large-scale power modulation in maps of CMB anisotropy. We consider a phenomenological model in which the observed anomaly is caused by modulation of large-scale primordial curvature perturbations, and calculate Fisher information and error forecasts for future polarization data, constrained by the existing CMB anisotropy data. Because a significant fraction of the available information is contained in correlations with the anomalous temperature data, it is essential to account for these constraints. We also present a systematic approach to finding a set of normal modes that maximize the available information, generalizing the well-known Karhunen-Loeve transformation to take account of the constraints from the temperature data. A polarization map covering at least $\\sim 60\\%$ of the sky should be able to provide a $3\\sigma$ detection of modulation at the level favored by the...

  14. FutureGen 2.0 Oxy-combustion Large Scale Test – Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Kenison, LaVesta [URS, Pittsburgh, PA (United States); Flanigan, Thomas [URS, Pittsburgh, PA (United States); Hagerty, Gregg [URS, Pittsburgh, PA (United States); Gorrie, James [Air Liquide, Kennesaw, GA (United States); Leclerc, Mathieu [Air Liquide, Kennesaw, GA (United States); Lockwood, Frederick [Air Liquide, Kennesaw, GA (United States); Falla, Lyle [Babcock & Wilcox and Burns McDonnell, Kansas City, MO (United States); Macinnis, Jim [Babcock & Wilcox and Burns McDonnell, Kansas City, MO (United States); Fedak, Mathew [Babcock & Wilcox and Burns McDonnell, Kansas City, MO (United States); Yakle, Jeff [Babcock & Wilcox and Burns McDonnell, Kansas City, MO (United States); Williford, Mark [Futuregen Industrial Alliance, Inc., Morgan County, IL (United States); Wood, Paul [Futuregen Industrial Alliance, Inc., Morgan County, IL (United States)

    2016-04-01

    The primary objectives of the FutureGen 2.0 CO2 Oxy-Combustion Large Scale Test Project were to site, permit, design, construct, and commission, an oxy-combustion boiler, gas quality control system, air separation unit, and CO2 compression and purification unit, together with the necessary supporting and interconnection utilities. The project was to demonstrate at commercial scale (168MWe gross) the capability to cleanly produce electricity through coal combustion at a retrofitted, existing coal-fired power plant; thereby, resulting in near-zeroemissions of all commonly regulated air emissions, as well as 90% CO2 capture in steady-state operations. The project was to be fully integrated in terms of project management, capacity, capabilities, technical scope, cost, and schedule with the companion FutureGen 2.0 CO2 Pipeline and Storage Project, a separate but complementary project whose objective was to safely transport, permanently store and monitor the CO2 captured by the Oxy-combustion Power Plant Project. The FutureGen 2.0 Oxy-Combustion Large Scale Test Project successfully achieved all technical objectives inclusive of front-end-engineering and design, and advanced design required to accurately estimate and contract for the construction, commissioning, and start-up of a commercial-scale "ready to build" power plant using oxy-combustion technology, including full integration with the companion CO2 Pipeline and Storage project. Ultimately the project did not proceed to construction due to insufficient time to complete necessary EPC contract negotiations and commercial financing prior to expiration of federal co-funding, which triggered a DOE decision to closeout its participation in the project. Through the work that was completed, valuable technical, commercial, and programmatic lessons were learned. This project has significantly advanced the development of near-zero emission technology and will

  15. The quest for better quality-of-life - learning from large-scale shaking table tests

    Science.gov (United States)

    Nakashima, M.; Sato, E.; Nagae, T.; Kunio, F.; Takahito, I.

    2010-12-01

    Earthquake engineering has its origins in the practice of “learning from actual earthquakes and earthquake damages.” That is, we recognize serious problems by witnessing the actual damage to our structures, and then we develop and apply engineering solutions to solve these problems. This tradition in earthquake engineering, i.e., “learning from actual damage,” was an obvious engineering response to earthquakes and arose naturally as a practice in a civil and building engineering discipline that traditionally places more emphasis on experience than do other engineering disciplines. But with the rapid progress of urbanization, as society becomes denser, and as the many components that form our society interact with increasing complexity, the potential damage with which earthquakes threaten the society also increases. In such an era, the approach of ”learning from actual earthquake damages” becomes unacceptably dangerous and expensive. Among the practical alternatives to the old practice is to “learn from quasi-actual earthquake damages.” One tool for experiencing earthquake damages without attendant catastrophe is the large shaking table. E-Defense, the largest one we have, was developed in Japan after the 1995 Hyogoken-Nanbu (Kobe) earthquake. Since its inauguration in 2005, E-Defense has conducted over forty full-scale or large-scale shaking table tests, applied to a variety of structural systems. The tests supply detailed data on actual behavior and collapse of the tested structures, offering the earthquake engineering community opportunities to experience and assess the actual seismic performance of the structures, and to help society prepare for earthquakes. Notably, the data were obtained without having to wait for the aftermaths of actual earthquakes. Earthquake engineering has always been about life safety, but in recent years maintaining the quality of life has also become a critical issue. Quality-of-life concerns include nonstructural

  16. Analysing Item Position Effects due to Test Booklet Design within Large-Scale Assessment

    Science.gov (United States)

    Hohensinn, Christine; Kubinger, Klaus D.; Reif, Manuel; Schleicher, Eva; Khorramdel, Lale

    2011-01-01

    For large-scale assessments, usually booklet designs administering the same item at different positions within a booklet are used. Therefore, the occurrence of position effects influencing the difficulty of the item is a crucial issue. Not taking learning or fatigue effects into account would result in a bias of estimated item difficulty. The…

  17. Digital Image Correlation Techniques Applied to Large Scale Rocket Engine Testing

    Science.gov (United States)

    Gradl, Paul R.

    2016-01-01

    Rocket engine hot-fire ground testing is necessary to understand component performance, reliability and engine system interactions during development. The J-2X upper stage engine completed a series of developmental hot-fire tests that derived performance of the engine and components, validated analytical models and provided the necessary data to identify where design changes, process improvements and technology development were needed. The J-2X development engines were heavily instrumented to provide the data necessary to support these activities which enabled the team to investigate any anomalies experienced during the test program. This paper describes the development of an optical digital image correlation technique to augment the data provided by traditional strain gauges which are prone to debonding at elevated temperatures and limited to localized measurements. The feasibility of this optical measurement system was demonstrated during full scale hot-fire testing of J-2X, during which a digital image correlation system, incorporating a pair of high speed cameras to measure three-dimensional, real-time displacements and strains was installed and operated under the extreme environments present on the test stand. The camera and facility setup, pre-test calibrations, data collection, hot-fire test data collection and post-test analysis and results are presented in this paper.

  18. Energetic and Economic Assessment of Pipe Network Effects on Unused Energy Source System Performance in Large-Scale Horticulture Facilities

    Directory of Open Access Journals (Sweden)

    Jae Ho Lee

    2015-04-01

    Full Text Available As the use of fossil fuel has increased, not only in construction, but also in agriculture due to the drastic industrial development in recent times, the problems of heating costs and global warming are getting worse. Therefore, the introduction of more reliable and environmentally-friendly alternative energy sources has become urgent and the same trend is found in large-scale horticulture facilities. In this study, among many alternative energy sources, we investigated the reserves and the potential of various different unused energy sources which have infinite potential, but are nowadays wasted due to limitations in their utilization. This study investigated the effects of the distance between the greenhouse and the actual heat source by taking into account the heat transfer taking place inside the pipe network. This study considered CO2 emissions and economic aspects to determine the optimal heat source. Payback period analysis against initial investment cost shows that a heat pump based on a power plant’s waste heat has the shortest payback period of 7.69 years at a distance of 0 km. On the other hand, the payback period of a heat pump based on geothermal heat showed the shortest payback period of 10.17 year at the distance of 5 km, indicating that heat pumps utilizing geothermal heat were the most effective model if the heat transfer inside the pipe network between the greenhouse and the actual heat source is taken into account.

  19. Large-scale Demonstration and Deployment Project for D&D of Fuel Storage Canals and Associated Facilities at INEEL

    Energy Technology Data Exchange (ETDEWEB)

    Whitmill, Larry Joseph

    2001-12-01

    The Department of Energy (DOE) Office of Science and Technology (OST), Deactivation and Decommissioning Focus Area (DDFA), sponsored a Large Scale Demonstration and Deployment Project (LSDDP) at the Idaho National Engineering and Environmental Laboratory (INEEL) under management of the DOE National Energy Technology Laboratory (NETL). The INEEL LSDDP is one of several LSDDPs sponsored by DOE. The LSDDP process integrates field demonstrations into actual decontamination and decommissioning (D&D) operations by comparing new or improved technologies against existing baseline technologies using a side-by-side comparison. The goals are (a) to identify technologies that are cheaper, safer, faster, and cleaner (produce less waste), and (b) to incorporate those technologies into D&D baseline operations. The INEEL LSDDP reviewed more than 300 technologies, screened 141, and demonstrated 17. These 17 technologies have been deployed a total of 70 times at facilities other than those where the technology was demonstrated, and 10 have become baseline at the INEEL. Fifteen INEEL D&D needs have been modified or removed from the Needs Management System as a direct result of using these new technologies. Conservatively, the ten-year projected cost savings at the INEEL resulting from use of the technologies demonstrated in this INEEL LSDDP exceeds $39 million dollars.

  20. Are Teacher Perspectives Useful? Incorporating EFL Teacher Feedback in the Development of a Large-Scale International English Test

    Science.gov (United States)

    So, Youngsoon

    2014-01-01

    This article presents a case study incorporating English teachers' perspectives into the development of a large-scale international English assessment, the recently developed "TOEFL Junior"® Comprehensive test. It discusses how stakeholder feedback gathered during test development supports the validity argument for score…

  1. Recent Developments in Language Assessment and the Case of Four Large-Scale Tests of ESOL Ability

    Science.gov (United States)

    Stoynoff, Stephen

    2009-01-01

    This review article surveys recent developments and validation activities related to four large-scale tests of L2 English ability: the iBT TOEFL, the IELTS, the FCE, and the TOEIC. In addition to describing recent changes to these tests, the paper reports on validation activities that were conducted on the measures. The results of this research…

  2. Recent Developments in Language Assessment and the Case of Four Large-Scale Tests of ESOL Ability

    Science.gov (United States)

    Stoynoff, Stephen

    2009-01-01

    This review article surveys recent developments and validation activities related to four large-scale tests of L2 English ability: the iBT TOEFL, the IELTS, the FCE, and the TOEIC. In addition to describing recent changes to these tests, the paper reports on validation activities that were conducted on the measures. The results of this research…

  3. Estimation Source Parameters of Large-Scale Chemical Surface Explosions and Recent Underground Nuclear Tests

    Science.gov (United States)

    Gitterman, Y.; Kim, S.; Hofstetter, R.

    2013-12-01

    Large-scale surface explosions were conducted by the Geophysical Institute of Israel at Sayarim Military Range (SMR), Negev desert: 82 tons of strong HE explosives in August 2009, and 10&100 tons of ANFO explosives in January 2011. The main goal was to provide strong controlled sources in different wind conditions, for calibration of IMS infrasound stations. Numerous dense observations of blast waves were provided by high-pressure, acoustic and seismic sensors at near-source ( 2000 tons) ANFO surface shots at White Sands Military Range (WSMR) were analyzed for SS time delay. The Secondary Shocks were revealed on the records in the range 1.5-60 km and showed consistency with the SMR data, thus extending the charge and distance range for the developed SS delay relationship. Obtained results suggest that measured SS delays can provide important information about an explosion source character, and can be used as a new simple cost-effective yield estimator for explosions with known type of explosives. The new results are compared with analogous available data of surface nuclear explosions. Special distinctions in air-blast waves are revealed and analyzed, resulting from the different source phenomenology (energy release). Two underground nuclear explosions conducted by North Korea in 2009 and 2013 were recorded by several stations of Israel Seismic Network. Pronounced minima (spectral nulls) at 1.2-1.3 Hz were revealed in the spectra of teleseismic P-waves. For a ground-truth explosion with a shallow source depth (relatively to an earthquake), this phenomenon can be interpreted in terms of the interference between the down-going P-wave energy and the pP phase reflected from the Earth's surface. A similar effect was observed before at ISN stations for the Pakistan explosion (28.05.98) at a different frequency 1.7 Hz indicating the source- and not site-effect. Based on the null frequency dependency on the near-surface acoustic velocity and the source depth, the depth of

  4. Testing on a Large Scale Running the ATLAS Data Acquisition and High Level Trigger Software on 700 PC Nodes

    CERN Document Server

    Burckhart-Chromek, Doris; Adragna, P; Alexandrov, L; Amorim, A; Armstrong, S; Badescu, E; Baines, J T M; Barros, N; Beck, H P; Bee, C; Blair, R; Bogaerts, J A C; Bold, T; Bosman, M; Caprini, M; Caramarcu, C; Ciobotaru, M; Comune, G; Corso-Radu, A; Cranfield, R; Crone, G; Dawson, J; Della Pietra, M; Di Mattia, A; Dobinson, Robert W; Dobson, M; Dos Anjos, A; Dotti, A; Drake, G; Ellis, Nick; Ermoline, Y; Ertorer, E; Falciano, S; Ferrari, R; Ferrer, M L; Francis, D; Gadomski, S; Gameiro, S; Garitaonandia, H; Gaudio, G; George, S; Gesualdi-Mello, A; Gorini, B; Green, B; Haas, S; Haberichter, W N; Hadavand, H; Haeberli, C; Haller, J; Hansen, J; Hauser, R; Hillier, S J; Höcker, A; Hughes-Jones, R E; Joos, M; Kazarov, A; Kieft, G; Klous, S; Kohno, T; Kolos, S; Korcyl, K; Kordas, K; Kotov, V; Kugel, A; Landon, M; Lankford, A; Leahu, L; Leahu, M; Lehmann-Miotto, G; Le Vine, M J; Liu, W; Maeno, T; Männer, R; Mapelli, L; Martin, B; Masik, J; McLaren, R; Meessen, C; Meirosu, C; Mineev, M; Misiejuk, A; Morettini, P; Mornacchi, G; Müller, M; Garcia-Murillo, R; Nagasaka, Y; Negri, A; Padilla, C; Pasqualucci, E; Pauly, T; Perera, V; Petersen, J; Pope, B; Albuquerque-Portes, M; Pretzl, K; Prigent, D; Roda, C; Ryabov, Yu; Salvatore, D; Schiavi, C; Schlereth, J L; Scholtes, I; Sole-Segura, E; Seixas, M; Sloper, J; Soloviev, I; Spiwoks, R; Stamen, R; Stancu, S; Strong, S; Sushkov, S; Szymocha, T; Tapprogge, S; Teixeira-Dias, P; Torres, R; Touchard, F; Tremblet, L; Ünel, G; Van Wasen, J; Vandelli, W; Vaz-Gil-Lopes, L; Vermeulen, J C; von der Schmitt, H; Wengler, T; Werner, P; Wheeler, S; Wickens, F; Wiedenmann, W; Wiesmann, M; Wu, X; Yasu, Y; Yu, M; Zema, F; Zobernig, H; Computing In High Energy and Nuclear Physics

    2006-01-01

    The ATLAS Data Acquisition (DAQ) and High Level Trigger (HLT) software system will be comprised initially of 2000 PC nodes which take part in the control, event readout, second level trigger and event filter operations. This high number of PCs will only be purchased before data taking in 2007. The large CERN IT LXBATCH facility provided the opportunity to run in July 2005 online functionality tests over a period of 5 weeks on a stepwise increasing farm size from 100 up to 700 PC dual nodes. The interplay between the control and monitoring software with the event readout, event building and the trigger software has been exercised the first time as an integrated system on this large scale. New was also to run algorithms in the online environment for the trigger selection and in the event filter processing tasks on a larger scale. A mechanism has been developed to package the offline software together with the DAQ/HLT software and to distribute it via peer-to-peer software efficiently to this large pc cluster. T...

  5. Testing on a Large Scale running the ATLAS Data Acquisition and High Level Trigger Software on 700 PC Nodes

    CERN Document Server

    Burckhart-Chromek, Doris; Adragna, P; Albuquerque-Portes, M; Alexandrov, L; Amorim, A; Armstrong, S; Badescu, E; Baines, J T M; Barros, N; Beck, H P; Bee, C; Blair, R; Bogaerts, J A C; Bold, T; Bosman, M; Caprini, M; Caramarcu, C; Ciobotaru, M; Comune, G; Corso-Radu, A; Cranfield, R; Crone, G; Dawson, J; Della Pietra, M; Di Mattia, A; Dobinson, Robert W; Dobson, M; Dos Anjos, A; Dotti, A; Drake, G; Ellis, Nick; Ermoline, Y; Ertorer, E; Falciano, S; Ferrari, R; Ferrer, M L; Francis, D; Gadomski, S; Gameiro, S; Garcia-Murillo, R; Garitaonandia, H; Gaudio, G; George, S; Gesualdi-Mello, A; Gorini, B; Green, B; Haas, S; Haberichter, W N; Hadavand, H; Haeberli, C; Haller, J; Hansen, J; Hauser, R; Hillier, S J; Hughes-Jones, R E; Höcker, A; Joos, M; Kazarov, A; Kieft, G; Klous, S; Kohno, T; Kolos, S; Korcyl, K; Kordas, K; Kotov, V; Kugel, A; Landon, M; Lankford, A; Le Vine, M J; Leahu, L; Leahu, M; Lehmann-Miotto, G; Liu, W; Maeno, T; Mapelli, L; Martin, B; Masik, J; McLaren, R; Meessen, C; Meirosu, C; Mineev, M; Misiejuk, A; Morettini, P; Mornacchi, G; Männer, R; Müller, M; Nagasaka, Y; Negri, A; Padilla, C; Pasqualucci, E; Pauly, T; Perera, V; Petersen, J; Pope, B; Pretzl, K; Prigent, D; Roda, C; Ryabov, Yu; Salvatore, D; Schiavi, C; Schlereth, J L; Scholtes, I; Seixas, M; Sloper, J; Sole-Segura, E; Soloviev, I; Spiwoks, R; Stamen, R; Stancu, S; Strong, S; Sushkov, S; Szymocha, T; Tapprogge, S; Teixeira-Dias, P; Torres, R; Touchard, F; Tremblet, L; Van Wasen, J; Vandelli, W; Vaz-Gil-Lopes, L; Vermeulen, J C; Wengler, T; Werner, P; Wheeler, S; Wickens, F; Wiedenmann, W; Wiesmann, M; Wu, X; Yasu, Y; Yu, M; Zema, F; Zobernig, H; von der Schmitt, H; Ünel, G; Computing In High Energy and Nuclear Physics

    2006-01-01

    The ATLAS Data Acquisition (DAQ) and High Level Trigger (HLT) software system will be comprised initially of 2000 PC nodes which take part in the control, event readout, second level trigger and event filter operations. This high number of PCs will only be purchased before data taking in 2007. The large CERN IT LXBATCH facility provided the opportunity to run in July 2005 online functionality tests over a period of 5 weeks on a stepwise increasing farm size from 100 up to 700 PC dual nodes. The interplay between the control and monitoring software with the event readout, event building and the trigger software has been exercised the first time as an integrated system on this large scale. New was also to run algorithms in the online environment for the trigger selection and in the event filter processing tasks on a larger scale. A mechanism has been developed to package the offline software together with the DAQ/HLT software and to distribute it via peer-to-peer software efficiently to this large pc cluster. T...

  6. Indicators to examine quality of large scale survey data: an example through district level household and facility survey.

    Directory of Open Access Journals (Sweden)

    Kakoli Borkotoky

    Full Text Available BACKGROUND: Large scale surveys are the main source of data pertaining to all the social and demographic indicators, hence its quality is also of great concern. In this paper, we discuss the indicators used to examine the quality of data. We focus on age misreporting, incompleteness and inconsistency of information; and skipping of questions on reproductive and sexual health related issues. In order to observe the practical consequences of errors in a survey; the District Level Household and Facility Survey (DLHS-3 is used as an example dataset. METHODS: Whipple's and Myer's indices are used to identify age misreporting. Age displacements are identified by estimating downward and upward transfers for women from bordering age groups of the eligible age range. Skipping pattern is examined by recording the responses to the questions which precede the sections on birth history, immunization, and reproductive and sexual health. RESULTS: The study observed errors in age reporting, in all the states, but the extent of misreporting differs by state and individual characteristics. Illiteracy, rural residence and poor economic condition are the major factors that lead to age misreporting. Female were excluded from the eligible age group, to reduce the duration of interview. The study further observed that respondents tend to skip questions on HIV/RTI and other questions which follow a set of questions. CONCLUSION: The study concludes that age misreporting, inconsistency and incomplete response are three sources of error that need to be considered carefully before drawing conclusions from any survey. DLHS-3 also suffers from age misreporting, particularly for female in the reproductive ages. In view of the coverage of the survey, it may not be possible to control age misreporting completely, but some extra effort to probe a better answer may help in improving the quality of data in the survey.

  7. The restricted stochastic user equilibrium with threshold model: Large-scale application and parameter testing

    DEFF Research Database (Denmark)

    Rasmussen, Thomas Kjær; Nielsen, Otto Anker; Watling, David P.

    2017-01-01

    -pairs, and comparisons are performed with respect to a previously proposed RSUE model as well as an existing link-based mixed Multinomial Probit (MNP) SUE model. The results show that the RSUET has very attractive computation times for large-scale applications and demonstrate that the threshold addition to the RSUE...... highlight that the RSUET outperforms the MNP SUE in terms of convergence, calculation time and behavioural realism. The choice set composition is validated by using 16,618 observed route choices collected by GPS devices in the same network and observing their reproduction within the equilibrated choice sets...

  8. mJ range all-fiber MOPA prototype with hollow-core fiber beam delivery designed for large scale laser facilities seeding (Conference Presentation)

    Science.gov (United States)

    Scol, Florent; Gouriou, Pierre; Perrin, Arnaud; Gleyze, Jean-François; Valentin, Constance; Bouwmans, Géraud; Hugonnot, Emmanuel

    2017-03-01

    The Laser megajoule (LMJ) is a French large scale laser facility dedicated to inertial fusion research. Its front-ends are based on fiber laser technology and generate highly controlled beams in the nanojoule range. Scaling the energy of those fiber seeders to the millijoule range is a way explored to upgrade LMJ's architecture. We report on a fully integrated narrow line-width all-fiber MOPA prototype at 1053 nm designed to meet stringent requirements of large-scale laser facilities seeding. We achieve 750 µJ temporally-shaped pulses of few nanoseconds at 1 kHz. Thanks to its original longitudinal geometry and its wide output core (26µm MFD), the Yb-doped tapered fiber used in the power amplifier stage ensures a single-mode operation and negligible spectro-temporal distortions. The transport of 30 kW peak power pulses (from tapered fiber) in a 17 m long large mode area (39µm) hollow-core (HC) fiber is presented and points out frequency modulation to amplitude modulation conversion management issues. A S² measurement of this fiber allows to attribute this conversion to a slightly multimode behavior (< 13dB of extinction between the fundamental mode and higher order modes). Other HC fibers exhibiting a really single-mode behavior (<20 dB) have been tested and the comparison will be presented in the conference. Finally, fiber spatial beam shaping from coherent Gaussian beam to coherent top-hat intensity profile beam in the mJ range with a specifically designed and fabricated fiber will also be presented.

  9. Fermi Observations of Resolved Large-Scale Jets: Testing the IC/CMB Model

    Science.gov (United States)

    Breiding, Peter; Meyer, Eileen T.; Georganopoulos, Markos

    2017-01-01

    It has been observed with the Chandra X-ray Observatory since the early 2000s that many powerful quasar jets show X-ray emission on the kpc scale (Harris & Krawczynski, 2006). In many cases these X-rays cannot be explained by the extension of the radio-optical spectrum produced by synchrotron emitting electrons in the jet, since the observed X-ray flux is too high and the X-ray spectral index too hard. A widely accepted model for the X-ray emission first proposed by Celotti et al. 2001 and Tavecchio et al. 2000 posits that the X-rays are produced when relativistic electrons in the jet up-scatter ambient cosmic microwave background (CMB) photons via inverse Compton scattering from microwave to X-ray energies (the IC/CMB model). However, explaining the X-ray emission for these jets with the IC/CMB model requires high levels of IC/CMB γ-ray emission (Georganopoulos et al., 2006), which we are looking for using the FERMI/LAT γ-ray space telescope. Another viable model for the large scale jet X-ray emission favored by the results of Meyer et al. 2015 and Meyer & Georganopoulos 2014 is an alternate population of synchrotron emitting electrons. In contrast with the second synchrotron interpretation; the IC/CMB model requires jets with high kinetic powers which can exceed the Eddington luminsoity (Dermer & Atoyan 2004 and Atoyan & Dermer 2004) and be very fast on the kpc scale with a Γ~10 (Celotti et al. 2001 and Tavecchio et al. 2000). New results from data obtained with the Fermi/LAT will be shown for several quasars not in the Fermi/LAT 3FGL catalog whose large scale X-ray jets are attributed to IC/CMB. Additionally, recent work on the γ-ray bright blazar AP Librae will be shown which helps to constrain some models attempting to explain the high energy component of its SED, which extends from X-ray to TeV energies (e.g., Zacharias & Wagner 2016 & Petropoulou et al. 2016).

  10. Large scale static tests of a tilt-nacelle V/STOL propulsion/attitude control system

    Science.gov (United States)

    1978-01-01

    The concept of a combined V/STOL propulsion and aircraft attitude control system was subjected to large scale engine tests. The tilt nacelle/attitude control vane package consisted of the T55 powered Hamilton Standard Q-Fan demonstrator. Vane forces, moments, thermal and acoustic characteristics as well as the effects on propulsion system performance were measured under conditions simulating hover in and out of ground effect.

  11. The Large Scale Bias of Dark Matter Halos: Numerical Calibration and Model Tests

    CERN Document Server

    Tinker, Jeremy L; Kravtsov, Andrey V; Klypin, Anatoly; Warren, Michael S; Yepes, Gustavo; Gottlober, Stefan

    2010-01-01

    We measure the clustering of dark matter halos in a large set of collisionless cosmological simulations of the flat LCDM cosmology. Halos are identified using the spherical overdensity algorithm, which finds the mass around isolated peaks in the density field such that the mean density is Delta times the background. We calibrate fitting functions for the large scale bias that are adaptable to any value of Delta we examine. We find a ~6% scatter about our best fit bias relation. Our fitting functions couple to the halo mass functions of Tinker et. al. (2008) such that bias of all dark matter is normalized to unity. We demonstrate that the bias of massive, rare halos is higher than that predicted in the modified ellipsoidal collapse model of Sheth, Mo, & Tormen (2001), and approaches the predictions of the spherical collapse model for the rarest halos. Halo bias results based on friends-of-friends halos identified with linking length 0.2 are systematically lower than for halos with the canonical Delta=200 o...

  12. Engine Test Facility (ETF)

    Data.gov (United States)

    Federal Laboratory Consortium — The Air Force Arnold Engineering Development Center's Engine Test Facility (ETF) test cells are used for development and evaluation testing of propulsion systems for...

  13. TRAC code assessment using data from SCTF Core-III, a large-scale 2D/3D facility

    Energy Technology Data Exchange (ETDEWEB)

    Boyack, B.E.; Shire, P.R.; Harmony, S.C.; Rhee, G.

    1988-01-01

    Nine tests from the SCTF Core-III configuration have been analyzed using TRAC-PF1/MOD1. The objectives of these assessment activities were to obtain a better understanding of the phenomena occurring during the refill and reflood phases of a large-break loss-of-coolant accident, to determine the accuracy to which key parameters are calculated, and to identify deficiencies in key code correlations and models that provide closure for the differential equations defining thermal-hydraulic phenomena in pressurized water reactors. Overall, the agreement between calculated and measured values of peak cladding temperature is reasonable. In addition, TRAC adequately predicts many of the trends observed in both the integral effect and separate effect tests conducted in SCTF Core-III. The importance of assessment activities that consider potential contributors to discrepancies between the measured and calculated results arising from three sources are described as those related to (1) knowledge about the facility configuration and operation, (2) facility modeling for code input, and (3) deficiencies in code correlations and models. An example is provided. 8 refs., 7 figs., 2 tabs.

  14. High-Stakes Accountability: Student Anxiety and Large-Scale Testing

    Science.gov (United States)

    von der Embse, Nathaniel P.; Witmer, Sara E.

    2014-01-01

    This study examined the relationship between student anxiety about high-stakes testing and their subsequent test performance. The FRIEDBEN Test Anxiety Scale was administered to 1,134 11th-grade students, and data were subsequently collected on their statewide assessment performance. Test anxiety was a significant predictor of test performance…

  15. Large-scale groundwater modeling using global datasets: a test case for the Rhine-Meuse basin

    Directory of Open Access Journals (Sweden)

    E. H. Sutanudjaja

    2011-09-01

    Full Text Available The current generation of large-scale hydrological models does not include a groundwater flow component. Large-scale groundwater models, involving aquifers and basins of multiple countries, are still rare mainly due to a lack of hydro-geological data which are usually only available in developed countries. In this study, we propose a novel approach to construct large-scale groundwater models by using global datasets that are readily available. As the test-bed, we use the combined Rhine-Meuse basin that contains groundwater head data used to verify the model output. We start by building a distributed land surface model (30 arc-second resolution to estimate groundwater recharge and river discharge. Subsequently, a MODFLOW transient groundwater model is built and forced by the recharge and surface water levels calculated by the land surface model. Results are promising despite the fact that we still use an offline procedure to couple the land surface and MODFLOW groundwater models (i.e. the simulations of both models are separately performed. The simulated river discharges compare well to the observations. Moreover, based on our sensitivity analysis, in which we run several groundwater model scenarios with various hydro-geological parameter settings, we observe that the model can reasonably well reproduce the observed groundwater head time series. However, we note that there are still some limitations in the current approach, specifically because the offline-coupling technique simplifies the dynamic feedbacks between surface water levels and groundwater heads, and between soil moisture states and groundwater heads. Also the current sensitivity analysis ignores the uncertainty of the land surface model output. Despite these limitations, we argue that the results of the current model show a promise for large-scale groundwater modeling practices, including for data-poor environments and at the global scale.

  16. Large-scale academic achievement testing of deaf and hard-of-hearing students: past, present, and future.

    Science.gov (United States)

    Qi, Sen; Mitchell, Ross E

    2012-01-01

    The first large-scale, nationwide academic achievement testing program using Stanford Achievement Test (Stanford) for deaf and hard-of-hearing children in the United States started in 1969. Over the past three decades, the Stanford has served as a benchmark in the field of deaf education for assessing student academic achievement. However, the validity and reliability of using the Stanford for this special student population still require extensive scrutiny. Recent shifts in educational policy environment, which require that schools enable all children to achieve proficiency through accountability testing, warrants a close examination of the adequacy and relevance of the current large-scale testing of deaf and hard-of-hearing students. This study has three objectives: (a) it will summarize the historical data over the last three decades to indicate trends in academic achievement for this special population, (b) it will analyze the current federal laws and regulations related to educational testing and special education, thereby identifying gaps between policy and practice in the field, especially identifying the limitations of current testing programs in assessing what deaf and hard-of-hearing students know, and (c) it will offer some insights and suggestions for future testing programs for deaf and hard-of-hearing students.

  17. Large scale genomic testing within herd does not affect contribution margin

    DEFF Research Database (Denmark)

    Hjortø, Line; Ettema, Jehan; Sørensen, Anders Christian

    2013-01-01

    A Danish study from 2012 shows that genomic test of all or part of the females in a Holstein herd gives roughly the same economic result as not performing any genomic test at all......A Danish study from 2012 shows that genomic test of all or part of the females in a Holstein herd gives roughly the same economic result as not performing any genomic test at all...

  18. Fabrication and testing of gas-filled targets for large-scale plasma experiments on nova

    Energy Technology Data Exchange (ETDEWEB)

    Stone, G.F.; Rivers, C.J.; Spragge, M.R.; Wallace, R.J.

    1996-06-01

    The proposed next-generation ICF facility, the National Ignition Facility (NIF) is designed to produce energy gain from x-ray heated {open_quotes}indirect-drive{close_quotes} fuel capsules. For indirect-drive targets, laser light heats the inside of the Au hohlraum wall and produces x rays which in turn heat and implode the capsule to produce fusion conditions in the fuel. Unlike Nova targets, in NIF-scale targets laser light will propagate through several millimeters of gas, producing a plasma, before impinging upon the Au hohlraum wall. The purpose of the gas-produced plasma is to provide sufficient pressure to keep the radiating Au surface from expanding excessively into the hohlraum cavity. Excessive expansion of the Au wall interacts with the laser pulse and degrades the drive symmetry of the capsule implosion. The authors have begun an experimental campaign on the Nova laser to study the effect of hohlraum gas on both laser-plasma interaction and implosion symmetry. In their current NIF target design, the calculated plasma electron temperature is T{sub e} {approx} 3 keV and the electron density is N{sub e} {approx} 10{sup 21}cm{sup {minus}3}.

  19. The Search for the Holy Grail: Content-Referenced Score Interpretations from Large-Scale Tests

    Science.gov (United States)

    Marion, Scott F.

    2015-01-01

    The measurement industry is in crisis. The public outcry against "over testing" and the opt-out movement are symptoms of a larger sociopolitical battle being fought over Common Core, teacher evaluation, federal intrusion, and a host of other issues, but much of the vitriol is directed at the tests and the testing industry. If we, as…

  20. Textiles Performance Testing Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — The Textiles Performance Testing Facilities has the capabilities to perform all physical wet and dry performance testing, and visual and instrumental color analysis...

  1. GPS Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Global Positioning System (GPS) Test Facility Instrumentation Suite (GPSIS) provides great flexibility in testing receivers by providing operational control of...

  2. Large-scale inhomogeneity in sapphire test masses revealed by Rayleigh scattering imaging

    Science.gov (United States)

    Yan, Zewu; Ju, Li; Eon, François; Gras, Slawomir; Zhao, Chunnong; Jacob, John; Blair, David G.

    2004-03-01

    Rayleigh scattering in test masses can introduce noise and reduce the sensitivity of laser interferometric gravitational wave detectors. In this paper, we present laser Rayleigh scattering imaging as a technique to investigate sapphire test masses. The system provides three-dimensional Rayleigh scattering mapping of entire test masses and quantitative evaluation of the Rayleigh scattering coefficient. Rayleigh scattering mapping of two sapphire samples reveals point defects as well as inhomogeneous structures in the samples. We present results showing significant non-uniform scattering within two 4.5 kg sapphire test masses manufactured by the heat exchanger method.

  3. Development and Execution of a Large-scale DDT Tube Test for IHE Material Qualification

    Energy Technology Data Exchange (ETDEWEB)

    Parker, Gary Robert [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Broilo, Robert M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lopez-Pulliam, Ian Daniel [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vaughan, Larry Dean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-10-24

    Insensitive High Explosive (IHE) Materials are defined in Chapter IX of the DOE Explosive Safety Standard (DOE-STD-1212-2012) as being materials that are massdetonable explosives that are so insensitive that the probability of accidental initiation or transition from burning to detonation is negligible1. There are currently a number of tests included in the standard that are required to qualify a material as IHE, however, none of the tests directly evaluate for the transition from burning to detonation (aka deflagration-to-detonation transition, DDT). Currently, there is a DOE complex-wide effort to revisit the IHE definition in DOE-STD-1212-2012 and change the qualification requirements. The proposal lays out a new approach, requiring fewer, but more appropriate tests, for IHE Material qualification. One of these new tests is the Deflagration-to-Detonation Test. According to the redefinition proposal, the purpose of the new deflagration-todetonation test is “to demonstrate that an IHE material will not undergo deflagration-to-detonation under stockpile relevant conditions of scale, confinement, and material condition. Inherent in this test design is the assumption that ignition does occur, with onset of deflagration. The test design will incorporate large margins and replicates to account for the stochastic nature of DDT events.” In short, the philosophy behind this approach is that if a material fails to undergo DDT in a significant over-test, then it is extremely unlikely to do so in realistic conditions. This effort will be valuable for the B61 LEP to satisfy their need qualify the new production lots of PBX 9502. The work described in this report is intended as a preliminary investigation to support the proposed design of an overly conservative, easily fielded DDT test for updated IHE Material Qualification standard. Specifically, we evaluated the aspects of confinement, geometry, material morphology and temperature. We also developed and tested a

  4. Using Raters from India to Score a Large-Scale Speaking Test

    Science.gov (United States)

    Xi, Xiaoming; Mollaun, Pam

    2011-01-01

    We investigated the scoring of the Speaking section of the Test of English as a Foreign Language[TM] Internet-based (TOEFL iBT[R]) test by speakers of English and one or more Indian languages. We explored the extent to which raters from India, after being trained and certified, were able to score the TOEFL examinees with mixed first languages…

  5. Using Raters from India to Score a Large-Scale Speaking Test

    Science.gov (United States)

    Xi, Xiaoming; Mollaun, Pam

    2011-01-01

    We investigated the scoring of the Speaking section of the Test of English as a Foreign Language[TM] Internet-based (TOEFL iBT[R]) test by speakers of English and one or more Indian languages. We explored the extent to which raters from India, after being trained and certified, were able to score the TOEFL examinees with mixed first languages…

  6. Proportional and Integral Thermal Control System for Large Scale Heating Tests

    Science.gov (United States)

    Fleischer, Van Tran

    2015-01-01

    The National Aeronautics and Space Administration Armstrong Flight Research Center (Edwards, California) Flight Loads Laboratory is a unique national laboratory that supports thermal, mechanical, thermal/mechanical, and structural dynamics research and testing. A Proportional Integral thermal control system was designed and implemented to support thermal tests. A thermal control algorithm supporting a quartz lamp heater was developed based on the Proportional Integral control concept and a linearized heating process. The thermal control equations were derived and expressed in terms of power levels, integral gain, proportional gain, and differences between thermal setpoints and skin temperatures. Besides the derived equations, user's predefined thermal test information generated in the form of thermal maps was used to implement the thermal control system capabilities. Graphite heater closed-loop thermal control and graphite heater open-loop power level were added later to fulfill the demand for higher temperature tests. Verification and validation tests were performed to ensure that the thermal control system requirements were achieved. This thermal control system has successfully supported many milestone thermal and thermal/mechanical tests for almost a decade with temperatures ranging from 50 F to 3000 F and temperature rise rates from -10 F/s to 70 F/s for a variety of test articles having unique thermal profiles and test setups.

  7. Large-scale tests of insulated conduit for the ITER CS coil

    Science.gov (United States)

    Reed, R. P.; Walsh, R. P.; Schutz, J. B.

    Compression-fatigue tests at 77 K were conducted on test modules of insulated Incoloy 908 conduit. To replicate the operating conditions for the ITER central solenoid (CS) full-scale coil, fatigue loads up to 3.6 MN were applied for 10 5 cycles; no mechanical breakdowns occurred. The conduits were insulated with a preimpregnated resin system, a tetraglycidyl diaminodiphenyl methane (TGDM) epoxy cured with DDS aromatic amine. The conduits were joined by vacuum-pressure impregnation with a diglycidyl ether of bisphenol-F epoxy/anhydride-cured resin system. In the 4×4 stacked-conduit test modules, the layer insulation (a high-pressure laminate of TGDM epoxy cured with DDS aromatic amine) was inserted. Periodically during the tests, breakdown voltage was measured across the conduits of both turn and layer insulation; throughout the test, breakdown voltages were at least 46 kV. The addition of a barrier increased structural and electrical reliability.

  8. A fast multilocus test with adaptive SNP selection for large-scale genetic-association studies

    KAUST Repository

    Zhang, Han

    2013-09-11

    As increasing evidence suggests that multiple correlated genetic variants could jointly influence the outcome, a multilocus test that aggregates association evidence across multiple genetic markers in a considered gene or a genomic region may be more powerful than a single-marker test for detecting susceptibility loci. We propose a multilocus test, AdaJoint, which adopts a variable selection procedure to identify a subset of genetic markers that jointly show the strongest association signal, and defines the test statistic based on the selected genetic markers. The P-value from the AdaJoint test is evaluated by a computationally efficient algorithm that effectively adjusts for multiple-comparison, and is hundreds of times faster than the standard permutation method. Simulation studies demonstrate that AdaJoint has the most robust performance among several commonly used multilocus tests. We perform multilocus analysis of over 26,000 genes/regions on two genome-wide association studies of pancreatic cancer. Compared with its competitors, AdaJoint identifies a much stronger association between the gene CLPTM1L and pancreatic cancer risk (6.0 × 10(-8)), with the signal optimally captured by two correlated single-nucleotide polymorphisms (SNPs). Finally, we show AdaJoint as a powerful tool for mapping cis-regulating methylation quantitative trait loci on normal breast tissues, and find many CpG sites whose methylation levels are jointly regulated by multiple SNPs nearby.

  9. Ouellette Thermal Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Thermal Test Facility is a joint Army/Navy state-of-the-art facility (8,100 ft2) that was designed to:Evaluate and characterize the effect of flame and thermal...

  10. Ouellette Thermal Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Thermal Test Facility is a joint Army/Navy state-of-the-art facility (8,100 ft2) that was designed to: Evaluate and characterize the effect of flame and thermal...

  11. Develop and test an Internally Cooled, Cabled Superconductor (ICCS) for large scale MHD magnets

    Science.gov (United States)

    Marston, P. G.; Hale, J. R.; Dawson, A. M.

    1990-04-01

    The work included four principal tasks: (1) development of a design requirements definition for a retrofit MHD magnet system; (2) analysis of an internally cooled, cabled superconductor (ICCS) to use in that design; (3) design of an experiment to test a subscale version of that conductor, which is a NbTi, copper stabilized superconductor; and (4) proof-of-concept testing of the conductor. The program was carried forth through the third task with very successful development and test of a conventional ICCS conductor with 27 multifilamentary copper-superconductor composite strands and a new concept conductor in which, in each triplet, two strands were pure copper and the third strand was a multifilamentary composite. In reviewing the magnet design and the premises for the conductor design it became obvious that an extra barrier might be highly effective in enhancing magnet stability and protection. This concept was developed and a sample conductor manufactured and tested in comparison with an identical conductor lacking such an additional barrier. Results of these conductor tests confirm the potential value of such a barrier. Since the work of tasks 1 through 3 has been reported in detail in quarterly and semiannual reports, as well as in special reports prepared throughout the course of this project, this report reviews early work briefly and then discusses this last phase in great detail.

  12. Liquid Methane Testing With a Large-Scale Spray Bar Thermodynamic Vent System

    Science.gov (United States)

    Hastings, L. J.; Bolshinskiy, L. G.; Hedayat, A.; Flachbart, R. H.; Sisco, J. D.; Schnell. A. R.

    2014-01-01

    NASA's Marshall Space Flight Center conducted liquid methane testing in November 2006 using the multipurpose hydrogen test bed outfitted with a spray bar thermodynamic vent system (TVS). The basic objective was to identify any unusual or unique thermodynamic characteristics associated with densified methane that should be considered in the design of space-based TVSs. Thirteen days of testing were performed with total tank heat loads ranging from 720 to 420 W at a fill level of approximately 90%. It was noted that as the fluid passed through the Joule-Thompson expansion, thermodynamic conditions consistent with the pervasive presence of metastability were indicated. This Technical Publication describes conditions that correspond with metastability and its detrimental effects on TVS performance. The observed conditions were primarily functions of methane densification and helium pressurization; therefore, assurance must be provided that metastable conditions have been circumvented in future applications of thermodynamic venting to in-space methane storage.

  13. Develop and test an internally cooled, cabled superconductor (ICCS) for large scale MHD magnets

    Energy Technology Data Exchange (ETDEWEB)

    Marston, P.G.; Hale, J.R.; Dawson, A.M.

    1990-04-30

    The work conducted under DOE/PETC Contract DE-AC22-84PC70512 has included four principal tasks, (1) development of a Design Requirements Definition for a retrofit MHD magnet system, (2) analysis of an internally cooled, cabled superconductor (ICCS) to use in that design, (3) design of an experiment to test a subscale version of that conductor, which is a NbTi, copper stabilized superconductor, and (4) proof-of-concept testing of the conductor. The program was carried forth through the third task with very successful development and test of a conventional ICCS conductor with 27 multifilamentary copper-superconductor composite strands and a new concept conductor in which, in each triplet, two strands were pure copper and the third strand was a multifilamentary composite. In reviewing the magnet design and the premises for the conductor design it became obvious that, since the principal source of perturbation in MHD magnets derives from slippage between coils, or between turns in a coil, thereby producing frictional heat which must flow through the conductor sheath and the helium to the superconductor strands, an extra barrier might be highly effective in enhancing magnet stability and protection. This concept was developed and a sample conductor manufactured and tested in comparison with an identical conductor lacking such an additional barrier. Results of these conductor tests confirm the potential value of such a barrier. As the work of tasks 1 through 3 has been reported in detail in quarterly and semiannual reports, as well as in special reports prepared throughout the course of this project, this report reviews early work briefly and then discusses this last phase in great detail. 8 refs., 36 figs.

  14. Paranormal psychic believers and skeptics: a large-scale test of the cognitive differences hypothesis.

    Science.gov (United States)

    Gray, Stephen J; Gallo, David A

    2016-02-01

    Belief in paranormal psychic phenomena is widespread in the United States, with over a third of the population believing in extrasensory perception (ESP). Why do some people believe, while others are skeptical? According to the cognitive differences hypothesis, individual differences in the way people process information about the world can contribute to the creation of psychic beliefs, such as differences in memory accuracy (e.g., selectively remembering a fortune teller's correct predictions) or analytical thinking (e.g., relying on intuition rather than scrutinizing evidence). While this hypothesis is prevalent in the literature, few have attempted to empirically test it. Here, we provided the most comprehensive test of the cognitive differences hypothesis to date. In 3 studies, we used online screening to recruit groups of strong believers and strong skeptics, matched on key demographics (age, sex, and years of education). These groups were then tested in laboratory and online settings using multiple cognitive tasks and other measures. Our cognitive testing showed that there were no consistent group differences on tasks of episodic memory distortion, autobiographical memory distortion, or working memory capacity, but skeptics consistently outperformed believers on several tasks tapping analytical or logical thinking as well as vocabulary. These findings demonstrate cognitive similarities and differences between these groups and suggest that differences in analytical thinking and conceptual knowledge might contribute to the development of psychic beliefs. We also found that psychic belief was associated with greater life satisfaction, demonstrating benefits associated with psychic beliefs and highlighting the role of both cognitive and noncognitive factors in understanding these individual differences.

  15. Testing of a Stitched Composite Large-Scale Multi-Bay Pressure Box

    Science.gov (United States)

    Jegley, Dawn; Rouse, Marshall; Przekop, Adam; Lovejoy, Andrew

    2016-01-01

    NASA has created the Environmentally Responsible Aviation (ERA) Project to develop technologies to reduce aviation's impact on the environment. A critical aspect of this pursuit is the development of a lighter, more robust airframe to enable the introduction of unconventional aircraft configurations. NASA and The Boeing Company have worked together to develop a structural concept that is lightweight and an advancement beyond state-of-the-art composite structures. The Pultruded Rod Stitched Efficient Unitized Structure (PRSEUS) is an integrally stiffened panel design where elements are stitched together. The PRSEUS concept is designed to maintain residual load carrying capabilities under a variety of damage scenarios. A series of building block tests were evaluated to explore the fundamental assumptions related to the capability and advantages of PRSEUS panels. The final step in the building block series is an 80%-scale pressure box representing a portion of the center section of a Hybrid Wing Body (HWB) transport aircraft. The testing of this article under maneuver load and internal pressure load conditions is the subject of this paper. The experimental evaluation of this article, along with the other building block tests and the accompanying analyses, has demonstrated the viability of a PRSEUS center body for the HWB vehicle. Additionally, much of the development effort is also applicable to traditional tube-and-wing aircraft, advanced aircraft configurations, and other structures where weight and through-the-thickness strength are design considerations.

  16. Pressurized burner test facility

    Energy Technology Data Exchange (ETDEWEB)

    Maloney, D.J.; Norton, T.S.; Hadley, M.A. [Morgantown Energy Technology Center, WV (United States)

    1993-06-01

    The Morgantown Energy Technology Center (METC) is currently fabricating a high-pressure burner test facility. The facility was designed to support the development of gas turbine combustion systems fired on natural gas and coal-derived gaseous fuels containing fuel-bound nitrogen. Upon completion of fabrication and shake-down testing in October 1993, the facility will be available for use by industrial and university partners through Cooperative Research and Development Agreements (CRADAs) or through other cooperative arrangements. This paper describes the burner test facility and associated operating parameter ranges and informs interested parties of the availability of the facility.

  17. Development, installation and testing of a large-scale tidal current turbine

    Energy Technology Data Exchange (ETDEWEB)

    Thake, J.

    2005-10-15

    This report summarises the findings of the Seaflow project to investigate the feasibility of building and operating a commercial scale marine current horizontal axis tidal turbine and to evaluate the long-term economics of producing electricity using tidal turbines. Details are given of competitive tidal stream technologies and their commercial status, the selection of the site on the North Devon coast of the UK, and the evaluation of the turbine design, manufacture, testing, installation, commissioning, and maintenance of the turbine. The organisations working on the Seaflow project and cost estimations are discussed.

  18. Facile preparation of monodisperse, impurity-free, and antioxidation copper nanoparticles on a large scale for application in conductive ink.

    Science.gov (United States)

    Zhang, Yu; Zhu, Pengli; Li, Gang; Zhao, Tao; Fu, Xianzhu; Sun, Rong; Zhou, Feng; Wong, Ching-ping

    2014-01-08

    Monodisperse copper nanoparticles with high purity and antioxidation properties are synthesized quickly (only 5 min) on a large scale (multigram amounts) by a modified polyol process using slightly soluble Cu(OH)2 as the precursor, L-ascorbic acid as the reductant, and PEG-2000 as the protectant. The resulting copper nanoparticles have a size distribution of 135 ± 30 nm and do not suffer significant oxidation even after being stored for 30 days under ambient conditions. The copper nanoparticles can be well-dispersed in an oil-based ink, which can be silk-screen printed onto flexible substrates and then converted into conductive patterns after heat treatment. An optimal electrical resistivity of 15.8 μΩ cm is achieved, which is only 10 times larger than that of bulk copper. The synthesized copper nanoparticles could be considered as a cheap and effective material for printed electronics.

  19. Testing large-scale (an)isotropy of ultra-high energy cosmic rays

    CERN Document Server

    Koers, Hylke B J

    2008-01-01

    We present a simple yet powerful method to test models of cosmic-ray (CR) origin using the distribution of CR arrival directions. The method is statistically unambiguous in the sense that it is binless and does not invoke scanning over unknown parameters, and general in the sense that it can be applied to any model that predicts a continuous distribution of CRs over the sky. We show that it provides a powerful discrimination between an isotropic distribution and predictions from the "matter tracer" model, a benchmark model that assumes small CR deflections and a continuous distribution of sources tracing the distribution of matter in the Universe. Our method is competitive or superior in statistical power to existing methods, and is especially sensitive in the case of relatively few high energy events. Applying the method to the present data we find that neither an isotropic distribution nor the matter tracer model can be excluded. Based on estimates of its statistical power, we expect that the proposed test ...

  20. Microfluidic very large scale integration (VLSI) modeling, simulation, testing, compilation and physical synthesis

    CERN Document Server

    Pop, Paul; Madsen, Jan

    2016-01-01

    This book presents the state-of-the-art techniques for the modeling, simulation, testing, compilation and physical synthesis of mVLSI biochips. The authors describe a top-down modeling and synthesis methodology for the mVLSI biochips, inspired by microelectronics VLSI methodologies. They introduce a modeling framework for the components and the biochip architecture, and a high-level microfluidic protocol language. Coverage includes a topology graph-based model for the biochip architecture, and a sequencing graph to model for biochemical application, showing how the application model can be obtained from the protocol language. The techniques described facilitate programmability and automation, enabling developers in the emerging, large biochip market. · Presents the current models used for the research on compilation and synthesis techniques of mVLSI biochips in a tutorial fashion; · Includes a set of "benchmarks", that are presented in great detail and includes the source code of several of the techniques p...

  1. Interacting dark energy models in Cosmology and large-scale structure observational tests

    CERN Document Server

    Marcondes, Rafael J F

    2016-01-01

    Modern Cosmology offers us a great understanding of the universe with striking precision, made possible by the modern technologies of the newest generations of telescopes. The standard cosmological model, however, is not absent of theoretical problems and open questions. One possibility that has been put forward is the existence of a coupling between dark sectors. The idea of an interaction between the dark components could help physicists understand why we live in an epoch of the universe where dark matter and dark energy are comparable in terms of energy density, which can be regarded as a coincidence given that their time evolutions are completely different. We introduce the interaction phenomenologically and proceed to test models of interaction with observations of redshift-space distortions. In a flat universe composed only of those two fluids, we consider separately two forms of interaction, through terms proportional to the densities of both dark energy and dark matter. An analytic expression for the ...

  2. Comparing the reinforcement capacity of welded steel mesh and a thin spray-on liner using large scale laboratory tests

    Institute of Scientific and Technical Information of China (English)

    Zhenjun Shan; Porter Ian; Nemcik Jan; Baafi Ernest

    2014-01-01

    Steel mesh is used as a passive skin confinement medium to supplement the active support provided by rock bolts for roof and rib control in underground coal mines. Thin spray-on liners (TSL) are believed to have the potential to take the place of steel mesh as the skin confinement medium in underground mines. To confirm this belief, large scale laboratory experiments were conducted to compare the behaviour of welded steel mesh and a TSL, when used in conjunction with rock bolts, in reinforcing strata with weak bedding planes and strata prone to guttering, two common rock conditions which exist in coal mines. It was found that while the peak load taken by the simulated rock mass with weak bedding planes acting as the control sample (no skin confinement) was 2494 kN, the corresponding value of the sample with 5 mm thick TSL reinforcement reached 2856 kN. The peak load of the steel mesh reinforced sample was only 2321 kN, but this was attributed to the fact that one of the rock bolts broke during the test. The TSL rein-forced sample had a similar post-yield behaviour as the steel mesh reinforced one. The results of the large scale guttering test indicated that a TSL is better than steel mesh in restricting rock movement and thus inhibiting the formation of gutters in the roof.

  3. PANDA: A Multipurpose Integral Test Facility for LWR Safety Investigations

    OpenAIRE

    2012-01-01

    The PANDA facility is a large scale, multicompartmental thermal hydraulic facility suited for investigations related to the safety of current and advanced LWRs. The facility is multipurpose, and the applications cover integral containment response tests, component tests, primary system tests, and separate effect tests. Experimental investigations carried on in the PANDA facility have been embedded in international projects, most of which under the auspices of the EU and OECD and with the supp...

  4. Structural Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Provides a wide variety of testing equipment, fixtures and facilities to perform both unique aviation component testing as well as common types of materials testing...

  5. Structural Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Provides a wide variety of testing equipment, fixtures and facilities to perform both unique aviation component testing as well as common types of materials testing...

  6. Mark 1 Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Mark I Test Facility is a state-of-the-art space environment simulation test chamber for full-scale space systems testing. A $1.5M dollar upgrade in fiscal year...

  7. Performance on large-scale science tests: Item attributes that may impact achievement scores

    Science.gov (United States)

    Gordon, Janet Victoria

    , characteristics of test items themselves and/or opportunities to learn. Suggestions for future research are made.

  8. Large-scale renewal of electric facility for industrial waterworks; Kogyo yosui denki setsubi no daikibo renewal

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-02-29

    The receiving/transformation, power and supervisory control facilities were renewed for Tomo industrial waterworks of Bureau of Public Utilities, Gunma Prefecture. (1) The receiving/transformation facility adopts 6kV 2-circuit power receiving and 2-circuit power supply to every facility for easy maintenance and stable power supply. (2) The power facility adopts PWM converter type inverters for water supply pumps as harmonics control measures, and a CC300M control center (including IPMAT-L) for power circuit to reduce relays as much as possible. The instrumentation facility adopts alarm setting by CRT supervisory control equipment to eliminate various troublesome changes of setting. (3) The supervisory control facility adopts the EIC integrated system composed of an operator station micronet-OPS8000 and a process control station UNISEQUE ADC4000. Like the power supply system, the 2-circuit system is adopted for I/O of common information to prevent various risks. Switching of automatic control modes and setting of various parameters are easily possible systematically on CRT screens. (translated by NEDO)

  9. Pavement Testing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Comprehensive Environmental and Structural Analyses The ERDC Pavement Testing Facility, located on the ERDC Vicksburg campus, was originally constructed to provide...

  10. Pavement Testing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Comprehensive Environmental and Structural AnalysesThe ERDC Pavement Testing Facility, located on the ERDC Vicksburg campus, was originally constructed to provide an...

  11. Concurrent Validity and Feasibility of Short Tests Currently Used to Measure Early Childhood Development in Large Scale Studies.

    Science.gov (United States)

    Rubio-Codina, Marta; Araujo, M Caridad; Attanasio, Orazio; Muñoz, Pablo; Grantham-McGregor, Sally

    2016-01-01

    In low- and middle-income countries (LIMCs), measuring early childhood development (ECD) with standard tests in large scale surveys and evaluations of interventions is difficult and expensive. Multi-dimensional screeners and single-domain tests ('short tests') are frequently used as alternatives. However, their validity in these circumstances is unknown. We examined the feasibility, reliability, and concurrent validity of three multi-dimensional screeners (Ages and Stages Questionnaires (ASQ-3), Denver Developmental Screening Test (Denver-II), Battelle Developmental Inventory screener (BDI-2)) and two single-domain tests (MacArthur-Bates Short-Forms (SFI and SFII), WHO Motor Milestones (WHO-Motor)) in 1,311 children 6-42 months in Bogota, Colombia. The scores were compared with those on the Bayley Scales of Infant and Toddler Development (Bayley-III), taken as the 'gold standard'. The Bayley-III was given at a center by psychologists; whereas the short tests were administered in the home by interviewers, as in a survey setting. Findings indicated good internal validity of all short tests except the ASQ-3. The BDI-2 took long to administer and was expensive, while the single-domain tests were quickest and cheapest and the Denver-II and ASQ-3 were intermediate. Concurrent validity of the multi-dimensional tests' cognitive, language, and fine motor scales with the corresponding Bayley-III scale was low below 19 months. However, it increased with age, becoming moderate-to-high over 30 months. In contrast, gross motor scales' concurrence was high under 19 months and then decreased. Of the single-domain tests, the WHO-Motor had high validity with gross motor under 16 months, and the SFI and SFII expressive scales showed moderate correlations with language under 30 months. Overall, the Denver-II was the most feasible and valid multi-dimensional test and the ASQ-3 performed poorly under 31 months. By domain, gross motor development had the highest concurrence below 19

  12. Environmental Test Facility (ETF)

    Data.gov (United States)

    Federal Laboratory Consortium — The Environmental Test Facility (ETF) provides non-isolated shock testing for stand-alone equipment and full size cabinets under MIL-S-901D specifications. The ETF...

  13. Ballistic Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Ballistic Test Facility is comprised of two outdoor and one indoor test ranges, which are all instrumented for data acquisition and analysis. Full-size aircraft...

  14. Corrosion Testing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Corrosion Testing Facility is part of the Army Corrosion Office (ACO). It is a fully functional atmospheric exposure site, called the Corrosion Instrumented Test...

  15. Analysis Methods for Extracting Knowledge from Large-Scale WiFi Monitoring to Inform Building Facility Planning

    DEFF Research Database (Denmark)

    Ruiz-Ruiz, Antonio; Blunck, Henrik; Prentow, Thor Siiger

    2014-01-01

    realistic data to inform facility planning. In this paper, we propose analysis methods to extract knowledge from large sets of network collected WiFi traces to better inform facility management and planning in large building complexes. The analysis methods, which build on a rich set of temporal and spatial...... features, include methods for noise removal, e.g., labeling of beyond building-perimeter devices, and methods for quantification of area densities and flows, e.g., building enter and exit events, and for classifying the behavior of people, e.g., into user roles such as visitor, hospitalized or employee...... noise removal of beyond building perimeter devices. We furthermore present detailed statistics from our analysis regarding people’s presence, movement and roles, and example types of visualizations that both highlight their potential as inspection tools for planners and provide interesting insights...

  16. Construction and testing of a large scale prototype of a silicon tungsten electromagnetic calorimeter for a future lepton collider

    CERN Document Server

    Rouëné,J

    2013-01-01

    The CALICE collaboration is preparing large scale prototypes of highly granular calorimeters for detectors to be operated at a future linear electron positron collider. After several beam campaigns at DESY, CERN and FNAL, the CALICE collaboration has demonstrated the principle of highly granular electromagnetic calorimeters with a first prototype called physics prototype. The next prototype, called technological prototype, addresses the engineering challenges which come along with the realisation of highly granular calorimeters. This prototype will comprise 30 layers where each layer is composed of four 9_9 cm2 silicon wafers. The front end electronics is integrated into the detector layers. The size of each pixel is 5_5 mm2. This prototype enter sits construction phase. We present results of the first layers of the technological prototype obtained during beam test campaigns in spring and summer 2012. According to these results the signal over noise ratio of the detector exceeds the R&D goal of10:1.

  17. Concurrent Validity and Feasibility of Short Tests Currently Used to Measure Early Childhood Development in Large Scale Studies

    Science.gov (United States)

    Rubio-Codina, Marta; Araujo, M. Caridad; Attanasio, Orazio; Muñoz, Pablo; Grantham-McGregor, Sally

    2016-01-01

    In low- and middle-income countries (LIMCs), measuring early childhood development (ECD) with standard tests in large scale surveys and evaluations of interventions is difficult and expensive. Multi-dimensional screeners and single-domain tests (‘short tests’) are frequently used as alternatives. However, their validity in these circumstances is unknown. We examined the feasibility, reliability, and concurrent validity of three multi-dimensional screeners (Ages and Stages Questionnaires (ASQ-3), Denver Developmental Screening Test (Denver-II), Battelle Developmental Inventory screener (BDI-2)) and two single-domain tests (MacArthur-Bates Short-Forms (SFI and SFII), WHO Motor Milestones (WHO-Motor)) in 1,311 children 6–42 months in Bogota, Colombia. The scores were compared with those on the Bayley Scales of Infant and Toddler Development (Bayley-III), taken as the ‘gold standard’. The Bayley-III was given at a center by psychologists; whereas the short tests were administered in the home by interviewers, as in a survey setting. Findings indicated good internal validity of all short tests except the ASQ-3. The BDI-2 took long to administer and was expensive, while the single-domain tests were quickest and cheapest and the Denver-II and ASQ-3 were intermediate. Concurrent validity of the multi-dimensional tests’ cognitive, language, and fine motor scales with the corresponding Bayley-III scale was low below 19 months. However, it increased with age, becoming moderate-to-high over 30 months. In contrast, gross motor scales’ concurrence was high under 19 months and then decreased. Of the single-domain tests, the WHO-Motor had high validity with gross motor under 16 months, and the SFI and SFII expressive scales showed moderate correlations with language under 30 months. Overall, the Denver-II was the most feasible and valid multi-dimensional test and the ASQ-3 performed poorly under 31 months. By domain, gross motor development had the highest concurrence

  18. Instrument Thermal Test Bed - A unique two phase test facility

    Science.gov (United States)

    Swanson, Theodore; Didion, Jeffrey

    1991-01-01

    The Instrument Thermal Test Bed (ITTB) is a modular, large-scale test facility which provides a medium for ground testing and flight qualification of spacecraft thermal control components and system configurations. The initial 'shade-down' operations are discussed herein. Operational parameters and performance characteristics were determined and quantified on a preliminary basis. The ITTB was successfully operated at evaporator power loads ranging from 600 W to 9600 W as well as in both capillary pumped and series hybrid pumped modes.

  19. Thermal distortion test facility

    Science.gov (United States)

    Stapp, James L.

    1995-02-01

    The thermal distortion test facility (TDTF) at Phillips Laboratory provides precise measurements of the distortion of mirrors that occurs when their surfaces are heated. The TDTF has been used for several years to evaluate mirrors being developed for high-power lasers. The facility has recently undergone some significant upgrades to improve the accuracy with which mirrors can be heated and the resulting distortion measured. The facility and its associated instrumentation are discussed.

  20. Exploring the Demands on Nurses Working in Health Care Facilities During a Large-Scale Natural Disaster

    Directory of Open Access Journals (Sweden)

    Gillian C. Scrymgeour

    2016-06-01

    Full Text Available Nurses are pivotal to an effective societal response to a range of critical events, including disasters. This presents nurses with many significant and complex challenges that require them to function effectively under highly challenging and stressful circumstances and often for prolonged periods of time. The exponential growth in the number of disasters means that knowledge of disaster preparedness and how this knowledge can be implemented to facilitate the development of resilient and adaptive nurses and health care organizations represents an important adjunct to nurse education, policy development, and research considerations. Although this topic has and continues to attract attention in the literature, a lack of systematic understanding of the contingencies makes it difficult to clearly differentiate what is known and what gaps remain in this literature. Providing a sound footing for future research can be facilitated by first systematically reviewing the relevant literature. Focused themes were identified and analyzed using an ecological and interactive systems framework. Ten of the 12 retained studies included evacuation, revealing that evacuation is more likely to occur in an aged care facility than a hospital. The unpredictability of an event also highlighted organizational, functional, and competency issues in regard to the complexity of decision making and overall preparedness. The integrative review also identified that the unique roles, competencies, and demands on nurses working in hospitals and residential health care facilities during a natural disaster appear invisible within the highly visible event.

  1. Ice-Accretion Test Results for Three Large-Scale Swept-Wing Models in the NASA Icing Research Tunnel

    Science.gov (United States)

    Broeren, Andy P.; Potapczuk, Mark G.; Lee, Sam; Malone, Adam M.; Paul, Benard P., Jr.; Woodard, Brian S.

    2016-01-01

    Icing simulation tools and computational fluid dynamics codes are reaching levels of maturity such that they are being proposed by manufacturers for use in certification of aircraft for flight in icing conditions with increasingly less reliance on natural-icing flight testing and icing-wind-tunnel testing. Sufficient high-quality data to evaluate the performance of these tools is not currently available. The objective of this work was to generate a database of ice-accretion geometry that can be used for development and validation of icing simulation tools as well as for aerodynamic testing. Three large-scale swept wing models were built and tested at the NASA Glenn Icing Research Tunnel (IRT). The models represented the Inboard (20% semispan), Midspan (64% semispan) and Outboard stations (83% semispan) of a wing based upon a 65% scale version of the Common Research Model (CRM). The IRT models utilized a hybrid design that maintained the full-scale leading-edge geometry with a truncated afterbody and flap. The models were instrumented with surface pressure taps in order to acquire sufficient aerodynamic data to verify the hybrid model design capability to simulate the full-scale wing section. A series of ice-accretion tests were conducted over a range of total temperatures from -23.8 deg C to -1.4 deg C with all other conditions held constant. The results showed the changing ice-accretion morphology from rime ice at the colder temperatures to highly 3-D scallop ice in the range of -11.2 deg C to -6.3 deg C. Warmer temperatures generated highly 3-D ice accretion with glaze ice characteristics. The results indicated that the general scallop ice morphology was similar for all three models. Icing results were documented for limited parametric variations in angle of attack, drop size and cloud liquid-water content (LWC). The effect of velocity on ice accretion was documented for the Midspan and Outboard models for a limited number of test cases. The data suggest that

  2. Lessons from a Large-Scale Assessment: Results from Free Response Pre- and Post-testing in Electricity and Magnetism

    CERN Document Server

    Thacker, Beth; Chapagain, Ganesh; Rusuriye, Vanelet; Dulli, Hani

    2014-01-01

    As part of a large-scale assessment project at a large university, we administered weekly pre-tests and bi-weekly post-tests in the recitation sections of our introductory classes over four semesters from Spring 2010 through Fall 2011. The post-tests were administered as graded quizzes and were developed to assess problem solving, laboratory, calculational and conceptual skills that had been the focus of instruction in lab, lecture and recitation sessions. They were not comprehensive, but gave us 'snapshots' of students' abilities throughout the semester. They were used in conjunction with other forms of assessment, such as conceptual inventories, to give us a broader picture of the state of our undergraduate classes, recitation sections and laboratories. The written pre- and post-tests, which required students to show their work and explain their reasoning, yielded different information on students' skills than the conceptual inventories. On almost all of the questions, the students in the inquiry-based cour...

  3. Wind Tunnel Testing Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — NASA Ames Research Center is pleased to offer the services of our premier wind tunnel facilities that have a broad range of proven testing capabilities to customers...

  4. Climatic Environmental Test Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — RTTC has an extensive suite of facilities for supporting MIL-STD-810 testing, toinclude: Temperature/Altitude, Rapid Decompression, Low/High Temperature,Temperature...

  5. Urban Test Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — RTTC has access to various facilities for use in urban testing applications,including an agreement with the Hazardous Devices School (HDS): a restrictedaccess Urban...

  6. Toroid magnet test facility

    CERN Multimedia

    2002-01-01

    Because of its exceptional size, it was not feasible to assemble and test the Barrel Toroid - made of eight coils - as an integrated toroid on the surface, prior to its final installation underground in LHC interaction point 1. It was therefore decided to test these eight coils individually in a dedicated test facility.

  7. Multi-parameter decoupling and slope tracking control strategy of a large-scale high altitude environment simulation test cabin

    Institute of Scientific and Technical Information of China (English)

    Li Ke; Liu Wangkai; Wang Jun; Huang Yong; Liu Meng

    2014-01-01

    A large-scale high altitude environment simulation test cabin was developed to accu-rately control temperatures and pressures encountered at high altitudes. The system was developed to provide slope-tracking dynamic control of the temperature–pressure two-parameter and over-come the control difficulties inherent to a large inertia lag link with a complex control system which is composed of turbine refrigeration device, vacuum device and liquid nitrogen cooling device. The system includes multi-parameter decoupling of the cabin itself to avoid equipment damage of air refrigeration turbine caused by improper operation. Based on analysis of the dynamic characteris-tics and modeling for variations in temperature, pressure and rotation speed, an intelligent control-ler was implemented that includes decoupling and fuzzy arithmetic combined with an expert PID controller to control test parameters by decoupling and slope tracking control strategy. The control system employed centralized management in an open industrial ethernet architecture with an indus-trial computer at the core. The simulation and field debugging and running results show that this method can solve the problems of a poor anti-interference performance typical for a conventional PID and overshooting that can readily damage equipment. The steady-state characteristics meet the system requirements.

  8. Testing gravity on large scales by combining weak lensing with galaxy clustering using CFHTLenS and BOSS CMASS

    Science.gov (United States)

    Alam, Shadab; Miyatake, Hironao; More, Surhud; Ho, Shirley; Mandelbaum, Rachel

    2017-03-01

    We measure a combination of gravitational lensing, galaxy clustering and redshift-space distortions (RSDs) called EG. The quantity EG probes both parts of metric potential and is insensitive to galaxy bias and σ8. These properties make it an attractive statistic to test lambda cold dark matter, general relativity and its alternate theories. We have combined CMASS Data Release 11 with CFHTLenS and recent measurements of β from RSD analysis, and find EG(z = 0.57) = 0.42 ± 0.056, a 13 per cent measurement in agreement with the prediction of general relativity EG(z = 0.57) = 0.396 ± 0.011 using the Planck 2015 cosmological parameters. We have corrected our measurement for various observational and theoretical systematics. Our measurement is consistent with the first measurement of EG using cosmic microwave background lensing in place of galaxy lensing at small scales, but shows 2.8σ tension when compared with their final results including large scales. This analysis with future surveys will provide improved statistical error and better control over systematics to test general relativity and its alternate theories.

  9. Surface-subsurface flow modeling: an example of large-scale research at the new NEON user facility

    Science.gov (United States)

    Powell, H.; McKnight, D. M.

    2009-12-01

    . Theoretical constructs, such as the River Continuum Concept, that aim to elucidate general mechanistic underpinnings of freshwater ecosystem function via testable hypotheses about relative rates of photosynthesis and respiration, for example, may be readily examined using data collected at hourly time scales at the NEON facility once constructed. By taking advantage of NEON data and adding PI-driven research to the Observatory, we can further our understanding of the relative roles of water flow, nutrients, temperature, and light on freshwater ecosystem function and structure.

  10. Evaluation of Multiplex-Based Antibody Testing for Use in Large-Scale Surveillance for Yaws: a Comparative Study.

    Science.gov (United States)

    Cooley, Gretchen M; Mitja, Oriol; Goodhew, Brook; Pillay, Allan; Lammie, Patrick J; Castro, Arnold; Moses, Penias; Chen, Cheng; Ye, Tun; Ballard, Ronald; Martin, Diana L

    2016-05-01

    WHO has targeted yaws for global eradication by 2020. The program goals are to interrupt the transmission in countries where yaws is endemic and to certify countries as yaws free where yaws was endemic in the past. No new rapid plasmin reagin (RPR) seroreactivity in young children is required for certification of elimination at a country level. We sought to evaluate whether antibody responses to specific treponemal antigens measured in a high-throughput multiplex bead array (MBA) assay differentiate past versus current infection and whether a nontreponemal lipoidal antigen test can be incorporated into the MBA. Serum and dried blood spot specimens collected for yaws surveillance projects in Ghana, Vanuatu, and Papua New Guinea (PNG) were run on MBA to measure antibodies against recombinant p17 (rp17) and treponemal membrane protein A (TmpA) treponemal antigens. Results were compared to standard treponemal laboratory (TPPA or TPHA [TPP(H)A]) and quantitative RPR test data. Of 589 specimens, 241 were TPP(H)A(+)/RPR(+), 88 were TPP(H)A(+)/RPR(-), 6 were TPP(H)A(-)/RPR(+), and 254 were negative for both tests. Compared to TPP(H)A, reactive concordance of rp17 was 93.7%, while reactive concordance of TmpA was only 81.9%. TmpA-specific reactivity showed good correlation with RPR titers (R(2) = 0.41; P RPR testing (cardiolipin) were not detected in the MBA. Our results suggest that TmpA can be used as a treponemal antigen marker for recent or active infection and potentially replace RPR in a high-throughput multiplex tool for large-scale yaws surveillance.

  11. Large scale characterization of unsaturated soil properties in a semi-arid region combining infiltration, pedotransfer functions and evaporation tests

    Science.gov (United States)

    Shabou, Marouen; Angulo-Jaramillo, Rafael; Lassabatère, Laurent; Boulet, Gilles; Mougenot, Bernard; Lili Chabaane, Zohra; Zribi, Mehrez

    2016-04-01

    Water resource management is a major issue in semi-arid regions, especially where irrigated agriculture is dominant on soils with highly variable clay content. Indeed, topsoil clay content has a significant importance on infiltration and evaporation processes and therefore in the estimation of the volume of water needed for crops. In this poster we present several methods to estimate wilting point, field capacity volumetric water contents and saturated hydraulic conductivity of the Kairouan plain (680 km2), central Tunisia (North Africa). The first method relies on the Beerkan Estimation of Soil Transfer parameters (BEST) method, which consists in local estimate of unsaturated soil hydraulic properties from a single-ring infiltration test, combined with the use of pedotransfer functions applied to the Kairouan plain different soil types. Results are obtained over six different topsoil texture classes along the Kairouan plain. Saturated hydraulic conductivity is high for coarse textured and some of the fine textured soils due to shrinkage cracking-macropore soil structure. The saturated hydraulic conductivity values are respectively 1.31E-5 m.s-1 and 1.71E-05 m.s-1. The second method is based on evaporation tests on different test plots. It consists of analyzing soil moisture profile changes during the dry down periods to detect the time-to-stress that can be obtained from observation of soil moisture variation, albedo measurements and variation of soil temperature. Results show that the estimated parameters with the evaporation method are close to those obtained by combining the BEST method and pedotransfer functions. The results validate that combining local infiltration tests and pedotransfer functions is a promising tool for the large scale hydraulic characterization of region with strong spatial variability of soils properties.

  12. Absolute pitch among students at the Shanghai Conservatory of Music: a large-scale direct-test study.

    Science.gov (United States)

    Deutsch, Diana; Li, Xiaonuo; Shen, Jing

    2013-11-01

    This paper reports a large-scale direct-test study of absolute pitch (AP) in students at the Shanghai Conservatory of Music. Overall note-naming scores were very high, with high scores correlating positively with early onset of musical training. Students who had begun training at age ≤5 yr scored 83% correct not allowing for semitone errors and 90% correct allowing for semitone errors. Performance levels were higher for white key pitches than for black key pitches. This effect was greater for orchestral performers than for pianists, indicating that it cannot be attributed to early training on the piano. Rather, accuracy in identifying notes of different names (C, C#, D, etc.) correlated with their frequency of occurrence in a large sample of music taken from the Western tonal repertoire. There was also an effect of pitch range, so that performance on tones in the two-octave range beginning on Middle C was higher than on tones in the octave below Middle C. In addition, semitone errors tended to be on the sharp side. The evidence also ran counter to the hypothesis, previously advanced by others, that the note A plays a special role in pitch identification judgments.

  13. Testing gravity on large scales by combining weak lensing with galaxy clustering using CFHTLenS and BOSS CMASS

    CERN Document Server

    Alam, Shadab; More, Surhud; Ho, Shirley; Mandelbaum, Rachel

    2016-01-01

    We measure a combination of gravitational lensing, galaxy clustering, and redshift-space distortions called $E_G$. The quantity $E_G$ probes both parts of metric potential and is insensitive to galaxy bias and $\\sigma_8$. These properties make it an attractive statistic to test $\\Lambda$CDM, General Relativity and its alternate theories. We have combined CMASS DR11 with CFHTLenS and recent measurements of $\\beta$ from RSD analysis, and find $E_G(z = 0.57) = 0.42 \\pm 0.056$, an 13\\% measurement in agreement with the prediction of general relativity $E_G(z = 0.57) = 0.396 \\pm 0.011$ using the Planck 2015 cosmological parameters. We have corrected our measurement for various observational and theoretical systematics. Our measurement is consistent with the first measurement of $E_G$ using CMB lensing in place of galaxy lensing (Pullen et. al. 2015a) at small scales, but shows 2.8$\\sigma$ tension when compared with their final results including large scales. This analysis with future surveys will provide improved ...

  14. A cryogenic test facility

    Science.gov (United States)

    Veenendaal, Ian

    The next generation, space-borne instruments for far infrared spectroscopy will utilize large diameter, cryogenically cooled telescopes in order to achieve unprecedented sensitivities. Low background, ground-based cryogenic facilities are required for the cryogenic testing of materials, components and subsystems. The Test Facility Cryostat (TFC) at the University of Lethbridge is a large volume, closed cycle, 4K cryogenic facility, developed for this purpose. This thesis discusses the design and performance of the facility and associated external instrumentation. An apparatus for measuring the thermal properties of materials is presented, and measurements of the thermal expansion and conductivity of carbon fibre reinforced polymers (CFRPs) at cryogenic temperatures are reported. Finally, I discuss the progress towards the design and fabrication of a demonstrator cryogenic, far infrared Fourier transform spectrometer.

  15. Large-scale experimental facility for emergency condition investigation of a new generation NPP WWER-640 reactor with passive safety systems

    Energy Technology Data Exchange (ETDEWEB)

    Aniskevich, Y.N.; Vasilenko, V.A.; Zasukha, V.K.; Migrov, Y.A.; Khabensky, V.B. [Research Inst. of Technology NITI (Russian Federation)

    1997-12-31

    The creation of the large-scale integral experimental facility (KMS) is specified by the programme of the experimental investigations to justify the engineering decisions on the safety of the design of the new generation NPP with the reactor WWER-640. The construction of KMS in a full volume will allow to conduct experimental investigations of all physical phenomena and processes, practically, occurring during the accidents on the NPPs with the reactor of WWER type and including the heat - mass exchange processes with low rates of the coolant, which is typical during the utilization of the passive safety systems, process during the accidents with a large leak, and also the complex intercommunicated processes in the reactor unit, passive safety systems and in the containment with the condition of long-term heat removal to the final absorber. KMS is being constructed at the Research Institute of Technology (NITI), Sosnovy Bor, Leningrad region, Russia. (orig.). 5 refs.

  16. Combined electron-beam and coagulation purification of molasses distillery slops. Features of the method, technical and economic evaluation of large-scale facility

    Energy Technology Data Exchange (ETDEWEB)

    Pikaev, A.K. E-mail: pikaev@ipc.rssi.ru; Ponomarev, A.V.; Bludenko, A.V.; Minin, V.N.; Elizar' eva, L.M

    2001-04-01

    The paper summarizes the results obtained from the study on combined electron-beam and coagulation method for purification of molasses distillery slops from distillery produced ethyl alcohol by fermentation of grain, potato, beet and some other plant materials. The method consists in preliminary mixing of industrial wastewater with municipal wastewater, electron-beam treatment of the mixture and subsequent coagulation. Technical and economic evaluation of large-scale facility (output of 7000 m{sup 3} day{sup -1}) with two powerful cascade electron accelerators (total maximum beam power of 400 kW) for treatment of the wastewater by the above method was carried out. It was calculated that the cost of purification of the wastes is equal to 0.25 US$ m{sup -3} that is noticeably less than in the case of the existing method.

  17. Combined electron-beam and coagulation purification of molasses distillery slops. Features of the method, technical and economic evaluation of large-scale facility

    Science.gov (United States)

    Pikaev, A. K.; Ponomarev, A. V.; Bludenko, A. V.; Minin, V. N.; Elizar'eva, L. M.

    2001-04-01

    The paper summarizes the results obtained from the study on combined electron-beam and coagulation method for purification of molasses distillery slops from distillery produced ethyl alcohol by fermentation of grain, potato, beet and some other plant materials. The method consists in preliminary mixing of industrial wastewater with municipal wastewater, electron-beam treatment of the mixture and subsequent coagulation. Technical and economic evaluation of large-scale facility (output of 7000 m 3 day -1) with two powerful cascade electron accelerators (total maximum beam power of 400 kW) for treatment of the wastewater by the above method was carried out. It was calculated that the cost of purification of the wastes is equal to 0.25 US$ m -3 that is noticeably less than in the case of the existing method.

  18. Erosion and break-up of light-gas layers by a horizontal jet in a multi-vessel, large-scale containment test system

    Energy Technology Data Exchange (ETDEWEB)

    Zboray, Robert, E-mail: robert.zboray@psi.ch; Mignot, Guillaume; Kapulla, Ralf; Paladino, Domenico

    2015-09-15

    The distribution and eventual stratification of hydrogen released during a hypothetical severe accident and the stability of the stratification formed in the early phase of the transient is of particular safety concern in Light Water Reactors (LWRs). The large-scale containment test facility PANDA (PSI, Switzerland) has been used to perform a series of four tests examining the erosion and break-up of stratified light-gas layers in the frame of the OECD SETH-2 project. The ultimate goal of the test program is to set-up an experimental data base of high-quality and high-density data that can challenge and validate 3D containment codes like e.g. GOTHIC, GASFLOW or MARS and validate the applicability of CFD codes like FLUENT or CFX for LWR containment problems. The test series discussed here focuses on the erosion of a stratified, helium-rich layer by horizontal steam injection at different locations below the layer. An approach with step-wise increasing complexity has been chosen to examine this problem allowing control over the rate of pressure increase and the occurrence of condensation. The step-wise approach enables a thorough understanding of the influence of different phenomena like position of steam injection, diffusion, pressurization and condensation on the behavior and erosion of the stratified layer.

  19. Test facilities for VINCI®

    Science.gov (United States)

    Greuel, Dirk; Schäfer, Klaus; Schlechtriem, Stefan

    2013-09-01

    With the replacement of the current upper-stage ESC-A of the Ariane 5 launcher by an enhanced cryogenic upper-stage, ESA's Ariane 5 Midterm Evolution (A5-ME) program aims to raise the launcher's payload capacity in geostationary transfer orbit from 10 to 12 tons, an increase of 20 %. Increasing the in-orbit delivery capability of the A5-ME launcher requires a versatile, high-performance, evolved cryogenic upper-stage engine suitable for delivering multiple payloads to all kinds of orbits, ranging from low earth orbit to geostationary transfer orbit with increased perigee. In order to meet these requirements the re-ignitable liquid oxygen/liquid hydrogen expander cycle engine VINCI® currently under development is designated to power the future upper stage, featuring a design performance of 180 kN of thrust and 464 s of specific impulse. Since 2010 development tests for the VINCI® engine have been conducted at the test benches P3.2 and P4.1 at DLR test site in Lampoldshausen under the ESA A5-ME program. For the VINCI® combustion chamber development the P3.2 test facility is used, which is the only European thrust chamber test facility. Originally erected for the development of the thrust chamber of the Vulcain engine, in 2003 the test facility was modified that today it is able to simulate vacuum conditions for the ignition and startup of the VINCI® combustion chamber. To maintain the test operations under vacuum conditions over an entire mission life of the VINCI® engine, including re-ignition following long and short coasting phases, between 2000 and 2005 the test facility P4.1 was completely rebuilt into a new high-altitude simulation facility. During the past two P4.1 test campaigns in 2010 and 2011 a series of important milestones were reached in the development of the VINCI® engine. In preparation for future activities within the frame of ESA's A5-ME program DLR has already started the engineering of a stage test facility for the prospective upper stage

  20. Testing Students with Special Educational Needs in Large-Scale Assessments - Psychometric Properties of Test Scores and Associations with Test Taking Behavior.

    Science.gov (United States)

    Pohl, Steffi; Südkamp, Anna; Hardt, Katinka; Carstensen, Claus H; Weinert, Sabine

    2016-01-01

    Assessing competencies of students with special educational needs in learning (SEN-L) poses a challenge for large-scale assessments (LSAs). For students with SEN-L, the available competence tests may fail to yield test scores of high psychometric quality, which are-at the same time-measurement invariant to test scores of general education students. We investigated whether we can identify a subgroup of students with SEN-L, for which measurement invariant competence measures of adequate psychometric quality may be obtained with tests available in LSAs. We furthermore investigated whether differences in test-taking behavior may explain dissatisfying psychometric properties and measurement non-invariance of test scores within LSAs. We relied on person fit indices and mixture distribution models to identify students with SEN-L for whom test scores with satisfactory psychometric properties and measurement invariance may be obtained. We also captured differences in test-taking behavior related to guessing and missing responses. As a result we identified a subgroup of students with SEN-L for whom competence scores of adequate psychometric quality that are measurement invariant to those of general education students were obtained. Concerning test taking behavior, there was a small number of students who unsystematically picked response options. Removing these students from the sample slightly improved item fit. Furthermore, two different patterns of missing responses were identified that explain to some extent problems in the assessments of students with SEN-L.

  1. Large-scale laboratory testing of bedload-monitoring technologies: overview of the StreamLab06 Experiments

    Science.gov (United States)

    Marr, Jeffrey D.G.; Gray, John R.; Davis, Broderick E.; Ellis, Chris; Johnson, Sara; Gray, John R.; Laronne, Jonathan B.; Marr, Jeffrey D.G.

    2010-01-01

    A 3-month-long, large-scale flume experiment involving research and testing of selected conventional and surrogate bedload-monitoring technologies was conducted in the Main Channel at the St. Anthony Falls Laboratory under the auspices of the National Center for Earth-surface Dynamics. These experiments, dubbed StreamLab06, involved 25 researchers and volunteers from academia, government, and the private sector. The research channel was equipped with a sediment-recirculation system and a sediment-flux monitoring system that allowed continuous measurement of sediment flux in the flume and provided a data set by which samplers were evaluated. Selected bedload-measurement technologies were tested under a range of flow and sediment-transport conditions. The experiment was conducted in two phases. The bed material in phase I was well-sorted siliceous sand (0.6-1.8 mm median diameter). A gravel mixture (1-32 mm median diameter) composed the bed material in phase II. Four conventional bedload samplers – a standard Helley-Smith, Elwha, BLH-84, and Toutle River II (TR-2) sampler – were manually deployed as part of both experiment phases. Bedload traps were deployed in study Phase II. Two surrogate bedload samplers – stationarymounted down-looking 600 kHz and 1200 kHz acoustic Doppler current profilers – were deployed in experiment phase II. This paper presents an overview of the experiment including the specific data-collection technologies used and the ambient hydraulic, sediment-transport and environmental conditions measured as part of the experiment. All data collected as part of the StreamLab06 experiments are, or will be available to the research community.

  2. Distributed Energy Resources Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — NREL's Distributed Energy Resources Test Facility (DERTF) is a working laboratory for interconnection and systems integration testing. This state-of-the-art facility...

  3. Distributed Energy Resources Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — NREL's Distributed Energy Resources Test Facility (DERTF) is a working laboratory for interconnection and systems integration testing. This state-of-the-art facility...

  4. Testing gene-environment interaction in large-scale case-control association studies: possible choices and comparisons.

    Science.gov (United States)

    Mukherjee, Bhramar; Ahn, Jaeil; Gruber, Stephen B; Chatterjee, Nilanjan

    2012-02-01

    Several methods for screening gene-environment interaction have recently been proposed that address the issue of using gene-environment independence in a data-adaptive way. In this report, the authors present a comparative simulation study of power and type I error properties of 3 classes of procedures: 1) the standard 1-step case-control method; 2) the case-only method that requires an assumption of gene-environment independence for the underlying population; and 3) a variety of hybrid methods, including empirical-Bayes, 2-step, and model averaging, that aim at gaining power by exploiting the assumption of gene-environment independence and yet can protect against false positives when the independence assumption is violated. These studies suggest that, although the case-only method generally has maximum power, it has the potential to create substantial false positives in large-scale studies even when a small fraction of markers are associated with the exposure under study in the underlying population. All the hybrid methods perform well in protecting against such false positives and yet can retain substantial power advantages over standard case-control tests. The authors conclude that, for future genome-wide scans for gene-environment interactions, major power gain is possible by using alternatives to standard case-control analysis. Whether a case-only type scan or one of the hybrid methods should be used depends on the strength and direction of gene-environment interaction and association, the level of tolerance for false positives, and the nature of replication strategies.

  5. Pressurized burner test facility

    Energy Technology Data Exchange (ETDEWEB)

    Maloney, D.J.; Norton, T.S.; Hadley, M.A.

    1993-09-01

    The US Department of Energy`s METC has recently completed construction and commissioning of a new high-pressure combustion research facility. Utilities servicing the facility enable combustion tests at scales up to 3 MW (10 MM Btu/h) and pressures in excess of 3000 kPa (30 atm). These include a preheated, high-pressure air supply that can deliver up to 1.7 kg/s (3.7 lbs/s) of combustion air, and a high-pressure, natural gas compressor that can deliver 0.8 kg/s (.19 lbs/s). In the summer of 1994 METC`s syngas generator is scheduled to come on line, at which time combustion tests on a range of fuel gases from low to medium to high heating values will be possible. The syngas generator will simulate a range of fuel gas compositions characteristic of coal gasification product streams. As part of the combustion facility, a high-pressure burner test facility is currently being constructed to support the development of gas turbine combustion systems fired on natural gas and coal-derived gaseous fuels containing fuel-bound nitrogen. The facility, illustrated in Figure 1, is a 61-centimeter (24-inch) diameter, refractory-lined vessel of modular construction, offering the flexibility to test a variety of NO{sub x} control concepts. Burner test modules are sandwiched between gas inlet and sampling plenums with a maximum combustion test zone of 2.2 m (90 inches) in length. Modules are custom designed for specific burners.

  6. National Solar Thermal Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The National Solar Thermal Test Facility (NSTTF) is the only test facility in the United States of its type. This unique facility provides experimental engineering...

  7. National Solar Thermal Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The National Solar Thermal Test Facility (NSTTF) is the only test facility in the United States of its type. This unique facility provides experimental engineering...

  8. Modal analysis of measurements from a large-scale VIV model test of a riser in linearly sheared flow

    Science.gov (United States)

    Lie, H.; Kaasen, K. E.

    2006-05-01

    Large-scale model testing of a tensioned steel riser in well-defined sheared current was performed at Hanøytangen outside Bergen, Norway in 1997. The length of the model was 90 m and the diameter was 3 cm. The aim of the present work is to look into this information and try to improve the understanding of vortex-induced vibrations (VIV) for cases with very high order of responding modes, and in particular to study if and under which circumstances the riser motions would be single-mode or multi-mode. The measurement system consisted of 29 biaxial gauges for bending moment. The signals are processed to yield curvature and displacement and further to identify modes of vibration. A modal approach is used successfully employing a combination of signal filtering and least-squares fitting of precalculated mode-shapes. As a part of the modal analysis, it is demonstrated that the equally spaced instrumentation limited the maximum mode number to be extracted to be equal to the number of instrumentation locations. This imposed a constraint on the analysis of in-line (IL) vibration, which occurs at higher frequencies and involves higher modes than cross-flow (CF). The analysis has shown that in general the riser response was irregular (i.e. broad-banded) and that the degree of irregularity increases with the flow speed. In some tests distinct spectral peaks could be seen, corresponding to a dominating mode. No occurrences of single-mode (lock-in) were seen. The IL response is more broad-banded than the CF response and contains higher frequencies. The average value of the displacement r.m.s over the length of the riser is computed to indicate the magnitude of VIV motion during one test. In the CF direction the average displacement is typically 1/4 of the diameter, almost independent of the flow speed. For the IL direction the values are in the range 0.05 0.08 of the diameter. The peak frequency taken from the spectra of the CF displacement at riser midpoint show approximately

  9. Analyses of large scale tests addressing the performance of a containment cooler and its effect on gas distribution

    Energy Technology Data Exchange (ETDEWEB)

    Andreani, M.; Mignot, G. [Paul Scherrer Inst., Villigen (Switzerland)

    2011-07-01

    The performance of containment coolers and their effect on the hydrogen risk in the case of an accident with core overheat is an issue that needs to be addressed by means of simulation tools. Four tests performed in the PANDA facility within the OECD SETH 2 project provide a new database to evaluate the capability of the codes to predict the cooling effectiveness of a cooler and its effect on flow patterns and light gas distribution. All tests have been simulated with the GOTHIC code using a three-dimensional mesh and a rather detailed model for the cooler tube bundle. In general, the results obtained are in reasonable agreement with the data, although some major discrepancies have also been observed, which are mostly due to the limited detail permitted by the relatively coarse mesh adopted for all tests of the SETH 2 project. (author)

  10. The DWT power spectrum analysis of the large scale structure in the universe: Method and simulation tests

    Institute of Scientific and Technical Information of China (English)

    YANG; Xiaohu

    2001-01-01

    [1]Vogeley, M. S., Szalay, A. S., Eigenmode analysis of galaxy redsh ift surveys. I. theory and methods, ApJ, 1996, 465: 34-53.[2]Fang, L. Z., Pando, J., Large-scale structures revealed by wavel et decomposition, The 5th Current Topics of Astrofundamental Physics (eds. Sanch ez, N., Zichichi, A.), Singapore: World Scientific, 1997.[3]Pando, J., Fang, L. Z., Detecting the non-Gaussian spectrum of Q SO's Lyalpha absorption line distribution, A&A, 1998, 340: 335-342.[4]Xu, W., Fang, L. Z., Deng, Z. G., Scale invariance of rich cluste r abundance: A possible test for models of structure formation, ApJ, 1998, 508: 472-482.[5]Pando, J., Valls-Gabaud, D., Fang, L. Z., Evidence for scale-sc ale correlations in the cosmic microwave background radiation, PRL, 1998, 81: 45 68-4571.[6]Feng, L. L., Fang, L. Z., Non-Gaussianity and the recovery of th e mass power spectrum from the Lyα forest, ApJ, 2000, 535: 519-529.[7]Feng, L. L., Deng, Z. G., Fang, L. Z., Breaking degeneracy of dar k matter models by the scale-scale correlations of galaxies, ApJ, 2000, 530: 53 -61.[8]Fang, L. Z., Feng, L. L., Measuring the galaxy power spectrum and scale-scale correlations with multiresolution-decomposed covariance-I. metho d, ApJ, 2000, 539: 9-22.[9]Tegmark, M., Hamilton, A. J. S., Vogeley, M. S. et al., Measuring the galaxy power spectrum with future redshift surveys, ApJ, 1998, 499: 555-57 6.[10]Bardeen, J. M., Bond, J. R., Kaiser, N. et al., The statistics of peak s of Gauss random fields, ApJ, 1986, 304: 15-61.[11]Peacock, J. A., Dodds, S. J., Linear power spectrum of cosmological ma ss fluctuations, MNRAS, 1994, 267: 1020-1034.[12]White, S. D. M., Efstathiou, G., Frenk, C. S., The amplitude of mass f luctuations in the universe, MNRAS, 1993, 262: 1023-1028.[13]Peacock, J. A., Dodds, S. J., Non-linear evolution of cosmological po wer spectra, MNRAS, 1996, 280: L19-L26.[14]Loveday, J., Peterson, B. A., Efstathiou, G. et al., The

  11. Hot Hydrogen Test Facility

    Science.gov (United States)

    Swank, W. David; Carmack, Jon; Werner, James E.; Pink, Robert J.; Haggard, DeLon C.; Johnson, Ryan

    2007-01-01

    The core in a nuclear thermal rocket will operate at high temperatures and in hydrogen. One of the important parameters in evaluating the performance of a nuclear thermal rocket is specific impulse, ISP. This quantity is proportional to the square root of the propellant's absolute temperature and inversely proportional to square root of its molecular weight. Therefore, high temperature hydrogen is a favored propellant of nuclear thermal rocket designers. Previous work has shown that one of the life-limiting phenomena for thermal rocket nuclear cores is mass loss of fuel to flowing hydrogen at high temperatures. The hot hydrogen test facility located at the Idaho National Lab (INL) is designed to test suitability of different core materials in 2500°C hydrogen flowing at 1500 liters per minute. The facility is intended to test low activity uranium containing materials but is also suited for testing cladding and coating materials. In this first installment the facility is described. Automated data acquisition, flow and temperature control, vessel compatibility with various core geometries and overall capabilities are discussed.

  12. A Hybrid Approach Using an Artificial Bee Algorithm with Mixed Integer Programming Applied to a Large-Scale Capacitated Facility Location Problem

    Directory of Open Access Journals (Sweden)

    Guillermo Cabrera G.

    2012-01-01

    Full Text Available We present a hybridization of two different approaches applied to the well-known Capacitated Facility Location Problem (CFLP. The Artificial Bee algorithm (BA is used to select a promising subset of locations (warehouses which are solely included in the Mixed Integer Programming (MIP model. Next, the algorithm solves the subproblem by considering the entire set of customers. The hybrid implementation allows us to bypass certain inherited weaknesses of each algorithm, which means that we are able to find an optimal solution in an acceptable computational time. In this paper we demonstrate that BA can be significantly improved by use of the MIP algorithm. At the same time, our hybrid implementation allows the MIP algorithm to reach the optimal solution in a considerably shorter time than is needed to solve the model using the entire dataset directly within the model. Our hybrid approach outperforms the results obtained by each technique separately. It is able to find the optimal solution in a shorter time than each technique on its own, and the results are highly competitive with the state-of-the-art in large-scale optimization. Furthermore, according to our results, combining the BA with a mathematical programming approach appears to be an interesting research area in combinatorial optimization.

  13. Self-Assessments or Tests? Comparing Cross-National Differences in Patterns and Outcomes of Graduates' Skills Based on International Large-Scale Surveys

    Science.gov (United States)

    Humburg, Martin; van der Velden, Rolf

    2015-01-01

    In this paper an analysis is carried out whether objective tests and subjective self-assessments in international large-scale studies yield similar results when looking at cross-national differences in the effects of skills on earnings, and skills patterns across countries, fields of study and gender. The findings indicate that subjective skills…

  14. Fatigue in high speed aluminium craft: Evaluating a design methodology for estimating the fatigue life using large scale tests and full scale trials

    NARCIS (Netherlands)

    Drummen, I.; Schiere, M.; Tuitman, J.T.

    2013-01-01

    Within the VOMAS project, a methodology has been developed to estimate the fatigue life of high-speed aluminium crafts. This paper presents the large scale test and full scale trials which were done to acquire data for evaluating the developed methodology and presents results of this evaluation. Dur

  15. Modeling the thermal-hydrologic processes in a large-scale underground heater test in partially saturated fractured tuff

    Science.gov (United States)

    Birkholzer, J. T.; Tsang, Y. W.

    2000-02-01

    The Drift Scale Test (DST) is being conducted in an underground facility at Yucca Mountain, Nevada, to probe the coupled thermal, hydrological, mechanical, and chemical processes likely to occur in the fractured rock mass around a potential high-level nuclear waste repository. Thermal-hydrological processes in the DST have been simulated using a three-dimensional numerical model. The model incorporates the realistic test configuration and all available site-specific measurements pertaining to the thermal and hydrological properties of the unsaturated fractured tuff of the test block. The modeled predictions were compared to the extensive set of measured data collected in the first year of this 8-year-long test. The mean error between the predictions and measurement at 12 months of heating for over 1600 temperature sensors is about 2°C. Heat-pipe signature in the temperature data, indicating two-phase regions of liquid-vapor counterflow, is seen in both the measurements and simulated results. The redistribution of moisture content in the rock mass (resulting from vaporization and condensation) was probed by periodic air-injection testing and geophysical measurements. Good agreement also occurred between the model predictions and these measurements. The general agreement between predictions from the numerical simulations and the measurements of the thermal test indicates that our fundamental understanding of the coupled thermal-hydrologic processes at Yucca Mountain is sound. However, effects of spatial heterogeneity from discrete fractures that are observed in the temperature data are not matched by simulations from the numerical model, which treat the densely spaced fractures as a continuum.

  16. Concurrent Validity and Feasibility of Short Tests Currently Used to Measure Early Childhood Development in Large Scale Studies: Methodology and Results

    OpenAIRE

    Rubio-Codina, Marta; Araujo, María Caridad; Attanasio, Orazio P.; Grantham-McGregor, Sally

    2016-01-01

    In low- and middle-income countries (LIMCs) measuring early childhood development (ECD) with standard tests in large scale surveys (i.e. evaluations of interventions) is difficult and expensive. Multi-dimensional screeners and single-domain tests ('short tests') are frequently used as alternatives. However, their validity in these circumstances is unknown. We examine the feasibility, reliability, and concurrent validity of three multi-dimensional screeners -the Ages and Stages Questionnaires ...

  17. TESLA Test Facility. Status

    Energy Technology Data Exchange (ETDEWEB)

    Aune, B. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); TESLA Collaboration

    1996-01-01

    The TESLA Test Facility (TTF), under construction at DESY by an international collaboration, is an R and D test bed for the superconducting option for future linear e+/e-colliders. It consists of an infrastructure to process and test the cavities and of a 500 MeV linac. The infrastructure has been installed and is fully operational. It includes a complex of clean rooms, an ultra-clean water plant, a chemical etching installation and an ultra-high vacuum furnace. The linac will consist of four cryo-modules, each containing eight 1 meter long nine-cell cavities operated at 1.3 GHz. The base accelerating field is 15 MV/m. A first injector will deliver a low charge per bunch beam, with the full average current (8 mA in pulses of 800 {mu}s). A more powerful injector based on RF gun technology will ultimately deliver a beam with high charge and low emittance to allow measurements necessary to qualify the TESLA option and to demonstrate the possibility of operating a free electron laser based on the Self-Amplified-Spontaneous-Emission principle. Overview and status of the facility will be given. Plans for the future use of the linac are presented. (R.P.). 19 refs.

  18. CLIC Test Facility 3

    CERN Multimedia

    Kossyvakis, I; Faus-golfe, A; Nguyen, F

    2007-01-01

    The design of CLIC is based on a two-beam scheme, where short pulses of high power 30 GHz RF are extracted from a drive beam running parallel to the main beam. The 3rd generation CLIC Test Facility (CTF3) will demonstrate the generation of the drive beam with the appropriate time structure, the extraction of 30 GHz RF power from this beam, as well as acceleration of a probe beam with 30 GHz RF cavities. The project makes maximum use of existing equipment and infrastructure of the LPI complex, which became available after the closure of LEP.

  19. Results of Large-Scale Testing on Effects of Anti-Foam Agent on Gas Retention and Release

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Charles W.; Guzman-Leong, Consuelo E.; Arm, Stuart T.; Butcher, Mark G.; Golovich, Elizabeth C.; Jagoda, Lynette K.; Park, Walter R.; Slaugh, Ryan W.; Su, Yin-Fong; Wend, Christopher F.; Mahoney, Lenna A.; Alzheimer, James M.; Bailey, Jeffrey A.; Cooley, Scott K.; Hurley, David E.; Johnson, Christian D.; Reid, Larry D.; Smith, Harry D.; Wells, Beric E.; Yokuda, Satoru T.

    2008-01-03

    The U.S. Department of Energy (DOE) Office of River Protection’s Waste Treatment Plant (WTP) will process and treat radioactive waste that is stored in tanks at the Hanford Site. The waste treatment process in the pretreatment facility will mix both Newtonian and non-Newtonian slurries in large process tanks. Process vessels mixing non-Newtonian slurries will use pulse jet mixers (PJMs), air sparging, and recirculation pumps. An anti-foam agent (AFA) will be added to the process streams to prevent surface foaming, but may also increase gas holdup and retention within the slurry. The work described in this report addresses gas retention and release in simulants with AFA through testing and analytical studies. Gas holdup and release tests were conducted in a 1/4-scale replica of the lag storage vessel operated in the Pacific Northwest National Laboratory (PNNL) Applied Process Engineering Laboratory using a kaolin/bentonite clay and AZ-101 HLW chemical simulant with non-Newtonian rheological properties representative of actual waste slurries. Additional tests were performed in a small-scale mixing vessel in the PNNL Physical Sciences Building using liquids and slurries representing major components of typical WTP waste streams. Analytical studies were directed at discovering how the effect of AFA might depend on gas composition and predicting the effect of AFA on gas retention and release in the full-scale plant, including the effects of mass transfer to the sparge air. The work at PNNL was part of a larger program that included tests conducted at Savannah River National Laboratory (SRNL) that is being reported separately. SRNL conducted gas holdup tests in a small-scale mixing vessel using the AZ-101 high-level waste (HLW) chemical simulant to investigate the effects of different AFAs, their components, and of adding noble metals. Full-scale, single-sparger mass transfer tests were also conducted at SRNL in water and AZ-101 HLW simulant to provide data for PNNL

  20. Conditional sampling technique to test the applicability of the Taylor hypothesis for the large-scale coherent structures

    Science.gov (United States)

    Hussain, A. K. M. F.

    1980-01-01

    Comparisons of the distributions of large scale structures in turbulent flow with distributions based on time dependent signals from stationary probes and the Taylor hypothesis are presented. The study investigated an area in the near field of a 7.62 cm circular air jet at a Re of 32,000, specifically having coherent structures through small-amplitude controlled excitation and stable vortex pairing in the jet column mode. Hot-wire and X-wire anemometry were employed to establish phase averaged spatial distributions of longitudinal and lateral velocities, coherent Reynolds stress and vorticity, background turbulent intensities, streamlines and pseudo-stream functions. The Taylor hypothesis was used to calculate spatial distributions of the phase-averaged properties, with results indicating that the usage of the local time-average velocity or streamwise velocity produces large distortions.

  1. Conditional sampling technique to test the applicability of the Taylor hypothesis for the large-scale coherent structures

    Science.gov (United States)

    Hussain, A. K. M. F.

    1980-01-01

    Comparisons of the distributions of large scale structures in turbulent flow with distributions based on time dependent signals from stationary probes and the Taylor hypothesis are presented. The study investigated an area in the near field of a 7.62 cm circular air jet at a Re of 32,000, specifically having coherent structures through small-amplitude controlled excitation and stable vortex pairing in the jet column mode. Hot-wire and X-wire anemometry were employed to establish phase averaged spatial distributions of longitudinal and lateral velocities, coherent Reynolds stress and vorticity, background turbulent intensities, streamlines and pseudo-stream functions. The Taylor hypothesis was used to calculate spatial distributions of the phase-averaged properties, with results indicating that the usage of the local time-average velocity or streamwise velocity produces large distortions.

  2. LARGE SCALE GLAZED

    DEFF Research Database (Denmark)

    Bache, Anja Margrethe

    2010-01-01

    WORLD FAMOUS ARCHITECTS CHALLENGE TODAY THE EXPOSURE OF CONCRETE IN THEIR ARCHITECTURE. IT IS MY HOPE TO BE ABLE TO COMPLEMENT THESE. I TRY TO DEVELOP NEW AESTHETIC POTENTIALS FOR THE CONCRETE AND CERAMICS, IN LARGE SCALES THAT HAS NOT BEEN SEEN BEFORE IN THE CERAMIC AREA. IT IS EXPECTED TO RESULT...

  3. Data Analysis, Pre-Ignition Assessment, and Post-Ignition Modeling of the Large-Scale Annular Cookoff Tests

    Energy Technology Data Exchange (ETDEWEB)

    G. Terrones; F.J. Souto; R.F. Shea; M.W.Burkett; E.S. Idar

    2005-09-30

    In order to understand the implications that cookoff of plastic-bonded explosive-9501 could have on safety assessments, we analyzed the available data from the large-scale annular cookoff (LSAC) assembly series of experiments. In addition, we examined recent data regarding hypotheses about pre-ignition that may be relevant to post-ignition behavior. Based on the post-ignition data from Shot 6, which had the most complete set of data, we developed an approximate equation of state (EOS) for the gaseous products of deflagration. Implementation of this EOS into the multimaterial hydrodynamics computer program PAGOSA yielded good agreement with the inner-liner collapse sequence for Shot 6 and with other data, such as velocity interferometer system for any reflector and resistance wires. A metric to establish the degree of symmetry based on the concept of time of arrival to pin locations was used to compare numerical simulations with experimental data. Several simulations were performed to elucidate the mode of ignition in the LSAC and to determine the possible compression levels that the metal assembly could have been subjected to during post-ignition.

  4. Arc Heated Scramjet Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Arc Heated Scramjet Test Facility is an arc heated facility which simulates the true enthalpy of flight over the Mach number range of about 4.7 to 8 for free-jet...

  5. Proceedings of the Joint IAEA/CSNI Specialists` Meeting on Fracture Mechanics Verification by Large-Scale Testing held at Pollard Auditorium, Oak Ridge, Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    Pugh, C.E.; Bass, B.R.; Keeney, J.A. [comps.] [Oak Ridge National Lab., TN (United States)

    1993-10-01

    This report contains 40 papers that were presented at the Joint IAEA/CSNI Specialists` Meeting Fracture Mechanics Verification by Large-Scale Testing held at the Pollard Auditorium, Oak Ridge, Tennessee, during the week of October 26--29, 1992. The papers are printed in the order of their presentation in each session and describe recent large-scale fracture (brittle and/or ductile) experiments, analyses of these experiments, and comparisons between predictions and experimental results. The goal of the meeting was to allow international experts to examine the fracture behavior of various materials and structures under conditions relevant to nuclear reactor components and operating environments. The emphasis was on the ability of various fracture models and analysis methods to predict the wide range of experimental data now available. The individual papers have been cataloged separately.

  6. A3 Altitude Test Facility

    Science.gov (United States)

    Dulreix, Lionel J.

    2009-01-01

    This slide presentation shows drawings, diagrams and photographs of the A3 Altitude Test Facility. It includes a review of the A3 Facility requirements, and drawings of the various sections of the facility including Engine Deck and Superstructure, Test Cell and Thrust Takeout, Structure and Altitude Support Systems, Chemical Steam generators, and the subscale diffuser. There are also pictures of the construction site, and the facility under construction. A Diagram of the A3 Steam system schematic is also shown

  7. Development and performance of a large-scale, transonic turbine blade cascade facility for aerodynamic studies of merging coolant-mainstream flows

    Science.gov (United States)

    Al-Sayeh, Amjad Isaaf

    1998-11-01

    A new, large scale, linear cascade facility of turbine blades has been developed for the experimental exploration of the aerodynamic aspects of film cooling technology. Primary interest is in the mixing of the ejected coolant with the mainstream, at both subsonic and supersonic mainstream Mach numbers at the cascade exit. In order to achieve a spatial resolution adequate for the exploration of details on the scale of the coolant ejection holes, the cascade dimensions were maximized, within the limitations of the air supply system. The cascade contains four blades (three passages) with 14.05 cm axial chord, 17.56 cm span and a design total turning angle of 130.6 degrees. Exit Mach numbers range from 0.6 to 1.5 and Reynolds numbers from 0.5 to 1.5 million. The air supply system capacity allows run times up to five minutes at maximum flow rates. A coolant supply system has been built to deliver mixtures of SFsb6 and air to simulate coolant/mainstream density ratios up to 2. The cascade contains several novel features. A full-perimeter bleed slot upstream of the blades is used to remove the approach boundary layer from all four walls, to improve the degree of two-dimensionality. The exit flow is bounded by two adjustable tailboards that are hinged at the trailing edges and actuated to set the exit flow direction according to the imposed pressure ratio. The boards are perforated and subjected to mass removal near the blades, to minimize the undesirable reflection of shocks and expansion waves. A probe actuator is incorporated that allows continuous positioning of probes in the exhaust stream, in both the streamwise and pitchwise directions. Diagnostic methods include extensive surface pressure taps on the approach and exhaust ducts and on the blade surfaces. The large size permitted as many as 19 taps on the trailing edge itself. Shadowgraph and schlieren are available. A three-prong wake probe has been constructed to simultaneously measure total and static pressures

  8. Reducing Plug and Process Loads for a Large Scale, Low Energy Office Building: NREL's Research Support Facility; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Lobato, C.; Pless, S.; Sheppy, M.; Torcellini, P.

    2011-02-01

    This paper documents the design and operational plug and process load energy efficiency measures needed to allow a large scale office building to reach ultra high efficiency building goals. The appendices of this document contain a wealth of documentation pertaining to plug and process load design in the RSF, including a list of equipment was selected for use.

  9. Research and Practice of the System of Computer-assisted Oral English Test on a Large Scale

    Institute of Scientific and Technical Information of China (English)

    Bo Jiang

    2012-01-01

      The establishment of computer-assisted oral test system is one way to improve college students’ English communication skills. This thesis gives a brief analysis of how to set up a scientific system in colleges from the characters of oral activities,oral ability and its features,the patterns of English oral test and the abilities tested.

  10. Thermal Radiation Source Test Facility,

    Science.gov (United States)

    1984-01-01

    KEY WORDS (Continu on revers side I eesr and identify by block nuMb.,) Thermal Radiation Source Thermal Test Facility 20 ABSTRACT (Continue on reverse...SECTION 1 INTRODUCTION 1-1 GENERAL Defense Nuclear Agency’s Field Command, located at Kirtland AFB in New Mexico, has recently upgraded its thermal test facility...is used to evaluate damage and survivability in a nuclear environment. The thermal test facility was first established in 1979 and used O large

  11. Ultrasonic Nondestructive Evaluation of Pultruded Rod Stitched Efficient Unitized Structure (PRSEUS) During Large-Scale Load Testing and Rod Push-Out Testing

    Science.gov (United States)

    Johnston, Patrick H.; Juarez, Peter D.

    2016-01-01

    The Pultruded Rod Stitched Efficient Unitized Structure (PRSEUS) is a structural concept developed by the Boeing Company to address the complex structural design aspects associated with a pressurized hybrid wing body (HWB) aircraft configuration. The HWB has long been a focus of NASA's environmentally responsible aviation (ERA) project, following a building block approach to structures development, culminating with the testing of a nearly full-scale multi-bay box (MBB), representing a segment of the pressurized, non-circular fuselage portion of the HWB. PRSEUS is an integral structural concept wherein skins, frames, stringers and tear straps made of variable number of layers of dry warp-knit carbon-fiber stacks are stitched together, then resin-infused and cured in an out-of-autoclave process. The PRSEUS concept has the potential for reducing the weight and cost and increasing the structural efficiency of transport aircraft structures. A key feature of PRSEUS is the damage-arresting nature of the stitches, which enables the use of fail-safe design principles. During the load testing of the MBB, ultrasonic nondestructive evaluation (NDE) was used to monitor several sites of intentional barely-visible impact damage (BVID) as well as to survey the areas surrounding the failure cracks after final loading to catastrophic failure. The damage-arresting ability of PRSEUS was confirmed by the results of NDE. In parallel with the large-scale structural testing of the MBB, mechanical tests were conducted of the PRSEUS rod-to-overwrap bonds, as measured by pushing the rod axially from a short length of stringer.

  12. Submarine Escape Set Test Facilities

    Directory of Open Access Journals (Sweden)

    G.S.N. Murthy

    2009-07-01

    Full Text Available Submarine Escape Set (SES is used by submariners to escape from a sunken submarine. This set caters for breathing needs of the submariner under water, until he reaches the surface. Evaluation of such life-saving equipment is of paramount importance. This paper describes the submarine escape set and various constructional features and schedules of operation of test facilities designed indegenously and which can evaluate the SES. The test facility is divided into two parts: the reducer test facility, and the breathing bag test facility. The equipment has been rigorously tested and accepted by Indian Navy. Two such test facilities have been developed, one of which is installed at INS Satavahana, Visakhapatnam, and are working satisfactorily.

  13. A Large-Scale Empirical Evaluation of Cross-Validation and External Test Set Validation in (Q)SAR.

    Science.gov (United States)

    Gütlein, Martin; Helma, Christoph; Karwath, Andreas; Kramer, Stefan

    2013-06-01

    (Q)SAR model validation is essential to ensure the quality of inferred models and to indicate future model predictivity on unseen compounds. Proper validation is also one of the requirements of regulatory authorities in order to accept the (Q)SAR model, and to approve its use in real world scenarios as alternative testing method. However, at the same time, the question of how to validate a (Q)SAR model, in particular whether to employ variants of cross-validation or external test set validation, is still under discussion. In this paper, we empirically compare a k-fold cross-validation with external test set validation. To this end we introduce a workflow allowing to realistically simulate the common problem setting of building predictive models for relatively small datasets. The workflow allows to apply the built and validated models on large amounts of unseen data, and to compare the performance of the different validation approaches. The experimental results indicate that cross-validation produces higher performant (Q)SAR models than external test set validation, reduces the variance of the results, while at the same time underestimates the performance on unseen compounds. The experimental results reported in this paper suggest that, contrary to current conception in the community, cross-validation may play a significant role in evaluating the predictivity of (Q)SAR models.

  14. Controlling Guessing Bias in the Dichotomous Rasch Model Applied to a Large-Scale, Vertically Scaled Testing Program

    Science.gov (United States)

    Andrich, David; Marais, Ida; Humphry, Stephen Mark

    2016-01-01

    Recent research has shown how the statistical bias in Rasch model difficulty estimates induced by guessing in multiple-choice items can be eliminated. Using vertical scaling of a high-profile national reading test, it is shown that the dominant effect of removing such bias is a nonlinear change in the unit of scale across the continuum. The…

  15. Large-Scale Liquid Hydrogen Tank Rapid Chill and Fill Testing for the Advanced Shuttle Upper Stage Concept

    Science.gov (United States)

    Flachbart, R. H.; Hedayat, A.; Holt, K. A.; Sims, J.; Johnson, E. F.; Hastings, L. J.; Lak, T.

    2013-01-01

    Cryogenic upper stages in the Space Shuttle program were prohibited primarily due to a safety risk of a 'return to launch site' abort. An upper stage concept addressed this concern by proposing that the stage be launched empty and filled using shuttle external tank residuals after the atmospheric pressure could no longer sustain an explosion. However, only about 5 minutes was allowed for tank fill. Liquid hydrogen testing was conducted within a near-ambient environment using the multipurpose hydrogen test bed 638.5 ft3 (18m3) cylindrical tank with a spray bar mounted longitudinally inside. Although the tank was filled within 5 minutes, chilldown of the tank structure was incomplete, and excessive tank pressures occurred upon vent valve closure. Elevated tank wall temperatures below the liquid level were clearly characteristic of film boiling. The test results have substantial implications for on-orbit cryogen transfer since the formation of a vapor film would be much less inhibited due to the reduced gravity. However, the heavy tank walls could become an asset in normal gravity testing for on-orbit transfer, i.e., if film boiling in a nonflight weight tank can be inhibited in normal gravity, then analytical modeling anchored with the data could be applied to reduced gravity environments with increased confidence.

  16. PANDA: A Multipurpose Integral Test Facility for LWR Safety Investigations

    Directory of Open Access Journals (Sweden)

    Domenico Paladino

    2012-01-01

    Full Text Available The PANDA facility is a large scale, multicompartmental thermal hydraulic facility suited for investigations related to the safety of current and advanced LWRs. The facility is multipurpose, and the applications cover integral containment response tests, component tests, primary system tests, and separate effect tests. Experimental investigations carried on in the PANDA facility have been embedded in international projects, most of which under the auspices of the EU and OECD and with the support of a large number of organizations (regulatory bodies, technical dupport organizations, national laboratories, electric utilities, industries worldwide. The paper provides an overview of the research programs performed in the PANDA facility in relation to BWR containment systems and those planned for PWR containment systems.

  17. Introducing Distance and Measurement in General Relativity: Changes for the Standard Tests and the Cosmological Large-Scale

    Directory of Open Access Journals (Sweden)

    Crothers S. J.

    2005-10-01

    Full Text Available Relativistic motion in the gravitational field of a massive body is governed by the external metric of a spherically symmetric extended object. Consequently, any solution for the point-mass is inadequate for the treatment of such motions since it pertains to a fictitious object. I therefore develop herein the physics of the standard tests of General Relativity by means of the generalised solution for the field external to a sphere of incompressible homogeneous fluid.

  18. Rapid implementation of an integrated large-scale HIV counseling and testing, malaria, and diarrhea prevention campaign in rural Kenya.

    Directory of Open Access Journals (Sweden)

    Eric Lugada

    Full Text Available BACKGROUND: Integrated disease prevention in low resource settings can increase coverage, equity and efficiency in controlling high burden infectious diseases. A public-private partnership with the Ministry of Health, CDC, Vestergaard Frandsen and CHF International implemented a one-week integrated multi-disease prevention campaign. METHOD: Residents of Lurambi, Western Kenya were eligible for participation. The aim was to offer services to at least 80% of those aged 15-49. 31 temporary sites in strategically dispersed locations offered: HIV counseling and testing, 60 male condoms, an insecticide-treated bednet, a household water filter for women or an individual filter for men, and for those testing positive, a 3-month supply of cotrimoxazole and referral for follow-up care and treatment. FINDINGS: Over 7 days, 47,311 people attended the campaign with a 96% uptake of the multi-disease preventive package. Of these, 99.7% were tested for HIV (87% in the target 15-49 age group; 80% had previously never tested. 4% of those tested were positive, 61% were women (5% of women and 3% of men, 6% had median CD4 counts of 541 cell/µL (IQR; 356, 754. 386 certified counselors attended to an average 17 participants per day, consistent with recommended national figures for mass campaigns. Among women, HIV infection varied by age, and was more likely with an ended marriage (e.g. widowed vs. never married, OR.3.91; 95% CI. 2.87-5.34, and lack of occupation. In men, quantitatively stronger relationships were found (e.g. widowed vs. never married, OR.7.0; 95% CI. 3.5-13.9. Always using condoms with a non-steady partner was more common among HIV-infected women participants who knew their status compared to those who did not (OR.5.4 95% CI. 2.3-12.8. CONCLUSION: Through integrated campaigns it is feasible to efficiently cover large proportions of eligible adults in rural underserved communities with multiple disease preventive services simultaneously achieving

  19. Small-scale and large-scale testing of photo-electrochemically activated leaching technology in Aprelkovo and Delmachik Mines

    Science.gov (United States)

    Sekisov, AG; Lavrov, AYu; Rubtsov, YuI

    2017-02-01

    The paper gives a description of tests and trials of the technology of heap gold leaching from rebellious ore in Aprelkovo and Delmachik Mines. Efficiency of leaching flowsheets with the stage-wise use of activated solutions of different reagents, including active forms of oxygen, is evaluated. Carbonate-peroxide solutions are used at the first stage of leaching to oxidize sulfide and sulfide-arsenide ore minerals to recover iron and copper from them. The second stage leaching uses active cyanide solutions to leach encapsulated and disperse gold and silver.

  20. Large Scale Solar Heating

    DEFF Research Database (Denmark)

    Heller, Alfred

    2001-01-01

    The main objective of the research was to evaluate large-scale solar heating connected to district heating (CSDHP), to build up a simulation tool and to demonstrate the application of the simulation tool for design studies and on a local energy planning case. The evaluation was mainly carried out...... model is designed and validated on the Marstal case. Applying the Danish Reference Year, a design tool is presented. The simulation tool is used for proposals for application of alternative designs, including high-performance solar collector types (trough solar collectors, vaccum pipe collectors......). Simulation programs are proposed as control supporting tool for daily operation and performance prediction of central solar heating plants. Finaly the CSHP technolgy is put into persepctive with respect to alternatives and a short discussion on the barries and breakthrough of the technology are given....

  1. Large-scale thermal convection of viscous fluids in a faulted system: 3D test case for numerical codes

    Science.gov (United States)

    Magri, Fabien; Cacace, Mauro; Fischer, Thomas; Kolditz, Olaf; Wang, Wenqing; Watanabe, Norihiro

    2017-04-01

    In contrast to simple homogeneous 1D and 2D systems, no appropriate analytical solutions exist to test onset of thermal convection against numerical models of complex 3D systems that account for variable fluid density and viscosity as well as permeability heterogeneity (e.g. presence of faults). Owing to the importance of thermal convection for the transport of energy and minerals, the development of a benchmark test for density/viscosity driven flow is crucial to ensure that the applied numerical models accurately simulate the physical processes at hands. The presented study proposes a 3D test case for the simulation of thermal convection in a faulted system that accounts for temperature dependent fluid density and viscosity. The linear stability analysis recently developed by Malkovsky and Magri (2016) is used to estimate the critical Rayleigh number above which thermal convection of viscous fluids is triggered. The numerical simulations are carried out using the finite element technique. OpenGeoSys (Kolditz et al., 2012) and Moose (Gaston et al., 2009) results are compared to those obtained using the commercial software FEFLOW (Diersch, 2014) to test the ability of widely applied codes in matching both the critical Rayleigh number and the dynamical features of convective processes. The methodology and Rayleigh expressions given in this study can be applied to any numerical model that deals with 3D geothermal processes in faulted basins as by example the Tiberas Basin (Magri et al., 2016). References Kolditz, O., Bauer, S., Bilke, L., Böttcher, N., Delfs, J. O., Fischer, T., U. J. Görke, T. Kalbacher, G. Kosakowski, McDermott, C. I., Park, C. H., Radu, F., Rink, K., Shao, H., Shao, H.B., Sun, F., Sun, Y., Sun, A., Singh, K., Taron, J., Walther, M., Wang,W., Watanabe, N., Wu, Y., Xie, M., Xu, W., Zehner, B., 2012. OpenGeoSys: an open-source initiative for numerical simulation of thermo-hydro-mechanical/chemical (THM/C) processes in porous media. Environmental

  2. Electromagnetic Interface Testing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Electromagnetic Interface Testing facilitysupports such testing asEmissions, Field Strength, Mode Stirring, EMP Pulser, 4 Probe Monitoring/Leveling System, and...

  3. Thermal energy storage testing facilities

    Science.gov (United States)

    Schoenhals, R. J.; Anderson, S. H.; Stevens, L. W.; Laster, W. R.; Elter, M. R.

    Development of a prototype testing facility for performance evaluation of electrically heated thermal energy storage units is discussed. Laboratory apparatus and test procedures are being evaluated by means of measurements and analysis. Testing procedures were improved, and test results were acquired for commercially available units. A 30 kW central unit and several smaller individual room-size units were tested.

  4. Testing the Large-Scale Environments of Cool-core and Noncool-core Clusters with Clustering Bias

    CERN Document Server

    Medezinski, Elinor; Coupon, Jean; Cen, Renyue; Gaspari, Massimo; Strauss, Michael A; Spergel, David N

    2016-01-01

    There is a well observed bimodality in X-ray astronomy between cool-core (CC) and non-cool-core (NCC) clusters, but the origin of this distinction is still largely unknown. Competing theories can be divided into internal (inside-out), in which internal physical processes transform or maintain the NCC phase, and external (outside-in), in which the cluster type is determined by its initial conditions, which in turn lead to different formation histories (i.e., assembly bias). We propose a new method that uses the relative assembly bias of CC to NCC clusters, as determined via the two-point cluster-galaxy cross-correlation function (CCF), to test whether formation history plays a role in determining their nature. We apply our method to 48 ACCEPT clusters, which have well resolved central entropies, and cross-correlate with the SDSS-III/BOSS LOWZ galaxy catalog. We find that the relative bias of NCC over CC clusters is $b = 1.42 \\pm 0.35$ ($1.6\\sigma$ different from unity). Our measurement is limited by the small ...

  5. Testing the Large-scale Environments of Cool-core and Non-cool-core Clusters with Clustering Bias

    Science.gov (United States)

    Medezinski, Elinor; Battaglia, Nicholas; Coupon, Jean; Cen, Renyue; Gaspari, Massimo; Strauss, Michael A.; Spergel, David N.

    2017-02-01

    There are well-observed differences between cool-core (CC) and non-cool-core (NCC) clusters, but the origin of this distinction is still largely unknown. Competing theories can be divided into internal (inside-out), in which internal physical processes transform or maintain the NCC phase, and external (outside-in), in which the cluster type is determined by its initial conditions, which in turn leads to different formation histories (i.e., assembly bias). We propose a new method that uses the relative assembly bias of CC to NCC clusters, as determined via the two-point cluster-galaxy cross-correlation function (CCF), to test whether formation history plays a role in determining their nature. We apply our method to 48 ACCEPT clusters, which have well resolved central entropies, and cross-correlate with the SDSS-III/BOSS LOWZ galaxy catalog. We find that the relative bias of NCC over CC clusters is b = 1.42 ± 0.35 (1.6σ different from unity). Our measurement is limited by the small number of clusters with core entropy information within the BOSS footprint, 14 CC and 34 NCC clusters. Future compilations of X-ray cluster samples, combined with deep all-sky redshift surveys, will be able to better constrain the relative assembly bias of CC and NCC clusters and determine the origin of the bimodality.

  6. Large-scale test of the natural refuge strategy for delaying insect resistance to transgenic Bt crops.

    Science.gov (United States)

    Jin, Lin; Zhang, Haonan; Lu, Yanhui; Yang, Yihua; Wu, Kongming; Tabashnik, Bruce E; Wu, Yidong

    2015-02-01

    The 'natural refuge strategy" for delaying insect resistance to transgenic cotton that produces insecticidal proteins from Bacillus thuringiensis (Bt) relies on refuges of host plants other than cotton that do not make Bt toxins. We tested this widely adopted strategy by comparing predictions from modeling with data from a four-year field study of cotton bollworm (Helicoverpa armigera) resistance to transgenic cotton producing Bt toxin Cry1Ac in six provinces of northern China. Bioassay data revealed that the percentage of resistant insects increased from 0.93% in 2010 to 5.5% in 2013. Modeling predicted that the percentage of resistant insects would exceed 98% in 2013 without natural refuges, but would increase to only 1.1% if natural refuges were as effective as non-Bt cotton refuges. Therefore, the results imply that natural refuges delayed resistance, but were not as effective as an equivalent area of non-Bt cotton refuges. The percentage of resistant insects with nonrecessive inheritance of resistance increased from 37% in 2010 to 84% in 2013. Switching to Bt cotton producing two or more toxins and integrating other control tactics could slow further increases in resistance.

  7. Carbon fiber plume sampling for large scale fire tests at Dugway Proving Ground. [fiber release during aircraft fires

    Science.gov (United States)

    Chovit, A. R.; Lieberman, P.; Freeman, D. E.; Beggs, W. C.; Millavec, W. A.

    1980-01-01

    Carbon fiber sampling instruments were developed: passive collectors made of sticky bridal veil mesh, and active instruments using a light emitting diode (LED) source. These instruments measured the number or number rate of carbon fibers released from carbon/graphite composite material when the material was burned in a 10.7 m (35 ft) dia JP-4 pool fire for approximately 20 minutes. The instruments were placed in an array suspended from a 305 m by 305 m (1000 ft by 1000 ft) Jacob's Ladder net held vertically aloft by balloons and oriented crosswind approximately 140 meters downwind of the pool fire. Three tests were conducted during which released carbon fiber data were acquired. These data were reduced and analyzed to obtain the characteristics of the released fibers including their spatial and size distributions and estimates of the number and total mass of fibers released. The results of the data analyses showed that 2.5 to 3.5 x 10 to the 8th power single carbon fibers were released during the 20 minute burn of 30 to 50 kg mass of initial, unburned carbon fiber material. The mass released as single carbon fibers was estimated to be between 0.1 and 0.2% of the initial, unburned fiber mass.

  8. The DWT power spectrum analysis of the large scale structure in the universe: Method and simulation tests

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Based on the discrete wavelet transformation (DWT), we prese nt apixelized method of estimating the power spectra of galaxy samples. With lo cal properties of wavelet both in physical and wavenumber spaces, DWT power spec trum is equal to the corresponding band average of Fourier power spectrum. The D WT estimator is optimized in the sense that the spatial resolution is adaptive a utomatically to the perturbation wavelength to be studied. Under the assumption of ergodicity, the spatial average of local DWT fluctuation modes provides a fai r estimation of the ensemble average. We test DWT spectra of four typical cold da rk matter (CDM) structure formation models with numerical simulations. To consid er the infections of various observation effects to the DWT spectra, we introduc e irregular survey geometries, a given sampling rate, radial selection effects a nd redshift distortion effects into our mock samples. The numerical results show that, owing to its local properties, DWT spectrum is less affected by the sampl ing rate, survey geometry, and statistical ensemble fluctuations. With fast wave let decomposition algorithm, DWT can be used to analyze large survey samples, wh i ch is of direct significance in precise measurement of the cosmological paramete rs from the galaxy redshift surveys of next generation.

  9. Eta vs. sigma: review of past results, Gallus-Klemp test, and large-scale wind skill in ensemble experiments

    Science.gov (United States)

    Mesinger, Fedor; Veljovic, Katarina

    2017-01-01

    To determine the effect of switching between the eta and the sigma coordinate in numerical weather prediction involving topography, five sets of tests were performed. The eta version did better in all of them particularly with precipitation scores and more accurate placement of storms. However, a problem of flow separation in the lee of the bell-shaped topography discovered by Gallus and Klemp seemed to many to suggest the eta coordinate to be ill suited for high-resolution models. Flow separation is shown not to occur following a refinement of the eta discretization. Trying to identify a primary cause of the improvement in 250 hPa winds previously demonstrated in Eta ensemble members over their ECMWF driver members, ten of the Eta members were run switched to sigma. At a critical time, the Eta members in eta mode showed a tendency for more accurate tilt of a 250 hPa trough than the members run in sigma mode. The experiment was rerun for a more recent and higher resolution ECMWF ensemble, and for an increased number of members. The advantage of the Eta over ECMWF is seen again, even though this time, the Eta resolution during the first 10 days of the experiment was about the same as that of driver members. Rerunning the Eta ensemble switched to sigma showed an advantage in the Eta/eta 250 hPa wind scores used, again associated with an upper-air trough's movement across the Rockies. Better positioning of lee lows ahead of these troughs using Eta/eta is suggested to be making significant contributions to its better precipitation scores. Implications of experiments done for regional climate modeling are discussed as well.

  10. EMI Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: Supports electromagnetic interference/radio frequency interference (EMI/RFI) testing of flight hardware. It is also used to support custom RF testing up to...

  11. Stereotype Threat, Inquiring about Test Takers' Race and Gender, and Performance on Low-Stakes Tests in a Large-Scale Assessment. Research Report. ETS RR-15-02

    Science.gov (United States)

    Stricker, Lawrence J.; Rock, Donald A.; Bridgeman, Brent

    2015-01-01

    This study explores stereotype threat on low-stakes tests used in a large-scale assessment, math and reading tests in the Education Longitudinal Study of 2002 (ELS). Issues identified in laboratory research (though not observed in studies of high-stakes tests) were assessed: whether inquiring about their race and gender is related to the…

  12. Performance monitoring of large-scale autonomously healed concrete beams under four-point bending through multiple non-destructive testing methods

    Science.gov (United States)

    Karaiskos, G.; Tsangouri, E.; Aggelis, D. G.; Van Tittelboom, K.; De Belie, N.; Van Hemelrijck, D.

    2016-05-01

    Concrete is still the leading structural material due to its low production cost and great structural design flexibility. Although it is distinguished by such a high durability and compressive strength, it is vulnerable in a series of ambient and operational degradation factors which all too frequently result in crack formation that can adversely affect its mechanical performance. The autonomous healing system, using encapsulated polyurethane-based, expansive, healing agent embedded in concrete, is triggered by the crack formation and propagation and promises material repair and operational service life extension. As shown in our previous studies, the formed cracks on small-scale concrete beams are sealed and repaired by filling them with the healing agent. In the present study, the crack formation and propagation in autonomously healed, large-scale concrete beams are thoroughly monitored through a combination of non-destructive testing (NDT) methods. The ultrasonic pulse velocity (UPV), using embedded low-cost and aggregate-size piezoelectric transducers, the acoustic emission (AE) and the digital image correlation (DIC) are the NDT methods which are comprehensively used. The integrated ultrasonic, acoustic and optical monitoring system introduces an experimental configuration that detects and locates the four-point bending mode fracture on large-scale concrete beams, detects the healing activation process and evaluates the subsequent concrete repair.

  13. The Integral Test Facility Karlstein

    Directory of Open Access Journals (Sweden)

    Stephan Leyer

    2012-01-01

    Full Text Available The Integral Test Facility Karlstein (INKA test facility was designed and erected to test the performance of the passive safety systems of KERENA, the new AREVA Boiling Water Reactor design. The experimental program included single component/system tests of the Emergency Condenser, the Containment Cooling Condenser and the Passive Core Flooding System. Integral system tests, including also the Passive Pressure Pulse Transmitter, will be performed to simulate transients and Loss of Coolant Accident scenarios at the test facility. The INKA test facility represents the KERENA Containment with a volume scaling of 1 : 24. Component heights and levels are in full scale. The reactor pressure vessel is simulated by the accumulator vessel of the large valve test facility of Karlstein—a vessel with a design pressure of 11 MPa and a storage capacity of 125 m3. The vessel is fed by a benson boiler with a maximum power supply of 22 MW. The INKA multi compartment pressure suppression Containment meets the requirements of modern and existing BWR designs. As a result of the large power supply at the facility, INKA is capable of simulating various accident scenarios, including a full train of passive systems, starting with the initiating event—for example pipe rupture.

  14. Large scale tracking algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Ross L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Love, Joshua Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Melgaard, David Kennett [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Karelitz, David B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pitts, Todd Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Zollweg, Joshua David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Anderson, Dylan Z. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Nandy, Prabal [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Whitlow, Gary L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bender, Daniel A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Byrne, Raymond Harry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  15. Large scale tracking algorithms.

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Ross L.; Love, Joshua Alan; Melgaard, David Kennett; Karelitz, David B.; Pitts, Todd Alan; Zollweg, Joshua David; Anderson, Dylan Z.; Nandy, Prabal; Whitlow, Gary L.; Bender, Daniel A.; Byrne, Raymond Harry

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  16. Analysis of long-term mechanical grooming on large-scale test panels coated with an antifouling and a fouling-release coating.

    Science.gov (United States)

    Hearin, John; Hunsucker, Kelli Z; Swain, Geoffrey; Stephens, Abraham; Gardner, Harrison; Lieberman, Kody; Harper, Michael

    2015-01-01

    Long-term grooming tests were conducted on two large-scale test panels, one coated with a fluorosilicone fouling-release (FR) coating, and one coated with a copper based ablative antifouling (AF) coating. Mechanical grooming was performed weekly or bi-weekly using a hand operated, electrically powered, rotating brush tool. The results indicate that weekly grooming was effective at removing loose or heavy biofilm settlement from both coatings, but could not prevent the permanent establishment of low-profile tenacious biofilms. Weekly grooming was very effective at preventing macrofouling establishment on the AF coating. The effectiveness of weekly grooming at preventing macrofouling establishment on the FR coating varied seasonally. The results suggest that frequent mechanical grooming is a viable method to reduce the fouling rating of ships' hulls with minimal impact to the coating. Frequent grooming could offer significant fuel savings while reducing hull cleaning frequencies and dry dock maintenance requirements.

  17. Low emissions combustor test facility

    Energy Technology Data Exchange (ETDEWEB)

    Maloney, D.J.; Hadley, M.S.; Norton, T.S.

    1993-12-01

    The Morgantown Energy Technology Center (METC) is in the process of constructing a low emissions combustor test and research (LECTR) facility designed to support the development of low emissions gas turbine combustion systems fired on natural gas and coal derived gaseous fuels containing fuel bound nitrogen. The LECTR facility is a major test station located within METC`s new combustion facility. The heart of this test station is a 60 centimeter (24 inch) diameter, refractory lined pressure vessel made up of a series of flanged modules. The facility design offers the flexibility to test a variety of low emissions combustion concepts at pressures up to 3 MPa (30 atm). Upon completion of fabrication and shake-down testing in January of 1994, the facility will be available for use by industrial and university partners through Cooperative Research and Development Agreements (CRADAs) or through other cooperative arrangements. This paper is intended to describe the LECTR facility and associated operating parameter ranges and to inform interested parties of the facility availability.

  18. Solenoid Testing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Current Configuration: Accommodate a device under test up to 2.8 m diameter, 0.7 m height and 15,000 lbs. weight. Up to 10 g/s, 4.5 K helium flow. Up to 250 A test...

  19. LLNL superconducting magnets test facility

    Energy Technology Data Exchange (ETDEWEB)

    Manahan, R; Martovetsky, N; Moller, J; Zbasnik, J

    1999-09-16

    The FENIX facility at Lawrence Livermore National Laboratory was upgraded and refurbished in 1996-1998 for testing CICC superconducting magnets. The FENIX facility was used for superconducting high current, short sample tests for fusion programs in the late 1980s--early 1990s. The new facility includes a 4-m diameter vacuum vessel, two refrigerators, a 40 kA, 42 V computer controlled power supply, a new switchyard with a dump resistor, a new helium distribution valve box, several sets of power leads, data acquisition system and other auxiliary systems, which provide a lot of flexibility in testing of a wide variety of superconducting magnets in a wide range of parameters. The detailed parameters and capabilities of this test facility and its systems are described in the paper.

  20. Elevated Fixed Platform Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Elevated Fixed Platform (EFP) is a helicopter recovery test facility located at Lakehurst, NJ. It consists of a 60 by 85 foot steel and concrete deck built atop...

  1. Reverberant Acoustic Test Facility (RATF)

    Data.gov (United States)

    Federal Laboratory Consortium — The very large Reverberant Acoustic Test Facility (RATF) at the NASA Glenn Research Center (GRC), Plum Brook Station, is currently under construction and is due to...

  2. Freshwater Treatment and Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Freshwater Treatment and Test Facility, located at SANGB, has direct year-round access to water from Lake St. Clair and has a State of Michigan approved National...

  3. Test Facility for Volumetric Absorber

    Energy Technology Data Exchange (ETDEWEB)

    Ebert, M.; Dibowski, G.; Pfander, M.; Sack, J. P.; Schwarzbozl, P.; Ulmer, S.

    2006-07-01

    Long-time testing of volumetric absorber modules is an inevitable measure to gain the experience and reliability required for the commercialization of the open volumetric receiver technology. While solar tower test facilities are necessary for performance measurements of complete volumetric receivers, the long-term stability of individual components can be tested in less expensive test setups. For the qualification of the aging effects of operating cycles on single elements of new absorber materials and designs, a test facility was developed and constructed in the framework of the KOSMOSOL project. In order to provide the concentrated solar radiation level, the absorber test facility is integrated into a parabolic dish system at the Plataforma Solar de Almeria (PSA) in Spain. Several new designs of ceramic absorbers were developed and tested during the last months. (Author)

  4. Gamma Irradiation Testing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — DMEA has a unique total dose testing laboratory accredited by the American Association for Laboratory Accreditation (A2LA). The lab[HTML_REMOVED]s two J.L. Shepherd...

  5. Ice Adhesion Testing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Uses Evaluate and compare the relative performance of materials and surfcae coating based on their ability to aid in ice removal Test the effectiveness of de-icing...

  6. Thermal energy storage testing facility

    Science.gov (United States)

    Schoenhals, R. J.; Lin, C. P.; Kuehlert, H. F.; Anderson, S. H.

    1981-03-01

    Development of a prototype testing facility for performance evaluation of electrically heated thermal energy storage units is described. Laboratory apparatus and test procedures were evaluated by means of measurements and analysis. A 30kW central unit and several smaller individual room-size units were tested.

  7. Large-scale numerical simulations of star formation put to the test: Comparing synthetic images and actual observations for statistical samples of protostars

    CERN Document Server

    Frimann, Søren; Haugbølle, Troels

    2015-01-01

    (abridged) Context: Both observations and simulations of embedded protostars have progressed rapidly in recent years. Bringing them together is an important step in advancing our knowledge about the earliest phases of star formation. Aims: To compare synthetic continuum images and SEDs, created from large-scale numerical simulations, to observational studies - thereby aiding both in the interpretation of observations and test the fidelity of the simulations. Methods: The radiative transfer code RADMC-3D is used to create synthetic continuum images and SEDs of protostellar systems in a large numerical simulation of a molecular cloud. More than 13000 unique radiative transfer models are produced of a variety of different protostellar systems. Results: Over the course of 0.76 Myr more than 500 protostars are formed in the simulation - primarily within two sub-clusters. Synthetic SEDs are used to calculate evolutionary tracers Tbol and Lsmm/Lbol. It is shown that, while the observed distributions of tracers are w...

  8. A radiant heating test facility for space shuttle orbiter thermal protection system certification

    Science.gov (United States)

    Sherborne, W. D.; Milhoan, J. D.

    1980-01-01

    A large scale radiant heating test facility was constructed so that thermal certification tests can be performed on the new generation of thermal protection systems developed for the space shuttle orbiter. This facility simulates surface thermal gradients, onorbit cold-soak temperatures down to 200 K, entry heating temperatures to 1710 K in an oxidizing environment, and the dynamic entry pressure environment. The capabilities of the facility and the development of new test equipment are presented.

  9. Thermal energy storage test facility

    Science.gov (United States)

    Ternes, M. P.

    1980-01-01

    The thermal behavior of prototype thermal energy storage units (TES) in both heating and cooling modes is determined. Improved and advanced storage systems are developed and performance standards are proposed. The design and construction of a thermal cycling facility for determining the thermal behavior of full scale TES units is described. The facility has the capability for testing with both liquid and air heat transport, at variable heat input/extraction rates, over a temperature range of 0 to 280 F.

  10. (abstract) Cryogenic Telescope Test Facility

    Science.gov (United States)

    Luchik, T. S.; Chave, R. G.; Nash, A. E.

    1995-01-01

    An optical test Dewar is being constructed with the unique capability to test mirrors of diameter less than or equal to 1 m, f less than or equal to 6, at temperatures from 300 to 4.2 K with a ZYGO Mark IV interferometer. The design and performance of this facility will be presented.

  11. Manual for operation of the multipurpose thermalhydraulic test facility TOPFLOW (Transient Two Phase Flow Test Facility); Betriebshandbuch fuer die Mehrzweck-Thermohydraulikversuchsanlage TOPFLOW

    Energy Technology Data Exchange (ETDEWEB)

    Beyer, M.; Carl, H.; Schuetz, H.; Pietruske, H.; Lenk, S. [SAAS Systemanalyse und Automatisierungsservice GmbH, Possendorf (Germany)

    2004-07-01

    The Forschungszentrum Rossendorf (FZR) e. V. is constructing a new large-scale test facility, TOPFLOW, for thermalhydraulic single effect tests. The acronym stands for transient two phase flow test facility. It will mainly be used for the investigation of generic and applied steady state and transient two phase flow phenomena and the development and validation of models of computational fluid dynamic (CFD) codes. The manual of the test facility must always be available for the staff in the control room and is restricted condition during operation of personnel and also reconstruction of the facility. (orig./GL)

  12. Similar Materials in Large-scale Shaking Table Model Test%大型振动台试验相似材料研究

    Institute of Scientific and Technical Information of China (English)

    邹威; 许强; 刘汉香

    2012-01-01

    On the basis of analyzing and summarizing the research status of similar materials of rock which were u-sually used in physical simulation experiments of geological engineering in recent years, using the similar materials of shaking table model tests compounded by barite powder, quartz sand , gypsum and glycerol, through mechanical test of different proportion of materials, and considering the water content of similar materials, the contents of all kinds of materials and the regularity of the influence of the water content changing on the physical and mechanical properties of similar materials were studies and analyzed. It was found that water content had great influence on the compressive strength, cohesion, elastic modulus and friction angle of materials. According to the result, the proportion of similar material in shaking table model test was fixed finally. It was proved by the test that the mechanical index of this kind of material was stale, could meet the needs of the choice requirements of similar materials, convenient for pouring relatively large-scale model specimen once only. It applies to physical simulation experiments like large-scale shaking table model test.%在分析总结近几年岩土工程物理模拟试验中常用的岩质相似材料研究现状的基础上,采用重晶石粉、石英砂、石膏、甘油配制振动台模型试验相似材料,通过不同配比的材料力学试验,考虑相似材料的含水率,分析研究了各种材料含量以及含水率变化对相似材料物理力学性质的影响规律,发现含水率对材料的抗压强度、粘聚力、弹性模量、摩擦角有较大影响,根据试验结果最终确定了振动台试验相似材料配比.试验结果证明,该相似材料力学指标稳定,能很好地满足相似材料的选材要求,便于一次性浇注较大尺寸规模的模型试件,适用于大型振动台等物理模拟试验.

  13. Implementation and Operational Research: Population-Based Active Tuberculosis Case Finding During Large-Scale Mobile HIV Testing Campaigns in Rural Uganda.

    Science.gov (United States)

    Ssemmondo, Emmanuel; Mwangwa, Florence; Kironde, Joel L; Kwarisiima, Dalsone; Clark, Tamara D; Marquez, Carina; Charlebois, Edwin D; Petersen, Maya L; Kamya, Moses R; Havlir, Diane V; Chamie, Gabriel

    2016-11-01

    Active tuberculosis (TB) screening outside clinics and in communities may reduce undiagnosed TB. To determine the yield of TB screening during community-based HIV testing campaigns (CHC) in 7 rural Ugandan communities within an ongoing cluster-randomized trial of universal HIV testing and treatment (SEARCH, NCT:01864603), we offered sputum microscopy to participants with prolonged cough (>2 weeks). We determined the number of persons needed to screen to identify one TB case, and the number of cases identified that linked to clinic and completed TB treatment. Of 36,785 adults enumerated in 7 communities, 27,214 (74%) attended CHCs, and HIV testing uptake was >99%, with 941 (3.5%) HIV-infected adults identified. Five thousand seven hundred eighty-six adults (21%) reported cough and 2876 (11%) reported cough >2 weeks. Staff obtained sputum in 1099/2876 (38%) participants with prolonged cough and identified 10 adults with AFB-positive sputum; 9 new diagnoses and 1 known case already under treatment. The number needed to screen to identify one new TB case was 3024 adults overall: 320 adults with prolonged cough and 80 HIV-infected adults with prolonged cough. All 9 newly diagnosed AFB+ participants were linked to TB care within 2 weeks and were initiated TB treatment. In a rural Ugandan setting, TB screening as an adjunct to large-scale mobile HIV testing campaigns provides an opportunity to increase TB case detection.

  14. Large-Scale Testing of Effects of Anti-Foam Agent on Gas Holdup in Process Vessels in the Hanford Waste Treatment Plant - 8280

    Energy Technology Data Exchange (ETDEWEB)

    Mahoney, Lenna A.; Alzheimer, James M.; Arm, Stuart T.; Guzman-Leong, Consuelo E.; Jagoda, Lynette K.; Stewart, Charles W.; Wells, Beric E.; Yokuda, Satoru T.

    2008-06-03

    The Hanford Waste Treatment Plant (WTP) will vitrify the radioactive wastes stored in underground tanks. These wastes generate and retain hydrogen and other flammable gases that create safety concerns for the vitrification process tanks in the WTP. An anti-foam agent (AFA) will be added to the WTP process streams. Prior testing in a bubble column and a small-scale impeller-mixed vessel indicated that gas holdup in a high-level waste chemical simulant with AFA was up to 10 times that in clay simulant without AFA. This raised a concern that major modifications to the WTP design or qualification of an alternative AFA might be required to satisfy plant safety criteria. However, because the mixing and gas generation mechanisms in the small-scale tests differed from those expected in WTP process vessels, additional tests were performed in a large-scale prototypic mixing system with in situ gas generation. This paper presents the results of this test program. The tests were conducted at Pacific Northwest National Laboratory in a ¼-scale model of the lag storage process vessel using pulse jet mixers and air spargers. Holdup and release of gas bubbles generated by hydrogen peroxide decomposition were evaluated in waste simulants containing an AFA over a range of Bingham yield stresses and gas gen geration rates. Results from the ¼-scale test stand showed that, contrary to the small-scale impeller-mixed tests, gas holdup in clay without AFA is comparable to that in the chemical waste simulant with AFA. The test stand, simulants, scaling and data-analysis methods, and results are described in relation to previous tests and anticipated WTP operating conditions.

  15. Utilisation of MSWI bottom ash as sub-base in road construction: first results from a large-scale test site.

    Science.gov (United States)

    Hjelmar, Ole; Holm, Jesper; Crillesen, Kim

    2007-01-31

    The preferred management option for municipal solid waste incinerator (MSWI) bottom ash in Denmark is utilisation rather than landfilling, but the current environmental quality criteria for bottom ash to be utilised in bulk quantities are rather strict. To evaluate the impact and risk assessments, upon which those criteria are based, a large-scale test site has been established. Three different MSWI bottom ashes have been used as sub-base in six test units ranging from 100 to 200 m2 with top covers of asphalt, flagstones and pebbles, respectively. All units, except one, are equipped with bottom liners and leachate collection equipment. The test site provides information on the leachate quality and quantity as a function of time under different conditions and on the flow pattern in asphalt and flagstone covered roads and squares with MSWI bottom ash sub-base. In addition, the leaching behaviour of the bottom ashes has been studied in the laboratory. The test site was established in October 2002 and the project is still ongoing. Water balance results indicate that the water flow distribution is strongly influenced by lateral flow on or in the upper part of the bottom ash layer and possibly by preferential flow. Comparisons between eluates from laboratory leaching tests on the bottom ashes and observations of the leachate from the site as a function of L/S show fairly good agreement for salts but less agreement for some trace elements. Most likely, this is partly due to the fact that the pH observed in the leachate from the field sites is lower than that observed in the eluates from the laboratory leaching tests.

  16. Data for generation of all Tables and Figures for CTEP publication in 2015 pertaining to large-scale diesel gensets tested

    Data.gov (United States)

    U.S. Environmental Protection Agency — particulate and gaseous emissions and particle optical properties for emissions from large-scale diesel gensets with and without aftermarket PM controls. This...

  17. Large scale biomimetic membrane arrays

    DEFF Research Database (Denmark)

    Hansen, Jesper Søndergaard; Perry, Mark; Vogel, Jörg

    2009-01-01

    To establish planar biomimetic membranes across large scale partition aperture arrays, we created a disposable single-use horizontal chamber design that supports combined optical-electrical measurements. Functional lipid bilayers could easily and efficiently be established across CO2 laser micro......-structured 8 x 8 aperture partition arrays with average aperture diameters of 301 +/- 5 mu m. We addressed the electro-physical properties of the lipid bilayers established across the micro-structured scaffold arrays by controllable reconstitution of biotechnological and physiological relevant membrane...... peptides and proteins. Next, we tested the scalability of the biomimetic membrane design by establishing lipid bilayers in rectangular 24 x 24 and hexagonal 24 x 27 aperture arrays, respectively. The results presented show that the design is suitable for further developments of sensitive biosensor assays...

  18. Aircraft Test & Evaluation Facility (Hush House)

    Data.gov (United States)

    Federal Laboratory Consortium — The Aircraft Test and Evaluation Facility (ATEF), or Hush House, is a noise-abated ground test sub-facility. The facility's controlled environment provides 24-hour...

  19. Large scale cluster computing workshop

    Energy Technology Data Exchange (ETDEWEB)

    Dane Skow; Alan Silverman

    2002-12-23

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community.

  20. Evaluation of Interface Shear Strength Properties of Geogrid Reinforced Foamed Recycled Glass Using a Large-Scale Direct Shear Testing Apparatus

    Directory of Open Access Journals (Sweden)

    Arul Arulrajah

    2015-01-01

    Full Text Available The interface shear strength properties of geogrid reinforced recycled foamed glass (FG were determined using a large-scale direct shear test (DST apparatus. Triaxial geogrid was used as a geogrid reinforcement. The geogrid increases the confinement of FG particles during shear; consequently the geogrid reinforced FG exhibits smaller vertical displacement and dilatancy ratio than FG at the same normal stress. The failure envelope of geogrid reinforced FG, at peak and critical states, coincides and yields a unique linear line possibly attributed to the crushing of FG particles and the rearrangement of crushed FG after peak shear state. The interface shear strength coefficient α is approximately constant at 0.9. This value can be used as the interface parameter for designing a reinforced embankment and mechanically stabilized earth (MSE wall when FG is used as a lightweight backfill and triaxial geogrid is used as an extensible earth reinforcement. This research will enable FG, recently assessed as suitable for lightweight backfills, to be used together with geogrids in a sustainable manner as a lightweight MSE wall. The geogrid carries tensile forces, while FG reduces bearing stresses imposed on the in situ soil. The use of geogrid reinforced FG is thus significant from engineering, economical, and environmental perspectives.

  1. Testing deviations from $\\Lambda$CDM with growth rate measurements from 6 Large Scale Structure Surveys at $\\mathbf{z=0.06}$ to 1

    CERN Document Server

    Alam, Shadab; Silvestri, Alessandra

    2015-01-01

    We use measurements from the Planck satellite mission and galaxy redshift surveys over the last decade to test three of the basic assumptions of the standard model of cosmology, $\\Lambda$CDM: the spatial curvature of the universe, the nature of dark energy and the laws of gravity on large scales. We obtain improved constraints on several scenarios that violate one or more of these assumptions. We measure $w_0=-0.94\\pm0.17$ (18\\% measurement) and $1+w_a=1.16\\pm0.36$ (31\\% measurement) for models with a time-dependent equation of state, which is an improvement over current best constraints \\citep{Aubourg2014}. In the context of modified gravity, we consider popular scalar tensor models as well as a parametrization of the growth factor. In the case of one-parameter $f(R)$ gravity models with a $\\Lambda$CDM background, we constrain $B_0 < 1.36 \\times 10^{-5} $ (1$\\sigma$ C.L.), which is an improvement by a factor of 4 on the current best \\citep{XU2015}. We provide the very first constraint on the coupling para...

  2. Quality assurance for point-of-care testing of oral anticoagulation: a large-scale evaluation of the Hemochron Junior Signature Microcoagulation System.

    Science.gov (United States)

    Maddox, J M; Bogo, P H; McGregor, E; Pippard, M J; Kerr, R

    2009-04-01

    We report the first large-scale evaluation of the Hemochron Junior Signature (HJS) Microcoagulation System for community monitoring of oral anticoagulation and establishment of a programme of internal and external quality assurance. Over 1600 HJS results, with a simultaneous venous sample for central analysis, were obtained over a 19 month period. Monitoring of an initial period of HJS results (n = 135) revealed an International Normalized Ratio (INR) over estimation (mean +1.05), with only 27% of results within 0.5 of the central laboratory INR. A correction factor was introduced which reduced the INR bias to +0.07 and improved the percentage of results within 0.5 of the central laboratory INR to 76% (n = 353). A revised correction factor was later introduced to adjust for an under estimation at higher INR values. This changed the INR bias to -0.05, with 76% of results within 0.5 of the central laboratory INR (n = 1174). Local external quality assurance samples were distributed monthly with a total of 791 samples during the study period. 84% of test results were within 15% of the median value (range 73-97% per month). These results emphasize the value of a robust quality assurance programme when using point-of-care devices for community monitoring of oral anticoagulation.

  3. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  4. Reducing Data Center Loads for a Large-Scale, Low-Energy Office Building: NREL's Research Support Facility (Book)

    Energy Technology Data Exchange (ETDEWEB)

    Sheppy, M.; Lobato, C.; Van Geet, O.; Pless, S.; Donovan, K.; Powers, C.

    2011-12-01

    This publication detailing the design, implementation strategies, and continuous performance monitoring of NREL's Research Support Facility data center. Data centers are energy-intensive spaces that facilitate the transmission, receipt, processing, and storage of digital data. These spaces require redundancies in power and storage, as well as infrastructure, to cool computing equipment and manage the resulting waste heat (Tschudi, Xu, Sartor, and Stein, 2003). Data center spaces can consume more than 100 times the energy of standard office spaces (VanGeet 2011). The U.S. Environmental Protection Agency (EPA) reported that data centers used 61 billion kilowatt-hours (kWh) in 2006, which was 1.5% of the total electricity consumption in the U.S. (U.S. EPA, 2007). Worldwide, data centers now consume more energy annually than Sweden (New York Times, 2009). Given their high energy consumption and conventional operation practices, there is a potential for huge energy savings in data centers. The National Renewable Energy Laboratory (NREL) is world renowned for its commitment to green building construction. In June 2010, the laboratory finished construction of a 220,000-square-foot (ft{sup 2}), LEED Platinum, Research Support Facility (RSF), which included a 1,900-ft{sup 2} data center. The RSF will expand to 360,000 ft{sup 2} with the opening of an additional wing December, 2011. The project's request for proposals (RFP) set a whole-building demand-side energy use requirement of a nominal 35 kBtu/ft{sup 2} per year. On-site renewable energy generation will offset the annual energy consumption. To support the RSF's energy goals, NREL's new data center was designed to minimize its energy footprint without compromising service quality. Several implementation challenges emerged during the design, construction, and first 11 months of operation of the RSF data center. This document highlights these challenges and describes in detail how NREL successfully

  5. Performance of powder-filled evacuated panel insulation in a manufactured home roof cavity: Tests in the Large Scale Climate Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Petrie, T.W.; Kosny, J.; Childs, P.W.

    1996-03-01

    A full-scale section of half the top of a single-wide manufactured home has been studied in the Large Scale Climate Simulator (LSCS) at the Oak Ridge National Laboratory. A small roof cavity with little room for insulation at the eaves is often the case with single-wide units and limits practical ways to improve thermal performance. The purpose of the current tests was to obtain steady-state performance data for the roof cavity of the manufactured home test section when the roof cavity was insulated with fiberglass batts, blown-in rock wool insulation or combinations of these insulations and powder-filled evacuated panel (PEP) insulation. Four insulation configurations were tested: (A) a configuration with two layers of nominal R{sub US}-7 h {center_dot} ft{sup 2} {center_dot} F/BTU (R{sub SI}-1.2 m{sup 2} {center_dot} K/W) fiberglass batts; (B) a layer of PEPs and one layer of the fiberglass batts; (C) four layers of the fiberglass batts; and (D) an average 4.1 in. (10.4 cm) thick layer of blown-in rock wool at an average density of 2.4 lb/ft{sup 3} (38 kg/m{sup 3}). Effects of additional sheathing were determined for Configurations B and C. With Configuration D over the ceiling, two layers of expanded polystyrene (EPS) boards, each about the same thickness as the PEPs, were installed over the trusses instead of the roof. Aluminum foils facing the attic and over the top layer of EPS were added. The top layer of EPS was then replaced by PEPs.

  6. Comparative evaluation of seven different sample treatment approaches for large-scale multiclass sport drug testing in urine by liquid chromatography-mass spectrometry.

    Science.gov (United States)

    Domínguez-Romero, Juan C; García-Reyes, Juan F; Molina-Díaz, Antonio

    2014-09-26

    Sample preparation is a critical step in large-scale multiclass analysis such as sport drug testing. Due to the wide heterogeneity of the analytes and the complexity of the matrix, the selection of a correct sample preparation method is essential, looking for a compromise between good recoveries for most of the analytes and cleanliness of the extract. In the present work, seven sample preparation procedures based on solid-phase extraction (SPE) (with 5 different cartridges), liquid-liquid extraction (LLE) and sorbent-supported liquid extraction (SLE) were evaluated for multiclass sport drug testing in urine. The selected SPE sorbents were polymeric cartridges Agilent PLEXA™ and Oasis HLB™, mixed mode cation and anion exchange cartridges Oasis MAX™ and MCX™, and C18 cartridges. LLE was performed using tert-butyl methyl ether and SLE was carried out using Agilent Chem Elut™ cartridges. To evaluate the proposed extraction procedures, a list of 189 compounds were selected as representative from different groups of doping agents, including 34 steroids, 14 glucocorticosteroids, 24 diuretics and masking agents, 11 stimulants, 9 beta-agonist, 16 beta-blockers, 6 Selective Estrogen Receptors Modulators (SERMs), 24 narcotics and 22 other drugs of abuse/sport drugs. Blank urine samples were spiked at two levels of concentration, 2.5 and 25μgL(-1) and extracted with the different extraction protocols (n=6). The analysis of the extracts was carried out by liquid chromatography electrospray time-of-flight mass spectrometry. The use of solid-phase extraction with polymer cartridges provided high recoveries for most of the analytes tested and was found the more suitable method for this type of application given the additional advantages such as low sample and solvent consumption along with increased automation and throughput.

  7. Dynamics of large-scale ionospheric inhomogeneities caused by a powerful radio emission of the Sura facility from the data collected onto ground-based GNSS network

    Science.gov (United States)

    Kogogin, D. A.; Nasyrov, I. A.; Grach, S. M.; Shindin, A. V.; Zagretdinov, R. V.

    2017-01-01

    The measurements of variations in the total electron content of the Earth's ionosphere along the GPS satellite signal propagation path are described. The signal parameters were measured at a network of receivers at three distant sites: Sura (Vasilsursk), Zelenodolsk, and Kazan. They are arranged along the geomagnetic latitude of the Sura Facility under short-wave radio irradiation of the ionosphere. One feature of the experiment is the crossing of a disturbed region by the radio path between a GPS satellite and Vasilsursk. This resulted from the angular sizes of the Sura array pattern; the radio paths between a GPS satellite and Zelenodolsk and a GPS satellite and Kazan did not cross. Variations in the total electron content of up to 0.15-0.3 TECU were revealed at all three sites during four experimental campaigns (March 2010, March 2013, May 2013, and November 2013). The lateral scale of an ionospheric disturbance stimulated by a high-power radio wave and the velocity of its west-to-east propagation along the geomagnetic latitude were 30-60 km and 270-350 m/s, respectively. A decrease in the total electron content (down to 0.55 TECU) was recorded along the Kazan-Zelenodolsk-Vasilsurks line, which is connected with the solar terminator transit; the lateral scale of the related ionospheric inhomogeneities was 65-80 km.

  8. Design and operation of an outdoor microalgae test facility

    Energy Technology Data Exchange (ETDEWEB)

    Weissman, J.C.; Tillett, D.M.; Goebel, R.P. (Microbial Products, Inc., Vacaville, CA (USA))

    1989-10-01

    The objective of the project covered in this report is to establish and operate a facility in the American Southwest to test the concept of producing microalgae on a large scale. This microalgae would then be used as a feedstock for producing liquid fuels. The site chosen for this project was an existing water research station in Roswell, New Mexico; the climate and water resources are representative of those in the Southwest. For this project, researchers tested specific designs, modes of operation, and strains of microalgae; proposed and evaluated modifications to technological concepts; and assessed the progress toward meeting cost objectives.

  9. Large-Scale Aquifer Test at the Bemidji, Minnesota, Oil-Spill Site: Implications for Modeling Multiphase Flow, Natural Attenuation, and LNAPL Remediation

    Science.gov (United States)

    Herkelrath, W. N.; Delin, G. N.

    2005-12-01

    A large-scale aquifer test was carried out at a crude oil spill site near Bemidji, Minnesota. The spill occurred in 1979 when a pipeline ruptured, spreading oil over a large area and creating three subsurface "pools" of high oil saturation near the water table. USGS scientists, in cooperation with researchers from several universities, have investigated the fate and transport of separate phase oil and hydrocarbons dissolved in ground water at this site since 1983. The primary goal of the aquifer test was to estimate parameters used in modeling processes such as subsurface flow of oil and water as well as natural attenuation of dissolved hydrocarbons in the plume. A secondary goal was to evaluate the effects of the oil on the parameters. Our aquifer test was carried out in July 2005 beneath the "north" oil pool, which occupies a 20x100 meter footprint. Prior to the test, the water table was about 6 meters below land surface, and the oil thickness in wells at the center of the pool was about 0.4 meters. A pumping well was installed near the center of the oil pool and screened 4-10 meters below the floating oil. During the test, water was pumped out at about 240 liters/min for 48 hours. Water levels were monitored in 21 wells that were screened below the water table and did not contain oil. Data loggers and pressure transducers were used to monitor 17 of these wells, and 4 wells were measured by hand using a tape. In 20 other wells that were screened at the water table and contained oil, depths to the oil-air and oil-water interfaces were monitored by hand using an oil-interface meter. Preliminary results indicate that oil thickness in wells within about 5 meters of the pumped well increased rapidly during the test to more than a meter. Oil also entered the top of the pumped well screen and filled the well bore to a thickness of about 3 meters. Preliminary analysis of water table drawdown vs. time data implies that the horizontal hydraulic conductivity is about 60 m

  10. Large Scale Magnetostrictive Valve Actuator

    Science.gov (United States)

    Richard, James A.; Holleman, Elizabeth; Eddleman, David

    2008-01-01

    Marshall Space Flight Center's Valves, Actuators and Ducts Design and Development Branch developed a large scale magnetostrictive valve actuator. The potential advantages of this technology are faster, more efficient valve actuators that consume less power and provide precise position control and deliver higher flow rates than conventional solenoid valves. Magnetostrictive materials change dimensions when a magnetic field is applied; this property is referred to as magnetostriction. Magnetostriction is caused by the alignment of the magnetic domains in the material s crystalline structure and the applied magnetic field lines. Typically, the material changes shape by elongating in the axial direction and constricting in the radial direction, resulting in no net change in volume. All hardware and testing is complete. This paper will discuss: the potential applications of the technology; overview of the as built actuator design; discuss problems that were uncovered during the development testing; review test data and evaluate weaknesses of the design; and discuss areas for improvement for future work. This actuator holds promises of a low power, high load, proportionally controlled actuator for valves requiring 440 to 1500 newtons load.

  11. Millimeter-wave Instrumentation Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Millimeter-wave Instrumentation Test Facility conducts basic research in propagation phenomena, remote sensing, and target signatures. The facility has a breadth...

  12. Thermal energy storage test facility

    Science.gov (United States)

    Ternes, M. P.

    1981-03-01

    Two loops making up the facility, using either air or liquid as the thermal transport fluid, are described. These loops will be capable of cycling residential-size thermal energy storage units through conditions simulating solar or off-peak electricity applications to evaluate the unit's performance. Construction of the liquid cycling loop was completed, and testing of thermal stratification techniques for hot and cold water is reported.

  13. Nondestructive testing and monitoring of stiff large-scale structures by measuring 3D coordinates of cardinal points using electronic distance measurements in a trilateration architecture

    Science.gov (United States)

    Parker, David H.

    2017-04-01

    By using three, or more, electronic distance measurement (EDM) instruments, such as commercially available laser trackers, in an unconventional trilateration architecture, 3-D coordinates of specialized retroreflector targets attached to cardinal points on a structure can be measured with absolute uncertainty of less than one part-permillion. For example, 3-D coordinates of a structure within a 100 meter cube can be measured within a volume of a 0.1 mm cube (the thickness of a sheet of paper). Relative dynamic movements, such as vibrations at 30 Hz, are typically measured 10 times better, i.e., within a 0.01 mm cube. Measurements of such accuracy open new areas for nondestructive testing and finite element model confirmation of stiff, large-scale structures, such as: buildings, bridges, cranes, boilers, tank cars, nuclear power plant containment buildings, post-tensioned concrete, and the like by measuring the response to applied loads, changes over the life of the structure, or changes following an accident, fire, earthquake, modification, etc. The sensitivity of these measurements makes it possible to measure parameters such as: linearity, hysteresis, creep, symmetry, damping coefficient, and the like. For example, cracks exhibit a highly non-linear response when strains are reversed from compression to tension. Due to the measurements being 3-D, unexpected movements, such as transverse motion produced by an axial load, could give an indication of an anomaly-such as an asymmetric crack or materials property in a beam, delamination of concrete, or other asymmetry due to failures. Details of the specialized retroreflector are included.

  14. A large-scale solar greenhouse dryer using polycarbonate cover: Modeling and testing in a tropical environment of Lao People's Democratic Republic

    Energy Technology Data Exchange (ETDEWEB)

    Janjai, Serm; Intawee, Poolsak; Kaewkiew, Jinda; Sritus, Chanoke [Solar Energy Research Laboratory, Department of Physics, Faculty of Science, Silpakorn University, Nakhon Pathom 73000 (Thailand); Khamvongsa, Vathsana [Department of Physics, Faculty of Natural Science, National University of Laos, P O Box 7322, Vientiane (Lao People' s Democratic Republic)

    2011-03-15

    A large-scale solar greenhouse dryer with a loading capacity of 1000 kg of fruits or vegetables has been developed and tested at field levels. The dryer has a parabolic shape and the dryer is covered with polycarbonate sheets. The base of the dryer is a black concrete floor with an area of 7.5 x 20.0 m{sup 2}. Nine DC fans powered by three 50-W solar cell modules are used to ventilate the dryer. The dryer was installed at Champasak (15.13 N, 105.79 E) in Lao People's Democratic Republic (Lao PDR). It is routinely used to dry chilli, banana and coffee. To assess the experimental performances of the dryer, air temperature, air relative humidity and product moisture contents were measured. One thousand kilograms of banana with the initial moisture content of 68% (wb) was dried within 5 days, compared to 7 days required for natural sun drying with the same weather conditions. Also three hundred kilograms of chilli with the initial moisture content of 75% (wb) was dried within 3 days while the natural sun drying needed 5 days. Two hundred kilograms of coffee with the initial moisture content of 52% (wb) was dried within 2 days as compared to 4 days required for natural sun drying. The chilli, coffee and banana dried in this dryer were completely protected from insects, animals and rain. Furthermore, good quality of dried products was obtained. The payback period of the dryer is estimated to be 2.5 years. A system of partial differential equations describing heat and moisture transfer during drying of chilli, coffee and banana in the greenhouse dryer was developed. These equations were solved by using the finite different method. The simulated results agree well with the experimental data. This model can be used to provide the design data for this type of dryer in other locations. (author)

  15. The GALATEA Test-facility

    Science.gov (United States)

    Abt, I.; Doenmez, B.; Garbini, L.; Irlbeck, S.; Palermo, M.; Schulz, O.

    GALATEA is a test-facility designed to study the properties of Germanium detectors in detail. It is a powerful high precision tool to investigate bulk and surface effects in germanium detectors. A vacuum tank houses an infrared screened volume with a cooled detector inside. A system of three stages allowa a complete scan of the detector. At the moment, a 19-fold segmented Germanium detector is under investigation. The main feature of GALATEA is that there is no material between source and detector. This allows the usage of alpha and beta sources as well as of a laser beam to study surface effects. The experimental setup is described.

  16. The Phoenix series large scale LNG pool fire experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  17. Interaction of a cumulus cloud ensemble with the large-scale environment. III - Semi-prognostic test of the Arakawa-Schubert cumulus parameterization

    Science.gov (United States)

    Lord, S. J.

    1982-01-01

    The verification of the Arakawa and Schubert (1974) cumulus parameterization is continued using a semiprognostic approach. Observed data from Phase III of GATE are used to provide estimates of the large-scale forcing of a cumulus ensemble at each observation time. Instantaneous values of the precipitation and the warming and drying due to cumulus convection are calculated using the parameterization. The results show that the calculated precipitation agrees very well with estimates from the observed large-scale moisture budget and from radar observations. The calculated vertical profiles of cumulus warming and drying also are quite similar to the observed. It is shown that the closure assumption adopted in the parameterization (the cloud-work function quasi-equilibrium) results in errors of generally less than 10% in the calculated precipitation. The sensitivity of the parameterization to some assumptions of the cloud ensemble model and the solution method for the cloud-base mass flux is investigated.

  18. Safeguards instruments for Large-Scale Reprocessing Plants

    Energy Technology Data Exchange (ETDEWEB)

    Hakkila, E.A. [Los Alamos National Lab., NM (United States); Case, R.S.; Sonnier, C. [Sandia National Labs., Albuquerque, NM (United States)

    1993-06-01

    Between 1987 and 1992 a multi-national forum known as LASCAR (Large Scale Reprocessing Plant Safeguards) met to assist the IAEA in development of effective and efficient safeguards for large-scale reprocessing plants. The US provided considerable input for safeguards approaches and instrumentation. This paper reviews and updates instrumentation of importance in measuring plutonium and uranium in these facilities.

  19. Facility for testing ice drills

    Science.gov (United States)

    Nielson, Dennis L.; Delahunty, Chris; Goodge, John W.; Severinghaus, Jeffery P.

    2017-05-01

    The Rapid Access Ice Drill (RAID) is designed for subsurface scientific investigations in Antarctica. Its objectives are to drill rapidly through ice, to core samples of the transition zone and bedrock, and to leave behind a borehole observatory. These objectives required the engineering and fabrication of an entirely new drilling system that included a modified mining-style coring rig, a unique fluid circulation system, a rod skid, a power unit, and a workshop with areas for the storage of supplies and consumables. An important milestone in fabrication of the RAID was the construction of a North American Test (NAT) facility where we were able to test drilling and fluid processing functions in an environment that is as close as possible to that expected in Antarctica. Our criteria for site selection was that the area should be cold during the winter months, be located in an area of low heat flow, and be at relatively high elevation. We selected a site for the facility near Bear Lake, Utah, USA. The general design of the NAT well (NAT-1) started with a 27.3 cm (10.75 in.) outer casing cemented in a 152 m deep hole. Within that casing, we hung a 14 cm (5.5 in.) casing string, and, within that casing, a column of ice was formed. The annulus between the 14 and 27.3 cm casings provided the path for circulation of a refrigerant. After in-depth study, we chose to use liquid CO2 to cool the hole. In order to minimize the likelihood of the casing splitting due to the volume increase associated with freezing water, the hole was first cooled and then ice was formed in increments from the bottom upward. First, ice cubes were placed in the inner liner and then water was added. Using this method, a column of ice was incrementally prepared for drilling tests. The drilling tests successfully demonstrated the functioning of the RAID system. Reproducing such a facility for testing of other ice drilling systems could be advantageous to other research programs in the future.

  20. Large Scale Dynamos in Stars

    Science.gov (United States)

    Vishniac, Ethan T.

    2015-01-01

    We show that a differentially rotating conducting fluid automatically creates a magnetic helicity flux with components along the rotation axis and in the direction of the local vorticity. This drives a rapid growth in the local density of current helicity, which in turn drives a large scale dynamo. The dynamo growth rate derived from this process is not constant, but depends inversely on the large scale magnetic field strength. This dynamo saturates when buoyant losses of magnetic flux compete with the large scale dynamo, providing a simple prediction for magnetic field strength as a function of Rossby number in stars. Increasing anisotropy in the turbulence produces a decreasing magnetic helicity flux, which explains the flattening of the B/Rossby number relation at low Rossby numbers. We also show that the kinetic helicity is always a subdominant effect. There is no kinematic dynamo in real stars.

  1. Large-scale circuit simulation

    Science.gov (United States)

    Wei, Y. P.

    1982-12-01

    The simulation of VLSI (Very Large Scale Integration) circuits falls beyond the capabilities of conventional circuit simulators like SPICE. On the other hand, conventional logic simulators can only give the results of logic levels 1 and 0 with the attendent loss of detail in the waveforms. The aim of developing large-scale circuit simulation is to bridge the gap between conventional circuit simulation and logic simulation. This research is to investigate new approaches for fast and relatively accurate time-domain simulation of MOS (Metal Oxide Semiconductors), LSI (Large Scale Integration) and VLSI circuits. New techniques and new algorithms are studied in the following areas: (1) analysis sequencing (2) nonlinear iteration (3) modified Gauss-Seidel method (4) latency criteria and timestep control scheme. The developed methods have been implemented into a simulation program PREMOS which could be used as a design verification tool for MOS circuits.

  2. GERDA test facilities in Munich

    Energy Technology Data Exchange (ETDEWEB)

    Jelen, M.; Abt, I.; Caldwell, A.; Liu Jing; Kroeninger, K.; Lenz, D.; Liu Xiang; Majorovits, B.; Schubert, J. [Max-Planck-Inst. fuer Physik, Muenchen (Germany)

    2007-07-01

    The GERDA (Germanium Detector Array) experiment is designed to search for neutrinoless double-beta decay of {sup 76}Ge. Germanium detectors enriched in {sup 76}Ge will be submerged in pure liquid argon. The cryogenic liquid is used as cooling liquid for the detectors and as shielding against gamma radiation. Several test facilities are currently under construction at the MPI Munich. Prototype Germanium detectors are tested in conditions close to the experimental setup of GERDA. Detector parameters are determined in a specialized vacuum teststand as well as directly in liquid argon. A new vacuum teststand named Galatea is under construction. It will be used to expose germanium detectors to {alpha}- and {beta}-particles and study their response to surface events. This yields information about dead layers and the response to surface contaminations. (orig.)

  3. Survey of solar thermal test facilities

    Energy Technology Data Exchange (ETDEWEB)

    Masterson, K.

    1979-08-01

    The facilities that are presently available for testing solar thermal energy collection and conversion systems are briefly described. Facilities that are known to meet ASHRAE standard 93-77 for testing flat-plate collectors are listed. The DOE programs and test needs for distributed concentrating collectors are identified. Existing and planned facilities that meet these needs are described and continued support for most of them is recommended. The needs and facilities that are suitable for testing components of central receiver systems, several of which are located overseas, are identified. The central contact point for obtaining additional details and test procedures for these facilities is the Solar Thermal Test Facilities Users' Association in Albuquerque, N.M. The appendices contain data sheets and tables which give additional details on the technical capabilities of each facility. Also included is the 1975 Aerospace Corporation report on test facilities that is frequently referenced in the present work.

  4. Very Large Scale Integration (VLSI).

    Science.gov (United States)

    Yeaman, Andrew R. J.

    Very Large Scale Integration (VLSI), the state-of-the-art production techniques for computer chips, promises such powerful, inexpensive computing that, in the future, people will be able to communicate with computer devices in natural language or even speech. However, before full-scale VLSI implementation can occur, certain salient factors must be…

  5. Electromagnetic Interference (EMI) and TEMPEST Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Electromagnetic Interference (EMI), Electromagnetic Compatibility (EMC) and TEMPEST testing are conducted at EPG's Blacktail Canyon Test Facility in one of its two...

  6. The New LOTIS Test Facility

    Science.gov (United States)

    Bell, R. M.; Cuzner, G.; Eugeni, C.; Hutchison, S. B.; Merrick, A. J.; Robins, G. C.; Bailey, S. H.; Ceurden, B.; Hagen, J.; Kenagy, K.; Martin, H. M.; Tuell, M.; Ward, M.; West, S. C.

    2008-01-01

    The Large Optical Test and Integration Site (LOTIS) at the Lockheed Martin Space Systems Company in Sunnyvale, CA is designed for the verification and testing of optical systems. The facility consists of an 88 foot temperature stabilized vacuum chamber that also functions as a class 10k vertical flow cleanroom. Many problems were encountered in the design and construction phases. The industry capability to build large chambers is very weak. Through many delays and extra engineering efforts, the final product is very good. With 11 Thermal Conditioning Units and precision RTD s, temperature is uniform and stable within 1oF, providing an ideal environment for precision optical testing. Within this chamber and atop an advanced micro-g vibration-isolation bench is the 6.5 meter diameter LOTIS Collimator and Scene Generator, LOTIS alignment and support equipment. The optical payloads are also placed on the vibration bench in the chamber for testing. This optical system is designed to operate in both air and vacuum, providing test imagery in an adaptable suite of visible/near infrared (VNIR) and midwave infrared (MWIR) point sources, and combined bandwidth visible-through-MWIR point sources, for testing of large aperture optical payloads. The heart of the system is the LOTIS Collimator, a 6.5m f/15 telescope, which projects scenes with wavefront errors <85 nm rms out to a 0.75 mrad field of view (FOV). Using field lenses, performance can be extended to a maximum field of view of 3.2 mrad. The LOTIS Collimator incorporates an extensive integrated wavefront sensing and control system to verify the performance of the system.

  7. Solar Thermal Propulsion Test Facility

    Science.gov (United States)

    1999-01-01

    Researchers at the Marshall Space Flight Center (MSFC) have designed, fabricated, and tested the first solar thermal engine, a non-chemical rocket engine that produces lower thrust but has better thrust efficiency than a chemical combustion engine. MSFC turned to solar thermal propulsion in the early 1990s due to its simplicity, safety, low cost, and commonality with other propulsion systems. Solar thermal propulsion works by acquiring and redirecting solar energy to heat a propellant. This photograph shows a fully assembled solar thermal engine placed inside the vacuum chamber at the test facility prior to testing. The 20- by 24-ft heliostat mirror (not shown in this photograph) has a dual-axis control that keeps a reflection of the sunlight on the 18-ft diameter concentrator mirror, which then focuses the sunlight to a 4-in focal point inside the vacuum chamber. The focal point has 10 kilowatts of intense solar power. As part of MSFC's Space Transportation Directorate, the Propulsion Research Center serves as a national resource for research of advanced, revolutionary propulsion technologies. The mission is to move theNation's capabilities beyond the confines of conventional chemical propulsion into an era of aircraft-like access to Earth orbit, rapid travel throughout the solar system, and exploration of interstellar space.

  8. Successful start for new CLIC test facility

    CERN Multimedia

    2004-01-01

    A new test facility is being built to study key feasibility issues for a possible future linear collider called CLIC. Commissioning of the first part of the facility began in June 2003 and nominal beam parameters have been achieved already.

  9. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  10. CHARM Facility Test Area Radiation Field Description

    CERN Document Server

    Thornton, Adam

    2016-01-01

    Specification document summarising the radiation field of the CHARM facility test area. This will act as a guide to any potential users of the facility as to what they can expect in terms of radiation, given in the form of radiation spectra information and fluence for each test position, along with general radiation maps for the test area and Montrac test location.

  11. DITCM roadside facilities for cooperative systems testing and evaluation

    NARCIS (Netherlands)

    Passchier, I.; Netten, B.D.; Wedemeijer, H.; Maas, S.M.P.; Leeuwen, C.J. van; Schackmann, P.P.M.

    2013-01-01

    Cooperative systems are being developed for large scale deployment in the near future. Validation of the performance of cooperative systems, and evaluation of the impact of cooperative applications is crucial before large scale deployment can proceed. The DITCM test site facilitates testing, evaluat

  12. X-ray computed tomography investigation of structures in Opalinus Clay from large-scale to small-scale after mechanical testing

    Science.gov (United States)

    Kaufhold, Annette; Halisch, Matthias; Zacher, Gerhard; Kaufhold, Stephan

    2016-08-01

    In the past years X-ray computed tomography (CT) has became more and more common for geoscientific applications and is used from the µm-scale (e.g. for investigations of microfossils or pore-scale structures) up to the dm-scale (full drill cores or soil columns). In this paper we present results from CT imaging and mineralogical investigations of an Opalinus Clay core on different scales and different regions of interest, emphasizing especially the 3-D evaluation and distribution of cracks and their impact on mechanical testing of such material. Enhanced knowledge of the testing behaviour of the Opalinus Clay is of great interest, especially since this material is considered for a long-term radioactive waste disposal and storage facility in Switzerland. Hence, results are compared regarding the mineral (i.e. phase) contrast resolution, the spatial resolution, and the overall scanning speed.With this extensive interdisciplinary scale-down approach it has been possible to characterize the general fracture propagation in comparison to mineralogical and textural features of the Opalinus Clay. Additionally, and as far as we know, a so-called mylonitic zone, located at an intersect of two main fractures, has been observed for the first time for an experimentally deformed Opalinus sample. The multi-scale results are in good accordance to data from naturally deformed Opalinus Clay samples, which enables us to perform systematical research under controlled laboratory conditions. Accompanying 3-D imaging greatly enhances the capability of data interpretation and assessment of such a material.

  13. Testing time for deep water[Deep water test facility in Rotterdam, NL

    Energy Technology Data Exchange (ETDEWEB)

    Snieckus, Darius

    2000-06-01

    A new deep water test facility in Rotterdam in the Netherlands is described. The construction is a basin measuring 45m by 36m and some 10.5m deep: it can accommodate large scale model tests at depths equivalent to 1000m by using a hydraulic 'moveable' floor buoyed by syntactic foam. For simulation of depths of 3000m it opens its 'deep pit' - a well 5m diameter and 20m deep. The facility can also simulate the winds, waves and currents met offshore in places such as the Shetlands, West Africa and the Gulf of Mexico. The article includes pictures and diagrams of the facility.

  14. N-body simulations of supercluster dynamics to test the viability of large scale structure as a probe of dark energy and dark matter

    Science.gov (United States)

    Pearson, David William

    Many parameters of modern cosmology have been determined to incredible precision at present, including tight constraints on two rather mysterious components of the Universe, dark matter and dark energy. Large Scale structure may be uniquely able to place constraints on both of these components, particularly structures that are loosely gravitationally bound. In such structures, the effects of dark energy's outward push is only slightly less than gravity's inward pull, giving the best chance for detection of dark energy in their dynamics. This work aims to answer whether these structures could potentially serve as a laboratory for studying dark energy, by simulating the dynamics of superclusters both including and excluding its effects. Also, by comparing simulation results with an observational dynamical analysis, dark matter content and possibly the effects of dark energy can be constrained. For this purpose four potentially bound superclusters were identified: the Aquarius, Corona Borealis, Microscopium, and Shapley superclusters. Their dynamics were simulated with N-body software written by the author. It is shown that there is a difference in the line-of-sight velocity dispersions of superclusters depending on whether the effects of dark energy are included or not, but this difference is small enough that it would not be detectable due to observational uncertainties. A new method of supercluster mass estimation, named SCM+FP, is presented, combining knowledge of the dynamics and the spherical collapse model to determine the mass. Also, a new analytical model for the extent of gravitationally bound structure is presented, arising from a simple modification of the spherical collapse model which is supported by simulation results. Further results include the most conclusive evidence to date of extended bound structure in the Corona Borealis supercluster along with evidence that there is extended bound structure in the Shapley supercluster, each with a core of five

  15. Controlled Archaeological Test Site (CATS) Facility

    Data.gov (United States)

    Federal Laboratory Consortium — CATS facility is at the Construction Engineering Research Laboratory (CERL), Champaign, IL. This 1-acre test site includes a variety of subsurface features carefully...

  16. National Solar Thermal Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    Cameron, C.P.

    1989-12-31

    This is a brief report about a Sandia National Laboratory facility which can provide high-thermal flux for simulation of nuclear thermal flash, measurements of the effects of aerodynamic heating on radar transmission, etc

  17. Strings and large scale magnetohydrodynamics

    CERN Document Server

    Olesen, P

    1995-01-01

    From computer simulations of magnetohydrodynamics one knows that a turbulent plasma becomes very intermittent, with the magnetic fields concentrated in thin flux tubes. This situation looks very "string-like", so we investigate whether strings could be solutions of the magnetohydrodynamics equations in the limit of infinite conductivity. We find that the induction equation is satisfied, and we discuss the Navier-Stokes equation (without viscosity) with the Lorentz force included. We argue that the string equations (with non-universal maximum velocity) should describe the large scale motion of narrow magnetic flux tubes, because of a large reparametrization (gauge) invariance of the magnetic and electric string fields.

  18. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  19. Aquatic Plant Control Research Program. Large-Scale Operations Management Test (LSOMT) of Insects and Pathogens for Control of Waterhyacinth in Louisiana. Volume 1. Results for 1979-1981.

    Science.gov (United States)

    1985-01-01

    Results for 1979-1981 * 7. AUTHOR(e) 8. CONTRACT OR GRANT NUMBER(s) "’ Dana R. Sanders, Sr. Edwin A. Theriot Patricia Perfetti 9. PERFORMING...Division (ERD), Environmental Laboratory (EL), WES, and Dr. Patricia Perfetti , University of Tennessee-Chattanooga, Chattanooga, Tennessee. The field...Sanders, D. R., Sr., Theriot, E. A., and Perfetti , P. 1985. "Large-Scale Operations Management Test (LSOMT) of Insects and Pathogens for Control

  20. How do OSS projects change in number and size? A large-scale analysis to test a model of project growth

    CERN Document Server

    Schweitzer, Frank; Tessone, Claudio J; Xia, Xi

    2015-01-01

    Established Open Source Software (OSS) projects can grow in size if new developers join, but also the number of OSS projects can grow if developers choose to found new projects. We discuss to what extent an established model for firm growth can be applied to the dynamics of OSS projects. Our analysis is based on a large-scale data set from SourceForge (SF) consisting of monthly data for 10 years, for up to 360'000 OSS projects and up to 340'000 developers. Over this time period, we find an exponential growth both in the number of projects and developers, with a remarkable increase of single-developer projects after 2009. We analyze the monthly entry and exit rates for both projects and developers, the growth rate of established projects and the monthly project size distribution. To derive a prediction for the latter, we use modeling assumptions of how newly entering developers choose to either found a new project or to join existing ones. Our model applies only to collaborative projects that are deemed to gro...

  1. A summary of recent activities at the National Solar Thermal Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    Cameron, C.P.

    1992-09-01

    The United States Department of Energy`s National Solar Thermal Test Facility (NSTTF), located at Sandia National Laboratories in Albuquerque, New Mexico, is the major facility for testing of solar thermal components and systems in the United States. Since originally being constructed as the Central Receiver Test Facility in the late 1970`s, its mission has been expanded to include distributed receiver technologies, and it now includes line-focus and point-focus collectors, two solar furnaces, and an engine test facility. In addition, the unique capabilities of the facility have been applied to a wide variety of tests unrelated to solar energy, but using the intense heat from concentrated solar radiation or using the large-scale optical systems at the site. In this paper, current activities at the NSTTF are summarized, with an emphasis on activities that have not been described elsewhere.

  2. A summary of recent activities at the National Solar Thermal Test Facility

    Science.gov (United States)

    Cameron, C. P.

    The United States Department of Energy's National Solar Thermal Test Facility (NSTTF), located at Sandia National Laboratories in Albuquerque, New Mexico, is the major facility for testing of solar thermal components and systems in the United States. Since originally being constructed as the Central Receiver Test Facility in the late 1970's, its mission has been expanded to include distributed receiver technologies, and it now includes line-focus and point-focus collectors, two solar furnaces, and an engine test facility. In addition, the unique capabilities of the facility have been applied to a wide variety of tests unrelated to solar energy, but using the intense heat from concentrated solar radiation or using the large-scale optical systems at the site. In this paper, current activities at the NSTTF are summarized, with an emphasis on activities that have not been described elsewhere.

  3. A summary of recent activities at the National Solar Thermal Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    Cameron, C.P.

    1992-01-01

    The United States Department of Energy's National Solar Thermal Test Facility (NSTTF), located at Sandia National Laboratories in Albuquerque, New Mexico, is the major facility for testing of solar thermal components and systems in the United States. Since originally being constructed as the Central Receiver Test Facility in the late 1970's, its mission has been expanded to include distributed receiver technologies, and it now includes line-focus and point-focus collectors, two solar furnaces, and an engine test facility. In addition, the unique capabilities of the facility have been applied to a wide variety of tests unrelated to solar energy, but using the intense heat from concentrated solar radiation or using the large-scale optical systems at the site. In this paper, current activities at the NSTTF are summarized, with an emphasis on activities that have not been described elsewhere.

  4. Test Results From The Idaho National Laboratory 15kW High Temperature Electrolysis Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    Carl M. Stoots; Keith G. Condie; James E. O' Brien; J. Stephen Herring; Joseph J. Hartvigsen

    2009-07-01

    A 15kW high temperature electrolysis test facility has been developed at the Idaho National Laboratory under the United States Department of Energy Nuclear Hydrogen Initiative. This facility is intended to study the technology readiness of using high temperature solid oxide cells for large scale nuclear powered hydrogen production. It is designed to address larger-scale issues such as thermal management (feed-stock heating, high temperature gas handling, heat recuperation), multiple-stack hot zone design, multiple-stack electrical configurations, etc. Heat recuperation and hydrogen recycle are incorporated into the design. The facility was operated for 1080 hours and successfully demonstrated the largest scale high temperature solid-oxide-based production of hydrogen to date.

  5. Models of large scale structure

    Energy Technology Data Exchange (ETDEWEB)

    Frenk, C.S. (Physics Dept., Univ. of Durham (UK))

    1991-01-01

    The ingredients required to construct models of the cosmic large scale structure are discussed. Input from particle physics leads to a considerable simplification by offering concrete proposals for the geometry of the universe, the nature of the dark matter and the primordial fluctuations that seed the growth of structure. The remaining ingredient is the physical interaction that governs dynamical evolution. Empirical evidence provided by an analysis of a redshift survey of IRAS galaxies suggests that gravity is the main agent shaping the large-scale structure. In addition, this survey implies large values of the mean cosmic density, {Omega}> or approx.0.5, and is consistent with a flat geometry if IRAS galaxies are somewhat more clustered than the underlying mass. Together with current limits on the density of baryons from Big Bang nucleosynthesis, this lends support to the idea of a universe dominated by non-baryonic dark matter. Results from cosmological N-body simulations evolved from a variety of initial conditions are reviewed. In particular, neutrino dominated and cold dark matter dominated universes are discussed in detail. Finally, it is shown that apparent periodicities in the redshift distributions in pencil-beam surveys arise frequently from distributions which have no intrinsic periodicity but are clustered on small scales. (orig.).

  6. Impact of on-site, small and large scale wastewater treatment facilities on levels and fate of pharmaceuticals, personal care products, artificial sweeteners, pesticides, and perfluoroalkyl substances in recipient waters.

    Science.gov (United States)

    Gago-Ferrero, Pablo; Gros, Meritxell; Ahrens, Lutz; Wiberg, Karin

    2017-12-01

    One of the main risks associated with effluents from both wastewater treatment plants (WWTPs) and on-site sewage treatment facilities (OSSFs) is the release of micropollutants (MPs) in receiving water bodies. However, the impact of MPs present in the effluents of OSSFs in the aquatic environment has not been studied so far. The current study evaluates the impact of the effluents of OSSFs and small-to-large scale WWTPs on natural waters. The discharge of 74 MPs was assessed including pharmaceuticals, personal care products, pesticides, artificial sweeteners and perfluoroalkyl substances (PFASs). The sampling was carried out within a Swedish catchment and included three sites that are exclusively affected by OSSFs and other sites that are mainly affected by WWTPs or a mixture of sources (7 sites, 28 samples). Results show that although OSSFs serve a much smaller total number of people, the MPs emitted from OSSFs reached the aquatic environment in significant quantities (concentrations of >150ngL(-1) of ∑MPs). The composition profiles for sites affected by WWTPs were similar and were dominated by sucralose (27% of the ∑MPs), caffeine (27% of the ∑MPs), lamotrigine (10% of the ∑MPs), desvenlafaxine (5% of the ∑MPs), and diclofenac (4% of the ∑MPs). In contrast, the sites affected by OSSFs showed high variability, exhibiting a different profile from those affected by WWTPs and also from each other, demonstrating that OSSFs are not homogeneous sources of MPs. Some specific compounds, such as diethyltoluamide (DEET) and caffeine, were proportionally much more important at sites affected by OSSFs than at sites affected by WWTPs (representing a much higher percentage of the ∑MPs in the OSSFs). In contrast, PFASs did not show high concentration variation among the different sampling sites and the composition profiles were relatively similar, indicating that these substances follow different routes of entry into the aquatic environment. Copyright © 2017

  7. ELASTIC: A Large Scale Dynamic Tuning Environment

    Directory of Open Access Journals (Sweden)

    Andrea Martínez

    2014-01-01

    Full Text Available The spectacular growth in the number of cores in current supercomputers poses design challenges for the development of performance analysis and tuning tools. To be effective, such analysis and tuning tools must be scalable and be able to manage the dynamic behaviour of parallel applications. In this work, we present ELASTIC, an environment for dynamic tuning of large-scale parallel applications. To be scalable, the architecture of ELASTIC takes the form of a hierarchical tuning network of nodes that perform a distributed analysis and tuning process. Moreover, the tuning network topology can be configured to adapt itself to the size of the parallel application. To guide the dynamic tuning process, ELASTIC supports a plugin architecture. These plugins, called ELASTIC packages, allow the integration of different tuning strategies into ELASTIC. We also present experimental tests conducted using ELASTIC, showing its effectiveness to improve the performance of large-scale parallel applications.

  8. Automation Technology Improvements on SEE Test Facility

    Institute of Scientific and Technical Information of China (English)

    FAN; Hui; LIU; Jian-cheng; SHEN; Dong-jun

    2012-01-01

    <正>When user do heavy ion SEE tests in the irradiation facility, the ion beam should be uniform and the beam flux should be fit for their tests. User also wants the sample position easy to be located. These requirements are very important for our facility. This year, our team has paid great effort in improving beam parameter monitoring and auto control ability of facility. The main jobs are as follows.

  9. Antenna Test Facility (ATF): User Test Planning Guide

    Science.gov (United States)

    Lin, Greg

    2011-01-01

    Test process, milestones and inputs are unknowns to first-time users of the ATF. The User Test Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their test engineering personnel in test planning and execution. Material covered includes a roadmap of the test process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, test article interfaces, and inputs necessary to define test scope, cost, and schedule are included as an appendix to the guide.

  10. Vibration and Acoustic Test Facility (VATF): User Test Planning Guide

    Science.gov (United States)

    Fantasia, Peter M.

    2011-01-01

    Test process, milestones and inputs are unknowns to first-time users of the VATF. The User Test Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their test engineering personnel in test planning and execution. Material covered includes a roadmap of the test process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, test article interfaces, and inputs necessary to define test scope, cost, and schedule are included as an appendix to the guide.

  11. Maestro: an orchestration framework for large-scale WSN simulations.

    Science.gov (United States)

    Riliskis, Laurynas; Osipov, Evgeny

    2014-03-18

    Contemporary wireless sensor networks (WSNs) have evolved into large and complex systems and are one of the main technologies used in cyber-physical systems and the Internet of Things. Extensive research on WSNs has led to the development of diverse solutions at all levels of software architecture, including protocol stacks for communications. This multitude of solutions is due to the limited computational power and restrictions on energy consumption that must be accounted for when designing typical WSN systems. It is therefore challenging to develop, test and validate even small WSN applications, and this process can easily consume significant resources. Simulations are inexpensive tools for testing, verifying and generally experimenting with new technologies in a repeatable fashion. Consequently, as the size of the systems to be tested increases, so does the need for large-scale simulations. This article describes a tool called Maestro for the automation of large-scale simulation and investigates the feasibility of using cloud computing facilities for such task. Using tools that are built into Maestro, we demonstrate a feasible approach for benchmarking cloud infrastructure in order to identify cloud Virtual Machine (VM)instances that provide an optimal balance of performance and cost for a given simulation.

  12. THE COMPONENT TEST FACILITY – A NATIONAL USER FACILITY FOR TESTING OF HIGH TEMPERATURE GAS-COOLED REACTOR (HTGR) COMPONENTS AND SYSTEMS

    Energy Technology Data Exchange (ETDEWEB)

    David S. Duncan; Vondell J. Balls; Stephanie L. Austad

    2008-09-01

    The Next Generation Nuclear Plant (NGNP) and other High-Temperature Gas-cooled Reactor (HTGR) Projects require research, development, design, construction, and operation of a nuclear plant intended for both high-efficiency electricity production and high-temperature industrial applications, including hydrogen production. During the life cycle stages of an HTGR, plant systems, structures and components (SSCs) will be developed to support this reactor technology. To mitigate technical, schedule, and project risk associated with development of these SSCs, a large-scale test facility is required to support design verification and qualification prior to operational implementation. As a full-scale helium test facility, the Component Test facility (CTF) will provide prototype testing and qualification of heat transfer system components (e.g., Intermediate Heat Exchanger, valves, hot gas ducts), reactor internals, and hydrogen generation processing. It will perform confirmation tests for large-scale effects, validate component performance requirements, perform transient effects tests, and provide production demonstration of hydrogen and other high-temperature applications. Sponsored wholly or in part by the U.S. Department of Energy, the CTF will support NGNP and will also act as a National User Facility to support worldwide development of High-Temperature Gas-cooled Reactor technologies.

  13. Sophisticated test facility to detect land mines

    NARCIS (Netherlands)

    Jong, W. de; Lensen, H.A.; Janssen, Y.H.L.

    1999-01-01

    In the framework of the Dutch government humanitarian demining project 'HOM-2000', an outdoor test facility has been realized to test, improve and develop detection equipment for land mines. This sophisticated facility, allows us to access and compare the performance of the individual and of a combi

  14. TOPFLOW - a new multipurpose thermalhydraulic test facility for the investigation of steady state and transient two phase flow phenomena

    Energy Technology Data Exchange (ETDEWEB)

    Schaffrath, A.; Kruessenberg, A.K.; Weiss, F.P.; Beyer, M.; Carl, H.; Prasser, H.M.; Schuster, J.; Schuetz, P.; Tamme, M.; Zimmermann, W. [Forschungszentrum Rossendorf e.V. (FZR) (Germany). Inst. fuer Sicherheitsforschung; Hicken, E.F. [Forschungszentrum Juelich (Germany). Inst. fuer Sicherheitsforschung und Reaktortechnik

    2001-08-01

    The Forschungszentrum Rossendorf (FZR) e. V. is constructing a new large-scale test facility, TOPFLOW, for thermalhydraulic single effect tests. The acronym stands for transient two phase flow test facility. It will mainly be used for the investigation of generic and applied steady state and transient two phase flow phenomena and the development and validation of models of computational fluid dynamic (CFD) codes. (orig.)

  15. Wake Shield Facility Modal Survey Test in Vibration Acoustic Test Facility

    Science.gov (United States)

    1994-01-01

    Astronaut Ronald M. Sega stands beside the University of Houston's Wake Shield Facility before it undergoes a Modal Survey Test in the Vibration and Acoustic Test Facility Building 49, prior to being flown on space shuttle mission STS-60.

  16. Large-Scale Galaxy Bias

    CERN Document Server

    Desjacques, Vincent; Schmidt, Fabian

    2016-01-01

    This review presents a comprehensive overview of galaxy bias, that is, the statistical relation between the distribution of galaxies and matter. We focus on large scales where cosmic density fields are quasi-linear. On these scales, the clustering of galaxies can be described by a perturbative bias expansion, and the complicated physics of galaxy formation is absorbed by a finite set of coefficients of the expansion, called bias parameters. The review begins with a pedagogical proof of this very important result, which forms the basis of the rigorous perturbative description of galaxy clustering, under the assumptions of General Relativity and Gaussian, adiabatic initial conditions. Key components of the bias expansion are all leading local gravitational observables, which includes the matter density but also tidal fields and their time derivatives. We hence expand the definition of local bias to encompass all these contributions. This derivation is followed by a presentation of the peak-background split in i...

  17. Conference on Large Scale Optimization

    CERN Document Server

    Hearn, D; Pardalos, P

    1994-01-01

    On February 15-17, 1993, a conference on Large Scale Optimization, hosted by the Center for Applied Optimization, was held at the University of Florida. The con­ ference was supported by the National Science Foundation, the U. S. Army Research Office, and the University of Florida, with endorsements from SIAM, MPS, ORSA and IMACS. Forty one invited speakers presented papers on mathematical program­ ming and optimal control topics with an emphasis on algorithm development, real world applications and numerical results. Participants from Canada, Japan, Sweden, The Netherlands, Germany, Belgium, Greece, and Denmark gave the meeting an important international component. At­ tendees also included representatives from IBM, American Airlines, US Air, United Parcel Serice, AT & T Bell Labs, Thinking Machines, Army High Performance Com­ puting Research Center, and Argonne National Laboratory. In addition, the NSF sponsored attendance of thirteen graduate students from universities in the United States and abro...

  18. Large Scale Correlation Clustering Optimization

    CERN Document Server

    Bagon, Shai

    2011-01-01

    Clustering is a fundamental task in unsupervised learning. The focus of this paper is the Correlation Clustering functional which combines positive and negative affinities between the data points. The contribution of this paper is two fold: (i) Provide a theoretic analysis of the functional. (ii) New optimization algorithms which can cope with large scale problems (>100K variables) that are infeasible using existing methods. Our theoretic analysis provides a probabilistic generative interpretation for the functional, and justifies its intrinsic "model-selection" capability. Furthermore, we draw an analogy between optimizing this functional and the well known Potts energy minimization. This analogy allows us to suggest several new optimization algorithms, which exploit the intrinsic "model-selection" capability of the functional to automatically recover the underlying number of clusters. We compare our algorithms to existing methods on both synthetic and real data. In addition we suggest two new applications t...

  19. Large scale in-situ BOrehole and Geofluid Simulator (i.BOGS) for the development and testing of borehole technologies at reservoir conditions

    Science.gov (United States)

    Duda, Mandy; Bracke, Rolf; Stöckhert, Ferdinand; Wittig, Volker

    2017-04-01

    A fundamental problem of technological applications related to the exploration and provision of geothermal energy is the inaccessibility of subsurface processes. As a result, actual reservoir properties can only be determined using (a) indirect measurement techniques such as seismic surveys, machine feedback and geophysical borehole logging, (b) laboratory experiments capable of simulating in-situ properties, but failing to preserve temporal and spatial scales, or vice versa, and (c) numerical simulations. Moreover, technological applications related to the drilling process, the completion and cementation of a wellbore or the stimulation and exploitation of the reservoir are exposed to high pressure and temperature conditions as well as corrosive environments resulting from both, rock formation and geofluid characteristics. To address fundamental and applied questions in the context of geothermal energy provision and subsurface exploration in general one of Europe's largest geoscientific laboratory infrastructures is introduced. The in-situ Borehole and Geofluid Simulator (i.BOGS) allows to simulate quasi scale-preserving processes at reservoir conditions up to depths of 5000 m and represents a large scale pressure vessel for iso-/hydrostatic and pore pressures up to 125 MPa and temperatures from -10°C to 180°C. The autoclave can either be filled with large rock core samples (25 cm in diameter, up to 3 m length) or with fluids and technical borehole devices (e.g. pumps, sensors). The pressure vessel is equipped with an ultrasound system for active transmission and passive recording of acoustic emissions, and can be complemented by additional sensors. The i.BOGS forms the basic module for the Match.BOGS finally consisting of three modules, i.e. (A) the i.BOGS, (B) the Drill.BOGS, a drilling module to be attached to the i.BOGS capable of applying realistic torques and contact forces to a drilling device that enters the i.BOGS, and (C) the Fluid.BOGS, a geofluid

  20. Counterpart experimental study of ISP-42 PANDA tests on PUMA facility

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Jun, E-mail: toyangjun@gmail.com [School of Nuclear Engineering, Purdue University, 400 Central Drive, West Lafayette, IN 47907-1290 (United States); Choi, Sung-Won; Lim, Jaehyok; Lee, Doo-Yong; Rassame, Somboon; Hibiki, Takashi; Ishii, Mamoru [School of Nuclear Engineering, Purdue University, 400 Central Drive, West Lafayette, IN 47907-1290 (United States)

    2013-05-15

    Highlights: ► Counterpart tests were performed on two large-scale BWR integral facilities. ► Similarity of post-LOCA system behaviors observed between two tests. ► Passive core and containment cooling systems work as design in both tests. -- Abstract: A counterpart test to the Passive Nachwärmeabfuhr und Druckabbau Test Anlage (Passive Decay Heat Removal and Depressurization Test Facility, PANDA) International Standard Problem (ISP)-42 test was conducted at the Purdue University Multi-Dimensional Integral Test Assembly (PUMA) facility. Aimed to support code validation on a range of light water reactor (LWR) containment issues, the ISP-42 test consists of six sequential phases (Phases A–F) with separately defined initial and boundary conditions, addressing different stages of anticipated accident scenario and system responses. The counterpart test was performed from Phases A to D, which are within the scope of the normal integral tests performed on the PUMA facility. A scaling methodology was developed by using the PANDA facility as prototype and PUMA facility as test model, and an engineering scaling has been applied to the PUMA facility. The counterpart test results indicated that functions of passive safety systems, such as passive containment cooling system (PCCS) start-up, gravity-driven cooling system (GDCS) discharge, PCCS normal operation and overload function were confirmed in both the PANDA and PUMA facilities with qualitative similarities.

  1. Energy Systems Test Area (ESTA). Power Systems Test Facilities

    Science.gov (United States)

    Situ, Cindy H.

    2010-01-01

    This viewgraph presentation provides a detailed description of the Johnson Space Center's Power Systems Facility located in the Energy Systems Test Area (ESTA). Facilities and the resources used to support power and battery systems testing are also shown. The contents include: 1) Power Testing; 2) Power Test Equipment Capabilities Summary; 3) Source/Load; 4) Battery Facilities; 5) Battery Test Equipment Capabilities Summary; 6) Battery Testing; 7) Performance Test Equipment; 8) Battery Test Environments; 9) Battery Abuse Chambers; 10) Battery Abuse Capabilities; and 11) Battery Test Area Resources.

  2. Photovoltaic Systems Test Facilities: Existing capabilities compilation

    Science.gov (United States)

    Volkmer, K.

    1982-01-01

    A general description of photovoltaic systems test facilities (PV-STFs) operated under the U.S. Department of Energy's photovoltaics program is given. Descriptions of a number of privately operated facilities having test capabilities appropriate to photovoltaic hardware development are given. A summary of specific, representative test capabilities at the system and subsystem level is presented for each listed facility. The range of system and subsystem test capabilities available to serve the needs of both the photovoltaics program and the private sector photovoltaics industry is given.

  3. Naval Aerodynamics Test Facility (NATF)

    Data.gov (United States)

    Federal Laboratory Consortium — The NATF specializes in Aerodynamics testing of scaled and fullsized Naval models, research into flow physics found on US Navy planes and ships, aerosol testing and...

  4. CryoModule Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — CMTFis able to test complete SRF cryomodules at cryogenic operating temperatures and with RF Power. CMTF will house the PIP-II Injector Experiment allowing test of...

  5. Construction and commissioning test report of the CEDM test facility

    Energy Technology Data Exchange (ETDEWEB)

    Chung, C. H.; Kim, J. T.; Park, W. M.; Youn, Y. J.; Jun, H. G.; Choi, N. H.; Park, J. K.; Song, C. H.; Lee, S. H.; Park, J. K

    2001-02-01

    The test facility for performance verification of the control element drive mechanism (CEDM) of next generation power plant was installed at the site of KAERI. The CEDM was featured a mechanism consisting of complicated mechanical parts and electromagnetic control system. Thus, a new CEDM design should go through performance verification tests prior to it's application in a reactor. The test facility can simulate the reactor operating conditions such as temperature, pressure and water quality and is equipped with a test chamber to accomodate a CEDM as installed in the power plant. This test facility can be used for the following tests; endurance test, coil cooling test, power measurement and reactivity rod drop test. The commissioning tests for the test facility were performed up to the CEDM test conditions of 320 C and 150 bar, and required water chemistry was obtained by operating the on-line water treatment system.

  6. 用于大型铸锻件现场硬度检测的大型硬度计%Large Hardness Testers for Field Hardness Testing of Large Scale Casting and Forging Parts

    Institute of Scientific and Technical Information of China (English)

    张凤林

    2016-01-01

    Large hardness testers are mainly applied in field hardness testing of large scale casting and forging parts. The principle,structure,performance,operation method and main feature and technical parameters of typical large hardness testers at home and abroad were introduced.The results show that the rocker type hardness testers which work in manual or semi-automatic manner are mainly applied in testing the hardness of one piece or small quantity large scale workpieces.The bridge type hardness testers work in manual,semi-automatic or fully automatic manner,and they could be applied not only in testing the hardness of one piece or small quantity large scale workpieces,but also in testing the hardness of batches of large scale or medium scale workpieces.The automatic bridge type hardness testers have the feature of high efficiency,and could be applied in testing quantity workpieces one by one.As the center,assembled with functions of workpiece polishing,feeding,blanking,compressing,conveying and telecommunication device,the automatic bridge type hardness tester could become an on-line automatic hardness testing system.%大型硬度计主要用于大型铸锻件的现场硬度检测。介绍了国内外典型大型硬度计的原理、结构、性能、操作方法、主要特点及技术参数。结果表明:摇臂式硬度计,以手动或半自动方式工作,主要用于检测单件或小批量大型工件;门式硬度计以手动、半自动或全自动方式工作,既可用于单件或小批量大型工件的检测,也可用于批量大中型工件的检测;其中自动门式硬度计的检测效率较高,可用于批量工件的逐件检测;以自动门式硬度计为核心,配上打磨、上下料、压紧及输送装置,并且增加远程通信功能,可以构成一套在线自动硬度检测系统。

  7. Large-scale pool fires

    Directory of Open Access Journals (Sweden)

    Steinhaus Thomas

    2007-01-01

    Full Text Available A review of research into the burning behavior of large pool fires and fuel spill fires is presented. The features which distinguish such fires from smaller pool fires are mainly associated with the fire dynamics at low source Froude numbers and the radiative interaction with the fire source. In hydrocarbon fires, higher soot levels at increased diameters result in radiation blockage effects around the perimeter of large fire plumes; this yields lower emissive powers and a drastic reduction in the radiative loss fraction; whilst there are simplifying factors with these phenomena, arising from the fact that soot yield can saturate, there are other complications deriving from the intermittency of the behavior, with luminous regions of efficient combustion appearing randomly in the outer surface of the fire according the turbulent fluctuations in the fire plume. Knowledge of the fluid flow instabilities, which lead to the formation of large eddies, is also key to understanding the behavior of large-scale fires. Here modeling tools can be effectively exploited in order to investigate the fluid flow phenomena, including RANS- and LES-based computational fluid dynamics codes. The latter are well-suited to representation of the turbulent motions, but a number of challenges remain with their practical application. Massively-parallel computational resources are likely to be necessary in order to be able to adequately address the complex coupled phenomena to the level of detail that is necessary.

  8. Design Study of Beijing XFEL Test Facility

    CERN Document Server

    Dai, J P

    2005-01-01

    As R&D of X-ray Free Electron Laser facility in China, the construction of Beijing XFEL Test Facility (BTF) has been proposed. And the start to end simulation of BTF was made with codes PARMELA, ELEGANT and TDA. This paper presents the motivation, the scheme and the simulation results of BTF.

  9. Ballast Water Treatment Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: Provides functionality for the full-scale testing and controlled simulation of ship ballasting operations for assessment of aquatic nuisance species (ANS)...

  10. Battery Post-Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Post-test diagnostics of aged batteries can provide additional information regarding the cause of performance degradation, which, previously, could be only inferred...

  11. Handbook of Large-Scale Random Networks

    CERN Document Server

    Bollobas, Bela; Miklos, Dezso

    2008-01-01

    Covers various aspects of large-scale networks, including mathematical foundations and rigorous results of random graph theory, modeling and computational aspects of large-scale networks, as well as areas in physics, biology, neuroscience, sociology and technical areas

  12. High Speed Networking and Large-scale Simulation in Geodynamics

    Science.gov (United States)

    Kuang, Weijia; Gary, Patrick; Seablom, Michael; Truszkowski, Walt; Odubiyi, Jide; Jiang, Weiyuan; Liu, Dong

    2004-01-01

    Large-scale numerical simulation has been one of the most important approaches for understanding global geodynamical processes. In this approach, peta-scale floating point operations (pflops) are often required to carry out a single physically-meaningful numerical experiment. For example, to model convective flow in the Earth's core and generation of the geomagnetic field (geodynamo), simulation for one magnetic free-decay time (approximately 15000 years) with a modest resolution of 150 in three spatial dimensions would require approximately 0.2 pflops. If such a numerical model is used to predict geomagnetic secular variation over decades and longer, with e.g. an ensemble Kalman filter assimilation approach, approximately 30 (and perhaps more) independent simulations of similar scales would be needed for one data assimilation analysis. Obviously, such a simulation would require an enormous computing resource that exceeds the capacity of a single facility currently available at our disposal. One solution is to utilize a very fast network (e.g. 10Gb optical networks) and available middleware (e.g. Globus Toolkit) to allocate available but often heterogeneous resources for such large-scale computing efforts. At NASA GSFC, we are experimenting with such an approach by networking several clusters for geomagnetic data assimilation research. We shall present our initial testing results in the meeting.

  13. Theoretical Analysis of Effective Thermal Conductivity for the Chinese HTR-PM Heat Transfer Test Facility

    Directory of Open Access Journals (Sweden)

    Cheng Ren

    2017-01-01

    Full Text Available The Chinese high temperature gas-cooled reactor pebble bed module (HTR-PM demonstration project has attracted increasing attention. In order to support the project, a large-scale heat transfer test facility has been constructed for pebble bed effective thermal conductivity measurement over the whole temperature range (0~1600 °C. Based on different heat transfer mechanisms in the randomly packed pebble bed, three different types of effective thermal conductivity have been theoretically evaluated. A prediction of the total effective thermal conductivity of the pebble bed over the whole temperature range is provided for the optimization of the test facility and guidance of further experiments.

  14. 400 Area/Fast Flux Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The 400 Area at Hanford is home primarily to the Fast Flux Test Facility (FFTF), a DOE-owned, formerly operating, 400-megawatt (thermal) liquid-metal (sodium)-cooled...

  15. Large-Scale Information Systems

    Energy Technology Data Exchange (ETDEWEB)

    D. M. Nicol; H. R. Ammerlahn; M. E. Goldsby; M. M. Johnson; D. E. Rhodes; A. S. Yoshimura

    2000-12-01

    Large enterprises are ever more dependent on their Large-Scale Information Systems (LSLS), computer systems that are distinguished architecturally by distributed components--data sources, networks, computing engines, simulations, human-in-the-loop control and remote access stations. These systems provide such capabilities as workflow, data fusion and distributed database access. The Nuclear Weapons Complex (NWC) contains many examples of LSIS components, a fact that motivates this research. However, most LSIS in use grew up from collections of separate subsystems that were not designed to be components of an integrated system. For this reason, they are often difficult to analyze and control. The problem is made more difficult by the size of a typical system, its diversity of information sources, and the institutional complexities associated with its geographic distribution across the enterprise. Moreover, there is no integrated approach for analyzing or managing such systems. Indeed, integrated development of LSIS is an active area of academic research. This work developed such an approach by simulating the various components of the LSIS and allowing the simulated components to interact with real LSIS subsystems. This research demonstrated two benefits. First, applying it to a particular LSIS provided a thorough understanding of the interfaces between the system's components. Second, it demonstrated how more rapid and detailed answers could be obtained to questions significant to the enterprise by interacting with the relevant LSIS subsystems through simulated components designed with those questions in mind. In a final, added phase of the project, investigations were made on extending this research to wireless communication networks in support of telemetry applications.

  16. Brookhaven superconducting cable test facility

    Energy Technology Data Exchange (ETDEWEB)

    Forsyth, E.B.; Gibbs, R.J.

    1976-08-17

    Construction has started on an outdoor testing station for flexible ac superconducting power transmission cables. It is intended to serve as an intermediate step between laboratory-scale experiments and qualification testing of prototype-scale cables. The permanent equipment includes a 500 W supercritical helium refrigerator using a screw compressor and multistage turbine expanders. Helium storage for 250,000 cu ft of helium at 250 psi is provided. Initially, the cables will be tested in a horizontal cryostat some 250 ft long. High-voltage 60 Hz tests will be performed with the cable in a series resonant mode with a maximum line to ground capability of 240 kV, this is adequate for a 138 kV system design. Impulse testing up to about 650 kV is planned. The cable conductor will be energized by current transformers, initially at about 4 kA and later up to fault levels of 40 kA. The refrigerator is now at the site and testing on a dummy load will commence in the Fall of 1976. The cryostat will be installed in 1977 followed about a year later by the first cable tests.

  17. Conundrum of the Large Scale Streaming

    CERN Document Server

    Malm, T M

    1999-01-01

    The etiology of the large scale peculiar velocity (large scale streaming motion) of clusters would increasingly seem more tenuous, within the context of the gravitational instability hypothesis. Are there any alternative testable models possibly accounting for such large scale streaming of clusters?

  18. The Development of Extraversion and Ability: Analysis of Data from a Large-Scale Longitudinal Study of Children Tested at 10-11 and 14-15 Years.

    Science.gov (United States)

    Anthony, W. S.

    1983-01-01

    Results of analysis of correlations collected by Cookson, following Eysenck and Cookson's study of personality and ability in young people, confirm the finding from previous Cattellian test data that the more intelligent children decline in relative extraversion scores and cast doubt on Eysenck's suggestion that introverts gradually show higher…

  19. Psychometric Features of the General Aptitude Test-Verbal Part (GAT-V): A Large-Scale Assessment of High School Graduates in Saudi Arabia

    Science.gov (United States)

    Dimitrov, Dimiter M.; Shamrani, Abdul Rahman

    2015-01-01

    This study examines the psychometric features of a General Aptitude Test-Verbal Part, which is used with assessments of high school graduates in Saudi Arabia. The data supported a bifactor model, with one general factor and three content domains (Analogy, Sentence Completion, and Reading Comprehension) as latent aspects of verbal aptitude.

  20. Psychometric Features of the General Aptitude Test-Verbal Part (GAT-V): A Large-Scale Assessment of High School Graduates in Saudi Arabia

    Science.gov (United States)

    Dimitrov, Dimiter M.; Shamrani, Abdul Rahman

    2015-01-01

    This study examines the psychometric features of a General Aptitude Test-Verbal Part, which is used with assessments of high school graduates in Saudi Arabia. The data supported a bifactor model, with one general factor and three content domains (Analogy, Sentence Completion, and Reading Comprehension) as latent aspects of verbal aptitude.

  1. Cryogenic magnet test facility for fair

    CERN Document Server

    Schroeder, C; Marzouki, F; Stafiniac, A; Floch, E; Schnizer, P; Moritz, G; Xiang, Y; Kauschke, M; Meier, J; Hess, G ,

    2009-01-01

    For testing fast-pulsed superconducting model and pre-series magnets for FAIR (Facility of Antiproton and Ion Research), a cryogenic magnet test facility was built up at GSI. The facility is able to cool either cold masses in a universal cryostat or complete magnets in their own cryo-module. It is possible to operate bath cooled, 2 phase cooled, and supercritical cooled magnets with a maximum current up to 11 kA and a ramp rate up to 14 kA/s. Measurements of magnet heat loss, with calorimetric and a V-I methods, are available, as are quench and magnetic field measurements. Design and functionality of the test facility will be described. Results of measurements with a supercritical cooled magnet and with a 2 phase cooled SIS100 model magnet will be shown.

  2. Costs and cost-effectiveness of a large-scale mass testing and treatment intervention for malaria in Southern Province, Zambia.

    Science.gov (United States)

    Silumbe, Kafula; Yukich, Joshua O; Hamainza, Busiku; Bennett, Adam; Earle, Duncan; Kamuliwo, Mulakwa; Steketee, Richard W; Eisele, Thomas P; Miller, John M

    2015-05-20

    A cluster, randomized, control trial of three dry-season rounds of a mass testing and treatment intervention (MTAT) using rapid diagnostic tests (RDTs) and artemether-lumefantrine (AL) was conducted in four districts in Southern Province, Zambia. Data were collected on the costs and logistics of the intervention and paired with effectiveness estimated from a community randomized control trial for the purpose of conducting a provider perspective cost-effectiveness analysis of MTAT vs no MTAT (Standard of Care). Dry-season MTAT in this setting did not reduce malaria transmission sufficiently to permit transition to a case-investigation strategy to then pursue malaria elimination, however, the intervention did substantially reduce malaria illness and was a highly cost-effective intervention for malaria burden reduction in this moderate transmission area. The cost per RDT administered was estimated to be USD4.39 (range: USD1.62-13.96) while the cost per AL treatment administered was estimated to be USD34.74 (range: USD3.87-3,835). The net cost per disability adjusted life year averted (incremental cost-effectiveness ratio) was estimated to be USD804. The intervention appears to be highly cost-effective relative to World Health Organization thresholds for malaria burden reduction in Zambia as compared to no MTAT. However, it was estimated that population-wide mass drug administration is likely to be more cost-effective for burden reduction and for transmission reduction compared to MTAT.

  3. Massachusetts Large Blade Test Facility Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Rahul Yarala; Rob Priore

    2011-09-02

    Project Objective: The Massachusetts Clean Energy Center (CEC) will design, construct, and ultimately have responsibility for the operation of the Large Wind Turbine Blade Test Facility, which is an advanced blade testing facility capable of testing wind turbine blades up to at least 90 meters in length on three test stands. Background: Wind turbine blade testing is required to meet international design standards, and is a critical factor in maintaining high levels of reliability and mitigating the technical and financial risk of deploying massproduced wind turbine models. Testing is also needed to identify specific blade design issues that may contribute to reduced wind turbine reliability and performance. Testing is also required to optimize aerodynamics, structural performance, encourage new technologies and materials development making wind even more competitive. The objective of this project is to accelerate the design and construction of a large wind blade testing facility capable of testing blades with minimum queue times at a reasonable cost. This testing facility will encourage and provide the opportunity for the U.S wind industry to conduct more rigorous testing of blades to improve wind turbine reliability.

  4. Modified gravity and large scale flows, a review

    Science.gov (United States)

    Mould, Jeremy

    2017-02-01

    Large scale flows have been a challenging feature of cosmography ever since galaxy scaling relations came on the scene 40 years ago. The next generation of surveys will offer a serious test of the standard cosmology.

  5. Large-scale evaluation of log P predictors: local corrections may compensate insufficient accuracy and need of experimentally testing every other compound.

    Science.gov (United States)

    Tetko, Igor V; Poda, Gennadiy I; Ostermann, Claude; Mannhold, Raimund

    2009-11-01

    A large variety of log P calculation methods failed to produce sufficient accuracy in log P prediction for two in-house datasets of more than 96000 compounds contrary to their significantly better performances on public datasets. The minimum Root Mean Squared Error (RMSE) of 1.02 and 0.65 were calculated for the Pfizer and Nycomed datasets, respectively, in the 'out-of-box' implementation. Importantly, the use of local corrections (LC) implemented in the ALOGPS program based on experimental in-house log P data significantly reduced the RMSE to 0.59 and 0.48 for the Pfizer and Nycomed datasets, respectively, instantly without retraining the model. Moreover, more than 60% of molecules predicted with the highest confidence in each set had a mean absolute error (MAE) less than 0.33 log units that is only ca. 10% higher than the estimated variation in experimental log P measurements for the Pfizer dataset. Therefore, following this retrospective analysis, we suggest that the use of the predicted log P values with high confidence may eliminate the need of experimentally testing every other compound. This strategy could reduce the cost of measurements for pharmaceutical companies by a factor of 2, increase the confidence in prediction at the analog design stage of drug discovery programs, and could be extended to other ADMET properties.

  6. Mapping of Schistosomiasis and Soil-Transmitted Helminths in Namibia: The First Large-Scale Protocol to Formally Include Rapid Diagnostic Tests.

    Directory of Open Access Journals (Sweden)

    José Carlos Sousa-Figueiredo

    Full Text Available Namibia is now ready to begin mass drug administration of praziquantel and albendazole against schistosomiasis and soil-transmitted helminths, respectively. Although historical data identifies areas of transmission of these neglected tropical diseases (NTDs, there is a need to update epidemiological data. For this reason, Namibia adopted a new protocol for mapping of schistosomiasis and geohelminths, formally integrating rapid diagnostic tests (RDTs for infections and morbidity. In this article, we explain the protocol in detail, and introduce the concept of 'mapping resolution', as well as present results and treatment recommendations for northern Namibia.This new protocol allowed a large sample to be surveyed (N = 17,896 children from 299 schools at relatively low cost (7 USD per person mapped and very quickly (28 working days. All children were analysed by RDTs, but only a sub-sample was also diagnosed by light microscopy. Overall prevalence of schistosomiasis in the surveyed areas was 9.0%, highly associated with poorer access to potable water (OR = 1.5, P<0.001 and defective (OR = 1.2, P<0.001 or absent sanitation infrastructure (OR = 2.0, P<0.001. Overall prevalence of geohelminths, more particularly hookworm infection, was 12.2%, highly associated with presence of faecal occult blood (OR = 1.9, P<0.001. Prevalence maps were produced and hot spots identified to better guide the national programme in drug administration, as well as targeted improvements in water, sanitation and hygiene. The RDTs employed (circulating cathodic antigen and microhaematuria for Schistosoma mansoni and S. haematobium, respectively performed well, with sensitivities above 80% and specificities above 95%.This protocol is cost-effective and sensitive to budget limitations and the potential economic and logistical strains placed on the national Ministries of Health. Here we present a high resolution map of disease prevalence levels, and treatment regimens are

  7. Cultural norm fulfillment, interpersonal belonging, or getting ahead? A large-scale cross-cultural test of three perspectives on the function of self-esteem.

    Science.gov (United States)

    Gebauer, Jochen E; Sedikides, Constantine; Wagner, Jenny; Bleidorn, Wiebke; Rentfrow, Peter J; Potter, Jeff; Gosling, Samuel D

    2015-09-01

    What is the function of self-esteem? We classified relevant theoretical work into 3 perspectives. The cultural norm-fulfillment perspective regards self-esteem a result of adherence to cultural norms. The interpersonal-belonging perspective regards self-esteem as a sociometer of interpersonal belonging. The getting-ahead perspective regards self-esteem as a sociometer of getting ahead in the social world, while regarding low anxiety/neuroticism as a sociometer of getting along with others. The 3 perspectives make contrasting predictions on the relation between the Big Five personality traits and self-esteem across cultures. We tested these predictions in a self-report study (2,718,838 participants from 106 countries) and an informant-report study (837,655 informants from 64 countries). We obtained some evidence for cultural norm fulfillment, but the effect size was small. Hence, this perspective does not satisfactorily account for self-esteem's function. We found a strong relation between Extraversion and higher self-esteem, but no such relation between Agreeableness and self-esteem. These 2 traits are pillars of interpersonal belonging. Hence, the results do not fit the interpersonal-belonging perspective either. However, the results closely fit the getting-ahead perspective. The relation between Extraversion and higher self-esteem is consistent with this perspective, because Extraversion is the Big Five driver for getting ahead in the social world. The relation between Agreeableness and lower neuroticism is also consistent with this perspective, because Agreeableness is the Big Five driver for getting along with others.

  8. Novel insights in the fecal egg count reduction test for monitoring drug efficacy against soil-transmitted helminths in large-scale treatment programs.

    Directory of Open Access Journals (Sweden)

    Bruno Levecke

    2011-12-01

    Full Text Available BACKGROUND: The fecal egg count reduction test (FECRT is recommended to monitor drug efficacy against soil-transmitted helminths (STHs in public health. However, the impact of factors inherent to study design (sample size and detection limit of the fecal egg count (FEC method and host-parasite interactions (mean baseline FEC and aggregation of FEC across host population on the reliability of FECRT is poorly understood. METHODOLOGY/PRINCIPAL FINDINGS: A simulation study was performed in which FECRT was assessed under varying conditions of the aforementioned factors. Classification trees were built to explore critical values for these factors required to obtain conclusive FECRT results. The outcome of this analysis was subsequently validated on five efficacy trials across Africa, Asia, and Latin America. Unsatisfactory (<85.0% sensitivity and specificity results to detect reduced efficacy were found if sample sizes were small (<10 or if sample sizes were moderate (10-49 combined with highly aggregated FEC (k<0.25. FECRT remained inconclusive under any evaluated condition for drug efficacies ranging from 87.5% to 92.5% for a reduced-efficacy-threshold of 90% and from 92.5% to 97.5% for a threshold of 95%. The most discriminatory study design required 200 subjects independent of STH status (including subjects who are not excreting eggs. For this sample size, the detection limit of the FEC method and the level of aggregation of the FEC did not affect the interpretation of the FECRT. Only for a threshold of 90%, mean baseline FEC <150 eggs per gram of stool led to a reduced discriminatory power. CONCLUSIONS/SIGNIFICANCE: This study confirms that the interpretation of FECRT is affected by a complex interplay of factors inherent to both study design and host-parasite interactions. The results also highlight that revision of the current World Health Organization guidelines to monitor drug efficacy is indicated. We, therefore, propose novel guidelines to

  9. Mapping of Schistosomiasis and Soil-Transmitted Helminths in Namibia: The First Large-Scale Protocol to Formally Include Rapid Diagnostic Tests

    Science.gov (United States)

    Sousa-Figueiredo, José Carlos; Stanton, Michelle C.; Katokele, Stark; Arinaitwe, Moses; Adriko, Moses; Balfour, Lexi; Reiff, Mark; Lancaster, Warren; Noden, Bruce H.; Bock, Ronnie; Stothard, J. Russell

    2015-01-01

    Background Namibia is now ready to begin mass drug administration of praziquantel and albendazole against schistosomiasis and soil-transmitted helminths, respectively. Although historical data identifies areas of transmission of these neglected tropical diseases (NTDs), there is a need to update epidemiological data. For this reason, Namibia adopted a new protocol for mapping of schistosomiasis and geohelminths, formally integrating rapid diagnostic tests (RDTs) for infections and morbidity. In this article, we explain the protocol in detail, and introduce the concept of ‘mapping resolution’, as well as present results and treatment recommendations for northern Namibia. Methods/Findings/Interpretation This new protocol allowed a large sample to be surveyed (N = 17 896 children from 299 schools) at relatively low cost (7 USD per person mapped) and very quickly (28 working days). All children were analysed by RDTs, but only a sub-sample was also diagnosed by light microscopy. Overall prevalence of schistosomiasis in the surveyed areas was 9.0%, highly associated with poorer access to potable water (OR = 1.5, P<0.001) and defective (OR = 1.2, P<0.001) or absent sanitation infrastructure (OR = 2.0, P<0.001). Overall prevalence of geohelminths, more particularly hookworm infection, was 12.2%, highly associated with presence of faecal occult blood (OR = 1.9, P<0.001). Prevalence maps were produced and hot spots identified to better guide the national programme in drug administration, as well as targeted improvements in water, sanitation and hygiene. The RDTs employed (circulating cathodic antigen and microhaematuria for Schistosoma mansoni and S. haematobium, respectively) performed well, with sensitivities above 80% and specificities above 95%. Conclusion/Significance This protocol is cost-effective and sensitive to budget limitations and the potential economic and logistical strains placed on the national Ministries of Health. Here we present a high resolution map

  10. A combined cycle engine test facility

    Energy Technology Data Exchange (ETDEWEB)

    Engers, R.; Cresci, D.; Tsai, C. [General Applied Science Laboratories Inc., Ronkonkoma, NY (United States)

    1995-09-01

    Rocket-Based Combined-Cycle (RBCC) engines intended for missiles and/or space launch applications incorporate features of rocket propulsion systems operating in concert with airbreathing engine cycles. Performance evaluation of these types of engines, which are intended to operate from static sea level take-off to supersonic cruise or accerlerate to orbit, requires ground test capabilities which integrate rocket component testing with airbreathing engine testing. A combined cycle engine test facility has been constructed in the General Applied Science Laboratories, Inc. (GASL) Aeropropulsion Test Laboratory to meet this requirement. The facility was designed to support the development of an innovative combined cycle engine concept which features a rocket based ramjet combustor. The test requirements included the ability to conduct tests in which the propulsive force was generated by rocket only, the ramjet only and simultaneous rocket and ramjet power (combined cycle) to evaluate combustor operation over the entire engine cycle. The test facility provides simulation over the flight Mach number range of 0 to 8 and at various trajectories. The capabilities of the combined cycle engine test facility are presented.

  11. A negative ion source test facility

    Science.gov (United States)

    Melanson, S.; Dehnel, M.; Potkins, D.; Theroux, J.; Hollinger, C.; Martin, J.; Philpott, C.; Stewart, T.; Jackle, P.; Williams, P.; Brown, S.; Jones, T.; Coad, B.; Withington, S.

    2016-02-01

    Progress is being made in the development of an Ion Source Test Facility (ISTF) by D-Pace Inc. in collaboration with Buckley Systems Ltd. in Auckland, NZ. The first phase of the ISTF is to be commissioned in October 2015 with the second phase being commissioned in March 2016. The facility will primarily be used for the development and the commercialization of ion sources. It will also be used to characterize and further develop various D-Pace Inc. beam diagnostic devices.

  12. A negative ion source test facility

    Energy Technology Data Exchange (ETDEWEB)

    Melanson, S.; Dehnel, M., E-mail: morgan@d-pace.com; Potkins, D.; Theroux, J.; Hollinger, C.; Martin, J.; Stewart, T.; Jackle, P.; Withington, S. [D-Pace, Inc., P.O. Box 201, Nelson, British Columbia V1L 5P9 (Canada); Philpott, C.; Williams, P.; Brown, S.; Jones, T.; Coad, B. [Buckley Systems Ltd., 6 Bowden Road, Mount Wellington, Auckland 1060 (New Zealand)

    2016-02-15

    Progress is being made in the development of an Ion Source Test Facility (ISTF) by D-Pace Inc. in collaboration with Buckley Systems Ltd. in Auckland, NZ. The first phase of the ISTF is to be commissioned in October 2015 with the second phase being commissioned in March 2016. The facility will primarily be used for the development and the commercialization of ion sources. It will also be used to characterize and further develop various D-Pace Inc. beam diagnostic devices.

  13. Development of Thermal Shock Test Facility

    Science.gov (United States)

    Lehmann, B.; Varewijck, G.; Dufour, J.-F.

    2012-07-01

    Thermal shock testing is performed to qualify materials and processes for use in space in accordance to ECSS- Q-70-04A. The Fast Thermal Vacuum facility (FTV) has been specially designed to allow testing from -100oC up to 550oC. This large temperature test range is achieved by having two separate temperature controlled compartments. The specimen is placed on a trolley, which moves from one compartment to the other. The challenge in development of the facility was the relatively large size of the compartments (600 mm x 600 mm x 400 mm) and the required vacuum level of p~1E-05 mbar. The FTV was successfully commissioned in September 2010. The presentation summarises the results of the commissioning, facility performance, test data and lessons learned.

  14. Recessed light fixture test facility

    Energy Technology Data Exchange (ETDEWEB)

    Yarbrough, D.W.; Yoo, K.T.; Koneru, P.B.

    1979-07-01

    Test results are presented for the operation of recessed light fixtures in contact with loose fill cellulose insulation. Nine recessed fixtures were operated at different power levels in attic sections in which loose fill cellulose was purposely misapplied. Cellulose insulation was introduced into the ceiling section by pouring to depths of up to nine inches. Maximum steady state temperatures were recorded for 485 combinations of the variables insulation depth, fixture power, and attic temperature. Results are included for operation of fixtures in the absence of cellulose and with barriers to provide needed clearance between the cellulose insulation and the powered fixtures. Observed temperatures on the electrical power cable attached to a fixture and ceiling joists adjacent to powered fixtures are reported. Examination of the data shows excess operating temperatures are encountered when powered fixtures are covered by three inches of loose fill insulation. Dangerous temperatures resulting in fires in some cases were recorded when covered fixtures were operated at above rated power levels. A preliminary analysis indicates that ceiling side heat transfer accounts for 85 to 90% of the heat dissipation from powered fixtures covered by three inches of loose fill cellulosic insulation.

  15. High-Throughput, Large-Scale SNP Genotyping: Bioinformatics Considerations

    OpenAIRE

    Margetic, Nino

    2004-01-01

    In order to provide a high-throughput, large-scale genotyping facility at the national level we have developed a set of inter-dependent information systems. A combination of commercial, publicly-available and in-house developed tools links a series of data repositories based both on flat files and relational databases providing an almost complete semi-automated pipeline.

  16. Characterizing experiments of the PPOOLEX test facility

    Energy Technology Data Exchange (ETDEWEB)

    Puustinen, M.; Laine, J. (Lappeenranta Univ. of Technology, Nuclear Safety Research Unit (Finland))

    2008-07-15

    This report summarizes the results of the characterizing test series in 2007 with the scaled down PPOOLEX facility designed and constructed at Lappeenranta University of Technology. Air and steam/air mixture was blown into the dry well compartment and from there through a DN200 blowdown pipe to the condensation pool (wet well). Altogether eight air and four steam/air mixture experiments, each consisting of several blows (tests), were carried out. The main purpose of the experiment series was to study the general behavior of the facility and the performance of basic instrumentation. Proper operation of automation, control and safety systems was also tested. The test facility is a closed stainless steel vessel divided into two compartments, dry well and wet well. The facility is equipped with high frequency measurements for capturing different aspects of the investigated phenomena. The general behavior of the PPOOLEX facility differs significantly from that of the previous POOLEX facility because of the closed two-compartment structure of the test vessel. Heat-up by several tens of degrees due to compression in both compartments was the most obvious evidence of this. Temperatures also stratified. Condensation oscillations and chugging phenomenon were encountered in those tests where the fraction of non-condensables had time to decrease significantly. A radical change from smooth condensation behavior to oscillating one occurred quite abruptly when the air fraction of the blowdown pipe flow dropped close to zero. The experiments again demonstrated the strong diminishing effect that noncondensable gases have on dynamic unsteady loadings experienced by submerged pool structures. BWR containment like behavior related to the beginning of a postulated steam line break accident was observed in the PPOOLEX test facility during the steam/air mixture experiments. The most important task of the research project, to produce experimental data for code simulation purposes, can be

  17. Hypersonic Tunnel Facility (HTF)

    Data.gov (United States)

    Federal Laboratory Consortium — The Hypersonic Tunnel Facility (HTF) is a blow-down, non-vitiated (clean air) free-jet wind tunnel capable of testing large-scale, propulsion systems at Mach 5, 6,...

  18. Hypersonic Tunnel Facility (HTF)

    Data.gov (United States)

    Federal Laboratory Consortium — The Hypersonic Tunnel Facility (HTF) is a blow-down, non-vitiated (clean air) free-jet wind tunnel capable of testing large-scale, propulsion systems at Mach 5, 6,...

  19. Kauai Test Facility hazards assessment document

    Energy Technology Data Exchange (ETDEWEB)

    Swihart, A

    1995-05-01

    The Department of Energy Order 55003A requires facility-specific hazards assessment be prepared, maintained, and used for emergency planning purposes. This hazards assessment document describes the chemical and radiological hazards associated with the Kauai Test Facility, Barking Sands, Kauai, Hawaii. The Kauai Test Facility`s chemical and radiological inventories were screened according to potential airborne impact to onsite and offsite individuals. The air dispersion model, ALOHA, estimated pollutant concentrations downwind from the source of a release, taking into consideration the toxicological and physical characteristics of the release site, the atmospheric conditions, and the circumstances of the release. The greatest distance to the Early Severe Health Effects threshold is 4.2 kilometers. The highest emergency classification is a General Emergency at the {open_quotes}Main Complex{close_quotes} and a Site Area Emergency at the Kokole Point Launch Site. The Emergency Planning Zone for the {open_quotes}Main Complex{close_quotes} is 5 kilometers. The Emergency Planning Zone for the Kokole Point Launch Site is the Pacific Missile Range Facility`s site boundary.

  20. a Low Temperature Regenerator Test Facility

    Science.gov (United States)

    Kashani, A.; Helvensteijn, B. P. M.; Feller, J. R.; Salerno, L. J.; Kittel, P.

    2008-03-01

    Testing regenerators presents an interesting challenge. When incorporated into a cryocooler, a regenerator is intimately coupled to the other components: expander, heat exchangers, and compressor. It is difficult to isolate the performance of any single component. We have developed a low temperature test facility that will allow us to separate the performance of the regenerator from the rest of the cryocooler. The purpose of the facility is the characterization of test regenerators using novel materials and/or geometries in temperature ranges down to 15 K. It consists of the following elements: The test column has two regenerators stacked in series. The coldest stage regenerator is the device under test. The warmer stage regenerator contains a stack of stainless steel screen, a well-characterized material. A commercial cryocooler is used to fix the temperatures at both ends of the test regenerator, cooling both heat exchangers flanging the regenerator stack. Heaters allow varying the temperatures and allow measurement of the remaining cooling power, and thus, regenerator effectiveness. A linear compressor delivers an oscillating pressure to the regenerator assembly. An inertance tube and reservoir provide the proper phase difference between mass flow and pressure. This phase shift, along with the imposed temperature differential, simulates the conditions of the test regenerator when used in an actual pulse tube cryocooler. This paper presents development details of the regenerator test facility, and test results on a second stage, stainless steel screen test regenerator.

  1. Comparison Between Overtopping Discharge in Small and Large Scale Models

    DEFF Research Database (Denmark)

    Helgason, Einar; Burcharth, Hans F.

    2006-01-01

    small and large scale model tests show no clear evidence of scale effects for overtopping above a threshold value. In the large scale model no overtopping was measured for waveheights below Hs = 0.5m as the water sunk into the voids between the stones on the crest. For low overtopping scale effects...... are presented as the small-scale model underpredicts the overtopping discharge....

  2. Grid sensitivity capability for large scale structures

    Science.gov (United States)

    Nagendra, Gopal K.; Wallerstein, David V.

    1989-01-01

    The considerations and the resultant approach used to implement design sensitivity capability for grids into a large scale, general purpose finite element system (MSC/NASTRAN) are presented. The design variables are grid perturbations with a rather general linking capability. Moreover, shape and sizing variables may be linked together. The design is general enough to facilitate geometric modeling techniques for generating design variable linking schemes in an easy and straightforward manner. Test cases have been run and validated by comparison with the overall finite difference method. The linking of a design sensitivity capability for shape variables in MSC/NASTRAN with an optimizer would give designers a powerful, automated tool to carry out practical optimization design of real life, complicated structures.

  3. Test Method of Homogeneity of Temperature About Large-Scale Car Type Gas Oven%大型燃气热处理炉炉温均匀性的测试方法

    Institute of Scientific and Technical Information of China (English)

    刘剑; 殷恺

    2012-01-01

    分析了大型燃气热处理炉炉温均匀性的影响因素及炉温均匀性检测方法,通过改善烧嘴形式、装炉方式、烧嘴状态及控温热电偶布置位置等措施,可使大型燃气台车式热处理炉炉温均匀性得到提高.%The influence factors and test method of the homogeneity of temperature for large-scale car type gas oven are analyzed. Through changing the type of burner,the way of loading,the status of burners, and the position of temperature controlling thermocouple, the homogeneity of temperature of large gas heat treatment furnace could be raised.

  4. A 33-GVA Interrupter Test Facility

    Science.gov (United States)

    1979-06-01

    REFERENCES 1. c. E. Swannack, R. A. Haarman, J. D. G. Lindsay, and D. M. Weldon, " HVDC Interrupter Experiments for Large Magnetic Energy...7759-MS, April 1979. 3. E. M. Honig, "Dual 30-kA, HVDC Interrupter Test Facility", Proc• 7th Symp. Eng. Problems of Fusion Res., Knoxville, TN

  5. ATLAS Large Scale Thin Gap Chambers

    Energy Technology Data Exchange (ETDEWEB)

    Soha, Aria [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2014-04-29

    This is a technical scope of work (TSW) between the Fermi National Accelerator Laboratory (Fermilab) and the experimenters of the ATLAS sTGC New Small Wheel collaboration who have committed to participate in beam tests to be carried out during the FY2014 Fermilab Test Beam Facility program.

  6. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  7. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  8. FAST FLUX TEST FACILITY DRIVER FUEL MEETING

    Energy Technology Data Exchange (ETDEWEB)

    None,

    1966-06-01

    The Pacific Northwest Laboratory has convened this meeting to enlist the best talents of our laboratories and industry in soliciting factual, technical information pertinent to the Pacific Northwest's Laboratory's evaluation of the potential fuel systems for the Fast Flux Test Facility. The particular factors emphasized for these fuel systems are those associated with safety, ability to meet testing objectives, and economics. The proceedings includes twenty-three presentations, along with a transcript of the discussion following each, as well as a summary discussion.

  9. 高烈度地震区公路填方路基大型振动台模型试验%Large-scale Shaking Table Model Test for Fill Subgrade of Highway in High-intensity Earthquake Area

    Institute of Scientific and Technical Information of China (English)

    李金贝; 张鸿儒; 李志强

    2011-01-01

    In order to study the seismic safety of highway fill subgrade, according to the test requirement, a large-scale shaking table model test scheme was designed for fill subgrade. First, the vibration system, the design of model box and the data acquisition system were introduced, the similar relation between treatment of the model boundary condition and fill subgrade model was analyzed, and the similarity constants are determined combining the test characteristics. The model material was determined by using the equivalent substitution method to deal with the similar gradation. Meanwhile, the arrangement of sensors was selected based on the numerical analysis of dynamic response characteristics. Finally, the loading scheme of the model test was prepared according to its purpose. The test result shows that the design of large-scale shaking table model test is feasible, and the purpose of the shaking table test was achieved.%为了深入研究公路填方路基的抗震安全,根据试验要求对填方路基进行了大型振动台模型试验方案设计.方案设计中首先介绍了振动台台阵体系、模型箱的设计和数据采集系统等试验装置;然后分析了填方路基振动台模型边界条件的处理和填方路基模型的相似关系,并在已有研究成果基础上结合本次试验特性确定了模型的相似常数,通过采用等量代替法来处理模型材料的相似级配,最终确定了模拟模型路基的试验材料.同时,在对填方路基数值分析的基础上,根据路基数值模型的动力反应特性确定了振动台试验测点的布置方案.最后,根据模型试验目的,编制了模型试验的加载制度.试验结果表明,填方路基的大型振动台模型试验设计方案可行,达到了振动台试验的目的.

  10. Large-scale sequential quadratic programming algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Eldersveld, S.K.

    1992-09-01

    The problem addressed is the general nonlinear programming problem: finding a local minimizer for a nonlinear function subject to a mixture of nonlinear equality and inequality constraints. The methods studied are in the class of sequential quadratic programming (SQP) algorithms, which have previously proved successful for problems of moderate size. Our goal is to devise an SQP algorithm that is applicable to large-scale optimization problems, using sparse data structures and storing less curvature information but maintaining the property of superlinear convergence. The main features are: 1. The use of a quasi-Newton approximation to the reduced Hessian of the Lagrangian function. Only an estimate of the reduced Hessian matrix is required by our algorithm. The impact of not having available the full Hessian approximation is studied and alternative estimates are constructed. 2. The use of a transformation matrix Q. This allows the QP gradient to be computed easily when only the reduced Hessian approximation is maintained. 3. The use of a reduced-gradient form of the basis for the null space of the working set. This choice of basis is more practical than an orthogonal null-space basis for large-scale problems. The continuity condition for this choice is proven. 4. The use of incomplete solutions of quadratic programming subproblems. Certain iterates generated by an active-set method for the QP subproblem are used in place of the QP minimizer to define the search direction for the nonlinear problem. An implementation of the new algorithm has been obtained by modifying the code MINOS. Results and comparisons with MINOS and NPSOL are given for the new algorithm on a set of 92 test problems.

  11. C Reactor overbore test facility review

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, P.A.; Nilson, R.

    1964-04-24

    In 1961, large-size, smooth-bore, Zircaloy process tubes were installed in C-Reactor graphite channels that had been enlarged to 2.275 inches. These tubes were installed to provide a test and demonstration facility for the concept of overboring as a means of securing significant improvement in the production capability of the reactors, After two years of facility operation, it is now appropriate to consider the extent to which original objectives have been achieved, to re-examine the original objectives, and to consider the best future use of this unique facility. This report presents the general results of such a review and re-examination in more detail.

  12. Testing principle of BOR and its calculation for large-scale full-capacity 16x104 m3 LNG storage tank%大型全容LNG储罐BOR测试方法与计算

    Institute of Scientific and Technical Information of China (English)

    胡小波; 魏玉迎

    2012-01-01

    以16×104 m3大型全容液化天然气储罐为例,描述其结构,介绍其静态蒸发率(BOR)的测试原理,详细论述了测试过程中的热量计算方法,并根据实测经验,对测试过程中储罐的静置、静置前相关阀位的隔离设置、现场数据测量、测后数据处理及完成测试后流程恢复进行分析梳理,提出针对此类测试的相关工艺操作建议,最后对测试结果的影响因素进行探讨,可为以后该类LNG储罐BOR的测试提供参考.%Taking the large-scale full-capacity 16×104 m3 LNG storage tank as an example, this paper describes its structure and testing principles of BOR, and analyzes the heat calculation method during the test in detail. Meanwhile, based on actual experiences, the authors analyze the parameters of tested tank in standing, relative isolation setting of valve's opening before standing, collection of field data and processing, including the recovery of technological process after testing during all test Relative suggestions about technology application in allusion to alike testing are presented. The influencing factors on testing conclusions are discussed. It is considered that the testing conclusions can be used as a reference in BOG testing of LNG tank.

  13. Construction of thermal ratchet structural test facility

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hyeong Yeon; Kim, J. B.; Yoo, B

    2000-12-01

    The objective of this study is to setup the thermal ratchet test facility to validate the NONSTA code that is under development for the inelastic structure analysis and to characterize the thermal ratchet behavior through structural thermal ratchet test. Thermal ratchet phenomenon, a progressive inelastic deformation can occur in the liquid metal reactor operating at high temperature above 500 deg C due to the moving temperature distribution along the axial direction as the hot free surface moves up and down due to the cyclic heat-up and cool-down of reactor operation. Thermal ratchet can cause a severe damage to the reactor structure. The structural ratchet test was performed and the test results were compared with those of the analysis using Chaboche constitutive model. The fabrication of the ratchet test facility was completed in 1/4 of 2000, the performance test was carried out in the second quarter of 2000, the noise reduction of thermocouples, measurements by laser displacement sensor with data acquisition system was carried out in the third quarter and the test results compared with those of the inelastic structure analysis in the forth quarter of 2000, which showed reasonable agreement with those of the tests.

  14. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  15. Network robustness under large-scale attacks

    CERN Document Server

    Zhou, Qing; Liu, Ruifang; Cui, Shuguang

    2014-01-01

    Network Robustness under Large-Scale Attacks provides the analysis of network robustness under attacks, with a focus on large-scale correlated physical attacks. The book begins with a thorough overview of the latest research and techniques to analyze the network responses to different types of attacks over various network topologies and connection models. It then introduces a new large-scale physical attack model coined as area attack, under which a new network robustness measure is introduced and applied to study the network responses. With this book, readers will learn the necessary tools to evaluate how a complex network responds to random and possibly correlated attacks.

  16. Safety assessment for the rf Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    Nagy, A.; Beane, F. (eds.)

    1984-08-01

    The Radio Frequency Test Facility (RFTF) is a part of the Magnetic Fusion Program's rf Heating Experiments. The goal of the Magnetic Fusion Program (MFP) is to develop and demonstrate the practical application of fusion. RFTF is an experimental device which will provide an essential link in the research effort aiming at the realization of fusion power. This report was compiled as a summary of the analysis done to ensure the safe operation of RFTF.

  17. Sensor test facilities and capabilities at the Nevada Test Site

    Energy Technology Data Exchange (ETDEWEB)

    Boyer, W.B.; Burke, L.J.; Gomez, B.J.; Livingston, L.; Nelson, D.S.; Smathers, D.C.

    1996-12-31

    Sandia National Laboratories has recently developed two major field test capabilities for unattended ground sensor systems at the Department of energy`s Nevada Test Site (NTS). The first capability utilizes the NTS large area, varied terrain, and intrasite communications systems for testing sensors for detecting and tracking vehicular traffic. Sensor and ground truth data can be collected at either of two secure control centers. This system also includes an automated ground truth capability that consists of differential Global Positioning Satellite (GPS) receivers on test vehicles and live TV coverage of critical road sections. Finally there is a high-speed, secure computer network link between the control centers and the Air Force`s Theater Air Command and Control Simulation Facility in Albuquerque NM. The second capability is Bunker 2-300. It is a facility for evaluating advanced sensor systems for monitoring activities in underground cut-and-cover facilities. The main part of the facility consists of an underground bunker with three large rooms for operating various types of equipment. This equipment includes simulated chemical production machinery and controlled seismic and acoustic signal sources. There has been a thorough geologic and electromagnetic characterization of the region around the bunker. Since the facility is in a remote location, it is well-isolated from seismic, acoustic, and electromagnetic interference.

  18. 我国一大型考试等值的铆题参数漂移检验%The Anchor Parameter Drift Test in Equating a Large-scale Examination in China

    Institute of Scientific and Technical Information of China (English)

    刘铁川; 戴海琦; 赵玉

    2012-01-01

    In a large-scale examination, common items or anchors are frequently embedded in different test forms for equating. The Non-Equivalent Anchor Test design (NEAT) requires not only the anchors' representation in contents but also functioning equivalently across test forms. As a result of the effects of irrelavent factors, some anchors' parameter may change substantially in different administrations. Goldstein (1983) named this phenomenon item as "parameter drift (IPD)". Drifted anchors may cause systematic error in equating ( Huiqin, Rogers, & Vukmirovic, 2008), but until now few studies have addressed this issue in China. In the present paper, several different approaches of detecting drifted items and minimizing their effect on equating were outlined first. Then, two kinds of popular methods for Differential Item Functioning (DIF) detection, the MH test and logistic regression, were utilized to examine anchors in equating two test forms from a large-scale examination in China. The MH method was done by means of DIFAS 4.0 and logistic regression was done by means of R. For controlling Type I Error, ETS' s classification criteria and pseudo R- squareds were also considered when MH test and logistic regression were performed. Two test forms data were fit and equated by the Three-Parameter Logistic Model (3PLM) after the drifted anchors were deleted. Item parameter estimation under 3PLM was performed by means of BILOG-MG. Factor analysis suggested that 3PLM could be used to fit two test forms. The results showed: (1) Twenty-two anchors were detected for parameter drift. Both anchors' difficulty parameter and discrimination parameter could drift across different test forms. (2) Twenty-one of all drifted anchors fitted 3PLM well in the old test form, but sixteen of them misfitted in the new test form. (3) Equating results with the Mean/Square method before and after the deletion of drifted anchors were very different

  19. Simulation Facilities and Test Beds for Galileo

    Science.gov (United States)

    Schlarmann, Bernhard Kl.; Leonard, Arian

    2002-01-01

    Galileo is the European satellite navigation system, financed by the European Space Agency (ESA) and the European Commission (EC). The Galileo System, currently under definition phase, will offer seamless global coverage, providing state-of-the-art positioning and timing services. Galileo services will include a standard service targeted at mass market users, an augmented integrity service, providing integrity warnings when fault occur and Public Regulated Services (ensuring a continuity of service for the public users). Other services are under consideration (SAR and integrated communications). Galileo will be interoperable with GPS, and will be complemented by local elements that will enhance the services for specific local users. In the frame of the Galileo definition phase, several system design and simulation facilities and test beds have been defined and developed for the coming phases of the project, respectively they are currently under development. These are mainly the following tools: Galileo Mission Analysis Simulator to design the Space Segment, especially to support constellation design, deployment and replacement. Galileo Service Volume Simulator to analyse the global performance requirements based on a coverage analysis for different service levels and degrades modes. Galileo System Simulation Facility is a sophisticated end-to-end simulation tool to assess the navigation performances for a complete variety of users under different operating conditions and different modes. Galileo Signal Validation Facility to evaluate signal and message structures for Galileo. Galileo System Test Bed (Version 1) to assess and refine the Orbit Determination &Time Synchronisation and Integrity algorithms, through experiments relying on GPS space infrastructure. This paper presents an overview on the so called "G-Facilities" and describes the use of the different system design tools during the project life cycle in order to design the system with respect to

  20. Large Scale Metal Additive Techniques Review

    Energy Technology Data Exchange (ETDEWEB)

    Nycz, Andrzej [ORNL; Adediran, Adeola I [ORNL; Noakes, Mark W [ORNL; Love, Lonnie J [ORNL

    2016-01-01

    In recent years additive manufacturing made long strides toward becoming a main stream production technology. Particularly strong progress has been made in large-scale polymer deposition. However, large scale metal additive has not yet reached parity with large scale polymer. This paper is a review study of the metal additive techniques in the context of building large structures. Current commercial devices are capable of printing metal parts on the order of several cubic feet compared to hundreds of cubic feet for the polymer side. In order to follow the polymer progress path several factors are considered: potential to scale, economy, environment friendliness, material properties, feedstock availability, robustness of the process, quality and accuracy, potential for defects, and post processing as well as potential applications. This paper focuses on current state of art of large scale metal additive technology with a focus on expanding the geometric limits.

  1. Clemson University Wind Turbine Drivetrain Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    Tuten, James Maner [Clemson Univ., SC (United States); Haque, Imtiaz [Clemson Univ., SC (United States); Rigas, Nikolaos [Clemson Univ., SC (United States)

    2016-03-30

    In November of 2009, Clemson University was awarded a competitive grant from the U.S. Department of Energy to design, build and operate a facility for full-scale, highly accelerated mechanical testing of next-generation wind turbine drivetrain technologies. The primary goal of the project was to design, construct, commission, and operate a state-of-the-art sustainable facility that permits full-scale highly accelerated testing of advanced drivetrain systems for large wind turbines. The secondary goal was to meet the objectives of the American Recovery and Reinvestment Act of 2009, especially in job creation, and provide a positive impact on economically distressed areas in the United States, and preservation and economic recovery in an expeditious manner. The project was executed according to a managed cooperative agreement with the Department of Energy and was an extraordinary success. The resultant new facility is located in North Charleston, SC, providing easy transportation access by rail, road or ship and operates on an open access model such that it is available to the U.S. Wind Industry for research, analysis, and evaluation activities. The 72 m by 97 m facility features two mechanical dynamometer test bays for evaluating the torque and blade dynamic forces experienced by the rotors of wind turbine drivetrains. The dynamometers are rated at 7.5 MW and 15 MW of low speed shaft power and are configured as independent test areas capable of simultaneous operation. All six degrees of freedom, three linear and three rotational, for blade and rotor dynamics are replicated through the combination of a drive motor, speed reduction gearbox and a controllable hydraulic load application unit (LAU). This new LAU setup readily supports accelerated lifetime mechanical testing and load analysis for the entire drivetrain system of the nacelle and easily simulates a wide variety of realistic operating scenarios in a controlled laboratory environment. The development of these

  2. Modular High Current Test Facility at LLNL

    Energy Technology Data Exchange (ETDEWEB)

    Tully, L K; Goerz, D A; Speer, R D; Ferriera, T J

    2008-05-20

    This paper describes the 1 MA, 225 kJ test facility in operation at Lawrence Livermore National Laboratory (LLNL). The capacitor bank is constructed from three parallel 1.5 mF modules. The modules are capable of switching simultaneously or sequentially via solid dielectric puncture switches. The bank nominally operates up to 10 kV and reaches peak current with all three cabled modules in approximately 30 {micro}s. Parallel output plates from the bank allow for cable or busbar interfacing to the load. This versatile bank is currently in use for code validation experiments, railgun related activities, switch testing, and diagnostic development.

  3. Experimental simulation of microinteractions in large scale explosions

    Energy Technology Data Exchange (ETDEWEB)

    Chen, X.; Luo, R.; Yuen, W.W.; Theofanous, T.G. [California Univ., Santa Barbara, CA (United States). Center for Risk Studies and Safety

    1998-01-01

    This paper presents data and analysis of recent experiments conducted in the SIGMA-2000 facility to simulate microinteractions in large scale explosions. Specifically, the fragmentation behavior of a high temperature molten steel drop under high pressure (beyond critical) conditions are investigated. The current data demonstrate, for the first time, the effect of high pressure in suppressing the thermal effect of fragmentation under supercritical conditions. The results support the microinteractions idea, and the ESPROSE.m prediction of fragmentation rate. (author)

  4. The ESO Adaptive Optics Facility under Test

    Science.gov (United States)

    Arsenault, Robin; Madec, Pierre-Yves; Paufique, Jerome; La Penna, Paolo; Stroebele, Stefan; Vernet, Elise; Pirard, Jean-François; Hackenberg, Wolfgang; Kuntschner, Harald; Kolb, Johann; Muller, Nicolas; Le Louarn, Miska; Amico, Paola; Hubin, Norbert; Lizon, Jean-Louis; Ridings, Rob; Abad, Jose; Fischer, Gert; Heinz, Volker; Kiekebusch, Mario; Argomedo, Javier; Conzelmann, Ralf; Tordo, Sebastien; Donaldson, Rob; Soenke, Christian; Duhoux, Philippe; Fedrigo, Enrico; Delabre, Bernard; Jost, Andrea; Duchateau, Michel; Downing, Mark; Moreno, Javier; Manescau, Antonio; Bonaccini Calia, Domenico; Quattri, Marco; Dupuy, Christophe; Guidolin, Ivan; Comin, Mauro; Guzman, Ronald; Buzzoni, Bernard; Quentin, Jutta; Lewis, Steffan; Jolley, Paul; Kraus, Max; Pfrommer, Thomas; Garcia-Rissmann, Aurea; Biasi, Roberto; Gallieni, Daniele; Stuik, Remko

    2013-12-01

    The Adaptive Optics Facility project has received most of its subsystems in Garching and the ESO Integration Hall has become the central operation location for the next phase of the project. The main test bench ASSIST and the 2nd Generation M2-Unit (hosting the Deformable Secondary Mirror) have been granted acceptance late 2012. The DSM will now undergo a series of tests on ASSIST to qualify its optical performance which launches the System Test Phase of the AOF. The tests will validate the AO modules operation with the DSM: first the GRAAL adaptive optics module for Hawk-I in natural guide star AO mode on-axis and then its Ground Layer AO mode. This will be followed by the GALACSI (for MUSE) Wide-Field-Mode (GLAO) and then the more challenging Narrow-Field-Mode (LTAO). We will report on the status of the subsystems at the time of the conference but also on the performance of the delivered ASSIST test bench, the DSM and the 20 Watt Sodium fiber Laser pre-production unit which has validated all specifications before final manufacturing of the serial units. We will also present some considerations and tools to ensure an efficient operation of the Facility in Paranal.

  5. Facility effluent monitoring plan for the fast flux test facility

    Energy Technology Data Exchange (ETDEWEB)

    Nickels, J M; Dahl, N R

    1992-11-01

    A facility effluent monitoring plan is required by the US Department of Energy in US Department of Energy Order 5400.1 for any operations that involve hazardous materials and radioactive substances that could affect employee or public safety or the environment. A Facility Effluent Monitoring Plan determination was performed during calendar year 1991 and the evaluation requires the need for a facility effluent monitoring plan. This facility effluent monitoring plan assesses effluent monitoring systems and evaluates whether they are adequate to ensure the public health and safety as specified in applicable federal, state, and local requirements.

  6. The Great Plains Wind Power Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    Schroeder, John

    2014-01-31

    This multi-year, multi-faceted project was focused on the continued development of a nationally-recognized facility for the testing, characterization, and improvement of grid-connected wind turbines, integrated wind-water desalination systems, and related educational and outreach topics. The project involved numerous faculty and graduate students from various engineering departments, as well as others from the departments of Geosciences (in particular the Atmospheric Science Group) and Economics. It was organized through the National Wind Institute (NWI), which serves as an intellectual hub for interdisciplinary and transdisciplinary research, commercialization and education related to wind science, wind energy, wind engineering and wind hazard mitigation at Texas Tech University (TTU). Largely executed by an academic based team, the project resulted in approximately 38 peer-reviewed publications, 99 conference presentations, the development/expansion of several experimental facilities, and two provisional patents.

  7. The Great Plains Wind Power Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    Schroeder, John [Texas Tech Univ., Lubbock, TX (United States)

    2014-01-30

    This multi-year, multi-faceted project was focused on the continued development of a nationally-recognized facility for the testing, characterization, and improvement of grid-connected wind turbines, integrated wind-water desalination systems, and related educational and outreach topics. The project involved numerous faculty and graduate students from various engineering departments, as well as others from the departments of Geosciences (in particular the Atmospheric Science Group) and Economics. It was organized through the National Wind Institute (NWI), which serves as an intellectual hub for interdisciplinary and transdisciplinary research, commercialization and education related to wind science, wind energy, wind engineering and wind hazard mitigation at Texas Tech University (TTU). Largely executed by an academic based team, the project resulted in approximately 38 peer-reviewed publications, 99 conference presentations, the development/expansion of several experimental facilities, and two provisional patents.

  8. Advanced Test Reactor National Scientific User Facility

    Energy Technology Data Exchange (ETDEWEB)

    Frances M. Marshall; Jeff Benson; Mary Catherine Thelen

    2011-08-01

    The Advanced Test Reactor (ATR), at the Idaho National Laboratory (INL), is a large test reactor for providing the capability for studying the effects of intense neutron and gamma radiation on reactor materials and fuels. The ATR is a pressurized, light-water, high flux test reactor with a maximum operating power of 250 MWth. The INL also has several hot cells and other laboratories in which irradiated material can be examined to study material irradiation effects. In 2007 the US Department of Energy (DOE) designated the ATR as a National Scientific User Facility (NSUF) to facilitate greater access to the ATR and the associated INL laboratories for material testing research by a broader user community. This paper highlights the ATR NSUF research program and the associated educational initiatives.

  9. Test facility for rewetting experiments at CDTN

    Energy Technology Data Exchange (ETDEWEB)

    Rezende, Hugo C.; Mesquita, Amir Z.; Ladeira, Luiz C.D.; Santos, Andre A.C., E-mail: hcr@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (SETRE/CDTN/CNEN-MG), Belo Horizonte, MG (Brazil). Servico de Tecnologia de Reatores

    2015-07-01

    One of the most important subjects in nuclear reactor safety analysis is the reactor core rewetting after a Loss-of-Coolant Accident (LOCA) in a Light Water Reactor LWR. Several codes for the prediction of the rewetting evolution are under development based on experimental results. In a Pressurized Water Reactor (PWR) the reflooding phase of a LOCA is when the fuel rods are rewetted from the bottom of the core to its top after having been totally uncovered and dried out. Out-of-pile reflooding experiments performed with electrical heated fuel rod simulators show different quench behavior depending the rods geometry. A test facility for rewetting experiments (ITR - Instalacao de Testes de Remolhamento) has been constructed at the Thermal Hydraulics Laboratory of the Centro de Desenvolvimento da Tecnologia Nuclear (CDTN), with the objective of performing investigations on basic phenomena that occur during the reflood phase of a LOCA in a PWR, using tubular and annular test sections. This paper presents the design aspects of the facility, and the current stage of the works. The mechanical aspects of the installation as its instrumentation are described. Two typical tests are presented and results compered with theoretical calculations using computer code. (author)

  10. Operation of the nuclear fuel cycle test facilities -Operation of the hot test loop facilities

    Energy Technology Data Exchange (ETDEWEB)

    Chun, S. Y.; Jeong, M. K.; Park, C. K.; Yang, S. K.; Won, S. Y.; Song, C. H.; Jeon, H. K.; Jeong, H. J.; Cho, S.; Min, K. H.; Jeong, J. H.

    1997-01-01

    A performance and reliability of a advanced nuclear fuel and reactor newly designed should be verified by performing the thermal hydraulics tests. In thermal hydraulics research team, the thermal hydraulics tests associated with the development of an advanced nuclear fuel and reactor haven been carried out with the test facilities, such as the Hot Test Loop operated under high temperature and pressure conditions, Cold Test Loop, RCS Loop and B and C Loop. The objective of this project is to obtain the available experimental data and to develop the advanced measuring techniques through taking full advantage of the facilities. The facilities operated by the thermal hydraulics research team have been maintained and repaired in order to carry out the thermal hydraulics tests necessary for providing the available data. The performance tests for the double grid type bottom end piece which was improved on the debris filtering effectivity were performed using the PWR-Hot Test Loop. The CANDU-Hot Test Loop was operated to carry out the pressure drop tests and strength tests of CANFLEX fuel. The Cold Test Loop was used to obtain the local velocity data in subchannel within HANARO fuel bundle and to study a thermal mixing characteristic of PWR fuel bundle. RCS thermal hydraulic loop was constructed and the experiments have been carried out to measure the critical heat flux. In B and C Loop, the performance tests for each component were carried out. (author). 19 tabs., 78 figs., 19 refs.

  11. Clemson University Wind Turbine Drivetrain Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    Tuten, James Maner [Clemson Univ., SC (United States); Haque, Imtiaz [Clemson Univ., SC (United States); Rigas, Nikolaos [Clemson Univ., SC (United States)

    2016-03-30

    In November of 2009, Clemson University was awarded a competitive grant from the U.S. Department of Energy to design, build and operate a facility for full-scale, highly accelerated mechanical testing of next-generation wind turbine drivetrain technologies. The primary goal of the project was to design, construct, commission, and operate a state-of-the-art sustainable facility that permits full-scale highly accelerated testing of advanced drivetrain systems for large wind turbines. The secondary goal was to meet the objectives of the American Recovery and Reinvestment Act of 2009, especially in job creation, and provide a positive impact on economically distressed areas in the United States, and preservation and economic recovery in an expeditious manner. The project was executed according to a managed cooperative agreement with the Department of Energy and was an extraordinary success. The resultant new facility is located in North Charleston, SC, providing easy transportation access by rail, road or ship and operates on an open access model such that it is available to the U.S. Wind Industry for research, analysis, and evaluation activities. The 72 m by 97 m facility features two mechanical dynamometer test bays for evaluating the torque and blade dynamic forces experienced by the rotors of wind turbine drivetrains. The dynamometers are rated at 7.5 MW and 15 MW of low speed shaft power and are configured as independent test areas capable of simultaneous operation. All six degrees of freedom, three linear and three rotational, for blade and rotor dynamics are replicated through the combination of a drive motor, speed reduction gearbox and a controllable hydraulic load application unit (LAU). This new LAU setup readily supports accelerated lifetime mechanical testing and load analysis for the entire drivetrain system of the nacelle and easily simulates a wide variety of realistic operating scenarios in a controlled laboratory environment. The development of these

  12. Usability Testing and Analysis Facility (UTAF)

    Science.gov (United States)

    Wong, Douglas T.

    2010-01-01

    This slide presentation reviews the work of the Usability Testing and Analysis Facility (UTAF) at NASA Johnson Space Center. It is one of the Space Human Factors Laboratories in the Habitability and Human Factors Branch (SF3) at NASA Johnson Space Center The primary focus pf the UTAF is to perform Human factors evaluation and usability testing of crew / vehicle interfaces. The presentation reviews the UTAF expertise and capabilities, the processes and methodologies, and the equipment available. It also reviews the programs that it has supported detailing the human engineering activities in support of the design of the Orion space craft, testing of the EVA integrated spacesuit, and work done for the design of the lunar projects of the Constellation Program: Altair, Lunar Electric Rover, and Outposts

  13. NASA Plum Brook's B-2 test facility-Thermal vacuum and propellant test facility

    Science.gov (United States)

    Kudlac, Maureen; Weaver, Harold; Cmar, Mark

    2012-06-01

    The National Aeronautics and Space Administration (NASA) Glenn Research Center (GRC) Plum Brook Station (PBS) Spacecraft Propulsion Research Facility, commonly referred to as B-2, is NASA's third largest thermal vacuum facility. It is the largest designed to store and transfer large quantities of liquid hydrogen and liquid oxygen, and is perfectly suited to support developmental testing of upper stage chemical propulsion systems as well as fully integrated stages. The facility is also capable of providing thermal-vacuum simulation services to support testing of large lightweight structures, Cryogenic Fluid Management (CFM) systems, electric propulsion test programs, and other In-Space propulsion programs. A recently completed integrated system test demonstrated the refurbished thermal vacuum capabilities of the facility. The test used the modernized data acquisition and control system to monitor the facility. The heat sink provided a uniform temperature environment of approximately 77K. The modernized infrared lamp array produced a nominal heat flux of 1.4 kW/m2. With the lamp array and heat sink operating simultaneously, the thermal systems produced a heat flux pattern simulating radiation to space on one surface and solar exposure on the other surface.

  14. NASA Plum Brook's B-2 Test Facility: Thermal Vacuum and Propellant Test Facility

    Science.gov (United States)

    Kudlac, Maureen T.; Weaver, Harold F.; Cmar, Mark D.

    2012-01-01

    The National Aeronautics and Space Administration (NASA) Glenn Research Center (GRC) Plum Brook Station (PBS) Spacecraft Propulsion Research Facility, commonly referred to as B-2, is NASA's third largest thermal vacuum facility. It is the largest designed to store and transfer large quantities of liquid hydrogen and liquid oxygen, and is perfectly suited to support developmental testing of upper stage chemical propulsion systems as well as fully integrated stages. The facility is also capable of providing thermal-vacuum simulation services to support testing of large lightweight structures, Cryogenic Fluid Management (CFM) systems, electric propulsion test programs, and other In-Space propulsion programs. A recently completed integrated system test demonstrated the refurbished thermal vacuum capabilities of the facility. The test used the modernized data acquisition and control system to monitor the facility. The heat sink provided a uniform temperature environment of approximately 77 K. The modernized infrared lamp array produced a nominal heat flux of 1.4 kW/sq m. With the lamp array and heat sink operating simultaneously, the thermal systems produced a heat flux pattern simulating radiation to space on one surface and solar exposure on the other surface.

  15. Design and construction of thermal striping test facility

    Energy Technology Data Exchange (ETDEWEB)

    Han, D. H.; Kim, J. M.; Nam, H. Y.; Choi, J. H.; Choi, B. H.; Jeong, J. Y.; Jeong, K. C.; Park, J. H.; Kim, T. J.; Kim, B. H

    2003-12-01

    Test facility was designed and constructed to generate of experimental data for the validation of turbulence model for analyzing thermal striping phenomena. Test facility consists mainly of test section, heat transfer system and control system. In this report design and construction process of test facility was described in detail.

  16. Test Facility For Thermal Imaging Systems

    Science.gov (United States)

    Fontanella, Jean-Claude

    1981-10-01

    The test facility is designed to measure the main performances of thermal imaging systems : optical transfer function, minimum resolvable thermal difference, noise equivalent temperature difference and spectral response. The infrared sources are slits, MRTD four bar patterns or the output slit of a monochromator which are placed in the focal plane of two collimators. The response of the system can be measured either on the display using a photometer or in the video signal using a transient recorder. Most of the measurements are controlled by a minicomputer. Typical results are presented.

  17. Thermal effects testing at the National Solar Thermal Test Facility

    Science.gov (United States)

    Ralph, Mark E.; Cameron, Christopher P.; Ghanbari, Cheryl M.

    The National Solar Thermal Test Facility is operated by Sandia National Laboratories and located on Kirtland Air Force Base in Albuquerque, New Mexico. The permanent features of the facility include a heliostat field and associated receiver tower, two solar furnaces, two point-focus parabolic concentrators, and Engine Test Facility. The heliostat field contains 220 computer-controlled mirrors, which reflect concentrated solar energy to test stations on a 61-m tower. The field produces a peak flux density of 250 W/sq cm that is uniform over a 15-cm diameter with a total beam power of over 5 MWt. One solar furnace produces flux levels of 270 W/sq cm over and delivers a 6-mm diameter and total power of 16 kWt. A second furnace produces flux levels up to 1000 W/sq cm over a 4 cm diameter and total power of 60 kWt. Both furnaces include shutters and attenuators that can provide square or shaped pulses. The two 11-m diameter tracking parabolic point-focusing concentrators at the facility can each produce peak flux levels of 1500 W/sq cm over a 2.5-cm diameter and total power of 75 kWt. High-speed shutters have been used to produce square pulses.

  18. Human pescadillo induces large-scale chromatin unfolding

    Institute of Scientific and Technical Information of China (English)

    ZHANG Hao; FANG Yan; HUANG Cuifen; YANG Xiao; YE Qinong

    2005-01-01

    The human pescadillo gene encodes a protein with a BRCT domain. Pescadillo plays an important role in DNA synthesis, cell proliferation and transformation. Since BRCT domains have been shown to induce chromatin large-scale unfolding, we tested the role of Pescadillo in regulation of large-scale chromatin unfolding. To this end, we isolated the coding region of Pescadillo from human mammary MCF10A cells. Compared with the reported sequence, the isolated Pescadillo contains in-frame deletion from amino acid 580 to 582. Targeting the Pescadillo to an amplified, lac operator-containing chromosome region in the mammalian genome results in large-scale chromatin decondensation. This unfolding activity maps to the BRCT domain of Pescadillo. These data provide a new clue to understanding the vital role of Pescadillo.

  19. Large Scale Anomalies of the Cosmic Microwave Background with Planck

    DEFF Research Database (Denmark)

    Frejsel, Anne Mette

    This thesis focuses on the large scale anomalies of the Cosmic Microwave Background (CMB) and their possible origins. The investigations consist of two main parts. The first part is on statistical tests of the CMB, and the consistency of both maps and power spectrum. We find that the Planck data...... is very consistent, while the WMAP 9 year release appears more contaminated by non-CMB residuals than the 7 year release. The second part is concerned with the anomalies of the CMB from two approaches. One is based on an extended inflationary model as the origin of one specific large scale anomaly, namely....... Here we find evidence that the Planck CMB maps contain residual radiation in the loop areas, which can be linked to some of the large scale CMB anomalies: the point-parity asymmetry, the alignment of quadrupole and octupole and the dipolemodulation....

  20. Large Scale Anomalies of the Cosmic Microwave Background with Planck

    DEFF Research Database (Denmark)

    Frejsel, Anne Mette

    This thesis focuses on the large scale anomalies of the Cosmic Microwave Background (CMB) and their possible origins. The investigations consist of two main parts. The first part is on statistical tests of the CMB, and the consistency of both maps and power spectrum. We find that the Planck data...... is very consistent, while the WMAP 9 year release appears more contaminated by non-CMB residuals than the 7 year release. The second part is concerned with the anomalies of the CMB from two approaches. One is based on an extended inflationary model as the origin of one specific large scale anomaly, namely....... Here we find evidence that the Planck CMB maps contain residual radiation in the loop areas, which can be linked to some of the large scale CMB anomalies: the point-parity asymmetry, the alignment of quadrupole and octupole and the dipolemodulation....

  1. Large-Scale Analysis of Art Proportions

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2014-01-01

    While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square) and with majo......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  2. Large-scale Complex IT Systems

    CERN Document Server

    Sommerville, Ian; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challenges and issues in the development of large-scale complex, software-intensive systems. Central to this is the notion that we cannot separate software from the socio-technical environment in which it is used.

  3. Topological Routing in Large-Scale Networks

    DEFF Research Database (Denmark)

    Pedersen, Jens Myrup; Knudsen, Thomas Phillip; Madsen, Ole Brun

    2004-01-01

    A new routing scheme, Topological Routing, for large-scale networks is proposed. It allows for efficient routing without large routing tables as known from traditional routing schemes. It presupposes a certain level of order in the networks, known from Structural QoS. The main issues in applying...... Topological Routing to large-scale networks are discussed. Hierarchical extensions are presented along with schemes for shortest path routing, fault handling and path restoration. Further reserach in the area is discussed and perspectives on the prerequisites for practical deployment of Topological Routing...

  4. Topological Routing in Large-Scale Networks

    DEFF Research Database (Denmark)

    Pedersen, Jens Myrup; Knudsen, Thomas Phillip; Madsen, Ole Brun

    A new routing scheme, Topological Routing, for large-scale networks is proposed. It allows for efficient routing without large routing tables as known from traditional routing schemes. It presupposes a certain level of order in the networks, known from Structural QoS. The main issues in applying...... Topological Routing to large-scale networks are discussed. Hierarchical extensions are presented along with schemes for shortest path routing, fault handling and path restoration. Further reserach in the area is discussed and perspectives on the prerequisites for practical deployment of Topological Routing...

  5. Large scale topic modeling made practical

    DEFF Research Database (Denmark)

    Wahlgreen, Bjarne Ørum; Hansen, Lars Kai

    2011-01-01

    Topic models are of broad interest. They can be used for query expansion and result structuring in information retrieval and as an important component in services such as recommender systems and user adaptive advertising. In large scale applications both the size of the database (number of docume......Topic models are of broad interest. They can be used for query expansion and result structuring in information retrieval and as an important component in services such as recommender systems and user adaptive advertising. In large scale applications both the size of the database (number...... topics at par with a much larger case specific vocabulary....

  6. Measurement, Sampling, and Equating Errors in Large-Scale Assessments

    Science.gov (United States)

    Wu, Margaret

    2010-01-01

    In large-scale assessments, such as state-wide testing programs, national sample-based assessments, and international comparative studies, there are many steps involved in the measurement and reporting of student achievement. There are always sources of inaccuracies in each of the steps. It is of interest to identify the source and magnitude of…

  7. Thermal effects testing at the National Solar Thermal Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    Ralph, M.E.; Cameron, C.P. (Sandia National Labs., Albuquerque, NM (United States)); Ghanbari, C.M. (Technadyne Engineering Consultants, Inc., Albuquerque, NM (United States))

    1992-01-01

    The National Solar Thermal Test Facility is operated by Sandia National Laboratories and located on Kirkland Air Force Base in Albuquerque, New Mexico. The permanent features of the facility include a heliostat field and associated receiver tower, two solar furnaces, two point-focus parabolic concentrators, and Engine Test Facility. The heliostat field contains 220 computer-controlled mirrors, which reflect concentrated solar energy to test stations on a 61-m tower. The field produces a peak flux density of 250 W/cm[sup 2] that is uniform over a 15-cm diameter with a total beam power of over 5 MW[sub t]. The solar beam has been used to simulate aerodynamic heating for several customers. Thermal nuclear blasts have also been simulated using a high-speed shutter in combination with heliostat control. The shutter can accommodate samples up to 1 m [times] 1 m and it has been used by several US and Canadian agencies. A glass-windowed wind tunnel is also available in the Solar Tower. It provides simultaneous exposure to the thermal flux and air flow. Each solar furnace at the facility includes a heliostat, an attenuator, and a parabolic concentrator. One solar furnace produces flux levels of 270 W/cm[sup 2] over and delivers a 6-mm diameter and total power of 16 kW[sub t]. A second furnace produces flux levels up to 1000 W/cm[sup 2] over a 4 cm diameter and total power of 60 kW[sub t]. Both furnaces include shutters and attenuators that can provide square or shaped pulses. The two 11 m diameter tracking parabolic point-focusing concentrators at the facility can each produce peak flux levels of 1500 W/cm[sup 2] over a 2.5 cm diameter and total power of 75 kW[sub t]. High-speed shutters have been used to produce square pulses.

  8. Thermal effects testing at the National Solar Thermal Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    Ralph, M.E.; Cameron, C.P. [Sandia National Labs., Albuquerque, NM (United States); Ghanbari, C.M. [Technadyne Engineering Consultants, Inc., Albuquerque, NM (United States)

    1992-12-31

    The National Solar Thermal Test Facility is operated by Sandia National Laboratories and located on Kirkland Air Force Base in Albuquerque, New Mexico. The permanent features of the facility include a heliostat field and associated receiver tower, two solar furnaces, two point-focus parabolic concentrators, and Engine Test Facility. The heliostat field contains 220 computer-controlled mirrors, which reflect concentrated solar energy to test stations on a 61-m tower. The field produces a peak flux density of 250 W/cm{sup 2} that is uniform over a 15-cm diameter with a total beam power of over 5 MW{sub t}. The solar beam has been used to simulate aerodynamic heating for several customers. Thermal nuclear blasts have also been simulated using a high-speed shutter in combination with heliostat control. The shutter can accommodate samples up to 1 m {times} 1 m and it has been used by several US and Canadian agencies. A glass-windowed wind tunnel is also available in the Solar Tower. It provides simultaneous exposure to the thermal flux and air flow. Each solar furnace at the facility includes a heliostat, an attenuator, and a parabolic concentrator. One solar furnace produces flux levels of 270 W/cm{sup 2} over and delivers a 6-mm diameter and total power of 16 kW{sub t}. A second furnace produces flux levels up to 1000 W/cm{sup 2} over a 4 cm diameter and total power of 60 kW{sub t}. Both furnaces include shutters and attenuators that can provide square or shaped pulses. The two 11 m diameter tracking parabolic point-focusing concentrators at the facility can each produce peak flux levels of 1500 W/cm{sup 2} over a 2.5 cm diameter and total power of 75 kW{sub t}. High-speed shutters have been used to produce square pulses.

  9. A test facility for hypervelocity rarefied flows

    Science.gov (United States)

    Macrossan, M. N.; Chiu, H.-H.; Mee, D. J.

    2001-08-01

    This paper describes a rarefied hypervelocity test facility producing gas speeds greater than 7 km/s. The X1 expansion tube at The University of Queensland has been used to produce nitrogen flows at 8.9 and 9.5 km/s with test flow durations of 50 and 40 μs respectively. Rarefied flow is indicated by values of the freestream breakdown parameter >0.1 (Cheng's rarefaction parameter tank. Nominal conditions in the expansion are derived from CFD predictions. Measured bar gauge (Pitot) pressures show that the flow is radially uniform when the Pitot pressure has decreased by a factor ten. The measured bar gauge pressures are an increasing fraction of the expected Pitot pressure as the rarefaction parameters increase.

  10. First results of the ITER-relevant negative ion beam test facility ELISE (invited).

    Science.gov (United States)

    Fantz, U; Franzen, P; Heinemann, B; Wünderlich, D

    2014-02-01

    An important step in the European R&D roadmap towards the neutral beam heating systems of ITER is the new test facility ELISE (Extraction from a Large Ion Source Experiment) for large-scale extraction from a half-size ITER RF source. The test facility was constructed in the last years at Max-Planck-Institut für Plasmaphysik Garching and is now operational. ELISE is gaining early experience of the performance and operation of large RF-driven negative hydrogen ion sources with plasma illumination of a source area of 1 × 0.9 m(2) and an extraction area of 0.1 m(2) using 640 apertures. First results in volume operation, i.e., without caesium seeding, are presented.

  11. Material testing facilities and programs for plasma-facing component testing

    Science.gov (United States)

    Linsmeier, Ch.; Unterberg, B.; Coenen, J. W.; Doerner, R. P.; Greuner, H.; Kreter, A.; Linke, J.; Maier, H.

    2017-09-01

    Component development for operation in a large-scale fusion device requires thorough testing and qualification for the intended operational conditions. In particular environments are necessary which are comparable to the real operation conditions, allowing at the same time for in situ/in vacuo diagnostics and flexible operation, even beyond design limits during the testing. Various electron and neutral particle devices provide the capabilities for high heat load tests, suited for material samples and components from lab-scale dimensions up to full-size parts, containing toxic materials like beryllium, and being activated by neutron irradiation. To simulate the conditions specific to a fusion plasma both at the first wall and in the divertor of fusion devices, linear plasma devices allow for a test of erosion and hydrogen isotope recycling behavior under well-defined and controlled conditions. Finally, the complex conditions in a fusion device (including the effects caused by magnetic fields) are exploited for component and material tests by exposing test mock-ups or material samples to a fusion plasma by manipulator systems. They allow for easy exchange of test pieces in a tokamak or stellarator device, without opening the vessel. Such a chain of test devices and qualification procedures is required for the development of plasma-facing components which then can be successfully operated in future fusion power devices. The various available as well as newly planned devices and test stands, together with their specific capabilities, are presented in this manuscript. Results from experimental programs on test facilities illustrate their significance for the qualification of plasma-facing materials and components. An extended set of references provides access to the current status of material and component testing capabilities in the international fusion programs.

  12. A facile in situ self-assembly strategy for large-scale fabrication of CHS@MOF yolk/shell structure and its catalytic application in a flow system.

    Science.gov (United States)

    Gao, Hongyi; Luan, Yi; Chaikittikul, Kullapat; Dong, Wenjun; Li, Jie; Zhang, Xiaowei; Jia, Dandan; Yang, Mu; Wang, Ge

    2015-03-04

    A hierarchical yolk/shell copper hydroxysulfates@MOF (CHS@MOF, where MOF = metal-organic frameworks) structure was fabricated from a homogeneous yolk/shell CHS template composed of an active shell and a stabilized core via a facile self-template strategy at room temperature. The active shell of the template served as the source of metal ion and was in situ transformed into a well-defined MOF crystal shell, and the relatively stabilized core retained its own nature during the formation of the MOF shell. The strategy of in situ transformation of CHS shell to MOF shell avoided the self-nucleation of MOF in the solution and complex multistep procedures. Furthermore, a flow reaction system using CHS@MOF as self-supported stationary-phase catalyst was developed, which demonstrated excellent catalytic performance for aldehyde acetalization with ethanol, and high yields and selectivities were achieved under mild conditions.

  13. Imaging HF-induced large-scale irregularities above HAARP

    Science.gov (United States)

    Djuth, Frank T.; Reinisch, Bodo W.; Kitrosser, David F.; Elder, John H.; Snyder, A. Lee; Sales, Gary S.

    2006-02-01

    The University of Massachusetts-Lowell digisonde is used with the HAARP high-frequency (HF), ionospheric modification facility to obtain radio images of artificially-produced, large-scale, geomagnetic field-aligned irregularities. F region irregularities generated with the HAARP beam pointed in the vertical and geomagnetic field-aligned directions are examined in a smooth background plasma. It is found that limited large-scale irregularity production takes place with vertical transmissions, whereas there is a dramatic increase in the number of source irregularities with the beam pointed parallel to the geomagnetic field. Strong irregularity production appears to be confined to within ~5° of the geomagnetic zenith and does not fill the volume occupied by the HF beam. A similar effect is observed in optical images of artificial airglow.

  14. Large-scale multimedia modeling applications

    Energy Technology Data Exchange (ETDEWEB)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications.

  15. Evaluating Large-Scale Interactive Radio Programmes

    Science.gov (United States)

    Potter, Charles; Naidoo, Gordon

    2009-01-01

    This article focuses on the challenges involved in conducting evaluations of interactive radio programmes in South Africa with large numbers of schools, teachers, and learners. It focuses on the role such large-scale evaluation has played during the South African radio learning programme's development stage, as well as during its subsequent…

  16. Configuration management in large scale infrastructure development

    NARCIS (Netherlands)

    Rijn, T.P.J. van; Belt, H. van de; Los, R.H.

    2000-01-01

    Large Scale Infrastructure (LSI) development projects such as the construction of roads, rail-ways and other civil engineering (water)works is tendered differently today than a decade ago. Traditional workflow requested quotes from construction companies for construction works where the works to be

  17. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data dissem

  18. Sensitivity analysis for large-scale problems

    Science.gov (United States)

    Noor, Ahmed K.; Whitworth, Sandra L.

    1987-01-01

    The development of efficient techniques for calculating sensitivity derivatives is studied. The objective is to present a computational procedure for calculating sensitivity derivatives as part of performing structural reanalysis for large-scale problems. The scope is limited to framed type structures. Both linear static analysis and free-vibration eigenvalue problems are considered.

  19. Large-scale perspective as a challenge

    NARCIS (Netherlands)

    Plomp, M.G.A.

    2012-01-01

    1. Scale forms a challenge for chain researchers: when exactly is something ‘large-scale’? What are the underlying factors (e.g. number of parties, data, objects in the chain, complexity) that determine this? It appears to be a continuum between small- and large-scale, where positioning on that cont

  20. Ensemble methods for large scale inverse problems

    NARCIS (Netherlands)

    Heemink, A.W.; Umer Altaf, M.; Barbu, A.L.; Verlaan, M.

    2013-01-01

    Variational data assimilation, also sometimes simply called the ‘adjoint method’, is used very often for large scale model calibration problems. Using the available data, the uncertain parameters in the model are identified by minimizing a certain cost function that measures the difference between t

  1. Ethics of large-scale change

    DEFF Research Database (Denmark)

    Arler, Finn

    2006-01-01

    , which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, the neoclassical economists' approach, and finally the so-called Concentric Circle Theories approach...

  2. Inflation, large scale structure and particle physics

    Indian Academy of Sciences (India)

    S F King

    2004-02-01

    We review experimental and theoretical developments in inflation and its application to structure formation, including the curvation idea. We then discuss a particle physics model of supersymmetric hybrid inflation at the intermediate scale in which the Higgs scalar field is responsible for large scale structure, show how such a theory is completely natural in the framework extra dimensions with an intermediate string scale.

  3. Qualification Program of Korea Heat Load Test Facility KoHLT-EB for ITER Plasma Facing Components

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Suk-Kwon; Park, Seoung Dae; Jin, Hyung Gon; Lee, Eo Hwak; Yoon, Jae-Sung; Lee, Dong Won [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Cho, Seungyon [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    The qualification tests were performed to evaluate the high heat flux test facility for the PFCs and fusion reactor materials. For the thermal fatigue test, two types of tungsten mock-ups were fabricated. The cooling performance was tested under the similar operation condition of ITER and fusion reactor. After the completion of the preliminary mockup test and facility qualification, the high heat flux test facility will assess the performance test for the various plasma facing components in fusion reactor materials. Preliminary thermo-hydraulic and performance tests were conducted using various test mockups for the plasma facing components in the high heat flux test facilities of the world. The previous heat flux tests were performed by using the graphite heater facilities in Korea. Several facilities which equipped with an electron beam as the uniform heat source were fabricated for the tokamak PFCs in the EU, Russia and US. These heat flux test facilities are utilized for a cyclic heat flux test of the PFCs. Each facility working for their own purpose in EU FZJ, US SNL, and Russia Efremov institute. For this purpose, KoHLTEB was constructed and this facility will be used for ITER TBM performance test with the small-scale and large-scale mockups, and prototype. Also, it has been used for other fusion application for developing plasma facing component (PFC) for ITER FW, tungsten divertor, and heat transfer experiment and so on under the domestic R and D program. Korea heat load test facility by using electron beam KoHLT-EB was constructed for the high heat flux test to verify the plasma facing components, including ITER TBM first wall.

  4. Design, Evaluation and Test Technology Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The mission of this facility, which is composed of numerous specialized facilities, is to provide capabilities to simulate a wide range of environments for component...

  5. Design, Evaluation and Test Technology Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The mission of this facility, which is composed of numerous specialized facilities, is to provide capabilities to simulate a wide range of environments for component...

  6. Nuclear thermal propulsion test facility requirements and development strategy

    Science.gov (United States)

    Allen, George C.; Warren, John; Clark, J. S.

    1991-01-01

    The Nuclear Thermal Propulsion (NTP) subpanel of the Space Nuclear Propulsion Test Facilities Panel evaluated facility requirements and strategies for nuclear thermal propulsion systems development. High pressure, solid core concepts were considered as the baseline for the evaluation, with low pressure concepts an alternative. The work of the NTP subpanel revealed that a wealth of facilities already exists to support NTP development, and that only a few new facilities must be constructed. Some modifications to existing facilities will be required. Present funding emphasis should be on long-lead-time items for the major new ground test facility complex and on facilities supporting nuclear fuel development, hot hydrogen flow test facilities, and low power critical facilities.

  7. Nuclear thermal propulsion test facility requirements and development strategy

    Science.gov (United States)

    Allen, George C.; Clark, John S.; Warren, John; Perkins, David R.; Martinell, John

    1992-01-01

    The Nuclear Thermal Propulsion (NTP) subpanel of the Space Nuclear Propulsion Test Facilities Panel evaluated facility requirements and strategies for nuclear thermal propulsion systems development. High pressure, solid core concepts were considered as the baseline for the evaluation, with low pressure concepts an alternative. The work of the NTP subpanel revealed that a wealth of facilities already exists to support NTP development, and that only a few new facilities must be constructed. Some modifications to existing facilities will be required. Present funding emphasis should be on long-lead-time items for the major new ground test facility complex and on facilities supporting nuclear fuel development, hot hydrogen flow test facilities, and low power critical facilities.

  8. Basic Design of a LWR Fuel Compatibility Test Facility (PLUTO)

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Chang Hwan; Chun, Se Young; Kim, Bok Deuk; Park, Jong Kuk; Chun, Tae Hyun; Kim, Hyoung Kyu; Oh, Dong Seok

    2009-04-15

    KAERI is performing a project for developing a compatibility test facility and the relevant technology for an LWR fuel assembly. It includes the compatibility test and the long term wear test for dual fuel assemblies, and the pressure drop test, uplift force test, flow-induced vibration test, damping test, and the debris filtering capability test for a single fuel assembly. This compatibility test facility of the fuel assemblies is named PLUTO from Performance Test Facility for Fuel Assembly Hydraulics and Vibrations. The PLUTO will be basically constructed for a PWR fuel assembly, and it will be considered to test for the fuel assemblies of other reactors.

  9. Gas cooled fast breeder reactor design for a circulator test facility (modified HTGR circulator test facility)

    Energy Technology Data Exchange (ETDEWEB)

    1979-10-01

    A GCFR helium circulator test facility sized for full design conditions is proposed for meeting the above requirements. The circulator will be mounted in a large vessel containing high pressure helium which will permit testing at the same power, speed, pressure, temperature and flow conditions intended in the demonstration plant. The electric drive motor for the circulator will obtain its power from an electric supply and distribution system in which electric power will be taken from a local utility. The conceptual design decribed in this report is the result of close interaction between the General Atomic Company (GA), designer of the GCFR, and The Ralph M. Parson Company, architect/engineer for the test facility. A realistic estimate of total project cost is presented, together with a schedule for design, procurement, construction, and inspection.

  10. The large-scale structure of vacuum

    CERN Document Server

    Albareti, F D; Maroto, A L

    2014-01-01

    The vacuum state in quantum field theory is known to exhibit an important number of fundamental physical features. In this work we explore the possibility that this state could also present a non-trivial space-time structure on large scales. In particular, we will show that by imposing the renormalized vacuum energy-momentum tensor to be conserved and compatible with cosmological observations, the vacuum energy of sufficiently heavy fields behaves at late times as non-relativistic matter rather than as a cosmological constant. In this limit, the vacuum state supports perturbations whose speed of sound is negligible and accordingly allows the growth of structures in the vacuum energy itself. This large-scale structure of vacuum could seed the formation of galaxies and clusters very much in the same way as cold dark matter does.

  11. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...... limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its...... main focus. Here the general perception of the nature and role in society of large scale networks as a fundamental infrastructure is analysed. This analysis focuses on the effects of the technical DDN projects and on the perception of network infrastructure as expressed by key decision makers...

  12. Quantum Signature of Cosmological Large Scale Structures

    CERN Document Server

    Capozziello, S; De Siena, S; Illuminati, F; Capozziello, Salvatore; Martino, Salvatore De; Siena, Silvio De; Illuminati, Fabrizio

    1998-01-01

    We demonstrate that to all large scale cosmological structures where gravitation is the only overall relevant interaction assembling the system (e.g. galaxies), there is associated a characteristic unit of action per particle whose order of magnitude coincides with the Planck action constant $h$. This result extends the class of physical systems for which quantum coherence can act on macroscopic scales (as e.g. in superconductivity) and agrees with the absence of screening mechanisms for the gravitational forces, as predicted by some renormalizable quantum field theories of gravity. It also seems to support those lines of thought invoking that large scale structures in the Universe should be connected to quantum primordial perturbations as requested by inflation, that the Newton constant should vary with time and distance and, finally, that gravity should be considered as an effective interaction induced by quantization.

  13. Process Principles for Large-Scale Nanomanufacturing.

    Science.gov (United States)

    Behrens, Sven H; Breedveld, Victor; Mujica, Maritza; Filler, Michael A

    2017-06-07

    Nanomanufacturing-the fabrication of macroscopic products from well-defined nanoscale building blocks-in a truly scalable and versatile manner is still far from our current reality. Here, we describe the barriers to large-scale nanomanufacturing and identify routes to overcome them. We argue for nanomanufacturing systems consisting of an iterative sequence of synthesis/assembly and separation/sorting unit operations, analogous to those used in chemicals manufacturing. In addition to performance and economic considerations, phenomena unique to the nanoscale must guide the design of each unit operation and the overall process flow. We identify and discuss four key nanomanufacturing process design needs: (a) appropriately selected process break points, (b) synthesis techniques appropriate for large-scale manufacturing, (c) new structure- and property-based separations, and (d) advances in stabilization and packaging.

  14. Large-scale structure of the Universe

    Energy Technology Data Exchange (ETDEWEB)

    Shandarin, S.F.; Doroshkevich, A.G.; Zel' dovich, Ya.B. (Inst. Prikladnoj Matematiki, Moscow, USSR)

    1983-01-01

    A review of theory of the large-scale structure of the Universe is given, including formation of clusters and superclusters of galaxies as well as large voids. Particular attention is paid to the theory of neutrino dominated Universe - the cosmological model where neutrinos with the rest mass of several tens eV dominate the mean density. Evolution of small perturbations is discussed, estimates of microwave backgorund radiation fluctuations is given for different angular scales. Adiabatic theory of the Universe structure formation, known as ''cake'' scenario and their successive fragmentation is given. This scenario is based on approximate nonlinear theory of gravitation instability. Results of numerical experiments, modeling the processes of large-scale structure formation are discussed.

  15. Large-scale structure of the universe

    Energy Technology Data Exchange (ETDEWEB)

    Shandarin, S.F.; Doroshkevich, A.G.; Zel' dovich, Y.B.

    1983-01-01

    A survey is given of theories for the origin of large-scale structure in the universe: clusters and superclusters of galaxies, and vast black regions practically devoid of galaxies. Special attention is paid to the theory of a neutrino-dominated universe: a cosmology in which electron neutrinos with a rest mass of a few tens of electron volts would contribute the bulk of the mean density. The evolution of small perturbations is discussed, and estimates are made for the temperature anisotropy of the microwave background radiation on various angular scales. The nonlinear stage in the evolution of smooth irrotational perturbations in a low-pressure medium is described in detail. Numerical experiments simulating large-scale structure formation processes are discussed, as well as their interpretation in the context of catastrophe theory.

  16. Wireless Secrecy in Large-Scale Networks

    CERN Document Server

    Pinto, Pedro C; Win, Moe Z

    2011-01-01

    The ability to exchange secret information is critical to many commercial, governmental, and military networks. The intrinsically secure communications graph (iS-graph) is a random graph which describes the connections that can be securely established over a large-scale network, by exploiting the physical properties of the wireless medium. This paper provides an overview of the main properties of this new class of random graphs. We first analyze the local properties of the iS-graph, namely the degree distributions and their dependence on fading, target secrecy rate, and eavesdropper collusion. To mitigate the effect of the eavesdroppers, we propose two techniques that improve secure connectivity. Then, we analyze the global properties of the iS-graph, namely percolation on the infinite plane, and full connectivity on a finite region. These results help clarify how the presence of eavesdroppers can compromise secure communication in a large-scale network.

  17. Measuring Bulk Flows in Large Scale Surveys

    CERN Document Server

    Feldman, H A; Feldman, Hume A.; Watkins, Richard

    1993-01-01

    We follow a formalism presented by Kaiser to calculate the variance of bulk flows in large scale surveys. We apply the formalism to a mock survey of Abell clusters \\'a la Lauer \\& Postman and find the variance in the expected bulk velocities in a universe with CDM, MDM and IRAS--QDOT power spectra. We calculate the velocity variance as a function of the 1--D velocity dispersion of the clusters and the size of the survey.

  18. Statistical characteristics of Large Scale Structure

    OpenAIRE

    Demianski; Doroshkevich

    2002-01-01

    We investigate the mass functions of different elements of the Large Scale Structure -- walls, pancakes, filaments and clouds -- and the impact of transverse motions -- expansion and/or compression -- on their statistical characteristics. Using the Zel'dovich theory of gravitational instability we show that the mass functions of all structure elements are approximately the same and the mass of all elements is found to be concentrated near the corresponding mean mass. At high redshifts, both t...

  19. Topologies for large scale photovoltaic power plants

    OpenAIRE

    Cabrera Tobar, Ana; Bullich Massagué, Eduard; Aragüés Peñalba, Mònica; Gomis Bellmunt, Oriol

    2016-01-01

    © 2016 Elsevier Ltd. All rights reserved. The concern of increasing renewable energy penetration into the grid together with the reduction of prices of photovoltaic solar panels during the last decade have enabled the development of large scale solar power plants connected to the medium and high voltage grid. Photovoltaic generation components, the internal layout and the ac collection grid are being investigated for ensuring the best design, operation and control of these power plants. This ...

  20. Solar Thermal Propulsion Test Facility at MSFC

    Science.gov (United States)

    1999-01-01

    This photograph shows an overall view of the Solar Thermal Propulsion Test Facility at the Marshall Space Flight Center (MSFC). The 20-by 24-ft heliostat mirror, shown at the left, has dual-axis control that keeps a reflection of the sunlight on an 18-ft diameter concentrator mirror (right). The concentrator mirror then focuses the sunlight to a 4-in focal point inside the vacuum chamber, shown at the front of concentrator mirror. Researchers at MSFC have designed, fabricated, and tested the first solar thermal engine, a non-chemical rocket engine that produces lower thrust but has better thrust efficiency than chemical a combustion engine. MSFC turned to solar thermal propulsion in the early 1990s due to its simplicity, safety, low cost, and commonality with other propulsion systems. Solar thermal propulsion works by acquiring and redirecting solar energy to heat a propell nt. As part of MSFC's Space Transportation Directorate, the Propulsion Research Center serves as a national resource for research of advanced, revolutionary propulsion technologies. The mission is to move the Nation's capabilities beyond the confines of conventional chemical propulsion into an era of aircraft-like access to Earth-orbit, rapid travel throughout the solar system, and exploration of interstellar space.

  1. EFFLUENT TREATMENT FACILITY PEROXIDE DESTRUCTION CATALYST TESTING

    Energy Technology Data Exchange (ETDEWEB)

    HALGREN DL

    2008-07-30

    The 200 Area Effluent Treatment Facility (ETF) main treatment train includes the peroxide destruction module (PDM) where the hydrogen peroxide residual from the upstream ultraviolet light/hydrogen peroxide oxidation unit is destroyed. Removal of the residual peroxide is necessary to protect downstream membranes from the strong oxidizer. The main component of the PDM is two reaction vessels utilizing granular activated carbon (GAC) as the reaction media. The PDM experienced a number of operability problems, including frequent plugging, and has not been utilized since the ETF changed to groundwater as the predominant feed. The unit seemed to be underperforming in regards to peroxide removal during the early periods of operation as well. It is anticipated that a functional PDM will be required for wastewater from the vitrification plant and other future streams. An alternate media or methodology needs to be identified to replace the GAC in the PDMs. This series of bench scale tests is to develop information to support an engineering study on the options for replacement of the existing GAC method for peroxide destruction at the ETF. A number of different catalysts will be compared as well as other potential methods such as strong reducing agents. The testing should lead to general conclusions on the viability of different catalysts and identify candidates for further study and evaluation.

  2. Large-scale instabilities of helical flows

    CERN Document Server

    Cameron, Alexandre; Brachet, Marc-Étienne

    2016-01-01

    Large-scale hydrodynamic instabilities of periodic helical flows are investigated using $3$D Floquet numerical computations. A minimal three-modes analytical model that reproduce and explains some of the full Floquet results is derived. The growth-rate $\\sigma$ of the most unstable modes (at small scale, low Reynolds number $Re$ and small wavenumber $q$) is found to scale differently in the presence or absence of anisotropic kinetic alpha (\\AKA{}) effect. When an $AKA$ effect is present the scaling $\\sigma \\propto q\\; Re\\,$ predicted by the $AKA$ effect theory [U. Frisch, Z. S. She, and P. L. Sulem, Physica D: Nonlinear Phenomena 28, 382 (1987)] is recovered for $Re\\ll 1$ as expected (with most of the energy of the unstable mode concentrated in the large scales). However, as $Re$ increases, the growth-rate is found to saturate and most of the energy is found at small scales. In the absence of \\AKA{} effect, it is found that flows can still have large-scale instabilities, but with a negative eddy-viscosity sca...

  3. Economically viable large-scale hydrogen liquefaction

    Science.gov (United States)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  4. Large-Scale Visual Data Analysis

    Science.gov (United States)

    Johnson, Chris

    2014-04-01

    Modern high performance computers have speeds measured in petaflops and handle data set sizes measured in terabytes and petabytes. Although these machines offer enormous potential for solving very large-scale realistic computational problems, their effectiveness will hinge upon the ability of human experts to interact with their simulation results and extract useful information. One of the greatest scientific challenges of the 21st century is to effectively understand and make use of the vast amount of information being produced. Visual data analysis will be among our most most important tools in helping to understand such large-scale information. Our research at the Scientific Computing and Imaging (SCI) Institute at the University of Utah has focused on innovative, scalable techniques for large-scale 3D visual data analysis. In this talk, I will present state- of-the-art visualization techniques, including scalable visualization algorithms and software, cluster-based visualization methods and innovate visualization techniques applied to problems in computational science, engineering, and medicine. I will conclude with an outline for a future high performance visualization research challenges and opportunities.

  5. Large-scale neuromorphic computing systems

    Science.gov (United States)

    Furber, Steve

    2016-10-01

    Neuromorphic computing covers a diverse range of approaches to information processing all of which demonstrate some degree of neurobiological inspiration that differentiates them from mainstream conventional computing systems. The philosophy behind neuromorphic computing has its origins in the seminal work carried out by Carver Mead at Caltech in the late 1980s. This early work influenced others to carry developments forward, and advances in VLSI technology supported steady growth in the scale and capability of neuromorphic devices. Recently, a number of large-scale neuromorphic projects have emerged, taking the approach to unprecedented scales and capabilities. These large-scale projects are associated with major new funding initiatives for brain-related research, creating a sense that the time and circumstances are right for progress in our understanding of information processing in the brain. In this review we present a brief history of neuromorphic engineering then focus on some of the principal current large-scale projects, their main features, how their approaches are complementary and distinct, their advantages and drawbacks, and highlight the sorts of capabilities that each can deliver to neural modellers.

  6. Petascale computations for Large-scale Atomic and Molecular collisions

    CERN Document Server

    McLaughlin, Brendan M

    2014-01-01

    Petaflop architectures are currently being utilized efficiently to perform large scale computations in Atomic, Molecular and Optical Collisions. We solve the Schroedinger or Dirac equation for the appropriate collision problem using the R-matrix or R-matrix with pseudo-states approach. We briefly outline the parallel methodology used and implemented for the current suite of Breit-Pauli and DARC codes. Various examples are shown of our theoretical results compared with those obtained from Synchrotron Radiation facilities and from Satellite observations. We also indicate future directions and implementation of the R-matrix codes on emerging GPU architectures.

  7. CICC Joint Development and Test for the Test Facility

    Institute of Scientific and Technical Information of China (English)

    武玉; 翁佩德

    2005-01-01

    The superconducting joint of the NbTi Cable-in -conduit Conductor (CICC) has been developed and tested on the magnet test facility at Institute of Plasma Physics, Chinese Academy of Sciences. The CICC is composed of (2NbTi+1Cu)×3×3×(6+1tube) strands each with 0.85 mm in diameter, which has been developed for a central solenoid model coil. The effective length of the joint is about 500 mm. There have been two common fabrication modes,one of them is to integrate the 2 CICC terminals with the copper substrate via lead-soldering, and the other is to mechanically compress the above two parts into an integrated unit. In the current range from 2 kA to 10 kA the joint resistance changes slightly. Up to now, 11 TF magnets, a central solenoid model coil, a central solenoid prototype coil, and a large PF model coil of PF large coil have been completed via the latter joint in the test facility.

  8. NASA Data Acquisition System Software Development for Rocket Propulsion Test Facilities

    Science.gov (United States)

    Herbert, Phillip W., Sr.; Elliot, Alex C.; Graves, Andrew R.

    2015-01-01

    Current NASA propulsion test facilities include Stennis Space Center in Mississippi, Marshall Space Flight Center in Alabama, Plum Brook Station in Ohio, and White Sands Test Facility in New Mexico. Within and across these centers, a diverse set of data acquisition systems exist with different hardware and software platforms. The NASA Data Acquisition System (NDAS) is a software suite designed to operate and control many critical aspects of rocket engine testing. The software suite combines real-time data visualization, data recording to a variety formats, short-term and long-term acquisition system calibration capabilities, test stand configuration control, and a variety of data post-processing capabilities. Additionally, data stream conversion functions exist to translate test facility data streams to and from downstream systems, including engine customer systems. The primary design goals for NDAS are flexibility, extensibility, and modularity. Providing a common user interface for a variety of hardware platforms helps drive consistency and error reduction during testing. In addition, with an understanding that test facilities have different requirements and setups, the software is designed to be modular. One engine program may require real-time displays and data recording; others may require more complex data stream conversion, measurement filtering, or test stand configuration management. The NDAS suite allows test facilities to choose which components to use based on their specific needs. The NDAS code is primarily written in LabVIEW, a graphical, data-flow driven language. Although LabVIEW is a general-purpose programming language; large-scale software development in the language is relatively rare compared to more commonly used languages. The NDAS software suite also makes extensive use of a new, advanced development framework called the Actor Framework. The Actor Framework provides a level of code reuse and extensibility that has previously been difficult

  9. Electronics and Telemetry Engineering and Test Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Electronics Laboratory is a fully equipped facility providing the capability to support electronic product development from highly complex weapon system sensors,...

  10. Steam turbine overspeed protection scheme after a sudden large -scale thermal power generating units outgoing transmission line interrupted based on zero power facility%零功率装置的发电机组外送线路中断后超速保护方案

    Institute of Scientific and Technical Information of China (English)

    余智

    2012-01-01

    Large - scale thermal power plant is in which output channel is break off, operational units will take place overspeed, even cause the equipment damaged. To avoid the accident happened , after analysis and discussion, considering using the way of zero power facility to generator triping , so as to ensure the safety outage of equipments.%大型火力发电厂在输出通道突然中断的情况下,运行的机组将会发生超速,甚至造成损坏,为避免此类事故的发生,经过分析论证,考虑采用零功率装置方式进行切机,以确保机组设备的安全停运。

  11. New facility for testing LHC HTS power leads

    CERN Document Server

    Rabehl, Roger Jon; Fehér, S; Huang, Y; Orris, D; Pischalnikov, Y; Sylvester, C D; Tartaglia, M

    2005-01-01

    A new facility for testing HTS power leads at the Fermilab Magnet Test Facility has been designed and operated. The facility has successfully tested 19 pairs of HTS power leads, which are to be integrated into the Large Hadron Collider Interaction Region cryogenic feed boxes. This paper describes the design and operation of the cryogenics, process controls, data acquisition, and quench management systems. HTS power lead test results from the commissioning phase of the project are also presented.

  12. ORNL instrumentation performance for Slab Core Test Facility (SCTF)-Core I Reflood Test Facility

    Energy Technology Data Exchange (ETDEWEB)

    Hardy, J E; Hess, R A; Hylton, J O

    1983-11-01

    Instrumentation was developed for making measurements in experimental refill-reflood test facilities. These unique instrumentation systems were designed to survive the severe environmental conditions that exist during a simulated pressurized water reactor loss-of-coolant accident (LOCA). Measurement of in-vessel fluid phenomena such as two-phase flow velocity and void fraction and film thickness and film velocity are required for better understanding of reactor behavior during LOCAs. The Advanced Instrumentation for Reflood Studies (AIRS) Program fabricated and delivered instrumentation systems and data reduction software algorithms that allowed the above measurements to be made. Data produced by AIRS sensors during three experimental runs in the Japanese Slab Core Test Facility are presented. Although many of the sensors failed before any useful data could be obtained, the remaining probes gave encouraging and useful results. These results are the first of their kind produced during simulated refill-reflood stage of a LOCA near actual thermohydrodynamic conditions.

  13. Dynamic Response Testing in an Electrically Heated Reactor Test Facility

    Science.gov (United States)

    Bragg-Sitton, Shannon M.; Morton, T. J.

    2006-01-01

    Non-nuclear testing can be a valuable tool in development of a space nuclear power or propulsion system. In a non-nuclear test bed, electric heaters are used to simulate the heat from nuclear fuel. Standard testing allows one to fully assess thermal, heat transfer, and stress related attributes of a given system, but fails to demonstrate the dynamic response that would be present in an integrated, fueled reactor system. The integration of thermal hydraulic hardware tests with simulated neutronic response provides a bridge between electrically heated testing and full nuclear testing. By implementing a neutronic response model to simulate the dynamic response that would be expected in a fueled reactor system, one can better understand system integration issues, characterize integrated system response times and response characteristics, and assess potential design improvements at a relatively small fiscal investment. Initial system dynamic response testing was demonstrated on the integrated SAFE-100a heat pipe cooled, electrically heated reactor and heat exchanger hardware, utilizing a one-group solution to the point kinetics equations to simulate the expected neutronic response of the system (Bragg-Sitton, 2005). The current paper applies the same testing methodology to a direct drive gas cooled reactor system, demonstrating the applicability of the testing methodology to any reactor type and demonstrating the variation in system response characteristics in different reactor concepts. In each testing application, core power transients were controlled by a point kinetics model with reactivity feedback based on core average temperature; the neutron generation time and the temperature feedback coefficient are provided as model inputs. Although both system designs utilize a fast spectrum reactor, the method of cooling the reactor differs significantly, leading to a variable system response that can be demonstrated and assessed in a non-nuclear test facility.

  14. Results from DR and Instrumentation Test Facilities

    CERN Document Server

    Urakawa, Junji

    2005-01-01

    The KEK Accelerator Test Facility (ATF) is a 1.3GeV storage ring capable of producing ultra-low emittance electron beams and has a beam extraction line for ILC R&D. The ATF has proven to be an ideal place for researches with small, stable beams. 2x1010 single bunch and low current 20 bunch-train with 2.8nsec bunch spacing have been extracted to develop Nano-Cavity BPM’s, FONT, Nano Beam Orbit handling (FEATHER), Optical Diffraction Radiation (ODR) monitor, a precision multi-bunch laser-based beam profile monitor and polarized positron beam generation via backward-Compton scattering by the international collaboration. A set of three cavity BPM's is installed in the ATF extraction line on a set of extremely stiff supports. The KEK group installed another set of three BPM's, with their own support mechanism. The full set of 6 will prove extremely useful. In the DR (Damping Ring), we are researching the fast ion instability, micro-wave instability with four sets of damping wiggler and developing pul...

  15. Large Scale CW ECRH Systems: Some considerations

    Directory of Open Access Journals (Sweden)

    Turkin Y.

    2012-09-01

    Full Text Available Electron Cyclotron Resonance Heating (ECRH is a key component in the heating arsenal for the next step fusion devices like W7-X and ITER. These devices are equipped with superconducting coils and are designed to operate steady state. ECRH must thus operate in CW-mode with a large flexibility to comply with various physics demands such as plasma start-up, heating and current drive, as well as configurationand MHD - control. The request for many different sophisticated applications results in a growing complexity, which is in conflict with the request for high availability, reliability, and maintainability. ‘Advanced’ ECRH-systems must, therefore, comply with both the complex physics demands and operational robustness and reliability. The W7-X ECRH system is the first CW- facility of an ITER relevant size and is used as a test bed for advanced components. Proposals for future developments are presented together with improvements of gyrotrons, transmission components and launchers.

  16. Large-Scale Hollow Retroreflectors for Lunar Laser Ranging at Goddard Space Flight Center

    Science.gov (United States)

    Preston, Alix

    2012-01-01

    Laser ranging to the retroreflector arrays placed on the lunar surface by the Apollo astronauts and the Soviet Luna missions have dramatically increased our understanding of gravitational physics along with Earth and Moon geophysics, geodesy, and dynamics. Although the precision of the range measurements has historically been limited by the ground station capabilities, advances in the APOLLO instrument at the Apache Point facility in New Mexico is beginning to be limited by errors associated with the lunar arrays. We report here on efforts at Goddard Space Flight Center to develop the next generation of lunar retroreflectors. We will describe a new facility that is being used to design, assemble, and test large-scale hollow retroreflectors. We will also describe results from investigations into various bonding techniques used to assemble the open comer cubes and mirror coatings that have dust mitigation properties.

  17. Large-Scale Hollow Retroreflectors for Lunar Laser Ranging at Goddard Space Flight Center

    Science.gov (United States)

    Preston, Alix M.

    2012-05-01

    Laser ranging to the retroreflector arrays placed on the lunar surface by the Apollo astronauts and the Soviet Luna missions have dramatically increased our understanding of gravitational physics along with Earth and Moon geophysics, geodesy, and dynamics. Although the precision of the range measurements has historically been limited by the ground station capabilities, advances in the APOLLO instrument at the Apache Point facility in New Mexico is beginning to be limited by errors associated with the lunar arrays. We report here on efforts at Goddard Space Flight Center to develop the next generation of lunar retroreflectors. We will describe a new facility that is being used to design, assemble, and test large-scale hollow retroreflectors. We will also describe results from investigations into various bonding techniques used to assemble the open corner cubes and mirror coatings that have dust mitigation properties.

  18. Instrumentation Development for Large Scale Hypersonic Inflatable Aerodynamic Decelerator Characterization

    Science.gov (United States)

    Swanson, Gregory T.; Cassell, Alan M.

    2011-01-01

    Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology is currently being considered for multiple atmospheric entry applications as the limitations of traditional entry vehicles have been reached. The Inflatable Re-entry Vehicle Experiment (IRVE) has successfully demonstrated this technology as a viable candidate with a 3.0 m diameter vehicle sub-orbital flight. To further this technology, large scale HIADs (6.0 8.5 m) must be developed and tested. To characterize the performance of large scale HIAD technology new instrumentation concepts must be developed to accommodate the flexible nature inflatable aeroshell. Many of the concepts that are under consideration for the HIAD FY12 subsonic wind tunnel test series are discussed below.

  19. Dynamic Response Testing in an Electrically Heated Reactor Test Facility

    Science.gov (United States)

    Bragg-Sitton, Shannon M.; Morton, T. J.

    2006-01-01

    Non-nuclear testing can be a valuable tool in the development of a space nuclear power or propulsion system. In a non-nuclear test bed, electric heaters are used to simulate the heat from nuclear fuel. Standard testing allows one to fully assess thermal, heat transfer, and stress related attributes of a given system, but fails to demonstrate the dynamic response that would be present in an integrated, fueled reactor system. The integration of thermal hydraulic hardware tests with simulated neutronic response provides a bridge between electrically heated testing and fueled nuclear testing. By implementing a neutronic response model to simulate the dynamic response that would be expected in a fueled reactor system, one can better understand system integration issues, characterize integrated system response times and response characteristics, and assess potential design improvements at a relatively small fiscal investment. Initial system dynamic response testing was demonstrated on the integrated SAFE-100a heat pipe (HP) cooled, electrically heated reactor and heat exchanger hardware, utilizing a one-group solution to the point kinetics equations to simulate the expected neutronic response of the system. Reactivity feedback calculations were then based on a bulk reactivity feedback coefficient and measured average core temperature. This paper presents preliminary results from similar dynamic testing of a direct drive gas cooled reactor system (DDG), demonstrating the applicability of the testing methodology to any reactor type and demonstrating the variation in system response characteristics in different reactor concepts. Although the HP and DDG designs both utilize a fast spectrum reactor, the method of cooling the reactor differs significantly, leading to a variable system response that can be demonstrated and assessed in a non-nuclear test facility. Planned system upgrades to allow implementation of higher fidelity dynamic testing are also discussed. Proposed DDG

  20. Robust regression for large-scale neuroimaging studies.

    OpenAIRE

    2015-01-01

    PUBLISHED Multi-subject datasets used in neuroimaging group studies have a complex structure, as they exhibit non-stationary statistical properties across regions and display various artifacts. While studies with small sample sizes can rarely be shown to deviate from standard hypotheses (such as the normality of the residuals) due to the poor sensitivity of normality tests with low degrees of freedom, large-scale studies (e.g. >100 subjects) exhibit more obvious deviations from these hypot...

  1. Robust regression for large-scale neuroimaging studies.

    OpenAIRE

    BOKDE, ARUN

    2015-01-01

    PUBLISHED Multi-subject datasets used in neuroimaging group studies have a complex structure, as they exhibit non-stationary statistical properties across regions and display various artifacts. While studies with small sample sizes can rarely be shown to deviate from standard hypotheses (such as the normality of the residuals) due to the poor sensitivity of normality tests with low degrees of freedom, large-scale studies (e.g. >100 subjects) exhibit more obvious deviations from these hypot...

  2. Concurrent Programming Using Actors: Exploiting Large-Scale Parallelism,

    Science.gov (United States)

    1985-10-07

    ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK* Artificial Inteligence Laboratory AREA Is WORK UNIT NUMBERS 545 Technology Square...D-R162 422 CONCURRENT PROGRMMIZNG USING f"OS XL?ITP TEH l’ LARGE-SCALE PARALLELISH(U) NASI AC E Al CAMBRIDGE ARTIFICIAL INTELLIGENCE L. G AGHA ET AL...RESOLUTION TEST CHART N~ATIONAL BUREAU OF STANDA.RDS - -96 A -E. __ _ __ __’ .,*- - -- •. - MASSACHUSETTS INSTITUTE OF TECHNOLOGY ARTIFICIAL

  3. RESTRUCTURING OF THE LARGE-SCALE SPRINKLERS

    Directory of Open Access Journals (Sweden)

    Paweł Kozaczyk

    2016-09-01

    Full Text Available One of the best ways for agriculture to become independent from shortages of precipitation is irrigation. In the seventies and eighties of the last century a number of large-scale sprinklers in Wielkopolska was built. At the end of 1970’s in the Poznan province 67 sprinklers with a total area of 6400 ha were installed. The average size of the sprinkler reached 95 ha. In 1989 there were 98 sprinklers, and the area which was armed with them was more than 10 130 ha. The study was conducted on 7 large sprinklers with the area ranging from 230 to 520 hectares in 1986÷1998. After the introduction of the market economy in the early 90’s and ownership changes in agriculture, large-scale sprinklers have gone under a significant or total devastation. Land on the State Farms of the State Agricultural Property Agency has leased or sold and the new owners used the existing sprinklers to a very small extent. This involved a change in crop structure, demand structure and an increase in operating costs. There has also been a threefold increase in electricity prices. Operation of large-scale irrigation encountered all kinds of barriers in practice and limitations of system solutions, supply difficulties, high levels of equipment failure which is not inclined to rational use of available sprinklers. An effect of a vision of the local area was to show the current status of the remaining irrigation infrastructure. The adopted scheme for the restructuring of Polish agriculture was not the best solution, causing massive destruction of assets previously invested in the sprinkler system.

  4. Argonne to open new facility for advanced vehicle testing

    CERN Multimedia

    2002-01-01

    Argonne National Laboratory will open it's Advanced Powertrain Research Facility on Friday, Nov. 15. The facility is North America's only public testing facility for engines, fuel cells, electric drives and energy storage. State-of-the-art performance and emissions measurement equipment is available to support model development and technology validation (1 page).

  5. Design Report for the ½ Scale Air-Cooled RCCS Tests in the Natural convection Shutdown heat removal Test Facility (NSTF)

    Energy Technology Data Exchange (ETDEWEB)

    Lisowski, D. D. [Argonne National Lab. (ANL), Argonne, IL (United States); Farmer, M. T. [Argonne National Lab. (ANL), Argonne, IL (United States); Lomperski, S. [Argonne National Lab. (ANL), Argonne, IL (United States); Kilsdonk, D. J. [Argonne National Lab. (ANL), Argonne, IL (United States); Bremer, N. [Argonne National Lab. (ANL), Argonne, IL (United States); Aeschlimann, R. W. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-06-01

    The Natural convection Shutdown heat removal Test Facility (NSTF) is a large scale thermal hydraulics test facility that has been built at Argonne National Laboratory (ANL). The facility was constructed in order to carry out highly instrumented experiments that can be used to validate the performance of passive safety systems for advanced reactor designs. The facility has principally been designed for testing of Reactor Cavity Cooling System (RCCS) concepts that rely on natural convection cooling for either air or water-based systems. Standing 25-m in height, the facility is able to supply up to 220 kW at 21 kW/m2 to accurately simulate the heat fluxes at the walls of a reactor pressure vessel. A suite of nearly 400 data acquisition channels, including a sophisticated fiber optic system for high density temperature measurements, guides test operations and provides data to support scaling analysis and modeling efforts. Measurements of system mass flow rate, air and surface temperatures, heat flux, humidity, and pressure differentials, among others; are part of this total generated data set. The following report provides an introduction to the top level-objectives of the program related to passively safe decay heat removal, a detailed description of the engineering specifications, design features, and dimensions of the test facility at Argonne. Specifications of the sensors and their placement on the test facility will be provided, along with a complete channel listing of the data acquisition system.

  6. Large-Scale PV Integration Study

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Shuai; Etingov, Pavel V.; Diao, Ruisheng; Ma, Jian; Samaan, Nader A.; Makarov, Yuri V.; Guo, Xinxin; Hafen, Ryan P.; Jin, Chunlian; Kirkham, Harold; Shlatz, Eugene; Frantzis, Lisa; McClive, Timothy; Karlson, Gregory; Acharya, Dhruv; Ellis, Abraham; Stein, Joshua; Hansen, Clifford; Chadliev, Vladimir; Smart, Michael; Salgo, Richard; Sorensen, Rahn; Allen, Barbara; Idelchik, Boris

    2011-07-29

    This research effort evaluates the impact of large-scale photovoltaic (PV) and distributed generation (DG) output on NV Energy’s electric grid system in southern Nevada. It analyzes the ability of NV Energy’s generation to accommodate increasing amounts of utility-scale PV and DG, and the resulting cost of integrating variable renewable resources. The study was jointly funded by the United States Department of Energy and NV Energy, and conducted by a project team comprised of industry experts and research scientists from Navigant Consulting Inc., Sandia National Laboratories, Pacific Northwest National Laboratory and NV Energy.

  7. Conformal Anomaly and Large Scale Gravitational Coupling

    CERN Document Server

    Salehi, H

    2000-01-01

    We present a model in which the breackdown of conformal symmetry of a quantum stress-tensor due to the trace anomaly is related to a cosmological effect in a gravitational model. This is done by characterizing the traceless part of the quantum stress-tensor in terms of the stress-tensor of a conformal invariant classical scalar field. We introduce a conformal frame in which the anomalous trace is identified with a cosmological constant. In this conformal frame we establish the Einstein field equations by connecting the quantum stress-tensor with the large scale distribution of matter in the universe.

  8. Large Scale Quantum Simulations of Nuclear Pasta

    Science.gov (United States)

    Fattoyev, Farrukh J.; Horowitz, Charles J.; Schuetrumpf, Bastian

    2016-03-01

    Complex and exotic nuclear geometries collectively referred to as ``nuclear pasta'' are expected to naturally exist in the crust of neutron stars and in supernovae matter. Using a set of self-consistent microscopic nuclear energy density functionals we present the first results of large scale quantum simulations of pasta phases at baryon densities 0 . 03 pasta configurations. This work is supported in part by DOE Grants DE-FG02-87ER40365 (Indiana University) and DE-SC0008808 (NUCLEI SciDAC Collaboration).

  9. Large scale wind power penetration in Denmark

    DEFF Research Database (Denmark)

    Karnøe, Peter

    2013-01-01

    he Danish electricity generating system prepared to adopt nuclear power in the 1970s, yet has become the world's front runner in wind power with a national plan for 50% wind power penetration by 2020. This paper deploys a sociotechnical perspective to explain the historical transformation of "net...... expertise evolves and contributes to the normalization and large-scale penetration of wind power in the electricity generating system. The analysis teaches us how technological paths become locked-in, but also indicates keys for locking them out....

  10. Stabilization Algorithms for Large-Scale Problems

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg

    2006-01-01

    The focus of the project is on stabilization of large-scale inverse problems where structured models and iterative algorithms are necessary for computing approximate solutions. For this purpose, we study various iterative Krylov methods and their abilities to produce regularized solutions. Some......-curve. This heuristic is implemented as a part of a larger algorithm which is developed in collaboration with G. Rodriguez and P. C. Hansen. Last, but not least, a large part of the project has, in different ways, revolved around the object-oriented Matlab toolbox MOORe Tools developed by PhD Michael Jacobsen. New...

  11. What is a large-scale dynamo?

    Science.gov (United States)

    Nigro, G.; Pongkitiwanichakul, P.; Cattaneo, F.; Tobias, S. M.

    2017-01-01

    We consider kinematic dynamo action in a sheared helical flow at moderate to high values of the magnetic Reynolds number (Rm). We find exponentially growing solutions which, for large enough shear, take the form of a coherent part embedded in incoherent fluctuations. We argue that at large Rm large-scale dynamo action should be identified by the presence of structures coherent in time, rather than those at large spatial scales. We further argue that although the growth rate is determined by small-scale processes, the period of the coherent structures is set by mean-field considerations.

  12. Large scale phononic metamaterials for seismic isolation

    Energy Technology Data Exchange (ETDEWEB)

    Aravantinos-Zafiris, N. [Department of Sound and Musical Instruments Technology, Ionian Islands Technological Educational Institute, Stylianou Typaldou ave., Lixouri 28200 (Greece); Sigalas, M. M. [Department of Materials Science, University of Patras, Patras 26504 (Greece)

    2015-08-14

    In this work, we numerically examine structures that could be characterized as large scale phononic metamaterials. These novel structures could have band gaps in the frequency spectrum of seismic waves when their dimensions are chosen appropriately, thus raising the belief that they could be serious candidates for seismic isolation structures. Different and easy to fabricate structures were examined made from construction materials such as concrete and steel. The well-known finite difference time domain method is used in our calculations in order to calculate the band structures of the proposed metamaterials.

  13. Hiearchical Engine for Large Scale Infrastructure Simulation

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-15

    HELICS ls a new open-source, cyber-physlcal-energy co-simulation framework for electric power systems. HELICS Is designed to support very-large-scale (100,000+ federates) co­simulations with off-the-shelf power-system, communication, market, and end-use tools. Other key features Include cross platform operating system support, the integration of both eventdrlven (e.g., packetlzed communication) and time-series (e.g.,power flow) simulations, and the ability to co-Iterate among federates to ensure physical model convergence at each time step.

  14. Colloquium: Large scale simulations on GPU clusters

    Science.gov (United States)

    Bernaschi, Massimo; Bisson, Mauro; Fatica, Massimiliano

    2015-06-01

    Graphics processing units (GPU) are currently used as a cost-effective platform for computer simulations and big-data processing. Large scale applications require that multiple GPUs work together but the efficiency obtained with cluster of GPUs is, at times, sub-optimal because the GPU features are not exploited at their best. We describe how it is possible to achieve an excellent efficiency for applications in statistical mechanics, particle dynamics and networks analysis by using suitable memory access patterns and mechanisms like CUDA streams, profiling tools, etc. Similar concepts and techniques may be applied also to other problems like the solution of Partial Differential Equations.

  15. Post-test thermal-hydraulic analysis of two intermediate LOCA tests at the ROSA facility including uncertainty evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Freixa, J., E-mail: jordi@freixa.net [Paul Scherrer Institut (PSI) 5232 Villigen PSI (Switzerland); Kim, T.-W. [Paul Scherrer Institut (PSI) 5232 Villigen PSI (Switzerland); Manera, A. [University of Michigan, Ann Arbor, MI 48109 (United States)

    2013-11-15

    The OECD/NEA ROSA-2 project aims at addressing thermal-hydraulic safety issues relevant for light water reactors by building up an experimental database at the ROSA Large Scale Test Facility (LSTF). The ROSA facility simulates a PWR Westinghouse design with a four-loop configuration and a nominal power of 3423 MWth. Two intermediate break loss-of-coolant-accident (LOCA) experiments (Tests 1 and 2) have been carried out during 2010. The two tests were analyzed by using the US-NRC TRACE best estimate code, employing the same nodalization previously used for the simulation of small-break LOCA experiments of the ROSA-1 programme. A post-test calculation was performed for each test along with uncertainty analysis providing uncertainty bands for each relevant time trend. Uncertainties in the code modelling capabilities as well as in the initial and boundary conditions were taken into account, following the guidelines and lessons learnt through participation in the OECD/NEA BEMUSE programme. Two different versions of the TRACE code were used in the analysis, providing a qualitatively good prediction of the tests. However, the uncertainty analysis revealed differences between the performances of some models in the two versions. The most relevant parameters of the two experimental tests were falling within the computed uncertainty bands.

  16. Post-test thermal-hydraulic analysis of two intermediate LOCA tests at the ROSA facility including uncertainty evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Freixa, J.; Kim, T-W.; Manera, A. [Paul Scherrer Inst., Villigen (Switzerland)

    2011-07-01

    The OECD/NEA ROSA-2 project aims at addressing thermal-hydraulic safety issues relevant for light water reactors by building up an experimental database at the ROSA Large Scale Test Facility (LSTF). The ROSA facility simulates a PWR Westinghouse design with a four-loop configuration and a nominal power of 3423 MWth. Two intermediate break loss-of-coolant-accident (LOCA) experiments (Test 1 and 2) have been carried out during 2010. The two tests were analyzed by using the US-NRC TRACE best estimate code, employing the same nodalization previously used for the simulation of small-break LOCA experiments of the ROSA-1 program. A post-test calculation was performed for each test along with uncertainty analysis providing uncertainty bands for each relevant time trend. Uncertainties in the code modeling capabilities as well as in the initial and boundary conditions were taken into account, following the guidelines and lessons learnt through participation in the OECD/NEA BEMUSE program. Two different versions of the TRACE code were used in the analysis, providing a qualitatively good prediction of the tests. However, both versions showed deficiencies that need to be addressed. The most relevant parameters of the two experimental tests were falling within the computed uncertainty bands. (author)

  17. Large-Scale Laboratory Facility For Sediment Transport Research

    Data.gov (United States)

    Federal Laboratory Consortium — Effective design and maintenance of inlet navigation and shore protection projects require accurate estimates of the quantity of sand that moves along the beach. The...

  18. Ethanol Production from Biomass: Large Scale Facility Design Project

    Energy Technology Data Exchange (ETDEWEB)

    Berson, R. Eric [Univ. of Louisville, KY (United States)

    2009-10-29

    High solids processing of biomass slurries provides the following benefits: maximized product concentration in the fermentable sugar stream, reduced water usage, and reduced reactor size. However, high solids processing poses mixing and heat transfer problems above about 15% for pretreated corn stover solids due to their high viscosities. Also, highly viscous slurries require high power consumption in conventional stirred tanks since they must be run at high rotational speeds to maintain proper mixing. An 8 liter scraped surface bio-reactor (SSBR) is employed here that is designed to efficiently handle high solids loadings for enzymatic saccharification of pretreated corn stover (PCS) while maintaining power requirements on the order of low viscous liquids in conventional stirred tanks. Saccharification of biomass exhibit slow reaction rates and incomplete conversion, which may be attributed to enzyme deactivation and loss of activity due to a variety of mechanisms. Enzyme deactivation is classified into two categories here: one, deactivation due to enzyme-substrate interactions and two, deactivation due to all other factors that are grouped together and termed “non-specific” deactivation. A study was conducted to investigate the relative extents of “non-specific” deactivation and deactivation due to “enzyme-substrate interactions” and a model was developed that describes the kinetics of cellulose hydrolysis by considering the observed deactivation effects. Enzyme substrate interactions had a much more significant effect on overall deactivation with a deactivation rate constant about 20X higher than the non-specific deactivation rate constant (0.35 h-1 vs 0.018 h-1). The model is well validated by the experimental data and predicts complete conversion of cellulose within 30 hours in the absence of enzyme substrate interactions.

  19. Robust regression for large-scale neuroimaging studies.

    Science.gov (United States)

    Fritsch, Virgile; Da Mota, Benoit; Loth, Eva; Varoquaux, Gaël; Banaschewski, Tobias; Barker, Gareth J; Bokde, Arun L W; Brühl, Rüdiger; Butzek, Brigitte; Conrod, Patricia; Flor, Herta; Garavan, Hugh; Lemaitre, Hervé; Mann, Karl; Nees, Frauke; Paus, Tomas; Schad, Daniel J; Schümann, Gunter; Frouin, Vincent; Poline, Jean-Baptiste; Thirion, Bertrand

    2015-05-01

    Multi-subject datasets used in neuroimaging group studies have a complex structure, as they exhibit non-stationary statistical properties across regions and display various artifacts. While studies with small sample sizes can rarely be shown to deviate from standard hypotheses (such as the normality of the residuals) due to the poor sensitivity of normality tests with low degrees of freedom, large-scale studies (e.g. >100 subjects) exhibit more obvious deviations from these hypotheses and call for more refined models for statistical inference. Here, we demonstrate the benefits of robust regression as a tool for analyzing large neuroimaging cohorts. First, we use an analytic test based on robust parameter estimates; based on simulations, this procedure is shown to provide an accurate statistical control without resorting to permutations. Second, we show that robust regression yields more detections than standard algorithms using as an example an imaging genetics study with 392 subjects. Third, we show that robust regression can avoid false positives in a large-scale analysis of brain-behavior relationships with over 1500 subjects. Finally we embed robust regression in the Randomized Parcellation Based Inference (RPBI) method and demonstrate that this combination further improves the sensitivity of tests carried out across the whole brain. Altogether, our results show that robust procedures provide important advantages in large-scale neuroimaging group studies.

  20. Upgrade of the Cryogenic CERN RF Test Facility

    CERN Document Server

    Pirotte, O; Brunner, O; Inglese, V; Koettig, T; Maesen, P; Vullierme, B

    2014-01-01

    With the large number of superconducting radiofrequency (RF) cryomodules to be tested for the former LEP and the present LHC accelerator a RF test facility was erected early in the 1990’s in the largest cryogenic test facility at CERN located at Point 18. This facility consisted of four vertical test stands for single cavities and originally one and then two horizontal test benches for RF cryomodules operating at 4.5 K in saturated helium. CERN is presently working on the upgrade of its accelerator infrastructure, which requires new superconducting cavities operating below 2 K in saturated superfluid helium. Consequently, the RF test facility has been renewed in order to allow efficient cavity and cryomodule tests in superfluid helium and to improve its thermal performances. The new RF test facility is described and its performances are presented.

  1. Upgrade of the cryogenic CERN RF test facility

    Science.gov (United States)

    Pirotte, O.; Benda, V.; Brunner, O.; Inglese, V.; Koettig, T.; Maesen, P.; Vullierme, B.

    2014-01-01

    With the large number of superconducting radiofrequency (RF) cryomodules to be tested for the former LEP and the present LHC accelerator a RF test facility was erected early in the 1990's in the largest cryogenic test facility at CERN located at Point 18. This facility consisted of four vertical test stands for single cavities and originally one and then two horizontal test benches for RF cryomodules operating at 4.5 K in saturated helium. CERN is presently working on the upgrade of its accelerator infrastructure, which requires new superconducting cavities operating below 2 K in saturated superfluid helium. Consequently, the RF test facility has been renewed in order to allow efficient cavity and cryomodule tests in superfluid helium and to improve its thermal performances. The new RF test facility is described and its performances are presented.

  2. Computational Modeling in Support of High Altitude Testing Facilities Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Simulation technology plays an important role in propulsion test facility design and development by assessing risks, identifying failure modes and predicting...

  3. Computational Modeling in Support of High Altitude Testing Facilities Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Simulation technology plays an important role in rocket engine test facility design and development by assessing risks, identifying failure modes and predicting...

  4. Learning Design at White Sands Test Facility

    Science.gov (United States)

    Grotewiel, Shane

    2010-01-01

    During the Fall of 2010, I spent my time at NASA White Sands Test Facility in Las Cruces, NM as an Undergraduate Student Research Program (USRP) Intern. During that time, I was given three projects to work on: Large Altitude Simulation System (LASS) basket strainer, log books, and the design of a case for touch screen monitors used for simulations. I spent most of my time on the LASS basket strainer. The LASS system has a water feed line with a basket strainer that filters out rust. In 2009, there were three misfires which cost approximately $27,000 and about 8% of the allotted time. The strainer was getting a large change in pressure that would result in a shutdown of the system. I have designed a new basket that will eliminate the large pressure change and it can be used with the old basket strainer housing. The LASS system has three steam generators (modules). Documents pertaining to these modules are stored electronically, and the majority of the documents are not able to be searched with keywords, so they have to be gone through one by one. I have come up with an idea on how to organize these files so that the Propulsion Department may efficiently search through the documents needed. Propulsion also has a LASS simulator that incorporates two touch screen monitors. Currently these monitors are in six foot by two foot metal cabinet on wheels. During simulation these monitors are used in the block house and need to be taken out of the block house when not in use. I have designed different options for hand held cases for storing and transporting the monitors in and out of the block house. The three projects previously mentioned demonstrate my contributions to the Propulsion Department and have taught me real world experience that is essential in becoming a productive engineer.

  5. Internationalization Measures in Large Scale Research Projects

    Science.gov (United States)

    Soeding, Emanuel; Smith, Nancy

    2017-04-01

    Internationalization measures in Large Scale Research Projects Large scale research projects (LSRP) often serve as flagships used by universities or research institutions to demonstrate their performance and capability to stakeholders and other interested parties. As the global competition among universities for the recruitment of the brightest brains has increased, effective internationalization measures have become hot topics for universities and LSRP alike. Nevertheless, most projects and universities are challenged with little experience on how to conduct these measures and make internationalization an cost efficient and useful activity. Furthermore, those undertakings permanently have to be justified with the Project PIs as important, valuable tools to improve the capacity of the project and the research location. There are a variety of measures, suited to support universities in international recruitment. These include e.g. institutional partnerships, research marketing, a welcome culture, support for science mobility and an effective alumni strategy. These activities, although often conducted by different university entities, are interlocked and can be very powerful measures if interfaced in an effective way. On this poster we display a number of internationalization measures for various target groups, identify interfaces between project management, university administration, researchers and international partners to work together, exchange information and improve processes in order to be able to recruit, support and keep the brightest heads to your project.

  6. Large-scale Globally Propagating Coronal Waves

    Directory of Open Access Journals (Sweden)

    Alexander Warmuth

    2015-09-01

    Full Text Available Large-scale, globally propagating wave-like disturbances have been observed in the solar chromosphere and by inference in the corona since the 1960s. However, detailed analysis of these phenomena has only been conducted since the late 1990s. This was prompted by the availability of high-cadence coronal imaging data from numerous spaced-based instruments, which routinely show spectacular globally propagating bright fronts. Coronal waves, as these perturbations are usually referred to, have now been observed in a wide range of spectral channels, yielding a wealth of information. Many findings have supported the “classical” interpretation of the disturbances: fast-mode MHD waves or shocks that are propagating in the solar corona. However, observations that seemed inconsistent with this picture have stimulated the development of alternative models in which “pseudo waves” are generated by magnetic reconfiguration in the framework of an expanding coronal mass ejection. This has resulted in a vigorous debate on the physical nature of these disturbances. This review focuses on demonstrating how the numerous observational findings of the last one and a half decades can be used to constrain our models of large-scale coronal waves, and how a coherent physical understanding of these disturbances is finally emerging.

  7. The multipurpose thermalhydraulic test facility TOPFLOW: an overview on experimental capabilities, instrumentation and results

    Energy Technology Data Exchange (ETDEWEB)

    Prasser, H.M.; Beyer, M.; Carl, H.; Manera, A.; Pietruske, H.; Schuetz, P.; Weiss, F.P. [Forschungszentrum Rossedorf e.V. (FZR), Dresden (Germany). Inst. fuer Sicherheitsforschung

    2006-08-15

    A new multipurpose thermalhydraulic test facility TOPFLOW (TwO Phase FLOW) was built and put into operation at Forschungszentrum Rossendorf in 2002 and 2003. Since then, it has been mainly used for the investigation of generic and applied steady state and transient two phase flow phenomena and the development and validation of models of computational fluid dynamic (CFD) codes in the frame of the German CFD initiative. The advantage of TOPFLOW consists in the combination of a large scale of the test channels with a wide operational range both of the flow velocities as well as of the system pressures and temperatures plus finally the availability of a special instrumentation that is capable in high spatial and temporal resolving two phase flow phenomena, for example the wire-mesh sensors. (orig.)

  8. Bayesian large-scale structure inference and cosmic web analysis

    CERN Document Server

    Leclercq, Florent

    2015-01-01

    Surveys of the cosmic large-scale structure carry opportunities for building and testing cosmological theories about the origin and evolution of the Universe. This endeavor requires appropriate data assimilation tools, for establishing the contact between survey catalogs and models of structure formation. In this thesis, we present an innovative statistical approach for the ab initio simultaneous analysis of the formation history and morphology of the cosmic web: the BORG algorithm infers the primordial density fluctuations and produces physical reconstructions of the dark matter distribution that underlies observed galaxies, by assimilating the survey data into a cosmological structure formation model. The method, based on Bayesian probability theory, provides accurate means of uncertainty quantification. We demonstrate the application of BORG to the Sloan Digital Sky Survey data and describe the primordial and late-time large-scale structure in the observed volume. We show how the approach has led to the fi...

  9. Supermassive black holes, large scale structure and holography

    CERN Document Server

    Mongan, T R

    2013-01-01

    A holographic analysis of large scale structure in the universe estimates the mass of supermassive black holes at the center of large scale structures with matter density varying inversely as the square of the distance from their center. The estimate is consistent with two important test cases involving observations of the supermassive black hole with mass 3.6\\times10^{-6} times the galactic mass in Sagittarius A^{*} near the center of our Milky Way and the 2\\times10^{9} solar mass black hole in the quasar ULAS J112001.48+064124.3 at redshift z=7.085. It is also consistent with upper bounds on central black hole masses in globular clusters M15, M19 and M22 developed using the Jansky Very Large Array in New Mexico.

  10. Distant galaxy clusters in the XMM Large Scale Structure survey

    CERN Document Server

    Willis, J P; Bremer, M N; Pierre, M; Adami, C; Ilbert, O; Maughan, B; Maurogordato, S; Pacaud, F; Valtchanov, I; Chiappetti, L; Thanjavur, K; Gwyn, S; Stanway, E R; Winkworth, C

    2012-01-01

    (Abridged) Distant galaxy clusters provide important tests of the growth of large scale structure in addition to highlighting the process of galaxy evolution in a consistently defined environment at large look back time. We present a sample of 22 distant (z>0.8) galaxy clusters and cluster candidates selected from the 9 deg2 footprint of the overlapping X-ray Multi Mirror (XMM) Large Scale Structure (LSS), CFHTLS Wide and Spitzer SWIRE surveys. Clusters are selected as extended X-ray sources with an accompanying overdensity of galaxies displaying optical to mid-infrared photometry consistent with z>0.8. Nine clusters have confirmed spectroscopic redshifts in the interval 0.80.8 clusters.

  11. Quantum noise in large-scale coherent nonlinear photonic circuits

    CERN Document Server

    Santori, Charles; Beausoleil, Raymond G; Tezak, Nikolas; Hamerly, Ryan; Mabuchi, Hideo

    2014-01-01

    A semiclassical simulation approach is presented for studying quantum noise in large-scale photonic circuits incorporating an ideal Kerr nonlinearity. A netlist-based circuit solver is used to generate matrices defining a set of stochastic differential equations, in which the resonator field variables represent random samplings of the Wigner quasi-probability distributions. Although the semiclassical approach involves making a large-photon-number approximation, tests on one- and two-resonator circuits indicate satisfactory agreement between the semiclassical and full-quantum simulation results in the parameter regime of interest. The semiclassical model is used to simulate random errors in a large-scale circuit that contains 88 resonators and hundreds of components in total, and functions as a 4-bit ripple counter. The error rate as a function of on-state photon number is examined, and it is observed that the quantum fluctuation amplitudes do not increase as signals propagate through the circuit, an important...

  12. Team Update on North American Proton Facilities for Radiation Testing

    Science.gov (United States)

    LaBel, Kenneth A.; Turflinger, Thomas; Haas, Thurman; George, Jeffrey; Moss, Steven; Davis, Scott; Kostic, Andrew; Wie, Brian; Reed, Robert; Guertin, Steven; Wert, Jerry; Foster, Charles

    2016-01-01

    In the wake of the closure of the Indiana University Cyclotron Facility (IUCF), this presentation provides an overview of the options for North American proton facilities. This includes those in use by the aerospace community as well as new additions from the cancer therapy regime. In addition, proton single event testing background is provided for understanding the criteria needed for these facilities for electronics testing.

  13. Large-Scale Astrophysical Visualization on Smartphones

    Science.gov (United States)

    Becciani, U.; Massimino, P.; Costa, A.; Gheller, C.; Grillo, A.; Krokos, M.; Petta, C.

    2011-07-01

    Nowadays digital sky surveys and long-duration, high-resolution numerical simulations using high performance computing and grid systems produce multidimensional astrophysical datasets in the order of several Petabytes. Sharing visualizations of such datasets within communities and collaborating research groups is of paramount importance for disseminating results and advancing astrophysical research. Moreover educational and public outreach programs can benefit greatly from novel ways of presenting these datasets by promoting understanding of complex astrophysical processes, e.g., formation of stars and galaxies. We have previously developed VisIVO Server, a grid-enabled platform for high-performance large-scale astrophysical visualization. This article reviews the latest developments on VisIVO Web, a custom designed web portal wrapped around VisIVO Server, then introduces VisIVO Smartphone, a gateway connecting VisIVO Web and data repositories for mobile astrophysical visualization. We discuss current work and summarize future developments.

  14. Accelerated large-scale multiple sequence alignment

    Directory of Open Access Journals (Sweden)

    Lloyd Scott

    2011-12-01

    Full Text Available Abstract Background Multiple sequence alignment (MSA is a fundamental analysis method used in bioinformatics and many comparative genomic applications. Prior MSA acceleration attempts with reconfigurable computing have only addressed the first stage of progressive alignment and consequently exhibit performance limitations according to Amdahl's Law. This work is the first known to accelerate the third stage of progressive alignment on reconfigurable hardware. Results We reduce subgroups of aligned sequences into discrete profiles before they are pairwise aligned on the accelerator. Using an FPGA accelerator, an overall speedup of up to 150 has been demonstrated on a large data set when compared to a 2.4 GHz Core2 processor. Conclusions Our parallel algorithm and architecture accelerates large-scale MSA with reconfigurable computing and allows researchers to solve the larger problems that confront biologists today. Program source is available from http://dna.cs.byu.edu/msa/.

  15. Clumps in large scale relativistic jets

    CERN Document Server

    Tavecchio, F; Celotti, A

    2003-01-01

    The relatively intense X-ray emission from large scale (tens to hundreds kpc) jets discovered with Chandra likely implies that jets (at least in powerful quasars) are still relativistic at that distances from the active nucleus. In this case the emission is due to Compton scattering off seed photons provided by the Cosmic Microwave Background, and this on one hand permits to have magnetic fields close to equipartition with the emitting particles, and on the other hand minimizes the requirements about the total power carried by the jet. The emission comes from compact (kpc scale) knots, and we here investigate what we can predict about the possible emission between the bright knots. This is motivated by the fact that bulk relativistic motion makes Compton scattering off the CMB photons efficient even when electrons are cold or mildly relativistic in the comoving frame. This implies relatively long cooling times, dominated by adiabatic losses. Therefore the relativistically moving plasma can emit, by Compton sc...

  16. Large-scale parametric survival analysis.

    Science.gov (United States)

    Mittal, Sushil; Madigan, David; Cheng, Jerry Q; Burd, Randall S

    2013-10-15

    Survival analysis has been a topic of active statistical research in the past few decades with applications spread across several areas. Traditional applications usually consider data with only a small numbers of predictors with a few hundreds or thousands of observations. Recent advances in data acquisition techniques and computation power have led to considerable interest in analyzing very-high-dimensional data where the number of predictor variables and the number of observations range between 10(4) and 10(6). In this paper, we present a tool for performing large-scale regularized parametric survival analysis using a variant of the cyclic coordinate descent method. Through our experiments on two real data sets, we show that application of regularized models to high-dimensional data avoids overfitting and can provide improved predictive performance and calibration over corresponding low-dimensional models.

  17. Curvature constraints from Large Scale Structure

    CERN Document Server

    Di Dio, Enea; Raccanelli, Alvise; Durrer, Ruth; Kamionkowski, Marc; Lesgourgues, Julien

    2016-01-01

    We modified the CLASS code in order to include relativistic galaxy number counts in spatially curved geometries; we present the formalism and study the effect of relativistic corrections on spatial curvature. The new version of the code is now publicly available. Using a Fisher matrix analysis, we investigate how measurements of the spatial curvature parameter $\\Omega_K$ with future galaxy surveys are affected by relativistic effects, which influence observations of the large scale galaxy distribution. These effects include contributions from cosmic magnification, Doppler terms and terms involving the gravitational potential. As an application, we consider angle and redshift dependent power spectra, which are especially well suited for model independent cosmological constraints. We compute our results for a representative deep, wide and spectroscopic survey, and our results show the impact of relativistic corrections on the spatial curvature parameter estimation. We show that constraints on the curvature para...

  18. Large-scale simulations of reionization

    Energy Technology Data Exchange (ETDEWEB)

    Kohler, Katharina; /JILA, Boulder /Fermilab; Gnedin, Nickolay Y.; /Fermilab; Hamilton, Andrew J.S.; /JILA, Boulder

    2005-11-01

    We use cosmological simulations to explore the large-scale effects of reionization. Since reionization is a process that involves a large dynamic range--from galaxies to rare bright quasars--we need to be able to cover a significant volume of the universe in our simulation without losing the important small scale effects from galaxies. Here we have taken an approach that uses clumping factors derived from small scale simulations to approximate the radiative transfer on the sub-cell scales. Using this technique, we can cover a simulation size up to 1280h{sup -1} Mpc with 10h{sup -1} Mpc cells. This allows us to construct synthetic spectra of quasars similar to observed spectra of SDSS quasars at high redshifts and compare them to the observational data. These spectra can then be analyzed for HII region sizes, the presence of the Gunn-Peterson trough, and the Lyman-{alpha} forest.

  19. Large-Scale Tides in General Relativity

    CERN Document Server

    Ip, Hiu Yan

    2016-01-01

    Density perturbations in cosmology, i.e. spherically symmetric adiabatic perturbations of a Friedmann-Lema\\^itre-Robertson-Walker (FLRW) spacetime, are locally exactly equivalent to a different FLRW solution, as long as their wavelength is much larger than the sound horizon of all fluid components. This fact is known as the "separate universe" paradigm. However, no such relation is known for anisotropic adiabatic perturbations, which correspond to an FLRW spacetime with large-scale tidal fields. Here, we provide a closed, fully relativistic set of evolutionary equations for the nonlinear evolution of such modes, based on the conformal Fermi (CFC) frame. We show explicitly that the tidal effects are encoded by the Weyl tensor, and are hence entirely different from an anisotropic Bianchi I spacetime, where the anisotropy is sourced by the Ricci tensor. In order to close the system, certain higher derivative terms have to be dropped. We show that this approximation is equivalent to the local tidal approximation ...

  20. Large scale water lens for solar concentration.

    Science.gov (United States)

    Mondol, A S; Vogel, B; Bastian, G

    2015-06-01

    Properties of large scale water lenses for solar concentration were investigated. These lenses were built from readily available materials, normal tap water and hyper-elastic linear low density polyethylene foil. Exposed to sunlight, the focal lengths and light intensities in the focal spot were measured and calculated. Their optical properties were modeled with a raytracing software based on the lens shape. We have achieved a good match of experimental and theoretical data by considering wavelength dependent concentration factor, absorption and focal length. The change in light concentration as a function of water volume was examined via the resulting load on the foil and the corresponding change of shape. The latter was extracted from images and modeled by a finite element simulation.