WorldWideScience

Sample records for ground-based benchmark exposure

  1. Effects of exposure imprecision on estimation of the benchmark dose

    DEFF Research Database (Denmark)

    Budtz-Jørgensen, Esben; Keiding, Niels; Grandjean, Philippe

    2004-01-01

    In regression analysis failure to adjust for imprecision in the exposure variable is likely to lead to underestimation of the exposure effect. However, the consequences of exposure error for determination of safe doses of toxic substances have so far not received much attention. The benchmark...... approach is one of the most widely used methods for development of exposure limits. An important advantage of this approach is that it can be applied to observational data. However, in this type of data, exposure markers are seldom measured without error. It is shown that, if the exposure error is ignored......, then the benchmark approach produces results that are biased toward higher and less protective levels. It is therefore important to take exposure measurement error into account when calculating benchmark doses. Methods that allow this adjustment are described and illustrated in data from an epidemiological study...

  2. Benchmarking

    OpenAIRE

    Meylianti S., Brigita

    1999-01-01

    Benchmarking has different meaning to different people. There are five types of benchmarking, namely internal benchmarking, competitive benchmarking, industry / functional benchmarking, process / generic benchmarking and collaborative benchmarking. Each type of benchmarking has its own advantages as well as disadvantages. Therefore it is important to know what kind of benchmarking is suitable to a specific application. This paper will discuss those five types of benchmarking in detail, includ...

  3. The ground based plan

    International Nuclear Information System (INIS)

    1989-01-01

    The paper presents a report of ''The Ground Based Plan'' of the United Kingdom Science and Engineering Research Council. The ground based plan is a plan for research in astronomy and planetary science by ground based techniques. The contents of the report contains a description of:- the scientific objectives and technical requirements (the basis for the Plan), the present organisation and funding for the ground based programme, the Plan, the main scientific features and the further objectives of the Plan. (U.K.)

  4. Benchmarking

    OpenAIRE

    Beretta Sergio; Dossi Andrea; Grove Hugh

    2000-01-01

    Due to their particular nature, the benchmarking methodologies tend to exceed the boundaries of management techniques, and to enter the territories of managerial culture. A culture that is also destined to break into the accounting area not only strongly supporting the possibility of fixing targets, and measuring and comparing the performance (an aspect that is already innovative and that is worthy of attention), but also questioning one of the principles (or taboos) of the accounting or...

  5. Benchmarking Benchmarks

    NARCIS (Netherlands)

    D.C. Blitz (David)

    2011-01-01

    textabstractBenchmarking benchmarks is a bundle of six studies that are inspired by the prevalence of benchmarking in academic finance research as well as in investment practice. Three studies examine if current benchmark asset pricing models adequately describe the cross-section of stock returns.

  6. Exchange Rate Exposure Management: The Benchmarking Process of Industrial Companies

    DEFF Research Database (Denmark)

    Aabo, Tom

    . The conducted interviews show that empirical reasons behind actual hedging strategies vary considerably - some in accordance with mainstream finance theory, some resting on asymmetric information. The diversity of attitudes seems to be partly a result of different competitive environments, partly a result...... of practices and strategies that have been established in each company fairly independently over time. The paper argues that hedge benchmarks are useful in their creation process (by forcing a comprehensive analysis) as well as in their final status (by the establishment of a consistent hedging strategy......Based on a cross-case study of Danish industrial companies the paper analyzes the benchmarking of the optimal hedging strategy. A stock market approach is pursued but a serious question mark is put on the validity of the obtained information seen from a corporate value-adding point of view...

  7. Benchmarking of computer codes and approaches for modeling exposure scenarios

    International Nuclear Information System (INIS)

    Seitz, R.R.; Rittmann, P.D.; Wood, M.I.; Cook, J.R.

    1994-08-01

    The US Department of Energy Headquarters established a performance assessment task team (PATT) to integrate the activities of DOE sites that are preparing performance assessments for the disposal of newly generated low-level waste. The PATT chartered a subteam with the task of comparing computer codes and exposure scenarios used for dose calculations in performance assessments. This report documents the efforts of the subteam. Computer codes considered in the comparison include GENII, PATHRAE-EPA, MICROSHIELD, and ISOSHLD. Calculations were also conducted using spreadsheets to provide a comparison at the most fundamental level. Calculations and modeling approaches are compared for unit radionuclide concentrations in water and soil for the ingestion, inhalation, and external dose pathways. Over 30 tables comparing inputs and results are provided

  8. Ground-based photo monitoring

    Science.gov (United States)

    Frederick C. Hall

    2000-01-01

    Ground-based photo monitoring is repeat photography using ground-based cameras to document change in vegetation or soil. Assume those installing the photo location will not be the ones re-photographing it. This requires a protocol that includes: (1) a map to locate the monitoring area, (2) another map diagramming the photographic layout, (3) type and make of film such...

  9. An international pooled analysis for obtaining a benchmark dose for environmental lead exposure in children

    DEFF Research Database (Denmark)

    Budtz-Jørgensen, Esben; Bellinger, David; Lanphear, Bruce

    2013-01-01

    Lead is a recognized neurotoxicant, but estimating effects at the lowest measurable levels is difficult. An international pooled analysis of data from seven cohort studies reported an inverse and supra-linear relationship between blood lead concentrations and IQ scores in children. The lack...... of a clear threshold presents a challenge to the identification of an acceptable level of exposure. The benchmark dose (BMD) is defined as the dose that leads to a specific known loss. As an alternative to elusive thresholds, the BMD is being used increasingly by regulatory authorities. Using the pooled data...... yielding lower confidence limits (BMDLs) of about 0.1-1.0 μ g/dL for the dose leading to a loss of one IQ point. We conclude that current allowable blood lead concentrations need to be lowered and further prevention efforts are needed to protect children from lead toxicity....

  10. Ground-based observations of exoplanet atmospheres

    NARCIS (Netherlands)

    Mooij, Ernst Johan Walter de

    2011-01-01

    This thesis focuses on the properties of exoplanet atmospheres. The results for ground-based near-infrared secondary eclipse observations of three different exoplanets, TrES-3b, HAT-P-1b and WASP-33b, are presented which have been obtained with ground-based telescopes as part of the GROUSE project.

  11. Estimate of safe human exposure levels for lunar dust based on comparative benchmark dose modeling.

    Science.gov (United States)

    James, John T; Lam, Chiu-Wing; Santana, Patricia A; Scully, Robert R

    2013-04-01

    Brief exposures of Apollo astronauts to lunar dust occasionally elicited upper respiratory irritation; however, no limits were ever set for prolonged exposure to lunar dust. The United States and other space faring nations intend to return to the moon for extensive exploration within a few decades. In the meantime, habitats for that exploration, whether mobile or fixed, must be designed to limit human exposure to lunar dust to safe levels. Herein we estimate safe exposure limits for lunar dust collected during the Apollo 14 mission. We instilled three respirable-sized (∼2 μ mass median diameter) lunar dusts (two ground and one unground) and two standard dusts of widely different toxicities (quartz and TiO₂) into the respiratory system of rats. Rats in groups of six were given 0, 1, 2.5 or 7.5 mg of the test dust in a saline-Survanta® vehicle, and biochemical and cellular biomarkers of toxicity in lung lavage fluid were assayed 1 week and one month after instillation. By comparing the dose--response curves of sensitive biomarkers, we estimated safe exposure levels for astronauts and concluded that unground lunar dust and dust ground by two different methods were not toxicologically distinguishable. The safe exposure estimates were 1.3 ± 0.4 mg/m³ (jet-milled dust), 1.0 ± 0.5 mg/m³ (ball-milled dust) and 0.9 ± 0.3 mg/m³ (unground, natural dust). We estimate that 0.5-1 mg/m³ of lunar dust is safe for periodic human exposures during long stays in habitats on the lunar surface.

  12. Calibration of Ground -based Lidar instrument

    DEFF Research Database (Denmark)

    Villanueva, Héctor; Yordanova, Ginka

    This report presents the result of the lidar calibration performed for the given Ground-based Lidar at DTU’s test site for large wind turbines at Høvsøre, Denmark. Calibration is here understood as the establishment of a relation between the reference wind speed measurements with measurement...

  13. Space and Ground-Based Infrastructures

    Science.gov (United States)

    Weems, Jon; Zell, Martin

    This chapter deals first with the main characteristics of the space environment, outside and inside a spacecraft. Then the space and space-related (ground-based) infrastructures are described. The most important infrastructure is the International Space Station, which holds many European facilities (for instance the European Columbus Laboratory). Some of them, such as the Columbus External Payload Facility, are located outside the ISS to benefit from external space conditions. There is only one other example of orbital platforms, the Russian Foton/Bion Recoverable Orbital Capsule. In contrast, non-orbital weightless research platforms, although limited in experimental time, are more numerous: sounding rockets, parabolic flight aircraft, drop towers and high-altitude balloons. In addition to these facilities, there are a number of ground-based facilities and space simulators, for both life sciences (for instance: bed rest, clinostats) and physical sciences (for instance: magnetic compensation of gravity). Hypergravity can also be provided by human and non-human centrifuges.

  14. Illumination compensation in ground based hyperspectral imaging

    Science.gov (United States)

    Wendel, Alexander; Underwood, James

    2017-07-01

    Hyperspectral imaging has emerged as an important tool for analysing vegetation data in agricultural applications. Recently, low altitude and ground based hyperspectral imaging solutions have come to the fore, providing very high resolution data for mapping and studying large areas of crops in detail. However, these platforms introduce a unique set of challenges that need to be overcome to ensure consistent, accurate and timely acquisition of data. One particular problem is dealing with changes in environmental illumination while operating with natural light under cloud cover, which can have considerable effects on spectral shape. In the past this has been commonly achieved by imaging known reference targets at the time of data acquisition, direct measurement of irradiance, or atmospheric modelling. While capturing a reference panel continuously or very frequently allows accurate compensation for illumination changes, this is often not practical with ground based platforms, and impossible in aerial applications. This paper examines the use of an autonomous unmanned ground vehicle (UGV) to gather high resolution hyperspectral imaging data of crops under natural illumination. A process of illumination compensation is performed to extract the inherent reflectance properties of the crops, despite variable illumination. This work adapts a previously developed subspace model approach to reflectance and illumination recovery. Though tested on a ground vehicle in this paper, it is applicable to low altitude unmanned aerial hyperspectral imagery also. The method uses occasional observations of reference panel training data from within the same or other datasets, which enables a practical field protocol that minimises in-field manual labour. This paper tests the new approach, comparing it against traditional methods. Several illumination compensation protocols for high volume ground based data collection are presented based on the results. The findings in this paper are

  15. Calibration of Ground-based Lidar instrument

    DEFF Research Database (Denmark)

    Yordanova, Ginka; Gómez Arranz, Paula

    This report presents the result of the lidar calibration performed for the given Ground-based Lidar at DTU’s test site for large wind turbines at Høvsøre, Denmark. Calibration is here understood as the establishment of a relation between the reference wind speed measurements with measurement...... uncertainties provided by measurement standard and corresponding lidar wind speed indications with associated measurement uncertainties. The lidar calibration concerns the 10 minute mean wind speed measurements. The comparison of the lidar measurements of the wind direction with that from wind vanes...

  16. Ground-Based Telescope Parametric Cost Model

    Science.gov (United States)

    Stahl, H. Philip; Rowell, Ginger Holmes

    2004-01-01

    A parametric cost model for ground-based telescopes is developed using multi-variable statistical analysis, The model includes both engineering and performance parameters. While diameter continues to be the dominant cost driver, other significant factors include primary mirror radius of curvature and diffraction limited wavelength. The model includes an explicit factor for primary mirror segmentation and/or duplication (i.e.. multi-telescope phased-array systems). Additionally, single variable models based on aperture diameter are derived. This analysis indicates that recent mirror technology advances have indeed reduced the historical telescope cost curve.

  17. Space weather effects on ground based technology

    Science.gov (United States)

    Clark, T.

    Space weather can affect a variety of forms of ground-based technology, usually as a result of either the direct effects of the varying geomagnetic field, or as a result of the induced electric field that accompanies such variations. Technologies affected directly by geomagnetic variations include magnetic measurements made d ringu geophysical surveys, and navigation relying on the geomagnetic field as a direction reference, a method that is particularly common in the surveying of well-bores in the oil industry. The most obvious technology affected by induced electric fields during magnetic storms is electric power transmission, where the example of the blackout in Quebec during the March 1989 magnetic storm is widely known. Additionally, space weather effects must be taken into account in the design of active cathodic protection systems on pipelines to protect them against corrosion. Long-distance telecommunication cables may also have to be designed to cope with space weather related effects. This paper reviews the effects of space weather in these different areas of ground-based technology, and provides examples of how mitigation against hazards may be achieved. (The paper does not include the effects of space weather on radio communication or satellite navigation systems).

  18. Toxicological Benchmarks for Wildlife

    Energy Technology Data Exchange (ETDEWEB)

    Sample, B.E. Opresko, D.M. Suter, G.W.

    1993-01-01

    -tailed hawk, osprey) (scientific names for both the mammalian and avian species are presented in Appendix B). [In this document, NOAEL refers to both dose (mg contaminant per kg animal body weight per day) and concentration (mg contaminant per kg of food or L of drinking water)]. The 20 wildlife species were chosen because they are widely distributed and provide a representative range of body sizes and diets. The chemicals are some of those that occur at U.S. Department of Energy (DOE) waste sites. The NOAEL-based benchmarks presented in this report represent values believed to be nonhazardous for the listed wildlife species; LOAEL-based benchmarks represent threshold levels at which adverse effects are likely to become evident. These benchmarks consider contaminant exposure through oral ingestion of contaminated media only. Exposure through inhalation and/or direct dermal exposure are not considered in this report.

  19. Library Benchmarking

    Directory of Open Access Journals (Sweden)

    Wiji Suwarno

    2017-02-01

    Full Text Available The term benchmarking has been encountered in the implementation of total quality (TQM or in Indonesian termed holistic quality management because benchmarking is a tool to look for ideas or learn from the library. Benchmarking is a processof measuring and comparing for continuous business process of systematic and continuous measurement, the process of measuring and comparing for continuous business process of an organization to get information that can help these organization improve their performance efforts.

  20. Interactive benchmarking

    DEFF Research Database (Denmark)

    Lawson, Lartey; Nielsen, Kurt

    2005-01-01

    We discuss individual learning by interactive benchmarking using stochastic frontier models. The interactions allow the user to tailor the performance evaluation to preferences and explore alternative improvement strategies by selecting and searching the different frontiers using directional...... in the suggested benchmarking tool. The study investigates how different characteristics on dairy farms influences the technical efficiency....

  1. RUNE benchmarks

    DEFF Research Database (Denmark)

    Peña, Alfredo

    This report contains the description of a number of benchmarks with the purpose of evaluating flow models for near-shore wind resource estimation. The benchmarks are designed based on the comprehensive database of observations that the RUNE coastal experiment established from onshore lidar...

  2. Benchmark selection

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Tvede, Mich

    2002-01-01

    Within a production theoretic framework, this paper considers an axiomatic approach to benchmark selection. It is shown that two simple and weak axioms; efficiency and comprehensive monotonicity characterize a natural family of benchmarks which typically becomes unique. Further axioms are added...... in order to obtain a unique selection...

  3. SCIENTIFIC EFFICIENCY OF GROUND-BASED TELESCOPES

    International Nuclear Information System (INIS)

    Abt, Helmut A.

    2012-01-01

    I scanned the six major astronomical journals of 2008 for all 1589 papers that are based on new data obtained from ground-based optical/IR telescopes worldwide. Then I collected data on numbers of papers, citations to them in 3+ years, the most-cited papers, and annual operating costs. These data are assigned to four groups by telescope aperture. For instance, while the papers from telescopes with an aperture >7 m average 1.29 more citations than those with an aperture of 2 to 7 m) telescopes. I wonder why the large telescopes do so relatively poorly and suggest possible reasons. I also found that papers based on archival data, such as the Sloan Digital Sky Survey, produce 10.6% as many papers and 20.6% as many citations as those based on new data. Also, the 577.2 papers based on radio data produced 36.3% as many papers and 33.6% as many citations as the 1589 papers based on optical/IR telescopes.

  4. WLUP benchmarks

    International Nuclear Information System (INIS)

    Leszczynski, Francisco

    2002-01-01

    The IAEA-WIMS Library Update Project (WLUP) is on the end stage. The final library will be released on 2002. It is a result of research and development made by more than ten investigators during 10 years. The organization of benchmarks for testing and choosing the best set of data has been coordinated by the author of this paper. It is presented the organization, name conventions, contents and documentation of WLUP benchmarks, and an updated list of the main parameters for all cases. First, the benchmarks objectives and types are given. Then, comparisons of results from different WIMSD libraries are included. Finally it is described the program QVALUE for analysis and plot of results. Some examples are given. The set of benchmarks implemented on this work is a fundamental tool for testing new multigroup libraries. (author)

  5. Using an Individual Procedure Score Before and After the Advanced Surgical Skills Exposure for Trauma Course Training to Benchmark a Hemorrhage-Control Performance Metric.

    Science.gov (United States)

    Mackenzie, Colin F; Garofalo, Evan; Shackelford, Stacy; Shalin, Valerie; Pugh, Kristy; Chen, Hegang; Puche, Adam; Pasley, Jason; Sarani, Babak; Henry, Sharon; Bowyer, Mark

    2015-01-01

    Test with an individual procedure score (IPS) to assess whether an unpreserved cadaver trauma training course, including upper and lower limb vascular exposure, improves correct identification of surgical landmarks, underlying anatomy, and shortens time to vascular control. Prospective study of performance of 3 vascular exposure and control procedures (axillary, brachial, and femoral arteries) using IPS metrics by 2 colocated and trained evaluators before and after training with the Advanced Surgical Skills Exposure for Trauma (ASSET) course. IPS, including identification of anatomical landmarks, incisions, underlying structures, and time to completion of each procedure was compared before and after training using repeated measurement models. Audio-video instrumented cadaver laboratory at University of Maryland School of Medicine. A total of 41 second to sixth year surgical residents from surgical programs throughout Mid-Atlantic States who had not previously taken the ASSET course were enrolled, 40 completed the pre- and post-ASSET performance evaluations. After ASSET training, all components of IPS increased and time shortened for each of the 3 artery exposures. Procedure steps performed correctly increased 57%, anatomical knowledge increased 43% and skin incision to passage of a vessel loop twice around the correct vessel decreased by a mean of 2.5 minutes. An overall vascular trauma readiness index, a comprehensive IPS score for 3 procedures increased 28% with ASSET Training. Improved knowledge of surface landmarks and underlying anatomy is associated with increased IPS, faster procedures, more accurate incision placement, and successful vascular control. Structural recognition during specific procedural steps and anatomical knowledge were key points learned during the ASSET course. Such training may accelerate acquisition of specific trauma surgery skills to compensate for shortened training hours, infrequent exposure to major vascular injuries, or when just

  6. Regulatory Benchmarking

    DEFF Research Database (Denmark)

    Agrell, Per J.; Bogetoft, Peter

    2017-01-01

    Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators. The appli......Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators....... The application of bench-marking in regulation, however, requires specific steps in terms of data validation, model specification and outlier detection that are not systematically documented in open publications, leading to discussions about regulatory stability and economic feasibility of these techniques...

  7. Regulatory Benchmarking

    DEFF Research Database (Denmark)

    Agrell, Per J.; Bogetoft, Peter

    2017-01-01

    Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators. The appli......Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators....... The application of benchmarking in regulation, however, requires specific steps in terms of data validation, model specification and outlier detection that are not systematically documented in open publications, leading to discussions about regulatory stability and economic feasibility of these techniques...

  8. Tissue Engineering of Cartilage on Ground-Based Facilities

    Science.gov (United States)

    Aleshcheva, Ganna; Bauer, Johann; Hemmersbach, Ruth; Egli, Marcel; Wehland, Markus; Grimm, Daniela

    2016-06-01

    Investigations under simulated microgravity offer the opportunity for a better understanding of the influence of altered gravity on cells and the scaffold-free three-dimensional (3D) tissue formation. To investigate the short-term influence, human chondrocytes were cultivated for 2 h, 4 h, 16 h, and 24 h on a 2D Fast-Rotating Clinostat (FRC) in DMEM/F-12 medium supplemented with 10 % FCS. We detected holes in the vimentin network, perinuclear accumulations of vimentin after 2 h, and changes in the chondrocytes shape visualised by F-actin staining after 4 h of FRC-exposure. Scaffold-free cultivation of chondrocytes for 7 d on the Random Positioning Machine (RPM), the FRC and the Rotating Wall Vessel (RWV) resulted in spheroid formation, a phenomenon already known from spaceflight experiments with chondrocytes (MIR Space Station) and thyroid cancer cells (SimBox/Shenzhou-8 space mission). The experiments enabled by the ESA-CORA-GBF programme gave us an optimal opportunity to study gravity-related cellular processes, validate ground-based facilities for our chosen cell system, and prepare long-term experiments under real microgravity conditions in space

  9. Benchmarking of Percutaneous Injuries at the Ministry of Health Hospitals of Saudi Arabia in Comparison with the United States Hospitals Participating in Exposure Prevention Information Network (EPINet™

    Directory of Open Access Journals (Sweden)

    ZA Memish

    2015-01-01

    Full Text Available Background: Exposure to blood-borne pathogens from needle-stick and sharp injuries continues to pose a significant risk to health care workers. These events are of concern because of the risk to transmit blood-borne diseases such as hepatitis B virus, hepatitis C virus, and the human immunodeficiency virus. Objective: To benchmark different risk factors associated with needle-stick incidents among health care workers in the Ministry of Health hospitals in the Kingdom of Saudi Arabia compared to the US hospitals participating in Exposure Prevention Information Network (EPINet ™. Methods: Prospective surveillance of needle-stick and sharp incidents carried out during the year 2012 using EPINet™ ver 1.5 that provides uniform needle stick and sharp injury report form. Results: The annual percutaneous incidents (PIs rate per 100 occupied beds was 3.2 at the studied MOH hospitals. Nurses were the most affected job category by PIs (59.4%. Most PIs happened in patients' wards in the Ministry of Health hospitals (34.6%. Disposable syringes were the most common cause of PIs (47.20%. Most PIs occurred during use of the syringes (36.4%. Conclusion: Among health care workers, nurses and physicians appear especially at risk of exposure to PIs. Important risk factors of injuries include working in patient room, using disposable syringes, devices without safety features. Preventive strategies such as continuous training of health care workers with special emphasis on nurses and physicians, encouragement of reporting of such incidents, observation of sharp handling, their use and implementation of safety devices are warranted.

  10. KSC ADVANCED GROUND BASED FIELD MILL V1

    Data.gov (United States)

    National Aeronautics and Space Administration — The Advanced Ground Based Field Mill (AGBFM) network consists of 34 (31 operational) field mills located at Kennedy Space Center (KSC), Florida. The field mills...

  11. Altered operant responding for motor reinforcement and the determination of benchmark doses following perinatal exposure to low-level 2,3,7,8-tetrachlorodibenzo-p-dioxin.

    Science.gov (United States)

    Markowski, V P; Zareba, G; Stern, S; Cox, C; Weiss, B

    2001-06-01

    Pregnant Holtzman rats were exposed to a single oral dose of 0, 20, 60, or 180 ng/kg 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) on the 18th day of gestation. Their adult female offspring were trained to respond on a lever for brief opportunities to run in specially designed running wheels. Once they had begun responding on a fixed-ratio 1 (FR1) schedule of reinforcement, the fixed-ratio requirement for lever pressing was increased at five-session intervals to values of FR2, FR5, FR10, FR20, and FR30. We examined vaginal cytology after each behavior session to track estrous cyclicity. Under each of the FR values, perinatal TCDD exposure produced a significant dose-related reduction in the number of earned opportunities to run, the lever response rate, and the total number of revolutions in the wheel. Estrous cyclicity was not affected. Because of the consistent dose-response relationship at all FR values, we used the behavioral data to calculate benchmark doses based on displacements from modeled zero-dose performance of 1% (ED(01)) and 10% (ED(10)), as determined by a quadratic fit to the dose-response function. The mean ED(10) benchmark dose for earned run opportunities was 10.13 ng/kg with a 95% lower bound of 5.77 ng/kg. The corresponding ED(01) was 0.98 ng/kg with a 95% lower bound of 0.83 ng/kg. The mean ED(10) for total wheel revolutions was calculated as 7.32 ng/kg with a 95% lower bound of 5.41 ng/kg. The corresponding ED(01) was 0.71 ng/kg with a 95% lower bound of 0.60. These values should be viewed from the perspective of current human body burdens, whose average value, based on TCDD toxic equivalents, has been calculated as 13 ng/kg.

  12. Study of the unknown hemisphere of mercury by ground-based astronomical facilities

    Science.gov (United States)

    Ksanfomality, L. V.

    2011-08-01

    The short exposure method proved to be very productive in ground-based observations of Mercury. Telescopic observations with short exposures, together with computer codes for the processing of data arrays of many thousands of original electronic photos, make it possible to improve the resolution of images from ground-based instruments to almost the diffraction limit. The resulting composite images are comparable with images from spacecrafts approaching from a distance of about 1 million km. This paper presents images of the hemisphere of Mercury in longitude sectors 90°-180°W, 215°-350°W, and 50°-90°W, including, among others, areas not covered by spacecraft cameras. For the first time a giant S basin was discovered in the sector of longitudes 250°-290°W, which is the largest formation of this type on terrestrial planets. Mercury has a strong phase effects. As a result, the view of the surface changes completely with the change in the planetary phase. But the choice of the phase in the study using spacecrafts is limited by orbital characteristics of the mission. Thus, ground-based observations of the planet provide a valuable support.

  13. Modeling ground-based timber harvesting systems using computer simulation

    Science.gov (United States)

    Jingxin Wang; Chris B. LeDoux

    2001-01-01

    Modeling ground-based timber harvesting systems with an object-oriented methodology was investigated. Object-oriented modeling and design promote a better understanding of requirements, cleaner designs, and better maintainability of the harvesting simulation system. The model developed simulates chainsaw felling, drive-to-tree feller-buncher, swing-to-tree single-grip...

  14. The COROT ground-based archive and access system

    Science.gov (United States)

    Solano, E.; González-Riestra, R.; Catala, C.; Baglin, A.

    2002-01-01

    A prototype of the COROT ground-based archive and access system is presented here. The system has been developed at LAEFF and it is based on the experience gained at Laboratorio de Astrofisica Espacial y Fisica Fundamental (LAEFF) with the INES (IUE Newly Extracted System) Archive.

  15. Benchmarking in Foodservice Operations

    National Research Council Canada - National Science Library

    Johnson, Bonnie

    1998-01-01

    The objective of this study was to identify usage of foodservice performance measures, important activities in foodservice benchmarking, and benchmarking attitudes, beliefs, and practices by foodservice directors...

  16. Benchmarking and Performance Measurement.

    Science.gov (United States)

    Town, J. Stephen

    This paper defines benchmarking and its relationship to quality management, describes a project which applied the technique in a library context, and explores the relationship between performance measurement and benchmarking. Numerous benchmarking methods contain similar elements: deciding what to benchmark; identifying partners; gathering…

  17. High energy astrophysics with ground-based gamma ray detectors

    International Nuclear Information System (INIS)

    Aharonian, F; Buckley, J; Kifune, T; Sinnis, G

    2008-01-01

    Recent advances in ground-based gamma ray astronomy have led to the discovery of more than 70 sources of very high energy (E γ ≥ 100 GeV) gamma rays, falling into a number of source populations including pulsar wind nebulae, shell type supernova remnants, Wolf-Rayet stars, giant molecular clouds, binary systems, the Galactic Center, active galactic nuclei and 'dark' (yet unidentified) galactic objects. We summarize the history of TeV gamma ray astronomy up to the current status of the field including a description of experimental techniques and highlight recent astrophysical results. We also discuss the potential of ground-based gamma ray astronomy for future discoveries and describe possible directions for future instrumental developments

  18. Ground-based Nuclear Detonation Detection (GNDD) Technology Roadmap

    International Nuclear Information System (INIS)

    Casey, Leslie A.

    2014-01-01

    This GNDD Technology Roadmap is intended to provide guidance to potential researchers and help management define research priorities to achieve technology advancements for ground-based nuclear explosion monitoring science being pursued by the Ground-based Nuclear Detonation Detection (GNDD) Team within the Office of Nuclear Detonation Detection in the National Nuclear Security Administration (NNSA) of the U.S. Department of Energy (DOE). Four science-based elements were selected to encompass the entire scope of nuclear monitoring research and development (R&D) necessary to facilitate breakthrough scientific results, as well as deliver impactful products. Promising future R&D is delineated including dual use associated with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Important research themes as well as associated metrics are identified along with a progression of accomplishments, represented by a selected bibliography, that are precursors to major improvements to nuclear explosion monitoring.

  19. Ground-based Nuclear Detonation Detection (GNDD) Technology Roadmap

    Energy Technology Data Exchange (ETDEWEB)

    Casey, Leslie A.

    2014-01-13

    This GNDD Technology Roadmap is intended to provide guidance to potential researchers and help management define research priorities to achieve technology advancements for ground-based nuclear explosion monitoring science being pursued by the Ground-based Nuclear Detonation Detection (GNDD) Team within the Office of Nuclear Detonation Detection in the National Nuclear Security Administration (NNSA) of the U.S. Department of Energy (DOE). Four science-based elements were selected to encompass the entire scope of nuclear monitoring research and development (R&D) necessary to facilitate breakthrough scientific results, as well as deliver impactful products. Promising future R&D is delineated including dual use associated with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Important research themes as well as associated metrics are identified along with a progression of accomplishments, represented by a selected bibliography, that are precursors to major improvements to nuclear explosion monitoring.

  20. Automatic Barometric Updates from Ground-Based Navigational Aids

    Science.gov (United States)

    1990-03-12

    ro fAutomatic Barometric Updates US Department from of Transportation Ground-Based Federal Aviation Administration Navigational Aids Office of Safety...tighter vertical spacing controls , particularly for operations near Terminal Control Areas (TCAs), Airport Radar Service Areas (ARSAs), military climb and...E.F., Ruth, J.C., and Williges, B.H. (1987). Speech Controls and Displays. In Salvendy, G., E. Handbook of Human Factors/Ergonomics, New York, John

  1. Biomass burning aerosols characterization from ground based and profiling measurements

    Science.gov (United States)

    Marin, Cristina; Vasilescu, Jeni; Marmureanu, Luminita; Ene, Dragos; Preda, Liliana; Mihailescu, Mona

    2018-04-01

    The study goal is to assess the chemical and optical properties of aerosols present in the lofted layers and at the ground. The biomass burning aerosols were evaluated in low level layers from multi-wavelength lidar measurements, while chemical composition at ground was assessed using an Aerosol Chemical Speciation Monitor (ACSM) and an Aethalometer. Classification of aerosol type and specific organic markers were used to explore the potential to sense the particles from the same origin at ground base and on profiles.

  2. Silicon carbide optics for space and ground based astronomical telescopes

    Science.gov (United States)

    Robichaud, Joseph; Sampath, Deepak; Wainer, Chris; Schwartz, Jay; Peton, Craig; Mix, Steve; Heller, Court

    2012-09-01

    Silicon Carbide (SiC) optical materials are being applied widely for both space based and ground based optical telescopes. The material provides a superior weight to stiffness ratio, which is an important metric for the design and fabrication of lightweight space telescopes. The material also has superior thermal properties with a low coefficient of thermal expansion, and a high thermal conductivity. The thermal properties advantages are important for both space based and ground based systems, which typically need to operate under stressing thermal conditions. The paper will review L-3 Integrated Optical Systems - SSG’s (L-3 SSG) work in developing SiC optics and SiC optical systems for astronomical observing systems. L-3 SSG has been fielding SiC optical components and systems for over 25 years. Space systems described will emphasize the recently launched Long Range Reconnaissance Imager (LORRI) developed for JHU-APL and NASA-GSFC. Review of ground based applications of SiC will include supporting L-3 IOS-Brashear’s current contract to provide the 0.65 meter diameter, aspheric SiC secondary mirror for the Advanced Technology Solar Telescope (ATST).

  3. Benchmarking in the Netherlands

    International Nuclear Information System (INIS)

    1999-01-01

    In two articles an overview is given of the activities in the Dutch industry and energy sector with respect to benchmarking. In benchmarking operational processes of different competitive businesses are compared to improve your own performance. Benchmark covenants for energy efficiency between the Dutch government and industrial sectors contribute to a growth of the number of benchmark surveys in the energy intensive industry in the Netherlands. However, some doubt the effectiveness of the benchmark studies

  4. Augmenting WFIRST Microlensing with a Ground-Based Telescope Network

    Science.gov (United States)

    Zhu, Wei; Gould, Andrew

    2016-06-01

    Augmenting the Wide Field Infrared Survey Telescope (WFIRST) microlensing campaigns with intensive observations from a ground-based network of wide-field survey telescopes would have several major advantages. First, it would enable full two-dimensional (2-D) vector microlens parallax measurements for a substantial fraction of low-mass lenses as well as planetary and binary events that show caustic crossing features. For a significant fraction of the free-floating planet (FFP) events and all caustic-crossing planetary/binary events, these 2-D parallax measurements directly lead to complete solutions (mass, distance, transverse velocity) of the lens object (or lens system). For even more events, the complementary ground-based observations will yield 1-D parallax measurements. Together with the 1-D parallaxes from WFIRST alone, they can probe the entire mass range M > M_Earth. For luminous lenses, such 1-D parallax measurements can be promoted to complete solutions (mass, distance, transverse velocity) by high-resolution imaging. This would provide crucial information not only about the hosts of planets and other lenses, but also enable a much more precise Galactic model. Other benefits of such a survey include improved understanding of binaries (particularly with low mass primaries), and sensitivity to distant ice-giant and gas-giant companions of WFIRST lenses that cannot be detected by WFIRST itself due to its restricted observing windows. Existing ground-based microlensing surveys can be employed if WFIRST is pointed at lower-extinction fields than is currently envisaged. This would come at some cost to the event rate. Therefore the benefits of improved characterization of lenses must be weighed against these costs.

  5. Lidar to lidar calibration of Ground-based Lidar

    DEFF Research Database (Denmark)

    Fernandez Garcia, Sergio; Courtney, Michael

    This report presents the result of the lidar to lidar calibration performed for ground-based lidar. Calibration is here understood as the establishment of a relation between the reference lidar wind speed measurements with measurement uncertainties provided by measurement standard and corresponding...... lidar wind speed indications with associated measurement uncertainties. The lidar calibration concerns the 10 minute mean wind speed measurements. The comparison of the lidar measurements of the wind direction with that from the reference lidar measurements are given for information only....

  6. Aquatic Life Benchmarks

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Aquatic Life Benchmarks is an EPA-developed set of criteria for freshwater species. These benchmarks are based on toxicity values reviewed by EPA and used in the...

  7. MODELING ATMOSPHERIC EMISSION FOR CMB GROUND-BASED OBSERVATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Errard, J.; Borrill, J. [Space Sciences Laboratory, University of California, Berkeley, CA 94720 (United States); Ade, P. A. R. [School of Physics and Astronomy, Cardiff University, Cardiff CF10 3XQ (United Kingdom); Akiba, Y.; Chinone, Y. [High Energy Accelerator Research Organization (KEK), Tsukuba, Ibaraki 305-0801 (Japan); Arnold, K.; Atlas, M.; Barron, D.; Elleflot, T. [Department of Physics, University of California, San Diego, CA 92093-0424 (United States); Baccigalupi, C.; Fabbian, G. [International School for Advanced Studies (SISSA), Trieste I-34014 (Italy); Boettger, D. [Department of Astronomy, Pontifica Universidad Catolica de Chile (Chile); Chapman, S. [Department of Physics and Atmospheric Science, Dalhousie University, Halifax, NS, B3H 4R2 (Canada); Cukierman, A. [Department of Physics, University of California, Berkeley, CA 94720 (United States); Delabrouille, J. [AstroParticule et Cosmologie, Univ Paris Diderot, CNRS/IN2P3, CEA/Irfu, Obs de Paris, Sorbonne Paris Cité (France); Dobbs, M.; Gilbert, A. [Physics Department, McGill University, Montreal, QC H3A 0G4 (Canada); Ducout, A.; Feeney, S. [Department of Physics, Imperial College London, London SW7 2AZ (United Kingdom); Feng, C. [Department of Physics and Astronomy, University of California, Irvine (United States); and others

    2015-08-10

    Atmosphere is one of the most important noise sources for ground-based cosmic microwave background (CMB) experiments. By increasing optical loading on the detectors, it amplifies their effective noise, while its fluctuations introduce spatial and temporal correlations between detected signals. We present a physically motivated 3D-model of the atmosphere total intensity emission in the millimeter and sub-millimeter wavelengths. We derive a new analytical estimate for the correlation between detectors time-ordered data as a function of the instrument and survey design, as well as several atmospheric parameters such as wind, relative humidity, temperature and turbulence characteristics. Using an original numerical computation, we examine the effect of each physical parameter on the correlations in the time series of a given experiment. We then use a parametric-likelihood approach to validate the modeling and estimate atmosphere parameters from the polarbear-i project first season data set. We derive a new 1.0% upper limit on the linear polarization fraction of atmospheric emission. We also compare our results to previous studies and weather station measurements. The proposed model can be used for realistic simulations of future ground-based CMB observations.

  8. Strong Sporadic E Occurrence Detected by Ground-Based GNSS

    Science.gov (United States)

    Sun, Wenjie; Ning, Baiqi; Yue, Xinan; Li, Guozhu; Hu, Lianhuan; Chang, Shoumin; Lan, Jiaping; Zhu, Zhengping; Zhao, Biqiang; Lin, Jian

    2018-04-01

    The ionospheric sporadic E (Es) layer has significant impact on radio wave propagation. The traditional techniques employed for Es layer observation, for example, ionosondes, are not dense enough to resolve the morphology and dynamics of Es layer in spatial distribution. The ground-based Global Navigation Satellite Systems (GNSS) technique is expected to shed light on the understanding of regional strong Es occurrence, owing to the facts that the critical frequency (foEs) of strong Es structure is usually high enough to cause pulse-like disturbances in GNSS total electron content (TEC), and a large number of GNSS receivers have been deployed all over the world. Based on the Chinese ground-based GNSS networks, including the Crustal Movement Observation Network of China and the Beidou Ionospheric Observation Network, a large-scale strong Es event was observed in the middle latitude of China. The strong Es shown as a band-like structure in the southwest-northeast direction extended more than 1,000 km. By making a comparative analysis of Es occurrences identified from the simultaneous observations by ionosondes and GNSS TEC receivers over China middle latitude statistically, we found that GNSS TEC can be well employed to observe strong Es occurrence with a threshold value of foEs, 14 MHz.

  9. Benchmarking for Higher Education.

    Science.gov (United States)

    Jackson, Norman, Ed.; Lund, Helen, Ed.

    The chapters in this collection explore the concept of benchmarking as it is being used and developed in higher education (HE). Case studies and reviews show how universities in the United Kingdom are using benchmarking to aid in self-regulation and self-improvement. The chapters are: (1) "Introduction to Benchmarking" (Norman Jackson…

  10. Ground-Based Global Positioning System (GPS) Meteorology Integrated Precipitable Water Vapor (IPW)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Ground-Based Global Positioning System (GPS) Meteorology Integrated Precipitable Water Vapor (IPW) data set measures atmospheric water vapor using ground-based...

  11. Developing integrated benchmarks for DOE performance measurement

    Energy Technology Data Exchange (ETDEWEB)

    Barancik, J.I.; Kramer, C.F.; Thode, Jr. H.C.

    1992-09-30

    The objectives of this task were to describe and evaluate selected existing sources of information on occupational safety and health with emphasis on hazard and exposure assessment, abatement, training, reporting, and control identifying for exposure and outcome in preparation for developing DOE performance benchmarks. Existing resources and methodologies were assessed for their potential use as practical performance benchmarks. Strengths and limitations of current data resources were identified. Guidelines were outlined for developing new or improved performance factors, which then could become the basis for selecting performance benchmarks. Data bases for non-DOE comparison populations were identified so that DOE performance could be assessed relative to non-DOE occupational and industrial groups. Systems approaches were described which can be used to link hazards and exposure, event occurrence, and adverse outcome factors, as needed to generate valid, reliable, and predictive performance benchmarks. Data bases were identified which contain information relevant to one or more performance assessment categories . A list of 72 potential performance benchmarks was prepared to illustrate the kinds of information that can be produced through a benchmark development program. Current information resources which may be used to develop potential performance benchmarks are limited. There is need to develop an occupational safety and health information and data system in DOE, which is capable of incorporating demonstrated and documented performance benchmarks prior to, or concurrent with the development of hardware and software. A key to the success of this systems approach is rigorous development and demonstration of performance benchmark equivalents to users of such data before system hardware and software commitments are institutionalized.

  12. Benchmarking semantic web technology

    CERN Document Server

    García-Castro, R

    2009-01-01

    This book addresses the problem of benchmarking Semantic Web Technologies; first, from a methodological point of view, proposing a general methodology to follow in benchmarking activities over Semantic Web Technologies and, second, from a practical point of view, presenting two international benchmarking activities that involved benchmarking the interoperability of Semantic Web technologies using RDF(S) as the interchange language in one activity and OWL in the other.The book presents in detail how the different resources needed for these interoperability benchmarking activities were defined:

  13. Ground-based PIV and numerical flow visualization results from the Surface Tension Driven Convection Experiment

    Science.gov (United States)

    Pline, Alexander D.; Werner, Mark P.; Hsieh, Kwang-Chung

    1991-01-01

    The Surface Tension Driven Convection Experiment (STDCE) is a Space Transportation System flight experiment to study both transient and steady thermocapillary fluid flows aboard the United States Microgravity Laboratory-1 (USML-1) Spacelab mission planned for June, 1992. One of the components of data collected during the experiment is a video record of the flow field. This qualitative data is then quantified using an all electric, two dimensional Particle Image Velocimetry (PIV) technique called Particle Displacement Tracking (PDT), which uses a simple space domain particle tracking algorithm. Results using the ground based STDCE hardware, with a radiant flux heating mode, and the PDT system are compared to numerical solutions obtained by solving the axisymmetric Navier Stokes equations with a deformable free surface. The PDT technique is successful in producing a velocity vector field and corresponding stream function from the raw video data which satisfactorily represents the physical flow. A numerical program is used to compute the velocity field and corresponding stream function under identical conditions. Both the PDT system and numerical results were compared to a streak photograph, used as a benchmark, with good correlation.

  14. Benchmarking in University Toolbox

    Directory of Open Access Journals (Sweden)

    Katarzyna Kuźmicz

    2015-06-01

    Full Text Available In the face of global competition and rising challenges that higher education institutions (HEIs meet, it is imperative to increase innovativeness and efficiency of their management. Benchmarking can be the appropriate tool to search for a point of reference necessary to assess institution’s competitive position and learn from the best in order to improve. The primary purpose of the paper is to present in-depth analysis of benchmarking application in HEIs worldwide. The study involves indicating premises of using benchmarking in HEIs. It also contains detailed examination of types, approaches and scope of benchmarking initiatives. The thorough insight of benchmarking applications enabled developing classification of benchmarking undertakings in HEIs. The paper includes review of the most recent benchmarking projects and relating them to the classification according to the elaborated criteria (geographical range, scope, type of data, subject, support and continuity. The presented examples were chosen in order to exemplify different approaches to benchmarking in higher education setting. The study was performed on the basis of the published reports from benchmarking projects, scientific literature and the experience of the author from the active participation in benchmarking projects. The paper concludes with recommendations for university managers undertaking benchmarking, derived on the basis of the conducted analysis.

  15. Reconstruction of Sky Illumination Domes from Ground-Based Panoramas

    Science.gov (United States)

    Coubard, F.; Lelégard, L.; Brédif, M.; Paparoditis, N.; Briottet, X.

    2012-07-01

    The knowledge of the sky illumination is important for radiometric corrections and for computer graphics applications such as relighting or augmented reality. We propose an approach to compute environment maps, representing the sky radiance, from a set of ground-based images acquired by a panoramic acquisition system, for instance a mobile-mapping system. These images can be affected by important radiometric artifacts, such as bloom or overexposure. A Perez radiance model is estimated with the blue sky pixels of the images, and used to compute additive corrections in order to reduce these radiometric artifacts. The sky pixels are then aggregated in an environment map, which still suffers from discontinuities on stitching edges. The influence of the quality of estimated sky radiance on the simulated light signal is measured quantitatively on a simple synthetic urban scene; in our case, the maximal error for the total sensor radiance is about 10%.

  16. Ground-based transmission line conductor motion sensor

    International Nuclear Information System (INIS)

    Jacobs, M.L.; Milano, U.

    1988-01-01

    A ground-based-conductor motion-sensing apparatus is provided for remotely sensing movement of electric-power transmission lines, particularly as would occur during the wind-induced condition known as galloping. The apparatus is comprised of a motion sensor and signal-generating means which are placed underneath a transmission line and will sense changes in the electric field around the line due to excessive line motion. The detector then signals a remote station when a conditioning of galloping is sensed. The apparatus of the present invention is advantageous over the line-mounted sensors of the prior art in that it is easier and less hazardous to install. The system can also be modified so that a signal will only be given when particular conditions, such as specific temperature range, large-amplitude line motion, or excessive duration of the line motion, are occurring

  17. RECONSTRUCTION OF SKY ILLUMINATION DOMES FROM GROUND-BASED PANORAMAS

    Directory of Open Access Journals (Sweden)

    F. Coubard

    2012-07-01

    Full Text Available The knowledge of the sky illumination is important for radiometric corrections and for computer graphics applications such as relighting or augmented reality. We propose an approach to compute environment maps, representing the sky radiance, from a set of ground-based images acquired by a panoramic acquisition system, for instance a mobile-mapping system. These images can be affected by important radiometric artifacts, such as bloom or overexposure. A Perez radiance model is estimated with the blue sky pixels of the images, and used to compute additive corrections in order to reduce these radiometric artifacts. The sky pixels are then aggregated in an environment map, which still suffers from discontinuities on stitching edges. The influence of the quality of estimated sky radiance on the simulated light signal is measured quantitatively on a simple synthetic urban scene; in our case, the maximal error for the total sensor radiance is about 10%.

  18. Satellite and Ground Based Monitoring of Aerosol Plumes

    International Nuclear Information System (INIS)

    Doyle, Martin; Dorling, Stephen

    2002-01-01

    Plumes of atmospheric aerosol have been studied using a range of satellite and ground-based techniques. The Sea-viewing WideField-of-view Sensor (SeaWiFS) has been used to observe plumes of sulphate aerosol and Saharan dust around the coast of the United Kingdom. Aerosol Optical Thickness (AOT) was retrieved from SeaWiFS for two events; a plume of Saharan dust transported over the United Kingdom from Western Africa and a period of elevated sulphate experienced over the Easternregion of the UK. Patterns of AOT are discussed and related to the synoptic and mesoscale weather conditions. Further observation of the sulphate aerosol event was undertaken using the Advanced Very High Resolution Radiometer instrument(AVHRR). Atmospheric back trajectories and weather conditions were studied in order to identify the meteorological conditions which led to this event. Co-located ground-based measurements of PM 10 and PM 2.5 were obtained for 4sites within the UK and PM 2.5/10 ratios were calculated in order to identify any unusually high or low ratios(indicating the dominant size fraction within the plume)during either of these events. Calculated percentiles ofPM 2.5/10 ratios during the 2 events examined show that these events were notable within the record, but were in noway unique or unusual in the context of a 3 yr monitoring record. Visibility measurements for both episodes have been examined and show that visibility degradation occurred during both the sulphate aerosol and Saharan dust episodes

  19. Mixed-field GCR Simulations for Radiobiological Research using Ground Based Accelerators

    Science.gov (United States)

    Kim, Myung-Hee Y.; Rusek, Adam; Cucinotta, Francis

    Space radiation is comprised of a large number of particle types and energies, which have differential ionization power from high energy protons to high charge and energy (HZE) particles and secondary neutrons produced by galactic cosmic rays (GCR). Ground based accelerators such as the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL) are used to simulate space radiation for radiobiology research and dosimetry, electronics parts, and shielding testing using mono-energetic beams for single ion species. As a tool to support research on new risk assessment models, we have developed a stochastic model of heavy ion beams and space radiation effects, the GCR Event-based Risk Model computer code (GERMcode). For radiobiological research on mixed-field space radiation, a new GCR simulator at NSRL is proposed. The NSRL-GCR simulator, which implements the rapid switching mode and the higher energy beam extraction to 1.5 GeV/u, can integrate multiple ions into a single simulation to create GCR Z-spectrum in major energy bins. After considering the GCR environment and energy limitations of NSRL, a GCR reference field is proposed after extensive simulation studies using the GERMcode. The GCR reference field is shown to reproduce the Z and LET spectra of GCR behind shielding within 20 percents accuracy compared to simulated full GCR environments behind shielding. A major challenge for space radiobiology research is to consider chronic GCR exposure of up to 3-years in relation to simulations with cell and animal models of human risks. We discuss possible approaches to map important biological time scales in experimental models using ground-based simulation with extended exposure of up to a few weeks and fractionation approaches at a GCR simulator.

  20. MCNP neutron benchmarks

    International Nuclear Information System (INIS)

    Hendricks, J.S.; Whalen, D.J.; Cardon, D.A.; Uhle, J.L.

    1991-01-01

    Over 50 neutron benchmark calculations have recently been completed as part of an ongoing program to validate the MCNP Monte Carlo radiation transport code. The new and significant aspects of this work are as follows: These calculations are the first attempt at a validation program for MCNP and the first official benchmarking of version 4 of the code. We believe the chosen set of benchmarks is a comprehensive set that may be useful for benchmarking other radiation transport codes and data libraries. These calculations provide insight into how well neutron transport calculations can be expected to model a wide variety of problems

  1. Bridge Testing With Ground-Based Interferometric Radar: Experimental Results

    International Nuclear Information System (INIS)

    Chiara, P.; Morelli, A.

    2010-01-01

    The research of innovative non-contact techniques aimed at the vibration measurement of civil engineering structures (also for damage detection and structural health monitoring) is continuously directed to the optimization of measures and methods. Ground-Based Radar Interferometry (GBRI) represents the more recent technique available for static and dynamic control of structures and ground movements.Dynamic testing of bridges and buildings in operational conditions are currently performed: (a) to assess the conformity of the structure to the project design at the end of construction; (b) to identify the modal parameters (i.e. natural frequencies, mode shapes and damping ratios) and to check the variation of any modal parameters over the years; (c) to evaluate the amplitude of the structural response to special load conditions (i.e. strong winds, earthquakes, heavy railway or roadway loads). If such tests are carried out by using a non-contact technique (like GBRI), the classical issues of contact sensors (like accelerometers) are easily overtaken.This paper presents and discusses the results of various tests carried out on full-scale bridges by using a Stepped Frequency-Continuous Wave radar system.

  2. Bridge Testing With Ground-Based Interferometric Radar: Experimental Results

    Science.gov (United States)

    Chiara, P.; Morelli, A.

    2010-05-01

    The research of innovative non-contact techniques aimed at the vibration measurement of civil engineering structures (also for damage detection and structural health monitoring) is continuously directed to the optimization of measures and methods. Ground-Based Radar Interferometry (GBRI) represents the more recent technique available for static and dynamic control of structures and ground movements. Dynamic testing of bridges and buildings in operational conditions are currently performed: (a) to assess the conformity of the structure to the project design at the end of construction; (b) to identify the modal parameters (i.e. natural frequencies, mode shapes and damping ratios) and to check the variation of any modal parameters over the years; (c) to evaluate the amplitude of the structural response to special load conditions (i.e. strong winds, earthquakes, heavy railway or roadway loads). If such tests are carried out by using a non-contact technique (like GBRI), the classical issues of contact sensors (like accelerometers) are easily overtaken. This paper presents and discusses the results of various tests carried out on full-scale bridges by using a Stepped Frequency-Continuous Wave radar system.

  3. Observing Tsunamis in the Ionosphere Using Ground Based GPS Measurements

    Science.gov (United States)

    Galvan, D. A.; Komjathy, A.; Song, Y. Tony; Stephens, P.; Hickey, M. P.; Foster, J.

    2011-01-01

    Ground-based Global Positioning System (GPS) measurements of ionospheric Total Electron Content (TEC) show variations consistent with atmospheric internal gravity waves caused by ocean tsunamis following recent seismic events, including the Tohoku tsunami of March 11, 2011. We observe fluctuations correlated in time, space, and wave properties with this tsunami in TEC estimates processed using JPL's Global Ionospheric Mapping Software. These TEC estimates were band-pass filtered to remove ionospheric TEC variations with periods outside the typical range of internal gravity waves caused by tsunamis. Observable variations in TEC appear correlated with the Tohoku tsunami near the epicenter, at Hawaii, and near the west coast of North America. Disturbance magnitudes are 1-10% of the background TEC value. Observations near the epicenter are compared to estimates of expected tsunami-driven TEC variations produced by Embry Riddle Aeronautical University's Spectral Full Wave Model, an atmosphere-ionosphere coupling model, and found to be in good agreement. The potential exists to apply these detection techniques to real-time GPS TEC data, providing estimates of tsunami speed and amplitude that may be useful for future early warning systems.

  4. A design for a ground-based data management system

    Science.gov (United States)

    Lambird, Barbara A.; Lavine, David

    1988-01-01

    An initial design for a ground-based data management system which includes intelligent data abstraction and cataloging is described. The large quantity of data on some current and future NASA missions leads to significant problems in providing scientists with quick access to relevant data. Human screening of data for potential relevance to a particular study is time-consuming and costly. Intelligent databases can provide automatic screening when given relevent scientific parameters and constraints. The data management system would provide, at a minimum, information of availability of the range of data, the type available, specific time periods covered together with data quality information, and related sources of data. The system would inform the user about the primary types of screening, analysis, and methods of presentation available to the user. The system would then aid the user with performing the desired tasks, in such a way that the user need only specify the scientific parameters and objectives, and not worry about specific details for running a particular program. The design contains modules for data abstraction, catalog plan abstraction, a user-friendly interface, and expert systems for data handling, data evaluation, and application analysis. The emphasis is on developing general facilities for data representation, description, analysis, and presentation that will be easily used by scientists directly, thus bypassing the knowledge acquisition bottleneck. Expert system technology is used for many different aspects of the data management system, including the direct user interface, the interface to the data analysis routines, and the analysis of instrument status.

  5. Use of ground-based wind profiles in mesoscale forecasting

    Science.gov (United States)

    Schlatter, Thomas W.

    1985-01-01

    A brief review is presented of recent uses of ground-based wind profile data in mesoscale forecasting. Some of the applications are in real time, and some are after the fact. Not all of the work mentioned here has been published yet, but references are given wherever possible. As Gage and Balsley (1978) point out, sensitive Doppler radars have been used to examine tropospheric wind profiles since the 1970's. It was not until the early 1980's, however, that the potential contribution of these instruments to operational forecasting and numerical weather prediction became apparent. Profiler winds and radiosonde winds compare favorably, usually within a few m/s in speed and 10 degrees in direction (see Hogg et al., 1983), but the obvious advantage of the profiler is its frequent (hourly or more often) sampling of the same volume. The rawinsonde balloon is launched only twice a day and drifts with the wind. In this paper, I will: (1) mention two operational uses of data from a wind profiling system developed jointly by the Wave Propagation and Aeronomy Laboratories of NOAA; (2) describe a number of displays of these same data on a workstation for mesoscale forecasting developed by the Program for Regional Observing and Forecasting Services (PROFS); and (3) explain some interesting diagnostic calculations performed by meteorologists of the Wave Propagation Laboratory.

  6. Ground-based observations coordinated with Viking satellite measurements

    International Nuclear Information System (INIS)

    Opgenoorth, H.J.; Kirkwood, S.

    1989-01-01

    The instrumentation and the orbit of the Viking satellite made this first Swedish satellite mission ideally suited for coordinated observations with the dense network of ground-based stations in northern Scandinavia. Several arrays of complementing instruments such as magnetometers, all-sky cameras, riometers and doppler radars monitored on a routine basis the ionosphere under the magnetospheric region passed by Viking. For a large number of orbits the Viking passages close to Scandinavia were covered by the operation of specially designed programmes at the European incoherent-scatter facility (EISCAT). First results of coordinated observations on the ground and aboard Viking have shed new light on the most spectacular feature of substorm expansion, the westward-travelling surge. The end of a substorm and the associated decay of a westward-travelling surge have been analysed. EISCAT measurements of high spatial and temporal resolution indicate that the conductivities and electric fields associated with westward-travelling surges are not represented correctly by the existing models. (author)

  7. Ground-based detection of G star superflares with NGTS

    Science.gov (United States)

    Jackman, James A. G.; Wheatley, Peter J.; Pugh, Chloe E.; Gänsicke, Boris T.; Gillen, Edward; Broomhall, Anne-Marie; Armstrong, David J.; Burleigh, Matthew R.; Chaushev, Alexander; Eigmüller, Philipp; Erikson, Anders; Goad, Michael R.; Grange, Andrew; Günther, Maximilian N.; Jenkins, James S.; McCormac, James; Raynard, Liam; Thompson, Andrew P. G.; Udry, Stéphane; Walker, Simon; Watson, Christopher A.; West, Richard G.

    2018-04-01

    We present high cadence detections of two superflares from a bright G8 star (V = 11.56) with the Next Generation Transit Survey (NGTS). We improve upon previous superflare detections by resolving the flare rise and peak, allowing us to fit a solar flare inspired model without the need for arbitrary break points between rise and decay. Our data also enables us to identify substructure in the flares. From changing starspot modulation in the NGTS data we detect a stellar rotation period of 59 hours, along with evidence for differential rotation. We combine this rotation period with the observed ROSAT X-ray flux to determine that the star's X-ray activity is saturated. We calculate the flare bolometric energies as 5.4^{+0.8}_{-0.7}× 10^{34}and 2.6^{+0.4}_{-0.3}× 10^{34}erg and compare our detections with G star superflares detected in the Kepler survey. We find our main flare to be one of the largest amplitude superflares detected from a bright G star. With energies more than 100 times greater than the Carrington event, our flare detections demonstrate the role that ground-based instruments such as NGTS can have in assessing the habitability of Earth-like exoplanets, particularly in the era of PLATO.

  8. Benchmarking af kommunernes sagsbehandling

    DEFF Research Database (Denmark)

    Amilon, Anna

    Fra 2007 skal Ankestyrelsen gennemføre benchmarking af kommuernes sagsbehandlingskvalitet. Formålet med benchmarkingen er at udvikle praksisundersøgelsernes design med henblik på en bedre opfølgning og at forbedre kommunernes sagsbehandling. Dette arbejdspapir diskuterer metoder for benchmarking...

  9. Internet based benchmarking

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Nielsen, Kurt

    2005-01-01

    We discuss the design of interactive, internet based benchmarking using parametric (statistical) as well as nonparametric (DEA) models. The user receives benchmarks and improvement potentials. The user is also given the possibility to search different efficiency frontiers and hereby to explore...

  10. The Drill Down Benchmark

    NARCIS (Netherlands)

    P.A. Boncz (Peter); T. Rühl (Tim); F. Kwakkel

    1998-01-01

    textabstractData Mining places specific requirements on DBMS query performance that cannot be evaluated satisfactorily using existing OLAP benchmarks. The DD Benchmark - defined here - provides a practical case and yardstick to explore how well a DBMS is able to support Data Mining applications. It

  11. Benchmarking Tool Kit.

    Science.gov (United States)

    Canadian Health Libraries Association.

    Nine Canadian health libraries participated in a pilot test of the Benchmarking Tool Kit between January and April, 1998. Although the Tool Kit was designed specifically for health libraries, the content and approach are useful to other types of libraries as well. Used to its full potential, benchmarking can provide a common measuring stick to…

  12. Toxicological benchmarks for wildlife: 1994 Revision

    International Nuclear Information System (INIS)

    Opresko, D.M.; Sample, B.E.; Suter, G.W. II.

    1994-09-01

    The process by which ecological risks of environmental contaminants are evaluated is two-tiered. The first tier is a screening assessment where concentrations of contaminants in the environment are compared to toxicological benchmarks which represent concentrations of chemicals in environmental media (water, sediment, soil, food, etc.) that are presumed to be nonhazardous to the surrounding biota. The second tier is a baseline ecological risk assessment where toxicological benchmarks are one of several lines of evidence used to support or refute the presence of ecological effects. The report presents toxicological benchmarks for assessment of effects of 76 chemicals on 8 representative mammalian wildlife species and 31 chemicals on 9 avian wildlife species. The chemicals are some of those that occur at United States Department of Energy waste sites; the wildlife species were chosen because they are widely distributed and provide a representative range of body sizes and diets. Further descriptions of the chosen wildlife species and chemicals are provided in the report. The benchmarks presented in this report represent values believed to be nonhazardous for the listed wildlife species. These benchmarks only consider contaminant exposure through oral ingestion of contaminated media; exposure through inhalation or direct dermal exposure are not considered in this report

  13. Toxicological benchmarks for wildlife: 1994 Revision

    Energy Technology Data Exchange (ETDEWEB)

    Opresko, D.M.; Sample, B.E.; Suter, G.W. II

    1994-09-01

    The process by which ecological risks of environmental contaminants are evaluated is two-tiered. The first tier is a screening assessment where concentrations of contaminants in the environment are compared to toxicological benchmarks which represent concentrations of chemicals in environmental media (water, sediment, soil, food, etc.) that are presumed to be nonhazardous to the surrounding biota. The second tier is a baseline ecological risk assessment where toxicological benchmarks are one of several lines of evidence used to support or refute the presence of ecological effects. The report presents toxicological benchmarks for assessment of effects of 76 chemicals on 8 representative mammalian wildlife species and 31 chemicals on 9 avian wildlife species. The chemicals are some of those that occur at United States Department of Energy waste sites; the wildlife species were chosen because they are widely distributed and provide a representative range of body sizes and diets. Further descriptions of the chosen wildlife species and chemicals are provided in the report. The benchmarks presented in this report represent values believed to be nonhazardous for the listed wildlife species. These benchmarks only consider contaminant exposure through oral ingestion of contaminated media; exposure through inhalation or direct dermal exposure are not considered in this report.

  14. How Activists Use Benchmarks

    DEFF Research Database (Denmark)

    Seabrooke, Leonard; Wigan, Duncan

    2015-01-01

    Non-governmental organisations use benchmarks as a form of symbolic violence to place political pressure on firms, states, and international organisations. The development of benchmarks requires three elements: (1) salience, that the community of concern is aware of the issue and views...... are put to the test. The first is a reformist benchmarking cycle where organisations defer to experts to create a benchmark that conforms with the broader system of politico-economic norms. The second is a revolutionary benchmarking cycle driven by expert-activists that seek to contest strong vested...... interests and challenge established politico-economic norms. Differentiating these cycles provides insights into how activists work through organisations and with expert networks, as well as how campaigns on complex economic issues can be mounted and sustained....

  15. EGS4 benchmark program

    International Nuclear Information System (INIS)

    Yasu, Y.; Hirayama, H.; Namito, Y.; Yashiro, S.

    1995-01-01

    This paper proposes EGS4 Benchmark Suite which consists of three programs called UCSAMPL4, UCSAMPL4I and XYZDOS. This paper also evaluates optimization methods of recent RISC/UNIX systems, such as IBM, HP, DEC, Hitachi and Fujitsu, for the benchmark suite. When particular compiler option and math library were included in the evaluation process, system performed significantly better. Observed performance of some of the RISC/UNIX systems were beyond some so-called Mainframes of IBM, Hitachi or Fujitsu. The computer performance of EGS4 Code System on an HP9000/735 (99MHz) was defined to be the unit of EGS4 Unit. The EGS4 Benchmark Suite also run on various PCs such as Pentiums, i486 and DEC alpha and so forth. The performance of recent fast PCs reaches that of recent RISC/UNIX systems. The benchmark programs have been evaluated with correlation of industry benchmark programs, namely, SPECmark. (author)

  16. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  17. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  18. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  19. RISKIND verification and benchmark comparisons

    International Nuclear Information System (INIS)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models

  20. RISKIND verification and benchmark comparisons

    Energy Technology Data Exchange (ETDEWEB)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  1. Monitoring Hydraulic Fracturing Using Ground-Based Controlled Source Electromagnetics

    Science.gov (United States)

    Hickey, M. S.; Trevino, S., III; Everett, M. E.

    2017-12-01

    Hydraulic fracturing allows hydrocarbon production in low permeability formations. Imaging the distribution of fluid used to create a hydraulic fracture can aid in the characterization of fracture properties such as extent of plume penetration as well as fracture azimuth and symmetry. This could contribute to improving the efficiency of an operation, for example, in helping to determine ideal well spacing or the need to refracture a zone. A ground-based controlled-source electromagnetics (CSEM) technique is ideal for imaging the fluid due to the change in field caused by the difference in the conductive properties of the fluid when compared to the background. With advances in high signal to noise recording equipment, coupled with a high-power, broadband transmitter we can show hydraulic fracture extent and azimuth with minimal processing. A 3D finite element code is used to model the complete well casing along with the layered subsurface. This forward model is used to optimize the survey design and isolate the band of frequencies with the best response. In the field, the results of the modeling are also used to create a custom pseudorandom numeric (PRN) code to control the frequencies transmitted through a grounded dipole source. The receivers record the surface voltage across two grounded dipoles, one parallel and one perpendicular to the transmitter. The data are presented as the displays of amplitude ratios across several frequencies with the associated spatial information. In this presentation, we show multiple field results in multiple basins in the United States along with the CSEM theory used to create the survey designs.

  2. OBSERVATIONAL SELECTION EFFECTS WITH GROUND-BASED GRAVITATIONAL WAVE DETECTORS

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Hsin-Yu; Holz, Daniel E. [University of Chicago, Chicago, Illinois 60637 (United States); Essick, Reed; Vitale, Salvatore; Katsavounidis, Erik [LIGO, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139 (United States)

    2017-01-20

    Ground-based interferometers are not perfect all-sky instruments, and it is important to account for their behavior when considering the distribution of detected events. In particular, the LIGO detectors are most sensitive to sources above North America and the Indian Ocean, and as the Earth rotates, the sensitive regions are swept across the sky. However, because the detectors do not acquire data uniformly over time, there is a net bias on detectable sources’ right ascensions. Both LIGO detectors preferentially collect data during their local night; it is more than twice as likely to be local midnight than noon when both detectors are operating. We discuss these selection effects and how they impact LIGO’s observations and electromagnetic (EM) follow-up. Beyond galactic foregrounds associated with seasonal variations, we find that equatorial observatories can access over 80% of the localization probability, while mid-latitudes will access closer to 70%. Facilities located near the two LIGO sites can observe sources closer to their zenith than their analogs in the south, but the average observation will still be no closer than 44° from zenith. We also find that observatories in Africa or the South Atlantic will wait systematically longer before they can begin observing compared to the rest of the world; though, there is a preference for longitudes near the LIGOs. These effects, along with knowledge of the LIGO antenna pattern, can inform EM follow-up activities and optimization, including the possibility of directing observations even before gravitational-wave events occur.

  3. Project management for complex ground-based instruments: MEGARA plan

    Science.gov (United States)

    García-Vargas, María. Luisa; Pérez-Calpena, Ana; Gil de Paz, Armando; Gallego, Jesús; Carrasco, Esperanza; Cedazo, Raquel; Iglesias, Jorge

    2014-08-01

    The project management of complex instruments for ground-based large telescopes is a challenge itself. A good management is a clue for project success in terms of performance, schedule and budget. Being on time has become a strict requirement for two reasons: to assure the arrival at the telescope due to the pressure on demanding new instrumentation for this first world-class telescopes and to not fall in over-costs. The budget and cash-flow is not always the expected one and has to be properly handled from different administrative departments at the funding centers worldwide distributed. The complexity of the organizations, the technological and scientific return to the Consortium partners and the participation in the project of all kind of professional centers working in astronomical instrumentation: universities, research centers, small and large private companies, workshops and providers, etc. make the project management strategy, and the tools and procedures tuned to the project needs, crucial for success. MEGARA (Multi-Espectrógrafo en GTC de Alta Resolución para Astronomía) is a facility instrument of the 10.4m GTC (La Palma, Spain) working at optical wavelengths that provides both Integral-Field Unit (IFU) and Multi-Object Spectrograph (MOS) capabilities at resolutions in the range R=6,000-20,000. The project is an initiative led by Universidad Complutense de Madrid (Spain) in collaboration with INAOE (Mexico), IAA-CSIC (Spain) and Universidad Politécnica de Madrid (Spain). MEGARA is being developed under contract with GRANTECAN.

  4. OBSERVATIONAL SELECTION EFFECTS WITH GROUND-BASED GRAVITATIONAL WAVE DETECTORS

    International Nuclear Information System (INIS)

    Chen, Hsin-Yu; Holz, Daniel E.; Essick, Reed; Vitale, Salvatore; Katsavounidis, Erik

    2017-01-01

    Ground-based interferometers are not perfect all-sky instruments, and it is important to account for their behavior when considering the distribution of detected events. In particular, the LIGO detectors are most sensitive to sources above North America and the Indian Ocean, and as the Earth rotates, the sensitive regions are swept across the sky. However, because the detectors do not acquire data uniformly over time, there is a net bias on detectable sources’ right ascensions. Both LIGO detectors preferentially collect data during their local night; it is more than twice as likely to be local midnight than noon when both detectors are operating. We discuss these selection effects and how they impact LIGO’s observations and electromagnetic (EM) follow-up. Beyond galactic foregrounds associated with seasonal variations, we find that equatorial observatories can access over 80% of the localization probability, while mid-latitudes will access closer to 70%. Facilities located near the two LIGO sites can observe sources closer to their zenith than their analogs in the south, but the average observation will still be no closer than 44° from zenith. We also find that observatories in Africa or the South Atlantic will wait systematically longer before they can begin observing compared to the rest of the world; though, there is a preference for longitudes near the LIGOs. These effects, along with knowledge of the LIGO antenna pattern, can inform EM follow-up activities and optimization, including the possibility of directing observations even before gravitational-wave events occur.

  5. Space- and Ground-based Coronal Spectro-Polarimetry

    Science.gov (United States)

    Fineschi, Silvano; Bemporad, Alessandro; Rybak, Jan; Capobianco, Gerardo

    This presentation gives an overview of the near-future perspectives of ultraviolet and visible-light spectro-polarimetric instrumentation for probing coronal magnetism from space-based and ground-based observatories. Spectro-polarimetric imaging of coronal emission-lines in the visible-light wavelength-band provides an important diagnostics tool of the coronal magnetism. The interpretation in terms of Hanle and Zeeman effect of the line-polarization in forbidden emission-lines yields information on the direction and strength of the coronal magnetic field. As study case, this presentation will describe the Torino Coronal Magnetograph (CorMag) for the spectro-polarimetric observation of the FeXIV, 530.3 nm, forbidden emission-line. CorMag - consisting of a Liquid Crystal (LC) Lyot filter and a LC linear polarimeter - has been recently installed on the Lomnicky Peak Observatory 20cm Zeiss coronagraph. The preliminary results from CorMag will be presented. The linear polarization by resonance scattering of coronal permitted line-emission in the ultraviolet (UV)can be modified by magnetic fields through the Hanle effect. Space-based UV spectro-polarimeters would provide an additional tool for the disgnostics of coronal magnetism. As a case study of space-borne UV spectro-polarimeters, this presentation will describe the future upgrade of the Sounding-rocket Coronagraphic Experiment (SCORE) to include the capability of imaging polarimetry of the HI Lyman-alpha, 121.6 nm. SCORE is a multi-wavelength imager for the emission-lines, HeII 30.4 nm and HI 121.6 nm, and visible-light broad-band emission of the polarized K-corona. SCORE has flown successfully in 2009. This presentation will describe how in future re-flights SCORE could observe the expected Hanle effect in corona with a HI Lyman-alpha polarimeter.

  6. Simulating the Performance of Ground-Based Optical Asteroid Surveys

    Science.gov (United States)

    Christensen, Eric J.; Shelly, Frank C.; Gibbs, Alex R.; Grauer, Albert D.; Hill, Richard E.; Johnson, Jess A.; Kowalski, Richard A.; Larson, Stephen M.

    2014-11-01

    We are developing a set of asteroid survey simulation tools in order to estimate the capability of existing and planned ground-based optical surveys, and to test a variety of possible survey cadences and strategies. The survey simulator is composed of several layers, including a model population of solar system objects and an orbital integrator, a site-specific atmospheric model (including inputs for seeing, haze and seasonal cloud cover), a model telescope (with a complete optical path to estimate throughput), a model camera (including FOV, pixel scale, and focal plane fill factor) and model source extraction and moving object detection layers with tunable detection requirements. We have also developed a flexible survey cadence planning tool to automatically generate nightly survey plans. Inputs to the cadence planner include camera properties (FOV, readout time), telescope limits (horizon, declination, hour angle, lunar and zenithal avoidance), preferred and restricted survey regions in RA/Dec, ecliptic, and Galactic coordinate systems, and recent coverage by other asteroid surveys. Simulated surveys are created for a subset of current and previous NEO surveys (LINEAR, Pan-STARRS and the three Catalina Sky Survey telescopes), and compared against the actual performance of these surveys in order to validate the model’s performance. The simulator tracks objects within the FOV of any pointing that were not discovered (e.g. too few observations, too trailed, focal plane array gaps, too fast or slow), thus dividing the population into “discoverable” and “discovered” subsets, to inform possible survey design changes. Ongoing and future work includes generating a realistic “known” subset of the model NEO population, running multiple independent simulated surveys in coordinated and uncoordinated modes, and testing various cadences to find optimal strategies for detecting NEO sub-populations. These tools can also assist in quantifying the efficiency of novel

  7. The high-resolution extraterrestrial solar spectrum (QASUMEFTS determined from ground-based solar irradiance measurements

    Directory of Open Access Journals (Sweden)

    J. Gröbner

    2017-09-01

    Full Text Available A high-resolution extraterrestrial solar spectrum has been determined from ground-based measurements of direct solar spectral irradiance (SSI over the wavelength range from 300 to 500 nm using the Langley-plot technique. The measurements were obtained at the Izaña Atmospheric Research Centre from the Agencia Estatal de Meteorología, Tenerife, Spain, during the period 12 to 24 September 2016. This solar spectrum (QASUMEFTS was combined from medium-resolution (bandpass of 0.86 nm measurements of the QASUME (Quality Assurance of Spectral Ultraviolet Measurements in Europe spectroradiometer in the wavelength range from 300 to 500 nm and high-resolution measurements (0.025 nm from a Fourier transform spectroradiometer (FTS over the wavelength range from 305 to 380 nm. The Kitt Peak solar flux atlas was used to extend this high-resolution solar spectrum to 500 nm. The expanded uncertainties of this solar spectrum are 2 % between 310 and 500 nm and 4 % at 300 nm. The comparison of this solar spectrum with solar spectra measured in space (top of the atmosphere gave very good agreements in some cases, while in some other cases discrepancies of up to 5 % were observed. The QASUMEFTS solar spectrum represents a benchmark dataset with uncertainties lower than anything previously published. The metrological traceability of the measurements to the International System of Units (SI is assured by an unbroken chain of calibrations leading to the primary spectral irradiance standard of the Physikalisch-Technische Bundesanstalt in Germany.

  8. Foundation Investigation for Ground Based Radar Project-Kwajalein Island, Marshall Islands

    Science.gov (United States)

    1990-04-01

    iL_ COPY MISCELLANEOUS PAPER GL-90-5 i iFOUNDATION INVESTIGATION FOR GROUND BASED RADAR PROJECT--KWAJALEIN ISLAND, MARSHALL ISLANDS by Donald E...C!assification) Foundatioa Investigation for Ground Based Radar Project -- Kwajalein Island, Marshall Islands 12. PERSONAL AUTHOR(S) Yule, Donald E...investigation for the Ground Based Radar Project -- Kwajalein Island, Marshall Islands , are presented.- eophysical tests comprised of surface refrac- tion

  9. Benchmarking and the laboratory

    Science.gov (United States)

    Galloway, M; Nadin, L

    2001-01-01

    This article describes how benchmarking can be used to assess laboratory performance. Two benchmarking schemes are reviewed, the Clinical Benchmarking Company's Pathology Report and the College of American Pathologists' Q-Probes scheme. The Clinical Benchmarking Company's Pathology Report is undertaken by staff based in the clinical management unit, Keele University with appropriate input from the professional organisations within pathology. Five annual reports have now been completed. Each report is a detailed analysis of 10 areas of laboratory performance. In this review, particular attention is focused on the areas of quality, productivity, variation in clinical practice, skill mix, and working hours. The Q-Probes scheme is part of the College of American Pathologists programme in studies of quality assurance. The Q-Probes scheme and its applicability to pathology in the UK is illustrated by reviewing two recent Q-Probe studies: routine outpatient test turnaround time and outpatient test order accuracy. The Q-Probes scheme is somewhat limited by the small number of UK laboratories that have participated. In conclusion, as a result of the government's policy in the UK, benchmarking is here to stay. Benchmarking schemes described in this article are one way in which pathologists can demonstrate that they are providing a cost effective and high quality service. Key Words: benchmarking • pathology PMID:11477112

  10. Shielding benchmark problems, (2)

    International Nuclear Information System (INIS)

    Tanaka, Shun-ichi; Sasamoto, Nobuo; Oka, Yoshiaki; Shin, Kazuo; Tada, Keiko.

    1980-02-01

    Shielding benchmark problems prepared by Working Group of Assessment of Shielding Experiments in the Research Committee on Shielding Design in the Atomic Energy Society of Japan were compiled by Shielding Laboratory in Japan Atomic Energy Research Institute. Fourteen shielding benchmark problems are presented newly in addition to twenty-one problems proposed already, for evaluating the calculational algorithm and accuracy of computer codes based on discrete ordinates method and Monte Carlo method and for evaluating the nuclear data used in codes. The present benchmark problems are principally for investigating the backscattering and the streaming of neutrons and gamma rays in two- and three-dimensional configurations. (author)

  11. Long term landslide monitoring with Ground Based SAR

    Science.gov (United States)

    Monserrat, Oriol; Crosetto, Michele; Luzi, Guido; Gili, Josep; Moya, Jose; Corominas, Jordi

    2014-05-01

    In the last decade, Ground-Based (GBSAR) has proven to be a reliable microwave Remote Sensing technique in several application fields, especially for unstable slopes monitoring. GBSAR can provide displacement measurements over few squared kilometres areas and with a very high spatial and temporal resolution. This work is focused on the use of GBSAR technique for long term landslide monitoring based on a particular data acquisition configuration, which is called discontinuous GBSAR (D-GBSAR). In the most commonly used GBSAR configuration, the radar is left installed in situ, acquiring data periodically, e.g. every few minutes. Deformations are estimated by processing sets of GBSAR images acquired during several weeks or months, without moving the system. By contrast, in the D-GBSAR the radar is installed and dismounted at each measurement campaign, revisiting a given site periodically. This configuration is useful to monitor slow deformation phenomena. In this work, two alternative ways for exploiting the D-GBSAR technique will be presented: the DInSAR technique and the Amplitude based Technique. The former is based on the exploitation of the phase component of the acquired SAR images and it allows providing millimetric precision on the deformation estimates. However, this technique presents several limitations like the reduction of measurable points with an increase in the period of observation, the ambiguous nature of the phase measurements, and the influence of the atmospheric phase component that can make it non applicable in some cases, specially when working in natural environments. The second approach, that is based on the use of the amplitude component of GB-SAR images combined with a image matching technique, will allow the estimation of the displacements over specific targets avoiding two of the limitations commented above: the phase unwrapping and atmosphere contribution but reducing the deformation measurement precision. Two successful examples of D

  12. Diagnostic Algorithm Benchmarking

    Science.gov (United States)

    Poll, Scott

    2011-01-01

    A poster for the NASA Aviation Safety Program Annual Technical Meeting. It describes empirical benchmarking on diagnostic algorithms using data from the ADAPT Electrical Power System testbed and a diagnostic software framework.

  13. Benchmarking Swiss electricity grids

    International Nuclear Information System (INIS)

    Walti, N.O.; Weber, Ch.

    2001-01-01

    This extensive article describes a pilot benchmarking project initiated by the Swiss Association of Electricity Enterprises that assessed 37 Swiss utilities. The data collected from these utilities on a voluntary basis included data on technical infrastructure, investments and operating costs. These various factors are listed and discussed in detail. The assessment methods and rating mechanisms that provided the benchmarks are discussed and the results of the pilot study are presented that are to form the basis of benchmarking procedures for the grid regulation authorities under the planned Switzerland's electricity market law. Examples of the practical use of the benchmarking methods are given and cost-efficiency questions still open in the area of investment and operating costs are listed. Prefaces by the Swiss Association of Electricity Enterprises and the Swiss Federal Office of Energy complete the article

  14. Benchmarking and Regulation

    DEFF Research Database (Denmark)

    Agrell, Per J.; Bogetoft, Peter

    . The application of benchmarking in regulation, however, requires specific steps in terms of data validation, model specification and outlier detection that are not systematically documented in open publications, leading to discussions about regulatory stability and economic feasibility of these techniques...

  15. Financial Integrity Benchmarks

    Data.gov (United States)

    City of Jackson, Mississippi — This data compiles standard financial integrity benchmarks that allow the City to measure its financial standing. It measure the City's debt ratio and bond ratings....

  16. Benchmarking in Foodservice Operations

    National Research Council Canada - National Science Library

    Johnson, Bonnie

    1998-01-01

    .... The design of this study included two parts: (1) eleven expert panelists involved in a Delphi technique to identify and rate importance of foodservice performance measures and rate the importance of benchmarking activities, and (2...

  17. MFTF TOTAL benchmark

    International Nuclear Information System (INIS)

    Choy, J.H.

    1979-06-01

    A benchmark of the TOTAL data base management system as applied to the Mirror Fusion Test Facility (MFTF) data base was implemented and run in February and March of 1979. The benchmark was run on an Interdata 8/32 and involved the following tasks: (1) data base design, (2) data base generation, (3) data base load, and (4) develop and implement programs to simulate MFTF usage of the data base

  18. Accelerator shielding benchmark problems

    International Nuclear Information System (INIS)

    Hirayama, H.; Ban, S.; Nakamura, T.

    1993-01-01

    Accelerator shielding benchmark problems prepared by Working Group of Accelerator Shielding in the Research Committee on Radiation Behavior in the Atomic Energy Society of Japan were compiled by Radiation Safety Control Center of National Laboratory for High Energy Physics. Twenty-five accelerator shielding benchmark problems are presented for evaluating the calculational algorithm, the accuracy of computer codes and the nuclear data used in codes. (author)

  19. Shielding benchmark problems

    International Nuclear Information System (INIS)

    Tanaka, Shun-ichi; Sasamoto, Nobuo; Oka, Yoshiaki; Kawai, Masayoshi; Nakazawa, Masaharu.

    1978-09-01

    Shielding benchmark problems were prepared by the Working Group of Assessment of Shielding Experiments in the Research Comittee on Shielding Design of the Atomic Energy Society of Japan, and compiled by the Shielding Laboratory of Japan Atomic Energy Research Institute. Twenty-one kinds of shielding benchmark problems are presented for evaluating the calculational algorithm and the accuracy of computer codes based on the discrete ordinates method and the Monte Carlo method and for evaluating the nuclear data used in the codes. (author)

  20. Benchmarking electricity distribution

    Energy Technology Data Exchange (ETDEWEB)

    Watts, K. [Department of Justice and Attorney-General, QLD (Australia)

    1995-12-31

    Benchmarking has been described as a method of continuous improvement that involves an ongoing and systematic evaluation and incorporation of external products, services and processes recognised as representing best practice. It is a management tool similar to total quality management (TQM) and business process re-engineering (BPR), and is best used as part of a total package. This paper discusses benchmarking models and approaches and suggests a few key performance indicators that could be applied to benchmarking electricity distribution utilities. Some recent benchmarking studies are used as examples and briefly discussed. It is concluded that benchmarking is a strong tool to be added to the range of techniques that can be used by electricity distribution utilities and other organizations in search of continuous improvement, and that there is now a high level of interest in Australia. Benchmarking represents an opportunity for organizations to approach learning from others in a disciplined and highly productive way, which will complement the other micro-economic reforms being implemented in Australia. (author). 26 refs.

  1. A detrimental soil disturbance prediction model for ground-based timber harvesting

    Science.gov (United States)

    Derrick A. Reeves; Matthew C. Reeves; Ann M. Abbott; Deborah S. Page-Dumroese; Mark D. Coleman

    2012-01-01

    Soil properties and forest productivity can be affected during ground-based harvest operations and site preparation. The degree of impact varies widely depending on topographic features and soil properties. Forest managers who understand site-specific limits to ground-based harvesting can alter harvest method or season to limit soil disturbance. To determine the...

  2. Biosensors for EVA: Improved Instrumentation for Ground-based Studies

    Science.gov (United States)

    Soller, B.; Ellerby, G.; Zou, F.; Scott, P.; Jin, C.; Lee, S. M. C.; Coates, J.

    2010-01-01

    During lunar excursions in the EVA suit, real-time measurement of metabolic rate is required to manage consumables and guide activities to ensure safe return to the base. Metabolic rate, or oxygen consumption (VO2), is normally measured from pulmonary parameters but cannot be determined with standard techniques in the oxygen-rich environment of a spacesuit. Our group has developed novel near infrared spectroscopic (NIRS) methods to calculate muscle oxygen saturation (SmO 2), hematocrit, and pH, and we recently demonstrated that we can use our NIRS sensor to measure VO 2 on the leg during cycling. Our NSBRI project has 4 objectives: (1) increase the accuracy of the metabolic rate calculation through improved prediction of stroke volume; (2) investigate the relative contributions of calf and thigh oxygen consumption to metabolic rate calculation for walking and running; (3) demonstrate that the NIRS-based noninvasive metabolic rate methodology is sensitive enough to detect decrement in VO 2 in a space analog; and (4) improve instrumentation to allow testing within a spacesuit. Over the past year we have made progress on all four objectives, but the most significant progress was made in improving the instrumentation. The NIRS system currently in use at JSC is based on fiber optics technology. Optical fiber bundles are used to deliver light from a light source in the monitor to the patient, and light reflected back from the patient s muscle to the monitor for spectroscopic analysis. The fiber optic cables are large and fragile, and there is no way to get them in and out of the test spacesuit used for ground-based studies. With complimentary funding from the US Army, we undertook a complete redesign of the sensor and control electronics to build a novel system small enough to be used within the spacesuit and portable enough to be used by a combat medic. In the new system the filament lamp used in the fiber optic system was replaced with a novel broadband near infrared

  3. The KMAT: Benchmarking Knowledge Management.

    Science.gov (United States)

    de Jager, Martha

    Provides an overview of knowledge management and benchmarking, including the benefits and methods of benchmarking (e.g., competitive, cooperative, collaborative, and internal benchmarking). Arthur Andersen's KMAT (Knowledge Management Assessment Tool) is described. The KMAT is a collaborative benchmarking tool, designed to help organizations make…

  4. Benchmarking the Netherlands. Benchmarking for growth

    International Nuclear Information System (INIS)

    2003-01-01

    This is the fourth edition of the Ministry of Economic Affairs' publication 'Benchmarking the Netherlands', which aims to assess the competitiveness of the Dutch economy. The methodology and objective of the benchmarking remain the same. The basic conditions for economic activity (institutions, regulation, etc.) in a number of benchmark countries are compared in order to learn from the solutions found by other countries for common economic problems. This publication is devoted entirely to the potential output of the Dutch economy. In other words, its ability to achieve sustainable growth and create work over a longer period without capacity becoming an obstacle. This is important because economic growth is needed to increase prosperity in the broad sense and meeting social needs. Prosperity in both a material (per capita GDP) and immaterial (living environment, environment, health, etc) sense, in other words. The economy's potential output is determined by two structural factors: the growth of potential employment and the structural increase in labour productivity. Analysis by the Netherlands Bureau for Economic Policy Analysis (CPB) shows that in recent years the increase in the capacity for economic growth has been realised mainly by increasing the supply of labour and reducing the equilibrium unemployment rate. In view of the ageing of the population in the coming years and decades the supply of labour is unlikely to continue growing at the pace we have become accustomed to in recent years. According to a number of recent studies, to achieve a respectable rate of sustainable economic growth the aim will therefore have to be to increase labour productivity. To realise this we have to focus on for six pillars of economic policy: (1) human capital, (2) functioning of markets, (3) entrepreneurship, (4) spatial planning, (5) innovation, and (6) sustainability. These six pillars determine the course for economic policy aiming at higher productivity growth. Throughout

  5. Benchmarking the Netherlands. Benchmarking for growth

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-01-01

    This is the fourth edition of the Ministry of Economic Affairs' publication 'Benchmarking the Netherlands', which aims to assess the competitiveness of the Dutch economy. The methodology and objective of the benchmarking remain the same. The basic conditions for economic activity (institutions, regulation, etc.) in a number of benchmark countries are compared in order to learn from the solutions found by other countries for common economic problems. This publication is devoted entirely to the potential output of the Dutch economy. In other words, its ability to achieve sustainable growth and create work over a longer period without capacity becoming an obstacle. This is important because economic growth is needed to increase prosperity in the broad sense and meeting social needs. Prosperity in both a material (per capita GDP) and immaterial (living environment, environment, health, etc) sense, in other words. The economy's potential output is determined by two structural factors: the growth of potential employment and the structural increase in labour productivity. Analysis by the Netherlands Bureau for Economic Policy Analysis (CPB) shows that in recent years the increase in the capacity for economic growth has been realised mainly by increasing the supply of labour and reducing the equilibrium unemployment rate. In view of the ageing of the population in the coming years and decades the supply of labour is unlikely to continue growing at the pace we have become accustomed to in recent years. According to a number of recent studies, to achieve a respectable rate of sustainable economic growth the aim will therefore have to be to increase labour productivity. To realise this we have to focus on for six pillars of economic policy: (1) human capital, (2) functioning of markets, (3) entrepreneurship, (4) spatial planning, (5) innovation, and (6) sustainability. These six pillars determine the course for economic policy aiming at higher productivity

  6. Benchmarking in Mobarakeh Steel Company

    Directory of Open Access Journals (Sweden)

    Sasan Ghasemi

    2008-05-01

    Full Text Available Benchmarking is considered as one of the most effective ways of improving performance incompanies. Although benchmarking in business organizations is a relatively new concept and practice, ithas rapidly gained acceptance worldwide. This paper introduces the benchmarking project conducted in Esfahan’s Mobarakeh Steel Company, as the first systematic benchmarking project conducted in Iran. It aimsto share the process deployed for the benchmarking project in this company and illustrate how the projectsystematic implementation led to succes.

  7. Benchmarking in Mobarakeh Steel Company

    OpenAIRE

    Sasan Ghasemi; Mohammad Nazemi; Mehran Nejati

    2008-01-01

    Benchmarking is considered as one of the most effective ways of improving performance in companies. Although benchmarking in business organizations is a relatively new concept and practice, it has rapidly gained acceptance worldwide. This paper introduces the benchmarking project conducted in Esfahan's Mobarakeh Steel Company, as the first systematic benchmarking project conducted in Iran. It aims to share the process deployed for the benchmarking project in this company and illustrate how th...

  8. Exoplanets -New Results from Space and Ground-based Surveys

    Science.gov (United States)

    Udry, Stephane

    The exploration of the outer solar system and in particular of the giant planets and their environments is an on-going process with the Cassini spacecraft currently around Saturn, the Juno mission to Jupiter preparing to depart and two large future space missions planned to launch in the 2020-2025 time frame for the Jupiter system and its satellites (Europa and Ganymede) on the one hand, and the Saturnian system and Titan on the other hand [1,2]. Titan, Saturn's largest satellite, is the only other object in our Solar system to possess an extensive nitrogen atmosphere, host to an active organic chemistry, based on the interaction of N2 with methane (CH4). Following the Voyager flyby in 1980, Titan has been intensely studied from the ground-based large telescopes (such as the Keck or the VLT) and by artificial satellites (such as the Infrared Space Observatory and the Hubble Space Telescope) for the past three decades. Prior to Cassini-Huygens, Titan's atmospheric composition was thus known to us from the Voyager missions and also through the explorations by the ISO. Our perception of Titan had thus greatly been enhanced accordingly, but many questions remained as to the nature of the haze surrounding the satellite and the composition of the surface. The recent revelations by the Cassini-Huygens mission have managed to surprise us with many discoveries [3-8] and have yet to reveal more of the interesting aspects of the satellite. The Cassini-Huygens mission to the Saturnian system has been an extraordinary success for the planetary community since the Saturn-Orbit-Insertion (SOI) in July 2004 and again the very successful probe descent and landing of Huygens on January 14, 2005. One of its main targets was Titan. Titan was revealed to be a complex world more like the Earth than any other: it has a dense mostly nitrogen atmosphere and active climate and meteorological cycles where the working fluid, methane, behaves under Titan conditions the way that water does on

  9. Shielding Benchmark Computational Analysis

    International Nuclear Information System (INIS)

    Hunter, H.T.; Slater, C.O.; Holland, L.B.; Tracz, G.; Marshall, W.J.; Parsons, J.L.

    2000-01-01

    Over the past several decades, nuclear science has relied on experimental research to verify and validate information about shielding nuclear radiation for a variety of applications. These benchmarks are compared with results from computer code models and are useful for the development of more accurate cross-section libraries, computer code development of radiation transport modeling, and building accurate tests for miniature shielding mockups of new nuclear facilities. When documenting measurements, one must describe many parts of the experimental results to allow a complete computational analysis. Both old and new benchmark experiments, by any definition, must provide a sound basis for modeling more complex geometries required for quality assurance and cost savings in nuclear project development. Benchmarks may involve one or many materials and thicknesses, types of sources, and measurement techniques. In this paper the benchmark experiments of varying complexity are chosen to study the transport properties of some popular materials and thicknesses. These were analyzed using three-dimensional (3-D) models and continuous energy libraries of MCNP4B2, a Monte Carlo code developed at Los Alamos National Laboratory, New Mexico. A shielding benchmark library provided the experimental data and allowed a wide range of choices for source, geometry, and measurement data. The experimental data had often been used in previous analyses by reputable groups such as the Cross Section Evaluation Working Group (CSEWG) and the Organization for Economic Cooperation and Development/Nuclear Energy Agency Nuclear Science Committee (OECD/NEANSC)

  10. Deviating From the Benchmarks

    DEFF Research Database (Denmark)

    Rocha, Vera; Van Praag, Mirjam; Carneiro, Anabela

    This paper studies three related questions: To what extent otherwise similar startups employ different quantities and qualities of human capital at the moment of entry? How persistent are initial human capital choices over time? And how does deviating from human capital benchmarks influence firm......, founders human capital, and the ownership structure of startups (solo entrepreneurs versus entrepreneurial teams). We then study the survival implications of exogenous deviations from these benchmarks, based on spline models for survival data. Our results indicate that (especially negative) deviations from...... the benchmark can be substantial, are persistent over time, and hinder the survival of firms. The implications may, however, vary according to the sector and the ownership structure at entry. Given the stickiness of initial choices, wrong human capital decisions at entry turn out to be a close to irreversible...

  11. Kepler Ground-Based Photometry Proof-of-Concept

    Science.gov (United States)

    Brown, Timothy M.; Latham, D.; Howell, S.; Everett, M.

    2004-01-01

    We report on our efforts to evaluate the feasibility of using the 4-Shooter CCD camera on the 48-inch reflector at the Whipple Observatory to carry out a multi-band photometric survey of the Kepler target region. We also include recommendations for future work. We were assigned 36 nights with the &hooter during 2003 for this feasibility study. Most of the time during the first two dozen nights was dedicated to the development of procedures, test exposures, and a reconnaissance across the Kepler field. The final 12 nights in September and October 2003 were used for "production" observing in the middle of the Kepler field using the full complement of seven filters (SDSS u, g, r, i, z, plus our special Gred and D51 intermediate-band filters). Nine of these 12 nights were clear and photometric, and production observations were obtained at 109 pointings, corresponding to 14.6 square degrees.

  12. An assessment of the performance of global rainfall estimates without ground-based observations

    Directory of Open Access Journals (Sweden)

    C. Massari

    2017-09-01

    Full Text Available Satellite-based rainfall estimates over land have great potential for a wide range of applications, but their validation is challenging due to the scarcity of ground-based observations of rainfall in many areas of the planet. Recent studies have suggested the use of triple collocation (TC to characterize uncertainties associated with rainfall estimates by using three collocated rainfall products. However, TC requires the simultaneous availability of three products with mutually uncorrelated errors, a requirement which is difficult to satisfy with current global precipitation data sets. In this study, a recently developed method for rainfall estimation from soil moisture observations, SM2RAIN, is demonstrated to facilitate the accurate application of TC within triplets containing two state-of-the-art satellite rainfall estimates and a reanalysis product. The validity of different TC assumptions are indirectly tested via a high-quality ground rainfall product over the contiguous United States (CONUS, showing that SM2RAIN can provide a truly independent source of rainfall accumulation information which uniquely satisfies the assumptions underlying TC. On this basis, TC is applied with SM2RAIN on a global scale in an optimal configuration to calculate, for the first time, reliable global correlations (vs. an unknown truth of the aforementioned products without using a ground benchmark data set. The analysis is carried out during the period 2007–2012 using daily rainfall accumulation products obtained at 1° × 1° spatial resolution. Results convey the relatively high performance of the satellite rainfall estimates in eastern North and South America, southern Africa, southern and eastern Asia, eastern Australia, and southern Europe, as well as complementary performances between the reanalysis product and SM2RAIN, with the first performing reasonably well in the Northern Hemisphere and the second providing very good performance in the Southern

  13. HPCG Benchmark Technical Specification

    Energy Technology Data Exchange (ETDEWEB)

    Heroux, Michael Allen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dongarra, Jack [Univ. of Tennessee, Knoxville, TN (United States); Luszczek, Piotr [Univ. of Tennessee, Knoxville, TN (United States)

    2013-10-01

    The High Performance Conjugate Gradient (HPCG) benchmark [cite SNL, UTK reports] is a tool for ranking computer systems based on a simple additive Schwarz, symmetric Gauss-Seidel preconditioned conjugate gradient solver. HPCG is similar to the High Performance Linpack (HPL), or Top 500, benchmark [1] in its purpose, but HPCG is intended to better represent how today’s applications perform. In this paper we describe the technical details of HPCG: how it is designed and implemented, what code transformations are permitted and how to interpret and report results.

  14. Benchmarking for Best Practice

    CERN Document Server

    Zairi, Mohamed

    1998-01-01

    Benchmarking for Best Practice uses up-to-the-minute case-studies of individual companies and industry-wide quality schemes to show how and why implementation has succeeded. For any practitioner wanting to establish best practice in a wide variety of business areas, this book makes essential reading. .It is also an ideal textbook on the applications of TQM since it describes concepts, covers definitions and illustrates the applications with first-hand examples. Professor Mohamed Zairi is an international expert and leading figure in the field of benchmarking. His pioneering work in this area l

  15. Benchmarking Danish Industries

    DEFF Research Database (Denmark)

    Gammelgaard, Britta; Bentzen, Eric; Aagaard Andreassen, Mette

    2003-01-01

    compatible survey. The International Manufacturing Strategy Survey (IMSS) doesbring up the question of supply chain management, but unfortunately, we did not have access to thedatabase. Data from the members of the SCOR-model, in the form of benchmarked performance data,may exist, but are nonetheless...... not public. The survey is a cooperative project "Benchmarking DanishIndustries" with CIP/Aalborg University, the Danish Technological University, the DanishTechnological Institute and Copenhagen Business School as consortia partners. The project has beenfunded by the Danish Agency for Trade and Industry...

  16. [Do you mean benchmarking?].

    Science.gov (United States)

    Bonnet, F; Solignac, S; Marty, J

    2008-03-01

    The purpose of benchmarking is to settle improvement processes by comparing the activities to quality standards. The proposed methodology is illustrated by benchmark business cases performed inside medical plants on some items like nosocomial diseases or organization of surgery facilities. Moreover, the authors have built a specific graphic tool, enhanced with balance score numbers and mappings, so that the comparison between different anesthesia-reanimation services, which are willing to start an improvement program, is easy and relevant. This ready-made application is even more accurate as far as detailed tariffs of activities are implemented.

  17. RB reactor benchmark cores

    International Nuclear Information System (INIS)

    Pesic, M.

    1998-01-01

    A selected set of the RB reactor benchmark cores is presented in this paper. The first results of validation of the well-known Monte Carlo MCNP TM code and adjoining neutron cross section libraries are given. They confirm the idea for the proposal of the new U-D 2 O criticality benchmark system and support the intention to include this system in the next edition of the recent OECD/NEA Project: International Handbook of Evaluated Criticality Safety Experiment, in near future. (author)

  18. Benchmarking and Performance Management

    Directory of Open Access Journals (Sweden)

    Adrian TANTAU

    2010-12-01

    Full Text Available The relevance of the chosen topic is explained by the meaning of the firm efficiency concept - the firm efficiency means the revealed performance (how well the firm performs in the actual market environment given the basic characteristics of the firms and their markets that are expected to drive their profitability (firm size, market power etc.. This complex and relative performance could be due to such things as product innovation, management quality, work organization, some other factors can be a cause even if they are not directly observed by the researcher. The critical need for the management individuals/group to continuously improve their firm/company’s efficiency and effectiveness, the need for the managers to know which are the success factors and the competitiveness determinants determine consequently, what performance measures are most critical in determining their firm’s overall success. Benchmarking, when done properly, can accurately identify both successful companies and the underlying reasons for their success. Innovation and benchmarking firm level performance are critical interdependent activities. Firm level variables, used to infer performance, are often interdependent due to operational reasons. Hence, the managers need to take the dependencies among these variables into account when forecasting and benchmarking performance. This paper studies firm level performance using financial ratio and other type of profitability measures. It uses econometric models to describe and then propose a method to forecast and benchmark performance.

  19. Surveys and Benchmarks

    Science.gov (United States)

    Bers, Trudy

    2012-01-01

    Surveys and benchmarks continue to grow in importance for community colleges in response to several factors. One is the press for accountability, that is, for colleges to report the outcomes of their programs and services to demonstrate their quality and prudent use of resources, primarily to external constituents and governing boards at the state…

  20. Spectral Analysis of the Background in Ground-based, Long-slit ...

    Indian Academy of Sciences (India)

    1996-12-08

    Dec 8, 1996 ... Spectral Analysis of the Background in Ground-based,. Long-slit .... Figure 1 plots spectra from the 2-D array, after instrumental calibration and before correction for ..... which would merit attention and a better understanding.

  1. Ground-Based Global Navigation Satellite System Combined Broadcast Ephemeris Data (daily files) from NASA CDDIS

    Data.gov (United States)

    National Aeronautics and Space Administration — This dataset consists of ground-based Global Navigation Satellite System (GNSS) Combined Broadcast Ephemeris Data (daily files of all distinct navigation messages...

  2. Chasing Small Exoplanets with Ground-Based Near-Infrared Transit Photometry

    Science.gov (United States)

    Colon, K. D.; Barentsen, G.; Vinicius, Z.; Vanderburg, A.; Coughlin, J.; Thompson, S.; Mullally, F.; Barclay, T.; Quintana, E.

    2017-11-01

    I will present results from a ground-based survey to measure the infrared radius and other properties of small K2 exoplanets and candidates. The survey is preparation for upcoming discoveries from TESS and characterization with JWST.

  3. High-precision ground-based photometry of exoplanets

    Directory of Open Access Journals (Sweden)

    de Mooij Ernst J.W.

    2013-04-01

    Full Text Available High-precision photometry of transiting exoplanet systems has contributed significantly to our understanding of the properties of their atmospheres. The best targets are the bright exoplanet systems, for which the high number of photons allow very high signal-to-noise ratios. Most of the current instruments are not optimised for these high-precision measurements, either they have a large read-out overhead to reduce the readnoise and/or their field-of-view is limited, preventing simultaneous observations of both the target and a reference star. Recently we have proposed a new wide-field imager for the Observatoir de Mont-Megantic optimised for these bright systems (PI: Jayawardhana. The instruments has a dual beam design and a field-of-view of 17' by 17'. The cameras have a read-out time of 2 seconds, significantly reducing read-out overheads. Over the past years we have obtained significant experience with how to reach the high precision required for the characterisation of exoplanet atmospheres. Based on our experience we provide the following advice: Get the best calibrations possible. In the case of bad weather, characterise the instrument (e.g. non-linearity, dome flats, bias level, this is vital for better understanding of the science data. Observe the target for as long as possible, the out-of-transit baseline is as important as the transit/eclipse itself. A short baseline can lead to improperly corrected systematic and mis-estimation of the red-noise. Keep everything (e.g. position on detector, exposure time as stable as possible. Take care that the defocus is not too strong. For a large defocus, the contribution of the total flux from the sky-background in the aperture could well exceed that of the target, resulting in very strict requirements on the precision at which the background is measured.

  4. Reactor calculation benchmark PCA blind test results

    International Nuclear Information System (INIS)

    Kam, F.B.K.; Stallmann, F.W.

    1980-01-01

    Further improvement in calculational procedures or a combination of calculations and measurements is necessary to attain 10 to 15% (1 sigma) accuracy for neutron exposure parameters (flux greater than 0.1 MeV, flux greater than 1.0 MeV, and dpa). The calculational modeling of power reactors should be benchmarked in an actual LWR plant to provide final uncertainty estimates for end-of-life predictions and limitations for plant operations. 26 references, 14 figures, 6 tables

  5. Reactor calculation benchmark PCA blind test results

    Energy Technology Data Exchange (ETDEWEB)

    Kam, F.B.K.; Stallmann, F.W.

    1980-01-01

    Further improvement in calculational procedures or a combination of calculations and measurements is necessary to attain 10 to 15% (1 sigma) accuracy for neutron exposure parameters (flux greater than 0.1 MeV, flux greater than 1.0 MeV, and dpa). The calculational modeling of power reactors should be benchmarked in an actual LWR plant to provide final uncertainty estimates for end-of-life predictions and limitations for plant operations. 26 references, 14 figures, 6 tables.

  6. Standard Guide for Benchmark Testing of Light Water Reactor Calculations

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This guide covers general approaches for benchmarking neutron transport calculations in light water reactor systems. A companion guide (Guide E2005) covers use of benchmark fields for testing neutron transport calculations and cross sections in well controlled environments. This guide covers experimental benchmarking of neutron fluence calculations (or calculations of other exposure parameters such as dpa) in more complex geometries relevant to reactor surveillance. Particular sections of the guide discuss: the use of well-characterized benchmark neutron fields to provide an indication of the accuracy of the calculational methods and nuclear data when applied to typical cases; and the use of plant specific measurements to indicate bias in individual plant calculations. Use of these two benchmark techniques will serve to limit plant-specific calculational uncertainty, and, when combined with analytical uncertainty estimates for the calculations, will provide uncertainty estimates for reactor fluences with ...

  7. Benchmarking i den offentlige sektor

    DEFF Research Database (Denmark)

    Bukh, Per Nikolaj; Dietrichson, Lars; Sandalgaard, Niels

    2008-01-01

    I artiklen vil vi kort diskutere behovet for benchmarking i fraværet af traditionelle markedsmekanismer. Herefter vil vi nærmere redegøre for, hvad benchmarking er med udgangspunkt i fire forskellige anvendelser af benchmarking. Regulering af forsyningsvirksomheder vil blive behandlet, hvorefter...

  8. Cloud benchmarking for performance

    OpenAIRE

    Varghese, Blesson; Akgun, Ozgur; Miguel, Ian; Thai, Long; Barker, Adam

    2014-01-01

    Date of Acceptance: 20/09/2014 How can applications be deployed on the cloud to achieve maximum performance? This question has become significant and challenging with the availability of a wide variety of Virtual Machines (VMs) with different performance capabilities in the cloud. The above question is addressed by proposing a six step benchmarking methodology in which a user provides a set of four weights that indicate how important each of the following groups: memory, processor, computa...

  9. Benchmarking reference services: an introduction.

    Science.gov (United States)

    Marshall, J G; Buchanan, H S

    1995-01-01

    Benchmarking is based on the common sense idea that someone else, either inside or outside of libraries, has found a better way of doing certain things and that your own library's performance can be improved by finding out how others do things and adopting the best practices you find. Benchmarking is one of the tools used for achieving continuous improvement in Total Quality Management (TQM) programs. Although benchmarking can be done on an informal basis, TQM puts considerable emphasis on formal data collection and performance measurement. Used to its full potential, benchmarking can provide a common measuring stick to evaluate process performance. This article introduces the general concept of benchmarking, linking it whenever possible to reference services in health sciences libraries. Data collection instruments that have potential application in benchmarking studies are discussed and the need to develop common measurement tools to facilitate benchmarking is emphasized.

  10. Supporting a Diverse Community of Undergraduate Researchers in Satellite and Ground-Based Remote Sensing

    Science.gov (United States)

    Blake, R.; Liou-Mark, J.

    2012-12-01

    The U.S. remains in grave danger of losing its global competitive edge in STEM. To find solutions to this problem, the Obama Administration proposed two new national initiatives: the Educate to Innovate Initiative and the $100 million government/private industry initiative to train 100,000 STEM teachers and graduate 1 million additional STEM students over the next decade. To assist in ameliorating the national STEM plight, the New York City College of Technology has designed its NSF Research Experience for Undergraduate (REU) program in satellite and ground-based remote sensing to target underrepresented minority students. Since the inception of the program in 2008, a total of 45 undergraduate students of which 38 (84%) are considered underrepresented minorities in STEM have finished or are continuing with their research or are pursuing their STEM endeavors. The program is comprised of the three primary components. The first component, Structured Learning Environments: Preparation and Mentorship, provides the REU Scholars with the skill sets necessary for proficiency in satellite and ground-based remote sensing research. The students are offered mini-courses in Geographic Information Systems, MATLAB, and Remote Sensing. They also participate in workshops on the Ethics of Research. Each REU student is a member of a team that consists of faculty mentors, post doctorate/graduate students, and high school students. The second component, Student Support and Safety Nets, provides undergraduates a learning environment that supports them in becoming successful researchers. Special networking and Brown Bag sessions, and an annual picnic with research scientists are organized so that REU Scholars are provided with opportunities to expand their professional community. Graduate school support is provided by offering free Graduate Record Examination preparation courses and workshops on the graduate school application process. Additionally, students are supported by college

  11. BigBOSS: The Ground-Based Stage IV BAO Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schlegel, David; Bebek, Chris; Heetderks, Henry; Ho, Shirley; Lampton, Michael; Levi, Michael; Mostek, Nick; Padmanabhan, Nikhil; Perlmutter, Saul; Roe, Natalie; Sholl, Michael; Smoot, George; White, Martin; Dey, Arjun; Abraham, Tony; Jannuzi, Buell; Joyce, Dick; Liang, Ming; Merrill, Mike; Olsen, Knut; Salim, Samir

    2009-04-01

    The BigBOSS experiment is a proposed DOE-NSF Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with an all-sky galaxy redshift survey. The project is designed to unlock the mystery of dark energy using existing ground-based facilities operated by NOAO. A new 4000-fiber R=5000 spectrograph covering a 3-degree diameter field will measure BAO and redshift space distortions in the distribution of galaxies and hydrogen gas spanning redshifts from 0.2< z< 3.5. The Dark Energy Task Force figure of merit (DETF FoM) for this experiment is expected to be equal to that of a JDEM mission for BAO with the lower risk and cost typical of a ground-based experiment.

  12. Benchmarking HIV health care

    DEFF Research Database (Denmark)

    Podlekareva, Daria; Reekie, Joanne; Mocroft, Amanda

    2012-01-01

    ABSTRACT: BACKGROUND: State-of-the-art care involving the utilisation of multiple health care interventions is the basis for an optimal long-term clinical prognosis for HIV-patients. We evaluated health care for HIV-patients based on four key indicators. METHODS: Four indicators of health care we...... document pronounced regional differences in adherence to guidelines and can help to identify gaps and direct target interventions. It may serve as a tool for assessment and benchmarking the clinical management of HIV-patients in any setting worldwide....

  13. Benchmarking Cloud Storage Systems

    OpenAIRE

    Wang, Xing

    2014-01-01

    With the rise of cloud computing, many cloud storage systems like Dropbox, Google Drive and Mega have been built to provide decentralized and reliable file storage. It is thus of prime importance to know their features, performance, and the best way to make use of them. In this context, we introduce BenchCloud, a tool designed as part of this thesis to conveniently and efficiently benchmark any cloud storage system. First, we provide a study of six commonly-used cloud storage systems to ident...

  14. The COST Benchmark

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Tiesyte, Dalia; Tradisauskas, Nerius

    2006-01-01

    An infrastructure is emerging that enables the positioning of populations of on-line, mobile service users. In step with this, research in the management of moving objects has attracted substantial attention. In particular, quite a few proposals now exist for the indexing of moving objects...... takes into account that the available positions of the moving objects are inaccurate, an aspect largely ignored in previous indexing research. The concepts of data and query enlargement are introduced for addressing inaccuracy. As proof of concepts of the benchmark, the paper covers the application...

  15. Asteroseismology of solar-type stars with Kepler: III. Ground-based data

    DEFF Research Database (Denmark)

    Karoff, Christoffer; Molenda-Żakowicz , J.

    2010-01-01

    We report on the ground-based follow-up program of spectroscopic and photometric observations of solar-like asteroseismic targets for the Kepler space mission. These stars constitute a large group of more than a thousand objects which are the subject of an intensive study by the Kepler Asteroseis......We report on the ground-based follow-up program of spectroscopic and photometric observations of solar-like asteroseismic targets for the Kepler space mission. These stars constitute a large group of more than a thousand objects which are the subject of an intensive study by the Kepler...

  16. Status of advanced ground-based laser interferometers for gravitational-wave detection

    International Nuclear Information System (INIS)

    Dooley, K L; Akutsu, T; Dwyer, S; Puppo, P

    2015-01-01

    Ground-based laser interferometers for gravitational-wave (GW) detection were first constructed starting 20 years ago and as of 2010 collection of several years’ worth of science data at initial design sensitivities was completed. Upgrades to the initial detectors together with construction of brand new detectors are ongoing and feature advanced technologies to improve the sensitivity to GWs. This conference proceeding provides an overview of the common design features of ground-based laser interferometric GW detectors and establishes the context for the status updates of each of the four gravitational-wave detectors around the world: Advanced LIGO, Advanced Virgo, GEO 600 and KAGRA. (paper)

  17. Status of advanced ground-based laser interferometers for gravitational-wave detection

    Science.gov (United States)

    Dooley, K. L.; Akutsu, T.; Dwyer, S.; Puppo, P.

    2015-05-01

    Ground-based laser interferometers for gravitational-wave (GW) detection were first constructed starting 20 years ago and as of 2010 collection of several years’ worth of science data at initial design sensitivities was completed. Upgrades to the initial detectors together with construction of brand new detectors are ongoing and feature advanced technologies to improve the sensitivity to GWs. This conference proceeding provides an overview of the common design features of ground-based laser interferometric GW detectors and establishes the context for the status updates of each of the four gravitational-wave detectors around the world: Advanced LIGO, Advanced Virgo, GEO 600 and KAGRA.

  18. Benchmarking multimedia performance

    Science.gov (United States)

    Zandi, Ahmad; Sudharsanan, Subramania I.

    1998-03-01

    With the introduction of faster processors and special instruction sets tailored to multimedia, a number of exciting applications are now feasible on the desktops. Among these is the DVD playback consisting, among other things, of MPEG-2 video and Dolby digital audio or MPEG-2 audio. Other multimedia applications such as video conferencing and speech recognition are also becoming popular on computer systems. In view of this tremendous interest in multimedia, a group of major computer companies have formed, Multimedia Benchmarks Committee as part of Standard Performance Evaluation Corp. to address the performance issues of multimedia applications. The approach is multi-tiered with three tiers of fidelity from minimal to full compliant. In each case the fidelity of the bitstream reconstruction as well as quality of the video or audio output are measured and the system is classified accordingly. At the next step the performance of the system is measured. In many multimedia applications such as the DVD playback the application needs to be run at a specific rate. In this case the measurement of the excess processing power, makes all the difference. All these make a system level, application based, multimedia benchmark very challenging. Several ideas and methodologies for each aspect of the problems will be presented and analyzed.

  19. Core Benchmarks Descriptions

    International Nuclear Information System (INIS)

    Pavlovichev, A.M.

    2001-01-01

    Actual regulations while designing of new fuel cycles for nuclear power installations comprise a calculational justification to be performed by certified computer codes. It guarantees that obtained calculational results will be within the limits of declared uncertainties that are indicated in a certificate issued by Gosatomnadzor of Russian Federation (GAN) and concerning a corresponding computer code. A formal justification of declared uncertainties is the comparison of calculational results obtained by a commercial code with the results of experiments or of calculational tests that are calculated with an uncertainty defined by certified precision codes of MCU type or of other one. The actual level of international cooperation provides an enlarging of the bank of experimental and calculational benchmarks acceptable for a certification of commercial codes that are being used for a design of fuel loadings with MOX fuel. In particular, the work is practically finished on the forming of calculational benchmarks list for a certification of code TVS-M as applied to MOX fuel assembly calculations. The results on these activities are presented

  20. A benchmarking study

    Directory of Open Access Journals (Sweden)

    H. Groessing

    2015-02-01

    Full Text Available A benchmark study for permeability measurement is presented. In the past studies of other research groups which focused on the reproducibility of 1D-permeability measurements showed high standard deviations of the gained permeability values (25%, even though a defined test rig with required specifications was used. Within this study, the reproducibility of capacitive in-plane permeability testing system measurements was benchmarked by comparing results of two research sites using this technology. The reproducibility was compared by using a glass fibre woven textile and carbon fibre non crimped fabric (NCF. These two material types were taken into consideration due to the different electrical properties of glass and carbon with respect to dielectric capacitive sensors of the permeability measurement systems. In order to determine the unsaturated permeability characteristics as function of fibre volume content the measurements were executed at three different fibre volume contents including five repetitions. It was found that the stability and reproducibility of the presentedin-plane permeability measurement system is very good in the case of the glass fibre woven textiles. This is true for the comparison of the repetition measurements as well as for the comparison between the two different permeameters. These positive results were confirmed by a comparison to permeability values of the same textile gained with an older generation permeameter applying the same measurement technology. Also it was shown, that a correct determination of the grammage and the material density are crucial for correct correlation of measured permeability values and fibre volume contents.

  1. Benchmarking Using Basic DBMS Operations

    Science.gov (United States)

    Crolotte, Alain; Ghazal, Ahmad

    The TPC-H benchmark proved to be successful in the decision support area. Many commercial database vendors and their related hardware vendors used these benchmarks to show the superiority and competitive edge of their products. However, over time, the TPC-H became less representative of industry trends as vendors keep tuning their database to this benchmark-specific workload. In this paper, we present XMarq, a simple benchmark framework that can be used to compare various software/hardware combinations. Our benchmark model is currently composed of 25 queries that measure the performance of basic operations such as scans, aggregations, joins and index access. This benchmark model is based on the TPC-H data model due to its maturity and well-understood data generation capability. We also propose metrics to evaluate single-system performance and compare two systems. Finally we illustrate the effectiveness of this model by showing experimental results comparing two systems under different conditions.

  2. Ground-Based VIS/NIR Reflectance Spectra of 25143 Itokawa: What Hayabusa will See and How Ground-Based Data can Augment Analyses

    Science.gov (United States)

    Vilas, Faith; Abell, P. A.; Jarvis, K. S.

    2004-01-01

    Planning for the arrival of the Hayabusa spacecraft at asteroid 25143 Itokawa includes consideration of the expected spectral information to be obtained using the AMICA and NIRS instruments. The rotationally-resolved spatial coverage the asteroid we have obtained with ground-based telescopic spectrophotometry in the visible and near-infrared can be utilized here to address expected spacecraft data. We use spectrophotometry to simulate the types of data that Hayabusa will receive with the NIRS and AMICA instruments, and will demonstrate them here. The NIRS will cover a wavelength range from 0.85 m, and have a dispersion per element of 250 Angstroms. Thus, we are limited in coverage of the 1.0 micrometer and 2.0 micrometer mafic silicate absorption features. The ground-based reflectance spectra of Itokawa show a large component of olivine in its surface material, and the 2.0 micrometer feature is shallow. Determining the olivine to pyroxene abundance ratio is critically dependent on the attributes of the 1.0- and 2.0 micrometer features. With a cut-off near 2,1 micrometer the longer edge of the 2.0- feature will not be obtained by NIRS. Reflectance spectra obtained using ground-based telescopes can be used to determine the regional composition around space-based spectral observations, and possibly augment the longer wavelength spectral attributes. Similarly, the shorter wavelength end of the 1.0 micrometer absorption feature will be partially lost to the NIRS. The AMICA filters mimic the ECAS filters, and have wavelength coverage overlapping with the NIRS spectral range. We demonstrate how merging photometry from AMICA will extend the spectral coverage of the NIRS. Lessons learned from earlier spacecraft to asteroids should be considered.

  3. Benchmarking & European Sustainable Transport Policies

    DEFF Research Database (Denmark)

    Gudmundsson, H.

    2003-01-01

    , Benchmarking is one of the management tools that have recently been introduced in the transport sector. It is rapidly being applied to a wide range of transport operations, services and policies. This paper is a contribution to the discussion of the role of benchmarking in the future efforts to...... contribution to the discussions within the Eusponsored BEST Thematic Network (Benchmarking European Sustainable Transport) which ran from 2000 to 2003....

  4. Benchmarking in Czech Higher Education

    OpenAIRE

    Plaček Michal; Ochrana František; Půček Milan

    2015-01-01

    The first part of this article surveys the current experience with the use of benchmarking at Czech universities specializing in economics and management. The results indicate that collaborative benchmarking is not used on this level today, but most actors show some interest in its introduction. The expression of the need for it and the importance of benchmarking as a very suitable performance-management tool in less developed countries are the impetus for the second part of our article. Base...

  5. Power reactor pressure vessel benchmarks

    International Nuclear Information System (INIS)

    Rahn, F.J.

    1978-01-01

    A review is given of the current status of experimental and calculational benchmarks for use in understanding the radiation embrittlement effects in the pressure vessels of operating light water power reactors. The requirements of such benchmarks for application to pressure vessel dosimetry are stated. Recent developments in active and passive neutron detectors sensitive in the ranges of importance to embrittlement studies are summarized and recommendations for improvements in the benchmark are made. (author)

  6. Take-off and Landing Using Ground Based Power - Landing Simulations Using Multibody Dynamics

    NARCIS (Netherlands)

    Wu, P.; Voskuijl, M.; Van Tooren, M.J.L.

    2014-01-01

    A novel take-off and landing system using ground based power is proposed in the EUFP7 project GABRIEL. The proposed system has the potential benefit to reduce aircraft weight, emissions and noise. A preliminary investigation of the feasibility of the structural design of the connection mechanism

  7. ForestCrowns: a software tool for analyzing ground-based digital photographs of forest canopies

    Science.gov (United States)

    Matthew F. Winn; Sang-Mook Lee; Phillip A. Araman

    2013-01-01

    Canopy coverage is a key variable used to characterize forest structure. In addition, the light transmitted through the canopy is an important ecological indicator of plant and animal habitat and understory climate conditions. A common ground-based method used to document canopy coverage is to take digital photographs from below the canopy. To assist with analyzing...

  8. Estimating and validating ground-based timber harvesting production through computer simulation

    Science.gov (United States)

    Jingxin Wang; Chris B. LeDoux

    2003-01-01

    Estimating ground-based timber harvesting systems production with an object oriented methodology was investigated. The estimation model developed generates stands of trees, simulates chain saw, drive-to-tree feller-buncher, swing-to-tree single-grip harvester felling, and grapple skidder and forwarder extraction activities, and analyzes costs and productivity. It also...

  9. On reconciling ground-based with spaceborne normalized radar cross section measurements

    DEFF Research Database (Denmark)

    Baumgartner, Francois; Munk, Jens; Jezek, K C

    2002-01-01

    This study examines differences in the normalized radar cross section, derived from ground-based versus spaceborne radar data. A simple homogeneous half-space model, indicates that agreement between the two improves as 1) the distance from the scatterer is increased; and/or 2) the extinction...

  10. Validation of the CrIS fast physical NH3 retrieval with ground-based FTIR

    NARCIS (Netherlands)

    Dammers, E.; Shephard, M.W.; Palm, M.; Cady-Pereira, K.; Capps, S.; Lutsch, E.; Strong, K.; Hannigan, J.W.; Ortega, I.; Toon, G.C.; Stremme, W.; Grutter, M.; Jones, N.; Smale, D.; Siemons, J.; Hrpcek, K.; Tremblay, D.; Schaap, M.; Notholt, J.; Willem Erisman, J.

    2017-01-01

    Presented here is the validation of the CrIS (Cross-track Infrared Sounder) fast physical NH3 retrieval (CFPR) column and profile measurements using ground-based Fourier transform infrared (FTIR) observations. We use the total columns and profiles from seven FTIR sites in the Network for the

  11. A cost-performance model for ground-based optical communications receiving telescopes

    Science.gov (United States)

    Lesh, J. R.; Robinson, D. L.

    1986-01-01

    An analytical cost-performance model for a ground-based optical communications receiving telescope is presented. The model considers costs of existing telescopes as a function of diameter and field of view. This, coupled with communication performance as a function of receiver diameter and field of view, yields the appropriate telescope cost versus communication performance curve.

  12. Retrieval of liquid water cloud properties from ground-based remote sensing observations

    NARCIS (Netherlands)

    Knist, C.L.

    2014-01-01

    Accurate ground-based remotely sensed microphysical and optical properties of liquid water clouds are essential references to validate satellite-observed cloud properties and to improve cloud parameterizations in weather and climate models. This requires the evaluation of algorithms for retrieval of

  13. Modern developments for ground-based monitoring of fire behavior and effects

    Science.gov (United States)

    Colin C. Hardy; Robert Kremens; Matthew B. Dickinson

    2010-01-01

    Advances in electronic technology over the last several decades have been staggering. The cost of electronics continues to decrease while system performance increases seemingly without limit. We have applied modern techniques in sensors, electronics and instrumentation to create a suite of ground based diagnostics that can be used in laboratory (~ 1 m2), field scale...

  14. Submillimetric motion detection with a 94 GHz ground based synthetic aperture radar

    OpenAIRE

    Martinez Cervera, Arturo; Lort Cuenca, Marc; Aguasca Solé, Alberto; Broquetas Ibars, Antoni

    2015-01-01

    The paper presents the validation and experimental assessment of a 94 GHz (W-Band) CW-FM Radar that can be configured as a Ground Based SAR for high resolution imaging and interferometry. Several experimental campaigns have been carried out to assess the capability of the system to remotely observe submillimetric deformation and vibration in infrastructures. Peer Reviewed

  15. Ground-based forest harvesting effects on soil physical properties and Douglas-fir growth.

    Science.gov (United States)

    Adrian Ares; Thomas A. Terry; Richard E. Miller; Harry W. Anderson; Barry L. Flaming

    2005-01-01

    Soil properties and forest productivity can be affected by heavy equipment used for harvest and site preparation but these impacts vary greatly with site conditions and operational practices. We assessed the effects of ground-based logging on soil physical properties and subsequent Douglas-fir [Pseudotsuga menziesii (Mirb) Franco] growth on a highly...

  16. Overview of Boundary Layer Clouds Using Satellite and Ground-Based Measurements

    Science.gov (United States)

    Xi, B.; Dong, X.; Wu, P.; Qiu, S.

    2017-12-01

    A comprehensive summary of boundary layer clouds properties based on our few recently studies will be presented. The analyses include the global cloud fractions and cloud macro/micro- physical properties based on satellite measurements using both CERES-MODIS and CloudSat/Caliposo data products,; the annual/seasonal/diurnal variations of stratocumulus clouds over different climate regions (mid-latitude land, mid-latitude ocean, and Arctic region) using DOE ARM ground-based measurements over Southern great plain (SGP), Azores (GRW), and North slope of Alaska (NSA) sites; the impact of environmental conditions to the formation and dissipation process of marine boundary layer clouds over Azores site; characterizing Arctice mixed-phase cloud structure and favorable environmental conditions for the formation/maintainess of mixed-phase clouds over NSA site. Though the presentation has widely spread topics, we will focus on the representation of the ground-based measurements over different climate regions; evaluation of satellite retrieved cloud properties using these ground-based measurements, and understanding the uncertainties of both satellite and ground-based retrievals and measurements.

  17. MOx Depletion Calculation Benchmark

    International Nuclear Information System (INIS)

    San Felice, Laurence; Eschbach, Romain; Dewi Syarifah, Ratna; Maryam, Seif-Eddine; Hesketh, Kevin

    2016-01-01

    Under the auspices of the NEA Nuclear Science Committee (NSC), the Working Party on Scientific Issues of Reactor Systems (WPRS) has been established to study the reactor physics, fuel performance, radiation transport and shielding, and the uncertainties associated with modelling of these phenomena in present and future nuclear power systems. The WPRS has different expert groups to cover a wide range of scientific issues in these fields. The Expert Group on Reactor Physics and Advanced Nuclear Systems (EGRPANS) was created in 2011 to perform specific tasks associated with reactor physics aspects of present and future nuclear power systems. EGRPANS provides expert advice to the WPRS and the nuclear community on the development needs (data and methods, validation experiments, scenario studies) for different reactor systems and also provides specific technical information regarding: core reactivity characteristics, including fuel depletion effects; core power/flux distributions; Core dynamics and reactivity control. In 2013 EGRPANS published a report that investigated fuel depletion effects in a Pressurised Water Reactor (PWR). This was entitled 'International Comparison of a Depletion Calculation Benchmark on Fuel Cycle Issues' NEA/NSC/DOC(2013) that documented a benchmark exercise for UO 2 fuel rods. This report documents a complementary benchmark exercise that focused on PuO 2 /UO 2 Mixed Oxide (MOX) fuel rods. The results are especially relevant to the back-end of the fuel cycle, including irradiated fuel transport, reprocessing, interim storage and waste repository. Saint-Laurent B1 (SLB1) was the first French reactor to use MOx assemblies. SLB1 is a 900 MWe PWR, with 30% MOx fuel loading. The standard MOx assemblies, used in Saint-Laurent B1 reactor, include three zones with different plutonium enrichments, high Pu content (5.64%) in the center zone, medium Pu content (4.42%) in the intermediate zone and low Pu content (2.91%) in the peripheral zone

  18. Benchmarking Academic Anatomic Pathologists

    Directory of Open Access Journals (Sweden)

    Barbara S. Ducatman MD

    2016-10-01

    Full Text Available The most common benchmarks for faculty productivity are derived from Medical Group Management Association (MGMA or Vizient-AAMC Faculty Practice Solutions Center ® (FPSC databases. The Association of Pathology Chairs has also collected similar survey data for several years. We examined the Association of Pathology Chairs annual faculty productivity data and compared it with MGMA and FPSC data to understand the value, inherent flaws, and limitations of benchmarking data. We hypothesized that the variability in calculated faculty productivity is due to the type of practice model and clinical effort allocation. Data from the Association of Pathology Chairs survey on 629 surgical pathologists and/or anatomic pathologists from 51 programs were analyzed. From review of service assignments, we were able to assign each pathologist to a specific practice model: general anatomic pathologists/surgical pathologists, 1 or more subspecialties, or a hybrid of the 2 models. There were statistically significant differences among academic ranks and practice types. When we analyzed our data using each organization’s methods, the median results for the anatomic pathologists/surgical pathologists general practice model compared to MGMA and FPSC results for anatomic and/or surgical pathology were quite close. Both MGMA and FPSC data exclude a significant proportion of academic pathologists with clinical duties. We used the more inclusive FPSC definition of clinical “full-time faculty” (0.60 clinical full-time equivalent and above. The correlation between clinical full-time equivalent effort allocation, annual days on service, and annual work relative value unit productivity was poor. This study demonstrates that effort allocations are variable across academic departments of pathology and do not correlate well with either work relative value unit effort or reported days on service. Although the Association of Pathology Chairs–reported median work relative

  19. Self-benchmarking Guide for Cleanrooms: Metrics, Benchmarks, Actions

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul; Sartor, Dale; Tschudi, William

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  20. Self-benchmarking Guide for Laboratory Buildings: Metrics, Benchmarks, Actions

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul; Greenberg, Steve; Sartor, Dale

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  1. Shielding benchmark test

    International Nuclear Information System (INIS)

    Kawai, Masayoshi

    1984-01-01

    Iron data in JENDL-2 have been tested by analyzing shielding benchmark experiments for neutron transmission through iron block performed at KFK using CF-252 neutron source and at ORNL using collimated neutron beam from reactor. The analyses are made by a shielding analysis code system RADHEAT-V4 developed at JAERI. The calculated results are compared with the measured data. As for the KFK experiments, the C/E values are about 1.1. For the ORNL experiments, the calculated values agree with the measured data within an accuracy of 33% for the off-center geometry. The d-t neutron transmission measurements through carbon sphere made at LLNL are also analyzed preliminarily by using the revised JENDL data for fusion neutronics calculation. (author)

  2. Practical Applications of Cosmic Ray Science: Spacecraft, Aircraft, Ground-Based Computation and Control Systems, and Human Health and Safety

    Science.gov (United States)

    Atwell, William; Koontz, Steve; Normand, Eugene

    2012-01-01

    Three twentieth century technological developments, 1) high altitude commercial and military aircraft; 2) manned and unmanned spacecraft; and 3) increasingly complex and sensitive solid state micro-electronics systems, have driven an ongoing evolution of basic cosmic ray science into a set of practical engineering tools needed to design, test, and verify the safety and reliability of modern complex technological systems. The effects of primary cosmic ray particles and secondary particle showers produced by nuclear reactions with the atmosphere, can determine the design and verification processes (as well as the total dollar cost) for manned and unmanned spacecraft avionics systems. Similar considerations apply to commercial and military aircraft operating at high latitudes and altitudes near the atmospheric Pfotzer maximum. Even ground based computational and controls systems can be negatively affected by secondary particle showers at the Earth s surface, especially if the net target area of the sensitive electronic system components is large. Finally, accumulation of both primary cosmic ray and secondary cosmic ray induced particle shower radiation dose is an important health and safety consideration for commercial or military air crews operating at high altitude/latitude and is also one of the most important factors presently limiting manned space flight operations beyond low-Earth orbit (LEO). In this paper we review the discovery of cosmic ray effects on the performance and reliability of microelectronic systems as well as human health and the development of the engineering and health science tools used to evaluate and mitigate cosmic ray effects in ground-based atmospheric flight, and space flight environments. Ground test methods applied to microelectronic components and systems are used in combinations with radiation transport and reaction codes to predict the performance of microelectronic systems in their operating environments. Similar radiation transport

  3. Benchmarking monthly homogenization algorithms

    Science.gov (United States)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.

    2011-08-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random break-type inhomogeneities were added to the simulated datasets modeled as a Poisson process with normally distributed breakpoint sizes. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data

  4. Benchmarking foreign electronics technologies

    Energy Technology Data Exchange (ETDEWEB)

    Bostian, C.W.; Hodges, D.A.; Leachman, R.C.; Sheridan, T.B.; Tsang, W.T.; White, R.M.

    1994-12-01

    This report has been drafted in response to a request from the Japanese Technology Evaluation Center`s (JTEC) Panel on Benchmarking Select Technologies. Since April 1991, the Competitive Semiconductor Manufacturing (CSM) Program at the University of California at Berkeley has been engaged in a detailed study of quality, productivity, and competitiveness in semiconductor manufacturing worldwide. The program is a joint activity of the College of Engineering, the Haas School of Business, and the Berkeley Roundtable on the International Economy, under sponsorship of the Alfred P. Sloan Foundation, and with the cooperation of semiconductor producers from Asia, Europe and the United States. Professors David A. Hodges and Robert C. Leachman are the project`s Co-Directors. The present report for JTEC is primarily based on data and analysis drawn from that continuing program. The CSM program is being conducted by faculty, graduate students and research staff from UC Berkeley`s Schools of Engineering and Business, and Department of Economics. Many of the participating firms are represented on the program`s Industry Advisory Board. The Board played an important role in defining the research agenda. A pilot study was conducted in 1991 with the cooperation of three semiconductor plants. The research plan and survey documents were thereby refined. The main phase of the CSM benchmarking study began in mid-1992 and will continue at least through 1997. reports are presented on the manufacture of integrated circuits; data storage; wireless technology; human-machine interfaces; and optoelectronics. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.

  5. SSI and structural benchmarks

    International Nuclear Information System (INIS)

    Philippacopoulos, A.J.; Miller, C.A.; Costantino, C.J.; Graves, H.

    1987-01-01

    This paper presents the latest results of the ongoing program entitled, Standard Problems for Structural Computer Codes, currently being worked on at BNL for the USNRC, Office of Nuclear Regulatory Research. During FY 1986, efforts were focussed on three tasks, namely, (1) an investigation of ground water effects on the response of Category I structures, (2) the Soil-Structure Interaction Workshop and (3) studies on structural benchmarks associated with Category I structures. The objective of the studies on ground water effects is to verify the applicability and the limitations of the SSI methods currently used by the industry in performing seismic evaluations of nuclear plants which are located at sites with high water tables. In a previous study by BNL (NUREG/CR-4588), it has been concluded that the pore water can influence significantly the soil-structure interaction process. This result, however, is based on the assumption of fully saturated soil profiles. Consequently, the work was further extended to include cases associated with variable water table depths. In this paper, results related to cut-off depths beyond which the pore water effects can be ignored in seismic calculations, are addressed. Comprehensive numerical data are given for soil configurations typical to those encountered in nuclear plant sites. These data were generated by using a modified version of the SLAM code which is capable of handling problems related to the dynamic response of saturated soils. Further, the paper presents some key aspects of the Soil-Structure Interaction Workshop (NUREG/CP-0054) which was held in Bethesda, MD on June 1, 1986. Finally, recent efforts related to the task on the structural benchmarks are described

  6. Ground-Based Global Navigation Satellite System (GNSS) GLONASS Broadcast Ephemeris Data (hourly files) from NASA CDDIS

    Data.gov (United States)

    National Aeronautics and Space Administration — This dataset consists of ground-based Global Navigation Satellite System (GNSS) GLObal NAvigation Satellite System (GLONASS) Broadcast Ephemeris Data (hourly files)...

  7. Loss of signal transduction and inhibition of lymphocyte locomotion in a ground-based model of microgravity

    Science.gov (United States)

    Sundaresan, Alamelu; Risin, Diana; Pellis, Neal R.; McIntire, L. V. (Principal Investigator)

    2002-01-01

    Inflammatory adherence to, and locomotion through the interstitium is an important component of the immune response. Conditions such as microgravity and modeled microgravity (MMG) severely inhibit lymphocyte locomotion in vitro through gelled type I collagen. We used the NASA rotating wall vessel bioreactor or slow-turning lateral vessel as a prototype for MMG in ground-based experiments. Previous experiments from our laboratory revealed that when lymphocytes (human peripheral blood mononuclear cells [PBMCs]) were first activated with phytohemaglutinin followed by exposure to MMG, locomotory capacity was not affected. In the present study, MMG inhibits lymphocyte locomotion in a manner similar to that observed in microgravity. Phorbol myristate acetate (PMA) treatment of PBMCs restored lost locomotory capacity by a maximum of 87%. Augmentation of cellular calcium flux with ionomycin had no restorative effect. Treatment of lymphocytes with mitomycin C prior to exposure to MMG, followed by PMA, restored locomotion to the same extent as when nonmitomycin C-treated lymphocytes were exposed to MMG (80-87%), suggesting that deoxyribonucleic acid replication is not essential for the restoration of locomotion. Thus, direct activation of protein kinase C (PKC) with PMA was effective in restoring locomotion in MMG comparable to the normal levels seen in Ig cultures. Therefore, in MMG, lymphocyte calcium signaling pathways were functional, with defects occurring at either the level of PKC or upstream of PKC.

  8. Review for session K - benchmarks

    International Nuclear Information System (INIS)

    McCracken, A.K.

    1980-01-01

    Eight of the papers to be considered in Session K are directly concerned, at least in part, with the Pool Critical Assembly (P.C.A.) benchmark at Oak Ridge. The remaining seven papers in this session, the subject of this review, are concerned with a variety of topics related to the general theme of Benchmarks and will be considered individually

  9. Internal Benchmarking for Institutional Effectiveness

    Science.gov (United States)

    Ronco, Sharron L.

    2012-01-01

    Internal benchmarking is an established practice in business and industry for identifying best in-house practices and disseminating the knowledge about those practices to other groups in the organization. Internal benchmarking can be done with structures, processes, outcomes, or even individuals. In colleges or universities with multicampuses or a…

  10. Entropy-based benchmarking methods

    NARCIS (Netherlands)

    Temurshoev, Umed

    2012-01-01

    We argue that benchmarking sign-volatile series should be based on the principle of movement and sign preservation, which states that a bench-marked series should reproduce the movement and signs in the original series. We show that the widely used variants of Denton (1971) method and the growth

  11. Benchmark simulation models, quo vadis?

    DEFF Research Database (Denmark)

    Jeppsson, U.; Alex, J; Batstone, D. J.

    2013-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to p...

  12. EPA's Benchmark Dose Modeling Software

    Science.gov (United States)

    The EPA developed the Benchmark Dose Software (BMDS) as a tool to help Agency risk assessors facilitate applying benchmark dose (BMD) method’s to EPA’s human health risk assessment (HHRA) documents. The application of BMD methods overcomes many well know limitations ...

  13. Benchmark for Strategic Performance Improvement.

    Science.gov (United States)

    Gohlke, Annette

    1997-01-01

    Explains benchmarking, a total quality management tool used to measure and compare the work processes in a library with those in other libraries to increase library performance. Topics include the main groups of upper management, clients, and staff; critical success factors for each group; and benefits of benchmarking. (Author/LRW)

  14. Benchmarking: A Process for Improvement.

    Science.gov (United States)

    Peischl, Thomas M.

    One problem with the outcome-based measures used in higher education is that they measure quantity but not quality. Benchmarking, or the use of some external standard of quality to measure tasks, processes, and outputs, is partially solving that difficulty. Benchmarking allows for the establishment of a systematic process to indicate if outputs…

  15. Benchmark job – Watch out!

    CERN Multimedia

    Staff Association

    2017-01-01

    On 12 December 2016, in Echo No. 259, we already discussed at length the MERIT and benchmark jobs. Still, we find that a couple of issues warrant further discussion. Benchmark job – administrative decision on 1 July 2017 On 12 January 2017, the HR Department informed all staff members of a change to the effective date of the administrative decision regarding benchmark jobs. The benchmark job title of each staff member will be confirmed on 1 July 2017, instead of 1 May 2017 as originally announced in HR’s letter on 18 August 2016. Postponing the administrative decision by two months will leave a little more time to address the issues related to incorrect placement in a benchmark job. Benchmark job – discuss with your supervisor, at the latest during the MERIT interview In order to rectify an incorrect placement in a benchmark job, it is essential that the supervisor and the supervisee go over the assigned benchmark job together. In most cases, this placement has been done autom...

  16. Benchmarking: applications to transfusion medicine.

    Science.gov (United States)

    Apelseth, Torunn Oveland; Molnar, Laura; Arnold, Emmy; Heddle, Nancy M

    2012-10-01

    Benchmarking is as a structured continuous collaborative process in which comparisons for selected indicators are used to identify factors that, when implemented, will improve transfusion practices. This study aimed to identify transfusion medicine studies reporting on benchmarking, summarize the benchmarking approaches used, and identify important considerations to move the concept of benchmarking forward in the field of transfusion medicine. A systematic review of published literature was performed to identify transfusion medicine-related studies that compared at least 2 separate institutions or regions with the intention of benchmarking focusing on 4 areas: blood utilization, safety, operational aspects, and blood donation. Forty-five studies were included: blood utilization (n = 35), safety (n = 5), operational aspects of transfusion medicine (n = 5), and blood donation (n = 0). Based on predefined criteria, 7 publications were classified as benchmarking, 2 as trending, and 36 as single-event studies. Three models of benchmarking are described: (1) a regional benchmarking program that collects and links relevant data from existing electronic sources, (2) a sentinel site model where data from a limited number of sites are collected, and (3) an institutional-initiated model where a site identifies indicators of interest and approaches other institutions. Benchmarking approaches are needed in the field of transfusion medicine. Major challenges include defining best practices and developing cost-effective methods of data collection. For those interested in initiating a benchmarking program, the sentinel site model may be most effective and sustainable as a starting point, although the regional model would be the ideal goal. Copyright © 2012 Elsevier Inc. All rights reserved.

  17. Benchmarking school nursing practice: the North West Regional Benchmarking Group

    OpenAIRE

    Littler, Nadine; Mullen, Margaret; Beckett, Helen; Freshney, Alice; Pinder, Lynn

    2016-01-01

    It is essential that the quality of care is reviewed regularly through robust processes such as benchmarking to ensure all outcomes and resources are evidence-based so that children and young people’s needs are met effectively. This article provides an example of the use of benchmarking in school nursing practice. Benchmarking has been defined as a process for finding, adapting and applying best practices (Camp, 1994). This concept was first adopted in the 1970s ‘from industry where it was us...

  18. Tip-tilt compensation: Resolution limits for ground-based telescopes using laser guide star adaptive optics

    International Nuclear Information System (INIS)

    Olivier, S.S.; Max, C.E.; Gavel, D.T.; Brase, J.M.

    1992-01-01

    The angular resolution of long-exposure images from ground-based telescopes equipped with laser guide star adaptive optics systems is fundamentally limited by the the accuracy with which the tip-tilt aberrations introduced by the atmosphere can be corrected. Assuming that a natural star is used as the tilt reference, the residual error due to tilt anisoplanatism can significantly degrade the long-exposure resolution even if the tilt reference star is separated from the object being imaged by a small angle. Given the observed distribution of stars in the sky, the need to find a tilt reference star quite close to the object restricts the fraction of the sky over which long-exposure images with diffraction limited resolution can be obtained. In this paper, the authors present a comprehensive performance analysis of tip-tilt compensation systems that use a natural star as a tilt reference, taking into account properties of the atmosphere and of the Galactic stellar populations, and optimizing over the system operating parameters to determine the fundamental limits to the long-exposure resolution. Their results show that for a ten meter telescope on Mauna Kea, if the image of the tilt reference star is uncorrected, about half the sky can be imaged in the V band with long-exposure resolution less than 60 milli-arc-seconds (mas), while if the image of the tilt reference star is fully corrected, about half the sky can be imaged in the V band with long-exposure resolution less than 16 mas. Furthermore, V band images long-exposure resolution of less than 16 mas may be obtained with a ten meter telescope on Mauna Kea for unresolved objects brighter than magnitude 22 that are fully corrected by a laser guide star adaptive optics system. This level of resolution represents about 70% of the diffraction limit of a ten meter telescope in the V band and is more than a factor of 45 better than the median seeing in the V band on Mauna Kea

  19. Space debris removal using a high-power ground-based laser

    Energy Technology Data Exchange (ETDEWEB)

    Monroe, D.K.

    1993-12-31

    The feasibility and practicality of using a ground-based laser (GBL) to remove artificial space debris is examined. Physical constraints indicate that a reactor-pumped laser (RPL) may be best suited for this mission, because of its capabilities for multimegawatt output long run-times, and near-diffraction-limited initial beams. Simulations of a laser-powered debris removal system indicate that a 5-MW RPL with a 10-meter-diameter beam director and adaptive optics capabilities can deorbit 1-kg debris from space station altitudes. Larger debris can be deorbited or transferred to safer orbits after multiple laser engagements. A ground-based laser system may be the only realistic way to access and remove some 10,000 separate objects, having velocities in the neighborhood of 7 km/sec, and being spatially distributed over some 10{sup 10} km{sup 3} of space.

  20. Benchmarking Nuclear Power Plants

    International Nuclear Information System (INIS)

    Jakic, I.

    2016-01-01

    One of the main tasks an owner have is to keep its business competitive on the market while delivering its product. Being owner of nuclear power plant bear the same (or even more complex and stern) responsibility due to safety risks and costs. In the past, nuclear power plant managements could (partly) ignore profit or it was simply expected and to some degree assured through the various regulatory processes governing electricity rate design. It is obvious now that, with the deregulation, utility privatization and competitive electricity market, key measure of success used at nuclear power plants must include traditional metrics of successful business (return on investment, earnings and revenue generation) as well as those of plant performance, safety and reliability. In order to analyze business performance of (specific) nuclear power plant, benchmarking, as one of the well-established concept and usual method was used. Domain was conservatively designed, with well-adjusted framework, but results have still limited application due to many differences, gaps and uncertainties. (author).

  1. Virtual machine performance benchmarking.

    Science.gov (United States)

    Langer, Steve G; French, Todd

    2011-10-01

    The attractions of virtual computing are many: reduced costs, reduced resources and simplified maintenance. Any one of these would be compelling for a medical imaging professional attempting to support a complex practice on limited resources in an era of ever tightened reimbursement. In particular, the ability to run multiple operating systems optimized for different tasks (computational image processing on Linux versus office tasks on Microsoft operating systems) on a single physical machine is compelling. However, there are also potential drawbacks. High performance requirements need to be carefully considered if they are to be executed in an environment where the running software has to execute through multiple layers of device drivers before reaching the real disk or network interface. Our lab has attempted to gain insight into the impact of virtualization on performance by benchmarking the following metrics on both physical and virtual platforms: local memory and disk bandwidth, network bandwidth, and integer and floating point performance. The virtual performance metrics are compared to baseline performance on "bare metal." The results are complex, and indeed somewhat surprising.

  2. AER benchmark specification sheet

    International Nuclear Information System (INIS)

    Aszodi, A.; Toth, S.

    2009-01-01

    In the VVER-440/213 type reactors, the core outlet temperature field is monitored with in-core thermocouples, which are installed above 210 fuel assemblies. These measured temperatures are used in determination of the fuel assembly powers and they have important role in the reactor power limitation. For these reasons, correct interpretation of the thermocouple signals is an important question. In order to interpret the signals in correct way, knowledge of the coolant mixing in the assembly heads is necessary. Computational Fluid Dynamics (CFD) codes and experiments can help to understand better these mixing processes and they can provide information which can support the more adequate interpretation of the thermocouple signals. This benchmark deals with the 3D CFD modeling of the coolant mixing in the heads of the profiled fuel assemblies with 12.2 mm rod pitch. Two assemblies of the 23rd cycle of the Paks NPP's Unit 3 are investigated. One of them has symmetrical pin power profile and another possesses inclined profile. (authors)

  3. AER Benchmark Specification Sheet

    International Nuclear Information System (INIS)

    Aszodi, A.; Toth, S.

    2009-01-01

    In the WWER-440/213 type reactors, the core outlet temperature field is monitored with in-core thermocouples, which are installed above 210 fuel assemblies. These measured temperatures are used in determination of the fuel assembly powers and they have important role in the reactor power limitation. For these reasons, correct interpretation of the thermocouple signals is an important question. In order to interpret the signals in correct way, knowledge of the coolant mixing in the assembly heads is necessary. Computational fluid dynamics codes and experiments can help to understand better these mixing processes and they can provide information which can support the more adequate interpretation of the thermocouple signals. This benchmark deals with the 3D computational fluid dynamics modeling of the coolant mixing in the heads of the profiled fuel assemblies with 12.2 mm rod pitch. Two assemblies of the twenty third cycle of the Paks NPPs Unit 3 are investigated. One of them has symmetrical pin power profile and another possesses inclined profile. (Authors)

  4. Benchmarking biofuels; Biobrandstoffen benchmarken

    Energy Technology Data Exchange (ETDEWEB)

    Croezen, H.; Kampman, B.; Bergsma, G.

    2012-03-15

    A sustainability benchmark for transport biofuels has been developed and used to evaluate the various biofuels currently on the market. For comparison, electric vehicles, hydrogen vehicles and petrol/diesel vehicles were also included. A range of studies as well as growing insight are making it ever clearer that biomass-based transport fuels may have just as big a carbon footprint as fossil fuels like petrol or diesel, or even bigger. At the request of Greenpeace Netherlands, CE Delft has brought together current understanding on the sustainability of fossil fuels, biofuels and electric vehicles, with particular focus on the performance of the respective energy carriers on three sustainability criteria, with the first weighing the heaviest: (1) Greenhouse gas emissions; (2) Land use; and (3) Nutrient consumption [Dutch] Greenpeace Nederland heeft CE Delft gevraagd een duurzaamheidsmeetlat voor biobrandstoffen voor transport te ontwerpen en hierop de verschillende biobrandstoffen te scoren. Voor een vergelijk zijn ook elektrisch rijden, rijden op waterstof en rijden op benzine of diesel opgenomen. Door onderzoek en voortschrijdend inzicht blijkt steeds vaker dat transportbrandstoffen op basis van biomassa soms net zoveel of zelfs meer broeikasgassen veroorzaken dan fossiele brandstoffen als benzine en diesel. CE Delft heeft voor Greenpeace Nederland op een rijtje gezet wat de huidige inzichten zijn over de duurzaamheid van fossiele brandstoffen, biobrandstoffen en elektrisch rijden. Daarbij is gekeken naar de effecten van de brandstoffen op drie duurzaamheidscriteria, waarbij broeikasgasemissies het zwaarst wegen: (1) Broeikasgasemissies; (2) Landgebruik; en (3) Nutriëntengebruik.

  5. Benchmarking in academic pharmacy departments.

    Science.gov (United States)

    Bosso, John A; Chisholm-Burns, Marie; Nappi, Jean; Gubbins, Paul O; Ross, Leigh Ann

    2010-10-11

    Benchmarking in academic pharmacy, and recommendations for the potential uses of benchmarking in academic pharmacy departments are discussed in this paper. Benchmarking is the process by which practices, procedures, and performance metrics are compared to an established standard or best practice. Many businesses and industries use benchmarking to compare processes and outcomes, and ultimately plan for improvement. Institutions of higher learning have embraced benchmarking practices to facilitate measuring the quality of their educational and research programs. Benchmarking is used internally as well to justify the allocation of institutional resources or to mediate among competing demands for additional program staff or space. Surveying all chairs of academic pharmacy departments to explore benchmarking issues such as department size and composition, as well as faculty teaching, scholarly, and service productivity, could provide valuable information. To date, attempts to gather this data have had limited success. We believe this information is potentially important, urge that efforts to gather it should be continued, and offer suggestions to achieve full participation.

  6. Informing hydrological models with ground-based time-lapse relative gravimetry: potential and limitations

    DEFF Research Database (Denmark)

    Bauer-Gottwein, Peter; Christiansen, Lars; Rosbjerg, Dan

    2011-01-01

    parameter uncertainty decreased significantly when TLRG data was included in the inversion. The forced infiltration experiment caused changes in unsaturated zone storage, which were monitored using TLRG and ground-penetrating radar. A numerical unsaturated zone model was subsequently conditioned on both......Coupled hydrogeophysical inversion emerges as an attractive option to improve the calibration and predictive capability of hydrological models. Recently, ground-based time-lapse relative gravity (TLRG) measurements have attracted increasing interest because there is a direct relationship between...

  7. (DCT-FY08) Target Detection Using Multiple Modality Airborne and Ground Based Sensors

    Science.gov (United States)

    2013-03-01

    resolution SIFT grids in metric-topological SLAM ,” in Proc. of the IEEE Conference on Computer Vision and Pattern Recognition, 2009. [4] M. Bosse and R...single camera SLAM ,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 29, no. 6, pp. 1052–1067, 2007. [7] D. Nister, O. Naroditsky, and J. Bergen...segmentation with ground-based and airborne LIDAR range data,” in Proceedings of the Fourth International Symposium on 3D Data Processing

  8. The SPARC water vapor assessment II: intercomparison of satellite and ground-based microwave measurements

    Science.gov (United States)

    Nedoluha, Gerald E.; Kiefer, Michael; Lossow, Stefan; Gomez, R. Michael; Kämpfer, Niklaus; Lainer, Martin; Forkman, Peter; Christensen, Ole Martin; Oh, Jung Jin; Hartogh, Paul; Anderson, John; Bramstedt, Klaus; Dinelli, Bianca M.; Garcia-Comas, Maya; Hervig, Mark; Murtagh, Donal; Raspollini, Piera; Read, William G.; Rosenlof, Karen; Stiller, Gabriele P.; Walker, Kaley A.

    2017-12-01

    As part of the second SPARC (Stratosphere-troposphere Processes And their Role in Climate) water vapor assessment (WAVAS-II), we present measurements taken from or coincident with seven sites from which ground-based microwave instruments measure water vapor in the middle atmosphere. Six of the ground-based instruments are part of the Network for the Detection of Atmospheric Composition Change (NDACC) and provide datasets that can be used for drift and trend assessment. We compare measurements from these ground-based instruments with satellite datasets that have provided retrievals of water vapor in the lower mesosphere over extended periods since 1996. We first compare biases between the satellite and ground-based instruments from the upper stratosphere to the upper mesosphere. We then show a number of time series comparisons at 0.46 hPa, a level that is sensitive to changes in H2O and CH4 entering the stratosphere but, because almost all CH4 has been oxidized, is relatively insensitive to dynamical variations. Interannual variations and drifts are investigated with respect to both the Aura Microwave Limb Sounder (MLS; from 2004 onwards) and each instrument's climatological mean. We find that the variation in the interannual difference in the mean H2O measured by any two instruments is typically ˜ 1%. Most of the datasets start in or after 2004 and show annual increases in H2O of 0-1 % yr-1. In particular, MLS shows a trend of between 0.5 % yr-1 and 0.7 % yr-1 at the comparison sites. However, the two longest measurement datasets used here, with measurements back to 1996, show much smaller trends of +0.1 % yr-1 (at Mauna Loa, Hawaii) and -0.1 % yr-1 (at Lauder, New Zealand).

  9. Testing a ground-based canopy model using the wind river canopy crane

    Science.gov (United States)

    Robert Van Pelt; Malcolm P. North

    1999-01-01

    A ground-based canopy model that estimates the volume of occupied space in forest canopies was tested using the Wind River Canopy Crane. A total of 126 trees in a 0.25 ha area were measured from the ground and directly from a gondola suspended from the crane. The trees were located in a low elevation, old-growth forest in the southern Washington Cascades. The ground-...

  10. The SPARC water vapor assessment II: intercomparison of satellite and ground-based microwave measurements

    Directory of Open Access Journals (Sweden)

    G. E. Nedoluha

    2017-12-01

    Full Text Available As part of the second SPARC (Stratosphere–troposphere Processes And their Role in Climate water vapor assessment (WAVAS-II, we present measurements taken from or coincident with seven sites from which ground-based microwave instruments measure water vapor in the middle atmosphere. Six of the ground-based instruments are part of the Network for the Detection of Atmospheric Composition Change (NDACC and provide datasets that can be used for drift and trend assessment. We compare measurements from these ground-based instruments with satellite datasets that have provided retrievals of water vapor in the lower mesosphere over extended periods since 1996. We first compare biases between the satellite and ground-based instruments from the upper stratosphere to the upper mesosphere. We then show a number of time series comparisons at 0.46 hPa, a level that is sensitive to changes in H2O and CH4 entering the stratosphere but, because almost all CH4 has been oxidized, is relatively insensitive to dynamical variations. Interannual variations and drifts are investigated with respect to both the Aura Microwave Limb Sounder (MLS; from 2004 onwards and each instrument's climatological mean. We find that the variation in the interannual difference in the mean H2O measured by any two instruments is typically  ∼  1%. Most of the datasets start in or after 2004 and show annual increases in H2O of 0–1 % yr−1. In particular, MLS shows a trend of between 0.5 % yr−1 and 0.7 % yr−1 at the comparison sites. However, the two longest measurement datasets used here, with measurements back to 1996, show much smaller trends of +0.1 % yr−1 (at Mauna Loa, Hawaii and −0.1 % yr−1 (at Lauder, New Zealand.

  11. Intercomparison of ground-based ozone and NO2 measurements during the MANTRA 2004 campaign

    Directory of Open Access Journals (Sweden)

    K. Strong

    2007-11-01

    Full Text Available The MANTRA (Middle Atmosphere Nitrogen TRend Assessment 2004 campaign took place in Vanscoy, Saskatchewan, Canada (52° N, 107° W from 3 August to 15 September, 2004. In support of the main balloon launch, a suite of five zenith-sky and direct-Sun-viewing UV-visible ground-based spectrometers was deployed, primarily measuring ozone and NO2 total columns. Three Fourier transform spectrometers (FTSs that were part of the balloon payload also performed ground-based measurements of several species, including ozone. Ground-based measurements of ozone and NO2 differential slant column densities from the zenith-viewing UV-visible instruments are presented herein. They are found to partially agree within NDACC (Network for the Detection of Atmospheric Composition Change standards for instruments certified for process studies and satellite validation. Vertical column densities of ozone from the zenith-sky UV-visible instruments, the FTSs, a Brewer spectrophotometer, and ozonesondes are compared, and found to agree within the combined error estimates of the instruments (15%. NO2 vertical column densities from two of the UV-visible instruments are compared, and are also found to agree within combined error (15%.

  12. Issues in Benchmark Metric Selection

    Science.gov (United States)

    Crolotte, Alain

    It is true that a metric can influence a benchmark but will esoteric metrics create more problems than they will solve? We answer this question affirmatively by examining the case of the TPC-D metric which used the much debated geometric mean for the single-stream test. We will show how a simple choice influenced the benchmark and its conduct and, to some extent, DBMS development. After examining other alternatives our conclusion is that the “real” measure for a decision-support benchmark is the arithmetic mean.

  13. California commercial building energy benchmarking

    Energy Technology Data Exchange (ETDEWEB)

    Kinney, Satkartar; Piette, Mary Ann

    2003-07-01

    Building energy benchmarking is the comparison of whole-building energy use relative to a set of similar buildings. It provides a useful starting point for individual energy audits and for targeting buildings for energy-saving measures in multiple-site audits. Benchmarking is of interest and practical use to a number of groups. Energy service companies and performance contractors communicate energy savings potential with ''typical'' and ''best-practice'' benchmarks while control companies and utilities can provide direct tracking of energy use and combine data from multiple buildings. Benchmarking is also useful in the design stage of a new building or retrofit to determine if a design is relatively efficient. Energy managers and building owners have an ongoing interest in comparing energy performance to others. Large corporations, schools, and government agencies with numerous facilities also use benchmarking methods to compare their buildings to each other. The primary goal of Task 2.1.1 Web-based Benchmarking was the development of a web-based benchmarking tool, dubbed Cal-Arch, for benchmarking energy use in California commercial buildings. While there were several other benchmarking tools available to California consumers prior to the development of Cal-Arch, there were none that were based solely on California data. Most available benchmarking information, including the Energy Star performance rating, were developed using DOE's Commercial Building Energy Consumption Survey (CBECS), which does not provide state-level data. Each database and tool has advantages as well as limitations, such as the number of buildings and the coverage by type, climate regions and end uses. There is considerable commercial interest in benchmarking because it provides an inexpensive method of screening buildings for tune-ups and retrofits. However, private companies who collect and manage consumption data are concerned that the

  14. A Heterogeneous Medium Analytical Benchmark

    International Nuclear Information System (INIS)

    Ganapol, B.D.

    1999-01-01

    A benchmark, called benchmark BLUE, has been developed for one-group neutral particle (neutron or photon) transport in a one-dimensional sub-critical heterogeneous plane parallel medium with surface illumination. General anisotropic scattering is accommodated through the Green's Function Method (GFM). Numerical Fourier transform inversion is used to generate the required Green's functions which are kernels to coupled integral equations that give the exiting angular fluxes. The interior scalar flux is then obtained through quadrature. A compound iterative procedure for quadrature order and slab surface source convergence provides highly accurate benchmark qualities (4- to 5- places of accuracy) results

  15. BENCHMARK DOSES FOR CHEMICAL MIXTURES: EVALUATION OF A MIXTURE OF 18 PHAHS.

    Science.gov (United States)

    Benchmark doses (BMDs), defined as doses of a substance that are expected to result in a pre-specified level of "benchmark" response (BMR), have been used for quantifying the risk associated with exposure to environmental hazards. The lower confidence limit of the BMD is used as...

  16. A Global Vision over Benchmarking Process: Benchmarking Based Enterprises

    OpenAIRE

    Sitnikov, Catalina; Giurca Vasilescu, Laura

    2008-01-01

    Benchmarking uses the knowledge and the experience of others to improve the enterprise. Starting from the analysis of the performance and underlying the strengths and weaknesses of the enterprise it should be assessed what must be done in order to improve its activity. Using benchmarking techniques, an enterprise looks at how processes in the value chain are performed. The approach based on the vision “from the whole towards the parts” (a fragmented image of the enterprise’s value chain) redu...

  17. Benchmarking and Learning in Public Healthcare

    DEFF Research Database (Denmark)

    Buckmaster, Natalie; Mouritsen, Jan

    2017-01-01

    This research investigates the effects of learning-oriented benchmarking in public healthcare settings. Benchmarking is a widely adopted yet little explored accounting practice that is part of the paradigm of New Public Management. Extant studies are directed towards mandated coercive benchmarking...... applications. The present study analyses voluntary benchmarking in a public setting that is oriented towards learning. The study contributes by showing how benchmarking can be mobilised for learning and offers evidence of the effects of such benchmarking for performance outcomes. It concludes that benchmarking...... can enable learning in public settings but that this requires actors to invest in ensuring that benchmark data are directed towards improvement....

  18. Performance Targets and External Benchmarking

    DEFF Research Database (Denmark)

    Friis, Ivar; Hansen, Allan; Vámosi, Tamás S.

    Research on relative performance measures, transfer pricing, beyond budgeting initiatives, target costing, piece rates systems and value based management has for decades underlined the importance of external benchmarking in performance management. Research conceptualises external benchmarking...... as a market mechanism that can be brought inside the firm to provide incentives for continuous improvement and the development of competitive advances. However, whereas extant research primarily has focused on the importance and effects of using external benchmarks, less attention has been directed towards...... the conditions upon which the market mechanism is performing within organizations. This paper aims to contribute to research by providing more insight to the conditions for the use of external benchmarking as an element in performance management in organizations. Our study explores a particular type of external...

  19. Benchmarking and Sustainable Transport Policy

    DEFF Research Database (Denmark)

    Gudmundsson, Henrik; Wyatt, Andrew; Gordon, Lucy

    2004-01-01

    Order to learn from the best. In 2000 the European Commission initiated research to explore benchmarking as a tool to promote policies for ‘sustainable transport’. This paper reports findings and recommendations on how to address this challenge. The findings suggest that benchmarking is a valuable...... tool that may indeed help to move forward the transport policy agenda. However, there are major conditions and limitations. First of all it is not always so straightforward to delimit, measure and compare transport services in order to establish a clear benchmark. Secondly ‘sustainable transport......’ evokes a broad range of concerns that are hard to address fully at the level of specific practices. Thirdly policies are not directly comparable across space and context. For these reasons attempting to benchmark ‘sustainable transport policies’ against one another would be a highly complex task, which...

  20. Benchmarking: contexts and details matter.

    Science.gov (United States)

    Zheng, Siyuan

    2017-07-05

    Benchmarking is an essential step in the development of computational tools. We take this opportunity to pitch in our opinions on tool benchmarking, in light of two correspondence articles published in Genome Biology.Please see related Li et al. and Newman et al. correspondence articles: www.dx.doi.org/10.1186/s13059-017-1256-5 and www.dx.doi.org/10.1186/s13059-017-1257-4.

  1. Handbook of critical experiments benchmarks

    International Nuclear Information System (INIS)

    Durst, B.M.; Bierman, S.R.; Clayton, E.D.

    1978-03-01

    Data from critical experiments have been collected together for use as benchmarks in evaluating calculational techniques and nuclear data. These benchmarks have been selected from the numerous experiments performed on homogeneous plutonium systems. No attempt has been made to reproduce all of the data that exists. The primary objective in the collection of these data is to present representative experimental data defined in a concise, standardized format that can easily be translated into computer code input

  2. Analysis of Benchmark 2 results

    International Nuclear Information System (INIS)

    Bacha, F.; Lefievre, B.; Maillard, J.; Silva, J.

    1994-01-01

    The code GEANT315 has been compared to different codes in two benchmarks. We analyze its performances through our results, especially in the thick target case. In spite of gaps in nucleus-nucleus interaction theories at intermediate energies, benchmarks allow possible improvements of physical models used in our codes. Thereafter, a scheme of radioactive waste burning system is studied. (authors). 4 refs., 7 figs., 1 tab

  3. Benchmarks for GADRAS performance validation

    International Nuclear Information System (INIS)

    Mattingly, John K.; Mitchell, Dean James; Rhykerd, Charles L. Jr.

    2009-01-01

    The performance of the Gamma Detector Response and Analysis Software (GADRAS) was validated by comparing GADRAS model results to experimental measurements for a series of benchmark sources. Sources for the benchmark include a plutonium metal sphere, bare and shielded in polyethylene, plutonium oxide in cans, a highly enriched uranium sphere, bare and shielded in polyethylene, a depleted uranium shell and spheres, and a natural uranium sphere. The benchmark experimental data were previously acquired and consist of careful collection of background and calibration source spectra along with the source spectra. The calibration data were fit with GADRAS to determine response functions for the detector in each experiment. A one-dimensional model (pie chart) was constructed for each source based on the dimensions of the benchmark source. The GADRAS code made a forward calculation from each model to predict the radiation spectrum for the detector used in the benchmark experiment. The comparisons between the GADRAS calculation and the experimental measurements are excellent, validating that GADRAS can correctly predict the radiation spectra for these well-defined benchmark sources.

  4. Benchmarking in Czech Higher Education

    Directory of Open Access Journals (Sweden)

    Plaček Michal

    2015-12-01

    Full Text Available The first part of this article surveys the current experience with the use of benchmarking at Czech universities specializing in economics and management. The results indicate that collaborative benchmarking is not used on this level today, but most actors show some interest in its introduction. The expression of the need for it and the importance of benchmarking as a very suitable performance-management tool in less developed countries are the impetus for the second part of our article. Based on an analysis of the current situation and existing needs in the Czech Republic, as well as on a comparison with international experience, recommendations for public policy are made, which lie in the design of a model of a collaborative benchmarking for Czech economics and management in higher-education programs. Because the fully complex model cannot be implemented immediately – which is also confirmed by structured interviews with academics who have practical experience with benchmarking –, the final model is designed as a multi-stage model. This approach helps eliminate major barriers to the implementation of benchmarking.

  5. Ground-based structure from motion - multi view stereo (SFM-MVS) for upland soil erosion assessment.

    Science.gov (United States)

    McShane, Gareth; James, Mike; Quniton, John; Farrow, Luke; Glendell, Miriam; Jones, Lee; Kirkham, Matthew; Morgan, David; Evans, Martin; Anderson, Karen; Lark, Murray; Rawlins, Barry; Rickson, Jane; Quine, Timothy; Benaud, Pia; Brazier, Richard

    2016-04-01

    In upland environments, quantifying soil loss through erosion processes at a high resolution can be time consuming, costly and logistically difficult. In this pilot study 'A cost effective framework for monitoring soil erosion in England and Wales', funded by the UK Department for Environment, Food and Rural Affairs (Defra), we evaluate the use of annually repeated ground-based photography surveys, processed using structure-from-motion and multi-view stereo (SfM-MVS) 3-D reconstruction software (Agisoft Photoscan). The aim is to enable efficient but detailed site-scale studies of erosion forms in inaccessible UK upland environments, in order to quantify dynamic processes, such as erosion and mass movement. The evaluation of the SfM-MVS technique is particularly relevant in upland landscapes, where the remoteness and inaccessibility of field sites may render some of the more established survey techniques impractical. We present results from 5 upland sites across the UK, acquired over a 2-year period. Erosion features of varying width (3 m to 35 m) and length (20 m to 60 m), representing a range of spatial scales (from 100 m2 to 1000 m2) were surveyed, in upland habitats including bogs, peatland, upland grassland and moorland. For each feature, around 150 to 600 ground-based photographs were taken at oblique angles over a 10 to 20 minute period, using an uncalibrated Canon 600D SLR camera with a 28 mm lens (focal length set to infinity). Camera settings varied based upon light conditions (exposure 100-400 ISO, aperture F4.5 to F8, shutter speed 1/100 to 1/250 second). For inter-survey comparisons, models were geo-referenced using 20 to 30 ground control points (numbered black markers with a white target) placed around and within the feature, with their co-ordinates measured by survey-grade differential GNSS (Trimble R4). Volumetric estimates of soil loss were quantified using digital surface models (DSMs) derived from the repeat survey data and subtracted from a

  6. Dynamic benchmarking of simulation codes

    International Nuclear Information System (INIS)

    Henry, R.E.; Paik, C.Y.; Hauser, G.M.

    1996-01-01

    Computer simulation of nuclear power plant response can be a full-scope control room simulator, an engineering simulator to represent the general behavior of the plant under normal and abnormal conditions, or the modeling of the plant response to conditions that would eventually lead to core damage. In any of these, the underlying foundation for their use in analysing situations, training of vendor/utility personnel, etc. is how well they represent what has been known from industrial experience, large integral experiments and separate effects tests. Typically, simulation codes are benchmarked with some of these; the level of agreement necessary being dependent upon the ultimate use of the simulation tool. However, these analytical models are computer codes, and as a result, the capabilities are continually enhanced, errors are corrected, new situations are imposed on the code that are outside of the original design basis, etc. Consequently, there is a continual need to assure that the benchmarks with important transients are preserved as the computer code evolves. Retention of this benchmarking capability is essential to develop trust in the computer code. Given the evolving world of computer codes, how is this retention of benchmarking capabilities accomplished? For the MAAP4 codes this capability is accomplished through a 'dynamic benchmarking' feature embedded in the source code. In particular, a set of dynamic benchmarks are included in the source code and these are exercised every time the archive codes are upgraded and distributed to the MAAP users. Three different types of dynamic benchmarks are used: plant transients; large integral experiments; and separate effects tests. Each of these is performed in a different manner. The first is accomplished by developing a parameter file for the plant modeled and an input deck to describe the sequence; i.e. the entire MAAP4 code is exercised. The pertinent plant data is included in the source code and the computer

  7. Retrieval and analysis of atmospheric XCO2 using ground-based spectral observation.

    Science.gov (United States)

    Qin, Xiu-Chun; Lei, Li-Ping; Kawasaki, Masahiro; Masafumi, Ohashi; Takahiro, Kuroki; Zeng, Zhao-Cheng; Zhang, Bing

    2014-07-01

    Atmospheric CO2 column concentration (column-averaged dry air mole fractions of atmospheric carbon dioxide) data obtained by ground-based hyperspectral observation is an important source of data for the verification and improvement of the results of CO2 retrieval based on satellite hyperspectral observation. However, few studies have been conducted on atmospheric CO2 column concentration retrieval based on ground-based spectral hyperspectral observation in China. In the present study, we carried out the ground-based hyperspectral observation in Xilingol Grassland, Inner Mongolia of China by using an observation system which is consisted of an optical spectral analyzer, a sun tracker, and some other elements. The atmospheric CO2 column concentration was retrieved using the observed hyperspectral data. The effect of a wavelength shift of the observation spectra and the meteorological parameters on the retrieval precision of the atmospheric CO2 concentration was evaluated and analyzed. The results show that the mean value of atmospheric CO2 concentration was 390.9 microg x mL(-1) in the study area during the observing period from July to September. The shift of wavelength in the range between -0.012 and 0.042 nm will generally lead to 1 microg x mL(-1) deviation in the CO2 retrievals. This study also revealed that the spectral transmittance was sensitive to meteorological parameters in the wavelength range of 6 357-6 358, 6 360-6 361, and 6 363-6 364 cm(-1). By comparing the CO2 retrievals derived from the meteorological parameters observed in synchronous and non-synchronous time, respectively, with the spectral observation, it was showed that the concentration deviation caused by using the non-synchronously observed meteorological parameters is ranged from 0.11 to 4 microg x mL(-1). These results can be used as references for the further improvement of retrieving CO2 column concentration based on spectral observation.

  8. Ground-based SMART-COMMIT Measurements for Studying Aerosol and Cloud Properties

    Science.gov (United States)

    Tsay, Si-Chee

    2008-01-01

    From radiometric principles, it is expected that the retrieved properties of extensive aerosols and clouds from reflected/emitted measurements by satellite (and/or aircraft) should be consistent with those retrieved from transmitted/emitted radiance observed at the surface. Although space-borne remote sensing observations cover large spatial domain, they are often plagued by contamination of surface signatures. Thus, ground-based in-situ and remote-sensing measurements, where signals come directly from atmospheric constituents, the sun, and/or the Earth-atmosphere interactions, provide additional information content for comparisons that confirm quantitatively the usefulness of the integrated surface, aircraft, and satellite data sets. The development and deployment of SMARTCOMMIT (Surface-sensing Measurements for Atmospheric Radiative Transfer - Chemical, Optical & Microphysical Measurements of In-situ Troposphere) mobile facilities are aimed for the optimal utilization of collocated ground-based observations as constraints to yield higher fidelity satellite retrievals and to determine any sampling bias due to target conditions. To quantify the energetics of the surface-atmosphere system and the atmospheric processes, SMART-COMMIT instruments fall into three categories: flux radiometer, radiance sensor and in-situ probe. In this paper, we will demonstrate the capability of SMART-COMMIT in recent field campaigns (e.g., CRYSTAL-FACE, UAE 2, BASEASIA, NAMMA) that were designed and executed to study the compelling variability in temporal scale of both anthropogenic and natural aerosols (e.g., biomass-burning smoke, airborne dust) and cirrus clouds. We envision robust approaches in which well-collocated ground-based measurements and space-borne observations will greatly advance our knowledge of extensive aerosols and clouds.

  9. Kepler and Ground-Based Transits of the exo-Neptune HAT-P-11b

    Science.gov (United States)

    Deming, Drake; Sada, Pedro V.; Jackson, Brian; Peterson, Steven W.; Agol, Eric; Knutson, Heather A.; Jennings, Donald E.; Haase, Plynn; Bays, Kevin

    2011-01-01

    We analyze 26 archival Kepler transits of the exo-Neptune HAT-P-11b, supplemented by ground-based transits observed in the blue (B band) and near-IR (J band). Both the planet and host star are smaller than previously believed; our analysis yields Rp = 4.31 R xor 0.06 R xor and Rs = 0.683 R solar mass 0.009 R solar mass, both about 3 sigma smaller than the discovery values. Our ground-based transit data at wavelengths bracketing the Kepler bandpass serve to check the wavelength dependence of stellar limb darkening, and the J-band transit provides a precise and independent constraint on the transit duration. Both the limb darkening and transit duration from our ground-based data are consistent with the new Kepler values for the system parameters. Our smaller radius for the planet implies that its gaseous envelope can be less extensive than previously believed, being very similar to the H-He envelope of GJ 436b and Kepler-4b. HAT-P-11 is an active star, and signatures of star spot crossings are ubiquitous in the Kepler transit data. We develop and apply a methodology to correct the planetary radius for the presence of both crossed and uncrossed star spots. Star spot crossings are concentrated at phases 0.002 and +0.006. This is consistent with inferences from Rossiter-McLaughlin measurements that the planet transits nearly perpendicular to the stellar equator. We identify the dominant phases of star spot crossings with active latitudes on the star, and infer that the stellar rotational pole is inclined at about 12 deg 5 deg to the plane of the sky. We point out that precise transit measurements over long durations could in principle allow us to construct a stellar Butterfly diagram to probe the cyclic evolution of magnetic activity on this active K-dwarf star.

  10. Toward High Altitude Airship Ground-Based Boresight Calibration of Hyperspectral Pushbroom Imaging Sensors

    Directory of Open Access Journals (Sweden)

    Aiwu Zhang

    2015-12-01

    Full Text Available The complexity of the single linear hyperspectral pushbroom imaging based on a high altitude airship (HAA without a three-axis stabilized platform is much more than that based on the spaceborne and airborne. Due to the effects of air pressure, temperature and airflow, the large pitch and roll angles tend to appear frequently that create pushbroom images highly characterized with severe geometric distortions. Thus, the in-flight calibration procedure is not appropriate to apply to the single linear pushbroom sensors on HAA having no three-axis stabilized platform. In order to address this problem, a new ground-based boresight calibration method is proposed. Firstly, a coordinate’s transformation model is developed for direct georeferencing (DG of the linear imaging sensor, and then the linear error equation is derived from it by using the Taylor expansion formula. Secondly, the boresight misalignments are worked out by using iterative least squares method with few ground control points (GCPs and ground-based side-scanning experiments. The proposed method is demonstrated by three sets of experiments: (i the stability and reliability of the method is verified through simulation-based experiments; (ii the boresight calibration is performed using ground-based experiments; and (iii the validation is done by applying on the orthorectification of the real hyperspectral pushbroom images from a HAA Earth observation payload system developed by our research team—“LanTianHao”. The test results show that the proposed boresight calibration approach significantly improves the quality of georeferencing by reducing the geometric distortions caused by boresight misalignments to the minimum level.

  11. Summer planetary-scale oscillations: aura MLS temperature compared with ground-based radar wind

    Directory of Open Access Journals (Sweden)

    C. E. Meek

    2009-04-01

    Full Text Available The advent of satellite based sampling brings with it the opportunity to examine virtually any part of the globe. Aura MLS mesospheric temperature data are analysed in a wavelet format for easy identification of possible planetary waves (PW and aliases masquerading as PW. A calendar year, 2005, of eastward, stationary, and westward waves at a selected latitude is shown in separate panels for wave number range −3 to +3 for period range 8 h to 30 days (d. Such a wavelet analysis is made possible by Aura's continuous sampling at all latitudes 82° S–82° N. The data presentation is suitable for examination of years of data. However this paper focuses on the striking feature of a "dish-shaped" upper limit to periods near 2 d in mid-summer, with longer periods appearing towards spring and fall, a feature also commonly seen in radar winds. The most probable cause is suggested to be filtering by the summer jet at 70–80 km, the latter being available from ground based medium frequency radar (MFR. Classically, the phase velocity of a wave must be greater than that of the jet in order to propagate through it. As an attempt to directly relate satellite and ground based sampling, a PW event of period 8d and wave number 2, which appears to be the original rather than an alias, is compared with ground based radar wind data. An appendix discusses characteristics of satellite data aliases with regard to their periods and amplitudes.

  12. Methane Emissions from Bangladesh: Bridging the Gap Between Ground-based and Space-borne Estimates

    Science.gov (United States)

    Peters, C.; Bennartz, R.; Hornberger, G. M.

    2015-12-01

    Gaining an understanding of methane (CH4) emission sources and atmospheric dispersion is an essential part of climate change research. Large-scale and global studies often rely on satellite observations of column CH4 mixing ratio whereas high-spatial resolution estimates rely on ground-based measurements. Extrapolation of ground-based measurements on, for example, rice paddies to broad region scales is highly uncertain because of spatio-temporal variability. We explore the use of ground-based river stage measurements and independent satellite observations of flooded area along with satellite measurements of CH4 mixing ratio to estimate the extent of methane emissions. Bangladesh, which comprises most of the Ganges Brahmaputra Meghna (GBM) delta, is a region of particular interest for studying spatio-temporal variation of methane emissions due to (1) broadscale rice cultivation and (2) seasonal flooding and atmospheric convection during the monsoon. Bangladesh and its deltaic landscape exhibit a broad range of environmental, economic, and social circumstances that are relevant to many nations in South and Southeast Asia. We explore the seasonal enhancement of CH4 in Bangladesh using passive remote sensing spectrometer CH4 products from the SCanning Imaging Absorption SpectroMeter for Atmospheric CHartographY (SCIAMACHY) and the Atmospheric Infrared Sounder (AIRS). The seasonal variation of CH4 is compared to independent estimates of seasonal flooding from water gauge stations and space-based passive microwave water-to-land fractions from the Tropical Rainfall Measuring Mission Microwave Imager (TRMM-TMI). Annual cycles in inundation (natural and anthropogenic) and atmospheric CH4 concentrations show highly correlated seasonal signals. NOAA's HYSPLIT model is used to determine atmospheric residence time of ground CH4 fluxes. Using the satellite observations, we can narrow the large uncertainty in extrapolation of ground-based CH4 emission estimates from rice paddies

  13. Development of ground-based wind energy in DOM and Corsica - Joint CGEDD / CGEIET report

    International Nuclear Information System (INIS)

    Joannis de Verclos, Christian de; Albrecht, Patrick; Iselin, Philippe; Legait, Benoit; Vignolles, Denis

    2012-09-01

    Addressing the peculiar cases of the French overseas districts (DOM: Guadeloupe, Martinique, Guyana, Mayotte, La Reunion) and Corsica, this report analyzes four main topics: the objectives and challenges of ground-based wind energy (sustainable development, not-interconnected areas, and public service of electricity supply), the local situations and their cartography, the legal issues and the possible evolution options (energy law, environmental law, urban planning law, local community law), and the modalities of devolution of project. The authors highlight the issues which require a new legal framework, notably governance and the devolution procedure

  14. Remote sensing of high-latitude ionization profiles by ground-based and spaceborne instrumentation

    International Nuclear Information System (INIS)

    Vondrak, R.R.

    1981-01-01

    Ionospheric specification and modeling are now largely based on data provided by active remote sensing with radiowave techniques (ionosondes, incoherent-scatter radars, and satellite beacons). More recently, passive remote sensing techniques have been developed that can be used to monitor quantitatively the spatial distribution of high-latitude E-region ionization. These passive methods depend on the measurement, or inference, of the energy distribution of precipitating kilovolt electrons, the principal source of the nighttime E-region at high latitudes. To validate these techniques, coordinated measurements of the auroral ionosphere have been made with the Chatanika incoherent-scatter radar and a variety of ground-based and spaceborne sensors

  15. Tests of the gravitational redshift effect in space-born and ground-based experiments

    Science.gov (United States)

    Vavilova, I. B.

    2018-02-01

    This paper provides a brief overview of experiments as concerns with the tests of the gravitational redshift (GRS) effect in ground-based and space-born experiments. In particular, we consider the GRS effects in the gravitational field of the Earth, the major planets of the Solar system, compact stars (white dwarfs and neutron stars) where this effect is confirmed with a higher accuracy. We discuss availabilities to confirm the GRS effect for galaxies and galaxy clusters in visible and X-ray ranges of the electromagnetic spectrum.

  16. On mean wind and turbulence profile measurements from ground-based wind lidars

    DEFF Research Database (Denmark)

    Mikkelsen, Torben

    2009-01-01

    Two types of wind lidar?s have become available for ground-based vertical mean wind and turbulence profiling. A continuous wave (CW) wind lidar, and a pulsed wind lidar. Although they both are build upon the same recent 1.55 μ telecom fibre technology, they possess fundamental differences between...... their temporal and spatial resolution capabilities. A literature review of the two lidar systems spatial and temporal resolution characteristics will be presented, and the implication for the two lidar types vertical profile measurements of mean wind and turbulence in the lower atmospheric boundary layer...

  17. Pulsation of IU Per from the Ground-based and ‘Integral’ Photometry

    Directory of Open Access Journals (Sweden)

    Kundra E.

    2013-06-01

    Full Text Available IU Per is an eclipsing semi-detached binary with a pulsating component. Using our own ground-based, as well as INTEGRAL satellite photometric observations in the B and V passbands, we derived geometrical and physical parameters of this system. We detected the short-term variations of IU Per in the residuals of brightness after the subtraction of synthetic light curves. Analysis of these residuals enabled us to characterize and localize the source of short-term variations as the pulsations of the primary component typical to δ Scuti-type stars.

  18. Liquid Structures and Physical Properties -- Ground Based Studies for ISS Experiments

    Science.gov (United States)

    Kelton, K. F.; Bendert, J. C.; Mauro, N. A.

    2012-01-01

    Studies of electrostatically-levitated supercooled liquids have demonstrated strong short- and medium-range ordering in transition metal and alloy liquids, which can influence phase transitions like crystal nucleation and the glass transition. The structure is also related to the liquid properties. Planned ISS experiments will allow a deeper investigation of these results as well as the first investigations of a new type of coupling in crystal nucleation in primary crystallizing liquids, resulting from a linking of the stochastic processes of diffusion with interfacial-attachment. A brief description of the techniques used for ground-based studies and some results relevant to planned ISS investigations are discussed.

  19. Plant diversity to support humans in a CELSS ground based demonstrator

    Science.gov (United States)

    Howe, J. M.; Hoff, J. E.

    1981-01-01

    A controlled ecological life support system (CELSS) for human habitation in preparation for future long duration space flights is considered. The success of such a system depends upon the feasibility of revitalization of food resources and the human nutritional needs which are to be met by these food resources. Edible higher plants are prime candidates for the photoautotrophic components of this system if nutritionally adequate diets can be derived from these plant sources to support humans. Human nutritional requirements information based on current knowledge are developed for inhabitants envisioned in the CELSS ground based demonstrator. Groups of plant products that can provide the nutrients are identified.

  20. The laser calibration system for the STACEE ground-based gamma ray detector

    CERN Document Server

    Hanna, D

    2002-01-01

    We describe the design and performance of the laser system used for calibration monitoring of components of the STACEE detector. STACEE is a ground based gamma ray detector which uses the heliostats of a solar power facility to collect and focus Cherenkov light onto a system of secondary optics and photomultiplier tubes. To monitor the gain and check the linearity and timing properties of the phototubes and associated electronics, a system based on a dye laser, neutral density filters and optical fibres has been developed. In this paper we describe the system and present some results from initial tests made with it.

  1. Development and application of freshwater sediment-toxicity benchmarks for currently used pesticides

    Science.gov (United States)

    Nowell, Lisa H.; Norman, Julia E.; Ingersoll, Christopher G.; Moran, Patrick W.

    2016-01-01

    Sediment-toxicity benchmarks are needed to interpret the biological significance of currently used pesticides detected in whole sediments. Two types of freshwater sediment benchmarks for pesticides were developed using spiked-sediment bioassay (SSB) data from the literature. These benchmarks can be used to interpret sediment-toxicity data or to assess the potential toxicity of pesticides in whole sediment. The Likely Effect Benchmark (LEB) defines a pesticide concentration in whole sediment above which there is a high probability of adverse effects on benthic invertebrates, and the Threshold Effect Benchmark (TEB) defines a concentration below which adverse effects are unlikely. For compounds without available SSBs, benchmarks were estimated using equilibrium partitioning (EqP). When a sediment sample contains a pesticide mixture, benchmark quotients can be summed for all detected pesticides to produce an indicator of potential toxicity for that mixture. Benchmarks were developed for 48 pesticide compounds using SSB data and 81 compounds using the EqP approach. In an example application, data for pesticides measured in sediment from 197 streams across the United States were evaluated using these benchmarks, and compared to measured toxicity from whole-sediment toxicity tests conducted with the amphipod Hyalella azteca (28-d exposures) and the midge Chironomus dilutus (10-d exposures). Amphipod survival, weight, and biomass were significantly and inversely related to summed benchmark quotients, whereas midge survival, weight, and biomass showed no relationship to benchmarks. Samples with LEB exceedances were rare (n = 3), but all were toxic to amphipods (i.e., significantly different from control). Significant toxicity to amphipods was observed for 72% of samples exceeding one or more TEBs, compared to 18% of samples below all TEBs. Factors affecting toxicity below TEBs may include the presence of contaminants other than pesticides, physical

  2. Regional Competitive Intelligence: Benchmarking and Policymaking

    OpenAIRE

    Huggins , Robert

    2010-01-01

    Benchmarking exercises have become increasingly popular within the sphere of regional policymaking in recent years. The aim of this paper is to analyse the concept of regional benchmarking and its links with regional policymaking processes. It develops a typology of regional benchmarking exercises and regional benchmarkers, and critically reviews the literature, both academic and policy oriented. It is argued that critics who suggest regional benchmarking is a flawed concept and technique fai...

  3. Benchmarking of human resources management

    Directory of Open Access Journals (Sweden)

    David M. Akinnusi

    2008-11-01

    Full Text Available This paper reviews the role of human resource management (HRM which, today, plays a strategic partnership role in management. The focus of the paper is on HRM in the public sector, where much hope rests on HRM as a means of transforming the public service and achieving much needed service delivery. However, a critical evaluation of HRM practices in the public sector reveals that these services leave much to be desired. The paper suggests the adoption of benchmarking as a process to revamp HRM in the public sector so that it is able to deliver on its promises. It describes the nature and process of benchmarking and highlights the inherent difficulties in applying benchmarking in HRM. It concludes with some suggestions for a plan of action. The process of identifying “best” practices in HRM requires the best collaborative efforts of HRM practitioners and academicians. If used creatively, benchmarking has the potential to bring about radical and positive changes in HRM in the public sector. The adoption of the benchmarking process is, in itself, a litmus test of the extent to which HRM in the public sector has grown professionally.

  4. Benchmark simulation models, quo vadis?

    Science.gov (United States)

    Jeppsson, U; Alex, J; Batstone, D J; Benedetti, L; Comas, J; Copp, J B; Corominas, L; Flores-Alsina, X; Gernaey, K V; Nopens, I; Pons, M-N; Rodríguez-Roda, I; Rosen, C; Steyer, J-P; Vanrolleghem, P A; Volcke, E I P; Vrecko, D

    2013-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to provide their insights, highlighting areas where knowledge may still be deficient and where new opportunities are emerging, and to propose potential avenues for future development and application of the general benchmarking framework and its associated tools. The paper focuses on the topics of temporal and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work already being done within the context of the benchmarking simulation models (BSMs) or applicable work in the wider literature. Of key importance is increasing capability, usability and transparency of the BSM package while avoiding unnecessary complexity.

  5. Development and application of freshwater sediment-toxicity benchmarks for currently used pesticides

    Energy Technology Data Exchange (ETDEWEB)

    Nowell, Lisa H., E-mail: lhnowell@usgs.gov [U.S. Geological Survey, California Water Science Center, Placer Hall, 6000 J Street, Sacramento, CA 95819 (United States); Norman, Julia E., E-mail: jnorman@usgs.gov [U.S. Geological Survey, Oregon Water Science Center, 2130 SW 5" t" h Avenue, Portland, OR 97201 (United States); Ingersoll, Christopher G., E-mail: cingersoll@usgs.gov [U.S. Geological Survey, Columbia Environmental Research Center, 4200 New Haven Road, Columbia, MO 65021 (United States); Moran, Patrick W., E-mail: pwmoran@usgs.gov [U.S. Geological Survey, Washington Water Science Center, 934 Broadway, Suite 300, Tacoma, WA 98402 (United States)

    2016-04-15

    Sediment-toxicity benchmarks are needed to interpret the biological significance of currently used pesticides detected in whole sediments. Two types of freshwater sediment benchmarks for pesticides were developed using spiked-sediment bioassay (SSB) data from the literature. These benchmarks can be used to interpret sediment-toxicity data or to assess the potential toxicity of pesticides in whole sediment. The Likely Effect Benchmark (LEB) defines a pesticide concentration in whole sediment above which there is a high probability of adverse effects on benthic invertebrates, and the Threshold Effect Benchmark (TEB) defines a concentration below which adverse effects are unlikely. For compounds without available SSBs, benchmarks were estimated using equilibrium partitioning (EqP). When a sediment sample contains a pesticide mixture, benchmark quotients can be summed for all detected pesticides to produce an indicator of potential toxicity for that mixture. Benchmarks were developed for 48 pesticide compounds using SSB data and 81 compounds using the EqP approach. In an example application, data for pesticides measured in sediment from 197 streams across the United States were evaluated using these benchmarks, and compared to measured toxicity from whole-sediment toxicity tests conducted with the amphipod Hyalella azteca (28-d exposures) and the midge Chironomus dilutus (10-d exposures). Amphipod survival, weight, and biomass were significantly and inversely related to summed benchmark quotients, whereas midge survival, weight, and biomass showed no relationship to benchmarks. Samples with LEB exceedances were rare (n = 3), but all were toxic to amphipods (i.e., significantly different from control). Significant toxicity to amphipods was observed for 72% of samples exceeding one or more TEBs, compared to 18% of samples below all TEBs. Factors affecting toxicity below TEBs may include the presence of contaminants other than pesticides, physical/chemical characteristics

  6. Development and application of freshwater sediment-toxicity benchmarks for currently used pesticides

    International Nuclear Information System (INIS)

    Nowell, Lisa H.; Norman, Julia E.; Ingersoll, Christopher G.; Moran, Patrick W.

    2016-01-01

    Sediment-toxicity benchmarks are needed to interpret the biological significance of currently used pesticides detected in whole sediments. Two types of freshwater sediment benchmarks for pesticides were developed using spiked-sediment bioassay (SSB) data from the literature. These benchmarks can be used to interpret sediment-toxicity data or to assess the potential toxicity of pesticides in whole sediment. The Likely Effect Benchmark (LEB) defines a pesticide concentration in whole sediment above which there is a high probability of adverse effects on benthic invertebrates, and the Threshold Effect Benchmark (TEB) defines a concentration below which adverse effects are unlikely. For compounds without available SSBs, benchmarks were estimated using equilibrium partitioning (EqP). When a sediment sample contains a pesticide mixture, benchmark quotients can be summed for all detected pesticides to produce an indicator of potential toxicity for that mixture. Benchmarks were developed for 48 pesticide compounds using SSB data and 81 compounds using the EqP approach. In an example application, data for pesticides measured in sediment from 197 streams across the United States were evaluated using these benchmarks, and compared to measured toxicity from whole-sediment toxicity tests conducted with the amphipod Hyalella azteca (28-d exposures) and the midge Chironomus dilutus (10-d exposures). Amphipod survival, weight, and biomass were significantly and inversely related to summed benchmark quotients, whereas midge survival, weight, and biomass showed no relationship to benchmarks. Samples with LEB exceedances were rare (n = 3), but all were toxic to amphipods (i.e., significantly different from control). Significant toxicity to amphipods was observed for 72% of samples exceeding one or more TEBs, compared to 18% of samples below all TEBs. Factors affecting toxicity below TEBs may include the presence of contaminants other than pesticides, physical/chemical characteristics

  7. Nonparametric estimation of benchmark doses in environmental risk assessment

    Science.gov (United States)

    Piegorsch, Walter W.; Xiong, Hui; Bhattacharya, Rabi N.; Lin, Lizhen

    2013-01-01

    Summary An important statistical objective in environmental risk analysis is estimation of minimum exposure levels, called benchmark doses (BMDs), that induce a pre-specified benchmark response in a dose-response experiment. In such settings, representations of the risk are traditionally based on a parametric dose-response model. It is a well-known concern, however, that if the chosen parametric form is misspecified, inaccurate and possibly unsafe low-dose inferences can result. We apply a nonparametric approach for calculating benchmark doses, based on an isotonic regression method for dose-response estimation with quantal-response data (Bhattacharya and Kong, 2007). We determine the large-sample properties of the estimator, develop bootstrap-based confidence limits on the BMDs, and explore the confidence limits’ small-sample properties via a short simulation study. An example from cancer risk assessment illustrates the calculations. PMID:23914133

  8. a Universal De-Noising Algorithm for Ground-Based LIDAR Signal

    Science.gov (United States)

    Ma, Xin; Xiang, Chengzhi; Gong, Wei

    2016-06-01

    Ground-based lidar, working as an effective remote sensing tool, plays an irreplaceable role in the study of atmosphere, since it has the ability to provide the atmospheric vertical profile. However, the appearance of noise in a lidar signal is unavoidable, which leads to difficulties and complexities when searching for more information. Every de-noising method has its own characteristic but with a certain limitation, since the lidar signal will vary with the atmosphere changes. In this paper, a universal de-noising algorithm is proposed to enhance the SNR of a ground-based lidar signal, which is based on signal segmentation and reconstruction. The signal segmentation serving as the keystone of the algorithm, segments the lidar signal into three different parts, which are processed by different de-noising method according to their own characteristics. The signal reconstruction is a relatively simple procedure that is to splice the signal sections end to end. Finally, a series of simulation signal tests and real dual field-of-view lidar signal shows the feasibility of the universal de-noising algorithm.

  9. Ground-based observation of emission lines from the corona of a red-dwarf star.

    Science.gov (United States)

    Schmitt, J H; Wichmann, R

    2001-08-02

    All 'solar-like' stars are surrounded by coronae, which contain magnetically confined plasma at temperatures above 106 K. (Until now, only the Sun's corona could be observed in the optical-as a shimmering envelope during a total solar eclipse.) As the underlying stellar 'surfaces'-the photospheres-are much cooler, some non-radiative process must be responsible for heating the coronae. The heating mechanism is generally thought to be magnetic in origin, but is not yet understood even for the case of the Sun. Ultraviolet emission lines first led to the discovery of the enormous temperature of the Sun's corona, but thermal emission from the coronae of other stars has hitherto been detectable only from space, at X-ray wavelengths. Here we report the detection of emission from highly ionized iron (Fe XIII at 3,388.1 A) in the corona of the red-dwarf star CN Leonis, using a ground-based telescope. The X-ray flux inferred from our data is consistent with previously measured X-ray fluxes, and the non-thermal line width of 18.4 km s-1 indicates great similarities between solar and stellar coronal heating mechanisms. The accessibility and spectral resolution (45,000) of the ground-based instrument are much better than those of X-ray satellites, so a new window to the study of stellar coronae has been opened.

  10. Proceedings of the 30th Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Wetovsky, Marv A [Los Alamos National Laboratory; Aguilar-chang, Julio [Los Alamos National Laboratory; Arrowsmith, Marie [Los Alamos National Laboratory; Arrowsmith, Stephen [Los Alamos National Laboratory; Baker, Diane [Los Alamos National Laboratory; Begnaud, Michael [Los Alamos National Laboratory; Harste, Hans [Los Alamos National Laboratory; Maceira, Monica [Los Alamos National Laboratory; Patton, Howard [Los Alamos National Laboratory; Phillips, Scott [Los Alamos National Laboratory; Randall, George [Los Alamos National Laboratory; Revelle, Douglas [Los Alamos National Laboratory; Rowe, Charlotte [Los Alamos National Laboratory; Stead, Richard [Los Alamos National Laboratory; Steck, Lee [Los Alamos National Laboratory; Whitaker, Rod [Los Alamos National Laboratory; Yang, Xiaoning [Los Alamos National Laboratory

    2008-09-23

    These proceedings contain papers prepared for the 30th Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies, held 23-25 September, 2008 in Portsmouth, Virginia. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Air Force Technical Applications Center (AFTAC), Air Force Research Laboratory (AFRL), US Army Space and Missile Defense Command, Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), and other invited sponsors. The scientific objectives of the research are to improve the United States’ capability to detect, locate, and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the preceding year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.

  11. Proceedings of the 27th Seismic Research Review: Ground-Based Nuclear Explosion Monitoring Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Wetovsky, Marvin A. [Editor; Benson, Jody [Editor; Patterson, Eileen F. [Editor

    2005-09-20

    These proceedings contain papers prepared for the 27th Seismic Research Review: Ground-Based Nuclear Explosion Monitoring Technologies, held 20-22 September, 2005 in Rancho Mirage, California. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Air Force Technical Applications Center (AFTAC), Air Force Research Laboratory (AFRL), US Army Space and Missile Defense Command, and other invited sponsors. The scientific objectives of the research are to improve the United States capability to detect, locate, and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the preceding year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.

  12. Preparing for TESS: Precision Ground-based Light-curves of Newly Discovered Transiting Exoplanets

    Science.gov (United States)

    Li, Yiting; Stefansson, Gudmundur; Mahadevan, Suvrath; Monson, Andy; Hebb, Leslie; Wisniewski, John; Huehnerhoff, Joseph

    2018-01-01

    NASA’s Transiting Exoplanet Survey Satellite (TESS), to be launched in early 2018, is expected to catalog a myriad of transiting exoplanet candidates ranging from Earth-sized to gas giants, orbiting a diverse range of stellar types in the solar neighborhood. In particular, TESS will find small planets orbiting the closest and brightest stars, and will enable detailed atmospheric characterizations of planets with current and future telescopes. In the TESS era, ground-based follow-up resources will play a critical role in validating and confirming the planetary nature of the candidates TESS will discover. Along with confirming the planetary nature of exoplanet transits, high precision ground-based transit observations allow us to put further constraints on exoplanet orbital parameters and transit timing variations. In this talk, we present new observations of transiting exoplanets recently discovered by the K2 mission, using the optical diffuser on the 3.5m ARC Telescope at Apache Point Observatory. These include observations of the mini-Neptunes K2-28b and K2-104b orbiting early-to-mid M-dwarfs. In addition, other recent transit observations performed using the robotic 30cm telescope at Las Campanas Observatory in Chile will be presented.

  13. A hardware-in-the-loop simulation program for ground-based radar

    Science.gov (United States)

    Lam, Eric P.; Black, Dennis W.; Ebisu, Jason S.; Magallon, Julianna

    2011-06-01

    A radar system created using an embedded computer system needs testing. The way to test an embedded computer system is different from the debugging approaches used on desktop computers. One way to test a radar system is to feed it artificial inputs and analyze the outputs of the radar. More often, not all of the building blocks of the radar system are available to test. This will require the engineer to test parts of the radar system using a "black box" approach. A common way to test software code on a desktop simulation is to use breakpoints so that is pauses after each cycle through its calculations. The outputs are compared against the values that are expected. This requires the engineer to use valid test scenarios. We will present a hardware-in-the-loop simulator that allows the embedded system to think it is operating with real-world inputs and outputs. From the embedded system's point of view, it is operating in real-time. The hardware in the loop simulation is based on our Desktop PC Simulation (PCS) testbed. In the past, PCS was used for ground-based radars. This embedded simulation, called Embedded PCS, allows a rapid simulated evaluation of ground-based radar performance in a laboratory environment.

  14. Education and Public Outreach for MSFC's Ground-Based Observations in Support of the HESSI Mission

    Science.gov (United States)

    Adams, Mitzi L.; Hagyard, Mona J.; Newton, Elizabeth K.

    1999-01-01

    A primary focus of NASA is the advancement of science and the communication of these advances to a number of audiences, both within the science research community and outside it. The upcoming High Energy Solar Spectroscopic Imager (HESSI) mission and the MSFC ground-based observing program, provide an excellent opportunity to communicate our knowledge of the Sun, its cycle of activity, the role of magnetic fields in that activity, and its effect on our planet. In addition to ground-based support of the HESSI mission, MSFC's Solar Observatory, located in North Alabama, will involve students and the local education community in its day-to-day operations, an experience which is more immediate, personal, and challenging than their everyday educational experience. Further, by taking advantage of the Internet, our program can reach beyond the immediate community. By joining with Fernbank Science Center in Atlanta, Georgia, we will leverage their almost 30 years'experience in science program delivery in diverse situations to a distance learning opportunity which can encompass the entire Southeast and beyond. This poster will outline our education and public outreach plans in support of the HESSI mission in which we will target middle and high school students and their teachers.

  15. Proceedings of the 29th Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Wetovsky, Marvin A. [Editor; Benson, Jody [Editor; Patterson, Eileen F. [Editor

    2007-09-25

    These proceedings contain papers prepared for the 29th Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies, held 25-27 September, 2007 in Denver, Colorado. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Air Force Technical Applications Center (AFTAC), Air Force Research Laboratory (AFRL), US Army Space and Missile Defense Command, Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), and other invited sponsors. The scientific objectives of the research are to improve the United States capability to detect, locate, and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the preceding year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.

  16. Proceedings of the 2011 Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Wetovsky, Marvin A. [Editor; Patterson, Eileen F. [Editor; Sandoval, Marisa N. [Editor

    2011-09-13

    These proceedings contain papers prepared for the Monitoring Research Review 2011: Ground-Based Nuclear Explosion Monitoring Technologies, held 13-15 September, 2011 in Tucson, Arizona. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Defense Threat Reduction Agency (DTRA), Air Force Research Laboratory (AFRL), US Army Space and Missile Defense Command, Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), National Science Foundation (NSF), and other invited sponsors. The scientific objectives of the research are to improve the United States' capability to detect, locate, and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the preceding year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.

  17. Ground-based VHE γ ray astronomy with air Cherenkov imaging telescopes

    International Nuclear Information System (INIS)

    Mirzoyan, R.

    2000-01-01

    The history of astronomy has been one of the scientific discovery following immediately the introduction of new technology. In this report, we will review shortly the basic development of the atmospheric air Cherenkov light detection technique, particularly the imaging telescope technique, which in the last years led to the firm establishment of a new branch in experimental astronomy, namely ground-based very high-energy (VHE) γ ray astronomy. Milestones in the technology and in the analysis of imaging technique will be discussed. The design of the 17 m diameter MAGIC Telescope, being currently under construction, is based on the development of new technologies for all its major parts and sets new standards in the performance of the ground-based γ detectors. MAGIC is one of the next major steps in the development of the technique being the first instrument that will allow one to carry out measurements also in the not yet investigated energy gap i.e. between 10 and 300 GeV

  18. Retrieval of tropospheric HCHO in El Salvador using ground based DOAS

    Science.gov (United States)

    Abarca, W.; Gamez, K.; Rudamas, C.

    2017-12-01

    Formaldehyde (HCHO) is the most abundant carbonyl in the atmosphere, being an intermediate product in the oxidation of most volatile organic compounds (VOCs). HCHO is carcinogenic, and highly water soluble [1]. HCHO can originate from biomass burning and fossil fuel combustion and has been observed from satellite and ground-based sensors by using the Differential Optical Absorption Spectroscopy (DOAS) technique [2].DOAS products can be used for air quality monitoring, validation of chemical transport models, validation of satellite tropospheric column density retrievals, among others [3]. In this study, we report on column density levels of HCHO measured by ground based Multi-Axis -DOAS in different locations of El Salvador in March, 2015. We have not observed large differences of the HCHO column density values at different viewing directions. This result points out a reasonably polluted and hazy atmosphere in the measuring sites, as reported by other authors [4]. Average values ranging from 1016 to 1017 molecules / cm2 has been obtained. The contribution of vehicular traffic and biomass burning to the column density levels in these sites of El Salvador will be discussed. [1] A. R. Garcia et al., Atmos. Chem. Phys. 6, 4545 (2006) [2] E. Peters et al., Atmos. Chem. Phys. 12, 11179 (2012) [3] T. Vlemmix, et al. Atmos. Meas. Tech., 8, 941-963, 2015 [4] A. Heckel et al., Atmos. Chem. Phys. 5, (2005)

  19. Validation of ozone monitoring instrument ultraviolet index against ground-based UV index in Kampala, Uganda.

    Science.gov (United States)

    Muyimbwa, Dennis; Dahlback, Arne; Ssenyonga, Taddeo; Chen, Yi-Chun; Stamnes, Jakob J; Frette, Øyvind; Hamre, Børge

    2015-10-01

    The Ozone Monitoring Instrument (OMI) overpass solar ultraviolet (UV) indices have been validated against the ground-based UV indices derived from Norwegian Institute for Air Research UV measurements in Kampala (0.31° N, 32.58° E, 1200 m), Uganda for the period between 2005 and 2014. An excessive use of old cars, which would imply a high loading of absorbing aerosols, could cause the OMI retrieval algorithm to overestimate the surface UV irradiances. The UV index values were found to follow a seasonal pattern with maximum values in March and October. Under all-sky conditions, the OMI retrieval algorithm was found to overestimate the UV index values with a mean bias of about 28%. When only days with radiation modification factor greater than or equal to 65%, 70%, 75%, and 80% were considered, the mean bias between ground-based and OMI overpass UV index values was reduced to 8%, 5%, 3%, and 1%, respectively. The overestimation of the UV index by the OMI retrieval algorithm was found to be mainly due to clouds and aerosols.

  20. Proceedings of the 2011 Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies

    International Nuclear Information System (INIS)

    Wetovsky, Marvin A.; Patterson, Eileen F.; Sandoval, Marisa N.

    2011-01-01

    These proceedings contain papers prepared for the Monitoring Research Review 2011: Ground-Based Nuclear Explosion Monitoring Technologies, held 13-15 September, 2011 in Tucson, Arizona. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Defense Threat Reduction Agency (DTRA), Air Force Research Laboratory (AFRL), US Army Space and Missile Defense Command, Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), National Science Foundation (NSF), and other invited sponsors. The scientific objectives of the research are to improve the United States' capability to detect, locate, and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the preceding year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.

  1. The Polarization-Sensitive Bolometers for SPICA and their Potential Use for Ground-Based Application

    Science.gov (United States)

    Reveret, Vincent

    2018-01-01

    CEA is leading the development of Safari-POL, an imaging-polarimeter aboard the SPICA space observatory (ESA M5). SPICA will be able to reach unprecedented sensitivities thanks to its cooled telescope and its ultra-sensitive detectors. The detector assembly of Safari-POL holds three arrays that are cooled down to 50 mK and correspond to three spectral bands : 100, 200 and 350 microns. The detectors (silicon bolometers), benefit from the Herschel/PACS legacy and are also a big step forward in term of sensitivity (improved by two orders of magnitude compared to PACS bolometers) and for polarimetry capabilities. Indeed, each pixel is intrinsically sensitive to two polarization components (Horizontal and Vertical). We will present the Safari-POL concept, the first results of measurements made on the detectors, and future plans for possible ground-based instruments using this technology. We will also present the example of the ArTéMiS camera, installed at APEX, that was developped as a ground-based conterpart of the PACS photometer.

  2. Prospects for Ground-Based Detection and Follow-up of TESS-Discovered Exoplanets

    Science.gov (United States)

    Varakian, Matthew; Deming, Drake

    2018-01-01

    The Transiting Exoplanet Survey Satellite (TESS) will monitor over 200,000 main sequence dwarf stars for exoplanetary transits, with the goal of discovering small planets orbiting stars that are bright enough for follow-up observations. We here evaluate the prospects for ground-based transit detection and follow-up of the TESS-discovered planets. We focus particularly on the TESS planets that only transit once during each 27.4 day TESS observing window per region, and we calculate to what extent ground-based recovery of additional transits will be possible. Using simulated exoplanet systems from Sullivan et al. and assuming the use of a 60-cm telescope at a high quality observing site, we project the S/N ratios for transits of such planets. We use Phoenix stellar models for stars with surface temperatures from 2500K to 12000K, and we account for limb darkening, red atmospheric noise, and missed transits due to the day-night cycle and poor weather.

  3. Nighttime Aerosol Optical Depth Measurements Using a Ground-based Lunar Photometer

    Science.gov (United States)

    Berkoff, Tim; Omar, Ali; Haggard, Charles; Pippin, Margaret; Tasaddaq, Aasam; Stone, Tom; Rodriguez, Jon; Slutsker, Ilya; Eck, Tom; Holben, Brent; hide

    2015-01-01

    In recent years it was proposed to combine AERONET network photometer capabilities with a high precision lunar model used for satellite calibration to retrieve columnar nighttime AODs. The USGS lunar model can continuously provide pre-atmosphere high precision lunar irradiance determinations for multiple wavelengths at ground sensor locations. When combined with measured irradiances from a ground-based AERONET photometer, atmospheric column transmissions can determined yielding nighttime column aerosol AOD and Angstrom coefficients. Additional demonstrations have utilized this approach to further develop calibration methods and to obtain data in polar regions where extended periods of darkness occur. This new capability enables more complete studies of the diurnal behavior of aerosols, and feedback for models and satellite retrievals for the nighttime behavior of aerosols. It is anticipated that the nighttime capability of these sensors will be useful for comparisons with satellite lidars such as CALIOP and CATS in additional to ground-based lidars in MPLNET at night, when the signal-to-noise ratio is higher than daytime and more precise AOD comparisons can be made.

  4. Automatic vetting of planet candidates from ground based surveys: Machine learning with NGTS

    Science.gov (United States)

    Armstrong, David J.; Günther, Maximilian N.; McCormac, James; Smith, Alexis M. S.; Bayliss, Daniel; Bouchy, François; Burleigh, Matthew R.; Casewell, Sarah; Eigmüller, Philipp; Gillen, Edward; Goad, Michael R.; Hodgkin, Simon T.; Jenkins, James S.; Louden, Tom; Metrailler, Lionel; Pollacco, Don; Poppenhaeger, Katja; Queloz, Didier; Raynard, Liam; Rauer, Heike; Udry, Stéphane; Walker, Simon R.; Watson, Christopher A.; West, Richard G.; Wheatley, Peter J.

    2018-05-01

    State of the art exoplanet transit surveys are producing ever increasing quantities of data. To make the best use of this resource, in detecting interesting planetary systems or in determining accurate planetary population statistics, requires new automated methods. Here we describe a machine learning algorithm that forms an integral part of the pipeline for the NGTS transit survey, demonstrating the efficacy of machine learning in selecting planetary candidates from multi-night ground based survey data. Our method uses a combination of random forests and self-organising-maps to rank planetary candidates, achieving an AUC score of 97.6% in ranking 12368 injected planets against 27496 false positives in the NGTS data. We build on past examples by using injected transit signals to form a training set, a necessary development for applying similar methods to upcoming surveys. We also make the autovet code used to implement the algorithm publicly accessible. autovet is designed to perform machine learned vetting of planetary candidates, and can utilise a variety of methods. The apparent robustness of machine learning techniques, whether on space-based or the qualitatively different ground-based data, highlights their importance to future surveys such as TESS and PLATO and the need to better understand their advantages and pitfalls in an exoplanetary context.

  5. Predicting Electron Population Characteristics in 2-D Using Multispectral Ground-Based Imaging

    Science.gov (United States)

    Grubbs, Guy; Michell, Robert; Samara, Marilia; Hampton, Donald; Jahn, Jorg-Micha

    2018-01-01

    Ground-based imaging and in situ sounding rocket data are compared to electron transport modeling for an active inverted-V type auroral event. The Ground-to-Rocket Electrodynamics-Electrons Correlative Experiment (GREECE) mission successfully launched from Poker Flat, Alaska, on 3 March 2014 at 11:09:50 UT and reached an apogee of approximately 335 km over the aurora. Multiple ground-based electron-multiplying charge-coupled device (EMCCD) imagers were positioned at Venetie, Alaska, and aimed toward magnetic zenith. The imagers observed the intensity of different auroral emission lines (427.8, 557.7, and 844.6 nm) at the magnetic foot point of the rocket payload. Emission line intensity data are correlated with electron characteristics measured by the GREECE onboard electron spectrometer. A modified version of the GLobal airglOW (GLOW) model is used to estimate precipitating electron characteristics based on optical emissions. GLOW predicted the electron population characteristics with 20% error given the observed spectral intensities within 10° of magnetic zenith. Predictions are within 30% of the actual values within 20° of magnetic zenith for inverted-V-type aurora. Therefore, it is argued that this technique can be used, at least in certain types of aurora, such as the inverted-V type presented here, to derive 2-D maps of electron characteristics. These can then be used to further derive 2-D maps of ionospheric parameters as a function of time, based solely on multispectral optical imaging data.

  6. Proceedings of the 27th Seismic Research Review: Ground-Based Nuclear Explosion Monitoring Technologies

    International Nuclear Information System (INIS)

    Wetovsky, Marvin A.; Benson, Jody; Patterson, Eileen F.

    2005-01-01

    These proceedings contain papers prepared for the 27th Seismic Research Review: Ground-Based Nuclear Explosion Monitoring Technologies, held 20-22 September, 2005 in Rancho Mirage, California. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Air Force Technical Applications Center (AFTAC), Air Force Research Laboratory (AFRL), US Army Space and Missile Defense Command, and other invited sponsors. The scientific objectives of the research are to improve the United States capability to detect, locate, and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the preceding year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.

  7. Development of a Ground-Based Atmospheric Monitoring Network for the Global Mercury Observation System (GMOS

    Directory of Open Access Journals (Sweden)

    Sprovieri F.

    2013-04-01

    Full Text Available Consistent, high-quality measurements of atmospheric mercury (Hg are necessary in order to better understand Hg emissions, transport, and deposition on a global scale. Although the number of atmospheric Hg monitoring stations has increased in recent years, the available measurement database is limited and there are many regions of the world where measurements have not been extensively performed. Long-term atmospheric Hg monitoring and additional ground-based monitoring sites are needed in order to generate datasets that will offer new insight and information about the global scale trends of atmospheric Hg emissions and deposition. In the framework of the Global Mercury Observation System (GMOS project, a coordinated global observational network for atmospheric Hg is being established. The overall research strategy of GMOS is to develop a state-of-the-art observation system able to provide information on the concentration of Hg species in ambient air and precipitation on the global scale. This network is being developed by integrating previously established ground-based atmospheric Hg monitoring stations with newly established GMOS sites that are located both at high altitude and sea level locations, as well as in climatically diverse regions. Through the collection of consistent, high-quality atmospheric Hg measurement data, we seek to create a comprehensive assessment of atmospheric Hg concentrations and their dependence on meteorology, long-range atmospheric transport and atmospheric emissions.

  8. Improving Agricultural Water Resources Management Using Ground-based Infrared Thermometry

    Science.gov (United States)

    Taghvaeian, S.

    2014-12-01

    Irrigated agriculture is the largest user of freshwater resources in arid/semi-arid parts of the world. Meeting rapidly growing demands in food, feed, fiber, and fuel while minimizing environmental pollution under a changing climate requires significant improvements in agricultural water management and irrigation scheduling. Although recent advances in remote sensing techniques and hydrological modeling has provided valuable information on agricultural water resources and their management, real improvements will only occur if farmers, the decision makers on the ground, are provided with simple, affordable, and practical tools to schedule irrigation events. This presentation reviews efforts in developing methods based on ground-based infrared thermometry and thermography for day-to-day management of irrigation systems. The results of research studies conducted in Colorado and Oklahoma show that ground-based remote sensing methods can be used effectively in quantifying water stress and consequently triggering irrigation events. Crop water use estimates based on stress indices have also showed to be in good agreement with estimates based on other methods (e.g. surface energy balance, root zone soil water balance, etc.). Major challenges toward the adoption of this approach by agricultural producers include the reduced accuracy under cloudy and humid conditions and its inability to forecast irrigation date, which is a critical knowledge since many irrigators need to decide about irrigations a few days in advance.

  9. Proceedings of the 2010 Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Wetovsky, Marvin A [Editor; Patterson, Eileen F [Editor

    2010-09-21

    These proceedings contain papers prepared for the Monitoring Research Review 2010: Ground-Based Nuclear Explosion Monitoring Technologies, held 21-23 September, 2010 in Orlando, Florida,. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Air Force Research Laboratory (AFRL), US Army Space and Missile Defense Command, National Science Foundation (NSF), Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), and other invited sponsors. The scientific objectives of the research are to improve the United States capability to detect, locate, and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the preceding year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.

  10. Recent successes and emerging challenges for coordinated satellite/ground-based magnetospheric exploration and modeling.

    Science.gov (United States)

    Angelopoulos, Vassilis

    With the availability of a distributed constellation of spacecraft (THEMIS, Geotail, Cluster) and increased capability ground based arrays (SuperDARN, THEMIS/GBOs), it is now pos-sible to infer simply from timing significant information regarding mapping of magnetospheric phenomena. Optical, magnetometer and radar data can pinpoint the location and nature of onset signatures. On the other hand, magnetic field modeling constrained by physical bound-aries (such as the isotropy boundary) the measured magnetic field and total pressure values at a distibuted network of satellites has proven to do a much better job at correlating ionospheric precipitation and diffuse auroral boundaries to magnetospheric phenomena, such as the inward boundary of the dipolarization fronts. It is now possible to routinely compare in-situ measured phase space densities of ion and electron distributions during ionosphere -magnetosphere con-junctions, in the absense of potential drops. It is also possible to not only infer equivalent current systems from the ground, but use reconstruction of the ionospheric current system from space to determine the full electrodynamics evolution of the ionosphere and compare with radars. Assimilation of this emerging ground based and global magnetospheric panoply into a self consistent magnetospheric model will likely be one of the most fruitful endeavors in magnetospheric exploration during the next few years.

  11. Development and calibration of a ground-based active collector for cloud- and fogwater

    Energy Technology Data Exchange (ETDEWEB)

    Kins, L.; Junkermann, W.; Meixner, F.X.; Muller, K.P.; Ehhalt, D.H.

    1986-04-01

    In spring 1985, field experiments were started to study the scavenging processes of atmospheric trace substances. Besides the chemical analysis of precipitation sample, these studies required simultaneous collection of cloud water for chemical analysis. In particular, a ground-based cloud water collector was needed, suitable for use on the top of a TV-tower. Existing designs of ground-based cloud or fogwater samplers be divided into two general classes: a) passive collectors, which utilize the ambient wind to impact the droplets on the collection surface; b) active collectors, which accelerate the droplets to a certain velocity as they approach the collection surface. Teflon-strings are extended between two disks which are 1m apart. The disadvantage of this collector, for these experiments, was that the collector strings are always exposed to the ambient air, so that contamination by aerosol impact during dry periods can not be excluded. Furthermore, because of the length of the strings, impacted droplets need a certain time to drain off, during which they remain exposed to the ambient air stream and continue to scavenge trace gases.

  12. The Monitoring Case of Ground-Based Synthetic Aperture Radar with Frequency Modulated Continuous Wave System

    Science.gov (United States)

    Zhang, H. Y.; Zhai, Q. P.; Chen, L.; Liu, Y. J.; Zhou, K. Q.; Wang, Y. S.; Dou, Y. D.

    2017-09-01

    The features of the landslide geological disaster are wide distribution, variety, high frequency, high intensity, destructive and so on. It has become a natural disaster with harmful and wide range of influence. The technology of ground-based synthetic aperture radar is a novel deformation monitoring technology developed in recent years. The features of the technology are large monitoring area, high accuracy, long distance without contact and so on. In this paper, fast ground-based synthetic aperture radar (Fast-GBSAR) based on frequency modulated continuous wave (FMCW) system is used to collect the data of Ma Liuzui landslide in Chongqing. The device can reduce the atmospheric errors caused by rapidly changing environment. The landslide deformation can be monitored in severe weather conditions (for example, fog) by Fast-GBSAR with acquisition speed up to 5 seconds per time. The data of Ma Liuzui landslide in Chongqing are analyzed in this paper. The result verifies that the device can monitor landslide deformation under severe weather conditions.

  13. Potential use of ground-based sensor technologies for weed detection.

    Science.gov (United States)

    Peteinatos, Gerassimos G; Weis, Martin; Andújar, Dionisio; Rueda Ayala, Victor; Gerhards, Roland

    2014-02-01

    Site-specific weed management is the part of precision agriculture (PA) that tries to effectively control weed infestations with the least economical and environmental burdens. This can be achieved with the aid of ground-based or near-range sensors in combination with decision rules and precise application technologies. Near-range sensor technologies, developed for mounting on a vehicle, have been emerging for PA applications during the last three decades. These technologies focus on identifying plants and measuring their physiological status with the aid of their spectral and morphological characteristics. Cameras, spectrometers, fluorometers and distance sensors are the most prominent sensors for PA applications. The objective of this article is to describe-ground based sensors that have the potential to be used for weed detection and measurement of weed infestation level. An overview of current sensor systems is presented, describing their concepts, results that have been achieved, already utilized commercial systems and problems that persist. A perspective for the development of these sensors is given. © 2013 Society of Chemical Industry.

  14. "Slow-scanning" in Ground-based Mid-infrared Observations

    Science.gov (United States)

    Ohsawa, Ryou; Sako, Shigeyuki; Miyata, Takashi; Kamizuka, Takafumi; Okada, Kazushi; Mori, Kiyoshi; Uchiyama, Masahito S.; Yamaguchi, Junpei; Fujiyoshi, Takuya; Morii, Mikio; Ikeda, Shiro

    2018-04-01

    Chopping observations with a tip-tilt secondary mirror have conventionally been used in ground-based mid-infrared observations. However, it is not practical for next generation large telescopes to have a large tip-tilt mirror that moves at a frequency larger than a few hertz. We propose an alternative observing method, a "slow-scanning" observation. Images are continuously captured as movie data, while the field of view is slowly moved. The signal from an astronomical object is extracted from the movie data by a low-rank and sparse matrix decomposition. The performance of the "slow-scanning" observation was tested in an experimental observation with Subaru/COMICS. The quality of a resultant image in the "slow-scanning" observation was as good as in a conventional chopping observation with COMICS, at least for a bright point-source object. The observational efficiency in the "slow-scanning" observation was better than that in the chopping observation. The results suggest that the "slow-scanning" observation can be a competitive method for the Subaru telescope and be of potential interest to other ground-based facilities to avoid chopping.

  15. Proceedings of the 28th Seismic Research Review: Ground-Based Nuclear Explosion Monitoring Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Wetovsky, Marvin A. [Editor; Benson, Jody [Editor; Patterson, Eileen F. [Editor

    2006-09-19

    These proceedings contain papers prepared for the 28th Seismic Research Review: Ground-Based Nuclear Explosion Monitoring Technologies, held 19-21 September, 2006 in Orlando, Florida. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Air Force Technical Applications Center (AFTAC), Air Force Research Laboratory (AFRL), US Army Space and Missile Defense Command, Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), and other invited sponsors. The scientific objectives of the research are to improve the United States capability to detect, locate, and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the preceding year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.

  16. Proceedings of the 2009 Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Wetovsky, Marv A [Los Alamos National Laboratory; Aguilar - Chang, Julio [Los Alamos National Laboratory; Anderson, Dale [Los Alamos National Laboratory; Arrowsmith, Marie [Los Alamos National Laboratory; Arrowsmith, Stephen [Los Alamos National Laboratory; Baker, Diane [Los Alamos National Laboratory; Begnaud, Michael [Los Alamos National Laboratory; Harste, Hans [Los Alamos National Laboratory; Maceira, Monica [Los Alamos National Laboratory; Patton, Howard [Los Alamos National Laboratory; Phillips, Scott [Los Alamos National Laboratory; Randall, George [Los Alamos National Laboratory; Rowe, Charlotte [Los Alamos National Laboratory; Stead, Richard [Los Alamos National Laboratory; Steck, Lee [Los Alamos National Laboratory; Whitaker, Rod [Los Alamos National Laboratory; Yang, Xiaoning ( David ) [Los Alamos National Laboratory

    2009-09-21

    These proceedings contain papers prepared for the Monitoring Research Review 2009: Ground -Based Nuclear Explosion Monitoring Technologies, held 21-23 September, 2009 in Tucson, Arizona,. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Air Force Research Laboratory (AFRL), US Army Space and Missile Defense Command, Comprehensive Test Ban Treaty Organization (CTBTO), and other invited sponsors. The scientific objectives of the research are to improve the United States’ capability to detect, locate, and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the preceding year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.

  17. Proceedings of the 2010 Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies

    International Nuclear Information System (INIS)

    Wetovsky, Marvin A.; Patterson, Eileen F.

    2010-01-01

    These proceedings contain papers prepared for the Monitoring Research Review 2010: Ground-Based Nuclear Explosion Monitoring Technologies, held 21-23 September, 2010 in Orlando, Florida,. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Air Force Research Laboratory (AFRL), US Army Space and Missile Defense Command, National Science Foundation (NSF), Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), and other invited sponsors. The scientific objectives of the research are to improve the United States capability to detect, locate, and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the preceding year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.

  18. Proceedings of the 28th Seismic Research Review: Ground-Based Nuclear Explosion Monitoring Technologies

    International Nuclear Information System (INIS)

    Wetovsky, Marvin A.; Benson, Jody; Patterson, Eileen F.

    2006-01-01

    These proceedings contain papers prepared for the 28th Seismic Research Review: Ground-Based Nuclear Explosion Monitoring Technologies, held 19-21 September, 2006 in Orlando, Florida. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Air Force Technical Applications Center (AFTAC), Air Force Research Laboratory (AFRL), US Army Space and Missile Defense Command, Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), and other invited sponsors. The scientific objectives of the research are to improve the United States capability to detect, locate, and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the preceding year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.

  19. A New Technique to Observe ENSO Activity via Ground-Based GPS Receivers

    Science.gov (United States)

    Suparta, Wayan; Iskandar, Ahmad; Singh, Mandeep Singh Jit

    In an attempt to study the effects of global climate change in the tropics for improving global climate model, this paper aims to detect the ENSO events, especially El Nino phase by using ground-based GPS receivers. Precipitable water vapor (PWV) obtained from the Global Positioning System (GPS) Meteorology measurements in line with the sea surface temperature anomaly (SSTa) are used to connect their response to El Niño activity. The data gathered from four selected stations over the Southeast Asia, namely PIMO (Philippines), KUAL (Malaysia), NTUS (Singapore) and BAKO (Indonesia) for the year of 2009/2010 were processed. A strong correlation was observed for PIMO station with a correlation coefficient of -0.90, significantly at the 99 % confidence level. In general, the relationship between GPS PWV and SSTa at all stations on a weekly basis showed with a negative correlation. The negative correlation indicates that during the El Niño event, the PWV variation was in decreased trend. Decreased trend of PWV value is caused by a dry season that affected the GPS signals in the ocean-atmospheric coupling. Based on these promising results, we can propose that the ground-based GPS receiver is capable used to monitor ENSO activity and this is a new prospective method that previously unexplored.

  20. Proceedings of the 29th Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies

    International Nuclear Information System (INIS)

    Wetovsky, Marvin A.; Benson, Jody; Patterson, Eileen F.

    2007-01-01

    These proceedings contain papers prepared for the 29th Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies, held 25-27 September, 2007 in Denver, Colorado. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Air Force Technical Applications Center (AFTAC), Air Force Research Laboratory (AFRL), US Army Space and Missile Defense Command, Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), and other invited sponsors. The scientific objectives of the research are to improve the United States capability to detect, locate, and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the preceding year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.

  1. Exploring the relationship between monitored ground-based and satellite aerosol measurements over the City of Johannesburg

    CSIR Research Space (South Africa)

    Garland, Rebecca M

    2012-09-01

    Full Text Available This project studied the relationship between aerosol optical depth (AOD) from the Multi-angle Imaging SpectroRadiometer (MISR) instrument on the Terra satellite, and ground-based monitored particulate matter (PM) mass concentrations measured...

  2. Information Technology Management: Select Controls for the Information Security of the Ground-Based Midcourse Defense Communications Network

    National Research Council Canada - National Science Library

    Truex, Kathryn M; Lamar, Karen J; Leighton, George A; Woodruff, Courtney E; Brunetti, Tina N; Russell, Dawn M

    2006-01-01

    ... to the Ground-Based Midcourse Defense Communications Network should read this report to reduce the risk of interruption, misuse, modification, and unauthorized access to information in the system...

  3. Ground-Based Global Navigation Satellite System (GNSS) GPS Broadcast Ephemeris Data (daily files) from NASA CDDIS

    Data.gov (United States)

    National Aeronautics and Space Administration — This dataset consists of ground-based Global Navigation Satellite System (GNSS) GPS Broadcast Ephemeris Data (daily files) from the NASA Crustal Dynamics Data...

  4. Ground-Based Global Navigation Satellite System Mixed Broadcast Ephemeris Data (sub-hourly files) from NASA CDDIS

    Data.gov (United States)

    National Aeronautics and Space Administration — This dataset consists of ground-based Global Navigation Satellite System (GNSS) Mixed Broadcast Ephemeris Data (sub-hourly files) from the NASA Crustal Dynamics Data...

  5. Coordinated Ground-Based Observations and the New Horizons Fly-by of Pluto

    Science.gov (United States)

    Young, Eliot; Young, Leslie; Parker, Joel; Binzel, Richard

    2015-04-01

    The New Horizons (NH) spacecraft is scheduled to make its closest approach to Pluto on July 14, 2015. NH carries seven scientific instruments, including separate UV and Visible-IR spectrographs, a long-focal-length imager, two plasma-sensing instruments and a dust counter. There are three arenas in particular in which ground-based observations should augment the NH instrument suite in synergistic ways: IR spectra at wavelengths longer than 2.5 µm (i.e., longer than the NH Ralph spectrograph), stellar occultation observations near the time of the fly-by, and thermal surface maps and atmospheric CO abundances based on ALMA observations - we discuss the first two of these. IR spectra in the 3 - 5 µm range cover the CH4 absorption band near 3.3 µm. This band can be an important constraint on the state and areal extent of nitrogen frost on Pluto's surface. If this band depth is close to zero (as was observed by Olkin et al. 2007), it limits the area of nitrogen frost, which is bright at that wavelength. Combined with the NH observations of nitrogen frost at 2.15 µm, the ground-based spectra will determine how much nitrogen frost is diluted with methane, which is a basic constraint on the seasonal cycle of sublimation and condensation that takes place on Pluto (and similar objects like Triton and Eris). There is a fortuitous stellar occultation by Pluto on 29-JUN-2015, only two weeks before the NH closest approach. The occulted star will be the brightest ever observed in a Pluto event, about 2 magnitudes brighter than Pluto itself. The track of the event is predicted to cover parts of Australia and New Zealand. Thanks to HST and ground based campaigns to find a TNO target reachable by NH, the position of the shadow path will be known at the +/-100 km level, allowing SOFIA and mobile ground-based observers to reliably cover the central flash region. Ground-based & SOFIA observations in visible and IR wavelengths will characterize the haze opacity and vertical

  6. Radiation Detection Computational Benchmark Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Shaver, Mark W.; Casella, Andrew M.; Wittman, Richard S.; McDonald, Ben S.

    2013-09-24

    Modeling forms an important component of radiation detection development, allowing for testing of new detector designs, evaluation of existing equipment against a wide variety of potential threat sources, and assessing operation performance of radiation detection systems. This can, however, result in large and complex scenarios which are time consuming to model. A variety of approaches to radiation transport modeling exist with complementary strengths and weaknesses for different problems. This variety of approaches, and the development of promising new tools (such as ORNL’s ADVANTG) which combine benefits of multiple approaches, illustrates the need for a means of evaluating or comparing different techniques for radiation detection problems. This report presents a set of 9 benchmark problems for comparing different types of radiation transport calculations, identifying appropriate tools for classes of problems, and testing and guiding the development of new methods. The benchmarks were drawn primarily from existing or previous calculations with a preference for scenarios which include experimental data, or otherwise have results with a high level of confidence, are non-sensitive, and represent problem sets of interest to NA-22. From a technical perspective, the benchmarks were chosen to span a range of difficulty and to include gamma transport, neutron transport, or both and represent different important physical processes and a range of sensitivity to angular or energy fidelity. Following benchmark identification, existing information about geometry, measurements, and previous calculations were assembled. Monte Carlo results (MCNP decks) were reviewed or created and re-run in order to attain accurate computational times and to verify agreement with experimental data, when present. Benchmark information was then conveyed to ORNL in order to guide testing and development of hybrid calculations. The results of those ADVANTG calculations were then sent to PNNL for

  7. Multiscale benchmarking of drug delivery vectors.

    Science.gov (United States)

    Summers, Huw D; Ware, Matthew J; Majithia, Ravish; Meissner, Kenith E; Godin, Biana; Rees, Paul

    2016-10-01

    Cross-system comparisons of drug delivery vectors are essential to ensure optimal design. An in-vitro experimental protocol is presented that separates the role of the delivery vector from that of its cargo in determining the cell response, thus allowing quantitative comparison of different systems. The technique is validated through benchmarking of the dose-response of human fibroblast cells exposed to the cationic molecule, polyethylene imine (PEI); delivered as a free molecule and as a cargo on the surface of CdSe nanoparticles and Silica microparticles. The exposure metrics are converted to a delivered dose with the transport properties of the different scale systems characterized by a delivery time, τ. The benchmarking highlights an agglomeration of the free PEI molecules into micron sized clusters and identifies the metric determining cell death as the total number of PEI molecules presented to cells, determined by the delivery vector dose and the surface density of the cargo. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. 3-D neutron transport benchmarks

    International Nuclear Information System (INIS)

    Takeda, T.; Ikeda, H.

    1991-03-01

    A set of 3-D neutron transport benchmark problems proposed by the Osaka University to NEACRP in 1988 has been calculated by many participants and the corresponding results are summarized in this report. The results of K eff , control rod worth and region-averaged fluxes for the four proposed core models, calculated by using various 3-D transport codes are compared and discussed. The calculational methods used were: Monte Carlo, Discrete Ordinates (Sn), Spherical Harmonics (Pn), Nodal Transport and others. The solutions of the four core models are quite useful as benchmarks for checking the validity of 3-D neutron transport codes

  9. Strategic behaviour under regulatory benchmarking

    Energy Technology Data Exchange (ETDEWEB)

    Jamasb, T. [Cambridge Univ. (United Kingdom). Dept. of Applied Economics; Nillesen, P. [NUON NV (Netherlands); Pollitt, M. [Cambridge Univ. (United Kingdom). Judge Inst. of Management

    2004-09-01

    In order to improve the efficiency of electricity distribution networks, some regulators have adopted incentive regulation schemes that rely on performance benchmarking. Although regulation benchmarking can influence the ''regulation game,'' the subject has received limited attention. This paper discusses how strategic behaviour can result in inefficient behaviour by firms. We then use the Data Envelopment Analysis (DEA) method with US utility data to examine implications of illustrative cases of strategic behaviour reported by regulators. The results show that gaming can have significant effects on the measured performance and profitability of firms. (author)

  10. Atomic Energy Research benchmark activity

    International Nuclear Information System (INIS)

    Makai, M.

    1998-01-01

    The test problems utilized in the validation and verification process of computer programs in Atomic Energie Research are collected into one bunch. This is the first step towards issuing a volume in which tests for VVER are collected, along with reference solutions and a number of solutions. The benchmarks do not include the ZR-6 experiments because they have been published along with a number of comparisons in the Final reports of TIC. The present collection focuses on operational and mathematical benchmarks which cover almost the entire range of reaktor calculation. (Author)

  11. Atomic oxygen effects on boron nitride and silicon nitride: A comparison of ground based and space flight data

    Science.gov (United States)

    Cross, J. B.; Lan, E. H.; Smith, C. A.; Whatley, W. J.

    1990-01-01

    The effects of atomic oxygen on boron nitride (BN) and silicon nitride (Si3N4) were evaluated in a low Earth orbit (LEO) flight experiment and in a ground based simulation facility. In both the inflight and ground based experiments, these materials were coated on thin (approx. 250A) silver films, and the electrical resistance of the silver was measured in situ to detect any penetration of atomic oxygen through the BN and Si3N4 materials. In the presence of atomic oxygen, silver oxidizes to form silver oxide, which has a much higher electrical resistance than pure silver. Permeation of atomic oxygen through BN, as indicated by an increase in the electrical resistance of the silver underneath, was observed in both the inflight and ground based experiments. In contrast, no permeation of atomic oxygen through Si3N4 was observed in either the inflight or ground based experiments. The ground based results show good qualitative correlation with the LEO flight results, indicating that ground based facilities such as the one at Los Alamos National Lab can reproduce space flight data from LEO.

  12. Study of the relations between cloud properties and atmospheric conditions using ground-based digital images

    Science.gov (United States)

    Bakalova, Kalinka

    The aerosol constituents of the earth atmosphere are of great significance for the radiation budget and global climate of the planet. They are the precursors of clouds that in turn play an essential role in these processes and in the hydrological cycle of the Earth. Understanding the complex aerosol-cloud interactions requires a detailed knowledge of the dynamical processes moving the water vapor through the atmosphere, and of the physical mechanisms involved in the formation and growth of cloud particles. Ground-based observations on regional and short time scale provide valuable detailed information about atmospheric dynamics and cloud properties, and are used as a complementary tool to the global satellite observations. The objective of the present paper is to study the physical properties of clouds as displayed in ground-based visible images, and juxtapose them to the specific surface and atmospheric meteorological conditions. The observations are being carried out over the urban area of the city of Sofia, Bulgaria. The data obtained from visible images of clouds enable a quantitative description of texture and morphological features of clouds such as shape, thickness, motion, etc. These characteristics are related to cloud microphysical properties. The changes of relative humidity and the horizontal visibility are considered to be representative of the variations of the type (natural/manmade) and amount of the atmospheric aerosols near the earth surface, and potentially, the cloud drop number concentration. The atmospheric dynamics is accounted for by means of the values of the atmospheric pressure, temperature, wind velocity, etc., observed at the earth's surface. The advantage of ground-based observations of clouds compared to satellite ones is in the high spatial and temporal resolution of the obtained data about the lowermost cloud layer, which in turn is sensitive to the meteorological regimes that determine cloud formation and evolution. It turns out

  13. Laser Guidestar Satellite for Ground-based Adaptive Optics Imaging of Geosynchronous Satellites and Astronomical Targets

    Science.gov (United States)

    Marlow, W. A.; Cahoy, K.; Males, J.; Carlton, A.; Yoon, H.

    2015-12-01

    Real-time observation and monitoring of geostationary (GEO) satellites with ground-based imaging systems would be an attractive alternative to fielding high cost, long lead, space-based imagers, but ground-based observations are inherently limited by atmospheric turbulence. Adaptive optics (AO) systems are used to help ground telescopes achieve diffraction-limited seeing. AO systems have historically relied on the use of bright natural guide stars or laser guide stars projected on a layer of the upper atmosphere by ground laser systems. There are several challenges with this approach such as the sidereal motion of GEO objects relative to natural guide stars and limitations of ground-based laser guide stars; they cannot be used to correct tip-tilt, they are not point sources, and have finite angular sizes when detected at the receiver. There is a difference between the wavefront error measured using the guide star compared with the target due to cone effect, which also makes it difficult to use a distributed aperture system with a larger baseline to improve resolution. Inspired by previous concepts proposed by A.H. Greenaway, we present using a space-based laser guide starprojected from a satellite orbiting the Earth. We show that a nanosatellite-based guide star system meets the needs for imaging GEO objects using a low power laser even from 36,000 km altitude. Satellite guide star (SGS) systemswould be well above atmospheric turbulence and could provide a small angular size reference source. CubeSatsoffer inexpensive, frequent access to space at a fraction of the cost of traditional systems, and are now being deployed to geostationary orbits and on interplanetary trajectories. The fundamental CubeSat bus unit of 10 cm cubed can be combined in multiple units and offers a common form factor allowing for easy integration as secondary payloads on traditional launches and rapid testing of new technologies on-orbit. We describe a 6U CubeSat SGS measuring 10 cm x 20 cm x

  14. Ground-based acoustic parametric generator impact on the atmosphere and ionosphere in an active experiment

    Directory of Open Access Journals (Sweden)

    Y. G. Rapoport

    2017-01-01

    Full Text Available We develop theoretical basics of active experiments with two beams of acoustic waves, radiated by a ground-based sound generator. These beams are transformed into atmospheric acoustic gravity waves (AGWs, which have parameters that enable them to penetrate to the altitudes of the ionospheric E and F regions where they influence the electron concentration of the ionosphere. Acoustic waves are generated by the ground-based parametric sound generator (PSG at the two close frequencies. The main idea of the experiment is to design the output parameters of the PSG to build a cascade scheme of nonlinear wave frequency downshift transformations to provide the necessary conditions for their vertical propagation and to enable penetration to ionospheric altitudes. The PSG generates sound waves (SWs with frequencies f1 = 600 and f2 = 625 Hz and large amplitudes (100–420 m s−1. Each of these waves is modulated with the frequency of 0.016 Hz. The novelty of the proposed analytical–numerical model is due to simultaneous accounting for nonlinearity, diffraction, losses, and dispersion and inclusion of the two-stage transformation (1 of the initial acoustic waves to the acoustic wave with the difference frequency Δf = f2 − f1 in the altitude ranges 0–0.1 km, in the strongly nonlinear regime, and (2 of the acoustic wave with the difference frequency to atmospheric acoustic gravity waves with the modulational frequency in the altitude ranges 0.1–20 km, which then reach the altitudes of the ionospheric E and F regions, in a practically linear regime. AGWs, nonlinearly transformed from the sound waves, launched by the two-frequency ground-based sound generator can increase the transparency of the ionosphere for the electromagnetic waves in HF (MHz and VLF (kHz ranges. The developed theoretical model can be used for interpreting an active experiment that includes the PSG impact on the atmosphere–ionosphere system

  15. Benchmarked Library Websites Comparative Study

    KAUST Repository

    Ramli, Rindra M.; Tyhurst, Janis

    2015-01-01

    This presentation provides an analysis of services provided by the benchmarked library websites. The exploratory study includes comparison of these websites against a list of criterion and presents a list of services that are most commonly deployed by the selected websites. In addition to that, the investigators proposed a list of services that could be provided via the KAUST library website.

  16. Prismatic Core Coupled Transient Benchmark

    International Nuclear Information System (INIS)

    Ortensi, J.; Pope, M.A.; Strydom, G.; Sen, R.S.; DeHart, M.D.; Gougar, H.D.; Ellis, C.; Baxter, A.; Seker, V.; Downar, T.J.; Vierow, K.; Ivanov, K.

    2011-01-01

    The Prismatic Modular Reactor (PMR) is one of the High Temperature Reactor (HTR) design concepts that have existed for some time. Several prismatic units have operated in the world (DRAGON, Fort St. Vrain, Peach Bottom) and one unit is still in operation (HTTR). The deterministic neutronics and thermal-fluids transient analysis tools and methods currently available for the design and analysis of PMRs have lagged behind the state of the art compared to LWR reactor technologies. This has motivated the development of more accurate and efficient tools for the design and safety evaluations of the PMR. In addition to the work invested in new methods, it is essential to develop appropriate benchmarks to verify and validate the new methods in computer codes. The purpose of this benchmark is to establish a well-defined problem, based on a common given set of data, to compare methods and tools in core simulation and thermal hydraulics analysis with a specific focus on transient events. The benchmark-working group is currently seeking OECD/NEA sponsorship. This benchmark is being pursued and is heavily based on the success of the PBMR-400 exercise.

  17. Detection of planets in extremely weak central perturbation microlensing events via next-generation ground-based surveys

    International Nuclear Information System (INIS)

    Chung, Sun-Ju; Lee, Chung-Uk; Koo, Jae-Rim

    2014-01-01

    Even though the recently discovered high-magnification event MOA-2010-BLG-311 had complete coverage over its peak, confident planet detection did not happen due to extremely weak central perturbations (EWCPs, fractional deviations of ≲ 2%). For confident detection of planets in EWCP events, it is necessary to have both high cadence monitoring and high photometric accuracy better than those of current follow-up observation systems. The next-generation ground-based observation project, Korea Microlensing Telescope Network (KMTNet), satisfies these conditions. We estimate the probability of occurrence of EWCP events with fractional deviations of ≤2% in high-magnification events and the efficiency of detecting planets in the EWCP events using the KMTNet. From this study, we find that the EWCP events occur with a frequency of >50% in the case of ≲ 100 M E planets with separations of 0.2 AU ≲ d ≲ 20 AU. We find that for main-sequence and sub-giant source stars, ≳ 1 M E planets in EWCP events with deviations ≤2% can be detected with frequency >50% in a certain range that changes with the planet mass. However, it is difficult to detect planets in EWCP events of bright stars like giant stars because it is easy for KMTNet to be saturated around the peak of the events because of its constant exposure time. EWCP events are caused by close, intermediate, and wide planetary systems with low-mass planets and close and wide planetary systems with massive planets. Therefore, we expect that a much greater variety of planetary systems than those already detected, which are mostly intermediate planetary systems, regardless of the planet mass, will be significantly detected in the near future.

  18. Automated Ground-based Time-lapse Camera Monitoring of West Greenland ice sheet outlet Glaciers: Challenges and Solutions

    Science.gov (United States)

    Ahn, Y.; Box, J. E.; Balog, J.; Lewinter, A.

    2008-12-01

    Monitoring Greenland outlet glaciers using remotely sensed data has drawn a great attention in earth science communities for decades and time series analysis of sensory data has provided important variability information of glacier flow by detecting speed and thickness changes, tracking features and acquiring model input. Thanks to advancements of commercial digital camera technology and increased solid state storage, we activated automatic ground-based time-lapse camera stations with high spatial/temporal resolution in west Greenland outlet and collected one-hour interval data continuous for more than one year at some but not all sites. We believe that important information of ice dynamics are contained in these data and that terrestrial mono-/stereo-photogrammetry can provide theoretical/practical fundamentals in data processing along with digital image processing techniques. Time-lapse images over periods in west Greenland indicate various phenomenon. Problematic is rain, snow, fog, shadows, freezing of water on camera enclosure window, image over-exposure, camera motion, sensor platform drift, and fox chewing of instrument cables, and the pecking of plastic window by ravens. Other problems include: feature identification, camera orientation, image registration, feature matching in image pairs, and feature tracking. Another obstacle is that non-metric digital camera contains large distortion to be compensated for precise photogrammetric use. Further, a massive number of images need to be processed in a way that is sufficiently computationally efficient. We meet these challenges by 1) identifying problems in possible photogrammetric processes, 2) categorizing them based on feasibility, and 3) clarifying limitation and alternatives, while emphasizing displacement computation and analyzing regional/temporal variability. We experiment with mono and stereo photogrammetric techniques in the aide of automatic correlation matching for efficiently handling the enormous

  19. Lightning discrimination by a ground-based nuclear burst detection system

    International Nuclear Information System (INIS)

    Thornbrough, A.D.

    1978-04-01

    Sandia Laboratories is developing for the U.S. Army a Ground-Based Nuclear Burst Detection System to provide pertinent information for its field commanders and higher authorities. The equipment must operate in all kinds of weather and produce very low false alarms under all types of conditions. With these requirements, a study of the effects during thunderstorms, which includes thousands of lightning flashes, was conducted. The results of these studies were that, with suitable discrimination, the system had no false alarms during a period of high thunderstorm activity in the Albuquerque area for the time from September 13 to October 3, 1977. Data and plots are included of those false alarms that were recorded before the final discriminants were implemented to provide an inventory of waveshapes for additional analysis

  20. Lightning discrimination by a ground-based nuclear burst detection system

    Energy Technology Data Exchange (ETDEWEB)

    Thornbrough, A.D.

    1978-04-01

    Sandia Laboratories is developing for the U.S. Army a Ground-Based Nuclear Burst Detection System to provide pertinent information for its field commanders and higher authorities. The equipment must operate in all kinds of weather and produce very low false alarms under all types of conditions. With these requirements, a study of the effects during thunderstorms, which includes thousands of lightning flashes, was conducted. The results of these studies were that, with suitable discrimination, the system had no false alarms during a period of high thunderstorm activity in the Albuquerque area for the time from September 13 to October 3, 1977. Data and plots are included of those false alarms that were recorded before the final discriminants were implemented to provide an inventory of waveshapes for additional analysis.

  1. Using Gaia as an Astrometric Tool for Deep Ground-based Surveys

    Science.gov (United States)

    Casetti-Dinescu, Dana I.; Girard, Terrence M.; Schriefer, Michael

    2018-04-01

    Gaia DR1 positions are used to astrometrically calibrate three epochs' worth of Subaru SuprimeCam images in the fields of globular cluster NGC 2419 and the Sextans dwarf spheroidal galaxy. Distortion-correction ``maps'' are constructed from a combination of offset dithers and reference to Gaia DR1. These are used to derive absolute proper motions in the field of NGC 2419. Notably, we identify the photometrically-detected Monoceros structure in the foreground of NGC 2419 as a kinematically-cold population of stars, distinct from Galactic-field stars. This project demonstrates the feasibility of combining Gaia with deep, ground-based surveys, thus extending high-quality astrometry to magnitudes beyond the limits of Gaia.

  2. Portable laser spectrometer for airborne and ground-based remote sensing of geological CO2 emissions.

    Science.gov (United States)

    Queisser, Manuel; Burton, Mike; Allan, Graham R; Chiarugi, Antonio

    2017-07-15

    A 24 kg, suitcase sized, CW laser remote sensing spectrometer (LARSS) with a ~2 km range has been developed. It has demonstrated its flexibility in measuring both atmospheric CO2 from an airborne platform and terrestrial emission of CO2 from a remote mud volcano, Bledug Kuwu, Indonesia, from a ground-based sight. This system scans the CO2 absorption line with 20 discrete wavelengths, as opposed to the typical two-wavelength online offline instrument. This multi-wavelength approach offers an effective quality control, bias control, and confidence estimate of measured CO2 concentrations via spectral fitting. The simplicity, ruggedness, and flexibility in the design allow for easy transportation and use on different platforms with a quick setup in some of the most challenging climatic conditions. While more refinement is needed, the results represent a stepping stone towards widespread use of active one-sided gas remote sensing in the earth sciences.

  3. Atmospheric effect on the ground-based measurements of broadband surface albedo

    Directory of Open Access Journals (Sweden)

    T. Manninen

    2012-11-01

    Full Text Available Ground-based pyranometer measurements of the (clear-sky broadband surface albedo are affected by the atmospheric conditions (mainly by aerosol particles, water vapour and ozone. A new semi-empirical method for estimating the magnitude of the effect of atmospheric conditions on surface albedo measurements in clear-sky conditions is presented. Global and reflected radiation and/or aerosol optical depth (AOD at two wavelengths are needed to apply the method. Depending on the aerosol optical depth and the solar zenith angle values, the effect can be as large as 20%. For the cases we tested using data from the Cabauw atmospheric test site in the Netherlands, the atmosphere caused typically up to 5% overestimation of surface albedo with respect to corresponding black-sky surface albedo values.

  4. Managing a big ground-based astronomy project: the Thirty Meter Telescope (TMT) project

    Science.gov (United States)

    Sanders, Gary H.

    2008-07-01

    TMT is a big science project and its scale is greater than previous ground-based optical/infrared telescope projects. This paper will describe the ideal "linear" project and how the TMT project departs from that ideal. The paper will describe the needed adaptations to successfully manage real world complexities. The progression from science requirements to a reference design, the development of a product-oriented Work Breakdown Structure (WBS) and an organization that parallels the WBS, the implementation of system engineering, requirements definition and the progression through Conceptual Design to Preliminary Design will be summarized. The development of a detailed cost estimate structured by the WBS, and the methodology of risk analysis to estimate contingency fund requirements will be summarized. Designing the project schedule defines the construction plan and, together with the cost model, provides the basis for executing the project guided by an earned value performance measurement system.

  5. Space situational awareness satellites and ground based radiation counting and imaging detector technology

    International Nuclear Information System (INIS)

    Jansen, Frank; Behrens, Joerg; Pospisil, Stanislav; Kudela, Karel

    2011-01-01

    We review the current status from the scientific and technological point of view of solar energetic particles, solar and galactic cosmic ray measurements as well as high energy UV-, X- and gamma-ray imaging of the Sun. These particles and electromagnetic data are an important tool for space situational awareness (SSA) aspects like space weather storm predictions to avoid failures in space, air and ground based technological systems. Real time data acquisition, position and energy sensitive imaging are demanded by the international space weather forecast services. We present how newly developed, highly miniaturized radiation detectors can find application in space in view of future SSA related satellites as a novel space application due to their counting and imaging capabilities.

  6. Conference on the exploitation, maintenance and resale of ground-based photovoltaic plants

    International Nuclear Information System (INIS)

    Roesner, Sven; Christmann, Ralf; Bozonnat, Cedric; Le Pivert, Xavier; Vaassen, Willi; Dumoulin, Cedric; Kiefer, Klaus; Semmel, Andreas; Doose, Eckhard; Bion, Alain; Sanches, Frederico; Daval, Xavier; Pampouille, Antoine; Goetze, Holger; Stahl, Wolf-Ruediger; Merere, Karine

    2017-11-01

    This document gathers contributions and debate contents of a conference. A first set of contributions addressed the situation and recent developments of ground-based photovoltaic power plants in France and in Germany with presentations of legal frameworks in these both countries. The second set addressed the optimisation of such power plants: meteorological prediction and follow-up at the service of production, risks to which these power plants are exposed during operation, and the issue of right price and good practices for maintenance contracts for these plants. A round table addressed the issue of the balance between optimisation and established practices in a new economic framework. The next set of contributions addressed reasons for and effects of the resale of photovoltaic fleet during their exploitation: actors and financing solutions, value components, point of attention and legal view on re-financing contracts. A round table discussed trends and success factors for the re-financing of photovoltaic projects

  7. Compact binary coalescences in the band of ground-based gravitational-wave detectors

    International Nuclear Information System (INIS)

    Mandel, Ilya; O'Shaughnessy, Richard

    2010-01-01

    As the ground-based gravitational-wave telescopes LIGO, Virgo and GEO 600 approach the era of first detections, we review the current knowledge of the coalescence rates and the mass and spin distributions of merging neutron-star and black-hole binaries. We emphasize the bi-directional connection between gravitational-wave astronomy and conventional astrophysics. Astrophysical input will make possible informed decisions about optimal detector configurations and search techniques. Meanwhile, rate upper limits, detected merger rates and the distribution of masses and spins measured by gravitational-wave searches will constrain astrophysical parameters through comparisons with astrophysical models. Future developments necessary to the success of gravitational-wave astronomy are discussed.

  8. The Holy Grail of Resource Assessment: Low Cost Ground-Based Measurements with Good Accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Marion, Bill; Smith, Benjamin

    2017-06-22

    Using performance data from some of the millions of installed photovoltaic (PV) modules with micro-inverters may afford the opportunity to provide ground-based solar resource data critical for developing PV projects. The method used back-solves for the direct normal irradiance (DNI) and the diffuse horizontal irradiance (DHI) from the micro-inverter ac production data. When the derived values of DNI and DHI were then used to model the performance of other PV systems, the annual mean bias deviations were within +/- 4%, and only 1% greater than when the PV performance was modeled using high quality irradiance measurements. An uncertainty analysis shows the method better suited for modeling PV performance than using satellite-based global horizontal irradiance.

  9. z'-BAND GROUND-BASED DETECTION OF THE SECONDARY ECLIPSE OF WASP-19b

    Energy Technology Data Exchange (ETDEWEB)

    Burton, J. R.; Watson, C. A.; Pollacco, D. [Astrophysics Research Centre, Queen' s University Belfast, Belfast BT7 1NN (United Kingdom); Littlefair, S. P.; Dhillon, V. S. [Department of Physics and Astronomy, University of Sheffield, Sheffield S3 7RH (United Kingdom); Gibson, N. P. [Department of Physics, University of Oxford, Oxford OX1 3RH (United Kingdom); Marsh, T. R., E-mail: jburton04@qub.ac.uk [Department of Physics and Astronomy, University of Warwick, Coventry CV4 7AL (United Kingdom)

    2012-08-01

    We present the ground-based detection of the secondary eclipse of the transiting exoplanet WASP-19b. The observations were made in the Sloan z' band using the ULTRACAM triple-beam CCD camera mounted on the New Technology Telescope. The measurement shows a 0.088% {+-} 0.019% eclipse depth, matching previous predictions based on H- and K-band measurements. We discuss in detail our approach to the removal of errors arising due to systematics in the data set, in addition to fitting a model transit to our data. This fit returns an eclipse center, T{sub 0}, of 2455578.7676 HJD, consistent with a circular orbit. Our measurement of the secondary eclipse depth is also compared to model atmospheres of WASP-19b and is found to be consistent with previous measurements at longer wavelengths for the model atmospheres we investigated.

  10. Coastal wind study based on Sentinel-1 and ground-based scanning lidar

    DEFF Research Database (Denmark)

    Hasager, Charlotte Bay; Badger, Merete; Pena Diaz, Alfredo

    Winds in the coastal zone have importance for near-shore wind farm planning. Recently the Danish Energy Agency gave new options for placing offshore wind farms much closer to the coastlines than previously. The new tender areas are located from 3 to 8 km from the coast. Ground-based scanning lidar...... located on land can partly cover this area out to around 15 km. In order to improve wind farm planning for near-shore coastal areas, the project‘Reducing the Uncertainty of Near-shore Energy estimates from meso- and micro-scale wind models’ (RUNE) is established. The measurement campaign starts October....... The various observation types have advantages and limitations; one advantage of both the Sentinel-1 and the scanning lidar is that they both observe wind fields covering a large area and so can be combined for studying the spatial variability of winds. Sentinel-1 are being processed near-real-time at DTU Wind...

  11. Perturbations of ionosphere-magnetosphere coupling by powerful VLF emissions from ground-based transmitters

    International Nuclear Information System (INIS)

    Belov, A. S.; Markov, G. A.; Ryabov, A. O.; Parrot, M.

    2012-01-01

    The characteristics of the plasma-wave disturbances stimulated in the near-Earth plasma by powerful VLF radiation from ground-based transmitters are investigated. Radio communication VLF transmitters of about 1 MW in power are shown to produce artificial plasma-wave channels (density ducts) in the near-Earth space that originate in the lower ionosphere above the disturbing emission source and extend through the entire ionosphere and magnetosphere of the Earth along the magnetic field lines. Measurements with the onboard equipment of the DEMETER satellite have revealed that under the action of emission from the NWC transmitter, which is one of the most powerful VLF radio transmitters, the generation of quasi-electrostatic (plasma) waves is observed on most of the satellite trajectory along the disturbed magnetic flux tube. This may probably be indicative of stimulated emission of a magnetospheric maser.

  12. Status and plans for future generations of ground-based interferometric gravitational wave antennas

    International Nuclear Information System (INIS)

    Kawamura, Seiji

    2003-01-01

    Several medium- to large-scale ground-based interferometric gravitational-wave antennas have been constructed around the world. Although these antennas of the first generation could detect gravitational waves within a few years, it is necessary to improve the sensitivity of the detectors significantly with advanced technologies to ensure more frequent detection of gravitational waves. Stronger seismic isolation and reduction of thermal noise, especially using cryogenic mirrors, are among the most important technologies that can lead us to the realization of advanced detectors. Some of the advanced technologies are already implemented in some of the existing detectors and others are currently being investigated for the future-generation detectors such as advanced LIGO, LCGT, upgrade of GEO600, AIGO, and EURO. We expect that such advanced detectors will eventually open a new window to the universe and establish a new field, 'gravitational wave astronomy'

  13. Space situational awareness satellites and ground based radiation counting and imaging detector technology

    Energy Technology Data Exchange (ETDEWEB)

    Jansen, Frank, E-mail: frank.jansen@dlr.de [DLR Institute of Space Systems, Robert-Hooke-Str. 7, 28359 Bremen (Germany); Behrens, Joerg [DLR Institute of Space Systems, Robert-Hooke-Str. 7, 28359 Bremen (Germany); Pospisil, Stanislav [Czech Technical University, IEAP, 12800 Prague 2, Horska 3a/22 (Czech Republic); Kudela, Karel [Slovak Academy of Sciences, IEP, 04001 Kosice, Watsonova 47 (Slovakia)

    2011-05-15

    We review the current status from the scientific and technological point of view of solar energetic particles, solar and galactic cosmic ray measurements as well as high energy UV-, X- and gamma-ray imaging of the Sun. These particles and electromagnetic data are an important tool for space situational awareness (SSA) aspects like space weather storm predictions to avoid failures in space, air and ground based technological systems. Real time data acquisition, position and energy sensitive imaging are demanded by the international space weather forecast services. We present how newly developed, highly miniaturized radiation detectors can find application in space in view of future SSA related satellites as a novel space application due to their counting and imaging capabilities.

  14. Quantifying greenhouse gas emissions from coal fires using airborne and ground-based methods

    Science.gov (United States)

    Engle, Mark A.; Radke, Lawrence F.; Heffern, Edward L.; O'Keefe, Jennifer M.K.; Smeltzer, Charles; Hower, James C.; Hower, Judith M.; Prakash, Anupma; Kolker, Allan; Eatwell, Robert J.; ter Schure, Arnout; Queen, Gerald; Aggen, Kerry L.; Stracher, Glenn B.; Henke, Kevin R.; Olea, Ricardo A.; Román-Colón, Yomayara

    2011-01-01

    Coal fires occur in all coal-bearing regions of the world and number, conservatively, in the thousands. These fires emit a variety of compounds including greenhouse gases. However, the magnitude of the contribution of combustion gases from coal fires to the environment is highly uncertain, because adequate data and methods for assessing emissions are lacking. This study demonstrates the ability to estimate CO2 and CH4 emissions for the Welch Ranch coal fire, Powder River Basin, Wyoming, USA, using two independent methods: (a) heat flux calculated from aerial thermal infrared imaging (3.7–4.4 t d−1 of CO2 equivalent emissions) and (b) direct, ground-based measurements (7.3–9.5 t d−1 of CO2 equivalent emissions). Both approaches offer the potential for conducting inventories of coal fires to assess their gas emissions and to evaluate and prioritize fires for mitigation.

  15. Evaluating statistical cloud schemes: What can we gain from ground-based remote sensing?

    Science.gov (United States)

    Grützun, V.; Quaas, J.; Morcrette, C. J.; Ament, F.

    2013-09-01

    Statistical cloud schemes with prognostic probability distribution functions have become more important in atmospheric modeling, especially since they are in principle scale adaptive and capture cloud physics in more detail. While in theory the schemes have a great potential, their accuracy is still questionable. High-resolution three-dimensional observational data of water vapor and cloud water, which could be used for testing them, are missing. We explore the potential of ground-based remote sensing such as lidar, microwave, and radar to evaluate prognostic distribution moments using the "perfect model approach." This means that we employ a high-resolution weather model as virtual reality and retrieve full three-dimensional atmospheric quantities and virtual ground-based observations. We then use statistics from the virtual observation to validate the modeled 3-D statistics. Since the data are entirely consistent, any discrepancy occurring is due to the method. Focusing on total water mixing ratio, we find that the mean ratio can be evaluated decently but that it strongly depends on the meteorological conditions as to whether the variance and skewness are reliable. Using some simple schematic description of different synoptic conditions, we show how statistics obtained from point or line measurements can be poor at representing the full three-dimensional distribution of water in the atmosphere. We argue that a careful analysis of measurement data and detailed knowledge of the meteorological situation is necessary to judge whether we can use the data for an evaluation of higher moments of the humidity distribution used by a statistical cloud scheme.

  16. Enhancing our Understanding of Snowfall Modes with Ground-Based Observations

    Science.gov (United States)

    Pettersen, C.; Kulie, M.; Petersen, W. A.; Bliven, L. F.; Wood, N.

    2016-12-01

    Snowfall can be broadly categorized into deep and shallow events based on the vertical distribution of the precipitating ice. Remotely sensed data refine these precipitation categories and aid in discerning the underlying macro- and microphysical mechanisms. The unique patterns in the remotely sensed instruments observations can potentially connect distinct modes of snowfall to specific processes. Though satellites can observe and recognize these patterns in snowfall, these measurements are limited - particularly in cases of shallow and light precipitation, as the snow may be too close to the surface or below the detection limits of the instrumentation. By enhancing satellite measurements with ground-based instrumentation, whether with limited-term field campaigns or long-term strategic sites, we can further our understanding and assumptions about different snowfall modes and how they are measured from spaceborne instruments. Presented are three years of data from a ground-based instrument suite consisting of a MicroRain Radar (MRR; optimized for snow events) and a Precipitation Imaging Package (PIP). These instruments are located at the Marquette, Michigan National Weather Service Weather Forecast Office to: a) use coincident meteorological measurements and observations to enhance our understanding of the thermodynamic drivers and b) showcase these instruments in an operational setting to enhance forecasts of shallow snow events. Three winters of MRR and PIP measurements are partitioned, based on meteorological surface observations, into two-dimensional histograms of reflectivity and particle size distribution data. These statistics improve our interpretation of deep versus shallow precipitation. Additionally, these statistical techniques are applied to similar datasets from Global Precipitation Measurement field campaigns for further insight into cloud and precipitation macro- and microphysical processes.

  17. Ground based mobile isotopic methane measurements in the Front Range, Colorado

    Science.gov (United States)

    Vaughn, B. H.; Rella, C.; Petron, G.; Sherwood, O.; Mielke-Maday, I.; Schwietzke, S.

    2014-12-01

    Increased development of unconventional oil and gas resources in North America has given rise to attempts to monitor and quantify fugitive emissions of methane from the industry. Emission estimates of methane from oil and gas basins can vary significantly from one study to another as well as from EPA or State estimates. New efforts are aimed at reconciling bottom-up, or inventory-based, emission estimates of methane with top-down estimates based on atmospheric measurements from aircraft, towers, mobile ground-based vehicles, and atmospheric models. Attributing airborne measurements of regional methane fluxes to specific sources is informed by ground-based measurements of methane. Stable isotopic measurements (δ13C) of methane help distinguish between emissions from the O&G industry, Confined Animal Feed Operations (CAFO), and landfills, but analytical challenges typically limit meaningful isotopic measurements to individual point sampling. We are developing a toolbox to use δ13CH4 measurements to assess the partitioning of methane emissions for regions with multiple methane sources. The method was applied to the Denver-Julesberg Basin. Here we present data from continuous isotopic measurements obtained over a wide geographic area by using MegaCore, a 1500 ft. tube that is constantly filled with sample air while driving, then subsequently analyzed at slower rates using cavity ring down spectroscopy (CRDS). Pressure, flow and calibration are tightly controlled allowing precise attribution of methane enhancements to their point of collection. Comparisons with point measurements are needed to confirm regional values and further constrain flux estimates and models. This effort was made in conjunction with several major field campaigns in the Colorado Front Range in July-August 2014, including FRAPPÉ (Front Range Air Pollution and Photochemistry Experiment), DISCOVER-AQ, and the Air Water Gas NSF Sustainability Research Network at the University of Colorado.

  18. An evaluation of IASI-NH3 with ground-based Fourier transform infrared spectroscopy measurements

    Directory of Open Access Journals (Sweden)

    E. Dammers

    2016-08-01

    Full Text Available Global distributions of atmospheric ammonia (NH3 measured with satellite instruments such as the Infrared Atmospheric Sounding Interferometer (IASI contain valuable information on NH3 concentrations and variability in regions not yet covered by ground-based instruments. Due to their large spatial coverage and (bi-daily overpasses, the satellite observations have the potential to increase our knowledge of the distribution of NH3 emissions and associated seasonal cycles. However the observations remain poorly validated, with only a handful of available studies often using only surface measurements without any vertical information. In this study, we present the first validation of the IASI-NH3 product using ground-based Fourier transform infrared spectroscopy (FTIR observations. Using a recently developed consistent retrieval strategy, NH3 concentration profiles have been retrieved using observations from nine Network for the Detection of Atmospheric Composition Change (NDACC stations around the world between 2008 and 2015. We demonstrate the importance of strict spatio-temporal collocation criteria for the comparison. Large differences in the regression results are observed for changing intervals of spatial criteria, mostly due to terrain characteristics and the short lifetime of NH3 in the atmosphere. The seasonal variations of both datasets are consistent for most sites. Correlations are found to be high at sites in areas with considerable NH3 levels, whereas correlations are lower at sites with low atmospheric NH3 levels close to the detection limit of the IASI instrument. A combination of the observations from all sites (Nobs = 547 give a mean relative difference of −32.4 ± (56.3 %, a correlation r of 0.8 with a slope of 0.73. These results give an improved estimate of the IASI-NH3 product performance compared to the previous upper-bound estimates (−50 to +100 %.

  19. Validation of OMI UV measurements against ground-based measurements at a station in Kampala, Uganda

    Science.gov (United States)

    Muyimbwa, Dennis; Dahlback, Arne; Stamnes, Jakob; Hamre, Børge; Frette, Øyvind; Ssenyonga, Taddeo; Chen, Yi-Chun

    2015-04-01

    We present solar ultraviolet (UV) irradiance data measured with a NILU-UV instrument at a ground site in Kampala (0.31°N, 32.58°E), Uganda for the period 2005-2014. The data were analyzed and compared with UV irradiances inferred from the Ozone Monitoring Instrument (OMI) for the same period. Kampala is located on the shores of lake Victoria, Africa's largest fresh water lake, which may influence the climate and weather conditions of the region. Also, there is an excessive use of worn cars, which may contribute to a high anthropogenic loading of absorbing aerosols. The OMI surface UV algorithm does not account for absorbing aerosols, which may lead to systematic overestimation of surface UV irradiances inferred from OMI satellite data. We retrieved UV index values from OMI UV irradiances and validated them against the ground-based UV index values obtained from NILU-UV measurements. The UV index values were found to follow a seasonal pattern similar to that of the clouds and the rainfall. OMI inferred UV index values were overestimated with a mean bias of about 28% under all-sky conditions, but the mean bias was reduced to about 8% under clear-sky conditions when only days with radiation modification factor (RMF) greater than 65% were considered. However, when days with RMF greater than 70, 75, and 80% were considered, OMI inferred UV index values were found to agree with the ground-based UV index values to within 5, 3, and 1%, respectively. In the validation we identified clouds/aerosols, which were present in 88% of the measurements, as the main cause of OMI inferred overestimation of the UV index.

  20. Introducing the VISAGE project - Visualization for Integrated Satellite, Airborne, and Ground-based data Exploration

    Science.gov (United States)

    Gatlin, P. N.; Conover, H.; Berendes, T.; Maskey, M.; Naeger, A. R.; Wingo, S. M.

    2017-12-01

    A key component of NASA's Earth observation system is its field experiments, for intensive observation of particular weather phenomena, or for ground validation of satellite observations. These experiments collect data from a wide variety of airborne and ground-based instruments, on different spatial and temporal scales, often in unique formats. The field data are often used with high volume satellite observations that have very different spatial and temporal coverage. The challenges inherent in working with such diverse datasets make it difficult for scientists to rapidly collect and analyze the data for physical process studies and validation of satellite algorithms. The newly-funded VISAGE project will address these issues by combining and extending nascent efforts to provide on-line data fusion, exploration, analysis and delivery capabilities. A key building block is the Field Campaign Explorer (FCX), which allows users to examine data collected during field campaigns and simplifies data acquisition for event-based research. VISAGE will extend FCX's capabilities beyond interactive visualization and exploration of coincident datasets, to provide interrogation of data values and basic analyses such as ratios and differences between data fields. The project will also incorporate new, higher level fused and aggregated analysis products from the System for Integrating Multi-platform data to Build the Atmospheric column (SIMBA), which combines satellite and ground-based observations into a common gridded atmospheric column data product; and the Validation Network (VN), which compiles a nationwide database of coincident ground- and satellite-based radar measurements of precipitation for larger scale scientific analysis. The VISAGE proof-of-concept will target "golden cases" from Global Precipitation Measurement Ground Validation campaigns. This presentation will introduce the VISAGE project, initial accomplishments and near term plans.

  1. Ground-based Observations and Atmospheric Modelling of Energetic Electron Precipitation Effects on Antarctic Mesospheric Chemistry

    Science.gov (United States)

    Newnham, D.; Clilverd, M. A.; Horne, R. B.; Rodger, C. J.; Seppälä, A.; Verronen, P. T.; Andersson, M. E.; Marsh, D. R.; Hendrickx, K.; Megner, L. S.; Kovacs, T.; Feng, W.; Plane, J. M. C.

    2016-12-01

    The effect of energetic electron precipitation (EEP) on the seasonal and diurnal abundances of nitric oxide (NO) and ozone in the Antarctic middle atmosphere during March 2013 to July 2014 is investigated. Geomagnetic storm activity during this period, close to solar maximum, was driven primarily by impulsive coronal mass ejections. Near-continuous ground-based atmospheric measurements have been made by a passive millimetre-wave radiometer deployed at Halley station (75°37'S, 26°14'W, L = 4.6), Antarctica. This location is directly under the region of radiation-belt EEP, at the extremity of magnetospheric substorm-driven EEP, and deep within the polar vortex during Austral winter. Superposed epoch analyses of the ground based data, together with NO observations made by the Solar Occultation For Ice Experiment (SOFIE) onboard the Aeronomy of Ice in the Mesosphere (AIM) satellite, show enhanced mesospheric NO following moderate geomagnetic storms (Dst ≤ -50 nT). Measurements by co-located 30 MHz riometers indicate simultaneous increases in ionisation at 75-90 km directly above Halley when Kp index ≥ 4. Direct NO production by EEP in the upper mesosphere, versus downward transport of NO from the lower thermosphere, is evaluated using a new version of the Whole Atmosphere Community Climate Model incorporating the full Sodankylä Ion Neutral Chemistry Model (WACCM SIC). Model ionization rates are derived from the Polar orbiting Operational Environmental Satellites (POES) second generation Space Environment Monitor (SEM 2) Medium Energy Proton and Electron Detector instrument (MEPED). The model data are compared with observations to quantify the impact of EEP on stratospheric and mesospheric odd nitrogen (NOx), odd hydrogen (HOx), and ozone.

  2. A novel technique for extracting clouds base height using ground based imaging

    Directory of Open Access Journals (Sweden)

    E. Hirsch

    2011-01-01

    Full Text Available The height of a cloud in the atmospheric column is a key parameter in its characterization. Several remote sensing techniques (passive and active, either ground-based or on space-borne platforms and in-situ measurements are routinely used in order to estimate top and base heights of clouds. In this article we present a novel method that combines thermal imaging from the ground and sounded wind profile in order to derive the cloud base height. This method is independent of cloud types, making it efficient for both low boundary layer and high clouds. In addition, using thermal imaging ensures extraction of clouds' features during daytime as well as at nighttime. The proposed technique was validated by comparison to active sounding by ceilometers (which is a standard ground based method, to lifted condensation level (LCL calculations, and to MODIS products obtained from space. As all passive remote sensing techniques, the proposed method extracts only the height of the lowest cloud layer, thus upper cloud layers are not detected. Nevertheless, the information derived from this method can be complementary to space-borne cloud top measurements when deep-convective clouds are present. Unlike techniques such as LCL, this method is not limited to boundary layer clouds, and can extract the cloud base height at any level, as long as sufficient thermal contrast exists between the radiative temperatures of the cloud and its surrounding air parcel. Another advantage of the proposed method is its simplicity and modest power needs, making it particularly suitable for field measurements and deployment at remote locations. Our method can be further simplified for use with visible CCD or CMOS camera (although nighttime clouds will not be observed.

  3. Nutritional status assessment in semiclosed environments: ground-based and space flight studies in humans

    Science.gov (United States)

    Smith, S. M.; Davis-Street, J. E.; Rice, B. L.; Nillen, J. L.; Gillman, P. L.; Block, G.

    2001-01-01

    Adequate nutrition is critical during long-term spaceflight, as is the ability to easily monitor dietary intake. A comprehensive nutritional status assessment profile was designed for use before, during and after flight. It included assessment of both dietary intake and biochemical markers of nutritional status. A spaceflight food-frequency questionnaire (FFQ) was developed to evaluate intake of key nutrients during spaceflight. The nutritional status assessment protocol was evaluated during two ground-based closed-chamber studies (60 and 91 d; n = 4/study), and was implemented for two astronauts during 4-mo stays on the Mir space station. Ground-based studies indicated that the FFQ, administered daily or weekly, adequately estimated intake of key nutrients. Chamber subjects maintained prechamber energy intake and body weight. Astronauts tended to eat 40--50% of WHO-predicted energy requirements, and lost >10% of preflight body mass. Serum ferritin levels were lower after the chamber stays, despite adequate iron intake. Red blood cell folate concentrations were increased after the chamber studies. Vitamin D stores were decreased by > 40% on chamber egress and after spaceflight. Mir crew members had decreased levels of most nutritional indices, but these are difficult to interpret given the insufficient energy intake and loss of body mass. Spaceflight food systems can provide adequate intake of macronutrients, although, as expected, micronutrient intake is a concern for any closed or semiclosed food system. These data demonstrate the utility and importance of nutritional status assessment during spaceflight and of the FFQ during extended-duration spaceflight.

  4. How ground-based observations can support satellite greenhouse gas retrievals

    Science.gov (United States)

    Butler, J. H.; Tans, P. P.; Sweeney, C.; Dlugokencky, E. J.

    2012-04-01

    Global society will eventually accelerate efforts to reduce greenhouse gas emissions in a variety of ways. These would likely involve international treaties, national policies, and regional strategies that will affect a number of economic, social, and environmental sectors. Some strategies will work better than others and some will not work at all. Because trillions of dollars will be involved in pursuing greenhouse gas emission reductions - through realignment of energy production, improvement of efficiencies, institution of taxes, implementation of carbon trading markets, and use of offsets - it is imperative that society be given all the tools at its disposal to ensure the ultimate success of these efforts. Providing independent, globally coherent information on the success of these efforts will give considerable strength to treaties, policies, and strategies. Doing this will require greenhouse gas observations greatly expanded from what we have today. Satellite measurements may ultimately be indispensable in achieving global coverage, but the requirements for accuracy and continuity of measurements over time are demanding if the data are to be relevant. Issues such as those associated with sensor drift, aging electronics, and retrieval artifacts present challenges that can be addressed in part by close coordination with ground-based and in situ systems. This presentation identifies the information that ground-based systems provide very well, but it also looks at what would be deficient even in a greatly expanded surface system, where satellites can fill these gaps, and how on-going, ground and in situ measurements can aid in addressing issues associated with accuracy, long-term continuity, and retrieval artifacts.

  5. Ground-based telescope pointing and tracking optimization using a neural controller.

    Science.gov (United States)

    Mancini, D; Brescia, M; Schipani, P

    2003-01-01

    Neural network models (NN) have emerged as important components for applications of adaptive control theories. Their basic generalization capability, based on acquired knowledge, together with execution rapidity and correlation ability between input stimula, are basic attributes to consider NN as an extremely powerful tool for on-line control of complex systems. By a control system point of view, not only accuracy and speed, but also, in some cases, a high level of adaptation capability is required in order to match all working phases of the whole system during its lifetime. This is particularly remarkable for a new generation ground-based telescope control system. Infact, strong changes in terms of system speed and instantaneous position error tolerance are necessary, especially in case of trajectory disturb induced by wind shake. The classical control scheme adopted in such a system is based on the proportional integral (PI) filter, already applied and implemented on a large amount of new generation telescopes, considered as a standard in this technological environment. In this paper we introduce the concept of a new approach, the neural variable structure proportional integral, (NVSPI), related to the implementation of a standard multi layer perceptron network in new generation ground-based Alt-Az telescope control systems. Its main purpose is to improve adaptive capability of the Variable structure proportional integral model, an already innovative control scheme recently introduced by authors [Proc SPIE (1997)], based on a modified version of classical PI control model, in terms of flexibility and accuracy of the dynamic response range also in presence of wind noise effects. The realization of a powerful well tested and validated telescope model simulation system allowed the possibility to directly compare performances of the two control schemes on simulated tracking trajectories, revealing extremely encouraging results in terms of NVSPI control robustness and

  6. Characterizing the Vertical Distribution of Aerosols using Ground-based Multiwavelength Lidar Data

    Science.gov (United States)

    Ferrare, R. A.; Thorsen, T. J.; Clayton, M.; Mueller, D.; Chemyakin, E.; Burton, S. P.; Goldsmith, J.; Holz, R.; Kuehn, R.; Eloranta, E. W.; Marais, W.; Newsom, R. K.; Liu, X.; Sawamura, P.; Holben, B. N.; Hostetler, C. A.

    2016-12-01

    Observations of aerosol optical and microphysical properties are critical for developing and evaluating aerosol transport model parameterizations and assessing global aerosol-radiation impacts on climate. During the Combined HSRL And Raman lidar Measurement Study (CHARMS), we investigated the synergistic use of ground-based Raman lidar and High Spectral Resolution Lidar (HSRL) measurements to retrieve aerosol properties aloft. Continuous (24/7) operation of these co-located lidars during the ten-week CHARMS mission (mid-July through September 2015) allowed the acquisition of a unique, multiwavelength ground-based lidar dataset for studying aerosol properties above the Southern Great Plains (SGP) site. The ARM Raman lidar measured profiles of aerosol backscatter, extinction and depolarization at 355 nm as well as profiles of water vapor mixing ratio and temperature. The University of Wisconsin HSRL simultaneously measured profiles of aerosol backscatter, extinction and depolarization at 532 nm and aerosol backscatter at 1064 nm. Recent advances in both lidar retrieval theory and algorithm development demonstrate that vertically-resolved retrievals using such multiwavelength lidar measurements of aerosol backscatter and extinction can help constrain both the aerosol optical (e.g. complex refractive index, scattering, etc.) and microphysical properties (e.g. effective radius, concentrations) as well as provide qualitative aerosol classification. Based on this work, the NASA Langley Research Center (LaRC) HSRL group developed automated algorithms for classifying and retrieving aerosol optical and microphysical properties, demonstrated these retrievals using data from the unique NASA/LaRC airborne multiwavelength HSRL-2 system, and validated the results using coincident airborne in situ data. We apply these algorithms to the CHARMS multiwavelength (Raman+HSRL) lidar dataset to retrieve aerosol properties above the SGP site. We present some profiles of aerosol effective

  7. Automated cloud classification using a ground based infra-red camera and texture analysis techniques

    Science.gov (United States)

    Rumi, Emal; Kerr, David; Coupland, Jeremy M.; Sandford, Andrew P.; Brettle, Mike J.

    2013-10-01

    Clouds play an important role in influencing the dynamics of local and global weather and climate conditions. Continuous monitoring of clouds is vital for weather forecasting and for air-traffic control. Convective clouds such as Towering Cumulus (TCU) and Cumulonimbus clouds (CB) are associated with thunderstorms, turbulence and atmospheric instability. Human observers periodically report the presence of CB and TCU clouds during operational hours at airports and observatories; however such observations are expensive and time limited. Robust, automatic classification of cloud type using infrared ground-based instrumentation offers the advantage of continuous, real-time (24/7) data capture and the representation of cloud structure in the form of a thermal map, which can greatly help to characterise certain cloud formations. The work presented here utilised a ground based infrared (8-14 μm) imaging device mounted on a pan/tilt unit for capturing high spatial resolution sky images. These images were processed to extract 45 separate textural features using statistical and spatial frequency based analytical techniques. These features were used to train a weighted k-nearest neighbour (KNN) classifier in order to determine cloud type. Ground truth data were obtained by inspection of images captured simultaneously from a visible wavelength colour camera at the same installation, with approximately the same field of view as the infrared device. These images were classified by a trained cloud observer. Results from the KNN classifier gave an encouraging success rate. A Probability of Detection (POD) of up to 90% with a Probability of False Alarm (POFA) as low as 16% was achieved.

  8. Methods for the performance enhancement and the error characterization of large diameter ground-based diffractive telescopes.

    Science.gov (United States)

    Zhang, Haolin; Liu, Hua; Lizana, Angel; Xu, Wenbin; Caompos, Juan; Lu, Zhenwu

    2017-10-30

    This paper is devoted to the improvement of ground-based telescopes based on diffractive primary lenses, which provide larger aperture and relaxed surface tolerance compared to non-diffractive telescopes. We performed two different studies devised to thoroughly characterize and improve the performance of ground-based diffractive telescopes. On the one hand, we experimentally validated the suitability of the stitching error theory, useful to characterize the error performance of subaperture diffractive telescopes. On the other hand, we proposed a novel ground-based telescope incorporated in a Cassegrain architecture, leading to a telescope with enhanced performance. To test the stitching error theory, a 300 mm diameter, 2000 mm focal length transmissive stitching diffractive telescope, based on a three-belt subaperture primary lens, was designed and implemented. The telescope achieves a 78 cy/mm resolution within 0.15 degree field of view while the working wavelength ranges from 582.8 nm to 682.8 nm without any stitching error. However, the long optical track (35.49 m) introduces air turbulence that reduces the final images contrast in the ground-based test. To enhance this result, a same diameter compacted Cassegrain ground-based diffractive (CGD) telescope with the total track distance of 1.267 m, was implemented within the same wavelength. The ground-based CGD telescope provides higher resolution and better contrast than the transmissive configuration. Star and resolution tests were experimentally performed to compare the CGD and the transmissive configurations, providing the suitability of the proposed ground-based CGD telescope.

  9. Benchmarking computer platforms for lattice QCD applications

    International Nuclear Information System (INIS)

    Hasenbusch, M.; Jansen, K.; Pleiter, D.; Wegner, P.; Wettig, T.

    2003-09-01

    We define a benchmark suite for lattice QCD and report on benchmark results from several computer platforms. The platforms considered are apeNEXT, CRAY T3E, Hitachi SR8000, IBM p690, PC-Clusters, and QCDOC. (orig.)

  10. Benchmarking computer platforms for lattice QCD applications

    International Nuclear Information System (INIS)

    Hasenbusch, M.; Jansen, K.; Pleiter, D.; Stueben, H.; Wegner, P.; Wettig, T.; Wittig, H.

    2004-01-01

    We define a benchmark suite for lattice QCD and report on benchmark results from several computer platforms. The platforms considered are apeNEXT, CRAY T3E; Hitachi SR8000, IBM p690, PC-Clusters, and QCDOC

  11. A Ground-Based Analog for CNS Exposure to Space Radiation: A System for Integrating Microbeam Technology and Neuronal Culture

    Data.gov (United States)

    National Aeronautics and Space Administration — Problem Statement: The connection between radiation-induced neuronal damage and deficits in behavior and cellular function is still largely unknown. Previous studies...

  12. Tourism Destination Benchmarking: Evaluation and Selection of the Benchmarking Partners

    Directory of Open Access Journals (Sweden)

    Luštický Martin

    2012-03-01

    Full Text Available Tourism development has an irreplaceable role in regional policy of almost all countries. This is due to its undeniable benefits for the local population with regards to the economic, social and environmental sphere. Tourist destinations compete for visitors at tourism market and subsequently get into a relatively sharp competitive struggle. The main goal of regional governments and destination management institutions is to succeed in this struggle by increasing the competitiveness of their destination. The quality of strategic planning and final strategies is a key factor of competitiveness. Even though the tourism sector is not the typical field where the benchmarking methods are widely used, such approaches could be successfully applied. The paper focuses on key phases of the benchmarking process which lies in the search for suitable referencing partners. The partners are consequently selected to meet general requirements to ensure the quality if strategies. Following from this, some specific characteristics are developed according to the SMART approach. The paper tests this procedure with an expert evaluation of eight selected regional tourism strategies of regions in the Czech Republic, Slovakia and Great Britain. In this way it validates the selected criteria in the frame of the international environment. Hence, it makes it possible to find strengths and weaknesses of selected strategies and at the same time facilitates the discovery of suitable benchmarking partners.

  13. BONFIRE: benchmarking computers and computer networks

    OpenAIRE

    Bouckaert, Stefan; Vanhie-Van Gerwen, Jono; Moerman, Ingrid; Phillips, Stephen; Wilander, Jerker

    2011-01-01

    The benchmarking concept is not new in the field of computing or computer networking. With “benchmarking tools”, one usually refers to a program or set of programs, used to evaluate the performance of a solution under certain reference conditions, relative to the performance of another solution. Since the 1970s, benchmarking techniques have been used to measure the performance of computers and computer networks. Benchmarking of applications and virtual machines in an Infrastructure-as-a-Servi...

  14. Benchmarking clinical photography services in the NHS.

    Science.gov (United States)

    Arbon, Giles

    2015-01-01

    Benchmarking is used in services across the National Health Service (NHS) using various benchmarking programs. Clinical photography services do not have a program in place and services have to rely on ad hoc surveys of other services. A trial benchmarking exercise was undertaken with 13 services in NHS Trusts. This highlights valuable data and comparisons that can be used to benchmark and improve services throughout the profession.

  15. A Ground-based validation of GOSAT-observed atmospheric CO2 in Inner-Mongolian grasslands

    International Nuclear Information System (INIS)

    Qin, X; Lei, L; Zeng, Z; Kawasaki, M; Oohasi, M

    2014-01-01

    Atmospheric carbon dioxide (CO 2 ) is a long-lived greenhouse gas that significantly contributes to global warming. Long-term and continuous measurements of atmospheric CO 2 to investigate its global distribution and concentration variations are important for accurately understanding its potential climatic effects. Satellite measurements from space can offer atmospheric CO 2 data for climate change research. For that, ground-based measurements are required for validation and improving the precision of satellite-measured CO 2 . We implemented observation experiment of CO 2 column densities in the Xilinguole grasslands in Inner Mongolia, China, using a ground-based measurement system, which mainly consists of an optical spectrum analyzer (OSA), a sun tracker and a notebook controller. Measurements from our ground-based system were analyzed and compared with those from the Greenhouse gas Observation SATellite (GOSAT). The ground-based measurements had an average value of 389.46 ppm, which was 2.4 ppm larger than from GOSAT, with a standard deviation of 3.4 ppm. This result is slightly larger than the difference between GOSAT and the Total Carbon Column Observing Network (TCCON). This study highlights the usefulness of the ground-based OSA measurement system for analyzing atmospheric CO 2 column densities, which is expected to supplement the current TCCON network

  16. Assessment of NASA airborne laser altimetry data using ground-based GPS data near Summit Station, Greenland

    Science.gov (United States)

    Brunt, Kelly M.; Hawley, Robert L.; Lutz, Eric R.; Studinger, Michael; Sonntag, John G.; Hofton, Michelle A.; Andrews, Lauren C.; Neumann, Thomas A.

    2017-03-01

    A series of NASA airborne lidars have been used in support of satellite laser altimetry missions. These airborne laser altimeters have been deployed for satellite instrument development, for spaceborne data validation, and to bridge the data gap between satellite missions. We used data from ground-based Global Positioning System (GPS) surveys of an 11 km long track near Summit Station, Greenland, to assess the surface-elevation bias and measurement precision of three airborne laser altimeters including the Airborne Topographic Mapper (ATM), the Land, Vegetation, and Ice Sensor (LVIS), and the Multiple Altimeter Beam Experimental Lidar (MABEL). Ground-based GPS data from the monthly ground-based traverses, which commenced in 2006, allowed for the assessment of nine airborne lidar surveys associated with ATM and LVIS between 2007 and 2016. Surface-elevation biases for these altimeters - over the flat, ice-sheet interior - are less than 0.12 m, while assessments of measurement precision are 0.09 m or better. Ground-based GPS positions determined both with and without differential post-processing techniques provided internally consistent solutions. Results from the analyses of ground-based and airborne data provide validation strategy guidance for the Ice, Cloud, and land Elevation Satellite 2 (ICESat-2) elevation and elevation-change data products.

  17. Using the FLUKA Monte Carlo Code to Simulate the Interactions of Ionizing Radiation with Matter to Assist and Aid Our Understanding of Ground Based Accelerator Testing, Space Hardware Design, and Secondary Space Radiation Environments

    Science.gov (United States)

    Reddell, Brandon

    2015-01-01

    Designing hardware to operate in the space radiation environment is a very difficult and costly activity. Ground based particle accelerators can be used to test for exposure to the radiation environment, one species at a time, however, the actual space environment cannot be duplicated because of the range of energies and isotropic nature of space radiation. The FLUKA Monte Carlo code is an integrated physics package based at CERN that has been under development for the last 40+ years and includes the most up-to-date fundamental physics theory and particle physics data. This work presents an overview of FLUKA and how it has been used in conjunction with ground based radiation testing for NASA and improve our understanding of secondary particle environments resulting from the interaction of space radiation with matter.

  18. How Benchmarking and Higher Education Came Together

    Science.gov (United States)

    Levy, Gary D.; Ronco, Sharron L.

    2012-01-01

    This chapter introduces the concept of benchmarking and how higher education institutions began to use benchmarking for a variety of purposes. Here, benchmarking is defined as a strategic and structured approach whereby an organization compares aspects of its processes and/or outcomes to those of another organization or set of organizations to…

  19. WWER-1000 Burnup Credit Benchmark (CB5)

    International Nuclear Information System (INIS)

    Manolova, M.A.

    2002-01-01

    In the paper the specification of WWER-1000 Burnup Credit Benchmark first phase (depletion calculations), given. The second phase - criticality calculations for the WWER-1000 fuel pin cell, will be given after the evaluation of the results, obtained at the first phase. The proposed benchmark is a continuation of the WWER benchmark activities in this field (Author)

  20. Benchmarking and Learning in Public Healthcare

    DEFF Research Database (Denmark)

    Buckmaster, Natalie; Mouritsen, Jan

    2017-01-01

    This research investigates the effects of learning-oriented benchmarking in public healthcare settings. Benchmarking is a widely adopted yet little explored accounting practice that is part of the paradigm of New Public Management. Extant studies are directed towards mandated coercive benchmarking...

  1. Geothermal Heat Pump Benchmarking Report

    Energy Technology Data Exchange (ETDEWEB)

    None

    1997-01-17

    A benchmarking study was conducted on behalf of the Department of Energy to determine the critical factors in successful utility geothermal heat pump programs. A Successful program is one that has achieved significant market penetration. Successfully marketing geothermal heat pumps has presented some major challenges to the utility industry. However, select utilities have developed programs that generate significant GHP sales. This benchmarking study concludes that there are three factors critical to the success of utility GHP marking programs: (1) Top management marketing commitment; (2) An understanding of the fundamentals of marketing and business development; and (3) An aggressive competitive posture. To generate significant GHP sales, competitive market forces must by used. However, because utilities have functioned only in a regulated arena, these companies and their leaders are unschooled in competitive business practices. Therefore, a lack of experience coupled with an intrinsically non-competitive culture yields an industry environment that impedes the generation of significant GHP sales in many, but not all, utilities.

  2. The development of code benchmarks

    International Nuclear Information System (INIS)

    Glass, R.E.

    1986-01-01

    Sandia National Laboratories has undertaken a code benchmarking effort to define a series of cask-like problems having both numerical solutions and experimental data. The development of the benchmarks includes: (1) model problem definition, (2) code intercomparison, and (3) experimental verification. The first two steps are complete and a series of experiments are planned. The experiments will examine the elastic/plastic behavior of cylinders for both the end and side impacts resulting from a nine meter drop. The cylinders will be made from stainless steel and aluminum to give a range of plastic deformations. This paper presents the results of analyses simulating the model's behavior using materials properties for stainless steel and aluminum

  3. Benchmarking Variable Selection in QSAR.

    Science.gov (United States)

    Eklund, Martin; Norinder, Ulf; Boyer, Scott; Carlsson, Lars

    2012-02-01

    Variable selection is important in QSAR modeling since it can improve model performance and transparency, as well as reduce the computational cost of model fitting and predictions. Which variable selection methods that perform well in QSAR settings is largely unknown. To address this question we, in a total of 1728 benchmarking experiments, rigorously investigated how eight variable selection methods affect the predictive performance and transparency of random forest models fitted to seven QSAR datasets covering different endpoints, descriptors sets, types of response variables, and number of chemical compounds. The results show that univariate variable selection methods are suboptimal and that the number of variables in the benchmarked datasets can be reduced with about 60 % without significant loss in model performance when using multivariate adaptive regression splines MARS and forward selection. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Closed-loop neuromorphic benchmarks

    CSIR Research Space (South Africa)

    Stewart, TC

    2015-11-01

    Full Text Available Benchmarks   Terrence C. Stewart 1* , Travis DeWolf 1 , Ashley Kleinhans 2 , Chris Eliasmith 1   1 University of Waterloo, Canada, 2 Council for Scientific and Industrial Research, South Africa   Submitted to Journal:   Frontiers in Neuroscience   Specialty... Eliasmith 1 1Centre for Theoretical Neuroscience, University of Waterloo, Waterloo, ON, Canada 2Mobile Intelligent Autonomous Systems group, Council for Scientific and Industrial Research, Pretoria, South Africa Correspondence*: Terrence C. Stewart Centre...

  5. Investible benchmarks & hedge fund liquidity

    OpenAIRE

    Freed, Marc S; McMillan, Ben

    2011-01-01

    A lack of commonly accepted benchmarks for hedge fund performance has permitted hedge fund managers to attribute to skill returns that may actually accrue from market risk factors and illiquidity. Recent innovations in hedge fund replication permits us to estimate the extent of this misattribution. Using an option-based model, we find evidence that the value of liquidity options that investors implicitly grant managers when they invest may account for part or even all hedge fund returns. C...

  6. TEMIS UV product validation using NILU-UV ground-based measurements in Thessaloniki, Greece

    Science.gov (United States)

    Zempila, Melina-Maria; van Geffen, Jos H. G. M.; Taylor, Michael; Fountoulakis, Ilias; Koukouli, Maria-Elissavet; van Weele, Michiel; van der A, Ronald J.; Bais, Alkiviadis; Meleti, Charikleia; Balis, Dimitrios

    2017-06-01

    This study aims to cross-validate ground-based and satellite-based models of three photobiological UV effective dose products: the Commission Internationale de l'Éclairage (CIE) erythemal UV, the production of vitamin D in the skin, and DNA damage, using high-temporal-resolution surface-based measurements of solar UV spectral irradiances from a synergy of instruments and models. The satellite-based Tropospheric Emission Monitoring Internet Service (TEMIS; version 1.4) UV daily dose data products were evaluated over the period 2009 to 2014 with ground-based data from a Norsk Institutt for Luftforskning (NILU)-UV multifilter radiometer located at the northern midlatitude super-site of the Laboratory of Atmospheric Physics, Aristotle University of Thessaloniki (LAP/AUTh), in Greece. For the NILU-UV effective dose rates retrieval algorithm, a neural network (NN) was trained to learn the nonlinear functional relation between NILU-UV irradiances and collocated Brewer-based photobiological effective dose products. Then the algorithm was subjected to sensitivity analysis and validation. The correlation of the NN estimates with target outputs was high (r = 0. 988 to 0.990) and with a very low bias (0.000 to 0.011 in absolute units) proving the robustness of the NN algorithm. For further evaluation of the NILU NN-derived products, retrievals of the vitamin D and DNA-damage effective doses from a collocated Yankee Environmental Systems (YES) UVB-1 pyranometer were used. For cloud-free days, differences in the derived UV doses are better than 2 % for all UV dose products, revealing the reference quality of the ground-based UV doses at Thessaloniki from the NILU-UV NN retrievals. The TEMIS UV doses used in this study are derived from ozone measurements by the SCIAMACHY/Envisat and GOME2/MetOp-A satellite instruments, over the European domain in combination with SEVIRI/Meteosat-based diurnal cycle of the cloud cover fraction per 0. 5° × 0. 5° (lat × long) grid cells. TEMIS

  7. TEMIS UV product validation using NILU-UV ground-based measurements in Thessaloniki, Greece

    Directory of Open Access Journals (Sweden)

    M.-M. Zempila

    2017-06-01

    Full Text Available This study aims to cross-validate ground-based and satellite-based models of three photobiological UV effective dose products: the Commission Internationale de l'Éclairage (CIE erythemal UV, the production of vitamin D in the skin, and DNA damage, using high-temporal-resolution surface-based measurements of solar UV spectral irradiances from a synergy of instruments and models. The satellite-based Tropospheric Emission Monitoring Internet Service (TEMIS; version 1.4 UV daily dose data products were evaluated over the period 2009 to 2014 with ground-based data from a Norsk Institutt for Luftforskning (NILU-UV multifilter radiometer located at the northern midlatitude super-site of the Laboratory of Atmospheric Physics, Aristotle University of Thessaloniki (LAP/AUTh, in Greece. For the NILU-UV effective dose rates retrieval algorithm, a neural network (NN was trained to learn the nonlinear functional relation between NILU-UV irradiances and collocated Brewer-based photobiological effective dose products. Then the algorithm was subjected to sensitivity analysis and validation. The correlation of the NN estimates with target outputs was high (r = 0. 988 to 0.990 and with a very low bias (0.000 to 0.011 in absolute units proving the robustness of the NN algorithm. For further evaluation of the NILU NN-derived products, retrievals of the vitamin D and DNA-damage effective doses from a collocated Yankee Environmental Systems (YES UVB-1 pyranometer were used. For cloud-free days, differences in the derived UV doses are better than 2 % for all UV dose products, revealing the reference quality of the ground-based UV doses at Thessaloniki from the NILU-UV NN retrievals. The TEMIS UV doses used in this study are derived from ozone measurements by the SCIAMACHY/Envisat and GOME2/MetOp-A satellite instruments, over the European domain in combination with SEVIRI/Meteosat-based diurnal cycle of the cloud cover fraction per 0. 5° × 0. 5

  8. NASA's Newest Orbital Debris Ground-based Telescope Assets: MCAT and UKIRT

    Science.gov (United States)

    Lederer, S.; Frith, J.; Pace, L. F.; Cowardin, H. M.; Hickson, P.; Glesne, T.; Maeda, R.; Buckalew, B.; Nishimoto, D.; Douglas, D.; Stansbery, E. G.

    2014-09-01

    NASAs Orbital Debris Program Office (ODPO) will break ground on Ascension Island in 2014 to build the newest optical (0.30 1.06 microns) ground-based telescope asset dedicated to the study of orbital debris. The Meter Class Autonomous Telescope (MCAT) is a 1.3m optical telescope designed to track objects in orbits ranging from Low Earth Orbit (LEO) to Geosynchronous Earth Orbit (GEO). Ascension Island is located in the South Atlantic Ocean, offering longitudinal sky coverage not afforded by the Ground-based Electro-Optical Deep Space Surveillance (GEODSS) network. With a fast-tracking dome, a suite of visible wide-band filters, and a time-delay integration (TDI) capable camera, MCAT is capable of multiple observing modes ranging from tracking cataloged debris targets to surveying the overall debris environment. Access to the United Kingdom Infrared Telescope (UKIRT) will extend our spectral coverage into the near- (0.8-5 micron) and mid- to far-infrared (8-25 micron) regime. UKIRT is a 3.8m telescope located on Mauna Kea on the Big Island of Hawaii. At nearly 14,000-feet and above the atmospheric inversion layer, this is one of the premier astronomical sites in the world and is an ideal setting for an infrared telescope. An unprecedented one-third of this telescopes time has been allocated to collect orbital debris data for NASAs ODPO over a 2-year period. UKIRT has several instruments available to obtain low-resolution spectroscopy in both the near-IR and the mid/far-IR. Infrared spectroscopy is ideal for constraining the material types, albedos and sizes of debris targets, and potentially gaining insight into reddening effects caused by space weathering. In addition, UKIRT will be used to acquire broadband photometric imaging at GEO with the Wide Field Camera (WFCAM) for studying known objects of interest as well as collecting data in survey-mode to discover new targets. Results from the first stage of the debris campaign will be presented. The combination of

  9. Ground-based Polarization Remote Sensing of Atmospheric Aerosols and the Correlation between Polarization Degree and PM2.5

    International Nuclear Information System (INIS)

    Cheng, Chen; Zhengqiang, Li; Weizhen, Hou; Yisong, Xie; Donghui, Li; Kaitao, Li; Ying, Zhang

    2014-01-01

    The ground-based polarization remote sensing adds the polarization dimension information to traditional intensity detection, which provides a new method to detect atmospheric aerosols properties. In this paper, the polarization measurements achieved by a new multi-wavelength sun photometer, CE318-DP, are used for the ground-based remote sensing of atmospheric aerosols. In addition, a polarized vector radiative transfer model is introduced to simulate the DOLP (Degree Of Linear Polarization) under different sky conditions. At last, the correlative analysis between mass density of PM 2.5 and multi-wavelength and multi-angular DOLP is carried out. The result shows that DOLP has a high correlation with mass density of PM 2.5 , R 2 >0.85. As a consequence, this work provides a new method to estimate the mass density of PM 2.5 by using the comprehensive network of ground-based sun photometer

  10. HS06 Benchmark for an ARM Server

    Science.gov (United States)

    Kluth, Stefan

    2014-06-01

    We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.

  11. HS06 benchmark for an ARM server

    International Nuclear Information System (INIS)

    Kluth, Stefan

    2014-01-01

    We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.

  12. TESTING GROUND BASED GEOPHYSICAL TECHNIQUES TO REFINE ELECTROMAGNETIC SURVEYS NORTH OF THE 300 AREA, HANFORD, WASHINGTON

    International Nuclear Information System (INIS)

    Petersen, S.W.

    2010-01-01

    Airborne electromagnetic (AEM) surveys were flown during fiscal year (FY) 2008 within the 600 Area in an attempt to characterize the underlying subsurface and to aid in the closure and remediation design study goals for the 200-PO-1 Groundwater Operable Unit (OU). The rationale for using the AEM surveys was that airborne surveys can cover large areas rapidly at relatively low costs with minimal cultural impact, and observed geo-electrical anomalies could be correlated with important subsurface geologic and hydrogeologic features. Initial interpretation of the AEM surveys indicated a tenuous correlation with the underlying geology, from which several anomalous zones likely associated with channels/erosional features incised into the Ringold units were identified near the River Corridor. Preliminary modeling resulted in a slightly improved correlation but revealed that more information was required to constrain the modeling (SGW-39674, Airborne Electromagnetic Survey Report, 200-PO-1 Groundwater Operable Unit, 600 Area, Hanford Site). Both time-and frequency domain AEM surveys were collected with the densest coverage occurring adjacent to the Columbia River Corridor. Time domain surveys targeted deeper subsurface features (e.g., top-of-basalt) and were acquired using the HeliGEOTEM(reg s ign) system along north-south flight lines with a nominal 400 m (1,312 ft) spacing. The frequency domain RESOLVE system acquired electromagnetic (EM) data along tighter spaced (100 m (328 ft) and 200 m (656 ft)) north-south profiles in the eastern fifth of the 200-PO-1 Groundwater OU (immediately adjacent to the River Corridor). The overall goal of this study is to provide further quantification of the AEM survey results, using ground based geophysical methods, and to link results to the underlying geology and/or hydrogeology. Specific goals of this project are as follows: (1) Test ground based geophysical techniques for the efficacy in delineating underlying geology; (2) Use ground

  13. TESTING GROUND BASED GEOPHYSICAL TECHNIQUES TO REFINE ELECTROMAGNETIC SURVEYS NORTH OF THE 300 AREA HANFORD WASHINGTON

    Energy Technology Data Exchange (ETDEWEB)

    PETERSEN SW

    2010-12-02

    Airborne electromagnetic (AEM) surveys were flown during fiscal year (FY) 2008 within the 600 Area in an attempt to characterize the underlying subsurface and to aid in the closure and remediation design study goals for the 200-PO-1 Groundwater Operable Unit (OU). The rationale for using the AEM surveys was that airborne surveys can cover large areas rapidly at relatively low costs with minimal cultural impact, and observed geo-electrical anomalies could be correlated with important subsurface geologic and hydrogeologic features. Initial interpretation of the AEM surveys indicated a tenuous correlation with the underlying geology, from which several anomalous zones likely associated with channels/erosional features incised into the Ringold units were identified near the River Corridor. Preliminary modeling resulted in a slightly improved correlation but revealed that more information was required to constrain the modeling (SGW-39674, Airborne Electromagnetic Survey Report, 200-PO-1 Groundwater Operable Unit, 600 Area, Hanford Site). Both time-and frequency domain AEM surveys were collected with the densest coverage occurring adjacent to the Columbia River Corridor. Time domain surveys targeted deeper subsurface features (e.g., top-of-basalt) and were acquired using the HeliGEOTEM{reg_sign} system along north-south flight lines with a nominal 400 m (1,312 ft) spacing. The frequency domain RESOLVE system acquired electromagnetic (EM) data along tighter spaced (100 m [328 ft] and 200 m [656 ft]) north-south profiles in the eastern fifth of the 200-PO-1 Groundwater OU (immediately adjacent to the River Corridor). The overall goal of this study is to provide further quantification of the AEM survey results, using ground based geophysical methods, and to link results to the underlying geology and/or hydrogeology. Specific goals of this project are as follows: (1) Test ground based geophysical techniques for the efficacy in delineating underlying geology; (2) Use ground

  14. Evaluation of the National Solar Radiation Database (NSRDB) Using Ground-Based Measurements

    Science.gov (United States)

    Xie, Y.; Sengupta, M.; Habte, A.; Lopez, A.

    2017-12-01

    Solar resource is essential for a wide spectrum of applications including renewable energy, climate studies, and solar forecasting. Solar resource information can be obtained from ground-based measurement stations and/or from modeled data sets. While measurements provide data for the development and validation of solar resource models and other applications modeled data expands the ability to address the needs for increased accuracy and spatial and temporal resolution. The National Renewable Energy Laboratory (NREL) has developed and regular updates modeled solar resource through the National Solar Radiation Database (NSRDB). The recent NSRDB dataset was developed using the physics-based Physical Solar Model (PSM) and provides gridded solar irradiance (global horizontal irradiance (GHI), direct normal irradiance (DNI), and diffuse horizontal irradiance) at a 4-km by 4-km spatial and half-hourly temporal resolution covering 18 years from 1998-2015. A comprehensive validation of the performance of the NSRDB (1998-2015) was conducted to quantify the accuracy of the spatial and temporal variability of the solar radiation data. Further, the study assessed the ability of NSRDB (1998-2015) to accurately capture inter-annual variability, which is essential information for solar energy conversion projects and grid integration studies. Comparisons of the NSRDB (1998-2015) with nine selected ground-measured data were conducted under both clear- and cloudy-sky conditions. These locations provide a high quality data covering a variety of geographical locations and climates. The comparison of the NSRDB to the ground-based data demonstrated that biases were within +/- 5% for GHI and +/-10% for DNI. A comprehensive uncertainty estimation methodology was established to analyze the performance of the gridded NSRDB and includes all sources of uncertainty at various time-averaged periods, a method that is not often used in model evaluation. Further, the study analyzed the inter

  15. Validation of OMI erythemal doses with multi-sensor ground-based measurements in Thessaloniki, Greece

    Science.gov (United States)

    Zempila, Melina Maria; Fountoulakis, Ilias; Taylor, Michael; Kazadzis, Stelios; Arola, Antti; Koukouli, Maria Elissavet; Bais, Alkiviadis; Meleti, Chariklia; Balis, Dimitrios

    2018-06-01

    The aim of this study is to validate the Ozone Monitoring Instrument (OMI) erythemal dose rates using ground-based measurements in Thessaloniki, Greece. In the Laboratory of Atmospheric Physics of the Aristotle University of Thessaloniki, a Yankee Environmental System UVB-1 radiometer measures the erythemal dose rates every minute, and a Norsk Institutt for Luftforskning (NILU) multi-filter radiometer provides multi-filter based irradiances that were used to derive erythemal dose rates for the period 2005-2014. Both these datasets were independently validated against collocated UV irradiance spectra from a Brewer MkIII spectrophotometer. Cloud detection was performed based on measurements of the global horizontal radiation from a Kipp & Zonen pyranometer and from NILU measurements in the visible range. The satellite versus ground observation validation was performed taking into account the effect of temporal averaging, limitations related to OMI quality control criteria, cloud conditions, the solar zenith angle and atmospheric aerosol loading. Aerosol optical depth was also retrieved using a collocated CIMEL sunphotometer in order to assess its impact on the comparisons. The effect of total ozone columns satellite versus ground-based differences on the erythemal dose comparisons was also investigated. Since most of the public awareness alerts are based on UV Index (UVI) classifications, an analysis and assessment of OMI capability for retrieving UVIs was also performed. An overestimation of the OMI erythemal product by 3-6% and 4-8% with respect to ground measurements is observed when examining overpass and noontime estimates respectively. The comparisons revealed a relatively small solar zenith angle dependence, with the OMI data showing a slight dependence on aerosol load, especially at high aerosol optical depth values. A mean underestimation of 2% in OMI total ozone columns under cloud-free conditions was found to lead to an overestimation in OMI erythemal

  16. Geospace Science from Ground-based Magnetometer Arrays: Advances in Sensors, Data Collection, and Data Integration

    Science.gov (United States)

    Mann, Ian; Chi, Peter

    2016-07-01

    Networks of ground-based magnetometers now provide the basis for the diagnosis of magnetic disturbances associated with solar wind-magnetosphere-ionosphere coupling on a truly global scale. Advances in sensor and digitisation technologies offer increases in sensitivity in fluxgate, induction coil, and new micro-sensor technologies - including the promise of hybrid sensors. Similarly, advances in remote connectivity provide the capacity for truly real-time monitoring of global dynamics at cadences sufficient for monitoring and in many cases resolving system level spatio-temporal ambiguities especially in combination with conjugate satellite measurements. A wide variety of the plasmaphysical processes active in driving geospace dynamics can be monitored based on the response of the electrical current system, including those associated with changes in global convection, magnetospheric substorms and nightside tail flows, as well as due to solar wind changes in both dynamic pressure and in response to rotations of the direction of the IMF. Significantly, any changes to the dynamical system must be communicated by the propagation of long-period Alfven and/or compressional waves. These wave populations hence provide diagnostics for not only the energy transport by the wave fields themselves, but also provide a mechanism for diagnosing the structure of the background plasma medium through which the waves propagate. Ultra-low frequency (ULF) waves are especially significant in offering a monitor for mass density profiles, often invisible to particle detectors because of their very low energy, through the application of a variety of magneto-seismology and cross-phase techniques. Renewed scientific interest in the plasma waves associated with near-Earth substorm dynamics, including magnetosphere-ionosphere coupling at substorm onset and their relation to magnetotail flows, as well the importance of global scale ultra-low frequency waves for the energisation, transport

  17. Cross Validation of Rain Drop Size Distribution between GPM and Ground Based Polarmetric radar

    Science.gov (United States)

    Chandra, C. V.; Biswas, S.; Le, M.; Chen, H.

    2017-12-01

    Dual-frequency precipitation radar (DPR) on board the Global Precipitation Measurement (GPM) core satellite has reflectivity measurements at two independent frequencies, Ku- and Ka- band. Dual-frequency retrieval algorithms have been developed traditionally through forward, backward, and recursive approaches. However, these algorithms suffer from "dual-value" problem when they retrieve medium volume diameter from dual-frequency ratio (DFR) in rain region. To this end, a hybrid method has been proposed to perform raindrop size distribution (DSD) retrieval for GPM using a linear constraint of DSD along rain profile to avoid "dual-value" problem (Le and Chandrasekar, 2015). In the current GPM level 2 algorithm (Iguchi et al. 2017- Algorithm Theoretical Basis Document) the Solver module retrieves a vertical profile of drop size distributionn from dual-frequency observations and path integrated attenuations. The algorithm details can be found in Seto et al. (2013) . On the other hand, ground based polarimetric radars have been used for a long time to estimate drop size distributions (e.g., Gorgucci et al. 2002 ). In addition, coincident GPM and ground based observations have been cross validated using careful overpass analysis. In this paper, we perform cross validation on raindrop size distribution retrieval from three sources, namely the hybrid method, the standard products from the solver module and DSD retrievals from ground polarimetric radars. The results are presented from two NEXRAD radars located in Dallas -Fort Worth, Texas (i.e., KFWS radar) and Melbourne, Florida (i.e., KMLB radar). The results demonstrate the ability of DPR observations to produce DSD estimates, which can be used subsequently to generate global DSD maps. References: Seto, S., T. Iguchi, T. Oki, 2013: The basic performance of a precipitation retrieval algorithm for the Global Precipitation Measurement mission's single/dual-frequency radar measurements. IEEE Transactions on Geoscience and

  18. MetaSensing's FastGBSAR: ground based radar for deformation monitoring

    Science.gov (United States)

    Rödelsperger, Sabine; Meta, Adriano

    2014-10-01

    The continuous monitoring of ground deformation and structural movement has become an important task in engineering. MetaSensing introduces a novel sensor system, the Fast Ground Based Synthetic Aperture Radar (FastGBSAR), based on innovative technologies that have already been successfully applied to airborne SAR applications. The FastGBSAR allows the remote sensing of deformations of a slope or infrastructure from up to a distance of 4 km. The FastGBSAR can be setup in two different configurations: in Real Aperture Radar (RAR) mode it is capable of accurately measuring displacements along a linear range profile, ideal for monitoring vibrations of structures like bridges and towers (displacement accuracy up to 0.01 mm). Modal parameters can be determined within half an hour. Alternatively, in Synthetic Aperture Radar (SAR) configuration it produces two-dimensional displacement images with an acquisition time of less than 5 seconds, ideal for monitoring areal structures like dams, landslides and open pit mines (displacement accuracy up to 0.1 mm). The MetaSensing FastGBSAR is the first ground based SAR instrument on the market able to produce two-dimensional deformation maps with this high acquisition rate. By that, deformation time series with a high temporal and spatial resolution can be generated, giving detailed information useful to determine the deformation mechanisms involved and eventually to predict an incoming failure. The system is fully portable and can be quickly installed on bedrock or a basement. The data acquisition and processing can be fully automated leading to a low effort in instrument operation and maintenance. Due to the short acquisition time of FastGBSAR, the coherence between two acquisitions is very high and the phase unwrapping is simplified enormously. This yields a high density of resolution cells with good quality and high reliability of the acquired deformations. The deformation maps can directly be used as input into an Early

  19. Integration between ground based and satellite SAR data in landslide mapping: The San Fratello case study

    Science.gov (United States)

    Bardi, Federica; Frodella, William; Ciampalini, Andrea; Bianchini, Silvia; Del Ventisette, Chiara; Gigli, Giovanni; Fanti, Riccardo; Moretti, Sandro; Basile, Giuseppe; Casagli, Nicola

    2014-10-01

    The potential use of the integration of PSI (Persistent Scatterer Interferometry) and GB-InSAR (Ground-based Synthetic Aperture Radar Interferometry) for landslide hazard mitigation was evaluated for mapping and monitoring activities of the San Fratello landslide (Sicily, Italy). Intense and exceptional rainfall events are the main factors that triggered several slope movements in the study area, which is susceptible to landslides, because of its steep slopes and silty-clayey sedimentary cover. In the last three centuries, the town of San Fratello was affected by three large landslides, developed in different periods: the oldest one occurred in 1754, damaging the northeastern sector of the town; in 1922 a large landslide completely destroyed a wide area in the western hillside of the town. In this paper, the attention is focussed on the most recent landslide that occurred on 14 February 2010: in this case, the phenomenon produced the failure of a large sector of the eastern hillside, causing severe damages to buildings and infrastructures. In particular, several slow-moving rotational and translational slides occurred in the area, making it suitable to monitor ground instability through different InSAR techniques. PS-InSAR™ (permanent scatterers SAR interferometry) techniques, using ERS-1/ERS-2, ENVISAT, RADARSAT-1, and COSMO-SkyMed SAR images, were applied to analyze ground displacements during pre- and post-event phases. Moreover, during the post-event phase in March 2010, a GB-InSAR system, able to acquire data continuously every 14 min, was installed collecting ground displacement maps for a period of about three years, until March 2013. Through the integration of space-borne and ground-based data sets, ground deformation velocity maps were obtained, providing a more accurate delimitation of the February 2010 landslide boundary, with respect to the carried out traditional geomorphological field survey. The integration of GB-InSAR and PSI techniques proved to

  20. New advanced netted ground based and topside radio diagnostics for Space Weather Program

    Science.gov (United States)

    Rothkaehl, Hanna; Krankowski, Andrzej; Morawski, Marek; Atamaniuk, Barbara; Zakharenkova, Irina; Cherniak, Iurii

    2014-05-01

    To give a more detailed and complete understanding of physical plasma processes that govern the solar-terrestrial space, and to develop qualitative and quantitative models of the magnetosphere-ionosphere-thermosphere coupling, it is necessary to design and build the next generation of instruments for space diagnostics and monitoring. Novel ground- based wide-area sensor networks, such as the LOFAR (Low Frequency Array) radar facility, comprising wide band, and vector-sensing radio receivers and multi-spacecraft plasma diagnostics should help solve outstanding problems of space physics and describe long-term environmental changes. The LOw Frequency ARray - LOFAR - is a new fully digital radio telescope designed for frequencies between 30 MHz and 240 MHz located in Europe. The three new LOFAR stations will be installed until summer 2015 in Poland. The LOFAR facilities in Poland will be distributed among three sites: Lazy (East of Krakow), Borowiec near Poznan and Baldy near Olsztyn. All they will be connected via PIONIER dedicated links to Poznan. Each site will host one LOFAR station (96 high-band+96 low-band antennas). They will most time work as a part of European network, however, when less charged, they can operate as a national network The new digital radio frequency analyzer (RFA) on board the low-orbiting RELEC satellite was designed to monitor and investigate the ionospheric plasma properties. This two-point ground-based and topside ionosphere-located space plasma diagnostic can be a useful new tool for monitoring and diagnosing turbulent plasma properties. The RFA on board the RELEC satellite is the first in a series of experiments which is planned to be launched into the near-Earth environment. In order to improve and validate the large scales and small scales ionospheric structures we will used the GPS observations collected at IGS/EPN network employed to reconstruct diurnal variations of TEC using all satellite passes over individual GPS stations and the

  1. Validation of CALIPSO space-borne-derived attenuated backscatter coefficient profiles using a ground-based lidar in Athens, Greece

    Directory of Open Access Journals (Sweden)

    R. E. Mamouri

    2009-09-01

    Full Text Available We present initial aerosol validation results of the space-borne lidar CALIOP -onboard the CALIPSO satellite- Level 1 attenuated backscatter coefficient profiles, using coincident observations performed with a ground-based lidar in Athens, Greece (37.9° N, 23.6° E. A multi-wavelength ground-based backscatter/Raman lidar system is operating since 2000 at the National Technical University of Athens (NTUA in the framework of the European Aerosol Research LIdar NETwork (EARLINET, the first lidar network for tropospheric aerosol studies on a continental scale. Since July 2006, a total of 40 coincidental aerosol ground-based lidar measurements were performed over Athens during CALIPSO overpasses. The ground-based measurements were performed each time CALIPSO overpasses the station location within a maximum distance of 100 km. The duration of the ground–based lidar measurements was approximately two hours, centred on the satellite overpass time. From the analysis of the ground-based/satellite correlative lidar measurements, a mean bias of the order of 22% for daytime measurements and of 8% for nighttime measurements with respect to the CALIPSO profiles was found for altitudes between 3 and 10 km. The mean bias becomes much larger for altitudes lower that 3 km (of the order of 60% which is attributed to the increase of aerosol horizontal inhomogeneity within the Planetary Boundary Layer, resulting to the observation of possibly different air masses by the two instruments. In cases of aerosol layers underlying Cirrus clouds, comparison results for aerosol tropospheric profiles become worse. This is attributed to the significant multiple scattering effects in Cirrus clouds experienced by CALIPSO which result in an attenuation which is less than that measured by the ground-based lidar.

  2. A review of ground-based heavy-ion radiobiology relevant to space radiation risk assessment: Part II. Cardiovascular and immunological effects

    Energy Technology Data Exchange (ETDEWEB)

    Blakely, Eleanor A.; Chang, Polly Y.

    2007-02-26

    The future of manned space flight depends on an analysis of the numerous potential risks of travel into deep space. Currently no radiation dose limits have been established for these exploratory missions. To set these standards more information is needed about potential acute and late effects on human physiology from appropriate radiation exposure scenarios, including pertinent radiation types and dose rates. Cancer risks have long been considered the most serious late effect from chronic daily relatively low-dose exposures to the complex space radiation environment. However, other late effects from space radiation exposure scenarios are under study in ground-based accelerator facilities and have revealed some unique particle radiation effects not observed with conventional radiations. A comprehensive review of pertinent literature that considers tissue effects of radiation leading to functional detriments in specific organ systems has recently been published (NCRP National Council on Radiation Protection and Measurements, Information Needed to Make Radiation Protection Recommendations for Space Missions Beyond Low-Earth Orbit, Report 153, Bethesda, MD, 2006). This paper highlights the review of two non-cancer concerns from this report: cardiovascular and immunological effects.

  3. Argonne Code Center: Benchmark problem book.

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    1977-06-01

    This book is an outgrowth of activities of the Computational Benchmark Problems Committee of the Mathematics and Computation Division of the American Nuclear Society. This is the second supplement of the original benchmark book which was first published in February, 1968 and contained computational benchmark problems in four different areas. Supplement No. 1, which was published in December, 1972, contained corrections to the original benchmark book plus additional problems in three new areas. The current supplement. Supplement No. 2, contains problems in eight additional new areas. The objectives of computational benchmark work and the procedures used by the committee in pursuing the objectives are outlined in the original edition of the benchmark book (ANL-7416, February, 1968). The members of the committee who have made contributions to Supplement No. 2 are listed below followed by the contributors to the earlier editions of the benchmark book.

  4. Benchmarks

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  5. Computer-aided mathematical analysis of probability of intercept for ground-based communication intercept system

    Science.gov (United States)

    Park, Sang Chul

    1989-09-01

    We develop a mathematical analysis model to calculate the probability of intercept (POI) for the ground-based communication intercept (COMINT) system. The POI is a measure of the effectiveness of the intercept system. We define the POI as the product of the probability of detection and the probability of coincidence. The probability of detection is a measure of the receiver's capability to detect a signal in the presence of noise. The probability of coincidence is the probability that an intercept system is available, actively listening in the proper frequency band, in the right direction and at the same time that the signal is received. We investigate the behavior of the POI with respect to the observation time, the separation distance, antenna elevations, the frequency of the signal, and the receiver bandwidths. We observe that the coincidence characteristic between the receiver scanning parameters and the signal parameters is the key factor to determine the time to obtain a given POI. This model can be used to find the optimal parameter combination to maximize the POI in a given scenario. We expand this model to a multiple system. This analysis is conducted on a personal computer to provide the portability. The model is also flexible and can be easily implemented under different situations.

  6. Architectural design of a ground-based deep-space optical reception antenna

    Science.gov (United States)

    Kerr, E. L.

    1989-01-01

    An architectural design of a ground-based antenna (telescope) for receiving optical communications from deep space is presented. Physical and optical parameters, and their effect on the performance and cost considerations, are described. The channel capacity of the antenna is 100 kbits/s from Saturn and 5 Mbits/s from Mars. A novel sunshade is designed to permit optical communication even when the deep-space laser source is as close to the sun as 12 deg. Inserts in the tubes of the sunshade permit operations at solar elongations as small as 6 or 3 deg. The Nd:YAG source laser and the Fraunhofer filter (a narrow-band predetection optical filter) are tuned to match the Doppler shifts of the source and background. A typical Saturn-to-earth data link can reduce its source power requirement from 8.2 W to 2 W of laser output by employing a Fraunhofer filter instead of a conventional multilayer dielectric filter.

  7. Coupling Fine-Scale Root and Canopy Structure Using Ground-Based Remote Sensing

    Directory of Open Access Journals (Sweden)

    Brady S. Hardiman

    2017-02-01

    Full Text Available Ecosystem physical structure, defined by the quantity and spatial distribution of biomass, influences a range of ecosystem functions. Remote sensing tools permit the non-destructive characterization of canopy and root features, potentially providing opportunities to link above- and belowground structure at fine spatial resolution in functionally meaningful ways. To test this possibility, we employed ground-based portable canopy LiDAR (PCL and ground penetrating radar (GPR along co-located transects in forested sites spanning multiple stages of ecosystem development and, consequently, of structural complexity. We examined canopy and root structural data for coherence (i.e., correlation in the frequency of spatial variation at multiple spatial scales ≤10 m within each site using wavelet analysis. Forest sites varied substantially in vertical canopy and root structure, with leaf area index and root mass more becoming even vertically as forests aged. In all sites, above- and belowground structure, characterized as mean maximum canopy height and root mass, exhibited significant coherence at a scale of 3.5–4 m, and results suggest that the scale of coherence may increase with stand age. Our findings demonstrate that canopy and root structure are linked at characteristic spatial scales, which provides the basis to optimize scales of observation. Our study highlights the potential, and limitations, for fusing LiDAR and radar technologies to quantitatively couple above- and belowground ecosystem structure.

  8. Detection Techniques of Microsecond Gamma-Ray Bursts Using Ground-based Telescopes

    International Nuclear Information System (INIS)

    Krennrich, F.; Le Bohec, S.; Weekes, T. C.

    2000-01-01

    Gamma-ray observations above 200 MeV are conventionally made by satellite-based detectors. The EGRET detector on the Compton Gamma Ray Observatory has provided good sensitivity for the detection of bursts lasting for more than 200 ms. Theoretical predictions of high-energy gamma-ray bursts produced by quantum mechanical decay of primordial black holes (Hawking) suggest the emission of bursts on shorter timescales. The final stage of a primordial black hole results in a burst of gamma rays, peaking around 250 MeV and lasting for 1/10 of a microsecond or longer depending on particle physics. In this work we show that there is an observational window using ground-based imaging Cerenkov detectors to measure gamma-ray burst emission at energies E>200 MeV. This technique, with a sensitivity for bursts lasting nanoseconds to several microseconds, is based on the detection of multiphoton-initiated air showers. (c) (c) 2000. The American Astronomical Society

  9. Ground-based adaptive optics coronagraphic performance under closed-loop predictive control

    Science.gov (United States)

    Males, Jared R.; Guyon, Olivier

    2018-01-01

    The discovery of the exoplanet Proxima b highlights the potential for the coming generation of giant segmented mirror telescopes (GSMTs) to characterize terrestrial-potentially habitable-planets orbiting nearby stars with direct imaging. This will require continued development and implementation of optimized adaptive optics systems feeding coronagraphs on the GSMTs. Such development should proceed with an understanding of the fundamental limits imposed by atmospheric turbulence. Here, we seek to address this question with a semianalytic framework for calculating the postcoronagraph contrast in a closed-loop adaptive optics system. We do this starting with the temporal power spectra of the Fourier basis calculated assuming frozen flow turbulence, and then apply closed-loop transfer functions. We include the benefits of a simple predictive controller, which we show could provide over a factor of 1400 gain in raw point spread function contrast at 1 λ/D on bright stars, and more than a factor of 30 gain on an I=7.5 mag star such as Proxima. More sophisticated predictive control can be expected to improve this even further. Assuming a photon-noise limited observing technique such as high-dispersion coronagraphy, these gains in raw contrast will decrease integration times by the same large factors. Predictive control of atmospheric turbulence should therefore be seen as one of the key technologies that will enable ground-based telescopes to characterize terrestrial planets.

  10. A ground-based near-infrared emission spectrum of the exoplanet HD 189733b.

    Science.gov (United States)

    Swain, Mark R; Deroo, Pieter; Griffith, Caitlin A; Tinetti, Giovanna; Thatte, Azam; Vasisht, Gautam; Chen, Pin; Bouwman, Jeroen; Crossfield, Ian J; Angerhausen, Daniel; Afonso, Cristina; Henning, Thomas

    2010-02-04

    Detection of molecules using infrared spectroscopy probes the conditions and compositions of exoplanet atmospheres. Water (H(2)O), methane (CH(4)), carbon dioxide (CO(2)), and carbon monoxide (CO) have been detected in two hot Jupiters. These previous results relied on space-based telescopes that do not provide spectroscopic capability in the 2.4-5.2 microm spectral region. Here we report ground-based observations of the dayside emission spectrum for HD 189733b between 2.0-2.4 microm and 3.1-4.1 microm, where we find a bright emission feature. Where overlap with space-based instruments exists, our results are in excellent agreement with previous measurements. A feature at approximately 3.25 microm is unexpected and difficult to explain with models that assume local thermodynamic equilibrium (LTE) conditions at the 1 bar to 1 x 10(-6) bar pressures typically sampled by infrared measurements. The most likely explanation for this feature is that it arises from non-LTE emission from CH(4), similar to what is seen in the atmospheres of planets in our own Solar System. These results suggest that non-LTE effects may need to be considered when interpreting measurements of strongly irradiated exoplanets.

  11. Evidence of Urban Precipitation Anomalies from Satellite and Ground-Based Measurements

    Science.gov (United States)

    Shepherd, J. Marshall; Manyin, M.; Negri, Andrew

    2004-01-01

    Urbanization is one of the extreme cases of land use change. Most of world's population has moved to urban areas. Although currently only 1.2% of the land is considered urban, the spatial coverage and density of cities are expected to rapidly increase in the near future. It is estimated that by the year 2025, 60% of the world's population will live in cities. Human activity in urban environments also alters weather and climate processes. However, our understanding of urbanization on the total Earth-weather-climate system is incomplete. Recent literature continues to provide evidence that anomalies in precipitation exist over and downwind of major cities. Current and future research efforts are actively seeking to verify these literature findings and understand potential cause-effect relationships. The novelty of this study is that it utilizes rainfall data from multiple satellite data sources (e.g. TRMM precipitation radar, TRMM-geosynchronous-rain gauge merged product, and SSM/I) and ground-based measurements to identify spatial anomalies and temporal trends in precipitation for cities around the world. Early results will be presented and placed within the context of weather prediction, climate assessment, and societal applications.

  12. A Ground-Based Validation System of Teleoperation for a Space Robot

    Directory of Open Access Journals (Sweden)

    Xueqian Wang

    2012-10-01

    Full Text Available Teleoperation of space robots is very important for future on-orbit service. In order to assure the task is accomplished successfully, ground experiments are required to verify the function and validity of the teleoperation system before a space robot is launched. In this paper, a ground-based validation subsystem is developed as a part of a teleoperation system. The subsystem is mainly composed of four parts: the input verification module, the onboard verification module, the dynamic and image workstation, and the communication simulator. The input verification module, consisting of hardware and software of the master, is used to verify the input ability. The onboard verification module, consisting of the same hardware and software as the onboard processor, is used to verify the processor's computing ability and execution schedule. In addition, the dynamic and image workstation calculates the dynamic response of the space robot and target, and generates emulated camera images, including the hand-eye cameras, global-vision camera and rendezvous camera. The communication simulator provides fidelity communication conditions, i.e., time delays and communication bandwidth. Lastly, we integrated a teleoperation system and conducted many experiments on the system. Experiment results show that the ground system is very useful for verified teleoperation technology.

  13. Evidence of rock slope breathing using ground-based InSAR

    Science.gov (United States)

    Rouyet, Line; Kristensen, Lene; Derron, Marc-Henri; Michoud, Clément; Blikra, Lars Harald; Jaboyedoff, Michel; Lauknes, Tom Rune

    2017-07-01

    Ground-Based Interferometric Synthetic Aperture Radar (GB-InSAR) campaigns were performed in summer 2011 and 2012 in the Romsdalen valley (Møre & Romsdal county, western Norway) in order to assess displacements on Mannen/Børa rock slope. Located 1 km northwest, a second GB-InSAR system continuously monitors the large Mannen rockslide. The availability of two GB-InSAR positions creates a wide coverage of the rock slope, including a slight dataset overlap valuable for validation. A phenomenon of rock slope breathing is detected in a remote and hard-to-access area in mid-slope. Millimetric upward displacements are recorded in August 2011. Analysis of 2012 GB-InSAR campaign, combined with the large dataset from the continuous station, shows that the slope is affected by inflation/deflation phenomenon between 5 and 10 mm along the line-of-sight. The pattern is not homogenous in time and inversions of movement have a seasonal recurrence. These seasonal changes are confirmed by satellite InSAR observations and can possibly be caused by hydrogeological variations. In addition, combination of GB-InSAR results, in situ measurements and satellite InSAR analyses contributes to a better overview of movement distribution over the whole area.

  14. On advanced estimation techniques for exoplanet detection and characterization using ground-based coronagraphs

    Science.gov (United States)

    Lawson, Peter R.; Poyneer, Lisa; Barrett, Harrison; Frazin, Richard; Caucci, Luca; Devaney, Nicholas; Furenlid, Lars; Gładysz, Szymon; Guyon, Olivier; Krist, John; Maire, Jérôme; Marois, Christian; Mawet, Dimitri; Mouillet, David; Mugnier, Laurent; Pearson, Iain; Perrin, Marshall; Pueyo, Laurent; Savransky, Dmitry

    2012-07-01

    The direct imaging of planets around nearby stars is exceedingly difficult. Only about 14 exoplanets have been imaged to date that have masses less than 13 times that of Jupiter. The next generation of planet-finding coronagraphs, including VLT-SPHERE, the Gemini Planet Imager, Palomar P1640, and Subaru HiCIAO have predicted contrast performance of roughly a thousand times less than would be needed to detect Earth-like planets. In this paper we review the state of the art in exoplanet imaging, most notably the method of Locally Optimized Combination of Images (LOCI), and we investigate the potential of improving the detectability of faint exoplanets through the use of advanced statistical methods based on the concepts of the ideal observer and the Hotelling observer. We propose a formal comparison of techniques using a blind data challenge with an evaluation of performance using the Receiver Operating Characteristic (ROC) and Localization ROC (LROC) curves. We place particular emphasis on the understanding and modeling of realistic sources of measurement noise in ground-based AO-corrected coronagraphs. The work reported in this paper is the result of interactions between the co-authors during a week-long workshop on exoplanet imaging that was held in Squaw Valley, California, in March of 2012.

  15. Ground-based infrared surveys: imaging the thermal fields at volcanoes and revealing the controlling parameters.

    Science.gov (United States)

    Pantaleo, Michele; Walter, Thomas

    2013-04-01

    Temperature monitoring is a widespread procedure in the frame of volcano hazard monitoring. Indeed temperature changes are expected to reflect changes in volcanic activity. We propose a new approach, within the thermal monitoring, which is meant to shed light on the parameters controlling the fluid pathways and the fumarole sites by using infrared measurements. Ground-based infrared cameras allow one to remotely image the spatial distribution, geometric pattern and amplitude of fumarole fields on volcanoes at metre to centimetre resolution. Infrared mosaics and time series are generated and interpreted, by integrating geological field observations and modeling, to define the setting of the volcanic degassing system at shallow level. We present results for different volcano morphologies and show that lithology, structures and topography control the appearance of fumarole field by the creation of permeability contrasts. We also show that the relative importance of those parameters is site-dependent. Deciphering the setting of the degassing system is essential for hazard assessment studies because it would improve our understanding on how the system responds to endogenous or exogenous modification.

  16. Ozone ground-based measurements by the GASCOD near-UV and visible DOAS system

    Science.gov (United States)

    Giovanelli, G.; Bonasoni, P.; Cervino, M.; Evangelisti, F.; Ravegnani, F.

    1994-01-01

    GASCOD, a near-ultraviolet and visible differential optical spectrometer, was developed at CNR's FISBAT Institute in Bologna, Italy, and first tested at Terra Nova Bay station in Antarctica (74.6 deg S, 164.6 deg E) during the summer expeditions 1988-1990 of PNRA (PNRA is the national research program in Antarctica, 'Programma Nazionale di Ricerche in Atartide'). A comparison with coincident O3 total column measurements taken in the same Antarctic area is presented, as is another comparison performed in Italy. Also introduced is an updated model for solar zenith measurements taken from a ground-based, upward-looking GASCOD spectrometer, which was employed for the 1991-92 winter campaign at Aer-Ostersund in Sweden (63.3 deg N, 13.1 deg E) during AESOE (European Arctic Stratospheric Ozone Experiment). The GASCOD can examine the spectra from 300 to 700 nm, in 50 nm steps, by moving the spectrometer's grating. At present, it takes measurements of solar zenith radiation in the 310-342 nm range for O3 and in the 405-463 nm range for NO2.

  17. PSC and volcanic aerosol routine observations in Antarctica by UV-visible ground-based spectrometry

    Science.gov (United States)

    Sarkissian, A.; Pommereau, J. P.; Goutail, F.

    1994-01-01

    Polar statospheric clouds (PSC) and stratospheric aerosol can be observed by ground-based UV-visible spectrometry by looking at the variation of the color of the sky during twilight. A radiative transfer model shows that reddenings are caused by high altitude (22-28 km) thin layers of scatterers, while low altitude (12-20 km) thick ones result in blueings. The color index method applied on 4 years of observations at Dumont d'Urville (67 deg S), from 1988 to 1991, shows that probably because the station is located at the edge of the vortex, dense PSC are uncommon. More unexpected is the existence of a systematic seasonal variation of the color of the twilight sky - bluer at spring - which reveals the formation of a dense scattering layer at or just above the tropopause at the end of the winter. Large scattering layers are reported above the station in 1991, first in August around 12-14 km, later in September at 22-24 km. They are attributed to volcanic aerosol from Mt Hudson and Mt Pinatubo respectively, which erupted in 1991. Inspection of the data shows that the lowest entered rapidly into the polar vortex but not the highest which remained outside, demonstrating that the vortex was isolated at 22-26 km.

  18. Ground-based thermal imaging of stream surface temperatures: Technique and evaluation

    Science.gov (United States)

    Bonar, Scott A.; Petre, Sally J.

    2015-01-01

    We evaluated a ground-based handheld thermal imaging system for measuring water temperatures using data from eight southwestern USA streams and rivers. We found handheld thermal imagers could provide considerably more spatial information on water temperature (for our unit one image = 19,600 individual temperature measurements) than traditional methods could supply without a prohibitive amount of effort. Furthermore, they could provide measurements of stream surface temperature almost instantaneously compared with most traditional handheld thermometers (e.g., >20 s/reading). Spatial temperature analysis is important for measurement of subtle temperature differences across waterways, and identification of warm and cold groundwater inputs. Handheld thermal imaging is less expensive and equipment intensive than airborne thermal imaging methods and is useful under riparian canopies. Disadvantages of handheld thermal imagers include their current higher expense than thermometers, their susceptibility to interference when used incorrectly, and their slightly lower accuracy than traditional temperature measurement methods. Thermal imagers can only measure surface temperature, but this usually corresponds to subsurface temperatures in well-mixed streams and rivers. Using thermal imaging in select applications, such as where spatial investigations of water temperature are needed, or in conjunction with stationary temperature data loggers or handheld electronic or liquid-in-glass thermometers to characterize stream temperatures by both time and space, could provide valuable information on stream temperature dynamics. These tools will become increasingly important to fisheries biologists as costs continue to decline.

  19. Real-time threat evaluation in a ground based air defence environment

    Directory of Open Access Journals (Sweden)

    JN Roux

    2008-06-01

    Full Text Available In a military environment a ground based air defence operator is required to evaluate the tactical situation in real-time and protect Defended Assets (DAs on the ground against aerial threats by assigning available Weapon Systems (WSs to engage enemy aircraft. Since this aerial environment requires rapid operational planning and decision making in stress situations, the associated responsibilities are typically divided between a number of operators and computerized systems that aid these operators during the decision making processes. One such a Decision Support System (DSS, a threat evaluation and weapon assignment system, assigns threat values to aircraft (with respect to DAs in real-time and uses these values to propose possible engagements of observed enemy aircraft by anti-aircraft WSs. In this paper a design of the threat evaluation part of such a DSS is put forward. The design follows the structured approach suggested in [Roux JN & van Vuuren JH, 2007, Threat evaluation and weapon assignment decision support: A review of the state of the art, ORiON, 23(2, pp. 151-187], phasing in a suite of increasingly complex qualitative and quantitative model components as more (reliable data become available.

  20. Characterization of aerosol pollution events in France using ground-based and POLDER-2 satellite data

    Directory of Open Access Journals (Sweden)

    M. Kacenelenbogen

    2006-01-01

    Full Text Available We analyze the relationship between daily fine particle mass concentration (PM2.5 and columnar aerosol optical thickness derived from the Polarization and Directionality of Earth's Reflectances (POLDER satellite sensor. The study is focused over France during the POLDER-2 lifetime between April and October 2003. We have first compared the POLDER derived aerosol optical thickness (AOT with integrated volume size distribution derived from ground-based Sun Photometer observations. The good correlation (R=0.72 with sub-micron volume fraction indicates that POLDER derived AOT is sensitive to the fine aerosol mass concentration. Considering 1974 match-up data points over 28 fine particle monitoring sites, the POLDER-2 derived AOT is fairly well correlated with collocated PM2.5 measurements, with a correlation coefficient of 0.55. The correlation coefficient reaches a maximum of 0.80 for particular sites. We have analyzed the probability to find an appropriate air quality category (AQC as defined by U.S. Environmental Protection Agency (EPA from POLDER-2 AOT measurements. The probability can be up to 88.8% (±3.7% for the "Good" AQC and 89.1% (±3.6% for the "Moderate" AQC.

  1. GROUND-BASED TRANSIT OBSERVATIONS OF THE SUPER-EARTH 55 Cnc e

    Energy Technology Data Exchange (ETDEWEB)

    De Mooij, E. J. W. [Astronomy and Astrophysics, University of Toronto, Toronto (Canada); López-Morales, M. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA (United States); Karjalainen, R.; Hrudkova, M. [Isaac Newton Group of Telescopes, La Palma (Spain); Jayawardhana, Ray, E-mail: demooij@astro.utoronto.ca [Physics and Astronomy, York University, Toronto (Canada)

    2014-12-20

    We report the first ground-based detections of the shallow transit of the super-Earth exoplanet 55 Cnc e using a 2 m class telescope. Using differential spectrophotometry, we observed one transit in 2013 and another in 2014, with average spectral resolutions of ∼700 and ∼250, spanning the Johnson BVR photometric bands. We find a white light planet-to-star radius ratio of 0.0190{sub −0.0027}{sup +0.0023} from the 2013 observations and 0.0200{sub −0.0018}{sup +0.0017} from the 2014 observations. The two data sets combined result in a radius ratio of 0.0198{sub −0.0014}{sup +0.0013}. These values are all in agreement with previous space-based results. Scintillation noise in the data prevents us from placing strong constraints on the presence of an extended hydrogen-rich atmosphere. Nevertheless, our detections of 55 Cnc e in transit demonstrate that moderate-sized telescopes on the ground will be capable of routine follow-up observations of super-Earth candidates discovered by the Transiting Exoplanet Survey Satellite around bright stars. We expect it also will be possible to place constraints on the atmospheric characteristics of those planets by devising observational strategies to minimize scintillation noise.

  2. Estimating atmospheric visibility using synergy of MODIS data and ground-based observations

    Science.gov (United States)

    Komeilian, H.; Mohyeddin Bateni, S.; Xu, T.; Nielson, J.

    2015-05-01

    Dust events are intricate climatic processes, which can have adverse effects on human health, safety, and the environment. In this study, two data mining approaches, namely, back-propagation artificial neural network (BP ANN) and supporting vector regression (SVR), were used to estimate atmospheric visibility through the synergistic use of Moderate Resolution Imaging Spectroradiometer (MODIS) Level 1B (L1B) data and ground-based observations at fourteen stations in the province of Khuzestan (southwestern Iran), during 2009-2010. Reflectance and brightness temperature in different bands (from MODIS) along with in situ meteorological data were input to the models to estimate atmospheric visibility. The results show that both models can accurately estimate atmospheric visibility. The visibility estimates from the BP ANN network had a root-mean-square error (RMSE) and Pearson's correlation coefficient (R) of 0.67 and 0.69, respectively. The corresponding RMSE and R from the SVR model were 0.59 and 0.71, implying that the SVR approach outperforms the BP ANN.

  3. Spent coffee grounds-based activated carbon preparation for sequestering of malachite green

    Science.gov (United States)

    Lim, Jun-Wei; Lam, Keat-Ying; Bashir, Mohammed J. K.; Yeong, Yin-Fong; Lam, Man-Kee; Ho, Yeek-Chia

    2016-11-01

    The key of reported work was to optimize the fabricating factors of spent coffee grounds-based activated carbon (SCG-bAC) used to sequester Malachite Green (MG) form aqueous solution via adsorption process. The fabricating factors of impregnation ratio with ortho-phosphoric acid, activation temperature and activation time were simultaneously optimized by central composite design (CCD) of response surface methodology (RSM) targeting on maximum removal of MG. At the optimum condition, 96.3% of MG was successfully removed by SCG-bAC at the impregnation ratio with ortho-phosphoric acid of 0.50, activation temperature of 554°C and activation time of 31.4 min. Statistical model that could predict the MG removal percentage was also derived and had been statistically confirmed to be significant. Subsequently, the MG adsorption equilibrium data was found well-fitted to Langmuir isotherm model, indicating the predominance of monolayer adsorption of MG on SCG-bAC surface. To conclude, the findings from the this study unveil the potential of spent coffee grounds as an alternative precursor in fabricating low-cost AC for the treatment of wastewater loaded with MG pollutant.

  4. Remote sensing of Sonoran Desert vegetation structure and phenology with ground-based LiDAR

    Science.gov (United States)

    Sankey, Joel B.; Munson, Seth M.; Webb, Robert H.; Wallace, Cynthia S.A.; Duran, Cesar M.

    2015-01-01

    Long-term vegetation monitoring efforts have become increasingly important for understanding ecosystem response to global change. Many traditional methods for monitoring can be infrequent and limited in scope. Ground-based LiDAR is one remote sensing method that offers a clear advancement to monitor vegetation dynamics at high spatial and temporal resolution. We determined the effectiveness of LiDAR to detect intra-annual variability in vegetation structure at a long-term Sonoran Desert monitoring plot dominated by cacti, deciduous and evergreen shrubs. Monthly repeat LiDAR scans of perennial plant canopies over the course of one year had high precision. LiDAR measurements of canopy height and area were accurate with respect to total station survey measurements of individual plants. We found an increase in the number of LiDAR vegetation returns following the wet North American Monsoon season. This intra-annual variability in vegetation structure detected by LiDAR was attributable to a drought deciduous shrub Ambrosia deltoidea, whereas the evergreen shrub Larrea tridentata and cactus Opuntia engelmannii had low variability. Benefits of using LiDAR over traditional methods to census desert plants are more rapid, consistent, and cost-effective data acquisition in a high-resolution, 3-dimensional context. We conclude that repeat LiDAR measurements can be an effective method for documenting ecosystem response to desert climatology and drought over short time intervals and at detailed-local spatial scale.

  5. A ground-based optical transmission spectrum of WASP-6b

    International Nuclear Information System (INIS)

    Jordán, Andrés; Espinoza, Néstor; Rabus, Markus; Eyheramendy, Susana; Sing, David K.; Désert, Jean-Michel; Bakos, Gáspár Á.; Fortney, Jonathan J.; López-Morales, Mercedes; Szentgyorgyi, Andrew; Maxted, Pierre F. L.; Triaud, Amaury H. M. J.

    2013-01-01

    We present a ground-based optical transmission spectrum of the inflated sub-Jupiter-mass planet WASP-6b. The spectrum was measured in 20 spectral channels from 480 nm to 860 nm using a series of 91 spectra over a complete transit event. The observations were carried out using multi-object differential spectrophotometry with the Inamori-Magellan Areal Camera and Spectrograph on the Baade Telescope at Las Campanas Observatory. We model systematic effects on the observed light curves using principal component analysis on the comparison stars and allow for the presence of short and long memory correlation structure in our Monte Carlo Markov Chain analysis of the transit light curves for WASP-6. The measured transmission spectrum presents a general trend of decreasing apparent planetary size with wavelength and lacks evidence for broad spectral features of Na and K predicted by clear atmosphere models. The spectrum is consistent with that expected for scattering that is more efficient in the blue, as could be caused by hazes or condensates in the atmosphere of WASP-6b. WASP-6b therefore appears to be yet another massive exoplanet with evidence for a mostly featureless transmission spectrum, underscoring the importance that hazes and condensates can have in determining the transmission spectra of exoplanets.

  6. Mobile Ground-Based Radar Sensor for Localization and Mapping: An Evaluation of two Approaches

    Directory of Open Access Journals (Sweden)

    Damien Vivet

    2013-08-01

    Full Text Available This paper is concerned with robotic applications using a ground-based radar sensor for simultaneous localization and mapping problems. In mobile robotics, radar technology is interesting because of its long range and the robustness of radar waves to atmospheric conditions, making these sensors well-suited for extended outdoor robotic applications. Two localization and mapping approaches using data obtained from a 360° field of view microwave radar sensor are presented and compared. The first method is a trajectory-oriented simultaneous localization and mapping technique, which makes no landmark assumptions and avoids the data association problem. The estimation of the ego-motion makes use of the Fourier-Mellin transform for registering radar images in a sequence, from which the rotation and translation of the sensor motion can be estimated. The second approach uses the consequence of using a rotating range sensor in high speed robotics. In such a situation, movement combinations create distortions in the collected data. Velocimetry is achieved here by explicitly analysing these measurement distortions. As a result, the trajectory of the vehicle and then the radar map of outdoor environments can be obtained. The evaluation of experimental results obtained by the two methods is presented on real-world data from a vehicle moving at 30 km/h over a 2.5 km course.

  7. Suitability assessment of OPC UA as the backbone of ground-based observatory control systems

    International Nuclear Information System (INIS)

    Pessemier, W.; Raskin, G.; Van Winckel, H.; Deconinck, G.; Saey, P.

    2012-01-01

    A common requirement of modern observatory control systems is to allow interaction between various heterogeneous subsystems in a transparent way. However, the integration of off-the-shelf (OTS) industrial products - such as Programmable Logic Controllers (PLCs) and Supervisory Control And Data Acquisition (SCADA) software - has long been hampered by the lack of an adequate interfacing method. With the advent of the Unified Architecture (UA) version of OPC (Object Linking and Embedding for Process Control), the limitations of the original industry accepted interface are now lifted, and also much more functionality has been defined. In this paper the most important features of OPC UA are matched against the requirements of ground-based observatory control systems in general and in particular of the 1.2 m Mercator Telescope. We investigate the opportunities of the 'information modelling' idea behind OPC UA, which could allow an extensive standardization in the field of astronomical instrumentation, similar to the efforts emerging in several industry domains. Because OPC UA is designed for both horizontal and vertical integration of heterogeneous subsystems, we explore its capabilities to serve as the backbone of a dependable and scalable observatory control system, treating industrial components like PLCs no differently than custom software components. Performance measurements and tests with a sample of OTS OPC UA products are presented. (authors)

  8. Remote sensing of the lightning heating effect duration with ground-based microwave radiometer

    Science.gov (United States)

    Jiang, Sulin; Pan, Yun; Lei, Lianfa; Ma, Lina; Li, Qing; Wang, Zhenhui

    2018-06-01

    Artificially triggered lightning events from May 26, 2017 to July 16, 2017 in Guangzhou Field Experiment Site for Lightning Research and Test (GFESL) were intentionally remotely sensed with a ground-based microwave radiometer for the first time in order to obtain the features of lightning heating effect. The microwave radiometer antenna was adjusted to point at a certain elevation angle towards the expected artificially triggered lightning discharging path. Eight of the 16 successfully artificially triggered lightning events were captured and the brightness temperature data at four frequencies in K and V bands were obtained. The results from data time series analysis show that artificially triggered lightning can make the radiometer generate brightness temperature pulses, and the amplitudes of these pulses are in the range of 2.0 K to 73.8 K. The brightness temperature pulses associated with 7 events can be used to estimate the duration of lightning heating effect through accounting the number of the pulses in the continuous pulse sequence and the sampling interval between four frequencies. The maximum duration of the lightning heating effect is 1.13 s, the minimum is 0.172 s, and the average is 0.63 s.

  9. Ground-based grasslands data to support remote sensing and ecosystem modeling of terrestrial primary production

    Science.gov (United States)

    Olson, R. J.; Scurlock, J. M. O.; Turner, R. S.; Jennings, S. V.

    1995-01-01

    Estimating terrestrial net primary production (NPP) using remote-sensing tools and ecosystem models requires adequate ground-based measurements for calibration, parameterization, and validation. These data needs were strongly endorsed at a recent meeting of ecosystem modelers organized by the International Geosphere-Biosphere Program's (IGBP's) Data and Information System (DIS) and its Global Analysis, Interpretation, and Modelling (GAIM) Task Force. To meet these needs, a multinational, multiagency project is being coordinated by the IGBP DIS to compile existing NPP data from field sites and to regionalize NPP point estimates to various-sized grid cells. Progress at Oak Ridge National Laboratory (ORNL) on compiling NPP data for grasslands as part of the IGBP DIS data initiative is described. Site data and associated documentation from diverse field studies are being acquired for selected grasslands and are being reviewed for completeness, consistency, and adequacy of documentation, including a description of sampling methods. Data are being compiled in a database with spatial, temporal, and thematic characteristics relevant to remote sensing and global modeling. NPP data are available from the ORNL Distributed Active Archive Center (DAAC) for biogeochemical dynamics. The ORNL DAAC is part of the Earth Observing System Data and Information System, of the US National Aeronautics and Space Administration.

  10. Monitoring geospace disturbances through coordinated space-borne and ground-based magnetometer observations

    Science.gov (United States)

    Balasis, Georgios

    2014-05-01

    Recently automated methods of deriving the characteristics of ultra low frequency (ULF) waves in the magnetosphere have been developed (Balasis et al., 2012, 2013), which can be effectively applied to the huge datasets from the new ESA Swarm mission, in order to retrieve, on an operational basis, new information about the near-Earth electromagnetic environment. Processing Swarm measurements with these methods will help to elucidate the processes influencing the generation and propagation of ULF waves, which in turn play a crucial role in magnetospheric dynamics. Moreover, a useful platform based on a combination of wavelet transforms and artificial neural networks has been developed to monitor the wave evolution from the outer boundaries of Earth's magnetosphere through the topside ionosphere down to the surface. Data from a Low Earth Orbit (LEO) satellite (CHAMP) and two magnetospheric missions (Cluster and Geotail) along with three ground-based magnetic networks (CARISMA, GIMA and IMAGE), during the Halloween 2003 magnetic superstorm when the Cluster and CHAMP spacecraft were in good local time (LT) conjunction, are used to demonstrate the potential of the analysis technique in studying wave evolution in detail.

  11. New efforts using helicopter-borne and ground based electromagnetics for mineral exploration

    Science.gov (United States)

    Meyer, U.; Siemon, B.; Noell, U.; Gutzmer, J.; Spitzer, K.; Becken, M.

    2014-12-01

    Throughout the last decades mineral resources, especially rare earth elements, gained a steadily growing importance in industry and therefore as well in exploration. New targets for mineral investigations came into focus and known sources have been and will be revisited. Since most of the mining for mineral resources in the past took place in the upper hundred metres below surface new techniques made deeper mining economically feasible. Consequently, mining engineers need the best possible knowledge about the full spatial extent of prospective geological structures, including their maximum depths. Especially in Germany and Europe, politics changed in terms not to rely only on the global mineral trade market but on national resources, if available. BGR and partners therefore started research programs on different levels to evaluate and develop new technologies on environmental friendly, non-invasive spatial exploration using airborne and partly ground-based electromagnetic methods. Mining waste heaps have been explored for valuable residual minerals (research project ROBEHA), a promising tin bearing ore body is being explored by airborne electromagnetics (research project E3) and a new airborne technology is aimed at to be able to reach investigation depths of about 1 km (research project DESMEX). First results of the projects ROBEHA and E3 will be presented and the project layout of DESMEX will be discussed.

  12. Statistical retrieval of thin liquid cloud microphysical properties using ground-based infrared and microwave observations

    Science.gov (United States)

    Marke, Tobias; Ebell, Kerstin; Löhnert, Ulrich; Turner, David D.

    2016-12-01

    In this article, liquid water cloud microphysical properties are retrieved by a combination of microwave and infrared ground-based observations. Clouds containing liquid water are frequently occurring in most climate regimes and play a significant role in terms of interaction with radiation. Small perturbations in the amount of liquid water contained in the cloud can cause large variations in the radiative fluxes. This effect is enhanced for thin clouds (liquid water path, LWP cloud properties crucial. Due to large relative errors in retrieving low LWP values from observations in the microwave domain and a high sensitivity for infrared methods when the LWP is low, a synergistic retrieval based on a neural network approach is built to estimate both LWP and cloud effective radius (reff). These statistical retrievals can be applied without high computational demand but imply constraints like prior information on cloud phase and cloud layering. The neural network retrievals are able to retrieve LWP and reff for thin clouds with a mean relative error of 9% and 17%, respectively. This is demonstrated using synthetic observations of a microwave radiometer (MWR) and a spectrally highly resolved infrared interferometer. The accuracy and robustness of the synergistic retrievals is confirmed by a low bias in a radiative closure study for the downwelling shortwave flux, even for marginally invalid scenes. Also, broadband infrared radiance observations, in combination with the MWR, have the potential to retrieve LWP with a higher accuracy than a MWR-only retrieval.

  13. FINDING EXTRATERRESTRIAL LIFE USING GROUND-BASED HIGH-DISPERSION SPECTROSCOPY

    International Nuclear Information System (INIS)

    Snellen, I. A. G.; Le Poole, R.; Brogi, M.; Birkby, J.; De Kok, R. J.

    2013-01-01

    Exoplanet observations promise one day to unveil the presence of extraterrestrial life. Atmospheric compounds in strong chemical disequilibrium would point to large-scale biological activity just as oxygen and methane do in the Earth's atmosphere. The cancellation of both the Terrestrial Planet Finder and Darwin missions means that it is unlikely that a dedicated space telescope to search for biomarker gases in exoplanet atmospheres will be launched within the next 25 years. Here we show that ground-based telescopes provide a strong alternative for finding biomarkers in exoplanet atmospheres through transit observations. Recent results on hot Jupiters show the enormous potential of high-dispersion spectroscopy to separate the extraterrestrial and telluric signals, making use of the Doppler shift of the planet. The transmission signal of oxygen from an Earth-twin orbiting a small red dwarf star is only a factor of three smaller than that of carbon monoxide recently detected in the hot Jupiter τ Boötis b, albeit such a star will be orders of magnitude fainter. We show that if Earth-like planets are common, the planned extremely large telescopes can detect oxygen within a few dozen transits. Ultimately, large arrays of dedicated flux-collector telescopes equipped with high-dispersion spectrographs can provide the large collecting area needed to perform a statistical study of life-bearing planets in the solar neighborhood.

  14. Reaching for the stars - New developments in ground-based astronomy

    CERN Document Server

    CERN. Geneva

    2015-01-01

    I will briefly review the state-of-the-art in ground-based astronomy - both on the telescope side and the instrument side. Interesting parallels can be drawn in cost, construction and operations with the particle physics facilities. I will then present some recent results in the two hottest topics in astronomy, driving the requests for more advanced facilities: exoplanets and the hunt for life beyond the solar system (calling for Extremely Large Telescope); and cosmology and the understanding of dark energy (calling for large survey telescopes). This will lead to a description of the latest telescope project developments on the ground: the on-going construction of the Large Synoptic Telescope on a quest to better understand dark energy, and the start of the construction of three Extremely Large Telescopes by European and US-led international consortia, hoping to find life on planets around nearby stars.   ATS Seminars Organisers: H. Burkhardt (BE), M. Modena (TE), T. Stora (EN) Coffee / tea will ...

  15. The SPQR experiment: detecting damage to orbiting spacecraft with ground-based telescopes

    Science.gov (United States)

    Paolozzi, Antonio; Porfilio, Manfredi; Currie, Douglas G.; Dantowitz, Ronald F.

    2007-09-01

    The objective of the Specular Point-like Quick Reference (SPQR) experiment was to evaluate the possibility of improving the resolution of ground-based telescopic imaging of manned spacecraft in orbit. The concept was to reduce image distortions due to atmospheric turbulence by evaluating the Point Spread Function (PSF) of a point-like light reference and processing the spacecraft image accordingly. The target spacecraft was the International Space Station (ISS) and the point-like reference was provided by a laser beam emitted by the ground station and reflected back to the telescope by a Cube Corner Reflector (CCR) mounted on an ISS window. The ultimate objective of the experiment was to demonstrate that it is possible to image spacecraft in Low Earth Orbit (LEO) with a resolution of 20 cm, which would have probably been sufficient to detect the damage which caused the Columbia disaster. The experiment was successfully performed from March to May 2005. The paper provides an overview of the SPQR experiment.

  16. Ground-based grasslands data to support remote sensing and ecosystem modeling of terrestrial primary production

    Energy Technology Data Exchange (ETDEWEB)

    Olson, R.J.; Turner, R.S. [Oak Ridge National Lab., TN (United States); Scurlock, J.M.O. [King`s College London, (England); Jennings, S.V. [Tennessee Univ., Knoxville, TN (United States)

    1995-12-31

    Estimating terrestrial net primary production (NPP) using remote- sensing tools and ecosystem models requires adequate ground-based measurements for calibration, parameterization, and validation. These data needs were strongly endorsed at a recent meeting of ecosystem modelers organized by the International Geosphere-Biosphere Programme`s (IGBP`s) Data and Information System (DIS) and its Global Analysis, Interpretation, and Modelling (GAIM) Task Force. To meet these needs, a multinational, multiagency project is being coordinated by the IGBP DIS to compile existing NPP data from field sites and to regionalize NPP point estimates to various-sized grid cells. Progress at Oak Ridge National Laboratory (ORNL) on compiling NPP data for grasslands as part of the IGBP DIS data initiative is described. Site data and associated documentation from diverse field studies are being acquired for selected grasslands and are being reviewed for completeness, consistency, and adequacy of documentation, including a description of sampling methods. Data are being compiled in a database with spatial, temporal, and thematic characteristics relevant to remote sensing and global modeling. NPP data are available from the ORNL Distributed Active Archive Center (DAAC) for biogeochemical dynamics. The ORNL DAAC is part of the Earth Observing System Data and Information System, of the US National Aeronautics and Space Administration.

  17. Component design challenges for the ground-based SP-100 nuclear assembly test

    International Nuclear Information System (INIS)

    Markley, R.A.; Disney, R.K.; Brown, G.B.

    1989-01-01

    The SP-100 ground engineering system (GES) program involves a ground test of the nuclear subsystems to demonstrate their design. The GES nuclear assembly test (NAT) will be performed in a simulated space environment within a vessel maintained at ultrahigh vacuum. The NAT employs a radiation shielding system that is comprised of both prototypical and nonprototypical shield subsystems to attenuate the reactor radiation leakage and also nonprototypical heat transport subsystems to remove the heat generated by the reactor. The reactor is cooled by liquid lithium, which will operate at temperatures prototypical of the flight system. In designing the components for these systems, a number of design challenges were encountered in meeting the operational requirements of the simulated space environment (and where necessary, prototypical requirements) while also accommodating the restrictions of a ground-based test facility with its limited available space. This paper presents a discussion of the design challenges associated with the radiation shield subsystem components and key components of the heat transport systems

  18. The emission function of ground-based light sources: State of the art and research challenges

    Science.gov (United States)

    Solano Lamphar, Héctor Antonio

    2018-05-01

    To understand the night sky radiance generated by the light emissions of urbanised areas, different researchers are currently proposing various theoretical approaches. The distribution of the radiant intensity as a function of the zenith angle is one of the most unknown properties on modelling skyglow. This is due to the collective effects of the artificial radiation emitted from the ground-based light sources. The emission function is a key property in characterising the sky brightness under arbitrary conditions, therefore it is required by modellers, environmental engineers, urban planners, light pollution researchers, and experimentalists who study the diffuse light of the night sky. As a matter of course, the emission function considers the public lighting system, which is in fact the main generator of the skyglow. Still, another class of light-emitting devices are gaining importance since their overuse and the urban sprawl of recent years. This paper will address the importance of the emission function in modelling skyglow and the factors involved in its characterization. On this subject, the author's intention is to organise, integrate, and evaluate previously published research in order to state the progress of current research toward clarifying this topic.

  19. Overview of diffraction gratings technologies for spaceflight satellites and ground-based telescopes

    Science.gov (United States)

    Cotel, A.; Liard, A.; Desserouer, F.; Pichon, P.

    2017-11-01

    The diffraction gratings are widely used in Space-flight satellites for spectrograph instruments or in ground-based telescopes in astronomy. The diffraction gratings are one of the key optical components of such systems and have to exhibit very high optical performances. HORIBA Jobin Yvon S.A.S. (part of HORIBA Group) is in the forefront of such gratings development for more than 40 years. During the past decades, HORIBA Jobin Yvon (HJY) has developed a unique expertise in diffraction grating design and manufacturing processes for holographic, ruled or etched gratings. We will present in this paper an overview of diffraction grating technologies especially designed for space and astronomy applications. We will firstly review the heritage of the company in this field with the space qualification of different grating types. Then, we will describe several key grating technologies developed for specific space or astronomy projects: ruled blazed low groove density plane reflection grating, high-groove density holographic toroidal and spherical grating, and finally transmission Fused Silica Etched (FSE) grism-assembled grating. We will not present the Volume Phase Holographic (VPHG) grating type which is used in Astronomy.

  20. Comparison of GOME tropospheric NO2 columns with NO2 profiles deduced from ground-based in situ measurements

    Science.gov (United States)

    Schaub, D.; Boersma, K. F.; Kaiser, J. W.; Weiss, A. K.; Folini, D.; Eskes, H. J.; Buchmann, B.

    2006-08-01

    Nitrogen dioxide (NO2) vertical tropospheric column densities (VTCs) retrieved from the Global Ozone Monitoring Experiment (GOME) are compared to coincident ground-based tropospheric NO2 columns. The ground-based columns are deduced from in situ measurements at different altitudes in the Alps for 1997 to June 2003, yielding a unique long-term comparison of GOME NO2 VTC data retrieved by a collaboration of KNMI (Royal Netherlands Meteorological Institute) and BIRA/IASB (Belgian Institute for Space Aeronomy) with independently derived tropospheric NO2 profiles. A first comparison relates the GOME retrieved tropospheric columns to the tropospheric columns obtained by integrating the ground-based NO2 measurements. For a second comparison, the tropospheric profiles constructed from the ground-based measurements are first multiplied with the averaging kernel (AK) of the GOME retrieval. The second approach makes the comparison independent from the a priori NO2 profile used in the GOME retrieval. This allows splitting the total difference between the column data sets into two contributions: one that is due to differences between the a priori and the ground-based NO2 profile shapes, and another that can be attributed to uncertainties in both the remaining retrieval parameters (such as, e.g., surface albedo or aerosol concentration) and the ground-based in situ NO2 profiles. For anticyclonic clear sky conditions the comparison indicates a good agreement between the columns (n=157, R=0.70/0.74 for the first/second comparison approach, respectively). The mean relative difference (with respect to the ground-based columns) is -7% with a standard deviation of 40% and GOME on average slightly underestimating the ground-based columns. Both data sets show a similar seasonal behaviour with a distinct maximum of spring NO2 VTCs. Further analysis indicates small GOME columns being systematically smaller than the ground-based ones. The influence of different shapes in the a priori and

  1. Comparison of GOME tropospheric NO2 columns with NO2 profiles deduced from ground-based in situ measurements

    Directory of Open Access Journals (Sweden)

    D. Schaub

    2006-01-01

    Full Text Available Nitrogen dioxide (NO2 vertical tropospheric column densities (VTCs retrieved from the Global Ozone Monitoring Experiment (GOME are compared to coincident ground-based tropospheric NO2 columns. The ground-based columns are deduced from in situ measurements at different altitudes in the Alps for 1997 to June 2003, yielding a unique long-term comparison of GOME NO2 VTC data retrieved by a collaboration of KNMI (Royal Netherlands Meteorological Institute and BIRA/IASB (Belgian Institute for Space Aeronomy with independently derived tropospheric NO2 profiles. A first comparison relates the GOME retrieved tropospheric columns to the tropospheric columns obtained by integrating the ground-based NO2 measurements. For a second comparison, the tropospheric profiles constructed from the ground-based measurements are first multiplied with the averaging kernel (AK of the GOME retrieval. The second approach makes the comparison independent from the a priori NO2 profile used in the GOME retrieval. This allows splitting the total difference between the column data sets into two contributions: one that is due to differences between the a priori and the ground-based NO2 profile shapes, and another that can be attributed to uncertainties in both the remaining retrieval parameters (such as, e.g., surface albedo or aerosol concentration and the ground-based in situ NO2 profiles. For anticyclonic clear sky conditions the comparison indicates a good agreement between the columns (n=157, R=0.70/0.74 for the first/second comparison approach, respectively. The mean relative difference (with respect to the ground-based columns is −7% with a standard deviation of 40% and GOME on average slightly underestimating the ground-based columns. Both data sets show a similar seasonal behaviour with a distinct maximum of spring NO2 VTCs. Further analysis indicates small GOME columns being systematically smaller than the ground-based ones. The influence of different shapes in the a

  2. Observations of temporal change of nighttime cloud cover from Himawari 8 and ground-based sky camera over Chiba, Japan

    Science.gov (United States)

    Lagrosas, N.; Gacal, G. F. B.; Kuze, H.

    2017-12-01

    Detection of nighttime cloud from Himawari 8 is implemented using the difference of digital numbers from bands 13 (10.4µm) and 7 (3.9µm). The digital number difference of -1.39x104 can be used as a threshold to separate clouds from clear sky conditions. To look at observations from the ground over Chiba, a digital camera (Canon Powershot A2300) is used to take images of the sky every 5 minutes at an exposure time of 5s at the Center for Environmental Remote Sensing, Chiba University. From these images, cloud cover values are obtained using threshold algorithm (Gacal, et al, 2016). Ten minute nighttime cloud cover values from these two datasets are compared and analyzed from 29 May to 05 June 2017 (20:00-03:00 JST). When compared with lidar data, the camera can detect thick high level clouds up to 10km. The results show that during clear sky conditions (02-03 June), both camera and satellite cloud cover values show 0% cloud cover. During cloudy conditions (05-06 June), the camera shows almost 100% cloud cover while satellite cloud cover values range from 60 to 100%. These low values can be attributed to the presence of low-level thin clouds ( 2km above the ground) as observed from National Institute for Environmental Studies lidar located inside Chiba University. This difference of cloud cover values shows that the camera can produce accurate cloud cover values of low level clouds that are sometimes not detected by satellites. The opposite occurs when high level clouds are present (01-02 June). Derived satellite cloud cover shows almost 100% during the whole night while ground-based camera shows cloud cover values that range from 10 to 100% during the same time interval. The fluctuating values can be attributed to the presence of thin clouds located at around 6km from the ground and the presence of low level clouds ( 1km). Since the camera relies on the reflected city lights, it is possible that the high level thin clouds are not observed by the camera but is

  3. NASA Software Engineering Benchmarking Effort

    Science.gov (United States)

    Godfrey, Sally; Rarick, Heather

    2012-01-01

    Benchmarking was very interesting and provided a wealth of information (1) We did see potential solutions to some of our "top 10" issues (2) We have an assessment of where NASA stands with relation to other aerospace/defense groups We formed new contacts and potential collaborations (1) Several organizations sent us examples of their templates, processes (2) Many of the organizations were interested in future collaboration: sharing of training, metrics, Capability Maturity Model Integration (CMMI) appraisers, instructors, etc. We received feedback from some of our contractors/ partners (1) Desires to participate in our training; provide feedback on procedures (2) Welcomed opportunity to provide feedback on working with NASA

  4. NEACRP thermal fission product benchmark

    International Nuclear Information System (INIS)

    Halsall, M.J.; Taubman, C.J.

    1989-09-01

    The objective of the thermal fission product benchmark was to compare the range of fission product data in use at the present time. A simple homogeneous problem was set with 200 atoms H/1 atom U235, to be burnt up to 1000 days and then decay for 1000 days. The problem was repeated with 200 atoms H/1 atom Pu239, 20 atoms H/1 atom U235 and 20 atoms H/1 atom Pu239. There were ten participants and the submissions received are detailed in this report. (author)

  5. Benchmark neutron porosity log calculations

    International Nuclear Information System (INIS)

    Little, R.C.; Michael, M.; Verghese, K.; Gardner, R.P.

    1989-01-01

    Calculations have been made for a benchmark neutron porosity log problem with the general purpose Monte Carlo code MCNP and the specific purpose Monte Carlo code McDNL. For accuracy and timing comparison purposes the CRAY XMP and MicroVax II computers have been used with these codes. The CRAY has been used for an analog version of the MCNP code while the MicroVax II has been used for the optimized variance reduction versions of both codes. Results indicate that the two codes give the same results within calculated standard deviations. Comparisons are given and discussed for accuracy (precision) and computation times for the two codes

  6. Ground-Based Observations and Modeling of the Visibility and Radar Reflectivity in a Radiation Fog Layer

    NARCIS (Netherlands)

    Boers, R.; Baltink, K.H.; Hemink, H.J.; Bosveld, F.C.; Moerman, M.

    2013-01-01

    The development of a radiation fog layer at the Cabauw Experimental Site for Atmospheric Research(51.97°N, 4.93°E) on 23 March 2011 was observed with ground-based in situ and remote sensing observationsto investigate the relationship between visibility and radar reflectivity. The fog layer thickness

  7. Productivity and cost estimators for conventional ground-based skidding on steep terrain using preplanned skid roads

    Science.gov (United States)

    Michael D. Erickson; Curt C. Hassler; Chris B. LeDoux

    1991-01-01

    Continuous time and motion study techniques were used to develop productivity and cost estimators for the skidding component of ground-based logging systems, operating on steep terrain using preplanned skid roads. Comparisons of productivity and costs were analyzed for an overland random access skidding method, verses a skidding method utilizing a network of preplanned...

  8. Predicted buffer zones to protect temporary pond invertebrates from ground-based insecticide applications against desert locusts.

    NARCIS (Netherlands)

    Lahr, J.; Gadji, B.; Dia, D.

    2000-01-01

    To estimate safe downwind distances (i.e. buffer zone widths) for temporary ponds from ULV-treatments with current locust insecticides, experimental trials with two ground-based sprayers, the hand-held Micro-Ulva® and the vehicle-mounted Ulva-Mast® X15 Mark I, were conducted with fenitrothion

  9. Characteristics of Volcanic Stratospheric Aerosol Layer Observed by CALIOP and Ground Based Lidar at Equatorial Atmosphere Radar Site

    Science.gov (United States)

    Abo, Makoto; Shibata, Yasukuni; Nagasawa, Chikao

    2018-04-01

    We investigated the relation between major tropical volcanic eruptions in the equatorial region and the stratospheric aerosol data, which have been collected by the ground based lidar observations at at Equatorial Atmosphere Radar site between 2004 and 2015 and the CALIOP observations in low latitude between 2006 and 2015. We found characteristic dynamic behavior of volcanic stratospheric aerosol layers over equatorial region.

  10. Validation of GOME (ERS-2) NO2 vertical column data with ground-based measurements at Issyk-Kul (Kyrgyzstan)

    Science.gov (United States)

    Ionov, D.; Sinyakov, V.; Semenov, V.

    Starting from 1995 the global monitoring of atmospheric nitrogen dioxide is carried out by the measurements of nadir-viewing GOME spectrometer aboard ERS-2 satellite. Continuous validation of that data by means of comparisons with well-controlled ground-based measurements is important to ensure the quality of GOME data products and improve related retrieval algorithms. At the station of Issyk-Kul (Kyrgyzstan) the ground-based spectroscopic observations of NO2 vertical column have been started since 1983. The station is located on the northern shore of Issyk-Kul lake, 1650 meters above the sea level (42.6 N, 77.0 E). The site is equipped with grating spectrometer for the twilight measurements of zenith-scattered solar radiation in the visible range, and applies the DOAS technique to retrieve NO2 vertical column. It is included in the list of NDSC stations as a complementary one. The present study is focused on validation of GOME NO2 vertical column data, based on 8-year comparison with correlative ground-based measurements at Issyk-Kul station in 1996-2003. Within the investigation, an agreement of both individual and monthly averaged GOME measurements with corresponding twilight ground-based observations is examined. Such agreement is analyzed with respect to different conditions (season, sun elevation), temporal/spatial criteria choice (actual overpass location, correction for diurnal variation) and data processing (GDP version 2.7, 3.0). In addition, NO2 vertical columns were integrated from simultaneous stratospheric profile measurements by NASA HALOE and SAGE-II/III satellite instruments and introduced to explain the differences with ground-based observations. In particular cases, NO2 vertical profiles retrieved from the twilight ground-based measurements at Issuk-Kul were also included into comparison. Overall, summertime GOME NO2 vertical columns were found to be systematicaly lower than ground-based data. This work was supported by International Association

  11. Airborne and Ground-Based Measurements Using a High-Performance Raman Lidar

    Science.gov (United States)

    Whiteman, David N.; Rush, Kurt; Rabenhorst, Scott; Welch, Wayne; Cadirola, Martin; McIntire, Gerry; Russo, Felicita; Adam, Mariana; Venable, Demetrius; Connell, Rasheen; hide

    2010-01-01

    A high-performance Raman lidar operating in the UV portion of the spectrum has been used to acquire, for the first time using a single lidar, simultaneous airborne profiles of the water vapor mixing ratio, aerosol backscatter, aerosol extinction, aerosol depolarization and research mode measurements of cloud liquid water, cloud droplet radius, and number density. The Raman Airborne Spectroscopic Lidar (RASL) system was installed in a Beechcraft King Air B200 aircraft and was flown over the mid-Atlantic United States during July August 2007 at altitudes ranging between 5 and 8 km. During these flights, despite suboptimal laser performance and subaperture use of the telescope, all RASL measurement expectations were met, except that of aerosol extinction. Following the Water Vapor Validation Experiment Satellite/Sondes (WAVES_2007) field campaign in the summer of 2007, RASL was installed in a mobile trailer for groundbased use during the Measurements of Humidity and Validation Experiment (MOHAVE-II) field campaign held during October 2007 at the Jet Propulsion Laboratory s Table Mountain Facility in southern California. This ground-based configuration of the lidar hardware is called Atmospheric Lidar for Validation, Interagency Collaboration and Education (ALVICE). During theMOHAVE-II field campaign, during which only nighttime measurements were made, ALVICE demonstrated significant sensitivity to lower-stratospheric water vapor. Numerical simulation and comparisons with a cryogenic frost-point hygrometer are used to demonstrate that a system with the performance characteristics of RASL ALVICE should indeed be able to quantify water vapor well into the lower stratosphere with extended averaging from an elevated location like Table Mountain. The same design considerations that optimize Raman lidar for airborne use on a small research aircraft are, therefore, shown to yield significant dividends in the quantification of lower-stratospheric water vapor. The MOHAVE

  12. Ultraviolet radiation modelling from ground-based and satellite measurements on Reunion Island, southern tropics

    Directory of Open Access Journals (Sweden)

    K. Lamy

    2018-01-01

    Full Text Available Surface ultraviolet radiation (SUR is not an increasing concern after the implementation of the Montreal Protocol and the recovery of the ozone layer Morgenstern et al.(2008. However, large uncertainties remain in the prediction of future changes of SUR Bais et al.(2015. Several studies pointed out that UV-B impacts the biosphere Erickson et al.(2015, especially the aquatic system, which plays a central part in the biogeochemical cycle Hader et al.(2007. It can affect phytoplankton productivity Smith and Cullen(1995. This influence can result in either positive or negative feedback on climate (Zepp et al., 2007. Global circulation model simulations predict an acceleration of the Brewer-Dobson circulation over the next century (Butchart, 2014, which would lead to a decrease in ozone levels in the tropics and an enhancement at higher latitudes (Hegglin and Shepherd, 2009. Reunion Island is located in the tropics (21° S, 55° E, in a part of the world where the amount of ozone in the ozone column is naturally low. In addition, this island is mountainous and the marine atmosphere is often clean with low aerosol concentrations. Thus, measurements show much higher SUR than at other sites at the same latitude or at midlatitudes. Ground-based measurements of SUR have been taken on Reunion Island by a Bentham DTMc300 spectroradiometer since 2009. This instrument is affiliated with the Network for the Detection of Atmospheric Composition Change (NDACC. In order to quantify the future evolution of SUR in the tropics, it is necessary to validate a model against present observations. This study is designed to be a preliminary parametric and sensitivity study of SUR modelling in the tropics. We developed a local parameterisation using the Tropospheric Ultraviolet and Visible Model (TUV; Madronich, 1993 and compared the output of TUV to multiple years of Bentham spectral measurements. This comparison started in early 2009 and continued until 2016

  13. Preservation of Multiple Mammalian Tissues to Maximize Science Return from Ground Based and Spaceflight Experiments.

    Science.gov (United States)

    Choi, Sungshin; Ray, Hami E; Lai, San-Huei; Alwood, Joshua S; Globus, Ruth K

    2016-01-01

    Even with recent scientific advancements, challenges posed by limited resources and capabilities at the time of sample dissection continue to limit the collection of high quality tissues from experiments that can be conducted only infrequently and at high cost, such as in space. The resources and time it takes to harvest tissues post-euthanasia, and the methods and duration of long duration storage, potentially have negative impacts on sample quantity and quality, thereby limiting the scientific outcome that can be achieved. The goals of this study were to optimize methods for both sample recovery and science return from rodent experiments, with possible relevance to both ground based and spaceflight studies. The first objective was to determine the impacts of tissue harvest time post-euthanasia, preservation methods, and storage duration, focusing on RNA quality and enzyme activities in liver and spleen as indices of sample quality. The second objective was to develop methods that will maximize science return by dissecting multiple tissues after long duration storage in situ at -80°C. Tissues of C57Bl/6J mice were dissected and preserved at various time points post-euthanasia and stored at -80°C for up to 11 months. In some experiments, tissues were recovered from frozen carcasses which had been stored at -80°C up to 7 months. RNA quantity and quality was assessed by measuring RNA Integrity Number (RIN) values using an Agilent Bioanalyzer. Additionally, the quality of tissues was assessed by measuring activities of hepatic enzymes (catalase, glutathione reductase and GAPDH). Fresh tissues were collected up to one hour post-euthanasia, and stored up to 11 months at -80°C, with minimal adverse effects on the RNA quality of either livers or RNAlater-preserved spleens. Liver enzyme activities were similar to those of positive controls, with no significant effect observed at any time point. Tissues dissected from frozen carcasses that had been stored for up to 7

  14. Ultraviolet radiation modelling from ground-based and satellite measurements on Reunion Island, southern tropics

    Science.gov (United States)

    Lamy, Kévin; Portafaix, Thierry; Brogniez, Colette; Godin-Beekmann, Sophie; Bencherif, Hassan; Morel, Béatrice; Pazmino, Andrea; Metzger, Jean Marc; Auriol, Frédérique; Deroo, Christine; Duflot, Valentin; Goloub, Philippe; Long, Charles N.

    2018-01-01

    Surface ultraviolet radiation (SUR) is not an increasing concern after the implementation of the Montreal Protocol and the recovery of the ozone layer (Morgenstern et al., 2008). However, large uncertainties remain in the prediction of future changes of SUR (Bais et al., 2015). Several studies pointed out that UV-B impacts the biosphere (Erickson et al., 2015), especially the aquatic system, which plays a central part in the biogeochemical cycle (Hader et al., 2007). It can affect phytoplankton productivity (Smith and Cullen, 1995). This influence can result in either positive or negative feedback on climate (Zepp et al., 2007). Global circulation model simulations predict an acceleration of the Brewer-Dobson circulation over the next century (Butchart, 2014), which would lead to a decrease in ozone levels in the tropics and an enhancement at higher latitudes (Hegglin and Shepherd, 2009). Reunion Island is located in the tropics (21° S, 55° E), in a part of the world where the amount of ozone in the ozone column is naturally low. In addition, this island is mountainous and the marine atmosphere is often clean with low aerosol concentrations. Thus, measurements show much higher SUR than at other sites at the same latitude or at midlatitudes. Ground-based measurements of SUR have been taken on Reunion Island by a Bentham DTMc300 spectroradiometer since 2009. This instrument is affiliated with the Network for the Detection of Atmospheric Composition Change (NDACC). In order to quantify the future evolution of SUR in the tropics, it is necessary to validate a model against present observations. This study is designed to be a preliminary parametric and sensitivity study of SUR modelling in the tropics. We developed a local parameterisation using the Tropospheric Ultraviolet and Visible Model (TUV; Madronich, 1993) and compared the output of TUV to multiple years of Bentham spectral measurements. This comparison started in early 2009 and continued until 2016. Only

  15. PhotoSpec - Ground-based Remote Sensing of Solar-Induced Chlorophyll Fluorescence: First Results

    Science.gov (United States)

    Grossmann, K.; Magney, T. S.; Frankenberg, C.; Seibt, U.; Pivovaroff, A. L.; Hurlock, S. C.; Stutz, J.

    2016-12-01

    Solar-Induced Chlorophyll Fluorescence (SIF) emitted from vegetation can be used as a proxy for photosynthetic activity and is observable on a global scale from space. However, many issues on a leaf-to-canopy scale remain poorly understood, such as influences on the SIF signal from environmental conditions, water stress, or radiation. We have developed a novel ground-based spectrometer system for measuring SIF from natural ecosystems. The instrumental set-up, requirements, and measurement technique are based on decades of experience using Differential Optical Absorption Spectroscopy (DOAS), an established method to measure atmospheric trace gases. The instrument consists of three thermally stabilized commercial spectrometers that are linked to a 2D scanning telescope unit via optical fiber bundles, and also includes a commercial photosynthetic active radiation (PAR) sensor. The spectrometers cover a SIF retrieval wavelength range at high spectral resolution (670 - 780 nm, 0.1 nm FWHM), and also provide moderate resolution spectra (400 - 800 nm, 1.5 nm FWHM) to retrieve vegetation indices and the photochemical reflectance index (PRI). We report on results of the first continuous field measurements of this novel system at Stunt Ranch Santa Monica Mountains UC Reserve, where the PhotoSpec instrument was monitoring SIF of four native Californian shrubland species with different adaptations to seasonal summer drought. We report on the correlation with CO2 fluxes over both the growing season and the hot summer period in 2016. We also show detailed measurements of the diurnal cycle of the SIF signal of single broad leaves, as well as dark-light transitions, under controlled experimental conditions. In addition to demonstrating the instrumental set-up, retrieval algorithm, and instrument performance, our results illustrate that SIF measurements at the leaf to ecosystem scale are needed to understand and interpret the SIF signals retrieved at larger scales.

  16. Sounding rocket/ground-based observation campaign to study Medium-Scale Traveling Ionospheric Disturbances (MSTID)

    Science.gov (United States)

    Yamamoto, M.; Yokoyama, T.; Saito, A.; Otsuka, Y.; Yamamoto, M.; Abe, T.; Watanabe, S.; Ishisaka, K.; Saito, S.; Larsen, M.; Pfaff, R. F.; Bernhardt, P. A.

    2012-12-01

    An observation campaign is under preparation. It is to launch sounding rockets S-520-27 and S-310-42 from Uchinoura Space Center of JAXA while ground-based instruments measure waves in the ionosphere. It is scheduled in July/August 2013. The main purpose of the experiment is to reveal generation mechanism of Medium-Scale Traveling Ionospheric Disturbance (MSTID). The MSTID is the ionospheric wave with 1-2 hour periodicity, 100-200 km horizontal wavelength, and southwestward propagation. It is enhanced in the summer nighttime of the mid-latitude ionosphere. The MSTID is not only a simple atmospheric-wave modulation of the ionosphere, but shows similarity to characteristics of the Perkins instability. A problem is that growth rate of the Perkins instability is too small to explain the phenomena. We now hypothesize a generation mechanism that electromagnetic coupling of the F- and E-regions help rapid growth of the MSTID especially at its initial stage. In the observation campaign, we will use the sounding rocket S-520-27 for in-situ measurement of ionospheric parameters, i.e., electron density and electric fields. Wind velocity measurements in both F- and E-regions are very important as well. For the F-region winds, we will conduct Lithium-release experiment under the full-moon condition. This is a big technical challenge. Another rocket S-310-42 will be used for the E-region wind measurement with the TMA release. On the ground, we will use GEONET (Japanese vast GPS receiver network) to monitor horizontal distribution of GPS-TEC on the realtime bases. In the presentation we will show MSTID characteristics and the proposed generation mechanism, and discuss plan and current status of the project.

  17. Method for validating cloud mask obtained from satellite measurements using ground-based sky camera.

    Science.gov (United States)

    Letu, Husi; Nagao, Takashi M; Nakajima, Takashi Y; Matsumae, Yoshiaki

    2014-11-01

    Error propagation in Earth's atmospheric, oceanic, and land surface parameters of the satellite products caused by misclassification of the cloud mask is a critical issue for improving the accuracy of satellite products. Thus, characterizing the accuracy of the cloud mask is important for investigating the influence of the cloud mask on satellite products. In this study, we proposed a method for validating multiwavelength satellite data derived cloud masks using ground-based sky camera (GSC) data. First, a cloud cover algorithm for GSC data has been developed using sky index and bright index. Then, Moderate Resolution Imaging Spectroradiometer (MODIS) satellite data derived cloud masks by two cloud-screening algorithms (i.e., MOD35 and CLAUDIA) were validated using the GSC cloud mask. The results indicate that MOD35 is likely to classify ambiguous pixels as "cloudy," whereas CLAUDIA is likely to classify them as "clear." Furthermore, the influence of error propagations caused by misclassification of the MOD35 and CLAUDIA cloud masks on MODIS derived reflectance, brightness temperature, and normalized difference vegetation index (NDVI) in clear and cloudy pixels was investigated using sky camera data. It shows that the influence of the error propagation by the MOD35 cloud mask on the MODIS derived monthly mean reflectance, brightness temperature, and NDVI for clear pixels is significantly smaller than for the CLAUDIA cloud mask; the influence of the error propagation by the CLAUDIA cloud mask on MODIS derived monthly mean cloud products for cloudy pixels is significantly smaller than that by the MOD35 cloud mask.

  18. Development and verification of ground-based tele-robotics operations concept for Dextre

    Science.gov (United States)

    Aziz, Sarmad

    2013-05-01

    The Special Purpose Dextreous Manipulator (Dextre) is the latest addition to the on-orbit segment of the Mobile Servicing System (MSS); Canada's contribution to the International Space Station (ISS). Launched in March 2008, the advanced two-armed robot is designed to perform various ISS maintenance tasks on robotically compatible elements and on-orbit replaceable units using a wide variety of tools and interfaces. The addition of Dextre has increased the capabilities of the MSS, and has introduced significant complexity to ISS robotics operations. While the initial operations concept for Dextre was based on human-in-the-loop control by the on-orbit astronauts, the complexities of robotic maintenance and the associated costs of training and maintaining the operator skills required for Dextre operations demanded a reexamination of the old concepts. A new approach to ISS robotic maintenance was developed in order to utilize the capabilities of Dextre safely and efficiently, while at the same time reducing the costs of on-orbit operations. This paper will describe the development, validation, and on-orbit demonstration of the operations concept for ground-based tele-robotics control of Dextre. It will describe the evolution of the new concepts from the experience gained from the development and implementation of the ground control capability for the Space Station Remote Manipulator System; Canadarm 2. It will discuss the various technical challenges faced during the development effort, such as requirements for high positioning accuracy, force/moment sensing and accommodation, failure tolerance, complex tool operations, and the novel operational tools and techniques developed to overcome them. The paper will also describe the work performed to validate the new concepts on orbit and will discuss the results and lessons learned from the on-orbit checkout and commissioning of Dextre using the newly developed tele-robotics techniques and capabilities.

  19. Ground-based remote sensing of tropospheric water vapour isotopologues within the project MUSICA

    Directory of Open Access Journals (Sweden)

    M. Schneider

    2012-12-01

    Full Text Available Within the project MUSICA (MUlti-platform remote Sensing of Isotopologues for investigating the Cycle of Atmospheric water, long-term tropospheric water vapour isotopologue data records are provided for ten globally distributed ground-based mid-infrared remote sensing stations of the NDACC (Network for the Detection of Atmospheric Composition Change. We present a new method allowing for an extensive and straightforward characterisation of the complex nature of such isotopologue remote sensing datasets. We demonstrate that the MUSICA humidity profiles are representative for most of the troposphere with a vertical resolution ranging from about 2 km (in the lower troposphere to 8 km (in the upper troposphere and with an estimated precision of better than 10%. We find that the sensitivity with respect to the isotopologue composition is limited to the lower and middle troposphere, whereby we estimate a precision of about 30‰ for the ratio between the two isotopologues HD16O and H216O. The measurement noise, the applied atmospheric temperature profiles, the uncertainty in the spectral baseline, and the cross-dependence on humidity are the leading error sources. We introduce an a posteriori correction method of the cross-dependence on humidity, and we recommend applying it to isotopologue ratio remote sensing datasets in general. In addition, we present mid-infrared CO2 retrievals and use them for demonstrating the MUSICA network-wide data consistency. In order to indicate the potential of long-term isotopologue remote sensing data if provided with a well-documented quality, we present a climatology and compare it to simulations of an isotope incorporated AGCM (Atmospheric General Circulation Model. We identify differences in the multi-year mean and seasonal cycles that significantly exceed the estimated errors, thereby indicating deficits in the modeled atmospheric water cycle.

  20. Ground-based remote sensing of tropospheric water vapour isotopologues within the project MUSICA

    Science.gov (United States)

    Schneider, M.; Barthlott, S.; Hase, F.; González, Y.; Yoshimura, K.; García, O. E.; Sepúlveda, E.; Gomez-Pelaez, A.; Gisi, M.; Kohlhepp, R.; Dohe, S.; Blumenstock, T.; Wiegele, A.; Christner, E.; Strong, K.; Weaver, D.; Palm, M.; Deutscher, N. M.; Warneke, T.; Notholt, J.; Lejeune, B.; Demoulin, P.; Jones, N.; Griffith, D. W. T.; Smale, D.; Robinson, J.

    2012-12-01

    Within the project MUSICA (MUlti-platform remote Sensing of Isotopologues for investigating the Cycle of Atmospheric water), long-term tropospheric water vapour isotopologue data records are provided for ten globally distributed ground-based mid-infrared remote sensing stations of the NDACC (Network for the Detection of Atmospheric Composition Change). We present a new method allowing for an extensive and straightforward characterisation of the complex nature of such isotopologue remote sensing datasets. We demonstrate that the MUSICA humidity profiles are representative for most of the troposphere with a vertical resolution ranging from about 2 km (in the lower troposphere) to 8 km (in the upper troposphere) and with an estimated precision of better than 10%. We find that the sensitivity with respect to the isotopologue composition is limited to the lower and middle troposphere, whereby we estimate a precision of about 30‰ for the ratio between the two isotopologues HD16O and H216O. The measurement noise, the applied atmospheric temperature profiles, the uncertainty in the spectral baseline, and the cross-dependence on humidity are the leading error sources. We introduce an a posteriori correction method of the cross-dependence on humidity, and we recommend applying it to isotopologue ratio remote sensing datasets in general. In addition, we present mid-infrared CO2 retrievals and use them for demonstrating the MUSICA network-wide data consistency. In order to indicate the potential of long-term isotopologue remote sensing data if provided with a well-documented quality, we present a climatology and compare it to simulations of an isotope incorporated AGCM (Atmospheric General Circulation Model). We identify differences in the multi-year mean and seasonal cycles that significantly exceed the estimated errors, thereby indicating deficits in the modeled atmospheric water cycle.

  1. Multiple ground-based and satellite observations of global Pi 2 magnetic pulsations

    International Nuclear Information System (INIS)

    Yumoto, K.; Takahashi, K.; Sakurai, T.; Sutcliffe, P.R.; Kokubun, S.; Luehr, H.; Saito, T.; Kuwashima, M.; Sato, N.

    1990-01-01

    Four Pi 2 magnetic pulsations, observed on the ground at L = 1.2-6.9 in the interval from 2,300 UT on May 22 to 0300 UT on May 23, 1985, provide new evidence of a global nature of Pi 2 pulsations in the inner (L approx-lt 7) region of the magnetosphere bounded by the plasma sheet during quiet geomagnetic conditions. In the present study, magnetic data have been collected from stations distributed widely both in local time and in latitude, including conjugate stations, and from the AMPTE/CCE spacecraft located in the magnetotail. On the basis of high time resolution magnetic field data, the following characteristics of Pi 2 have been established: horizontal components, H and D, of the Pi 2 oscillate nearly antiphase and in-phase, respectively, between the high- and low-altitude stations in the midnight southern hemisphere. Both the H and D components of the Pi 2 have nearly in-phase relationships between the nightside and the dayside stations at low latitude. The Pi 2 amplitude is larger at the high-latitude station and decreases toward lower latitudes. The dominant periods of the Pi 2 are nearly identical at all stations. Although a direct coincidence between spacecraft-observed and ground-based global Pi 2 events does not exist for these events, the Pi 2 events are believed to be a forced field line oscillation of global scale, coupled with the magnetospheric cavity resonance wave in the inner magnetosphere during the substorm expansive phase

  2. Coastal change analysis of Lovells Island using high resolution ground based LiDAR imagery

    Science.gov (United States)

    Ly, Jennifer K.

    Many methods have been employed to study coastline change. These methods range from historical map analysis to GPS surveys to modern airborne LiDAR and satellite imagery. These previously used methods can be time consuming, labor intensive, and expensive and have varying degrees of accuracy and temporal coverage. Additionally, it is often difficult to apply such techniques in direct response to an isolated event within an appropriate temporal framework. Here we utilize a new ground based Canopy Biomass LiDAR (CBL) system built at The University of Massachusetts Boston (in collaboration with the Rochester Institute of Technology) in order to identify and analyze coastal change on Lovells Island, Boston Harbor. Surveys of a bluff developing in an eroding drumlin and beach cusps on a high-energy cobble beach on Lovells Island were conducted in June, September and December of 2013. At each site for each survey, the CBL was set up and multiple scans of each feature were taken on a predetermined transect that was established parallel to the high-water mark at distances relative to the scale of the bluff and cusps. The scans from each feature were compiled, integrated and visualized using Meshlab. Results from our surveys indicate that the highly portable and easy to deploy CBL system produces images of exceptional clarity, with the capacity to resolve small-scale changes to coastal features and systems. The CBL, while still under development (and coastal surveying protocols with it are just being established), appears to be an ideal tool for analyzing coastal geological features and is anticipated to prove to be a useful tool for the observation and analysis of coastal change. Furthermore, there is significant potential for utilizing the low cost ultra-portable CBL in frequent deployments to develop small-scale erosion rate and sediment budget analyses.

  3. Characterization of Oribtal Debris via Hyper-Velocity Ground-Based Tests

    Science.gov (United States)

    Cowardin, H.

    2015-01-01

    Existing DoD and NASA satellite breakup models are based on a key laboratory-based test, Satellite Orbital debris Characterization Impact Test (SOCIT), which has supported many applications and matched on-orbit events involving older satellite designs reasonably well over the years. In order to update and improve the break-up models and the NASA Size Estimation Model (SEM) for events involving more modern satellite designs, the NASA Orbital Debris Program Office has worked in collaboration with the University of Florida to replicate a hypervelocity impact using a satellite built with modern-day spacecraft materials and construction techniques. The spacecraft, called DebriSat, was intended to be a representative of modern LEO satellites and all major designs decisions were reviewed and approved by subject matter experts at Aerospace Corporation. DebriSat is composed of 7 major subsystems including attitude determination and control system (ADCS), command and data handling (C&DH), electrical power system (EPS), payload, propulsion, telemetry tracking and command (TT&C), and thermal management. To reduce cost, most components are emulated based on existing design of flight hardware and fabricated with the same materials. All fragments down to 2 mm is size will be characterized via material, size, shape, bulk density, and the associated data will be stored in a database for multiple users to access. Laboratory radar and optical measurements will be performed on a subset of fragments to provide a better understanding of the data products from orbital debris acquired from ground-based radars and telescopes. The resulting data analysis from DebriSat will be used to update break-up models and develop the first optical SEM in conjunction with updates into the current NASA SEM. The characterization of the fragmentation will be discussed in the subsequent presentation.

  4. Solar energy prediction and verification using operational model forecasts and ground-based solar measurements

    International Nuclear Information System (INIS)

    Kosmopoulos, P.G.; Kazadzis, S.; Lagouvardos, K.; Kotroni, V.; Bais, A.

    2015-01-01

    The present study focuses on the predictions and verification of these predictions of solar energy using ground-based solar measurements from the Hellenic Network for Solar Energy and the National Observatory of Athens network, as well as solar radiation operational forecasts provided by the MM5 mesoscale model. The evaluation was carried out independently for the different networks, for two forecast horizons (1 and 2 days ahead), for the seasons of the year, for varying solar elevation, for the indicative energy potential of the area, and for four classes of cloud cover based on the calculated clearness index (k_t): CS (clear sky), SC (scattered clouds), BC (broken clouds) and OC (overcast). The seasonal dependence presented relative rRMSE (Root Mean Square Error) values ranging from 15% (summer) to 60% (winter), while the solar elevation dependence revealed a high effectiveness and reliability near local noon (rRMSE ∼30%). An increment of the errors with cloudiness was also observed. For CS with mean GHI (global horizontal irradiance) ∼ 650 W/m"2 the errors are 8%, for SC 20% and for BC and OC the errors were greater (>40%) but correspond to much lower radiation levels (<120 W/m"2) of consequently lower energy potential impact. The total energy potential for each ground station ranges from 1.5 to 1.9 MWh/m"2, while the mean monthly forecast error was found to be consistently below 10%. - Highlights: • Long term measurements at different atmospheric cases are needed for energy forecasting model evaluations. • The total energy potential at the Greek sites presented ranges from 1.5 to 1.9 MWh/m"2. • Mean monthly energy forecast errors are within 10% for all cases analyzed. • Cloud presence results of an additional forecast error that varies with the cloud cover.

  5. Satellite- and ground-based observations of atmospheric water vapor absorption in the 940 nm region

    International Nuclear Information System (INIS)

    Albert, P.; Smith, K.M.; Bennartz, R.; Newnham, D.A.; Fischer, J.

    2004-01-01

    Ground-based measurements of direct absorption of solar radiation between 9000 and 13,000 cm -1 (770-1100 nm) with a spectral resolution of 0.05 cm -1 are compared with line-by-line simulations of atmospheric absorption based on different molecular databases (HITRAN 2000, HITRAN 99, HITRAN 96 and ESA-WVR). Differences between measurements and simulations can be reduced to a great amount by scaling the individual line intensities with spectral and database dependent scaling factors. Scaling factors are calculated for the selected databases using a Marquardt non-linear least-squares fit together with a forward model for 100 cm -1 wide intervals between 10,150 and 11,250 cm -1 as well as for the water vapor absorption channels of the Medium Resolution Imaging Spectrometer (MERIS) onboard the European Space Agency's (ESA) ENVISAT platform and the Modular Optoelectronic Scanner (MOS) on the Indian IRSP-3 platform, developed by the German Aerospace Centre (DLR). For the latter, the scaling coefficients are converted into correction factors for retrieved total columnar water vapor content and used for a comparison of MOS-based retrievals of total columnar atmospheric water vapor above cloud-free land surfaces with radio soundings. The scaling factors determined for 100 cm -1 wide intervals range from 0.85 for the ESA-WVR molecular database to 1.15 for HITRAN 96. The best agreement between measurements and simulations is achieved with HITRAN 99 and HITRAN 2000, respectively, using scaling factors between 0.9 and 1. The effects on the satellite-based retrievals of columnar atmospheric water vapor range from 2% (HITRAN 2000) to 12% (ESA-WVR)

  6. Eight-component retrievals from ground-based MAX-DOAS observations

    Directory of Open Access Journals (Sweden)

    H. Irie

    2011-06-01

    Full Text Available We attempt for the first time to retrieve lower-tropospheric vertical profile information for 8 quantities from ground-based Multi-Axis Differential Optical Absorption Spectroscopy (MAX-DOAS observations. The components retrieved are the aerosol extinction coefficients at two wavelengths, 357 and 476 nm, and NO2, HCHO, CHOCHO, H2O, SO2, and O3 volume mixing ratios. A Japanese MAX-DOAS profile retrieval algorithm, version 1 (JM1, is applied to observations performed at Cabauw, the Netherlands (51.97° N, 4.93° E, in June–July 2009 during the Cabauw Intercomparison campaign of Nitrogen Dioxide measuring Instruments (CINDI. Of the retrieved profiles, we focus here on the lowest-layer data (mean values at altitudes 0–1 km, where the sensitivity is usually highest owing to the longest light path. In support of the capability of the multi-component retrievals, we find reasonable overall agreement with independent data sets, including a regional chemical transport model (CHIMERE and in situ observations performed near the surface (2–3 m and at the 200-m height level of the tall tower in Cabauw. Plumes of enhanced HCHO and SO2 were likely affected by biogenic and ship emissions, respectively, and an improvement in their emission strengths is suggested for better agreement between CHIMERE simulations and MAX-DOAS observations. Analysis of air mass factors indicates that the horizontal spatial representativeness of MAX-DOAS observations is about 3–15 km (depending mainly on aerosol extinction, comparable to or better than the spatial resolution of current UV-visible satellite observations and model calculations. These demonstrate that MAX-DOAS provides multi-component data useful for the evaluation of satellite observations and model calculations and can play an important role in bridging different data sets having different spatial resolutions.

  7. Simulation of submillimetre atmospheric spectra for characterising potential ground-based remote sensing observations

    Directory of Open Access Journals (Sweden)

    E. C. Turner

    2016-11-01

    Full Text Available The submillimetre is an understudied region of the Earth's atmospheric electromagnetic spectrum. Prior technological gaps and relatively high opacity due to the prevalence of rotational water vapour lines at these wavelengths have slowed progress from a ground-based remote sensing perspective; however, emerging superconducting detector technologies in the fields of astronomy offer the potential to address key atmospheric science challenges with new instrumental methods. A site study, with a focus on the polar regions, is performed to assess theoretical feasibility by simulating the downwelling (zenith angle = 0° clear-sky submillimetre spectrum from 30 mm (10 GHz to 150 µm (2000 GHz at six locations under annual mean, summer, winter, daytime, night-time and low-humidity conditions. Vertical profiles of temperature, pressure and 28 atmospheric gases are constructed by combining radiosonde, meteorological reanalysis and atmospheric chemistry model data. The sensitivity of the simulated spectra to the choice of water vapour continuum model and spectroscopic line database is explored. For the atmospheric trace species hypobromous acid (HOBr, hydrogen bromide (HBr, perhydroxyl radical (HO2 and nitrous oxide (N2O the emission lines producing the largest change in brightness temperature are identified. Signal strengths, centre frequencies, bandwidths, estimated minimum integration times and maximum receiver noise temperatures are determined for all cases. HOBr, HBr and HO2 produce brightness temperature peaks in the mK to µK range, whereas the N2O peaks are in the K range. The optimal submillimetre remote sensing lines for the four species are shown to vary significantly between location and scenario, strengthening the case for future hyperspectral instruments that measure over a broad wavelength range. The techniques presented here provide a framework that can be applied to additional species of interest and taken forward to simulate

  8. Investigation of ground-based microwave radiometer calibration techniques at 530 hPa

    Directory of Open Access Journals (Sweden)

    G. Maschwitz

    2013-10-01

    Full Text Available Ground-based microwave radiometers (MWR are becoming more and more common for remotely sensing the atmospheric temperature and humidity profile as well as path-integrated cloud liquid water content. The calibration accuracy of the state-of-the-art MWR HATPRO-G2 (Humidity And Temperature Profiler – Generation 2 was investigated during the second phase of the Radiative Heating in Underexplored Bands Campaign (RHUBC-II in northern Chile (5320 m above mean sea level, 530 hPa conducted by the Atmospheric Radiation Measurement (ARM program conducted between August and October 2009. This study assesses the quality of the two frequently used liquid nitrogen and tipping curve calibrations by performing a detailed error propagation study, which exploits the unique atmospheric conditions of RHUBC-II. Both methods are known to have open issues concerning systematic offsets and calibration repeatability. For the tipping curve calibration an uncertainty of ±0.1 to ±0.2 K (K-band and ±0.6 to ±0.7 K (V-band is found. The uncertainty in the tipping curve calibration is mainly due to atmospheric inhomogeneities and the assumed air mass correction for the Earth curvature. For the liquid nitrogen calibration the estimated uncertainty of ±0.3 to ±1.6 K is dominated by the uncertainty of the reflectivity of the liquid nitrogen target. A direct comparison between the two calibration techniques shows that for six of the nine channels that can be calibrated with both methods, they agree within the assessed uncertainties. For the other three channels the unexplained discrepancy is below 0.5 K. Systematic offsets, which may cause the disagreement of both methods within their estimated uncertainties, are discussed.

  9. Investigation of tropical cirrus cloud properties using ground based lidar measurements

    Science.gov (United States)

    Dhaman, Reji K.; Satyanarayana, Malladi; Krishnakumar, V.; Mahadevan Pillai, V. P.; Jayeshlal, G. S.; Raghunath, K.; Venkat Ratnam, M.

    2016-05-01

    Cirrus clouds play a significant role in the Earths radiation budget. Therefore, knowledge of geometrical and optical properties of cirrus cloud is essential for the climate modeling. In this paper, the cirrus clouds microphysical and optical properties are made by using a ground based lidar measurements over an inland tropical station Gadanki (13.5°N, 79.2°E), Andhra Pradesh, India. The variation of cirrus microphysical and optical properties with mid cloud temperature is also studied. The cirrus clouds mean height is generally observed in the range of 9-17km with a peak occurrence at 13- 14km. The cirrus mid cloud temperature ranges from -81°C to -46°C. The cirrus geometrical thickness ranges from 0.9- 4.5km. During the cirrus occurrence days sub-visual, thin and dense cirrus were at 37.5%, 50% and 12.5% respectively. The monthly cirrus optical depth ranges from 0.01-0.47, but most (<80%) of the cirrus have values less than 0.1. Optical depth shows a strong dependence with cirrus geometrical thickness and mid-cloud height. The monthly mean cirrus extinction ranges from 2.8E-06 to 8E-05 and depolarization ratio and lidar ratio varies from 0.13 to 0.77 and 2 to 52 sr respectively. A positive correlation exists for both optical depth and extinction with the mid-cloud temperature. The lidar ratio shows a scattered behavior with mid-cloud temperature.

  10. Spatio-temporal representativeness of ground-based downward solar radiation measurements

    Science.gov (United States)

    Schwarz, Matthias; Wild, Martin; Folini, Doris

    2017-04-01

    Surface solar radiation (SSR) is most directly observed with ground based pyranometer measurements. Besides measurement uncertainties, which arise from the pyranometer instrument itself, also errors attributed to the limited spatial representativeness of observations from single sites for their large-scale surrounding have to be taken into account when using such measurements for energy balance studies. In this study the spatial representativeness of 157 homogeneous European downward surface solar radiation time series from the Global Energy Balance Archive (GEBA) and the Baseline Surface Radiation Network (BSRN) were examined for the period 1983-2015 by using the high resolution (0.05°) surface solar radiation data set from the Satellite Application Facility on Climate Monitoring (CM-SAF SARAH) as a proxy for the spatiotemporal variability of SSR. By correlating deseasonalized monthly SSR time series form surface observations against single collocated satellite derived SSR time series, a mean spatial correlation pattern was calculated and validated against purely observational based patterns. Generally decreasing correlations with increasing distance from station, with high correlations (R2 = 0.7) in proximity to the observational sites (±0.5°), was found. When correlating surface observations against time series from spatially averaged satellite derived SSR data (and thereby simulating coarser and coarser grids), very high correspondence between sites and the collocated pixels has been found for pixel sizes up to several degrees. Moreover, special focus was put on the quantification of errors which arise in conjunction to spatial sampling when estimating the temporal variability and trends for a larger region from a single surface observation site. For 15-year trends on a 1° grid, errors due to spatial sampling in the order of half of the measurement uncertainty for monthly mean values were found.

  11. Weak-lensing detection of intracluster filaments with ground-based data

    Science.gov (United States)

    Maturi, Matteo; Merten, Julian

    2013-11-01

    According to the current standard model of cosmology, matter in the Universe arranges itself along a network of filamentary structure. These filaments connect the main nodes of this so-called "cosmic web", which are clusters of galaxies. Although its large-scale distribution is clearly characterized by numerical simulations, constraining the dark-matter content of the cosmic web in reality turns out to be difficult. The natural method of choice is gravitational lensing. However, the direct detection and mapping of the elusive filament signal is challenging and in this work we present two methods that are specifically tailored to achieve this task. A linear matched filter aims at detecting the smooth mass-component of filaments and is optimized to perform a shear decomposition that follows the anisotropic component of the lensing signal. Filaments clearly inherit this property due to their morphology. At the same time, the contamination arising from the central massive cluster is controlled in a natural way. The filament 1σ detection is of about κ ~ 0.01 - 0.005 according to the filter's template width and length, enabling the detection of structures beyond reach with other approaches. The second, complementary method seeks to detect the clumpy component of filaments. The detection is determined by the number density of subclump identifications in an area enclosing the potential filament, as was found within the observed field with the filter approach. We tested both methods against mocked observations based on realistic N-body simulations of filamentary structure and proved the feasibility of detecting filaments with ground-based data.

  12. Quantifying the effect of riming on snowfall using ground-based observations

    Science.gov (United States)

    Moisseev, Dmitri; von Lerber, Annakaisa; Tiira, Jussi

    2017-04-01

    Ground-based observations of ice particle size distribution and ensemble mean density are used to quantify the effect of riming on snowfall. The rime mass fraction is derived from these measurements by following the approach that is used in a single ice-phase category microphysical scheme proposed for the use in numerical weather prediction models. One of the characteristics of the proposed scheme is that the prefactor of a power law relation that links mass and size of ice particles is determined by the rime mass fraction, while the exponent does not change. To derive the rime mass fraction, a mass-dimensional relation representative of unrimed snow is also determined. To check the validity of the proposed retrieval method, the derived rime mass fraction is converted to the effective liquid water path that is compared to microwave radiometer observations. Since dual-polarization radar observations are often used to detect riming, the impact of riming on dual-polarization radar variables is studied for differential reflectivity measurements. It is shown that the relation between rime mass fraction and differential reflectivity is ambiguous, other factors such as change in median volume diameter need also be considered. Given the current interest on sensitivity of precipitation to aerosol pollution, which could inhibit riming, the importance of riming for surface snow accumulation is investigated. It is found that riming is responsible for 5% to 40% of snowfall mass. The study is based on data collected at the University of Helsinki field station in Hyytiälä during U.S. Department of Energy Biogenic Aerosols Effects on Clouds and Climate (BAECC) field campaign and the winter 2014/2015. In total 22 winter storms were analyzed, and detailed analysis of two events is presented to illustrate the study.

  13. Tentative detection of clear-air turbulence using a ground-based Rayleigh lidar.

    Science.gov (United States)

    Hauchecorne, Alain; Cot, Charles; Dalaudier, Francis; Porteneuve, Jacques; Gaudo, Thierry; Wilson, Richard; Cénac, Claire; Laqui, Christian; Keckhut, Philippe; Perrin, Jean-Marie; Dolfi, Agnès; Cézard, Nicolas; Lombard, Laurent; Besson, Claudine

    2016-05-01

    Atmospheric gravity waves and turbulence generate small-scale fluctuations of wind, pressure, density, and temperature in the atmosphere. These fluctuations represent a real hazard for commercial aircraft and are known by the generic name of clear-air turbulence (CAT). Numerical weather prediction models do not resolve CAT and therefore provide only a probability of occurrence. A ground-based Rayleigh lidar was designed and implemented to remotely detect and characterize the atmospheric variability induced by turbulence in vertical scales between 40 m and a few hundred meters. Field measurements were performed at Observatoire de Haute-Provence (OHP, France) on 8 December 2008 and 23 June 2009. The estimate of the mean squared amplitude of bidimensional fluctuations of lidar signal showed excess compared to the estimated contribution of the instrumental noise. This excess can be attributed to atmospheric turbulence with a 95% confidence level. During the first night, data from collocated stratosphere-troposphere (ST) radar were available. Altitudes of the turbulent layers detected by the lidar were roughly consistent with those of layers with enhanced radar echo. The derived values of turbulence parameters Cn2 or CT2 were in the range of those published in the literature using ST radar data. However, the detection was at the limit of the instrumental noise and additional measurement campaigns are highly desirable to confirm these initial results. This is to our knowledge the first successful attempt to detect CAT in the free troposphere using an incoherent Rayleigh lidar system. The built lidar device may serve as a test bed for the definition of embarked CAT detection lidar systems aboard airliners.

  14. Ground-based solar radio observations of the August 1972 events

    International Nuclear Information System (INIS)

    Bhonsle, R.V.; Degaonkar, S.S.; Alurkar, S.K.

    1976-01-01

    Ground-based observations of the variable solar radio emission ranging from few millimetres to decametres have been used here as a diagnostic tool to gain coherent phenomenological understanding of the great 2, 4 and 7 August, 1972 solar events in terms of dominant physical processes like generation and propagation of shock waves in the solar atmosphere, particle acceleration and trapping. Four major flares are selected for detailed analysis on the basis of their ability to produce energetic protons, shock waves, polar cap absorptions (PCA) and sudden commencement (SC) geomagnetic storms. A comparative study of their radio characteristics is made. Evidence is seen for the pulsations during microwave bursts by the mechanism similar to that proposed by McLean et al. (1971), to explain the pulsations in the metre wavelength continuum radiation. It is suggested that the multiple peaks observed in some microwave bursts may be attributable to individual flares occurring sequentially due to a single initiating flare. Attempts have been made to establish identification of Type II bursts with the interplanetary shock waves and SC geomagnetic storms. Furthermore, it is suggested that it is the mass behind the shock front which is the deciding factor for the detection of shock waves in the interplantary space. It appears that more work is necessary in order to identify which of the three moving Type IV bursts (Wild and Smerd, 1972), namely, advancing shock front, expanding magnetic arch and ejected plasma blob serves as the piston-driver behind the interplanetary shocks. The existing criteria for proton flare prediction have been summarized and two new criteria have been proposed. (Auth.)

  15. Exploring the Diversity of Exoplanet Atmospheres Using Ground-Based Transit Spectroscopy

    Science.gov (United States)

    Bean, Jacob

    This is a proposal to fund an observational study of the atmospheres of exoplanets in order to improve our understanding of the nature and origins of these mysterious worlds. The observations will be performed using our new approach for ground-based transit spectroscopy measurements that yields space-telescope quality data. We will also carry out supporting theoretical calculations with new abundance retrieval codes to interpret the measurements. Our project includes a survey of giant exoplanets, and intensive study of especially compelling exoplanets. For the survey, optical and near-infrared transmission spectra, and near-infrared emission spectra will be measured for giant exoplanets with a wide range of estimated temperatures, heavy element abundance, and mass. This comprehensive characterization of a large sample of these planets is now crucial to investigate such issues for their atmospheres as the carbon-to-oxygen ratios and overall metallicities, cause of thermal inversions, and prevalence and nature of high-altitude hazes. The intensive study of compelling individual planets will focus on low-mass (M spectroscopy, and leveraging its particular sensitivity to the atmospheric scale height. Observations for the project will be carried out with Magellan, Keck, Gemini, and VLT. The team has institutional access to Magellan and Keck, and a demonstrated record of obtaining time on Gemini and VLT for these observations through public channels. This proposal is highly relevant for current and future NASA projects. We are seeking to understand the diversity of exoplanets revealed by planet searches like Kepler and the Eta-Earth survey. Our observations will complement, extend, and provide context for similar observations with HST and Spitzer. We will investigate the fundamental nature of the closest kin to Earth-size exoplanets, and this is an important foundation that must be laid down before studying habitable planets with JWST and a future TPF-like mission.

  16. Subtropical and Polar Cirrus Clouds Characterized by Ground-Based Lidars and CALIPSO/CALIOP Observations

    Directory of Open Access Journals (Sweden)

    Córdoba-Jabonero Carmen

    2016-01-01

    Full Text Available Cirrus clouds are product of weather processes, and then their occurrence and macrophysical/optical properties can vary significantly over different regions of the world. Lidars can provide height-resolved measurements with a relatively good both vertical and temporal resolutions, making them the most suitable instrumentation for high-cloud observations. The aim of this work is to show the potential of lidar observations on Cirrus clouds detection in combination with a recently proposed methodology to retrieve the Cirrus clouds macrophysical and optical features. In this sense, a few case studies of cirrus clouds observed at both subtropical and polar latitudes are examined and compared to CALIPSO/CALIOP observations. Lidar measurements are carried out in two stations: the Metropolitan city of Sao Paulo (MSP, Brazil, 23.3°S 46.4°W, located at subtropical latitudes, and the Belgrano II base (BEL, Argentina, 78ºS 35ºW in the Antarctic continent. Optical (COD-cloud optical depth and LR-Lidar Ratio and macrophysical (top/base heights and thickness properties of both the subtropical and polar cirrus clouds are reported. In general, subtropical Cirrus clouds present lower LR values and are found at higher altitudes than those detected at polar latitudes. In general, Cirrus clouds are detected at similar altitudes by CALIOP. However, a poor agreement is achieved in the LR retrieved between ground-based lidars and space-borne CALIOP measurements, likely due to the use of a fixed (or low-variable LR value in CALIOP inversion procedures.

  17. Ground based interferometric radar initial look at Longview, Blue Springs, Tuttle Creek, and Milford Dams

    Science.gov (United States)

    Deng, Huazeng

    Measuring millimeter and smaller deformation has been demonstrated in the literature using RADAR. To address in part the limitations in current commercial satellite-based SAR datasets, a University of Missouri (MU) team worked with GAMMA Remote Sensing to develop a specialized (dual-frequency, polarimetric, and interferometric) ground-based real-aperture RADAR (GBIR) instrument. The GBIR device is portable with its tripod system and control electronics. It can be deployed to obtain data with high spatial resolution (i.e. on the order of 1 meter) and high temporal resolution (i.e. on the order 1 minute). The high temporal resolution is well suited for measurements of rapid deformation. From the same geodetic position, the GBIR may collect dual frequency data set using C-band and Ku-band. The overall goal of this project is to measure the deformation from various scenarios by applying the GBIR system. Initial efforts have been focusing on testing the system performance on different types of targets. This thesis details a number of my efforts on experimental and processing activities at the start of the MU GBIR imaging project. For improved close range capability, a wideband dual polarized antenna option was produced and tested. For GBIR calibration, several trihedral corner reflectors were designed and fabricated. In addition to experimental activities and site selection, I participated in advanced data processing activities. I processed GBIR data in several ways including single-look-complex (SLC) image generation, imagery registration, and interferometric processing. A number of initial-processed GBIR image products are presented from four dams: Longview, Blue Springs, Tuttle Creek, and Milford. Excellent imaging performance of the MU GBIR has been observed for various target types such as riprap, concrete, soil, rock, metal, and vegetation. Strong coherence of the test scene has been observed in the initial interferograms.

  18. Radiometric modeling and calibration of the Geostationary Imaging Fourier Transform Spectrometer (GIFTS) ground based measurement experiment

    Science.gov (United States)

    Tian, Jialin; Smith, William L.; Gazarik, Michael J.

    2008-12-01

    The ultimate remote sensing benefits of the high resolution Infrared radiance spectrometers will be realized with their geostationary satellite implementation in the form of imaging spectrometers. This will enable dynamic features of the atmosphere's thermodynamic fields and pollutant and greenhouse gas constituents to be observed for revolutionary improvements in weather forecasts and more accurate air quality and climate predictions. As an important step toward realizing this application objective, the Geostationary Imaging Fourier Transform Spectrometer (GIFTS) Engineering Demonstration Unit (EDU) was successfully developed under the NASA New Millennium Program, 2000-2006. The GIFTS-EDU instrument employs three focal plane arrays (FPAs), which gather measurements across the long-wave IR (LWIR), short/mid-wave IR (SMWIR), and visible spectral bands. The GIFTS calibration is achieved using internal blackbody calibration references at ambient (260 K) and hot (286 K) temperatures. In this paper, we introduce a refined calibration technique that utilizes Principle Component (PC) analysis to compensate for instrument distortions and artifacts, therefore, enhancing the absolute calibration accuracy. This method is applied to data collected during the GIFTS Ground Based Measurement (GBM) experiment, together with simultaneous observations by the accurately calibrated AERI (Atmospheric Emitted Radiance Interferometer), both simultaneously zenith viewing the sky through the same external scene mirror at ten-minute intervals throughout a cloudless day at Logan Utah on September 13, 2006. The accurately calibrated GIFTS radiances are produced using the first four PC scores in the GIFTS-AERI regression model. Temperature and moisture profiles retrieved from the PC-calibrated GIFTS radiances are verified against radiosonde measurements collected throughout the GIFTS sky measurement period. Using the GIFTS GBM calibration model, we compute the calibrated radiances from data

  19. SIRTA, a ground-based atmospheric observatory for cloud and aerosol research

    Directory of Open Access Journals (Sweden)

    M. Haeffelin

    2005-02-01

    Full Text Available Ground-based remote sensing observatories have a crucial role to play in providing data to improve our understanding of atmospheric processes, to test the performance of atmospheric models, and to develop new methods for future space-borne observations. Institut Pierre Simon Laplace, a French research institute in environmental sciences, created the Site Instrumental de Recherche par Télédétection Atmosphérique (SIRTA, an atmospheric observatory with these goals in mind. Today SIRTA, located 20km south of Paris, operates a suite a state-of-the-art active and passive remote sensing instruments dedicated to routine monitoring of cloud and aerosol properties, and key atmospheric parameters. Detailed description of the state of the atmospheric column is progressively archived and made accessible to the scientific community. This paper describes the SIRTA infrastructure and database, and provides an overview of the scientific research associated with the observatory. Researchers using SIRTA data conduct research on atmospheric processes involving complex interactions between clouds, aerosols and radiative and dynamic processes in the atmospheric column. Atmospheric modellers working with SIRTA observations develop new methods to test their models and innovative analyses to improve parametric representations of sub-grid processes that must be accounted for in the model. SIRTA provides the means to develop data interpretation tools for future active remote sensing missions in space (e.g. CloudSat and CALIPSO. SIRTA observation and research activities take place in networks of atmospheric observatories that allow scientists to access consistent data sets from diverse regions on the globe.

  20. The thin border between cloud and aerosol: Sensitivity of several ground based observation techniques

    Science.gov (United States)

    Calbó, Josep; Long, Charles N.; González, Josep-Abel; Augustine, John; McComiskey, Allison

    2017-11-01

    Cloud and aerosol are two manifestations of what it is essentially the same physical phenomenon: a suspension of particles in the air. The differences between the two come from the different composition (e.g., much higher amount of condensed water in particles constituting a cloud) and/or particle size, and also from the different number of such particles (10-10,000 particles per cubic centimeter depending on conditions). However, there exist situations in which the distinction is far from obvious, and even when broken or scattered clouds are present in the sky, the borders between cloud/not cloud are not always well defined, a transition area that has been coined as the ;twilight zone;. The current paper presents a discussion on the definition of cloud and aerosol, the need for distinguishing or for considering the continuum between the two, and suggests a quantification of the importance and frequency of such ambiguous situations, founded on several ground-based observing techniques. Specifically, sensitivity analyses are applied on sky camera images and broadband and spectral radiometric measurements taken at Girona (Spain) and Boulder (Co, USA). Results indicate that, at these sites, in more than 5% of the daytime hours the sky may be considered cloudless (but containing aerosols) or cloudy (with some kind of optically thin clouds) depending on the observing system and the thresholds applied. Similarly, at least 10% of the time the extension of scattered or broken clouds into clear areas is problematic to establish, and depends on where the limit is put between cloud and aerosol. These findings are relevant to both technical approaches for cloud screening and sky cover categorization algorithms and radiative transfer studies, given the different effect of clouds and aerosols (and the different treatment in models) on the Earth's radiation balance.

  1. Rates for parallax-shifted microlensing events from ground-based observations of the galactic bulge

    International Nuclear Information System (INIS)

    Buchalter, A.; Kamionkowski, M.

    1997-01-01

    The parallax effect in ground-based microlensing (ML) observations consists of a distortion to the standard ML light curve arising from the Earth's orbital motion. This can be used to partially remove the degeneracy among the system parameters in the event timescale, t 0 . In most cases, the resolution in current ML surveys is not accurate enough to observe this effect, but parallax could conceivably be detected with frequent follow-up observations of ML events in progress, providing the photometric errors are small enough. We calculate the expected fraction of ML events where the shape distortions will be observable by such follow-up observations, adopting Galactic models for the lens and source distributions that are consistent with observed microlensing timescale distributions. We study the dependence of the rates for parallax-shifted events on the frequency of follow-up observations and on the precision of the photometry. For example, we find that for hourly observations with typical photometric errors of 0.01 mag, 6% of events where the lens is in the bulge, and 31% of events where the lens is in the disk (or ∼10% of events overall), will give rise to a measurable parallax shift at the 95% confidence level. These fractions may be increased by improved photometric accuracy and increased sampling frequency. While long-duration events are favored, the surveys would be effective in picking out such distortions in events with timescales as low as t 0 ∼20 days. We study the dependence of these fractions on the assumed disk mass function and find that a higher parallax incidence is favored by mass functions with higher mean masses. Parallax measurements yield the reduced transverse speed, v, which gives both the relative transverse speed and lens mass as a function of distance. We give examples of the accuracies with which v may be measured in typical parallax events. (Abstract Truncated)

  2. Simultaneous and synergistic profiling of cloud and drizzle properties using ground-based observations

    Science.gov (United States)

    Rusli, Stephanie P.; Donovan, David P.; Russchenberg, Herman W. J.

    2017-12-01

    Despite the importance of radar reflectivity (Z) measurements in the retrieval of liquid water cloud properties, it remains nontrivial to interpret Z due to the possible presence of drizzle droplets within the clouds. So far, there has been no published work that utilizes Z to identify the presence of drizzle above the cloud base in an optimized and a physically consistent manner. In this work, we develop a retrieval technique that exploits the synergy of different remote sensing systems to carry out this task and to subsequently profile the microphysical properties of the cloud and drizzle in a unified framework. This is accomplished by using ground-based measurements of Z, lidar attenuated backscatter below as well as above the cloud base, and microwave brightness temperatures. Fast physical forward models coupled to cloud and drizzle structure parameterization are used in an optimal-estimation-type framework in order to retrieve the best estimate for the cloud and drizzle property profiles. The cloud retrieval is first evaluated using synthetic signals generated from large-eddy simulation (LES) output to verify the forward models used in the retrieval procedure and the vertical parameterization of the liquid water content (LWC). From this exercise it is found that, on average, the cloud properties can be retrieved within 5 % of the mean truth. The full cloud-drizzle retrieval method is then applied to a selected ACCEPT (Analysis of the Composition of Clouds with Extended Polarization Techniques) campaign dataset collected in Cabauw, the Netherlands. An assessment of the retrieval products is performed using three independent methods from the literature; each was specifically developed to retrieve only the cloud properties, the drizzle properties below the cloud base, or the drizzle fraction within the cloud. One-to-one comparisons, taking into account the uncertainties or limitations of each retrieval, show that our results are consistent with what is derived

  3. Confronting remote sensing product with ground base measurements across time and scale

    Science.gov (United States)

    Pourmokhtarian, A.; Dietze, M.

    2015-12-01

    Ecosystem models are essential tools in forecasting ecosystem responses to global climate change. One of the most challenging issues in ecosystem modeling is scaling while preserving landscape characteristics and minimizing loss of information, when moving from point observation to regional scale. There is a keen interest in providing accurate inputs for ecosystem models which represent ecosystem initial state conditions. Remote sensing land cover products, such as Landsat NLCD and MODIS MCD12Q1, provide extensive spatio-temporal coverage but do not capture forest composition and structure. Lidar and hyperspectral have the potential to meet this need but lack sufficient spatial and historical coverage. Forest inventory measurements provide detailed information on the landscape but in a very small footprint. Combining inventory and land cover could improve estimates of ecosystem state and characteristic across time and space. This study focuses on the challenges associated with fusing and scaling the US Forest Service FIA database and NLCD across regional scales to quantify ecosystem characteristics and reduce associated uncertainties. Across Southeast of U.S. 400 stratified random samples of 10x10 km2 landscapes were selected. Data on plant density, species, age, and DBH of trees in FIA plots within each site were extracted. Using allometry equations, the canopy cover of different plant functional types (PFTs) was estimated using a PPA-style canopy model and used to assign each inventory plot to a land cover class. Inventory and land cover were fused in a Bayesian model that adjusts the fractional coverage of inventory plots while accounting for multiple sources of uncertainty. Results were compared to estimates derived from inventory alone, land cover alone, and model spin-up alone. Our findings create a framework of data assimilation to better interpret remote sensing data using ground-based measurements.

  4. Validation of MOPITT carbon monoxide using ground-based Fourier transform infrared spectrometer data from NDACC

    Science.gov (United States)

    Buchholz, Rebecca R.; Deeter, Merritt N.; Worden, Helen M.; Gille, John; Edwards, David P.; Hannigan, James W.; Jones, Nicholas B.; Paton-Walsh, Clare; Griffith, David W. T.; Smale, Dan; Robinson, John; Strong, Kimberly; Conway, Stephanie; Sussmann, Ralf; Hase, Frank; Blumenstock, Thomas; Mahieu, Emmanuel; Langerock, Bavo

    2017-06-01

    The Measurements of Pollution in the Troposphere (MOPITT) satellite instrument provides the longest continuous dataset of carbon monoxide (CO) from space. We perform the first validation of MOPITT version 6 retrievals using total column CO measurements from ground-based remote-sensing Fourier transform infrared spectrometers (FTSs). Validation uses data recorded at 14 stations, that span a wide range of latitudes (80° N to 78° S), in the Network for the Detection of Atmospheric Composition Change (NDACC). MOPITT measurements are spatially co-located with each station, and different vertical sensitivities between instruments are accounted for by using MOPITT averaging kernels (AKs). All three MOPITT retrieval types are analyzed: thermal infrared (TIR-only), joint thermal and near infrared (TIR-NIR), and near infrared (NIR-only). Generally, MOPITT measurements overestimate CO relative to FTS measurements, but the bias is typically less than 10 %. Mean bias is 2.4 % for TIR-only, 5.1 % for TIR-NIR, and 6.5 % for NIR-only. The TIR-NIR and NIR-only products consistently produce a larger bias and lower correlation than the TIR-only. Validation performance of MOPITT for TIR-only and TIR-NIR retrievals over land or water scenes is equivalent. The four MOPITT detector element pixels are validated separately to account for their different uncertainty characteristics. Pixel 1 produces the highest standard deviation and lowest correlation for all three MOPITT products. However, for TIR-only and TIR-NIR, the error-weighted average that includes all four pixels often provides the best correlation, indicating compensating pixel biases and well-captured error characteristics. We find that MOPITT bias does not depend on latitude but rather is influenced by the proximity to rapidly changing atmospheric CO. MOPITT bias drift has been bound geographically to within ±0.5 % yr-1 or lower at almost all locations.

  5. Geocenter variations derived from a combined processing of LEO- and ground-based GPS observations

    Science.gov (United States)

    Männel, Benjamin; Rothacher, Markus

    2017-08-01

    GNSS observations provided by the global tracking network of the International GNSS Service (IGS, Dow et al. in J Geod 83(3):191-198, 2009) play an important role in the realization of a unique terrestrial reference frame that is accurate enough to allow a detailed monitoring of the Earth's system. Combining these ground-based data with GPS observations tracked by high-quality dual-frequency receivers on-board low earth orbiters (LEOs) is a promising way to further improve the realization of the terrestrial reference frame and the estimation of geocenter coordinates, GPS satellite orbits and Earth rotation parameters. To assess the scope of the improvement on the geocenter coordinates, we processed a network of 53 globally distributed and stable IGS stations together with four LEOs (GRACE-A, GRACE-B, OSTM/Jason-2 and GOCE) over a time interval of 3 years (2010-2012). To ensure fully consistent solutions, the zero-difference phase observations of the ground stations and LEOs were processed in a common least-squares adjustment, estimating all the relevant parameters such as GPS and LEO orbits, station coordinates, Earth rotation parameters and geocenter motion. We present the significant impact of the individual LEO and a combination of all four LEOs on the geocenter coordinates. The formal errors are reduced by around 20% due to the inclusion of one LEO into the ground-only solution, while in a solution with four LEOs LEO-specific characteristics are significantly reduced. We compare the derived geocenter coordinates w.r.t. LAGEOS results and external solutions based on GPS and SLR data. We found good agreement in the amplitudes of all components; however, the phases in x- and z-direction do not agree well.

  6. Characterizing GEO Titan IIIC Transtage Fragmentations Using Ground-based and Telescopic Measurements

    Science.gov (United States)

    Cowardin, H.; Anz-Meador, P.; Reyes, J. A.

    In a continued effort to better characterize the geosynchronous orbit (GEO) environment, NASA’s Orbital Debris Program Office (ODPO) utilizes various ground-based optical assets to acquire photometric and spectral data of known debris associated with fragmentations in or near GEO. The Titan IIIC Transtage upper stage is known to have fragmented four times. Two of the four fragmentations were in GEO while the Transtage fragmented a third time in GEO transfer orbit. The forth fragmentation occurred in low Earth orbit. To better assess and characterize these fragmentations, the NASA ODPO acquired a Titan Transtage test and display article previously in the custody of the 309th Aerospace Maintenance and Regeneration Group (AMARG) in Tucson, Arizona. After initial inspections at AMARG demonstrated that it was of sufficient fidelity to be of interest, the test article was brought to NASA Johnson Space Center (JSC) to continue material analysis and historical documentation. The Transtage has undergone two separate spectral measurement campaigns to characterize the reflectance spectroscopy of historical aerospace materials. These data have been incorporated into the NASA Spectral Database, with the goal of using telescopic data comparisons for potential material identification. A Light Detection and Ranging (LIDAR) system scan also has been completed and a scale model has been created for use in the Optical Measurement Center (OMC) for photometric analysis of an intact Transtage, including bidirectional reflectance distribution function (BRDF) measurements. An historical overview of the Titan IIIC Transtage, the current analysis that has been done to date, and the future work to be completed in support of characterizing the GEO and near GEO orbital debris environment will be discussed in the subsequent presentation.

  7. OGLE-2015-BLG-0196: GROUND-BASED GRAVITATIONAL MICROLENS PARALLAX CONFIRMED BY SPACE-BASED OBSERVATION

    Energy Technology Data Exchange (ETDEWEB)

    Han, C. [Department of Physics, Chungbuk National University, Cheongju 361-763 (Korea, Republic of); Udalski, A.; Szymański, M. K.; Soszyński, I.; Skowron, J.; Mróz, P.; Poleski, R.; Pietrukowicz, P.; Kozłowski, S.; Ulaczyk, K.; Pawlak, M. [Warsaw University Observatory, Al. Ujazdowskie 4, 00-478 Warszawa (Poland); Gould, A.; Zhu, Wei; Fausnaugh, M.; Gaudi, B. S. [Department of Astronomy, Ohio State University, 140 W. 18th Ave., Columbus, OH 43210 (United States); Yee, J. C. [Harvard-Smithsonian Center for Astrophysics, 60 Garden St., Cambridge, MA 02138 (United States); Beichman, C. [NASA Exoplanet Science Institute, MS 100-22, California Institute of Technology, Pasadena, CA 91125 (United States); Novati, S. Calchi [Dipartimento di Fisica “E. R. Caianiello,” Uńiversitá di Salerno, Via Giovanni Paolo II, I-84084 Fisciano (Italy); Carey, S. [Spitzer Science Center, MS 220-6, California Institute of Technology, Pasadena, CA (United States); Bryden, C. [Jet Propulsion Laboratory, California Institute of Technology, 4800 Oak Grove Dr., Pasadena, CA 91109 (United States); Collaboration: OGLE Collaboration; Spitzer Microlensing Team; and others

    2017-01-01

    In this paper, we present an analysis of the binary gravitational microlensing event OGLE-2015-BLG-0196. The event lasted for almost a year, and the light curve exhibited significant deviations from the lensing model based on the rectilinear lens-source relative motion, enabling us to measure the microlens parallax. The ground-based microlens parallax is confirmed by the data obtained from space-based microlens observations using the Spitzer telescope. By additionally measuring the angular Einstein radius from the analysis of the resolved caustic crossing, the physical parameters of the lens are determined up to the twofold degeneracy, u {sub 0} < 0 and u {sub 0} > 0, solutions caused by the well-known “ecliptic” degeneracy. It is found that the binary lens is composed of two M dwarf stars with similar masses, M {sub 1} = 0.38 ± 0.04 M {sub ⊙} (0.50 ± 0.05 M {sub ⊙}) and M {sub 2} = 0.38 ± 0.04 M {sub ⊙} (0.55 ± 0.06 M {sub ⊙}), and the distance to the lens is D {sub L} = 2.77 ± 0.23 kpc (3.30 ± 0.29 kpc). Here the physical parameters outside and inside the parentheses are for the u {sub 0} < 0 and u {sub 0} > 0 solutions, respectively.

  8. Macrophysical and optical properties of midlatitude cirrus clouds from four ground-based lidars and collocated CALIOP observations

    Energy Technology Data Exchange (ETDEWEB)

    Dupont, Jean-Charles; Haeffelin, M.; Morille, Y.; Noel, V.; Keckhut, P.; Winker, D.; Comstock, Jennifer M.; Chervet, P.; Roblin, A.

    2010-05-27

    Ground-based lidar and CALIOP datasets gathered over four mid-latitude sites, two US and two French sites, are used to evaluate the consistency of cloud macrophysical and optical property climatologies that can be derived by such datasets. The consistency in average cloud height (both base and top height) between the CALIOP and ground datasets ranges from -0.4km to +0.5km. The cloud geometrical thickness distributions vary significantly between the different datasets, due in part to the original vertical resolutions of the lidar profiles. Average cloud geometrical thicknesses vary from 1.2 to 1.9km, i.e. by more than 50%. Cloud optical thickness distributions in subvisible, semi-transparent and moderate intervals differ by more than 50% between ground and space-based datasets. The cirrus clouds with 2 optical thickness below 0.1 (not included in historical cloud climatologies) represent 30-50% of the non-opaque cirrus class. The differences in average cloud base altitude between ground and CALIOP datasets of 0.0-0.1 km, 0.0-0.2 km and 0.0-0.2 km can be attributed to irregular sampling of seasonal variations in the ground-based data, to day-night differences in detection capabilities by CALIOP, and to the restriction to situations without low-level clouds in ground-based data, respectively. The cloud geometrical thicknesses are not affected by irregular sampling of seasonal variations in the ground-based data, while up to 0.0-0.2 km and 0.1-0.3 km differences can be attributed to day-night differences in detection capabilities by CALIOP, and to the restriction to situations without lowlevel clouds in ground-based data, respectively.

  9. Ground-based aerosol characterization during the South American Biomass Burning Analysis (SAMBBA field experiment

    Directory of Open Access Journals (Sweden)

    J. Brito

    2014-11-01

    Full Text Available This paper investigates the physical and chemical characteristics of aerosols at ground level at a site heavily impacted by biomass burning. The site is located near Porto Velho, Rondônia, in the southwestern part of the Brazilian Amazon rainforest, and was selected for the deployment of a large suite of instruments, among them an Aerosol Chemical Speciation Monitor. Our measurements were made during the South American Biomass Burning Analysis (SAMBBA field experiment, which consisted of a combination of aircraft and ground-based measurements over Brazil, aimed to investigate the impacts of biomass burning emissions on climate, air quality, and numerical weather prediction over South America. The campaign took place during the dry season and the transition to the wet season in September/October 2012. During most of the campaign, the site was impacted by regional biomass burning pollution (average CO mixing ratio of 0.6 ppm, occasionally superimposed by intense (up to 2 ppm of CO, freshly emitted biomass burning plumes. Aerosol number concentrations ranged from ~1000 cm−3 to peaks of up to 35 000 cm−3 (during biomass burning (BB events, corresponding to an average submicron mass mean concentrations of 13.7 μg m−3 and peak concentrations close to 100 μg m−3. Organic aerosol strongly dominated the submicron non-refractory composition, with an average concentration of 11.4 μg m−3. The inorganic species, NH4, SO4, NO3, and Cl, were observed, on average, at concentrations of 0.44, 0.34, 0.19, and 0.01 μg m−3, respectively. Equivalent black carbon (BCe ranged from 0.2 to 5.5 μg m−3, with an average concentration of 1.3 μg m−3. During BB peaks, organics accounted for over 90% of total mass (submicron non-refractory plus BCe, among the highest values described in the literature. We examined the ageing of biomass burning organic aerosol (BBOA using the changes in the H : C and O : C ratios, and found that throughout most of the

  10. Reevaluation of the Jezebel Benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Favorite, Jeffrey A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-03-10

    Every nuclear engineering student is familiar with Jezebel, the homogeneous bare sphere of plutonium first assembled at Los Alamos in 1954-1955. The actual Jezebel assembly was neither homogeneous, nor bare, nor spherical; nor was it singular – there were hundreds of Jezebel configurations assembled. The Jezebel benchmark has been reevaluated for the International Criticality Safety Benchmark Evaluation Project (ICSBEP) Handbook. Logbooks, original drawings, mass accountability statements, internal reports, and published reports have been used to model four actual three-dimensional Jezebel assemblies with high fidelity. Because the documentation available today is often inconsistent, three major assumptions were made regarding plutonium part masses and dimensions. The first was that the assembly masses given in Los Alamos report LA-4208 (1969) were correct, and the second was that the original drawing dimension for the polar height of a certain major part was correct. The third assumption was that a change notice indicated on the original drawing was not actually implemented. This talk will describe these assumptions, the alternatives, and the implications. Since the publication of the 2013 ICSBEP Handbook, the actual masses of the major components have turned up. Our assumption regarding the assembly masses was proven correct, but we had the mass distribution incorrect. Work to incorporate the new information is ongoing, and this talk will describe the latest assessment.

  11. SCWEB, Scientific Workstation Evaluation Benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Raffenetti, R C [Computing Services-Support Services Division, Argonne National Laboratory, 9700 South Cass Avenue, Argonne, Illinois 60439 (United States)

    1988-06-16

    1 - Description of program or function: The SCWEB (Scientific Workstation Evaluation Benchmark) software includes 16 programs which are executed in a well-defined scenario to measure the following performance capabilities of a scientific workstation: implementation of FORTRAN77, processor speed, memory management, disk I/O, monitor (or display) output, scheduling of processing (multiprocessing), and scheduling of print tasks (spooling). 2 - Method of solution: The benchmark programs are: DK1, DK2, and DK3, which do Fourier series fitting based on spline techniques; JC1, which checks the FORTRAN function routines which produce numerical results; JD1 and JD2, which solve dense systems of linear equations in double- and single-precision, respectively; JD3 and JD4, which perform matrix multiplication in single- and double-precision, respectively; RB1, RB2, and RB3, which perform substantial amounts of I/O processing on files other than the input and output files; RR1, which does intense single-precision floating-point multiplication in a tight loop, RR2, which initializes a 512x512 integer matrix in a manner which skips around in the address space rather than initializing each consecutive memory cell in turn; RR3, which writes alternating text buffers to the output file; RR4, which evaluates the timer routines and demonstrates that they conform to the specification; and RR5, which determines whether the workstation is capable of executing a 4-megabyte program

  12. Pynamic: the Python Dynamic Benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Lee, G L; Ahn, D H; de Supinksi, B R; Gyllenhaal, J C; Miller, P J

    2007-07-10

    Python is widely used in scientific computing to facilitate application development and to support features such as computational steering. Making full use of some of Python's popular features, which improve programmer productivity, leads to applications that access extremely high numbers of dynamically linked libraries (DLLs). As a result, some important Python-based applications severely stress a system's dynamic linking and loading capabilities and also cause significant difficulties for most development environment tools, such as debuggers. Furthermore, using the Python paradigm for large scale MPI-based applications can create significant file IO and further stress tools and operating systems. In this paper, we present Pynamic, the first benchmark program to support configurable emulation of a wide-range of the DLL usage of Python-based applications for large scale systems. Pynamic has already accurately reproduced system software and tool issues encountered by important large Python-based scientific applications on our supercomputers. Pynamic provided insight for our system software and tool vendors, and our application developers, into the impact of several design decisions. As we describe the Pynamic benchmark, we will highlight some of the issues discovered in our large scale system software and tools using Pynamic.

  13. The Isprs Benchmark on Indoor Modelling

    Science.gov (United States)

    Khoshelham, K.; Díaz Vilariño, L.; Peter, M.; Kang, Z.; Acharya, D.

    2017-09-01

    Automated generation of 3D indoor models from point cloud data has been a topic of intensive research in recent years. While results on various datasets have been reported in literature, a comparison of the performance of different methods has not been possible due to the lack of benchmark datasets and a common evaluation framework. The ISPRS benchmark on indoor modelling aims to address this issue by providing a public benchmark dataset and an evaluation framework for performance comparison of indoor modelling methods. In this paper, we present the benchmark dataset comprising several point clouds of indoor environments captured by different sensors. We also discuss the evaluation and comparison of indoor modelling methods based on manually created reference models and appropriate quality evaluation criteria. The benchmark dataset is available for download at: html"target="_blank">http://www2.isprs.org/commissions/comm4/wg5/benchmark-on-indoor-modelling.html.

  14. Analysis of a molten salt reactor benchmark

    International Nuclear Information System (INIS)

    Ghosh, Biplab; Bajpai, Anil; Degweker, S.B.

    2013-01-01

    This paper discusses results of our studies of an IAEA molten salt reactor (MSR) benchmark. The benchmark, proposed by Japan, involves burnup calculations of a single lattice cell of a MSR for burning plutonium and other minor actinides. We have analyzed this cell with in-house developed burnup codes BURNTRAN and McBURN. This paper also presents a comparison of the results of our codes and those obtained by the proposers of the benchmark. (author)

  15. Benchmarking i eksternt regnskab og revision

    DEFF Research Database (Denmark)

    Thinggaard, Frank; Kiertzner, Lars

    2001-01-01

    løbende i en benchmarking-proces. Dette kapitel vil bredt undersøge, hvor man med nogen ret kan få benchmarking-begrebet knyttet til eksternt regnskab og revision. Afsnit 7.1 beskæftiger sig med det eksterne årsregnskab, mens afsnit 7.2 tager fat i revisionsområdet. Det sidste afsnit i kapitlet opsummerer...... betragtningerne om benchmarking i forbindelse med begge områder....

  16. Computational Chemistry Comparison and Benchmark Database

    Science.gov (United States)

    SRD 101 NIST Computational Chemistry Comparison and Benchmark Database (Web, free access)   The NIST Computational Chemistry Comparison and Benchmark Database is a collection of experimental and ab initio thermochemical properties for a selected set of molecules. The goals are to provide a benchmark set of molecules for the evaluation of ab initio computational methods and allow the comparison between different ab initio computational methods for the prediction of thermochemical properties.

  17. Aerodynamic Benchmarking of the Deepwind Design

    DEFF Research Database (Denmark)

    Bedona, Gabriele; Schmidt Paulsen, Uwe; Aagaard Madsen, Helge

    2015-01-01

    The aerodynamic benchmarking for the DeepWind rotor is conducted comparing different rotor geometries and solutions and keeping the comparison as fair as possible. The objective for the benchmarking is to find the most suitable configuration in order to maximize the power production and minimize...... the blade solicitation and the cost of energy. Different parameters are considered for the benchmarking study. The DeepWind blade is characterized by a shape similar to the Troposkien geometry but asymmetric between the top and bottom parts: this shape is considered as a fixed parameter in the benchmarking...

  18. HPC Benchmark Suite NMx, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Intelligent Automation Inc., (IAI) and University of Central Florida (UCF) propose to develop a comprehensive numerical test suite for benchmarking current and...

  19. High Energy Physics (HEP) benchmark program

    International Nuclear Information System (INIS)

    Yasu, Yoshiji; Ichii, Shingo; Yashiro, Shigeo; Hirayama, Hideo; Kokufuda, Akihiro; Suzuki, Eishin.

    1993-01-01

    High Energy Physics (HEP) benchmark programs are indispensable tools to select suitable computer for HEP application system. Industry standard benchmark programs can not be used for this kind of particular selection. The CERN and the SSC benchmark suite are famous HEP benchmark programs for this purpose. The CERN suite includes event reconstruction and event generator programs, while the SSC one includes event generators. In this paper, we found that the results from these two suites are not consistent. And, the result from the industry benchmark does not agree with either of these two. Besides, we describe comparison of benchmark results using EGS4 Monte Carlo simulation program with ones from two HEP benchmark suites. Then, we found that the result from EGS4 in not consistent with the two ones. The industry standard of SPECmark values on various computer systems are not consistent with the EGS4 results either. Because of these inconsistencies, we point out the necessity of a standardization of HEP benchmark suites. Also, EGS4 benchmark suite should be developed for users of applications such as medical science, nuclear power plant, nuclear physics and high energy physics. (author)

  20. Establishing benchmarks and metrics for utilization management.

    Science.gov (United States)

    Melanson, Stacy E F

    2014-01-01

    The changing environment of healthcare reimbursement is rapidly leading to a renewed appreciation of the importance of utilization management in the clinical laboratory. The process of benchmarking of laboratory operations is well established for comparing organizational performance to other hospitals (peers) and for trending data over time through internal benchmarks. However, there are relatively few resources available to assist organizations in benchmarking for laboratory utilization management. This article will review the topic of laboratory benchmarking with a focus on the available literature and services to assist in managing physician requests for laboratory testing. © 2013.

  1. Professional Performance and Bureaucratic Benchmarking Information

    DEFF Research Database (Denmark)

    Schneider, Melanie L.; Mahlendorf, Matthias D.; Schäffer, Utz

    Prior research documents positive effects of benchmarking information provision on performance and attributes this to social comparisons. However, the effects on professional recipients are unclear. Studies of professional control indicate that professional recipients often resist bureaucratic...... controls because of organizational-professional conflicts. We therefore analyze the association between bureaucratic benchmarking information provision and professional performance and suggest that the association is more positive if prior professional performance was low. We test our hypotheses based...... on archival, publicly disclosed, professional performance data for 191 German orthopedics departments, matched with survey data on bureaucratic benchmarking information given to chief orthopedists by the administration. We find a positive association between bureaucratic benchmarking information provision...

  2. Benchmarking of nuclear economics tools

    International Nuclear Information System (INIS)

    Moore, Megan; Korinny, Andriy; Shropshire, David; Sadhankar, Ramesh

    2017-01-01

    Highlights: • INPRO and GIF economic tools exhibited good alignment in total capital cost estimation. • Subtle discrepancies in the cost result from differences in financing and the fuel cycle assumptions. • A common set of assumptions was found to reduce the discrepancies to 1% or less. • Opportunities for harmonisation of economic tools exists. - Abstract: Benchmarking of the economics methodologies developed by the Generation IV International Forum (GIF) and the International Atomic Energy Agency’s International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO), was performed for three Generation IV nuclear energy systems. The Economic Modeling Working Group of GIF developed an Excel based spreadsheet package, G4ECONS (Generation 4 Excel-based Calculation Of Nuclear Systems), to calculate the total capital investment cost (TCIC) and the levelised unit energy cost (LUEC). G4ECONS is sufficiently generic in the sense that it can accept the types of projected input, performance and cost data that are expected to become available for Generation IV systems through various development phases and that it can model both open and closed fuel cycles. The Nuclear Energy System Assessment (NESA) Economic Support Tool (NEST) was developed to enable an economic analysis using the INPRO methodology to easily calculate outputs including the TCIC, LUEC and other financial figures of merit including internal rate of return, return of investment and net present value. NEST is also Excel based and can be used to evaluate nuclear reactor systems using the open fuel cycle, MOX (mixed oxide) fuel recycling and closed cycles. A Super Critical Water-cooled Reactor system with an open fuel cycle and two Fast Reactor systems, one with a break-even fuel cycle and another with a burner fuel cycle, were selected for the benchmarking exercise. Published data on capital and operating costs were used for economics analyses using G4ECONS and NEST tools. Both G4ECONS and

  3. FENDL neutronics benchmark: Specifications for the calculational neutronics and shielding benchmark

    International Nuclear Information System (INIS)

    Sawan, M.E.

    1994-12-01

    During the IAEA Advisory Group Meeting on ''Improved Evaluations and Integral Data Testing for FENDL'' held in Garching near Munich, Germany in the period 12-16 September 1994, the Working Group II on ''Experimental and Calculational Benchmarks on Fusion Neutronics for ITER'' recommended that a calculational benchmark representative of the ITER design should be developed. This report describes the neutronics and shielding calculational benchmark available for scientists interested in performing analysis for this benchmark. (author)

  4. Suborbital Reusable Launch Vehicles as an Opportunity to Consolidate and Calibrate Ground Based and Satellite Instruments

    Science.gov (United States)

    Papadopoulos, K.

    2014-12-01

    XCOR Aerospace, a commercial space company, is planning to provide frequent, low cost access to near-Earth space on the Lynx suborbital Reusable Launch Vehicle (sRLV). Measurements in the external vacuum environment can be made and can launch from most runways on a limited lead time. Lynx can operate as a platform to perform suborbital in situ measurements and remote sensing to supplement models and simulations with new data points. These measurements can serve as a quantitative link to existing instruments and be used as a basis to calibrate detectors on spacecraft. Easier access to suborbital data can improve the longevity and cohesiveness of spacecraft and ground-based resources. A study of how these measurements can be made on Lynx sRLV will be presented. At the boundary between terrestrial and space weather, measurements from instruments on Lynx can help develop algorithms to optimize the consolidation of ground and satellite based data as well as assimilate global models with new data points. For example, current tides and the equatorial electrojet, essential to understanding the Thermosphere-Ionosphere system, can be measured in situ frequently and on short notice. Furthermore, a negative-ion spectrometer and a Faraday cup, can take measurements of the D-region ion composition. A differential GPS receiver can infer the spatial gradient of ionospheric electron density. Instruments and optics on spacecraft degrade over time, leading to calibration drift. Lynx can be a cost effective platform for deploying a reference instrument to calibrate satellites with a frequent and fast turnaround and a successful return of the instrument. A calibrated reference instrument on Lynx can make collocated observations as another instrument and corrections are made for the latter, thus ensuring data consistency and mission longevity. Aboard a sRLV, atmospheric conditions that distort remotely sensed data (ground and spacecraft based) can be measured in situ. Moreover, an

  5. Blowing snow detection in Antarctica, from space borne and ground-based remote sensing

    Science.gov (United States)

    Gossart, A.; Souverijns, N.; Lhermitte, S.; Lenaerts, J.; Gorodetskaya, I.; Schween, J. H.; Van Lipzig, N. P. M.

    2017-12-01

    Surface mass balance (SMB) strongly controls spatial and temporal variations in the Antarctic Ice Sheet (AIS) mass balance and its contribution to sea level rise. Currently, the scarcity of observational data and the challenges of climate modelling over the ice sheet limit our understanding of the processes controlling AIS SMB. Particularly, the impact of blowing snow on local SMB is not yet constrained and is subject to large uncertainties. To assess the impact of blowing snow on local SMB, we investigate the attenuated backscatter profiles from ceilometers at two East Antarctic locations in Dronning Maud Land. Ceilometers are robust ground-based remote sensing instruments that yield information on cloud base height and vertical structure, but also provide information on the particles present in the boundary layer. We developed a new algorithm to detect blowing snow (snow particles lifted by the wind from the surface to substantial height) from the ceilometer attenuated backscatter. The algorithm successfully allows to detect strong blowing snow signal from layers thicker than 15 m at the Princess Elisabeth (PE, (72°S, 23°E)) and Neumayer (70°S, 8° W) stations. Applying the algorithm to PE, we retrieve the frequency and annual cycle of blowing snow as well as discriminate between clear sky and overcast conditions during blowing snow. We further apply the blowing snow algorithm at PE to evaluate the blowing snow events detection by satellite imagery (Palm et al., 2011): the near-surface blowing snow layers are apparent in lidar backscatter profiles and enable snowdrift events detection (spatial and temporal frequency, height and optical depth). These data are processed from CALIPSO, at a high resolution (1x1 km digital elevation model). However, the remote sensing detection of blowing snow events by satellite is limited to layers of a minimal thickness of 20-30 m. In addition, thick clouds, mostly occurring during winter storms, can impede drifting snow

  6. Mesoscale ionospheric electrodynamics of omega bands determined from ground-based electromagnetic and satellite optical observations

    Directory of Open Access Journals (Sweden)

    O. Amm

    2005-02-01

    Full Text Available We present ground-based electromagnetic data from the MIRACLE and BEAR networks and satellite optical observations from the UVI and PIXIE instruments on the Polar satellite of an omega band event over Northern Scandinavia on 26 June 1998, which occured close to the morning side edge of a substorm auroral bulge. Our analysis of the data concentrates on one omega band period from 03:18-03:27 UT, for which we use the method of characteristics combined with an analysis of the UVI and PIXIE data to derive a time series of instantaneous, solely data-based distributions of the mesoscale ionospheric electrodynamic parameters with a 1-min time resolution. In addition, the AMIE method is used to derive global Hall conductance patterns. Our results show that zonally alternating regions of enhanced ionospheric conductances ("tongues" up to ~60S and low conductance regions are associated with the omega bands. The tongues have a poleward extension of ~400km from their base and a zonal extension of ~380km. While they are moving coherently eastward with a velocity of ~770ms-1, the structures are not strictly stationary. The current system of the omega band can be described as a superposition of two parts: one consists of anticlockwise rotating Hall currents around the tongues, along with Pedersen currents, with a negative divergence in their centers. The sign of this system is reversing in the low conductance areas. It causes the characteristic ground magnetic signature. The second part consists of zonally aligned current wedges of westward flowing Hall currents and is mostly magnetically invisible below the ionosphere. This system dominates the field-aligned current (FAC pattern and causes alternating upward and downward FAC at the flanks of the tongues with maximum upward FAC of ~25µA m-2. The total FAC of ~2MA are comparable to the ones diverted inside a westward traveling surge. Throughout the event, the overwhelming part of the FAC are associated with

  7. Mesoscale ionospheric electrodynamics of omega bands determined from ground-based electromagnetic and satellite optical observations

    Directory of Open Access Journals (Sweden)

    O. Amm

    2005-02-01

    Full Text Available We present ground-based electromagnetic data from the MIRACLE and BEAR networks and satellite optical observations from the UVI and PIXIE instruments on the Polar satellite of an omega band event over Northern Scandinavia on 26 June 1998, which occured close to the morning side edge of a substorm auroral bulge. Our analysis of the data concentrates on one omega band period from 03:18-03:27 UT, for which we use the method of characteristics combined with an analysis of the UVI and PIXIE data to derive a time series of instantaneous, solely data-based distributions of the mesoscale ionospheric electrodynamic parameters with a 1-min time resolution. In addition, the AMIE method is used to derive global Hall conductance patterns. Our results show that zonally alternating regions of enhanced ionospheric conductances ("tongues" up to ~60S and low conductance regions are associated with the omega bands. The tongues have a poleward extension of ~400km from their base and a zonal extension of ~380km. While they are moving coherently eastward with a velocity of ~770ms-1, the structures are not strictly stationary. The current system of the omega band can be described as a superposition of two parts: one consists of anticlockwise rotating Hall currents around the tongues, along with Pedersen currents, with a negative divergence in their centers. The sign of this system is reversing in the low conductance areas. It causes the characteristic ground magnetic signature. The second part consists of zonally aligned current wedges of westward flowing Hall currents and is mostly magnetically invisible below the ionosphere. This system dominates the field-aligned current (FAC pattern and causes alternating upward and downward FAC at the flanks of the tongues with maximum upward FAC of ~25µA m-2. The total FAC of ~2MA are comparable to the ones diverted inside a westward traveling surge. Throughout the event, the overwhelming part of the FAC

  8. Tropospheric nitrogen dioxide column retrieval based on ground-based zenith-sky DOAS observations

    Science.gov (United States)

    Tack, F. M.; Hendrick, F.; Pinardi, G.; Fayt, C.; Van Roozendael, M.

    2013-12-01

    A retrieval approach has been developed to derive tropospheric NO2 vertical column amounts from ground-based zenith-sky measurements of scattered sunlight. Zenith radiance spectra are observed in the visible range by the BIRA-IASB Multi-Axis Differential Optical Absorption Spectroscopy (MAX-DOAS) instrument and analyzed by the DOAS technique, based on a least-squares spectral fitting. In recent years, this technique has shown to be a well-suited remote sensing tool for monitoring atmospheric trace gases. The retrieval algorithm is developed and validated based on a two month dataset acquired from June to July 2009 in the framework of the Cabauw (51.97° N, 4.93° E) Intercomparison campaign for Nitrogen Dioxide measuring Instruments (CINDI). Once fully operational, the retrieval approach can be applied to observations from stations of the Network for the Detection of Atmospheric Composition Change (NDACC). The obtained tropospheric vertical column amounts are compared with the multi-axis retrieval from the BIRA-IASB MAX-DOAS instrument and the retrieval from a zenith-viewing only SAOZ instrument (Système d'Analyse par Observations Zénithales), owned by Laboratoire Atmosphères, Milieux, Observations Spatiales (LATMOS). First results show a good agreement for the whole time series with the multi-axis retrieval (R = 0.82; y = 0.88x + 0.30) as well as with the SAOZ retrieval (R = 0.85; y = 0.76x + 0.28 ). Main error sources arise from the uncertainties in the determination of tropospheric and stratospheric air mass factors, the stratospheric NO2 abundances and the residual amount in the reference spectrum. However zenith-sky measurements have been commonly used over the last decades for stratospheric monitoring, this study also illustrates the suitability for retrieval of tropospheric column amounts. As there are long time series of zenith-sky acquisitions available, the developed approach offers new perspectives with regard to the use of observations from the NDACC

  9. Ground-based lidar and microwave radiometry synergy for high vertical resolution absolute humidity profiling

    Science.gov (United States)

    Barrera-Verdejo, María; Crewell, Susanne; Löhnert, Ulrich; Orlandi, Emiliano; Di Girolamo, Paolo

    2016-08-01

    Continuous monitoring of atmospheric humidity profiles is important for many applications, e.g., assessment of atmospheric stability and cloud formation. Nowadays there are a wide variety of ground-based sensors for atmospheric humidity profiling. Unfortunately there is no single instrument able to provide a measurement with complete vertical coverage, high vertical and temporal resolution and good performance under all weather conditions, simultaneously. For example, Raman lidar (RL) measurements can provide water vapor with a high vertical resolution, albeit with limited vertical coverage, due to sunlight contamination and the presence of clouds. Microwave radiometers (MWRs) receive water vapor information throughout the troposphere, though their vertical resolution is poor. In this work, we present an MWR and RL system synergy, which aims to overcome the specific sensor limitations. The retrieval algorithm combining these two instruments is an optimal estimation method (OEM), which allows for an uncertainty analysis of the retrieved profiles. The OEM combines measurements and a priori information, taking the uncertainty of both into account. The measurement vector consists of a set of MWR brightness temperatures and RL water vapor profiles. The method is applied to a 2-month field campaign around Jülich (Germany), focusing on clear sky periods. Different experiments are performed to analyze the improvements achieved via the synergy compared to the individual retrievals. When applying the combined retrieval, on average the theoretically determined absolute humidity uncertainty is reduced above the last usable lidar range by a factor of ˜ 2 with respect to the case where only RL measurements are used. The analysis in terms of degrees of freedom per signal reveal that most information is gained above the usable lidar range, especially important during daytime when the lidar vertical coverage is limited. The retrieved profiles are further evaluated using

  10. Comparing distinct ground-based lightning location networks covering the Netherlands

    Science.gov (United States)

    de Vos, Lotte; Leijnse, Hidde; Schmeits, Maurice; Beekhuis, Hans; Poelman, Dieter; Evers, Läslo; Smets, Pieter

    2015-04-01

    Lightning can be detected using a ground-based sensor network. The Royal Netherlands Meteorological Institute (KNMI) monitors lightning activity in the Netherlands with the so-called FLITS-system; a network combining SAFIR-type sensors. This makes use of Very High Frequency (VHF) as well as Low Frequency (LF) sensors. KNMI has recently decided to replace FLITS by data from a sub-continental network operated by Météorage which makes use of LF sensors only (KNMI Lightning Detection Network, or KLDN). KLDN is compared to the FLITS system, as well as Met Office's long-range Arrival Time Difference (ATDnet), which measures Very Low Frequency (VLF). Special focus lies on the ability to detect Cloud to Ground (CG) and Cloud to Cloud (CC) lightning in the Netherlands. Relative detection efficiency of individual flashes and lightning activity in a more general sense are calculated over a period of almost 5 years. Additionally, the detection efficiency of each system is compared to a ground-truth that is constructed from flashes that are detected by both of the other datasets. Finally, infrasound data is used as a fourth lightning data source for several case studies. Relative performance is found to vary strongly with location and time. As expected, it is found that FLITS detects significantly more CC lightning (because of the strong aptitude of VHF antennas to detect CC), though KLDN and ATDnet detect more CG lightning. We analyze statistics computed over the entire 5-year period, where we look at CG as well as total lightning (CC and CG combined). Statistics that are considered are the Probability of Detection (POD) and the so-called Lightning Activity Detection (LAD). POD is defined as the percentage of reference flashes the system detects compared to the total detections in the reference. LAD is defined as the fraction of system recordings of one or more flashes in predefined area boxes over a certain time period given the fact that the reference detects at least one

  11. Techniques For Near-Earth Interplanetary Matter Detection And Characterisation From Optical Ground-Based Observatories

    Science.gov (United States)

    Ocaña, Francisco

    2017-05-01

    PhD Thesis defended the 5th June 2017. Universidad Complutense de Madrid.This dissertation undertakes the research of the interplanetary matter near the Earth using two different observational approaches.The first one is based on the detection of the sunlight reflected by the bodies. The detection and characterisation of these nearby population require networks of medium-sized telescopes to survey and track them. We design a robotic system (the TBT telescopes) for the European Space Agency as a prototype for a future network. The first unit is already installed in Spain and we present the results of the commissioning. Additionally we evaluate the expected performance of such an instrument using a simulation with a synthetic population. We consider that the system designed is a powerful instrument for nearby asteroid discovery and tracking. It is based on commercial components, and therefore ready for a scalable implementation in a global network.Meanwhile the bodies smaller than asteroids are observed using the atmosphere as a detector. When these particles collide with the atmospheric molecules they are heated, ablated, sublimated, and finally light is emitted by these hot vapours, what we call meteors. We conduct the investigation of these meteors to study the meteoroids. In particular we address two different topics: On one hand we explore the size/mass frequency distribution of meteoroids using flux determination when the collide into the atmosphere. We develop a method to determine this flux using video observations of meteors and analyse the properties of meteors as an optical proxy to meteoroids in order to maximise the detection. It yields three ground-based observational solutions that we transform into instrumental designs. First we design and develop a meteor all-sky detection station for Observatorio UCM and use the Draconids 2011 campaign as a showcase for the flux determination, with successful results. Then we investigate the observation of meteors

  12. Characteristics of greenhouse gas concentrations derived from ground-based FTS spectra at Anmyeondo, South Korea

    Science.gov (United States)

    Oh, Young-Suk; Takele Kenea, S.; Goo, Tae-Young; Chung, Kyu-Sun; Rhee, Jae-Sang; Ou, Mi-Lim; Byun, Young-Hwa; Wennberg, Paul O.; Kiel, Matthäus; DiGangi, Joshua P.; Diskin, Glenn S.; Velazco, Voltaire A.; Griffith, David W. T.

    2018-04-01

    Since the late 1990s, the meteorological observatory established in Anmyeondo (36.5382° N, 126.3311° E, and 30 m above mean sea level) has been monitoring several greenhouse gases such as CO2, CH4, N2O, CFCs, and SF6 as a part of the Global Atmosphere Watch (GAW) Program. A high resolution ground-based (g-b) Fourier transform spectrometer (FTS) was installed at this observation site in 2013 and has been operated within the frame work of the Total Carbon Column Observing Network (TCCON) since August 2014. The solar spectra recorded by the g-b FTS cover the spectral range 3800 to 16 000 cm-1 at a resolution of 0.02 cm-1. In this work, the GGG2014 version of the TCCON standard retrieval algorithm was used to retrieve total column average CO2 and CH4 dry mole fractions (XCO2, XCH4) and from the FTS spectra. Spectral bands of CO2 (at 6220.0 and 6339.5 cm-1 center wavenumbers, CH4 at 6002 cm-1 wavenumber, and O2 near 7880 cm-1 ) were used to derive the XCO2 and XCH4. In this paper, we provide comparisons of XCO2 and XCH4 between the aircraft observations and g-b FTS over Anmyeondo station. A comparison of 13 coincident observations of XCO2 between g-b FTS and OCO-2 (Orbiting Carbon Observatory) satellite measurements are also presented for the measurement period between February 2014 and November 2017. OCO-2 observations are highly correlated with the g-b FTS measurements (r2 = 0.884) and exhibited a small positive bias (0.189 ppm). Both data set capture seasonal variations of the target species with maximum and minimum values in spring and late summer, respectively. In the future, it is planned to further utilize the FTS measurements for the evaluation of satellite observations such as Greenhouse Gases Observing Satellite (GOSAT, GOSAT-2). This is the first report of the g-b FTS observations of XCO2 species over the Anmyeondo station.

  13. Ground-Based Observations of Terrestrial Gamma Ray Flashes Associated with Downward-Directed Lightning Leaders

    Science.gov (United States)

    Belz, J.; Abbasi, R.; Krehbiel, P. R.; LeVon, R.; Remington, J.; Rison, W.; Thomas, R. J.

    2017-12-01

    Terrestrial Gamma Flashes (TGFs) have been observed in satellite-borne gamma ray detectors for several decades, starting with the BATSE instrument on the Compton Gamma-Ray observatory in 1994. TGFs consist of bursts of upwards of 1018 primary gamma rays, with a duration of up to a few milliseconds, originating in the Earth's atmosphere. More recent observations have shown that satellite-observed TGFs are generated in upward-propagating negative leaders of intracloud lightning, suggesting that they may be sensitive to the processes responsible for the initial lightning breakdown. Here, we present the first evidence that TGFs are also produced at the beginning of negative cloud-to-ground flashes, and that they may provide a new window through which ground-based observatories may contribute to understanding the breakdown process. The Telescope Array Surface Detector (TASD) is a 700 square kilometer cosmic ray observatory, an array of 507 3m2 scintillators on a 1.2 km grid. The array is triggered and read out when at least three adjacent detectors observe activity within an 8 μs window. Following the observation of bursts of anomalous TASD triggers, lasting a few hundred microseconds and correlated with local lightning activity, a Lightning Mapping Array (LMA) and slow electric field antenna were installed at the TASD site in order to study the effect. From data obtained between 2014 and 2016, correlated observations were obtained for ten -CG flashes. In 9 out of 10 cases, bursts of up to five anomalous triggers were detected during the first ms of the flash, as negative breakdown was descending into lower positive storm charge. The triggers occurred when the LMA-detected VHF radiation sources were at altitudes between 1.5 to 4.5 km AGL. The tenth flash was initiated by an unusually energetic leader that reached the ground in 2.5 ms and produced increasingly powerful triggers down to about 500 m AGL. While the TASD is not optimized for individual gamma ray detection

  14. Validation of the CrIS fast physical NH3 retrieval with ground-based FTIR

    Directory of Open Access Journals (Sweden)

    E. Dammers

    2017-07-01

    Full Text Available Presented here is the validation of the CrIS (Cross-track Infrared Sounder fast physical NH3 retrieval (CFPR column and profile measurements using ground-based Fourier transform infrared (FTIR observations. We use the total columns and profiles from seven FTIR sites in the Network for the Detection of Atmospheric Composition Change (NDACC to validate the satellite data products. The overall FTIR and CrIS total columns have a positive correlation of r  =  0.77 (N  =  218 with very little bias (a slope of 1.02. Binning the comparisons by total column amounts, for concentrations larger than 1.0  ×  1016 molecules cm−2, i.e. ranging from moderate to polluted conditions, the relative difference is on average ∼ 0–5 % with a standard deviation of 25–50 %, which is comparable to the estimated retrieval uncertainties in both CrIS and the FTIR. For the smallest total column range (< 1.0  × 1016 molecules cm−2 where there are a large number of observations at or near the CrIS noise level (detection limit the absolute differences between CrIS and the FTIR total columns show a slight positive column bias. The CrIS and FTIR profile comparison differences are mostly within the range of the single-level retrieved profile values from estimated retrieval uncertainties, showing average differences in the range of  ∼ 20 to 40 %. The CrIS retrievals typically show good vertical sensitivity down into the boundary layer which typically peaks at  ∼ 850 hPa (∼ 1.5 km. At this level the median absolute difference is 0.87 (std  =  ±0.08 ppb, corresponding to a median relative difference of 39 % (std  =  ±2 %. Most of the absolute and relative profile comparison differences are in the range of the estimated retrieval uncertainties. At the surface, where CrIS typically has lower sensitivity, it tends to overestimate in low-concentration conditions and underestimate

  15. Reliability-centered maintenance for ground-based large optical telescopes and radio antenna arrays

    Science.gov (United States)

    Marchiori, G.; Formentin, F.; Rampini, F.

    2014-07-01

    In the last years, EIE GROUP has been more and more involved in large optical telescopes and radio antennas array projects. In this frame, the paper describes a fundamental aspect of the Logistic Support Analysis (LSA) process, that is the application of the Reliability-Centered Maintenance (RCM) methodology for the generation of maintenance plans for ground-based large optical telescopes and radio antennas arrays. This helps maintenance engineers to make sure that the telescopes continue to work properly, doing what their users require them to do in their present operating conditions. The main objective of the RCM process is to establish the complete maintenance regime, with the safe minimum required maintenance, carried out without any risk to personnel, telescope and subsystems. At the same time, a correct application of the RCM allows to increase the cost effectiveness, telescope uptime and items availability, and to provide greater understanding of the level of risk that the organization is managing. At the same time, engineers shall make a great effort since the initial phase of the project to obtain a telescope requiring easy maintenance activities and simple replacement of the major assemblies, taking special care on the accesses design and items location, implementation and design of special lifting equipment and handling devices for the heavy items. This maintenance engineering framework is based on seven points, which lead to the main steps of the RCM program. The initial steps of the RCM process consist of: system selection and data collection (MTBF, MTTR, etc.), definition of system boundaries and operating context, telescope description with the use of functional block diagrams, and the running of a FMECA to address the dominant causes of equipment failure and to lay down the Critical Items List. In the second part of the process the RCM logic is applied, which helps to determine the appropriate maintenance tasks for each identified failure mode. Once

  16. Detection of greenbug infestation on wheat using ground-based radiometry

    Science.gov (United States)

    Yang, Zhiming

    Scope of methods of study. The purpose of this greenhouse study was to characterize stress in wheat caused by greenbugs using ground-based radiometry. Experiments were conducted to (a) identify spectral bands and vegetation indices sensitive to greenbug infestation; (b) differentiate stress caused due to greenbugs from water stress; (c) examine the impacts of plant growth stage on detection of greenbug infestation; and (d) compare infestations due to greenbug and Russian wheat aphid. Wheat (variety-TAM 107) was planted (seed spacing 1 in. x 3 in.) in plastic flats with dimension 24 in. x 16 in. x 8.75 in. Fifteen days after sowing, wheat seedlings were infested with greenbugs (biotype-E). Nadir measurement of canopy reflectance started the day after infestation and lasted until most infested plants were dead. Using a 16-band Cropscan radiometer, spectral reflectance data were collected daily (between 13:00--14:00 hours) and 128 vegetation indices were derived in addition to greenbug counts per tiller. Using SAS PROC MIXED, sensitivity of band and vegetation indices was identified based on Threshold Day. Subsequent to Threshold Day there was a consistent significant spectral difference between control and infested plants. Sensitivity of band and vegetation indices was further examined using correlation and relative sensitivity analyses. Findings and conclusions. Results show that it is possible to detect greenbug-induced stress on wheat using hand-held radiometers, such as Cropscan. Band 694 nm and the ratio-based vegetation index (RVI) derived from the band 694 nm and 800 nm were identified as most sensitive to greenbug infestation. Landsat TM bands and their derived vegetation indices also show potential for detecting wheat stress caused by greenbug infestation. Also, RVIs particularly derived using spectral band 694 nm and 800 nm were found useful in differentiating greenbug infestation from water stress. Furthermore, vegetation indices such as Normalized total

  17. Characteristics of greenhouse gas concentrations derived from ground-based FTS spectra at Anmyeondo, South Korea

    Directory of Open Access Journals (Sweden)

    Y.-S. Oh

    2018-04-01

    Full Text Available Since the late 1990s, the meteorological observatory established in Anmyeondo (36.5382° N, 126.3311° E, and 30 m above mean sea level has been monitoring several greenhouse gases such as CO2, CH4, N2O, CFCs, and SF6 as a part of the Global Atmosphere Watch (GAW Program. A high resolution ground-based (g-b Fourier transform spectrometer (FTS was installed at this observation site in 2013 and has been operated within the frame work of the Total Carbon Column Observing Network (TCCON since August 2014. The solar spectra recorded by the g-b FTS cover the spectral range 3800 to 16 000 cm−1 at a resolution of 0.02 cm−1. In this work, the GGG2014 version of the TCCON standard retrieval algorithm was used to retrieve total column average CO2 and CH4 dry mole fractions (XCO2, XCH4 and from the FTS spectra. Spectral bands of CO2 (at 6220.0 and 6339.5 cm−1 center wavenumbers, CH4 at 6002 cm−1 wavenumber, and O2 near 7880 cm−1 were used to derive the XCO2 and XCH4. In this paper, we provide comparisons of XCO2 and XCH4 between the aircraft observations and g-b FTS over Anmyeondo station. A comparison of 13 coincident observations of XCO2 between g-b FTS and OCO-2 (Orbiting Carbon Observatory satellite measurements are also presented for the measurement period between February 2014 and November 2017. OCO-2 observations are highly correlated with the g-b FTS measurements (r2 = 0.884 and exhibited a small positive bias (0.189 ppm. Both data set capture seasonal variations of the target species with maximum and minimum values in spring and late summer, respectively. In the future, it is planned to further utilize the FTS measurements for the evaluation of satellite observations such as Greenhouse Gases Observing Satellite (GOSAT, GOSAT-2. This is the first report of the g-b FTS observations of XCO2 species over the Anmyeondo station.

  18. Characterization of large instabilities displacements using Ground-Based InSAR

    Science.gov (United States)

    Rouyet, L.; Kristensen, L.; Derron, M.-H.; Michoud, C.; Blikra, L. H.; Jaboyedoff, M.

    2012-04-01

    A master thesis in progress at the Lausanne University (IGAR) in cooperation with the Åknes/Tafjord Early Warning Centre in Norway aims to characterize various instabilities displacements using Ground-Based Interferometric Synthetic Aperture Radar system (GB-InSAR). The main goal is to evaluate the potential of GB-InSAR to determine displacement velocities and mechanical behaviours of several large rock instabilities in Norway. GB-InSAR data are processed and interpreted for three case studies. The first test site is the unstable complex area of Mannen located in the Romsdalen valley (Møre og Romsdal county), threatening infrastructures and potentially able to cause a debacle event downstream. Its total volume is estimated to 15-25 mill m3. Mannen instability is monitored permanently with GB-InSAR since February 2010 and shows displacements towards the radar up to -8 mm per month during the most sensitive period. Børa area located on the southwest side of Mannen instability shows also some signs of activity. It monitored temporarily between August and October 2011 and could help to understand the behaviour of Mannen site. The second, Indre Nordnes rockslide in Lyngenfjord (Troms county), is directly located above an important fjord in North Norway causing a significant risk of tsunami. The volume is estimated to be around 10-15 mill m3. The site was monitored temporarily between June and August 2011. The data show displacements towards the radar up to -12 mm in 2 weeks. The third case concerns rock falls along the road between Oppdølsstranda and Sunndalsøra (Møre og Romsdal county). Even if the volume of rock is less important than the first two cases, rock falls are an important problem for the road 70 underneath. Several campaigns are done between beginning of 2010 and end of 2011. In June 2011 an important rock fall occurs in an area where significant movements were previously detected by GB-InSAR. In order to understand the behaviour of these

  19. Human factors reliability Benchmark exercise

    International Nuclear Information System (INIS)

    Poucet, A.

    1989-06-01

    The Joint Research Centre of the European Commission has organized a Human Factors Reliability Benchmark Exercise (HF-RBE) with the aim of assessing the state of the art in human reliability modelling and assessment. Fifteen teams from eleven countries, representing industry, utilities, licensing organisations and research institutes, participated in the HF-RBE. The HF-RBE was organized around two study cases: (1) analysis of routine functional Test and Maintenance (T and M) procedures: with the aim of assessing the probability of test induced failures, the probability of failures to remain unrevealed and the potential to initiate transients because of errors performed in the test; (2) analysis of human actions during an operational transient: with the aim of assessing the probability that the operators will correctly diagnose the malfunctions and take proper corrective action. This report contains the final summary reports produced by the participants in the exercise

  20. Experimental and computational benchmark tests

    International Nuclear Information System (INIS)

    Gilliam, D.M.; Briesmeister, J.F.

    1994-01-01

    A program involving principally NIST, LANL, and ORNL has been in progress for about four years now to establish a series of benchmark measurements and calculations related to the moderation and leakage of 252 Cf neutrons from a source surrounded by spherical aqueous moderators of various thicknesses and compositions. The motivation for these studies comes from problems in criticality calculations concerning arrays of multiplying components, where the leakage from one component acts as a source for the other components. This talk compares experimental and calculated values for the fission rates of four nuclides - 235 U, 239 Pu, 238 U, and 237 Np - in the leakage spectrum from moderator spheres of diameters 76.2 mm, 101.6 mm, and 127.0 mm, with either pure water or enriched B-10 solutions as the moderator. Very detailed Monte Carlo calculations were done with the MCNP code, using a open-quotes light waterclose quotes S(α,β) scattering kernel

  1. ENVIRONMENTAL BENCHMARKING FOR LOCAL AUTHORITIES

    Directory of Open Access Journals (Sweden)

    Marinela GHEREŞ

    2010-01-01

    Full Text Available This paper is an attempt to clarify and present the many definitions ofbenchmarking. It also attempts to explain the basic steps of benchmarking, toshow how this tool can be applied by local authorities as well as to discuss itspotential benefits and limitations. It is our strong belief that if cities useindicators and progressively introduce targets to improve management andrelated urban life quality, and to measure progress towards more sustainabledevelopment, we will also create a new type of competition among cities andfoster innovation. This is seen to be important because local authorities’actions play a vital role in responding to the challenges of enhancing thestate of the environment not only in policy-making, but also in the provision ofservices and in the planning process. Local communities therefore need tobe aware of their own sustainability performance levels and should be able toengage in exchange of best practices to respond effectively to the ecoeconomicalchallenges of the century.

  2. Benchmark results in radiative transfer

    International Nuclear Information System (INIS)

    Garcia, R.D.M.; Siewert, C.E.

    1986-02-01

    Several aspects of the F N method are reported, and the method is used to solve accurately some benchmark problems in radiative transfer in the field of atmospheric physics. The method was modified to solve cases of pure scattering and an improved process was developed for computing the radiation intensity. An algorithms for computing several quantities used in the F N method was done. An improved scheme to evaluate certain integrals relevant to the method is done, and a two-term recursion relation that has proved useful for the numerical evaluation of matrix elements, basic for the method, is given. The methods used to solve the encountered linear algebric equations are discussed, and the numerical results are evaluated. (M.C.K.) [pt

  3. Validation of neutron-transport calculations in benchmark facilities for improved damage-fluence predictions

    International Nuclear Information System (INIS)

    Williams, M.L.; Stallmann, F.W.; Maerker, R.E.; Kam, F.B.K.

    1983-01-01

    An accurate determination of damage fluence accumulated by reactor pressure vessels (RPV) as a function of time is essential in order to evaluate the vessel integrity for both pressurized thermal shock (PTS) transients and end-of-life considerations. The desired accuracy for neutron exposure parameters such as displacements per atom or fluence (E > 1 MeV) is of the order of 20 to 30%. However, these types of accuracies can only be obtained realistically by validation of nuclear data and calculational methods in benchmark facilities. The purposes of this paper are to review the needs and requirements for benchmark experiments, to discuss the status of current benchmark experiments, to summarize results and conclusions obtained so far, and to suggest areas where further benchmarking is needed

  4. NASA Software Engineering Benchmarking Study

    Science.gov (United States)

    Rarick, Heather L.; Godfrey, Sara H.; Kelly, John C.; Crumbley, Robert T.; Wifl, Joel M.

    2013-01-01

    To identify best practices for the improvement of software engineering on projects, NASA's Offices of Chief Engineer (OCE) and Safety and Mission Assurance (OSMA) formed a team led by Heather Rarick and Sally Godfrey to conduct this benchmarking study. The primary goals of the study are to identify best practices that: Improve the management and technical development of software intensive systems; Have a track record of successful deployment by aerospace industries, universities [including research and development (R&D) laboratories], and defense services, as well as NASA's own component Centers; and Identify candidate solutions for NASA's software issues. Beginning in the late fall of 2010, focus topics were chosen and interview questions were developed, based on the NASA top software challenges. Between February 2011 and November 2011, the Benchmark Team interviewed a total of 18 organizations, consisting of five NASA Centers, five industry organizations, four defense services organizations, and four university or university R and D laboratory organizations. A software assurance representative also participated in each of the interviews to focus on assurance and software safety best practices. Interviewees provided a wealth of information on each topic area that included: software policy, software acquisition, software assurance, testing, training, maintaining rigor in small projects, metrics, and use of the Capability Maturity Model Integration (CMMI) framework, as well as a number of special topics that came up in the discussions. NASA's software engineering practices compared favorably with the external organizations in most benchmark areas, but in every topic, there were ways in which NASA could improve its practices. Compared to defense services organizations and some of the industry organizations, one of NASA's notable weaknesses involved communication with contractors regarding its policies and requirements for acquired software. One of NASA's strengths

  5. First retrievals of methane isotopologues from FTIR ground-based observations

    Science.gov (United States)

    Bader, Whitney; Strong, Kimberly; Walker, Kaley; Buzan, Eric

    2017-04-01

    Atmospheric methane concentrations have reached a new high at 1845 ± 2 ppb, accounting for an increase of 256 % since pre-industrial times (WMO, 2016). In the last ten years, methane has been on the rise again at rates of ˜0.3%/year (e.g., Bader et al., 2016), after a period of stabilization of about 5 years. This recent increase is not fully understood due to remaining uncertainties in the methane budget, influenced by numerous anthropogenic and natural emission sources. In order to examine the cause(s) of this increase, we focus on the two main methane isotopologues, i.e. CH3D and 13CH4. Both CH3D and 13CH4 are emitted in the atmosphere with different ratio depending on the emission processes involved. As heavier isotopologues will react more slowly than 12CH4, each isotopologue will be depleted from the atmosphere at a specific rate depending on the removal process. Methane isotopologues are therefore good tracers of the methane budget. In this contribution, the first development and optimization of the retrieval strategy of CH3D as well as the preliminary tests for 13CH4 will be presented and discussed , using FTIR (Fourier Transform infrared) solar spectra collected at the Eureka (80.05 ˚ N, -86.42 ˚ E, 610 m a.s.l.) and Toronto (43.66˚ N, -79.4˚ E, 174 m a.s.l.) ground-based sites. Mixing ratio vertical profiles from a Whole Atmosphere Community Climate Model (WACCM v.4, Marsh et al., 2013) simulation developed by Buzan et al. (2016) are used as a priori inputs. A discussion on the type of regularization constraints used for the retrievals will be presented as well as an evaluation of available spectroscopy (primarily the different editions of the HITRAN database, see Rothman et al., 2013 and references therein). The uncertainties affecting the retrieved columns as well as information content evaluation will be discussed in order to assess the best strategy to be employed based on its altitude sensitivity range and complete error budget. Acknowledgments

  6. Preliminary Assessment of Detection Efficiency for the Geostationary Lightning Mapper Using Intercomparisons with Ground-Based Systems

    Science.gov (United States)

    Bateman, Monte; Mach, Douglas; Blakeslee, Richard J.; Koshak, William

    2018-01-01

    As part of the calibration/validation (cal/val) effort for the Geostationary Lightning Mapper (GLM) on GOES-16, we need to assess instrument performance (detection efficiency and accuracy). One major effort is to calculate the detection efficiency of GLM by comparing to multiple ground-based systems. These comparisons will be done pair-wise between GLM and each other source. A complication in this process is that the ground-based systems sense different properties of the lightning signal than does GLM (e.g., RF vs. optical). Also, each system has a different time and space resolution and accuracy. Preliminary results indicate that GLM is performing at or above its specification.

  7. Satellite and ground-based sensors for the Urban Heat Island analysis in the city of Rome

    DEFF Research Database (Denmark)

    Fabrizi, Roberto; Bonafoni, Stefania; Biondi, Riccardo

    2010-01-01

    In this work, the trend of the Urban Heat Island (UHI) of Rome is analyzed by both ground-based weather stations and a satellite-based infrared sensor. First, we have developed a suitable algorithm employing satellite brightness temperatures for the estimation of the air temperature belonging...... and nighttime scenes taken between 2003 and 2006 have been processed. Analysis of the Canopy Layer Heat Island (CLHI) during summer months reveals a mean growth in magnitude of 3-4 K during nighttime and a negative or almost zero CLHI intensity during daytime, confirmed by the weather stations. © 2010...... by the authors; licensee MDPI, Basel, Switzerland. Keyword: Thermal pollution,Summer months,Advanced-along track scanning radiometers,Urban heat island,Remote sensing,Canopy layer,Atmospheric temperature,Ground based sensors,Weather information services,Satellite remote sensing,Infra-red sensor,Weather stations...

  8. TANGOO: A ground-based tilting-filter spectrometer for deriving the temperature in the mesopause region

    Science.gov (United States)

    Wildner, S.; Bittner, M.

    2009-04-01

    TANGOO (Tilting-filter spectrometer for Atmospheric Nocturnal Ground-based Oxygen & hydrOxyl emission measurements) is a passive, ground-based optical instrument for the purpose of a simultanously automatic long-term monitoring of OH(6-2) and O2 atm. Band (0-1) emissions (called "airglow"), yielding rotational temperatures in about 87 and 95 km, respectively. TANGOO, being a transportable and comparatively easy-to-use instrument, is the enhancement of the Argentine Airglow Spectrometer (Scheer, 1987) and shows significant improvements in the temporal resolution and throughput. It will be located on the German Enviromental Research Station "Schneefernerhaus", Zugspitze (47°,4 N, 11° E) and will start measurements in 2009. Objectives of TANGOO cover the analysis of dynamical processes such as gravity waves as well as the identification of climate signals. The observation method will be presented.

  9. The role of benchmarking for yardstick competition

    International Nuclear Information System (INIS)

    Burns, Phil; Jenkins, Cloda; Riechmann, Christoph

    2005-01-01

    With the increasing interest in yardstick regulation, there is a need to understand the most appropriate method for realigning tariffs at the outset. Benchmarking is the tool used for such realignment and is therefore a necessary first-step in the implementation of yardstick competition. A number of concerns have been raised about the application of benchmarking, making some practitioners reluctant to move towards yardstick based regimes. We assess five of the key concerns often discussed and find that, in general, these are not as great as perceived. The assessment is based on economic principles and experiences with applying benchmarking to regulated sectors, e.g. in the electricity and water industries in the UK, The Netherlands, Austria and Germany in recent years. The aim is to demonstrate that clarity on the role of benchmarking reduces the concern about its application in different regulatory regimes. We find that benchmarking can be used in regulatory settlements, although the range of possible benchmarking approaches that are appropriate will be small for any individual regulatory question. Benchmarking is feasible as total cost measures and environmental factors are better defined in practice than is commonly appreciated and collusion is unlikely to occur in environments with more than 2 or 3 firms (where shareholders have a role in monitoring and rewarding performance). Furthermore, any concern about companies under-recovering costs is a matter to be determined through the regulatory settlement and does not affect the case for using benchmarking as part of that settlement. (author)

  10. Benchmarking set for domestic smart grid management

    NARCIS (Netherlands)

    Bosman, M.G.C.; Bakker, Vincent; Molderink, Albert; Hurink, Johann L.; Smit, Gerardus Johannes Maria

    2010-01-01

    In this paper we propose a benchmark for domestic smart grid management. It consists of an in-depth description of a domestic smart grid, in which local energy consumers, producers and buffers can be controlled. First, from this description a general benchmark framework is derived, which can be used

  11. Medical school benchmarking - from tools to programmes.

    Science.gov (United States)

    Wilkinson, Tim J; Hudson, Judith N; Mccoll, Geoffrey J; Hu, Wendy C Y; Jolly, Brian C; Schuwirth, Lambert W T

    2015-02-01

    Benchmarking among medical schools is essential, but may result in unwanted effects. To apply a conceptual framework to selected benchmarking activities of medical schools. We present an analogy between the effects of assessment on student learning and the effects of benchmarking on medical school educational activities. A framework by which benchmarking can be evaluated was developed and applied to key current benchmarking activities in Australia and New Zealand. The analogy generated a conceptual framework that tested five questions to be considered in relation to benchmarking: what is the purpose? what are the attributes of value? what are the best tools to assess the attributes of value? what happens to the results? and, what is the likely "institutional impact" of the results? If the activities were compared against a blueprint of desirable medical graduate outcomes, notable omissions would emerge. Medical schools should benchmark their performance on a range of educational activities to ensure quality improvement and to assure stakeholders that standards are being met. Although benchmarking potentially has positive benefits, it could also result in perverse incentives with unforeseen and detrimental effects on learning if it is undertaken using only a few selected assessment tools.

  12. Benchmarking in digital circuit design automation

    NARCIS (Netherlands)

    Jozwiak, L.; Gawlowski, D.M.; Slusarczyk, A.S.

    2008-01-01

    This paper focuses on benchmarking, which is the main experimental approach to the design method and EDA-tool analysis, characterization and evaluation. We discuss the importance and difficulties of benchmarking, as well as the recent research effort related to it. To resolve several serious

  13. Benchmark Two-Good Utility Functions

    NARCIS (Netherlands)

    de Jaegher, K.

    Benchmark two-good utility functions involving a good with zero income elasticity and unit income elasticity are well known. This paper derives utility functions for the additional benchmark cases where one good has zero cross-price elasticity, unit own-price elasticity, and zero own price

  14. Repeated Results Analysis for Middleware Regression Benchmarking

    Czech Academy of Sciences Publication Activity Database

    Bulej, Lubomír; Kalibera, T.; Tůma, P.

    2005-01-01

    Roč. 60, - (2005), s. 345-358 ISSN 0166-5316 R&D Projects: GA ČR GA102/03/0672 Institutional research plan: CEZ:AV0Z10300504 Keywords : middleware benchmarking * regression benchmarking * regression testing Subject RIV: JD - Computer Applications, Robotics Impact factor: 0.756, year: 2005

  15. Benchmarking the energy efficiency of commercial buildings

    International Nuclear Information System (INIS)

    Chung, William; Hui, Y.V.; Lam, Y. Miu

    2006-01-01

    Benchmarking energy-efficiency is an important tool to promote the efficient use of energy in commercial buildings. Benchmarking models are mostly constructed in a simple benchmark table (percentile table) of energy use, which is normalized with floor area and temperature. This paper describes a benchmarking process for energy efficiency by means of multiple regression analysis, where the relationship between energy-use intensities (EUIs) and the explanatory factors (e.g., operating hours) is developed. Using the resulting regression model, these EUIs are then normalized by removing the effect of deviance in the significant explanatory factors. The empirical cumulative distribution of the normalized EUI gives a benchmark table (or percentile table of EUI) for benchmarking an observed EUI. The advantage of this approach is that the benchmark table represents a normalized distribution of EUI, taking into account all the significant explanatory factors that affect energy consumption. An application to supermarkets is presented to illustrate the development and the use of the benchmarking method

  16. Benchmarking, Total Quality Management, and Libraries.

    Science.gov (United States)

    Shaughnessy, Thomas W.

    1993-01-01

    Discussion of the use of Total Quality Management (TQM) in higher education and academic libraries focuses on the identification, collection, and use of reliable data. Methods for measuring quality, including benchmarking, are described; performance measures are considered; and benchmarking techniques are examined. (11 references) (MES)

  17. Ground-based simulation of telepresence for materials science experiments. [remote viewing and control of processes aboard Space Station

    Science.gov (United States)

    Johnston, James C.; Rosenthal, Bruce N.; Bonner, Mary JO; Hahn, Richard C.; Herbach, Bruce

    1989-01-01

    A series of ground-based telepresence experiments have been performed to determine the minimum video frame rate and resolution required for the successive performance of materials science experiments in space. The approach used is to simulate transmission between earth and space station with transmission between laboratories on earth. The experiments include isothermal dendrite growth, physical vapor transport, and glass melting. Modifications of existing apparatus, software developed, and the establishment of an inhouse network are reviewed.

  18. Ground-Based Midcourse Defense (GMD) Initial Defensive Operations Capability (IDOC) at Vandenberg Air Force Base Environmental Assessment

    Science.gov (United States)

    2003-08-28

    Zielinski , EDAW, Inc., concerning utilities supply and demand for Vandenberg Air Force Base, 1 August. Rush, P., 2002. Personal communication between...Pernell W. Rush, Technical Sergeant, Water Utilities/Water Treatment NCO, USAF 30th CES/CEOIU, Vandenberg Air Force Base, and James E. Zielinski ... Dave Savinsky, Environmental Consultant, 30 CES/CEVC, Vandenberg Air Force Base, on the Preliminary Draft Ground-Based Midcourse Defense (GMD

  19. A comparison of ground-based hydroxyl airglow temperatures with SABER/TIMED measurements over 23° N, India

    Science.gov (United States)

    Parihar, Navin; Singh, Dupinder; Gurubaran, Subramanian

    2017-03-01

    Ground-based observations of OH (6, 2) Meinel band nightglow were carried out at Ranchi (23.3° N, 85.3° E), India, during January-March 2011, December 2011-May 2012 and December 2012-March 2013 using an all-sky imaging system. Near the mesopause, OH temperatures were derived from the OH (6, 2) Meinel band intensity information. A limited comparison of OH temperatures (TOH) with SABER/TIMED measurements in 30 cases was performed by defining almost coincident criterion of ±1.5° latitude-longitude and ±3 min of the ground-based observations. Using SABER OH 1.6 and 2.0 µm volume emission rate profiles as the weighing function, two sets of OH-equivalent temperature (T1. 6 and T2. 0 respectively) were estimated from its kinetic temperature profile for comparison with OH nightglow measurements. Overall, fair agreement existed between ground-based and SABER measurements in the majority of events within the limits of experimental errors. Overall, the mean value of OH-derived temperatures and SABER OH-equivalent temperatures were 197.3 ± 4.6, 192.0 ± 10.8 and 192.7 ± 10.3 K, and the ground-based temperatures were 4-5 K warmer than SABER values. A difference of 8 K or more is noted between two measurements when the peak of the OH emission layer lies in the vicinity of large temperature inversions. A comparison of OH temperatures derived using different sets of Einstein transition probabilities and SABER measurements was also performed; however, OH temperatures derived using Langhoff et al. (1986) transition probabilities were found to compare well.

  20. An In Depth Look at Lightning Trends in Hurricane Harvey using Satellite and Ground-Based Measurements

    Science.gov (United States)

    Ringhausen, J.

    2017-12-01

    This research combines satellite measurements of lightning in Hurricane Harvey with ground-based lightning measurements to get a better sense of the total lightning occurring in the hurricane, both intra-cloud (IC) and cloud-to-ground (CG), and how it relates to the intensification and weakening of the tropical system. Past studies have looked at lightning trends in hurricanes using the space based Lightning Imaging Sensor (LIS) or ground-based lightning detection networks. However, both of these methods have drawbacks. For instance, LIS was in low earth orbit, which limited lightning observations to 90 seconds for a particular point on the ground; hence, continuous lightning coverage of a hurricane was not possible. Ground-based networks can have a decreased detection efficiency, particularly for ICs, over oceans where hurricanes generally intensify. With the launch of the Geostationary Lightning Mapper (GLM) on the GOES-16 satellite, researchers can study total lightning continuously over the lifetime of a tropical cyclone. This study utilizes GLM to investigate total lightning activity in Hurricane Harvey temporally; this is augmented with spatial analysis relative to hurricane structure, similar to previous studies. Further, GLM and ground-based network data are combined using Bayesian techniques in a new manner to leverage the strengths of each detection method. This methodology 1) provides a more complete estimate of lightning activity and 2) enables the derivation of the IC:CG ratio (Z-ratio) throughout the time period of the study. In particular, details of the evolution of the Z-ratio in time and space are presented. In addition, lightning stroke spatiotemporal trends are compared to lightning flash trends. This research represents a new application of lightning data that can be used in future study of tropical cyclone intensification and weakening.

  1. Monitoring Strategies of Earth Dams by Ground-Based Radar Interferometry: How to Extract Useful Information for Seismic Risk Assessment.

    Science.gov (United States)

    Di Pasquale, Andrea; Nico, Giovanni; Pitullo, Alfredo; Prezioso, Giuseppina

    2018-01-16

    The aim of this paper is to describe how ground-based radar interferometry can provide displacement measurements of earth dam surfaces and of vibration frequencies of its main concrete infrastructures. In many cases, dams were built many decades ago and, at that time, were not equipped with in situ sensors embedded in the structure when they were built. Earth dams have scattering properties similar to landslides for which the Ground-Based Synthetic Aperture Radar (GBSAR) technique has been so far extensively applied to study ground displacements. In this work, SAR and Real Aperture Radar (RAR) configurations are used for the measurement of earth dam surface displacements and vibration frequencies of concrete structures, respectively. A methodology for the acquisition of SAR data and the rendering of results is described. The geometrical correction factor, needed to transform the Line-of-Sight (LoS) displacement measurements of GBSAR into an estimate of the horizontal displacement vector of the dam surface, is derived. Furthermore, a methodology for the acquisition of RAR data and the representation of displacement temporal profiles and vibration frequency spectra of dam concrete structures is presented. For this study a Ku-band ground-based radar, equipped with horn antennas having different radiation patterns, has been used. Four case studies, using different radar acquisition strategies specifically developed for the monitoring of earth dams, are examined. The results of this work show the information that a Ku-band ground-based radar can provide to structural engineers for a non-destructive seismic assessment of earth dams.

  2. Spatio-temporal monitoring of cotton cultivation using ground-based and airborne multispectral sensors in GIS environment.

    Science.gov (United States)

    Papadopoulos, Antonis; Kalivas, Dionissios; Theocharopoulos, Sid

    2017-07-01

    Multispectral sensor capability of capturing reflectance data at several spectral channels, together with the inherent reflectance responses of various soils and especially plant surfaces, has gained major interest in crop production. In present study, two multispectral sensing systems, a ground-based and an aerial-based, were applied for the multispatial and temporal monitoring of two cotton fields in central Greece. The ground-based system was Crop Circle ACS-430, while the aerial consisted of a consumer-level quadcopter (Phantom 2) and a modified Hero3+ Black digital camera. The purpose of the research was to monitor crop growth with the two systems and investigate possible interrelations between the derived well-known normalized difference vegetation index (NDVI). Five data collection campaigns were conducted during the cultivation period and concerned scanning soil and plants with the ground-based sensor and taking aerial photographs of the fields with the unmanned aerial system. According to the results, both systems successfully monitored cotton growth stages in terms of space and time. The mean values of NDVI changes through time as retrieved by the ground-based system were satisfactorily modelled by a second-order polynomial equation (R 2 0.96 in Field 1 and 0.99 in Field 2). Further, they were highly correlated (r 0.90 in Field 1 and 0.74 in Field 2) with the according values calculated via the aerial-based system. The unmanned aerial system (UAS) can potentially substitute crop scouting as it concerns a time-effective, non-destructive and reliable way of soil and plant monitoring.

  3. A Seafloor Benchmark for 3-dimensional Geodesy

    Science.gov (United States)

    Chadwell, C. D.; Webb, S. C.; Nooner, S. L.

    2014-12-01

    We have developed an inexpensive, permanent seafloor benchmark to increase the longevity of seafloor geodetic measurements. The benchmark provides a physical tie to the sea floor lasting for decades (perhaps longer) on which geodetic sensors can be repeatedly placed and removed with millimeter resolution. Global coordinates estimated with seafloor geodetic techniques will remain attached to the benchmark allowing for the interchange of sensors as they fail or become obsolete, or for the sensors to be removed and used elsewhere, all the while maintaining a coherent series of positions referenced to the benchmark. The benchmark has been designed to free fall from the sea surface with transponders attached. The transponder can be recalled via an acoustic command sent from the surface to release from the benchmark and freely float to the sea surface for recovery. The duration of the sensor attachment to the benchmark will last from a few days to a few years depending on the specific needs of the experiment. The recovered sensors are then available to be reused at other locations, or again at the same site in the future. Three pins on the sensor frame mate precisely and unambiguously with three grooves on the benchmark. To reoccupy a benchmark a Remotely Operated Vehicle (ROV) uses its manipulator arm to place the sensor pins into the benchmark grooves. In June 2014 we deployed four benchmarks offshore central Oregon. We used the ROV Jason to successfully demonstrate the removal and replacement of packages onto the benchmark. We will show the benchmark design and its operational capabilities. Presently models of megathrust slip within the Cascadia Subduction Zone (CSZ) are mostly constrained by the sub-aerial GPS vectors from the Plate Boundary Observatory, a part of Earthscope. More long-lived seafloor geodetic measures are needed to better understand the earthquake and tsunami risk associated with a large rupture of the thrust fault within the Cascadia subduction zone

  4. SP2Bench: A SPARQL Performance Benchmark

    Science.gov (United States)

    Schmidt, Michael; Hornung, Thomas; Meier, Michael; Pinkel, Christoph; Lausen, Georg

    A meaningful analysis and comparison of both existing storage schemes for RDF data and evaluation approaches for SPARQL queries necessitates a comprehensive and universal benchmark platform. We present SP2Bench, a publicly available, language-specific performance benchmark for the SPARQL query language. SP2Bench is settled in the DBLP scenario and comprises a data generator for creating arbitrarily large DBLP-like documents and a set of carefully designed benchmark queries. The generated documents mirror vital key characteristics and social-world distributions encountered in the original DBLP data set, while the queries implement meaningful requests on top of this data, covering a variety of SPARQL operator constellations and RDF access patterns. In this chapter, we discuss requirements and desiderata for SPARQL benchmarks and present the SP2Bench framework, including its data generator, benchmark queries and performance metrics.

  5. Benchmarking of refinery emissions performance : Executive summary

    International Nuclear Information System (INIS)

    2003-07-01

    This study was undertaken to collect emissions performance data for Canadian and comparable American refineries. The objective was to examine parameters that affect refinery air emissions performance and develop methods or correlations to normalize emissions performance. Another objective was to correlate and compare the performance of Canadian refineries to comparable American refineries. For the purpose of this study, benchmarking involved the determination of levels of emission performance that are being achieved for generic groups of facilities. A total of 20 facilities were included in the benchmarking analysis, and 74 American refinery emission correlations were developed. The recommended benchmarks, and the application of those correlations for comparison between Canadian and American refinery performance, were discussed. The benchmarks were: sulfur oxides, nitrogen oxides, carbon monoxide, particulate, volatile organic compounds, ammonia and benzene. For each refinery in Canada, benchmark emissions were developed. Several factors can explain differences in Canadian and American refinery emission performance. 4 tabs., 7 figs

  6. White Paper on the Status and Future of Ground-based Gamma-Ray Astronomy - Extragalactic Science Working Group

    Science.gov (United States)

    Krawczynski, H.; Coppi, P.; Dermer, C.; Dwek, E.; Georganopoulos, M.; Horan, D.; Jones, T.; Krennrich, F.; Mukherjee, R.; Perlman, E.; Vassiliev, V.

    2007-04-01

    In fall 2006, the Division of Astrophysics of the American Physical Society requested a white paper about the status and future of ground based gamma-ray astronomy. The white paper will largely be written in the year 2007. Interested scientists are invited to join the science working groups. In this contribution, we will report on some preliminary results of the extragalactic science working group. We will discuss the potential of future ground based gamma-ray experiments to elucidate how supermassive black holes accrete matter, form jets, and accelerate particles, and to study in detail the acceleration and propagation of cosmic rays in extragalactic systems like infrared galaxies and galaxy clusters. Furthermore, we discuss avenues to constrain the spectrum of the extragalactic infrared to optical background radiation, and to measure the extragalactic magnetic fields based on gamma-ray observations. Eventually, we discuss the potential of ground based experiments for conducting gamma-ray source surveys. More information about the white paper can be found at: http://cherenkov.physics.iastate.edu/wp/

  7. Preliminary Results from Powell Research Group on Integrating GRACE Satellite and Ground-based Estimates of Groundwater Storage Changes

    Science.gov (United States)

    Scanlon, B. R.; Zhang, Z.; Reitz, M.; Rodell, M.; Sanford, W. E.; Save, H.; Wiese, D. N.; Croteau, M. J.; McGuire, V. L.; Pool, D. R.; Faunt, C. C.; Zell, W.

    2017-12-01

    Groundwater storage depletion is a critical issue for many of the major aquifers in the U.S., particularly during intense droughts. GRACE (Gravity Recovery and Climate Experiment) satellite-based estimates of groundwater storage changes have attracted considerable media attention in the U.S. and globally and interest in GRACE products continues to increase. For this reason, a Powell Research Group was formed to: (1) Assess variations in groundwater storage using a variety of GRACE products and other storage components (snow, surface water, and soil moisture) for major aquifers in the U.S., (2) Quantify long-term trends in groundwater storage from ground-based monitoring and regional and national modeling, and (3) Use ground-based monitoring and modeling to interpret GRACE water storage changes within the context of extreme droughts and over-exploitation of groundwater. The group now has preliminary estimates from long-term trends and seasonal fluctuations in water storage using different GRACE solutions, including CSR, JPL and GSFC. Approaches to quantifying uncertainties in GRACE data are included. This work also shows how GRACE sees groundwater depletion in unconfined versus confined aquifers, and plans for future work will link GRACE data to regional groundwater models. The wealth of ground-based observations for the U.S. provides a unique opportunity to assess the reliability of GRACE-based estimates of groundwater storage changes.

  8. Comparison of the characteristic energy of precipitating electrons derived from ground-based and DMSP satellite data

    Directory of Open Access Journals (Sweden)

    M. Ashrafi

    2005-01-01

    Full Text Available Energy maps are important for ionosphere-magnetosphere coupling studies, because quantitative determination of field-aligned currents requires knowledge of the conductances and their spatial gradients. By combining imaging riometer absorption and all-sky auroral optical data it is possible to produce high temporal and spatial resolution maps of the Maxwellian characteristic energy of precipitating electrons within a 240240 common field of view. These data have been calibrated by inverting EISCAT electron density profiles into equivalent energy spectra. In this paper energy maps produced by ground-based instruments (optical and riometer are compared with DMSP satellite data during geomagnetic conjunctions. For the period 1995-2002, twelve satellite passes over the ground-based instruments' field of view for the cloud-free conditions have been considered. Four of the satellite conjunctions occurred during moderate geomagnetic, steady-state conditions and without any ion precipitation. In these cases with Maxwellian satellite spectra, there is 71% agreement between the characteristic energies derived from the satellite and the ground-based energy map method.

  9. Comparison of the characteristic energy of precipitating electrons derived from ground-based and DMSP satellite data

    Directory of Open Access Journals (Sweden)

    M. Ashrafi

    2005-01-01

    Full Text Available Energy maps are important for ionosphere-magnetosphere coupling studies, because quantitative determination of field-aligned currents requires knowledge of the conductances and their spatial gradients. By combining imaging riometer absorption and all-sky auroral optical data it is possible to produce high temporal and spatial resolution maps of the Maxwellian characteristic energy of precipitating electrons within a 240240 common field of view. These data have been calibrated by inverting EISCAT electron density profiles into equivalent energy spectra. In this paper energy maps produced by ground-based instruments (optical and riometer are compared with DMSP satellite data during geomagnetic conjunctions. For the period 1995-2002, twelve satellite passes over the ground-based instruments' field of view for the cloud-free conditions have been considered. Four of the satellite conjunctions occurred during moderate geomagnetic, steady-state conditions and without any ion precipitation. In these cases with Maxwellian satellite spectra, there is 71% agreement between the characteristic energies derived from the satellite and the ground-based energy map method.

  10. Vver-1000 Mox core computational benchmark

    International Nuclear Information System (INIS)

    2006-01-01

    The NEA Nuclear Science Committee has established an Expert Group that deals with the status and trends of reactor physics, fuel performance and fuel cycle issues related to disposing of weapons-grade plutonium in mixed-oxide fuel. The objectives of the group are to provide NEA member countries with up-to-date information on, and to develop consensus regarding, core and fuel cycle issues associated with burning weapons-grade plutonium in thermal water reactors (PWR, BWR, VVER-1000, CANDU) and fast reactors (BN-600). These issues concern core physics, fuel performance and reliability, and the capability and flexibility of thermal water reactors and fast reactors to dispose of weapons-grade plutonium in standard fuel cycles. The activities of the NEA Expert Group on Reactor-based Plutonium Disposition are carried out in close co-operation (jointly, in most cases) with the NEA Working Party on Scientific Issues in Reactor Systems (WPRS). A prominent part of these activities include benchmark studies. At the time of preparation of this report, the following benchmarks were completed or in progress: VENUS-2 MOX Core Benchmarks: carried out jointly with the WPRS (formerly the WPPR) (completed); VVER-1000 LEU and MOX Benchmark (completed); KRITZ-2 Benchmarks: carried out jointly with the WPRS (formerly the WPPR) (completed); Hollow and Solid MOX Fuel Behaviour Benchmark (completed); PRIMO MOX Fuel Performance Benchmark (ongoing); VENUS-2 MOX-fuelled Reactor Dosimetry Calculation (ongoing); VVER-1000 In-core Self-powered Neutron Detector Calculational Benchmark (started); MOX Fuel Rod Behaviour in Fast Power Pulse Conditions (started); Benchmark on the VENUS Plutonium Recycling Experiments Configuration 7 (started). This report describes the detailed results of the benchmark investigating the physics of a whole VVER-1000 reactor core using two-thirds low-enriched uranium (LEU) and one-third MOX fuel. It contributes to the computer code certification process and to the

  11. Atmospheric greenhouse gases retrieved from SCIAMACHY: comparison to ground-based FTS measurements and model results

    Directory of Open Access Journals (Sweden)

    O. Schneising

    2012-02-01

    Full Text Available SCIAMACHY onboard ENVISAT (launched in 2002 enables the retrieval of global long-term column-averaged dry air mole fractions of the two most important anthropogenic greenhouse gases carbon dioxide and methane (denoted XCO2 and XCH4. In order to assess the quality of the greenhouse gas data obtained with the recently introduced v2 of the scientific retrieval algorithm WFM-DOAS, we present validations with ground-based Fourier Transform Spectrometer (FTS measurements and comparisons with model results at eight Total Carbon Column Observing Network (TCCON sites providing realistic error estimates of the satellite data. Such validation is a prerequisite to assess the suitability of data sets for their use in inverse modelling.

    It is shown that there are generally no significant differences between the carbon dioxide annual increases of SCIAMACHY and the assimilation system CarbonTracker (2.00 ± 0.16 ppm yr−1 compared to 1.94 ± 0.03 ppm yr−1 on global average. The XCO2 seasonal cycle amplitudes derived from SCIAMACHY are typically larger than those from TCCON which are in turn larger than those from CarbonTracker. The absolute values of the northern hemispheric TCCON seasonal cycle amplitudes are closer to SCIAMACHY than to CarbonTracker and the corresponding differences are not significant when compared with SCIAMACHY, whereas they can be significant for a subset of the analysed TCCON sites when compared with CarbonTracker. At Darwin we find discrepancies of the seasonal cycle derived from SCIAMACHY compared to the other data sets which can probably be ascribed to occurrences of undetected thin clouds. Based on the comparison with the reference data, we conclude that the carbon dioxide data set can be characterised by a regional relative precision (mean standard deviation of the differences of about 2.2 ppm and a relative accuracy (standard deviation of the mean differences

  12. Tracking morphological changes and slope instability using spaceborne and ground-based SAR data

    Science.gov (United States)

    Di Traglia, Federico; Nolesini, Teresa; Ciampalini, Andrea; Solari, Lorenzo; Frodella, William; Bellotti, Fernando; Fumagalli, Alfio; De Rosa, Giuseppe; Casagli, Nicola

    2018-01-01

    Stromboli (Aeolian Archipelago, Italy) is an active volcano that is frequently affected by moderate to large mass wasting, which has occasionally triggered tsunamis. With the aim of understanding the relationship between the geomorphologic evolution and slope instability of Stromboli, remote sensing information from space-born Synthetic Aperture Radar (SAR) change detection and interferometry (InSAR) () and Ground Based InSAR (GBInSAR) was compared with field observations and morphological analyses. Ground reflectivity and SqueeSAR™ (an InSAR algorithm for surface deformation monitoring) displacement measurements from X-band COSMO-SkyMed satellites (CSK) were analysed together with displacement measurements from a permanent-sited, Ku-band GBInSAR system. Remote sensing results were compared with a preliminary morphological analysis of the Sciara del Fuoco (SdF) steep volcanic flank, which was carried out using a high-resolution Digital Elevation Model (DEM). Finally, field observations, supported by infrared thermographic surveys (IRT), allowed the interpretation and validation of remote sensing data. The analysis of the entire dataset (collected between January 2010 and December 2014) covers a period characterized by a low intensity of Strombolian activity. This period was punctuated by the occurrence of lava overflows, occurring from the crater terrace evolving downslope toward SdF, and flank eruptions, such as the 2014 event. The amplitude of the CSK images collected between February 22nd, 2010, and December 18th, 2014, highlights that during periods characterized by low-intensity Strombolian activity, the production of materials ejected from the crater terrace towards the SdF is generally low, and erosion is the prevailing process mainly affecting the central sector of the SdF. CSK-SqueeSAR™ and GBInSAR data allowed the identification of low displacements in the SdF, except for high displacement rates (up to 1.5 mm/h) that were measured following both lava

  13. Further Studies of Forest Structure Parameter Retrievals Using the Echidna® Ground-Based Lidar

    Science.gov (United States)

    Strahler, A. H.; Yao, T.; Zhao, F.; Yang, X.; Schaaf, C.; Wang, Z.; Li, Z.; Woodcock, C. E.; Culvenor, D.; Jupp, D.; Newnham, G.; Lovell, J.

    2012-12-01

    Ongoing work with the Echidna® Validation Instrument (EVI), a full-waveform, ground-based scanning lidar (1064 nm) developed by Australia's CSIRO and deployed by Boston University in California conifers (2008) and New England hardwood and softwood (conifer) stands (2007, 2009, 2010), confirms the importance of slope correction in forest structural parameter retrieval; detects growth and disturbance over periods of 2-3 years; provides a new way to measure the between-crown clumping factor in leaf area index retrieval using lidar range; and retrieves foliage profiles with more lower-canopy detail than a large-footprint aircraft scanner (LVIS), while simulating LVIS foliage profiles accurately from a nadir viewpoint using a 3-D point cloud. Slope correction is important for accurate retrieval of forest canopy structural parameters, such as mean diameter at breast height (DBH), stem count density, basal area, and above-ground biomass. Topographic slope can induce errors in parameter retrievals because the horizontal plane of the instrument scan, which is used to identify, measure, and count tree trunks, will intersect trunks below breast height in the uphill direction and above breast height in the downhill direction. A test of three methods at southern Sierra Nevada conifer sites improved the range of correlations of these EVI-retrieved parameters with field measurements from 0.53-0.68 to 0.85-0.93 for the best method. EVI scans can detect change, including both growth and disturbance, in periods of two to three years. We revisited three New England forest sites scanned in 2007-2009 or 2007-2010. A shelterwood stand at the Howland Experimental Forest, Howland, Maine, showed increased mean DBH, above-ground biomass and leaf area index between 2007 and 2009. Two stands at the Harvard Forest, Petersham, Massachusetts, suffered reduced leaf area index and reduced stem count density as the result of an ice storm that damaged the stands. At one stand, broken tops were

  14. Climatological lower thermosphere winds as seen by ground-based and space-based instruments

    Directory of Open Access Journals (Sweden)

    J. M. Forbes

    2004-06-01

    Full Text Available Comparisons are made between climatological dynamic fields obtained from ground-based (GB and space-based (SB instruments with a view towards identifying SB/GB intercalibration issues for TIMED and other future aeronomy satellite missions. SB measurements are made from the High Resolution Doppler Imager (HRDI instrument on the Upper Atmosphere Research Satellite (UARS. The GB data originate from meteor radars at Obninsk, (55° N, 37° E, Shigaraki (35° N, 136° E and Jakarta (6° S, 107° E and MF spaced-antenna radars at Hawaii (22° N, 160° W, Christmas I. (2° N, 158° W and Adelaide (35° S, 138° E. We focus on monthly-mean prevailing, diurnal and semidiurnal wind components at 96km, averaged over the 1991-1999 period. We perform space-based (SB analyses for 90° longitude sectors including the GB sites, as well as for the zonal mean. Taking the monthly prevailing zonal winds from these stations as a whole, on average, SB zonal winds exceed GB determinations by ~63%, whereas meridional winds are in much better agreement. The origin of this discrepancy remains unknown, and should receive high priority in initial GB/SB comparisons during the TIMED mission. We perform detailed comparisons between monthly climatologies from Jakarta and the geographically conjugate sites of Shigaraki and Adelaide, including some analyses of interannual variations. SB prevailing, diurnal and semidiurnal tides exceed those measured over Jakarta by factors, on the average, of the order of 2.0, 1.6, 1.3, respectively, for the eastward wind, although much variability exists. For the meridional component, SB/GB ratios for the diurnal and semidiurnal tide are about 1.6 and 1.7. Prevailing and tidal amplitudes at Adelaide are significantly lower than SB values, whereas similar net differences do not occur at the conjugate Northern Hemisphere location of Shigaraki. Adelaide diurnal phases lag SB phases by several hours, but excellent agreement between the two data

  15. Thermal Performance Benchmarking: Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Xuhui [National Renewable Energy Laboratory (NREL), Golden, CO (United States). Transportation and Hydrogen Systems Center

    2017-10-19

    In FY16, the thermal performance of the 2014 Honda Accord Hybrid power electronics thermal management systems were benchmarked. Both experiments and numerical simulation were utilized to thoroughly study the thermal resistances and temperature distribution in the power module. Experimental results obtained from the water-ethylene glycol tests provided the junction-to-liquid thermal resistance. The finite element analysis (FEA) and computational fluid dynamics (CFD) models were found to yield a good match with experimental results. Both experimental and modeling results demonstrate that the passive stack is the dominant thermal resistance for both the motor and power electronics systems. The 2014 Accord power electronics systems yield steady-state thermal resistance values around 42- 50 mm to the 2nd power K/W, depending on the flow rates. At a typical flow rate of 10 liters per minute, the thermal resistance of the Accord system was found to be about 44 percent lower than that of the 2012 Nissan LEAF system that was benchmarked in FY15. The main reason for the difference is that the Accord power module used a metalized-ceramic substrate and eliminated the thermal interface material layers. FEA models were developed to study the transient performance of 2012 Nissan LEAF, 2014 Accord, and two other systems that feature conventional power module designs. The simulation results indicate that the 2012 LEAF power module has lowest thermal impedance at a time scale less than one second. This is probably due to moving low thermally conductive materials further away from the heat source and enhancing the heat spreading effect from the copper-molybdenum plate close to the insulated gate bipolar transistors. When approaching steady state, the Honda system shows lower thermal impedance. Measurement results of the thermal resistance of the 2015 BMW i3 power electronic system indicate that the i3 insulated gate bipolar transistor module has significantly lower junction

  16. Modeling of Rocket Fuel Heating and Cooling Processes in the Interior Receptacle Space of Ground-Based Systems

    Directory of Open Access Journals (Sweden)

    K. I. Denisova

    2016-01-01

    Full Text Available The propellant to fill the fuel tanks of the spacecraft, upper stages, and space rockets on technical and ground-based launch sites before fueling should be prepared to ensure many of its parameters, including temperature, in appropriate condition. Preparation of fuel temperature is arranged through heating and cooling the rocket propellants (RP in the tanks of fueling equipment. Processes of RP temperature preparation are the most energy-intensive and timeconsuming ones, which require that a choice of sustainable technologies and modes of cooling (heating RP provided by the ground-based equipment has been made through modeling of the RP [1] temperature preparation processes at the stage of design and operation of the groundbased fueling equipment.The RP temperature preparation in the tanks of the ground-based systems can be provided through the heat-exchangers built-in the internal space and being external with respect to the tank in which antifreeze, air or liquid nitrogen may be used as the heat transfer media. The papers [1-12], which note a promising use of the liquid nitrogen to cool PR, present schematic diagrams and modeling systems for the RP temperature preparation in the fueling equipment of the ground-based systems.We consider the RP temperature preparation using heat exchangers to be placed directly in RP tanks. Feeding the liquid nitrogen into heat exchanger with the antifreeze provides the cooling mode of PR while a heated air fed there does that of heating. The paper gives the systems of equations and results of modeling the processes of RP temperature preparation, and its estimated efficiency.The systems of equations of cooling and heating RP are derived on the assumption that the heat exchange between the fuel and the antifreeze, as well as between the storage tank and the environment is quasi-stationary.The paper presents calculation results of the fuel temperature in the tank, and coolant temperature in the heat exchanger, as

  17. What Randomized Benchmarking Actually Measures

    International Nuclear Information System (INIS)

    Proctor, Timothy; Rudinger, Kenneth; Young, Kevin; Sarovar, Mohan; Blume-Kohout, Robin

    2017-01-01

    Randomized benchmarking (RB) is widely used to measure an error rate of a set of quantum gates, by performing random circuits that would do nothing if the gates were perfect. In the limit of no finite-sampling error, the exponential decay rate of the observable survival probabilities, versus circuit length, yields a single error metric r. For Clifford gates with arbitrary small errors described by process matrices, r was believed to reliably correspond to the mean, over all Clifford gates, of the average gate infidelity between the imperfect gates and their ideal counterparts. We show that this quantity is not a well-defined property of a physical gate set. It depends on the representations used for the imperfect and ideal gates, and the variant typically computed in the literature can differ from r by orders of magnitude. We present new theories of the RB decay that are accurate for all small errors describable by process matrices, and show that the RB decay curve is a simple exponential for all such errors. Here, these theories allow explicit computation of the error rate that RB measures (r), but as far as we can tell it does not correspond to the infidelity of a physically allowed (completely positive) representation of the imperfect gates.

  18. Benchmarking Commercial Conformer Ensemble Generators.

    Science.gov (United States)

    Friedrich, Nils-Ole; de Bruyn Kops, Christina; Flachsenberg, Florian; Sommer, Kai; Rarey, Matthias; Kirchmair, Johannes

    2017-11-27

    We assess and compare the performance of eight commercial conformer ensemble generators (ConfGen, ConfGenX, cxcalc, iCon, MOE LowModeMD, MOE Stochastic, MOE Conformation Import, and OMEGA) and one leading free algorithm, the distance geometry algorithm implemented in RDKit. The comparative study is based on a new version of the Platinum Diverse Dataset, a high-quality benchmarking dataset of 2859 protein-bound ligand conformations extracted from the PDB. Differences in the performance of commercial algorithms are much smaller than those observed for free algorithms in our previous study (J. Chem. Inf. 2017, 57, 529-539). For commercial algorithms, the median minimum root-mean-square deviations measured between protein-bound ligand conformations and ensembles of a maximum of 250 conformers are between 0.46 and 0.61 Å. Commercial conformer ensemble generators are characterized by their high robustness, with at least 99% of all input molecules successfully processed and few or even no substantial geometrical errors detectable in their output conformations. The RDKit distance geometry algorithm (with minimization enabled) appears to be a good free alternative since its performance is comparable to that of the midranked commercial algorithms. Based on a statistical analysis, we elaborate on which algorithms to use and how to parametrize them for best performance in different application scenarios.

  19. Benchmark tests of JENDL-1

    International Nuclear Information System (INIS)

    Kikuchi, Yasuyuki; Hasegawa, Akira; Takano, Hideki; Kamei, Takanobu; Hojuyama, Takeshi; Sasaki, Makoto; Seki, Yuji; Zukeran, Atsushi; Otake, Iwao.

    1982-02-01

    Various benchmark tests were made on JENDL-1. At the first stage, various core center characteristics were tested for many critical assemblies with one-dimensional model. At the second stage, applicability of JENDL-1 was further tested to more sophisticated problems for MOZART and ZPPR-3 assemblies with two-dimensional model. It was proved that JENDL-1 predicted various quantities of fast reactors satisfactorily as a whole. However, the following problems were pointed out: 1) There exists discrepancy of 0.9% in the k sub(eff)-values between the Pu- and U-cores. 2) The fission rate ratio of 239 Pu to 235 U is underestimated by 3%. 3) The Doppler reactivity coefficients are overestimated by about 10%. 4) The control rod worths are underestimated by 4%. 5) The fission rates of 235 U and 239 Pu are underestimated considerably in the outer core and radial blanket regions. 6) The negative sodium void reactivities are overestimated, when the sodium is removed from the outer core. As a whole, most of problems of JENDL-1 seem to be related with the neutron leakage and the neutron spectrum. It was found through the further study that most of these problems came from too small diffusion coefficients and too large elastic removal cross sections above 100 keV, which might be probably caused by overestimation of the total and elastic scattering cross sections for structural materials in the unresolved resonance region up to several MeV. (author)

  20. Human factors reliability benchmark exercise

    International Nuclear Information System (INIS)

    Poucet, A.

    1989-08-01

    The Joint Research Centre of the European Commission has organised a Human Factors Reliability Benchmark Exercise (HF-RBE) with the aim of assessing the state of the art in human reliability modelling and assessment. Fifteen teams from eleven countries, representing industry, utilities, licensing organisations and research institutes, participated in the HF-RBE. The HF-RBE was organised around two study cases: (1) analysis of routine functional Test and Maintenance (TPM) procedures: with the aim of assessing the probability of test induced failures, the probability of failures to remain unrevealed and the potential to initiate transients because of errors performed in the test; (2) analysis of human actions during an operational transient: with the aim of assessing the probability that the operators will correctly diagnose the malfunctions and take proper corrective action. This report summarises the contributions received from the participants and analyses these contributions on a comparative basis. The aim of this analysis was to compare the procedures, modelling techniques and quantification methods used, to obtain insight in the causes and magnitude of the variability observed in the results, to try to identify preferred human reliability assessment approaches and to get an understanding of the current state of the art in the field identifying the limitations that are still inherent to the different approaches