Watling, James I; Bucklin, David N; Speroterra, Carolina; Brandt, Laura A; Mazzotti, Frank J; Romañach, Stephanie S
Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species' distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967-1971 (t1 ) and evaluated using occurrence data from 1998-2002 (t2). Model sensitivity (the ability to correctly classify species presences) was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences) was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on species. PMID
In the greenhouse model the momentaneous environmental crop growth factors are calculated as output, together with the physical behaviour of the crop. The boundary conditions for this model are the outside weather conditions; other inputs are the physical characteristics of the crop, of the greenhouse and of the control system. The greenhouse model is based on the energy, water vapour and CO2 balances of the crop-greenhouse system. While the emphasis is on the dynamic behaviour of the greenhouse for implementation in continuous optimization, the state variables temperature, water vapour pressure and carbondioxide concentration in the relevant greenhouse parts crop, air, soil and cover are calculated from the balances over these parts. To do this in a proper way, the physical exchange processes between the system parts have to be quantified first. Therefore the greenhouse model is constructed from submodels describing these processes: a. Radiation transmission model for the modification of the outside to the inside global radiation. b. Ventilation model to describe the ventilation exchange between greenhouse and outside air. c. The description of the exchange of energy and mass between the crop and the greenhouse air. d. Calculation of the thermal radiation exchange between the various greenhouse parts. e. Quantification of the convective exchange processes between the greenhouse air and respectively the cover, the heating pipes and the soil surface and between the cover and the outside air. f. Determination of the heat conduction in the soil. The various submodels are validated first and then the complete greenhouse model is verified
Global Climate Models (GCMs) are the predominant tool with which we predict the future climate. In order that people can have confidence in such predictions, GCMs require validation. As almost every available item of meteorological data has been exploited in the construction and tuning of GCMs to date, independent validation is very difficult. This paper explores the use of isotopes as a novel and fully independent means of evaluating GCMs. The focus is the Amazon Basin which has a long history of isotope collection and analysis and also of climate modelling: both having been reported for over thirty years. Careful consideration of the results of GCM simulations of Amazonian deforestation and climate change suggests that the recent stable isotope record is more consistent with the predicted effects of greenhouse warming, possibly combined with forest removal, than with GCM predictions of the effects of deforestation alone
Gates, W.L. [Lawrence Livermore National Lab. Livermore, CA (United States). Program for Climate Model Diagnosis and Intercomparison
Climate modeling, whereby basic physical laws are used to integrate the physics and dynamics of climate into a consistent system, plays a key role in climate research and is the medium through. Depending upon the portion(s) of the climate system being considered, climate models range from those concerned only with the equilibrium globally-averaged surface temperature to those depicting the 3-dimensional time-dependent evolution of the coupled atmosphere, ocean, sea ice and land surface. Here only the latter class of models are considered, which are commonly known as general circulation models (or GCMs). (author)
Karmalkar, Ambarish V. [University of Oxford, School of Geography and the Environment, Oxford (United Kingdom); Bradley, Raymond S. [University of Massachusetts, Department of Geosciences, Amherst, MA (United States); Diaz, Henry F. [NOAA/ESRL/CIRES, Boulder, CO (United States)
Central America has high biodiversity, it harbors high-value ecosystems and it's important to provide regional climate change information to assist in adaptation and mitigation work in the region. Here we study climate change projections for Central America and Mexico using a regional climate model. The model evaluation shows its success in simulating spatial and temporal variability of temperature and precipitation and also in capturing regional climate features such as the bimodal annual cycle of precipitation and the Caribbean low-level jet. A variety of climate regimes within the model domain are also better identified in the regional model simulation due to improved resolution of topographic features. Although, the model suffers from large precipitation biases, it shows improvements over the coarse-resolution driving model in simulating precipitation amounts. The model shows a dry bias in the wet season and a wet bias in the dry season suggesting that it's unable to capture the full range of precipitation variability. Projected warming under the A2 scenario is higher in the wet season than that in the dry season with the Yucatan Peninsula experiencing highest warming. A large reduction in precipitation in the wet season is projected for the region, whereas parts of Central America that receive a considerable amount of moisture in the form of orographic precipitation show significant decreases in precipitation in the dry season. Projected climatic changes can have detrimental impacts on biodiversity as they are spatially similar, but far greater in magnitude, than those observed during the El Nino events in recent decades that adversely affected species in the region. (orig.)
Madsen, Henrik; Refsgaard, Jens C.; Andréassian, Vazken;
using proxies of future conditions. In general, a model that has been setup for solving a specific problem at a particular site should be tested in order to document its predictive capability and credibility. In a climate change context such tests, often referred to as model validations tests, are......Models used for projection of climate change and its impacts are usually not validated for simulation of future climate conditions. This is a serious deficiency that introduces an unknown level of uncertainty in the projections. A framework and guiding principles are presented for testing models...... particularly challenging since the model is used for an unknown future with a climate that is significantly different from current conditions. Most model studies reported on projections of climate change and its impacts have not included formal model validation tests that address this issue. A model validation...
Misra, Vasubandhu; Marx, Larry; James L. Kinter III; Kirtman, Ben P.; Guo, Zhichang; Min, Dughong; Fennessy, Mike; Dirmeyer, Paul A.; Kallummal, Rameshan; Straus, David M.
A newly developed Atmospheric General Circulation Model (AGCM) at T62 spectral truncation with 28 terrainfollowing (σ = p/ps) levels coupled to the Modular Ocean Model version 3.0 (MOM3.0) is evaluated for its simulation of El Niño and the Southern Oscillation (ENSO). It is also compared with an older version of the AGCM coupled to the same ocean model. A dozen features of ENSO are validated. These characteristics of ENSO highlight its influence on global climate at seasonal to interannual sc...
Misra, Vasubandhu; Marx, Larry; Kinter, James L. III; Kirtman, Ben P.; Zhichang Guo; Dughong Min; Fennessy, Mike; Dirmeyer, Paul A.; Kallummal, Rameshan; Straus, David M. [Center for Ocean-Land-Atmosphere Studies, Institute of Global Environment and Society, Inc. 4041 Powder Mill Road, Suite 302, Calverton, MD 20705 (United States)]. E-mail: email@example.com
A newly developed Atmospheric General Circulation Model (AGCM) at T62 spectral truncation with 28 terrain-following levels coupled to the Modular Ocean Model version 3.0 (MOM3.0) is evaluated for its simulation of El Nino and the Southern Oscillation (ENSO). It is also compared with an older version of the AGCM coupled to the same ocean model. A dozen features of ENSO are validated. These characteristics of ENSO highlight its influence on global climate at seasonal to interannual scales. The major improvements of the ENSO simulation from this new coupled climate model are the seasonal phase locking of the ENSO variability to a realistic annual cycle of the eastern equatorial Pacific Ocean, the duration of the ENSO events and its evolution that is comparable to the ocean data assimilation. The two apparent drawbacks of this new model are its relatively weak ENSO variability and the presence of erroneous split ITCZ. The improvement of the ENSO simulation in the new coupled model is attributed to realistic thermocline variability and wind stress simulation.
Yihdego, Yohannes; Webb, John
Forecast evaluation is an important topic that addresses the development of reliable hydrological probabilistic forecasts, mainly through the use of climate uncertainties. Often, validation has no place in hydrology for most of the times, despite the parameters of a model are uncertain. Similarly, the structure of the model can be incorrectly chosen. A calibrated and verified dynamic hydrologic water balance spreadsheet model has been used to assess the effect of climate variability on Lake Burrumbeet, southeastern Australia. The lake level has been verified to lake level, lake volume, lake surface area, surface outflow and lake salinity. The current study aims to increase lake level confidence model prediction through historical validation for the year 2008-2013, under different climatic scenario. Based on the observed climatic condition (2008-2013), it fairly matches with a hybridization of scenarios, being the period interval (2008-2013), corresponds to both dry and wet climatic condition. Besides to the hydrologic stresses uncertainty, uncertainty in the calibrated model is among the major drawbacks involved in making scenario simulations. In line with this, the uncertainty in the calibrated model was tested using sensitivity analysis and showed that errors in the model can largely be attributed to erroneous estimates of evaporation and rainfall, and surface inflow to a lesser. The study demonstrates that several climatic scenarios should be analysed, with a combination of extreme climate, stream flow and climate change instead of one assumed climatic sequence, to improve climate variability prediction in the future. Performing such scenario analysis is a valid exercise to comprehend the uncertainty with the model structure and hydrology, in a meaningful way, without missing those, even considered as less probable, ultimately turned to be crucial for decision making and will definitely increase the confidence of model prediction for management of the water
Dubois, F.; Randriambololona, H.; Petit, C.
This paper deals with the modeling of linear viscoelastic behavior and strain accumulation (accelerated creep) during moisture content changes in timber. A generalized Kelvin-Voigt model is used and associated in series with a shrinkage-swelling element depending on the mechanical and moisture content states of materials. The hygrothermal aging due to climatic variations implies an evolution of rheological parameters depending upon moisture content and temperature. Two distinct viscoelastic laws, one for drying and the other for moistening, are coupled according to the thermodynamic principles when wood is subjected to nonmonotonous moisture variations. An incremental formulation of behavior is established in the finite element program CAST3M (Software developed by C.E.A. (Commissariat á l'Energi Atomique) and an experimental validation from tension creep-recovery tests is presented.
Eyring, V.; Harris, N. R. P.; Rex, M.; Shepherd, T. G.; Fahey, D. W.; Amanatidis, G. T.; J. Austin; M. P. Chipperfield; Dameris, M.; P. M. De F. Forster; Gettelman, A.; Graf, H. F.; Nagashima, T.; Newman, P. A.; Pawson, S.
Accurate and reliable predictions and an understanding of future changes in the stratosphere are of major importance to our understanding of climate change. Simulating the interaction between chemistry and climate is of particular importance, because continued increases in greenhouse gases and a slow decrease in halogen loading are expected. These both influence the abundance of stratospheric ozone. In recent years a number of coupled chemistry climate models (CCMs) with different levels of c...
Heiri, Oliver; Brooks, Stephen J.; Renssen, Hans; Bedford, Alan; Hazekamp, Marjolein; Ilyashuk, Boris; Jeffers, Elizabeth S.; Lang, Barbara; Kirilova, Emiliya; Kuiper, Saskia; Millet, Laurent; Samartin, Stéphanie; Toth, Monika; Verbruggen, Frederike; Watson, Jenny E.; van Asch, Nelleke; Lammertsma, Emmy; Amon, Leeli; Birks, Hilary H.; Birks, H. John B.; Mortensen, Morten F.; Hoek, Wim Z.; Magyari, Enikö; Sobrino, Castor Muñoz; Seppä, Heikki; Tinner, Willy; Tonkov, Spassimir; Veski, Siim; Lotter, André F.
Comparisons of climate model hindcasts with independent proxy data are essential for assessing model performance in non-analogue situations. However, standardized paleoclimate datasets for assessing the spatial pattern of past climatic change across continents are lacking for some of the most dynamic episodes of Earth's recent past. Here we present a new chironomid-based paleotemperature dataset designed to assess climate model hindcasts of regional summer temperature change in Europe during the late-glacial and early Holocene. Latitudinal and longitudinal patterns of inferred temperature change are in excellent agreement with simulations by the ECHAM-4 model, implying that atmospheric general circulation models like ECHAM-4 can successfully predict regionally diverging temperature trends in Europe, even when conditions differ significantly from present. However, ECHAM-4 infers larger amplitudes of change and higher temperatures during warm phases than our paleotemperature estimates, suggesting that this and similar models may overestimate past and potentially also future summer temperature changes in Europe. PMID:25208610
Tinker, Jonathan; Lowe, Jason; Holt, Jason; Pardaens, Anne; Wiltshire, Andy
The aim of this study was to evaluate the performance of a modelling system used to represent the northwest European shelf seas. Variants of the coupled atmosphere-ocean global climate model, HadCM3, were run under conditions of historically varying concentrations of greenhouse gases and other radiatively active constituents. The atmospheric simulation for the shelf sea region and its surrounds was downscaled to finer spatial scales using a regional climate model (HadRM3); these simulations were then used to drive a river routing scheme (TRIP). Together, these provide the atmospheric, oceanic and riverine boundary conditions to drive the shelf seas model POLCOMS. Additionally, a shelf seas simulation was driven by the ERA-40 reanalysis in place of HadCM3. We compared the modelling systems output against a sea surface temperature satellite analysis product, a quality controlled ocean profile dataset and values of volume transport through particular ocean sections from the literature. In addition to assessing model drift with a pre-industrial control simulation the modelling system was evaluated against observations and the reanalysis driven simulation. We concluded that the modelling system provided an excellent (good) representation of the spatial patterns of temperature (salinity). It provided a good representation of the mean temperature climate, and a sufficient representation of the mean salinity and water column structure climate. The representation of the interannual variability was sufficient, while the overall shelf-wide circulation was qualitatively good. From this wide range of metrics we judged the modelling system fit for the purpose of providing centennial climate projections for the northwest European shelf seas.
Bracco, Annalisa [Georgia Inst. of Technology, Atlanta, GA (United States)
We developed a fast, robust and scalable methodology to examine, quantify, and visualize climate patterns and their relationships. It is based on a set of notions, algorithms and metrics used in the study of graphs, referred to as complex network analysis. This approach can be applied to explain known climate phenomena in terms of an underlying network structure and to uncover regional and global linkages in the climate system, while comparing general circulation models outputs with observations. The proposed method is based on a two-layer network representation, and is substantially new within the available network methodologies developed for climate studies. At the first layer, gridded climate data are used to identify ‘‘areas’’, i.e., geographical regions that are highly homogeneous in terms of the given climate variable. At the second layer, the identified areas are interconnected with links of varying strength, forming a global climate network. The robustness of the method (i.e. the ability to separate between topological distinct fields, while identifying correctly similarities) has been extensively tested. It has been proved that it provides a reliable, fast framework for comparing and ranking the ability of climate models of reproducing observed climate patterns and their connectivity. We further developed the methodology to account for lags in the connectivity between climate patterns and refined our area identification algorithm to account for autocorrelation in the data. The new methodology based on complex network analysis has been applied to state-of-the-art climate model simulations that participated to the last IPCC (International Panel for Climate Change) assessment to verify their performances, quantify uncertainties, and uncover changes in global linkages between past and future projections. Network properties of modeled sea surface temperature and rainfall over 1956–2005 have been constrained towards observations or reanalysis data sets
Baret, F.; Weiss, M.; Lacaze, R.; Camacho, F.; Smets, B.; Pacholczyk, P.; Makhmara, H.
LAI and fAPAR are recognized as Essential Climate Variables providing key information for the understanding and modeling of canopy functioning. Global remote sensing observations at medium resolution are routinely acquired since the 80’s mainly with AVHRR, SEAWIFS, VEGETATION, MODIS and MERIS sensors. Several operational products have been derived and provide global maps of LAI and fAPAR at daily to monthly time steps. Inter-comparison between MODIS, CYCLOPES, GLOBCARBON and JRC-FAPAR products showed generally consistent seasonality, while large differences in magnitude and smoothness may be observed. One of the objectives of the GEOLAND2 European project is to develop such core products to be used in a range of application services including the carbon monitoring. Rather than generating an additional product from scratch, the version 1 of GEOLAND2 products was capitalizing on the existing products by combining them to retain their pros and limit their cons. For these reasons, MODIS and CYCLOPES products were selected since they both include LAI and fAPAR while having relatively close temporal sampling intervals (8 to 10 days). GLOBCARBON products were not used here because of the too long monthly time step inducing large uncertainties in the seasonality description. JRC-FAPAR was not selected as well to preserve better consistency between LAI and fAPAR products. MODIS and CYCLOPES products were then linearly combined to take advantage of the good performances of CYCLOPES products for low to medium values of LAI and fAPAR while benefiting from the better MODIS performances for the highest LAI values. A training database representative of the global variability of vegetation type and conditions was thus built. A back-propagation neural network was then calibrated to estimate the new LAI and fAPAR products from VEGETATION preprocessed observations. Similarly, the vegetation cover fraction (fCover) was also derived by scaling the original CYCLOPES fCover products
Lauret, Philippe; Mara, Thierry A.; Boyer, Harry; Adelard, Laetitia; Garde, Francois; Garde, François
As part of our efforts to complete the software CODYRUN validation, we chose as test building a block of flats constructed in Reunion Island, which has a humid tropical climate. The sensitivity analysis allowed us to study the effects of both diffuse and direct solar radiation on our model of this building. With regard to the choice and location of sensors, this stage of the study also led us to measure the solar radiation falling on the windows. The comparison of measured and predicted radia...
Dumitrescu, Alexandru; Busuioc, Aristita
EURO-CORDEX is the European branch of the international CORDEX initiative that aims to provide improved regional climate change projections for Europe. The main objective of this paper is to document the performance of the individual models in reproducing the variability of precipitation extremes in Romania. Here three EURO-CORDEX regional climate models (RCMs) ensemble (scenario RCP4.5) are analysed and inter-compared: DMI-HIRHAM5, KNMI-RACMO2.2 and MPI-REMO. Compared to previous studies, when the RCM validation regarding the Romanian climate has mainly been made on mean state and at station scale, a more quantitative approach of precipitation extremes is proposed. In this respect, to have a more reliable comparison with observation, a high resolution daily precipitation gridded data set was used as observational reference (CLIMHYDEX project). The comparison between the RCM outputs and observed grid point values has been made by calculating three extremes precipitation indices, recommended by the Expert Team on Climate Change Detection Indices (ETCCDI), for the 1976-2005 period: R10MM, annual count of days when precipitation ≥10mm; RX5DAY, annual maximum 5-day precipitation and R95P%, precipitation fraction of annual total precipitation due to daily precipitation > 95th percentile. The RCMs capability to reproduce the mean state for these variables, as well as the main modes of their spatial variability (given by the first three EOF patterns), are analysed. The investigation confirms the ability of RCMs to simulate the main features of the precipitation extreme variability over Romania, but some deficiencies in reproducing of their regional characteristics were found (for example, overestimation of the mea state, especially over the extra Carpathian regions). This work has been realised within the research project "Changes in climate extremes and associated impact in hydrological events in Romania" (CLIMHYDEX), code PN II-ID-2011-2-0073, financed by the Romanian
Benestad, Rasmus E
An old conceptual physics-based back-of-the-envelope model for greenhouse effect is revisited and validated against state-of-the-art reanalyses. Untraditional diagnostics show a physically consistent picture, for which the state of earth's climate is constrained by well-known physical principles, such as energy balance, flow and, conservation. Greenhouse gas concentrations affect the atmospheric optical depth for infrared radiation, and increased opacity implies higher altitude from which earth's equivalent bulk heat loss takes place without being re-absorbed. Such increase is seen in the reanalyses. There has also been a reduction in the correlation between the spatial structure of outgoing long-wave radiation and surface temperature, consistent with increasingly more processes interfering with the upwelling infrared light before it reaches the top of the atmosphere. State-of-the-art reanalyses further imply increases in the overturning in the troposphere, consistent with a constant and continuous vertical e...
Lauret, A J P; Boyer, H; Adelard, L; Garde, F
As part of our efforts to complete the software CODYRUN validation, we chose as test building a block of flats constructed in Reunion Island, which has a humid tropical climate. The sensitivity analysis allowed us to study the effects of both diffuse and direct solar radiation on our model of this building. With regard to the choice and location of sensors, this stage of the study also led us to measure the solar radiation falling on the windows. The comparison of measured and predicted radiation clearly showed that our predictions over-estimated the incoming solar radiation, and we were able to trace the problem to the algorithm which calculates diffuse solar radiation. By calculating view factors between the windows and the associated shading devices, changes to the original program allowed us to improve the predictions, and so this article shows the importance of sensitivity analysis in this area of research.
Highlights: • We focus on solar energy assessment where measurements are not available. • Seventeen broadband models have been reviewed, and then evaluated. • Predictions have been compared with measured data of Gherdaia, Algeria. • Simple models that require few inputs perform better than some complex models. • ASHRAE predicts the DNI with good accuracy, particularly in developing countries. - Abstract: The accurate prediction of direct solar irradiance is essential in many solar energy applications, particularly those relying on concentrating solar technologies. The present paper is aimed at a detailed assessment of a large range of clear-sky solar radiation models under Algerian climate to select the more accurate one for estimating the performance of solar power projects where meteorological and radiometric measurement stations are not available. To this end, seventeen models have been reviewed and their performance compared to measured irradiance of Ghardaia (Southern Algeria). The validation methodology presented herein is very helpful for ranking the models. A new statistical accuracy indicator has been originally introduced to find out the most accurate ones. A thorough analysis of selected models has shown that the more complex models, that seem at first sight more sophisticated, are not necessary the most accurate; while simpler models depending on a limited number of parameters are more suitable. In other words, the suitability and accuracy of a model do not necessarily improve with an increase in the number of its parameters. This important finding is in good agreement with the previous published studies. This fact is important to take into account, in the case where measured data are not available, for the selection of the most suitable locations for the installation of the future concentrating solar power plants in Algeria or even in other countries
The Coupling of three model components, WRF/PCE (polar climate extension version of weather research and forecasting model ( WRF)), ROMS (regional ocean modeling system), and CICE (community ice code), has been implemented, and the regional atmosphere-ocean-sea ice coupled model named WRF/PCE-ROMS-CICE has been validated against ERA-interim reanalysis data sets for 1989. To better understand the reasons that generate model biases, the WRF/PCE-ROMS-CICE results were compared with those of its components, the WRF/PCE and the ROMS-CICE. There are cold biases in surface air temperature (SAT) over the Arctic Ocean, which contribute to the sea ice concentration (SIC) and sea surface temperature (SST) biases in the results of the WRF/PCE-ROMS-CICE. The cold SAT biases also appear in results of the atmo-spheric component with a mild temperature in winter and similar temperature in summer. Compared to results from the WRF/PCE, due to influences of different distributions of the SIC and the SST and inclusion of interactions of air-sea-sea ice in the WRF/PCE-ROMS-CICE, the simulated SAT has new features. These influences also lead to apparent differences at higher levels of the atmosphere, which can be thought as responses to biases in the SST and sea ice extent. There are similar atmospheric responses in feature of distribution to sea ice biases at 700 and 500 hPa, and the strength of responses weakens when the pressure decreases in January. The atmospheric responses in July reach up to 200 hPa. There are surplus sea ice ex-tents in the Greenland Sea, the Barents Sea, the Davis Strait and the Chukchi Sea in winter and in the Beau-fort Sea, the Chukchi Sea, the East Siberian Sea and the Laptev Sea in summer in the ROMS-CICE. These differences in the SIC distribution can all be explained by those in the SST distributions. These features in the simulated SST and SIC from ROMS-CICE also appear in the WRF/PCE-ROMS-CICE. It is shown that the performance of the WRF/PCE-ROMS-CICE is
National Aeronautics and Space Administration — CLAIRE MONTELEONI*, GAVIN SCHMIDT, AND SHAILESH SAROHA* Climate models are complex mathematical models designed by meteorologists, geophysicists, and climate...
Rehman, Muniza; Pedersen, Stig Andur
In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety of...... models has been somewhat narrow-minded reducing the notion of validation to establishment of truth. This article puts forward the diversity in applications of simulation models that demands a corresponding diversity in the notion of validation....
Wang, Guiling; Yu, Miao; Pal, Jeremy S.; Mei, Rui; Bonan, Gordon B.; Levis, Samuel; Thornton, Peter E.
This paper presents a regional climate system model RCM-CLM-CN-DV and its validation over Tropical Africa. The model development involves the initial coupling between the ICTP regional climate model RegCM4.3.4 (RCM) and the Community Land Model version 4 (CLM4) including models of carbon-nitrogen dynamics (CN) and vegetation dynamics (DV), and further improvements of the models. Model improvements derive from the new parameterization from CLM4.5 that addresses the well documented overestimation of gross primary production (GPP), a refinement of stress deciduous phenology scheme in CN that addresses a spurious LAI fluctuation for drought-deciduous plants, and the incorporation of a survival rule into the DV model to prevent tropical broadleaf evergreens trees from growing in areas with a prolonged drought season. The impact of the modifications on model results is documented based on numerical experiments using various subcomponents of the model. The performance of the coupled model is then validated against observational data based on three configurations with increasing capacity: RCM-CLM with prescribed leaf area index and fractional coverage of different plant functional types (PFTs); RCM-CLM-CN with prescribed PFTs coverage but prognostic plant phenology; RCM-CLM-CN-DV in which both the plant phenology and PFTs coverage are simulated by the model. Results from these three models are compared against the FLUXNET up-scaled GPP and ET data, LAI and PFT coverages from remote sensing data including MODIS and GIMMS, University of Delaware precipitation and temperature data, and surface radiation data from MVIRI and SRB. Our results indicate that the models perform well in reproducing the physical climate and surface radiative budgets in the domain of interest. However, PFTs coverage is significantly underestimated by the model over arid and semi-arid regions of Tropical Africa, caused by an underestimation of LAI in these regions by the CN model that gets exacerbated
The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid
The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model predictions with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid
J. J. Gómez-Navarro
Full Text Available We present and analyse a high-resolution regional climate palaeosimulation encompassing the European region for the period 1500–1990. We use the regional model MM5 driven at the boundaries by the global model ECHO-G. Both models are forced by reconstructions of three external factors: greenhouse gas concentrations, total solar irradiance and volcanic activity. The simulation skill is assessed in a recent period by comparing the model results with the Climate Research Unit (CRU database. The results show that although the regional model is tightly driven by the boundary conditions, it is able to improve the reliability of the simulations, narrowing the differences to the observations, especially in areas of complex topography. Additionally, the evolution of the spatial distributions of temperature and precipitation through the last five centuries is analysed, showing that the mean values of temperature reflects the influence of the external forcings. However, contrary to the results obtained under climate change scenario conditions, higher-order momenta of seasonal temperature and precipitation are hardly affected by changes in the external forcings.
J. J. Gómez-Navarro
Full Text Available We present and analyse a high-resolution regional climate palaeosimulation encompassing the European region for the period 1500–1990. We use the regional model MM5 coupled to the global model ECHO-G. Both models were driven by reconstructions of three external factors: greenhouse gas concentrations, Total Solar Irradiance and volcanic activity. The simulation has been assessed in a recent period by comparing the model results with the Climate Research Unit (CRU database. The results show that although the regional model is tightly driven by the boundary conditions, it is able to improve the reliability of the simulations, narrowing the differences to the observations, especially in areas of complex topography. Additionally, the evolution of the spatial distributions of temperature and precipitation through the last five centuries has been analysed. The mean values of temperature reflects the influence of the external forcings but, contrary to the results obtained under climate change scenario conditions, we found that higher-order momenta of the probability distribution of seasonal temperature and precipitation are hardly affected by changes in the external forcings
The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and
The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M and O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and
The paper points out the importance and usefulness of recognizing the separate roles of processes and geometric structures in predictive modeling of the performance of a nuclear waste repository or underground injection disposal of toxic wastes. Based on this a validation procedure is proposed. Furthermore, two stages and three elements of validation are described and discussed. Finally, comments are made on the choice of measurables to be used to compare modeling results and field data in the validation procedure. 8 refs
Under the NASA-KAIA-KARI ATM research collaboration agreement, SOSS ICN Model has been developed for Incheon International Airport. This presentation describes the model validation work in the project. The presentation will show the results and analysis of the validation.
This report summarizes the activities of our group to meet our stated objectives. The report is divided into sections entitled: Radiation Model Testing Activities, General Circulation Model Testing Activities, Science Team Activities, and Publications, Presentations and Meetings. The section on Science Team Activities summarizes our participation with the science team to further advance the observation and modeling programs. Appendix A lists graduate students supported, and post-doctoral appointments during the project. Reports on the activities during each of the first two years are included as Appendix B. Significant progress has been made in: determining the ability of line-by-line radiation models to calculate the downward longwave flux at the surface; determining the uncertainties in calculated the downwelling radiance and flux at the surface associated with the use of different proposed profiling techniques; intercomparing clear-sky radiance and flux observations with calculations from radiation codes from different climate models; determining the uncertainties associated with estimating N* from surface longwave flux observations; and determining the sensitivity of model calculations to different formulations of the effects of finite sized clouds
Molodtsova, T.; Kirilenko, A.; Stepchenkova, S.
Reporting the observed and modeled changes in climate to public requires the measures understandable by the general audience. E.g., the NASA GISS Common Sense Climate Index (Hansen et al., 1998) reports the change in climate based on six practically observable parameters such as the air temperature exceeding the norm by one standard deviation. The utility of the constructed indices for reporting climate change depends, however, on an assumption that the selected parameters are felt and connected with the changing climate by a non-expert, which needs to be validated. Dynamic discussion of climate change issues in social media may provide data for this validation. We connected the intensity of public discussion of climate change in social networks with regional weather variations for the territory of the USA. We collected the entire 2012 population of Twitter microblogging activity on climate change topic, accumulating over 1.8 million separate records (tweets) globally. We identified the geographic location of the tweets and associated the daily and weekly intensity of twitting with the following parameters of weather for these locations: temperature anomalies, 'hot' temperature anomalies, 'cold' temperature anomalies, heavy rain/snow events. To account for non-weather related events we included the articles on climate change from the 'prestige press', a collection of major newspapers. We found that the regional changes in parameters of weather significantly affect the number of tweets published on climate change. This effect, however, is short-lived and varies throughout the country. We found that in different locations different weather parameters had the most significant effect on climate change microblogging activity. Overall 'hot' temperature anomalies had significant influence on climate change twitting intensity.
Fortelius, C.; Holopainen, E.; Kaurola, J.; Ruosteenoja, K.; Raeisaenen, J. [Helsinki Univ. (Finland). Dept. of Meteorology
In recent years the modelling of interannual climate variability has been studied, the atmospheric energy and water cycles, and climate simulations with the ECHAM3 model. In addition, the climate simulations of several models have been compared with special emphasis in the area of northern Europe
The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computational ''tools'' for estimating the possible radiation dose that individuals may have received from past Hanford Site operations. This document describes the planned activities to ''validate'' these tools. In the sense of the HEDR Project, ''validation'' is a process carried out by comparing computational model predictions with field observations and experimental measurements that are independent of those used to develop the model
This letter assesses the quality of temperature and rainfall daily retrievals of the European Climate Assessment and Dataset (ECA and D) with respect to measurements collected locally in various parts of the Euro-Mediterranean region in the framework of the Hydrological Cycle in the Mediterranean Experiment (HyMeX), endorsed by the Global Energy and Water Cycle Experiment (GEWEX) of the World Climate Research Program (WCRP). The ECA and D, among other gridded datasets, is very often used as a reference for model calibration and evaluation. This is for instance the case in the context of the WCRP Coordinated Regional Downscaling Experiment (CORDEX) and its Mediterranean declination MED-CORDEX. This letter quantifies ECA and D dataset uncertainties associated with temperature and precipitation intra-seasonal variability, seasonal distribution and extremes. Our motivation is to help the interpretation of the results when validating or calibrating downscaling models by the ECA and D dataset in the context of regional climate research in the Euro-Mediterranean region. (letter)
Mayer, Stephanie; Maule, Cathrine Fox; Sobolowski, Stefan;
Before running climate projections with numerical models it is important to validate their performance under present climate conditions. Within the RiskChange project two high‐resolution regional climate models were run as a perfect boundary experiment over Scandinavia. The simulations are...... study is to analyse the properties of high‐resolution climate simulations over Scandinavia by testing a hypothesis that dynamic simulations are better at retaining the properties of precipitation, notably precipitation extremes than coarser simulations. When compared to statistical methods the dynamical...... downscaling has the advantage of retaining the full set of atmospheric variables as well as a physically more realistic description of e.g. complex terrain (e.g. mountain ranges and coastlines) and when the representation and behaviour of extremes are important to be captured in a realistic manner. Here, we...
A method of validating climate models in climate research with a view to extreme events; Eine Methode zur Validierung von Klimamodellen fuer die Klimawirkungsforschung hinsichtlich der Wiedergabe extremer Ereignisse
A method is presented to validate climate models with respect to extreme events which are suitable for risk assessment in impact modeling. The algorithm is intended to complement conventional techniques. These procedures mainly compare simulation results with reference data based on single or only a few climatic variables at the same time under the aspect how well a model performs in reproducing the known physical processes of the atmosphere. Such investigations are often based on seasonal or annual mean values. For impact research, however, extreme climatic conditions with shorter typical time scales are generally more interesting. Furthermore, such extreme events are frequently characterized by combinations of individual extremes which require a multivariate approach. The validation method presented here basically consists of a combination of several well-known statistical techniques, completed by a newly developed diagnosis module to quantify model deficiencies. First of all, critical threshold values of key climatic variables for impact research have to be derived serving as criteria to define extreme conditions for a specific activity. Unlike in other techniques, the simulation results to be validated are interpolated to the reference data sampling points in the initial step of this new technique. Besides that fact that the same spatial representation is provided in this way in both data sets for the next diagnostic steps, this procedure also enables to leave the reference basis unchanged for any type of model output and to perform the validation on a real orography. To simultaneously identify the spatial characteristics of a given situation regarding all considered extreme value criteria, a multivariate cluster analysis method for pattern recognition is separately applied to both simulation results and reference data. Afterwards, various distribution-free statistical tests are applied depending on the specific situation to detect statistical significant
Pitman, A.J.; Arneth, A.; Ganzeveld, L.N.
Global climate models simulate the Earth's climate impressively at scales of continents and greater. At these scales, large-scale dynamics and physics largely define the climate. At spatial scales relevant to policy makers, and to impacts and adaptation, many other processes may affect regional and
To validate an estimated model and to have a good understanding of its reliability is a central aspect of System Identification. This contribution discusses these aspects in the light of model error models that are explicit descriptions of the model error. A model error model is implicitly present in most model validation methods, so the concept is more of a representation form than a set of new techniques. Traditional model validation is essentially a test of whether the confidence region of...
The purpose with this deliverable 2.5 is to use fresh experimental data for validation and selection of a flow model to be used for control design in WP3-4. Initially the idea was to investigate the models developed in WP2. However, in the project it was agreed to include and focus on a additive...... model turns out not to be useful for prediction of the flow. Moreover, standard Box Jenkins model structures and multiple output auto regressive models proves to be superior as they can give useful predictions of the flow....
Ehrhart, Mark G; Torres, Elisa M; Wright, Lisa A; Martinez, Sandra Y; Aarons, Gregory A
There is increasing emphasis on the use of evidence-based practices (EBPs) in child welfare settings and growing recognition of the importance of the organizational environment, and the organization's climate in particular, for how employees perceive and support EBP implementation. Recently, Ehrhart, Aarons, and Farahnak (2014) reported on the development and validation of a measure of EBP implementation climate, the Implementation Climate Scale (ICS), in a sample of mental health clinicians. The ICS consists of 18 items and measures six critical dimensions of implementation climate: focus on EBP, educational support for EBP, recognition for EBP, rewards for EBP, selection or EBP, and selection for openness. The goal of the current study is to extend this work by providing evidence for the factor structure, reliability, and validity of the ICS in a sample of child welfare service providers. Survey data were collected from 215 child welfare providers across three states, 12 organizations, and 43 teams. Confirmatory factor analysis demonstrated good fit to the six-factor model and the alpha reliabilities for the overall measure and its subscales was acceptable. In addition, there was general support for the invariance of the factor structure across the child welfare and mental health sectors. In conclusion, this study provides evidence for the factor structure, reliability, and validity of the ICS measure for use in child welfare service organizations. PMID:26563643
Pedersen, Rasmus Anker
Past warm climate states could potentially provide information on future global warming. The past warming was driven by changed insolation rather than an increased greenhouse effect, and thus the warm climate states are expected to be different. Nonetheless, the response of the climate system...... involves some of the same mechanisms in the two climate states. This thesis aims to investigate these mechanisms through climate model experiments. This two-part study has a special focus on the Arctic region, and the main paleoclimate experiments are supplemented by idealized experiments detailing the...... impact of a changing sea ice cover. The first part focusses on the last interglacial climate (125,000 years before present) which was characterized by substantial warming at high northern latitudes due to an increased insolation during summer. The simulations reveal that the oceanic changes dominate the...
Obiyemi, O. O.; Ibiyemi, T. S.; Ojo, J. S.
In this paper, validation of rain climatic zone classifications for Nigeria is presented based on global radio-climatic models by the International Telecommunication Union-Radiocommunication (ITU-R) and Crane. Rain rate estimates deduced from several ground-based measurements and those earlier estimated from the precipitation index on the Tropical Rain Measurement Mission (TRMM) were employed for the validation exercise. Although earlier classifications indicated that Nigeria falls into zones P, Q, N, and K for the ITU-R designations, and zones E and H for Crane's climatic zone designations, the results however confirmed that the rain climatic zones across Nigeria can only be classified into four, namely P, Q, M, and N for the ITU-R designations, while the designations by Crane exhibited only three zones, namely E, G, and H. The ITU-R classification was found to be more suitable for planning microwave and millimeter wave links across Nigeria. The research outcomes are vital in boosting the confidence level of system designers in using the ITU-R designations as presented in the map developed for the rain zone designations for estimating the attenuation induced by rain along satellite and terrestrial microwave links over Nigeria.
Millner, Antony; McDermott, Thomas K J
Benefit-cost integrated assessment models (BC-IAMs) inform climate policy debates by quantifying the trade-offs between alternative greenhouse gas abatement options. They achieve this by coupling simplified models of the climate system to models of the global economy and the costs and benefits of climate policy. Although these models have provided valuable qualitative insights into the sensitivity of policy trade-offs to different ethical and empirical assumptions, they are increasingly being used to inform the selection of policies in the real world. To the extent that BC-IAMs are used as inputs to policy selection, our confidence in their quantitative outputs must depend on the empirical validity of their modeling assumptions. We have a degree of confidence in climate models both because they have been tested on historical data in hindcasting experiments and because the physical principles they are based on have been empirically confirmed in closely related applications. By contrast, the economic components of BC-IAMs often rely on untestable scenarios, or on structural models that are comparatively untested on relevant time scales. Where possible, an approach to model confirmation similar to that used in climate science could help to build confidence in the economic components of BC-IAMs, or focus attention on which components might need refinement for policy applications. We illustrate the potential benefits of model confirmation exercises by performing a long-run hindcasting experiment with one of the leading BC-IAMs. We show that its model of long-run economic growth-one of its most important economic components-had questionable predictive power over the 20th century. PMID:27432964
This paper gives an outline of climate modeling at Manitoba Hydro. Manitoba Hydro is studying climate change because it affects water supply and energy demand. Hence climate change must be addressed in planning and operation of hydropower projects as well as regulatory and compliance issues. The study has developed a series of climate change scenarios based on the Global Climate Models
Skalák, Petr; Déqué, M.; Belda, M.; Farda, Aleš; Halenka, T.; Csima, G.; Bartholy, J.; Caian, M.; Spiridonov, V.
Roč. 60, č. 1 (2014), s. 1-12. ISSN 0936-577X R&D Projects: GA MŠk(CZ) ED1.1.00/02.0073; GA MŠk(CZ) EE2.4.31.0056 Institutional support: RVO:67179843 Keywords : RCM * Model performance * Validation * CECILIA * ALADIN-Climate * RegCM3 Subject RIV: EH - Ecology, Behaviour Impact factor: 2.496, year: 2014
North, G. R.; Cahalan, R. F.; Coakley, J. A., Jr.
An introductory survey of the global energy balance climate models is presented with an emphasis on analytical results. A sequence of increasingly complicated models involving ice cap and radiative feedback processes are solved, and the solutions and parameter sensitivities are studied. The model parameterizations are examined critically in light of many current uncertainties. A simple seasonal model is used to study the effects of changes in orbital elements on the temperature field. A linear stability theorem and a complete nonlinear stability analysis for the models are developed. Analytical solutions are also obtained for the linearized models driven by stochastic forcing elements. In this context the relation between natural fluctuation statistics and climate sensitivity is stressed.
Fillol, Erwann Joachim
indicators developed in this thesis, a North/South disparity over Canada. The region north of the 55th parallel indicated a warming trend (increase in the annual degree-days, reduction in the length of the snow cover period, increase in the length of the growing season, increase in the air temperature), southern regions of Canada appeared to be cooling (based on the same indicators). Validation of the Canadian regional climate model ( towards the development of future climate prediction tools). The second part of this thesis focuses on the validation, using remotely sensed measurements, of a specific surface field of the CRCM model (Canadian Regional Climate Model), namely the land-surface temperature. The comparison over a short time scale between the ground temperature values modeled by the CRCM with composited satellite temperatures shows the feasibility of validating climatic models using remote sensing. The results show a slight under-estimation of the CRCM ground temperature during the summer. This is possibly due to an overestimation of the precipitation rate which in turn generates excessive surface wetness and an excessive evaporation rate (thus cooling the surface). The agreement which was observed between model and measurements strongly suggest that climate models of the type used in this work should facilitate reliable predictions of future climate trends and help in orienting the decision making process for the world community as we collectively fact the prospect of climate imbalance.
Full Text Available The aim of the study was to examine the construct validity and internal consistency of the Motivational Climate in Physical Education Scale (MCPES. A key element of the development process of the scale was establishing a theoretical framework that integrated the dimensions of task- and ego involving climates in conjunction with autonomy, and social relatedness supporting climates. These constructs were adopted from the self-determination and achievement goal theories. A sample of Finnish Grade 9 students, comprising 2,594 girls and 1,803 boys, completed the 18-item MCPES during one physical education class. The results of the study demonstrated that participants had highest mean in task-involving climate and the lowest in autonomy climate and ego-involving climate. Additionally, autonomy, social relatedness, and task- involving climates were significantly and strongly correlated with each other, whereas the ego- involving climate had low or negligible correlations with the other climate dimensions.The construct validity of the MCPES was analyzed using confirmatory factor analysis. The statistical fit of the four-factor model consisting of motivational climate factors supporting perceived autonomy, social relatedness, task-involvement, and ego-involvement was satisfactory. The results of the reliability analysis showed acceptable internal consistencies for all four dimensions. The Motivational Climate in Physical Education Scale can be considered as psychometrically valid tool to measure motivational climate in Finnish Grade 9 students.
Beaumet, Julien; Doutreloup, Sébastien; Fettweis, Xavier; Erpicum, Michel
Solar irradiance modelling is crucial for solar resource management, photovoltaic production forecasting and for a better integration of solar energy in the electrical grid network. For those reasons, an adapted version of the Modèle Atmospheric Regional (MAR) is being developed at the Laboratory of Climatology of the University of Liège in order to provide high quality modelling of solar radiation, wind and temperature over north-western Europe. In this new model version, the radiation scheme has been calibrated using solar irradiance in-situ measurements and CORINE Land Cover data have been assimilated in order to improve the modelling of 10 m wind speed and near-surface temperature. In this study, MAR is forced at its boundary by ERA-40 reanalysis and its horizontal resolution is 10 kilometres. Diffuse radiation is estimated using global radiation from MAR outputs and a calibrated version of Ruiz-Arias et al., (2010) sigmoid model. This study proposes to evaluate the method performance for global and diffuse radiation modelling at both the hourly and daily time scale using data from the European Solar Radiation Atlas database for the weather stations of Uccle (Belgium) and Braunschweig (Germany). After that, a 30-year climatology of global and diffuse irradiance for the 1981-2010 period over western Europe is built. The created data set is then analysed in order to highlight possible regional or seasonal trends. The validity of the results is then evaluated after comparison with trends found in in-situ data or from different studies from the literature.
How much can we trust model-based projections of future anthropogenic climate change? This review attempts to give an overview of this important but difficult topic by using three main lines of evidence: the skill of models in simulating present-day climate, intermodel agreement on future climate changes, and the ability of models to simulate climate changes that have already occurred. A comparison of simulated and observed present-day climates shows good agreement for many basic variables, p...
Cale, Jr, W. G.; Shugart, H. H.
Definitions of model realism and model validation are developed. Ecological and mathematical arguments are then presented to show that model equations which explicitly treat ecosystem processes can be systematically improved such that greater realism is attained and the condition of validity is approached. Several examples are presented.
School climate is recognized as a relevant factor for the improvement of educative processes, favoring the administrative processes and optimum school performance. The present article is the result of a quantitative research model which had the objective of psychometrically designing and validating a scale to diagnose the organizational climate of…
Full Text Available In this paper, I respond to the challenge raised against contemporary experimental neurobiology according to which the field is in a state of crisis because of the multiple experimental protocols employed in different laboratories and strengthening their reliability that presumably preclude the validity of neurobiological knowledge. I provide an alternative account of experimentation in neurobiology which makes sense of its experimental practices. I argue that maintaining a multiplicity of experimental protocols and strengthening their reliability are well justified and they foster rather than preclude the validity of neurobiological knowledge. Thus, their presence indicates thriving rather than crisis of experimental neurobiology.
In this paper, I respond to the challenge raised against contemporary experimental neurobiology according to which the field is in a state of crisis because of the multiple experimental protocols employed in different laboratories and strengthening their reliability that presumably preclude the validity of neurobiological knowledge. I provide an alternative account of experimentation in neurobiology which makes sense of its experimental practices. I argue that maintaining a multiplicity of ex...
Frigg, Roman; Thompson, Erica; Werndl, Charlotte
This is the second of three parts of an introduction to the philosophy of climate science. In this second part about modelling climate change, the topics of climate modelling, confirmation of climate models, the limits of climate projections, uncertainty and finally model ensembles will be discussed.
National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets...
Full Text Available The primary objective of the study was to quantitatively test the DART model, which despite being one of the most popular representations of co-creation concept was so far studied almost solely with qualitative methods. To this end, the researchers developed a multiple measurement scale and employed it in interviewing managers. The statistical evidence for adequacy of the model was obtained through CFA with AMOS software. The findings suggest that the DART model may not be an accurate representation of co-creation practices in companies. From the data analysis it was evident that the building blocks of DART had too much of conceptual overlap to be an effective framework for quantitative analysis. It was also implied that the phenomenon of co-creation is so rich and multifaceted that it may be more adequately captured by a measurement model where co-creation is conceived as a third-level factor with two layers of intermediate latent variables.
Full Text Available Selected pigments, diatoms and diatom-inferred phosphorus (Di-TP concentrations of a late glacial sediment core section of the meromictic Längsee, Austria, were compared with tephra- and varve-dated pollen stratigraphic and geochemical results. A conceptual model was adopted for Längsee and evaluated using multi proxy data. During the unforested late Pleniglacial, a holomictic lake stage with low primary productivity prevailed. Subsequent to the Lateglacial Betula expansion, at about 14,300 cal. y BP, okenone and isorenieratene, pigments from purple and green sulphur bacteria, indicate the onset of anoxic conditions in the hypolimnion. The formation of laminae coincides with this anoxic, meromictic period with high, though fluctuating, amounts of okenone that persisted throughout the Lateglacial interstadial. The occurrence of unlaminated sediment sections of allochthonous origin, and concurrent low concentrations of okenone, were related to cool and wet climate fluctuations during this period, probably coupled with a complete mixing of the water column. Two of these oscillations of the Lateglacial interstadial have been correlated tentatively with the Aegelsee and Gerzensee oscillations in the Alps. The latter climate fluctuation divides a period of enhanced anoxia and primary productivity, correlated with the Alleröd chronozone. Continental climate conditions were assumed to be the main driving forces for meromictic stability during Alleröd times. In addition, calcite dissolution due to severe hypolimnetic anoxia, appear to have supported meromictic stability. Increased pigment concentrations, which are in contrast to low diatom-inferred total phosphorus (Di- TP, indicate the formation of a productive metalimnion during this period, probably due to a clear-water phase (low catchment erosion, increased temperatures, and a steep gradient between the phosphorus enriched hypolimnion and the oligotrophic epilimnion. Meltwater impacts from an
Franzke, Christian L E; Berner, Judith; Williams, Paul D; Lucarini, Valerio
Stochastic methods are a crucial area in contemporary climate research and are increasingly being used in comprehensive weather and climate prediction models as well as reduced order climate models. Stochastic methods are used as subgrid-scale parameterizations as well as for model error representation, uncertainty quantification, data assimilation and ensemble prediction. The need to use stochastic approaches in weather and climate models arises because we still cannot resolve all necessary processes and scales in comprehensive numerical weather and climate prediction models. In many practical applications one is mainly interested in the largest and potentially predictable scales and not necessarily in the small and fast scales. For instance, reduced order models can simulate and predict large scale modes. Statistical mechanics and dynamical systems theory suggest that in reduced order models the impact of unresolved degrees of freedom can be represented by suitable combinations of deterministic and stochast...
Beckman, Jayson, E-mail: JBeckman@ers.usda.gov [Economic Research Service, USDA, Washington DC, 20036 (United States); Hertel, Thomas; Tyner, Wallace [Purdue University (United States)
Although CGE models have received heavy usage - particularly in the analysis of broad-based policies relating to energy, climate and trade, they are often criticized as being insufficiently validated. Key parameters are often not econometrically estimated, and the performance of the model as a whole is rarely checked against historical outcomes. As a consequence, questions frequently arise as to how much faith one can put in CGE results. In this paper, we employ a novel approach to the validation of a widely utilized global CGE model - GTAP-E. By comparing the variance of model-generated petroleum price distributions - driven by historical demand and supply shocks to the model - with observed five-year moving average price distributions, we conclude that energy demand in GTAP-E is far too price-elastic over this medium run time frame. After incorporating the latest econometric estimates of energy demand and supply elasticities, we revisit the validation question and find the model to perform more satisfactorily. As a further check, we compare a deterministic global general equilibrium simulation, based on historical realizations over the five year period: 2001-2006, during which petroleum prices rose sharply, along with growing global energy demands. As anticipated by the stochastic simulations, the revised model parameters perform much better than the original GTAP-E parameters in this global, general equilibrium context.
Although CGE models have received heavy usage - particularly in the analysis of broad-based policies relating to energy, climate and trade, they are often criticized as being insufficiently validated. Key parameters are often not econometrically estimated, and the performance of the model as a whole is rarely checked against historical outcomes. As a consequence, questions frequently arise as to how much faith one can put in CGE results. In this paper, we employ a novel approach to the validation of a widely utilized global CGE model - GTAP-E. By comparing the variance of model-generated petroleum price distributions - driven by historical demand and supply shocks to the model - with observed five-year moving average price distributions, we conclude that energy demand in GTAP-E is far too price-elastic over this medium run time frame. After incorporating the latest econometric estimates of energy demand and supply elasticities, we revisit the validation question and find the model to perform more satisfactorily. As a further check, we compare a deterministic global general equilibrium simulation, based on historical realizations over the five year period: 2001-2006, during which petroleum prices rose sharply, along with growing global energy demands. As anticipated by the stochastic simulations, the revised model parameters perform much better than the original GTAP-E parameters in this global, general equilibrium context.
LIU Fei; MA Ping; YANG Ming; WANG Zi-cai
To provide a realistic simulation environment for users, intelligent models have become key compo-nents in military simulations. After the analysis of modeling nature of intelligent models, the validation criteria for defining the validation points and validation metrics for measuring the agreements between human expertsand intelligent models were presented. Further, such methods as graphical comparison, feature analysis and face validation were discussed according to the characteristics of intelligent models. Based on the validation cri-teria, validation metrics and validation methods, the intelligent models can be effectively validated, which has been proved in current developed intelligent models.
The evidence for the construct validity of classroom climate measures was examined within a theoretical model where several affective measures served as criterion variables. The patterns of social-psychological and management-organization structural dimensions differed when the unit of analysis varied from the individual to the class. The…
Larsen, Morten Andreas Dahl
global warming and increased frequency of extreme events. The skill in developing projections of both the present and future climate depends essentially on the ability to numerically simulate the processes of atmospheric circulation, hydrology, energy and ecology. Previous modelling efforts of climate...... existing climate and hydrology models to more directly include the interaction between the atmosphere and the land surface. The present PhD study is motivated by an ambition of developing and applying a modelling tool capable of including the interaction and feedback mechanisms between the atmosphere and...... the land surface. The modelling tool consists of a fully dynamic two-way coupling of the HIRHAM regional climate model and the MIKE SHE hydrological model. The expected gain is twofold. Firstly, HIRHAM utilizes the land surface component of the combined MIKE SHE/SWET hydrology and land surface model...
It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs
Rumsey, Christopher L.
Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important
Maraun, Douglas; Widmann, Martin
When using climate change scenarios - either from global climate models or further downscaled - to assess localised real world impacts, one has to ensure that the local simulation indeed correctly represents the real world local climate. Representativeness has so far mainly been discussed as a scale issue: simulated meteorological variables in general represent grid box averages, whereas real weather is often expressed by means of point values. As a result, in particular simulated extreme values are not directly comparable with observed local extreme values. Here we argue that the issue of representativeness is more general. To illustrate this point, assume the following situations: first, the (GCM or RCM) simulated large scale weather, e.g., the mid-latitude storm track, might be systematically distorted compared to observed weather. If such a distortion at the synoptic scale is strong, the simulated local climate might be completely different from the observed. Second, the orography even of high resolution RCMs is only a coarse model of true orography. In particular in mountain ranges the simulated mesoscale flow might therefore considerably deviate from the observed flow, leading to systematically displaced local weather. In both cases, the simulated local climate does not represent observed local climate. Thus, representativeness also encompasses representing a particular location. We propose to measure this aspect of representativeness for RCMs driven with perfect boundary conditions as the correlation between observations and simulations at the inter-annual scale. In doing so, random variability generated by the RCMs is largely averaged out. As an example, we assess how well KNMIs RACMO2 RCM at 25km horizontal resolution represents winter precipitation in the gridded E-OBS data set over the European domain. At a chosen grid box, RCM precipitation might not be representative of observed precipitation, in particular in the rain shadow of major moutain ranges
Guo, Ying; Shen, Yanjun
We have developed an operational model to simulate water and energy fluxes in the Haihe River Basin (231,800 km2 in size) for the past 28 years. This model is capable of estimating water and energy fluxes of irrigated croplands and heterogeneous grids. The model was validated using actual evapotranspiration (ETa) measured by an eddy covariance system, measured soil moisture in croplands, groundwater level measurements over the piedmont plain and runoff observations in a mountainous catchment. A long-term time series of water and energy balance components were then simulated at a daily time step by integrating remotely sensed information and meteorological data to examine the spatial and temporal distribution and changes in water and energy fluxes in the basin over the past 28 years. The results show that net radiation (Rn) in the mountainous regions is generally higher than that in the plain regions. ETa in the plain regions is higher than that in the mountainous regions mostly because of higher air temperature and larger areas of irrigated farmland. Higher sensible heat flux (H) and lower ETa in the urban areas are possibly due to less vegetation cover, an impervious surface, rapid drainage, and the heat island effect of cities. During the study period, a water deficit continuously occurred in the plain regions because of extensive pumping of groundwater for irrigation to meet the crop water requirements. Irrigation has led to significant groundwater depletion, which poses a substantial challenge to the sustainability of water resources in this basin.
Pallant, Amy; Lee, Hee-Sun; Pryputniewicz, Sara
Systems thinking suggests that one can best understand a complex system by studying the interrelationships of its component parts rather than looking at the individual parts in isolation. With ongoing concern about the effects of climate change, using innovative materials to help students understand how Earth's systems connect with each other is…
Vargas, J.V.C. [Federal University of Parana (UFPR), Curitiba, PR (Brazil). Dept. of Mechanical Engineering], E-mail: firstname.lastname@example.org; Ordonez, J.C.; Martins, L.S. [Florida State University, Tallahassee, FL (United States). Center for Advanced Power Systems], Emails: email@example.com, firstname.lastname@example.org
In this paper, a simplified and comprehensive PEMFC mathematical model introduced in previous studies is experimentally validated. Numerical results are obtained for an existing set of commercial unit PEM fuel cells. The model accounts for pressure drops in the gas channels, and for temperature gradients with respect to space in the flow direction, that are investigated by direct infrared imaging, showing that even at low current operation such gradients are present in fuel cell operation, and therefore should be considered by a PEMFC model, since large coolant flow rates are limited due to induced high pressure drops in the cooling channels. The computed polarization and power curves are directly compared to the experimentally measured ones with good qualitative and quantitative agreement. The combination of accuracy and low computational time allow for the future utilization of the model as a reliable tool for PEMFC simulation, control, design and optimization purposes. (author)
Williamson, J. [Consortium for Advanced Residential Buildings, Norwalk, CT (United States); Puttagunta, S. [Consortium for Advanced Residential Buildings, Norwalk, CT (United States)
This study was intended to validate actual performance of three ZERHs in the Northeast to energy models created in REM/Rate v14.5 (one of the certified software programs used to generate a HERS Index) and the National Renewable Energy Laboratory’s Building Energy Optimization (BEopt™) v2.3 E+ (a more sophisticated hourly energy simulation software). This report details the validation methods used to analyze energy consumption at each home.
Raghavan, S. V.; Vu, M. T.; Liong, S. Y.
We present an analysis of the present-day (1961-1990) regional climate simulations over Vietnam. The regional climate model Weather Research and Forecasting (WRF) was driven by the global reanalysis ERA40. The performance of the regional climate model in simulating the observed climate is evaluated with a main focus on precipitation and temperature. The regional climate model was able to reproduce the observed spatial patterns of the climate, although with some biases. The model also performed better in reproducing the extreme precipitation and the interannual variability. Overall, the WRF model was able to simulate the main regional signatures of climate variables, seasonal cycles, and frequency distributions. This study is an evaluation of the present-day climate simulations of a regional climate model at a resolution of 25 km. Given that dynamical downscaling has become common for studying climate change and its impacts, the study highlights that much more improvements in modeling might be necessary to yield realistic simulations of climate at high resolutions before they can be used for impact studies at a local scale. The need for a dense network of observations is also realized as observations at high resolutions are needed when it comes to evaluations and validations of models at sub-regional and local scales.
Katz, J. I.
A "toy" model, simple and elementary enough for an undergraduate class, of the temperature dependence of the greenhouse (mid-IR) absorption by atmospheric water vapor implies a bistable climate system. The stable states are glaciation and warm interglacials, while intermediate states are unstable. This is in qualitative accord with the paleoclimatic data. The present climate may be unstable, with or without anthropogenic interventions such as CO$_2$ emission, unless there is additional stabil...
Katz, J I
A "toy" model, simple and elementary enough for an undergraduate class, of the temperature dependence of the greenhouse (mid-IR) absorption by atmospheric water vapor implies a bistable climate system. The stable states are glaciation and warm interglacials, while intermediate states are unstable. This is in qualitative accord with the paleoclimatic data. The present climate may be unstable, with or without anthropogenic interventions such as CO$_2$ emission, unless there is additional stabilizing feedback such as "geoengineering".
ZENG Qingcun; WANG Huijun; LIN Zhaohui; ZHOU Guangqing; YU Yongqiang
@@ The implementation of the project has lasted for more than 20 years. As a result, the following key innovative achievements have been obtained, ranging from the basic theory of climate dynamics, numerical model development and its related computational theory to the dynamical climate prediction using the climate system models:
Sliter, Katherine A
Due to the obesity epidemic, an increasing amount of research is being conducted to better understand the antecedents and consequences of excess employee weight. One construct often of interest to researchers in this area is organizational climate. Unfortunately, a viable measure of climate, as related to employee weight, does not exist. The purpose of this study was to remedy this by developing and validating a concise, psychometrically sound measure of climate for healthy weight. An item pool was developed based on surveys of full-time employees, and a sorting task was used to eliminate ambiguous items. Items were pilot tested by a sample of 338 full-time employees, and the item pool was reduced through item response theory (IRT) and reliability analyses. Finally, the retained 14 items, comprising 3 subscales, were completed by a sample of 360 full-time employees, representing 26 different organizations from across the United States. Multilevel modeling indicated that sufficient variance was explained by group membership to support aggregation, and confirmatory factor analysis (CFA) supported the hypothesized model of 3 subscale factors and an overall climate factor. Nine hypotheses specific to construct validation were tested. Scores on the new scale correlated significantly with individual-level reports of psychological constructs (e.g., health motivation, general leadership support for health) and physiological phenomena (e.g., body mass index [BMI], physical health problems) to which they should theoretically relate, supporting construct validity. Implications for the use of this scale in both applied and research settings are discussed. PMID:23834449
North, G. R.; Cahalan, R. F.; Coakley, J. A., Jr.
An introductory survey of the global energy balance climate models is presented with an emphasis on analytical results. A sequence of increasingly complicated models involving ice cap and radiative feedback processes are solved and the solutions and parameter sensitivities are studied. The model parameterizations are examined critically in light of many current uncertainties. A simple seasonal model is used to study the effects of changes in orbital elements on the temperature field. A linear stability theorem and a complete nonlinear stability analysis for the models are developed. Analytical solutions are also obtained for the linearized models driven by stochastic forcing elements. In this context the relation between natural fluctuation statistics and climate sensitivity is stressed.
This thesis develops a novel framework for model skill assessment and the generation of probabilistic future climate scenarios. Traditional approaches to model validation assume that skill in simulating the mean climate is a valid indicator of skill in modelling the climate system. However, without information about how errors arise, conclusions cannot be drawn about whether models are genuinely skilful. Initially, verification statistics are used to assess model skill in simul...
Glenis, Vassilis; Pinamonti, Valentina; Hall, Jim W.; Kilsby, Chris G.
Stochastic weather generators (WGs), which provide long synthetic time series of weather variables such as rainfall and potential evapotranspiration (PET), have found widespread use in water resources modelling. When conditioned upon the changes in climatic statistics (change factors, CFs) predicted by climate models, WGs provide a useful tool for climate impacts assessment and adaption planning. The latest climate modelling exercises have involved large numbers of global and regional climate models integrations, designed to explore the implications of uncertainties in the climate model formulation and parameter settings: so called 'perturbed physics ensembles' (PPEs). In this paper we show how these climate model uncertainties can be propagated through to impact studies by testing multiple vectors of CFs, each vector derived from a different sample from a PPE. We combine this with a new methodology to parameterise the projected time-evolution of CFs. We demonstrate how, when conditioned upon these time-dependent CFs, an existing, well validated and widely used WG can be used to generate non-stationary simulations of future climate that are consistent with probabilistic outputs from the Met Office Hadley Centre's Perturbed Physics Ensemble. The WG enables extensive sampling of natural variability and climate model uncertainty, providing the basis for development of robust water resources management strategies in the context of a non-stationary climate.
A dynamical downscaling is presented that allows an estimation of potential effects of climate change on the North Sea. Therefore, the ocean general circulation model OPYC is adapted for application on a shelf by adding a lateral boundary formulation and a tide model. In this set-up the model is forced, first, with data from the ECMWF reanalysis for model validation and the study of the natural variability, and, second, with data from climate change experiments to estimate the effects of climate change on the North Sea. (orig.)
Roeckner, E. [Max Planck Institute for Meterology, Hamburg (Germany)
Clouds are a very important, yet poorly modeled element in the climate system. There are many potential cloud feedbacks, including those related to cloud cover, height, water content, phase change, and droplet concentration and size distribution. As a prerequisite to studying the cloud feedback issue, this research reports on the simulation and validation of cloud radiative forcing under present climate conditions using the ECHAM general circulation model and ERBE top-of-atmosphere radiative fluxes.
The purpose of this Analysis and Modeling Report (AMR) is to validate the External Accumulation Model that predicts accumulation of fissile materials in fractures and lithophysae in the rock beneath a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. (Lithophysae are voids in the rock having concentric shells of finely crystalline alkali feldspar, quartz, and other materials that were formed due to entrapped gas that later escaped, DOE 1998, p. A-25.) The intended use of this model is to estimate the quantities of external accumulation of fissile material for use in external criticality risk assessments for different types of degrading WPs: U.S. Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The scope of the model validation is to (1) describe the model and the parameters used to develop the model, (2) provide rationale for selection of the parameters by comparisons with measured values, and (3) demonstrate that the parameters chosen are the most conservative selection for external criticality risk calculations. To demonstrate the applicability of the model, a Pu-ceramic WP is used as an example. The model begins with a source term from separately documented EQ6 calculations; where the source term is defined as the composition versus time of the water flowing out of a breached waste package (WP). Next, PHREEQC, is used to simulate the transport and interaction of the source term with the resident water and fractured tuff below the repository. In these simulations the primary mechanism for accumulation is mixing of the high pH, actinide-laden source term with resident water; thus lowering the pH values sufficiently for fissile minerals to become insoluble and precipitate. In the final section of the model, the outputs from PHREEQC, are processed to produce mass of accumulation
@@ Thanks to its work of past more than 20 years,a research team led by Prof.ZENG Qingcun and Prof.WANG Huijun from the CAS Institute of Atmospheric Physics (IAP) has scored innovative achievements in their studies of basic theory of climate dynamics,numerical model development,its related computational theory,and the dynamical climate prediction using the climate system models.Their work received a second prize of the National Award for Natural Sciences in 2005.
Braverman, Amy; Cressie, Noel; Teixeira, Joao
Climate models are deterministic, mathematical descriptions of the physics of climate. Confidence in predictions of future climate is increased if the physics are verifiably correct. A necessary, (but not sufficient) condition is that past and present climate be simulated well. Quantify the likelihood that a (summary statistic computed from a) set of observations arises from a physical system with the characteristics captured by a model generated time series. Given a prior on models, we can go further: posterior distribution of model given observations.
Coddington, O.; Lean, J.; Pilewskie, P.; Snow, M. A.; Lindholm, D. M.
A new climate data record of Total Solar Irradiance (TSI) and Solar Spectral Irradiance (SSI), including source code and supporting documentation is now publicly available as part of the National Oceanographic and Atmospheric Administration's (NOAA) National Centers for Environmental Information (NCEI) Climate Data Record (CDR) Program. Daily and monthly averaged values of TSI and SSI, with associated time and wavelength dependent uncertainties, are estimated from 1882 to the present with yearly averaged values since 1610, updated quarterly for the foreseeable future. The new Solar Irradiance Climate Data Record, jointly developed by the University of Colorado at Boulder's Laboratory for Atmospheric and Space Physics (LASP) and the Naval Research Laboratory (NRL), is constructed from solar irradiance models that determine the changes from quiet Sun conditions when bright faculae and dark sunspots are present on the solar disk. The magnitudes of the irradiance changes that these features produce are determined from linear regression of the proxy Mg II index and sunspot area indices against the approximately decade-long solar irradiance measurements made by instruments on the SOlar Radiation and Climate Experiment (SORCE) spacecraft. We describe the model formulation, uncertainty estimates, operational implementation and validation approach. Future efforts to improve the uncertainty estimates of the Solar Irradiance CDR arising from model assumptions, and augmentation of the solar irradiance reconstructions with direct measurements from the Total and Spectral Solar Irradiance Sensor (TSIS: launch date, July 2017) are also discussed.
Lund, Morten Enemark; de Zee, Mark; Andersen, Michael Skipper; Rasmussen, John
This paper reviews the opportunities to validate multibody musculoskeletal models in view of the current transition of musculoskeletal modelling from a research topic to a practical simulation tool in product design, healthcare and other important applications. This transition creates a new need...... practical steps for improvement of the validation of multibody musculoskeletal models are pointed out and directions for future research in the field are proposed. It is our hope that a more structured approach to model validation can help to improve the credibility of musculoskeletal models....
Sargsyan, K.; Safta, C.; Berry, R.; Debusschere, B.; Najm, H.
We address challenges that sensitivity analysis and uncertainty quantification methods face when dealing with complex computational models. In particular, climate models are computationally expensive and typically depend on a large number of input parameters. We consider the Community Land Model (CLM), which consists of a nested computational grid hierarchy designed to represent the spatial heterogeneity of the land surface. Each computational cell can be composed of multiple land types, and each land type can incorporate one or more sub-models describing the spatial and depth variability. Even for simulations at a regional scale, the computational cost of a single run is quite high and the number of parameters that control the model behavior is very large. Therefore, the parameter sensitivity analysis and uncertainty propagation face significant difficulties for climate models. This work employs several algorithmic avenues to address some of the challenges encountered by classical uncertainty quantification methodologies when dealing with expensive computational models, specifically focusing on the CLM as a primary application. First of all, since the available climate model predictions are extremely sparse due to the high computational cost of model runs, we adopt a Bayesian framework that effectively incorporates this lack-of-knowledge as a source of uncertainty, and produces robust predictions with quantified uncertainty even if the model runs are extremely sparse. In particular, we infer Polynomial Chaos spectral expansions that effectively encode the uncertain input-output relationship and allow efficient propagation of all sources of input uncertainties to outputs of interest. Secondly, the predictability analysis of climate models strongly suffers from the curse of dimensionality, i.e. the large number of input parameters. While single-parameter perturbation studies can be efficiently performed in a parallel fashion, the multivariate uncertainty analysis
WU Tongwen; WU Fanghua; LIU Yiming; ZHANG Fang; SHI Xueli; CHU Min; ZHANG Jie; FANG Yongjie; WANG Fang; LU Yixiong; LIU Xiangwen; SONG Lianchun; WEI Min; LIU Qianxia; ZHOU Wenyan; DONG Min; ZHAO Qigeng; JI Jinjun; Laurent LI; ZHOU Mingyu; LI Weiping; WANG Zaizhi; ZHANG Hua; XIN Xiaoge; ZHANG Yanwu; ZHANG Li; LI Jianglong
This paper reviews recent progress in the development of the Beijing Climate Center Climate System Model (BCC-CSM) and its four component models (atmosphere, land surface, ocean, and sea ice). Two recent versions are described: BCC-CSM1.1 with coarse resolution (approximately 2.8125◦×2.8125◦) and BCC-CSM1.1(m) with moderate resolution (approximately 1.125◦×1.125◦). Both versions are fully cou-pled climate-carbon cycle models that simulate the global terrestrial and oceanic carbon cycles and include dynamic vegetation. Both models well simulate the concentration and temporal evolution of atmospheric CO2 during the 20th century with anthropogenic CO2 emissions prescribed. Simulations using these two versions of the BCC-CSM model have been contributed to the Coupled Model Intercomparison Project phase fi ve (CMIP5) in support of the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5). These simulations are available for use by both national and international communities for investigating global climate change and for future climate pro jections. Simulations of the 20th century climate using BCC-CSM1.1 and BCC-CSM1.1(m) are presented and validated, with particular focus on the spatial pattern and seasonal evolution of precipitation and surface air temperature on global and continental scales. Simulations of climate during the last millennium and pro jections of climate change during the next century are also presented and discussed. Both BCC-CSM1.1 and BCC-CSM1.1(m) perform well when compared with other CMIP5 models. Preliminary analyses in-dicate that the higher resolution in BCC-CSM1.1(m) improves the simulation of mean climate relative to BCC-CSM1.1, particularly on regional scales.
Dunlea, Edward; Elfring, Chris
Climate models are the foundation for understanding and projecting climate and climate-related changes and are thus critical tools for supporting climate-related decision making. This study developed a holistic strategy for improving the nationâs capability to accurately simulate climate and related Earth system changes on decadal to centennial timescales. The committeeâs report is a high level analysis, providing a strategic framework to guide progress in the nationâs climate modeling enterprise over the next 10-20 years. This study was supported by DOE, NSF, NASA, NOAA, and the intelligence community.
of the ACIA (Arctic Climate Impact Assessment) because they account for many key processes in the Arctic and provide reasonable fit to historical data. Permafrost model of intermediate complexity was validated by the data from Circumpolar Active Layer Monitoring (CALM) program and used to calculate ALT, temperature and distribution of permafrost under GCM's forcing. Results were used to construct predictive hemispheric-scale maps of ALT and "permafrost hazards index" characterizing threats to constructions built on frozen ground for the 11-year time slices centered on 2030, 2050, and 2080. The major conclusions of this study are the following. Reduction of the total (continuous) permafrost area in the northern hemisphere by 2030, 2050, and 2080 is likely to be 10%-18% (15%- 25%); 15%-30%(20%-40%), and 20%-35%(25%-50%), respectively. Changes of the ALT are not uniform in both space and time. In the following three decades they will be relatively small, typically within 10%-15%. By the middle of the century ALT may increase on average by 15%-25%, and by 50% and more in the northernmost locations. By 2080 active layer will become markedly thicker (by 30%-50% and more) all over the permafrost area. Deeper seasonal thawing and higher temperature of the frozen ground will stimulate development of the destructive geocryological processes, particularly thermokarst, that may cause detrimental impacts on northern infrastructure.
Farrar, Charles [Los Alamos National Laboratory; Nishio, Mayuko [Yokohama University; Hemez, Francois [Los Alamos National Laboratory; Stull, Chris [Los Alamos National Laboratory; Park, Gyuhae [Chonnam Univesity; Cornwell, Phil [Rose-Hulman Institute of Technology; Figueiredo, Eloi [Universidade Lusófona; Luscher, D. J. [Los Alamos National Laboratory; Worden, Keith [University of Sheffield
As structural dynamics becomes increasingly non-modal, stochastic and nonlinear, finite element model-updating technology must adopt the broader notions of model validation and uncertainty quantification. For example, particular re-sampling procedures must be implemented to propagate uncertainty through a forward calculation, and non-modal features must be defined to analyze nonlinear data sets. The latter topic is the focus of this report, but first, some more general comments regarding the concept of model validation will be discussed.
Jesús M. Almendros-Jiménez
Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.
The safety assessment of a nuclear waste repository requires the use of models. Such models need to be validated to ensure, as much as possible, that they are a good representation of the actual processes occurring in the real system. In this paper we attempt to take a broad view by reviewing step by step the modeling process and bringing out the need to validating every step of this process. This model validation includes not only comparison of modeling results with data from selected experiments, but also evaluation of procedures for the construction of conceptual models and calculational models as well as methodologies for studying data and parameter correlation. The need for advancing basic scientific knowledge in related fields, for multiple assessment groups, and for presenting our modeling efforts in open literature to public scrutiny is also emphasized. 16 refs
Zeitlin, Wendy; Claiborne, Nancy; Lawrence, Catherine K.; Auerbach, Charles
Objective: Organizational climate has emerged as an important factor in understanding and addressing the complexities of providing services in child welfare. This research examines the psychometric properties of each of the dimensions of Parker and colleagues' Psychological Climate Survey in a sample of voluntary child welfare workers. Methods:…
MacNeice, P. J.; Takakishvili, Alexandre
This presentation is the first in a series which will provide independent validation of community models of the outer corona and inner heliosphere. In this work we establish a set of measures to be used in validating this group of models. We use these procedures to generate a comprehensive set of results from the Wang- Sheeley-Arge (WSA) model which will be used as a baseline, or reference, against which to compare all other models. We also run a test of the validation procedures by applying them to a small set of results produced by the ENLIL Magnetohydrodynamic (MHD) model. In future presentations we will validate other models currently hosted by the Community Coordinated Modeling Center(CCMC), including a comprehensive validation of the ENLIL model. The Wang-Sheeley-Arge (WSA) model is widely used to model the Solar wind, and is used by a number of agencies to predict Solar wind conditions at Earth as much as four days into the future. Because it is so important to both the research and space weather forecasting communities, it is essential that its performance be measured systematically, and independently. In this paper we offer just such an independent and systematic validation. We report skill scores for the model's predictions of wind speed and IMF polarity for a large set of Carrington rotations. The model was run in all its routinely used configurations. It ingests line of sight magnetograms. For this study we generated model results for monthly magnetograms from the National Solar Observatory (SOLIS), Mount Wilson Observatory and the GONG network, spanning the Carrington rotation range from 1650 to 2068. We compare the influence of the different magnetogram sources, performance at quiet and active times, and estimate the effect of different empirical wind speed tunings. We also consider the ability of the WSA model to identify sharp transitions in wind speed from slow to fast wind. These results will serve as a baseline against which to compare future
Structural change in a two-sector model of the climate and the economy introduces issues concerning substitutability among goods in a two-sector economic growth model where emissions from fossil fuels give rise to a climate externality. Substitution is modeled using a CES-production function where the intermediate inputs differ only in their technologies and the way they are affected by the climate externality. I derive a simple formula for optimal taxes and resource allocation over time and ...
The performance assessment of a nuclear waste repository demands much more in comparison to the safety evaluation of any civil constructions such as dams, or the resource evaluation of a petroleum or geothermal reservoir. It involves the estimation of low probability (low concentration) of radionuclide transport extrapolated 1000's of years into the future. Thus models used to make these estimates need to be carefully validated. A number of recent efforts have been devoted to the study of this problem. Some general comments on model validation were given by Tsang. The present paper discusses some issues of validation in regards to radionuclide transport. 5 refs
National Aeronautics and Space Administration — The program focuses on turbulence modeling enhancements for predicting high-speed rocket base flows. A key component of the effort is the collection of...
Lutz, Arthur F.; ter Maat, Herbert W.; Biemans, Hester; Shrestha, Arun B.; Wester, Philippus; Immerzeel, Walter W.
Climate change impact studies depend on projections of future climate provided by climate models. The number of climate models is large and increasing, yet limitations in computational capacity make it necessary to compromise the number of climate models that can be included in a climate change impa
Sornette, D; Ide, K; Kamm, J R; Pisarenko, V; Vixie, K R
Validation is often defined as the process of determining the degree to which a model is an accurate representation of the real world from the perspective of its intended uses. Validation is crucial as industries and governments depend increasingly on predictions by computer models to justify their decisions. We propose to formulate the validation of a given model/code as an iterative construction process that mimics the often implicit process occurring in the minds of scientists. We offer a formal representation of the progressive build-up of trust in the model. We thus replace static claims on the impossibility of validating a given model/code by a dynamic process of constructive approximation. This approach is better adapted to the fuzzy, coarse-grained nature of validation. Our procedure factors in the degree of redundancy versus novelty of the experiments used for validation as well as the degree to which the model predicts the observations. We illustrate the new methodology first with the maturation of ...
National Aeronautics and Space Administration — Develop a packaged data management infrastructure for the comparison of generated climate model output to existing observational datasets that includes...
D J Ewins
In this paper, a review is presented of the various methods which are available for the purpose of performing a systematic comparison and correlation between two sets of vibration data. In the present case, the application of interest is in conducting this correlation process as a prelude to model correlation or updating activity.
Overgård, Christian Hansen; VUK, Goran
matched the observed traffic better than those of the transit assignment model. With respect to the metro forecasts, the model over-predicts metro passenger flows by 10% to 50%. The wide range of findings from the project resulted in two actions. First, a project was started in January 2005 to upgrade the......The paper presents a comprehensive validation procedure for the passenger traffic model for Copenhagen based on external data from the Danish national travel survey and traffic counts. The model was validated for the years 2000 to 2004, with 2004 being of particular interest because the Copenhagen...
Ishizaki, Yasuhiro; Emori, Seita; Shiogama, Hideo; Takahashi, Kiyoshi; Yokohata, Tokuta; Yoshimori, Masakazu
We developed a simple climate model based on MAGICC6, and investigated the ability of the simple climate model to emulate global mean surface air temperature (SAT) changes of an atmosphere-ocean general circulation model (MIROC5) in the twenty-first century in representative concentration pathways (RCPs). Some previous research indicated that climate sensitivity, ocean vertical diffusion and forcing of anthropogenic aerosols (direct and indirect effects of sulfate aerosol, black carbon and organic carbon) are important factors to emulate global mean SAT changes of atmosphere-ocean general circulation models CMIP3. We therefore estimate these important parameters in the simple climate model using a Metropolis-Hastings Markov chain Monte Carlo (MCMC) approach. The estimated values of the important parameters by the MCMC are physically valid, and our simple climate model can successfully emulate global mean SAT changes of MIROC5 in RCPs with the estimated parameters by the MCMC approach. In addition, we estimated the relative contributions f each important parameter in sensitivity experiments, in which we change the value of an important parameter from the estimated one by the MCMC to the default value of MAGICC6. As a result, we found that the estimation of climate sensitivity is the most important factor for the emulation of the AOGCM, and the stimation of ocean vertical diffusion is also important factor. Although the estimations of the anthropogenic aerosols forcing are very important for the emulation of the AOGCM in the twenty century, the influence of them on the emulation of the AOGCM in the twenty first century is very small. This is because emissions of anthropogenic aerosols are projected to decrease in the twenty first century, and relative contributions of the forcing of anthropogenic aerosols also decrease. Carbon cycle models are not incorporated into our simple climate model yet. A sophisticated carbon cycle model is required to be incorporated into
Soini, Markus; Liukkonen, Jarmo; Watt, Athony P.P.; Yli-Piipari, Sami; Jaakkola, Timo
The aim of the study was to examine the construct validity and internal consistency of the Motivational Climate in Physical Education Scale (MCPES). A key element of the development process of the scale was establishing a theoretical framework that integrated the dimensions of task- and ego involving climates in conjunction with autonomy, and social relatedness supporting climates. These constructs were adopted from the self-determination and achievement goal theories. A ...
Markus Soini; Jarmo Liukkonen; Anthony Watt; Sami Yli-Piipari; Timo Jaakkola
The aim of the study was to examine the construct validity and internal consistency of the Motivational Climate in Physical Education Scale (MCPES). A key element of the development process of the scale was establishing a theoretical framework that integrated the dimensions of task- and ego involving climates in conjunction with autonomy, and social relatedness supporting climates. These constructs were adopted from the self-determination and achievement goal theories. A sample of Finnish Gra...
Full Text Available Abstract Background This article describes the validation of an instrument to measure work group climate in public health organizations in developing countries. The instrument, the Work Group Climate Assessment Tool (WCA, was applied in Brazil, Mozambique, and Guinea to assess the intermediate outcomes of a program to develop leadership for performance improvement. Data were collected from 305 individuals in 42 work groups, who completed a self-administered questionnaire. Methods The WCA was initially validated using Cronbach's alpha reliability coefficient and exploratory factor analysis. This article presents the results of a second validation study to refine the initial analyses to account for nested data, to provide item-level psychometrics, and to establish construct validity. Analyses included eigenvalue decomposition analysis, confirmatory factor analysis, and validity and reliability analyses. Results This study confirmed the validity and reliability of the WCA across work groups with different demographic characteristics (gender, education, management level, and geographical location. The study showed that there is agreement between the theoretical construct of work climate and the items in the WCA tool across different populations. The WCA captures a single perception of climate rather than individual sub-scales of clarity, support, and challenge. Conclusion The WCA is useful for comparing the climates of different work groups, tracking the changes in climate in a single work group over time, or examining differences among individuals' perceptions of their work group climate. Application of the WCA before and after a leadership development process can help work groups hold a discussion about current climate and select a target for improvement. The WCA provides work groups with a tool to take ownership of their own group climate through a process that is simple and objective and that protects individual confidentiality.
Mihailović, Dragutin T; Arsenić, Ilija
Some issues which are relevant for the recent state in climate modeling have been considered. A detailed overview of literature related to this subject is given. The concept in modeling of climate, as a complex system, seen through Godel's Theorem and Rosen's definition of complexity and predictability is discussed. It is pointed out to occurrence of chaos in computing the environmental interface temperature from the energy balance equation given in a difference form. A coupled system of equations, often used in climate models is analyzed. It is shown that the Lyapunov exponent mostly has positive values allowing presence of chaos in this systems. The horizontal energy exchange between environmental interfaces, which is described by the dynamics of driven coupled oscillators, is analyzed. Their behavior and synchronization, when a perturbation is introduced in the system, as a function of the coupling parameters, the logistic parameter and the parameter of exchange, was studied calculating the Lyapunov expone...
Air-sea fluxes, open-sea deep convection and cyclo-genesis are studied in the Mediterranean with the development of a regional coupled model (AORCM). It accurately simulates these processes and their climate variabilities are quantified and studied. The regional coupling shows a significant impact on the number of winter intense cyclo-genesis as well as on associated air-sea fluxes and precipitation. A lower inter-annual variability than in non-coupled models is simulated for fluxes and deep convection. The feedbacks driving this variability are understood. The climate change response is then analysed for the 21. century with the non-coupled models: cyclo-genesis decreases, associated precipitation increases in spring and autumn and decreases in summer. Moreover, a warming and salting of the Mediterranean as well as a strong weakening of its thermohaline circulation occur. This study also concludes with the necessity of using AORCMs to assess climate change impacts on the Mediterranean. (author)
BIOMOVS (BIOspheric MOdel Validation Study) is an international study where models used for describing the distribution of radioactive and nonradioactive trace substances in terrestrial and aquatic environments are compared and tested. The main objectives of the study are to compare and test the accuracy of predictions between such models, explain differences in these predictions, recommend priorities for future research concerning the improvement of the accuracy of model predictions and act as a forum for the exchange of ideas, experience and information. (author)
BIOMOVS (BIOspheric MOdel Validation Study) is an international study where models used for describing the distribution of radioactive and nonradioactive trace substances in terrestrial and aquatic environments are compared and tested. The main objectives of the study are to compare and test the accuracy of predictions between such models, explain differences in these predictions, recommend priorities for future research concerning the improvement of the accuracy of model predictions and act as a forum for the exchange of ideas, experience and information. (orig.)
Full Text Available Abstract Background Very frequently the same biological system is described by several, sometimes competing mathematical models. This usually creates confusion around their validity, ie, which one is correct. However, this is unnecessary since validity of a model cannot be established; model validation is actually a misnomer. In principle the only statement that one can make about a system model is that it is incorrect, ie, invalid, a fact which can be established given appropriate experimental data. Nonlinear models of high dimension and with many parameters are impossible to invalidate through simulation and as such the invalidation process is often overlooked or ignored. Results We develop different approaches for showing how competing ordinary differential equation (ODE based models of the same biological phenomenon containing nonlinearities and parametric uncertainty can be invalidated using experimental data. We first emphasize the strong interplay between system identification and model invalidation and we describe a method for obtaining a lower bound on the error between candidate model predictions and data. We then turn to model invalidation and formulate a methodology for discrete-time and continuous-time model invalidation. The methodology is algorithmic and uses Semidefinite Programming as the computational tool. It is emphasized that trying to invalidate complex nonlinear models through exhaustive simulation is not only computationally intractable but also inconclusive. Conclusion Biological models derived from experimental data can never be validated. In fact, in order to understand biological function one should try to invalidate models that are incompatible with available data. This work describes a framework for invalidating both continuous and discrete-time ODE models based on convex optimization techniques. The methodology does not require any simulation of the candidate models; the algorithms presented in this paper have a
Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.
Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the
Knorek, John Kenneth
The concept "academic culture" has been used as a framework to understand faculty work in higher education. Academic culture research builds on organizational psychology concepts of culture and climate to better understand employee practices and work phenomenon. Ample research has investigated faculty teaching at the disciplinary and…
Structural system identification is concerned with the development of systematic procedures and tools for developing predictive analytical models based on a physical structure`s dynamic response characteristics. It is a multidisciplinary process that involves the ability (1) to define high fidelity physics-based analysis models, (2) to acquire accurate test-derived information for physical specimens using diagnostic experiments, (3) to validate the numerical simulation model by reconciling differences that inevitably exist between the analysis model and the experimental data, and (4) to quantify uncertainties in the final system models and subsequent numerical simulations. The goal of this project was to develop structural system identification techniques and software suitable for both research and production applications in code and model validation.
Hemez, Francois [Los Alamos National Laboratory; Farrar, Charles [Los Alamos National Laboratory; Park, Gyuhae [Los Alamos National Laboratory; Nishio, Mayuko [UNIV OF TOKYO; Worden, Keith [UNIV OF SHEFFIELD; Takeda, Nobuo [UNIV OF TOKYO
This study focuses on defining and comparing response features that can be used for structural dynamics model validation studies. Features extracted from dynamic responses obtained analytically or experimentally, such as basic signal statistics, frequency spectra, and estimated time-series models, can be used to compare characteristics of structural system dynamics. By comparing those response features extracted from experimental data and numerical outputs, validation and uncertainty quantification of numerical model containing uncertain parameters can be realized. In this study, the applicability of some response features to model validation is first discussed using measured data from a simple test-bed structure and the associated numerical simulations of these experiments. issues that must be considered were sensitivity, dimensionality, type of response, and presence or absence of measurement noise in the response. Furthermore, we illustrate a comparison method of multivariate feature vectors for statistical model validation. Results show that the outlier detection technique using the Mahalanobis distance metric can be used as an effective and quantifiable technique for selecting appropriate model parameters. However, in this process, one must not only consider the sensitivity of the features being used, but also correlation of the parameters being compared.
Cronin, T. M.; Walker, H. A.
Many coastal ecosystems are severely degraded due to a variety of human factors, requiring large and expensive monitoring and modeling efforts for restoration and management. Climate variability, including abrupt climate change, is seldom factored into coastal ecosystem management despite growing evidence for climate forcing of precipitation, river discharge, water quality, salinity, turbidity, faunal and phytoplankton dynamics, dissolved oxygen, and other ecosystem processes. We will review evidence from long-term monitoring records, multi-proxy paleoclimatic and paleoecological records, and climatic modeling that suggests that the effects of climate can override local and regional human activities and may potentially diminish the success of restoration efforts. Because ecosystem restoration often involves long-term objectives requiring decades to achieve, our focus will be on examples from sub-tropical and temperate estuaries in North America that show ecosystem response over decadal timescales to variability related to El NiÃ±o-Southern Oscillation, the Pacific Decadal Oscillation and the North Atlantic Oscillation. Climatic variability evident from paleo-records of the past few centuries exceeds that recorded in most 20th century monitoring records. This raises issues about the efficacy of local and regional ecosystem and hydrodynamic models designed to simulate ecosystem response to anthropogenic changes in sediment and nutrient input, fresh-water discharge, and land-use because such models, though tested with rigorous validation procedures, use calibration data sets limited to a few years. Thus, they might not be appropriate for simulating response to climatic extremes on the scale and duration of past events outside their calibration range. Understanding the complexities of ecosystem response to climatic forcing, especially in the context of local and regional ecosystem disturbance, raises formidable challenges, but attempts to integrate climate
Sperber, K; Gleckler, P; Covey, C; Taylor, K; Bader, D; Phillips, T; Fiorino, M; Achutarao, K
In 2002, the Program for Climate Model Diagnosis and Intercomparison (PCMDI) proposed the concept for a state-of-the-science appraisal of climate models to be performed approximately every two years. Motivation for this idea arose from the perceived needs of the international modeling groups and the broader climate research community to document progress more frequently than provided by the Intergovernmental Panel on Climate Change (IPCC) Assessment Reports. A committee of external reviewers, which included senior researchers from four leading international modeling centers, supported the concept by stating in its review: ''The panel enthusiastically endorses the suggestion that PCMDI develop an independent appraisal of coupled model performance every 2-3 years. This would provide a useful 'mid-course' evaluation of modeling progress in the context of larger IPCC and national assessment activities, and should include both coupled and single-component model evaluations.''
Rosa Arboretti Giancristofaro
Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.
In order to evaluate the spatial and seasonal variations of paleoclimate in the southwest US, a local climate model (LCM) is developed that computes modern and 18,000 yr B.P. (18 ka) monthly temperature and precipitation from a set of independent variables. Independent variables include: terrain elevation, insolation, CO2 concentration, January and July winds, and January and July sea-surface temperatures. Solutions are the product of a canonical regression function which is calibrated using climate data from 641 stations from AZ, CA, CO, NM, NV, UT in the National Weather Service Cooperative observer network. Validation of the LCH, using climate data at 98 climate stations from the period 1980--1984, indicates no significant departures of LCM solutions from climate data. LCM solutions of modern and 18 ka climate are computed at a 15 km spacing over a rectangular domain extending 810 km east, 360 km west, 225 km north and 330 km south of the approximate location of Yucca Mt., KV. Solutions indicate mean annual temperature was 5 degrees C cooler at 18 ka and mean annual precipitation increased 68%. The annual cycle of temperature and precipitation at 18 ka was amplified with summers about 1 degrees C cooler and 71% drier, and winters about 11 degrees C colder and 35% wetter than the modern. Model results compare quite reasonably with proxy paleoclimate estimates from glacial deposits, pluvial lake deposits, pollen records, ostracodes records and packrat madden records from the southwest US However, bias (+5 degrees C to +10 degrees C) is indicated for LCM solutions of summer temperatures at 18 ka
Rodriguez, Gil R. Jr.; Kunkel, David E.
This research demonstrates the need and the procedure for testing sector programming models It compares the model estimates of endogenous variables to carefully selected base period parameters It uses an operational, static, deterministic, and highly aggregate programming model of Philippine agriculture as the framework Alternative formulations of the Philippine model are also examined for possible errors In the consumption, production, and objective function data sets
Zhang, Tianyi; Li, Tao; Yang, Xiaoguang; Simelton, Elisabeth
Climate-induced crop yields model projections are constrained by the accuracy of the phenology simulation in crop models. Here, we use phenology observations from 775 trials with 19 rice cultivars in 5 Asian countries to compare the performance of four rice phenology models (growing-degree-day (GDD), exponential, beta and bilinear models) when applied to warmer climates. For a given cultivar, the difference in growing season temperature (GST) varied between 2.2 and 8.2 °C in different trials, which allowed us to calibrate the models for lower GST and validate under higher GST, with three calibration experiments. The results show that in warmer climates the bilinear and beta phenology models resulted in gradually increasing bias for phenology predication and double yield bias per percent increase in phenology simulation bias, while the GDD and exponential models maintained a comparatively constant bias. The phenology biases were primarily attributed to varying phenological patterns to temperature in models, rather than on the size of the calibration dataset. Additionally, results suggest that model simulations based on multiple cultivars provide better predictability than using one cultivar. Therefore, to accurately capture climate change impacts on rice phenology, we recommend simulations based on multiple cultivars using the GDD and exponential phenology models.
Petoukhov, V.; Ganopolski, A.; Brovkin, V.; Claussen, M.; Eliseev, A.; Kubatzki, C.; Rahmstorf, S.
A 2.5-dimensional climate system model of intermediate complexity CLIMBER-2 and its performance for present climate conditions are presented. The model consists of modules describing atmosphere, ocean, sea ice, land surface processes, terrestrial vegetation cover, and global carbon cycle. The modules interact (on-line) through the fluxes of momentum, energy, water and carbon. The model has a coarse spatial resolution, allowing nevertheless to capture the major features of the Earth`s geography. The model describes temporal variability of the system on seasonal and longer time scales. Due to the fact that the model does not employ any type of flux adjustment and has fast turnaround time, it can be used for study of climates significantly different from the present one and allows to perform long-term (multimillennia) simulations. The constraints for coupling the atmosphere and ocean without flux adjustment are discussed. The results of a model validation against present climate data show that the model successfully describes the seasonal variability of a large set of characteristics of the climate system, including radiative balance, temperature, precipitation, ocean circulation and cryosphere. (orig.) 62 refs.
Gompel, P.H.C. van; Koornneef, G.P.
Two major trends can be identified for powertrain control in the next decade. The legislation will more and more focus on in-use emissions. Together with the global trend to reduce the CO 2 emissions, this will lead to an integral drive train approach. To develop and validate this integral drive tra
Wu, Jing; Tang, Lichun; Mohamed, Rayman; Zhu, Qianting; Wang, Zheng
Climate financing is a key issue in current negotiations on climate protection. This study establishes a climate financing model based on a mechanism in which donor countries set up funds for climate financing and recipient countries use the funds exclusively for carbon emission reduction. The burden-sharing principles are based on GDP, historical emissions, and consumptionbased emissions. Using this model, we develop and analyze a series of scenario simulations, including a financing program negotiated at the Cancun Climate Change Conference (2010) and several subsequent programs. Results show that sustained climate financing can help to combat global climate change. However, the Cancun Agreements are projected to result in a reduction of only 0.01°C in global warming by 2100 compared to the scenario without climate financing. Longer-term climate financing programs should be established to achieve more significant benefits. Our model and simulations also show that climate financing has economic benefits for developing countries. Developed countries will suffer a slight GDP loss in the early stages of climate financing, but the longterm economic growth and the eventual benefits of climate mitigation will compensate for this slight loss. Different burden-sharing principles have very similar effects on global temperature change and economic growth of recipient countries, but they do result in differences in GDP changes for Japan and the FSU. The GDP-based principle results in a larger share of financial burden for Japan, while the historical emissions-based principle results in a larger share of financial burden for the FSU. A larger burden share leads to a greater GDP loss.
Clark, James S.
Understanding how the biosphere may respond to increasing trace gas concentrations in the atmosphere requires models that contain vegetation responses to regional climate. Most of the processes ecologists study in forests, including trophic interactions, nutrient cycling, and disturbance regimes, and vital components of the world economy, such as forest products and agriculture, will be influenced in potentially unexpected ways by changing climate. These vegetation changes affect climate in the following ways: changing C, N, and S pools; trace gases; albedo; and water balance. The complexity of the indirect interactions among variables that depend on climate, together with the range of different space/time scales that best describe these processes, make the problems of modeling and prediction enormously difficult. These problems of predicting vegetation response to climate warming and potential ways of testing model predictions are the subjects of this chapter.
C. Z. Li
Full Text Available This paper investigates issues involved in calibrating hydrological models against observed data when the aim of the modelling is to predict future runoff under different climatic conditions. To achieve this objective, we tested two hydrological models, DWBM and SIMHYD, using data from 30 unimpaired catchments in Australia which had at least 60 yr of daily precipitation, potential evapotranspiration (PET, and streamflow data. Nash-Sutcliffe efficiency (NSE, modified index of agreement (d1 and water balance error (WBE were used as performance criteria. We used a differential split-sample test to split up the data into 120 sub-periods and 4 different climatic sub-periods in order to assess how well the calibrated model could be transferred different periods. For each catchment, the models were calibrated for one sub-period and validated on the other three. Monte Carlo simulation was used to explore parameter stability compared to historic climatic variability. The chi-square test was used to measure the relationship between the distribution of the parameters and hydroclimatic variability. The results showed that the performance of the two hydrological models differed and depended on the model calibration. We found that if a hydrological model is set up to simulate runoff for a wet climate scenario then it should be calibrated on a wet segment of the historic record, and similarly a dry segment should be used for a dry climate scenario. The Monte Carlo simulation provides an effective and pragmatic approach to explore uncertainty and equifinality in hydrological model parameters. Some parameters of the hydrological models are shown to be significantly more sensitive to the choice of calibration periods. Our findings support the idea that when using conceptual hydrological models to assess future climate change impacts, a differential split-sample test and Monte Carlo simulation should be used to quantify uncertainties due to
O. Morgenstern; M. A. Giorgetta; Shibata, K; Eyring, V.; D. W. Waugh; T. G. Shepherd; H. Akiyoshi; Austin, J; Baumgaertner, A.J.G.; Bekki, S.; P. Braesicke; Brühl, C.; Chipperfield, M. P.; Cugnet, D.; M. Dameris
The goal of the Chemistry-Climate Model Validation (CCMVal) activity is to improve understanding of chemistry-climate models (CCMs) through process-oriented evaluation and to provide reliable projections of stratospheric ozone and its impact on climate. An appreciation of the details of model formulations is essential for understanding how models respond to the changing external forcings of greenhouse gases and ozone-depleting substances, and hence for understanding the ozone and climate fore...
Janssen, Gijs; Bakker, Alexander; van Ek, Remco; Groot, Annemarie; Kroes, Joop; Kuiper, Marijn; Schipper, Peter; van Walsum, Paul; Wamelink, Wieger; Mol, Janet
Decision making towards climate proofing the water management of regional catchments can benefit greatly from the availability of a climate robust integrated modelling framework, capable of a consistent assessment of climate change impacts on the various interests present in the catchments. In the Netherlands, much effort has been devoted to developing state-of-the-art regional dynamic groundwater models with a very high spatial resolution (25x25 m2). Still, these models are not completely satisfactory to decision makers because the modelling concepts do not take into account feedbacks between meteorology, vegetation/crop growth, and hydrology. This introduces uncertainties in forecasting the effects of climate change on groundwater, surface water, agricultural yields, and development of groundwater dependent terrestrial ecosystems. These uncertainties add to the uncertainties about the predictions on climate change itself. In order to create an integrated, climate robust modelling framework, we coupled existing model codes on hydrology, agriculture and nature that are currently in use at the different research institutes in the Netherlands. The modelling framework consists of the model codes MODFLOW (groundwater flow), MetaSWAP (vadose zone), WOFOST (crop growth), SMART2-SUMO2 (soil-vegetation) and NTM3 (nature valuation). MODFLOW, MetaSWAP and WOFOST are coupled online (i.e. exchange information on time step basis). Thus, changes in meteorology and CO2-concentrations affect crop growth and feedbacks between crop growth, vadose zone water movement and groundwater recharge are accounted for. The model chain WOFOST-MetaSWAP-MODFLOW generates hydrological input for the ecological prediction model combination SMART2-SUMO2-NTM3. The modelling framework was used to support the regional water management decision making process in the 267 km2 Baakse Beek-Veengoot catchment in the east of the Netherlands. Computations were performed for regionalized 30-year climate change
Model verification and validation (VandV) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model VandV procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model VandV program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model VandV is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define VandV methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for VandV applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of VandV procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model
Williamson, J. [Consortium for Advanced Residential Buildings, Norwalk, CT (United States); Puttagunta, S. [Consortium for Advanced Residential Buildings, Norwalk, CT (United States)
This report details the validation methods used to analyze consumption at each of these homes. It includes a detailed end-use examination of consumptions from the following categories: 1) Heating, 2) Cooling, 3) Lights, Appliances, and Miscellaneous Electric Loads (LAMELS) along with Domestic Hot Water Use, 4) Ventilation, and 5) PV generation. A utility bill disaggregation method, which allows a crude estimation of space conditioning loads based on outdoor air temperature, was also performed and the results compared to the actual measured data.
Penix, John; Pecheur, Charles; Havelund, Klaus
This report describes an investigation into using model checking to assist validation of domain models for the HSTS planner. The planner models are specified using a qualitative temporal interval logic with quantitative duration constraints. We conducted several experiments to translate the domain modeling language into the SMV, Spin and Murphi model checkers. This allowed a direct comparison of how the different systems would support specific types of validation tasks. The preliminary results indicate that model checking is useful for finding faults in models that may not be easily identified by generating test plans.
Tobis, M.; Jackson, C. S.
Michael Tobis, Ph.D. Research Scientist Associate University of Texas Institute for Geophysics Charles S. Jackson Research Scientist University of Texas Institute for Geophysics Meteorological, oceanographic, and climatological applications have been at the forefront of scientific computing since its inception. The trend toward ever larger and more capable computing installations is unabated. However, much of the increase in capacity is accompanied by an increase in parallelism and a concomitant increase in complexity. An increase of at least four additional orders of magnitude in the computational power of scientific platforms is anticipated. It is unclear how individual climate simulations can continue to make effective use of the largest platforms. Conversion of existing community codes to higher resolution, or to more complex phenomenology, or both, presents daunting design and validation challenges. Our alternative approach is to use the expected resources to run very large ensembles of simulations of modest size, rather than to await the emergence of very large simulations. We are already doing this in exploring the parameter space of existing models using the Multiple Very Fast Simulated Annealing algorithm, which was developed for seismic imaging. Our experiments have the dual intentions of tuning the model and identifying ranges of parameter uncertainty. Our approach is less strongly constrained by the dimensionality of the parameter space than are competing methods. Nevertheless, scaling up remains costly. Much could be achieved by increasing the dimensionality of the search and adding complexity to the search algorithms. Such ensemble approaches scale naturally to very large platforms. Extensions of the approach are anticipated. For example, structurally different models can be tuned to comparable effectiveness. This can provide an objective test for which there is no realistic precedent with smaller computations. We find ourselves inventing new code to
This document reports Phase 2 of work undertaken by Science Applications International Corporation (SAIC) in support of the Atomic Energy Control Board's Probabilistic Safety Assessment (PSA) review. A methodology is presented for the systematic review and evaluation of a PSA model. These methods are intended to support consideration of the following question: To within the scope and depth of modeling resolution of a PSA study, is the resultant model a complete and accurate representation of the subject plant? This question was identified as a key PSA validation issue in SAIC's Phase 1 project. The validation methods are based on a model transformation process devised to enhance the transparency of the modeling assumptions. Through conversion to a 'success-oriented' framework, a closer correspondence to plant design and operational specifications is achieved. This can both enhance the scrutability of the model by plant personnel, and provide an alternative perspective on the model that may assist in the identification of deficiencies. The model transformation process is defined and applied to fault trees documented in the Darlington Probabilistic Safety Evaluation. A tentative real-time process is outlined for implementation and documentation of a PSA review based on the proposed methods. (author). 11 refs., 9 tabs., 30 refs
Full Text Available This work aims to evaluate the performance of a hydrological balance model in a watershed located in northern Tunisia (wadi Sejnane, 378 km2 in present climate conditions using input variables provided by four regional climate models. A modified version (MBBH of the lumped and single layer surface model BBH (Bucket with Bottom Hole model, in which pedo-transfer parameters estimated using watershed physiographic characteristics are introduced is adopted to simulate the water balance components. Only two parameters representing respectively the water retention capacity of the soil and the vegetation resistance to evapotranspiration are calibrated using rainfall-runoff data. The evaluation criterions for the MBBH model calibration are: relative bias, mean square error and the ratio of mean actual evapotranspiration to mean potential evapotranspiration. Daily air temperature, rainfall and runoff observations are available from 1960 to 1984. The period 1960–1971 is selected for calibration while the period 1972–1984 is chosen for validation. Air temperature and precipitation series are provided by four regional climate models (DMI, ARP, SMH and ICT from the European program ENSEMBLES, forced by two global climate models (GCM: ECHAM and ARPEGE. The regional climate model outputs (precipitation and air temperature are compared to the observations in terms of statistical distribution. The analysis was performed at the seasonal scale for precipitation. We found out that RCM precipitation must be corrected before being introduced as MBBH inputs. Thus, a non-parametric quantile-quantile bias correction method together with a dry day correction is employed. Finally, simulated runoff generated using corrected precipitation from the regional climate model SMH is found the most acceptable by comparison with runoff simulated using observed precipitation data, to reproduce the temporal variability of mean monthly runoff. The SMH model is the most accurate to
This paper contains a discussion of the work being performed in the UK to validate the CHYMES coarse mixing model. Attention is focussed on the MIXA experiments performed at Winfrith Technology Centre in which 3 kg of molten fuel simulant were released into water. The validation of CHYMES against one of these experiments (MIXA06) is discussed in detail. It is concluded that CHYMES can reproduce some features of the experiment (such as the existence of steam chimney around the mixture and the steam production rate within a factor of two) but it does not predict the observed mixture development (the radial spreading and the deceleration of the first melt arriving at the surface) well. Additional model development and experimental analysis underway to resolve these differences is discussed
A climatic model based upon analytical expressions is presented. This model is capable of making long-range predictions of heat energy variations on regional or global scales. These variations can then be transformed into corresponding variations of some other key climatic parameters since weather and climatic changes are basically driven by differential heating and cooling around the earth. On the basis of the mathematical expressions upon which the model is based, it is shown that the global heat energy structure (and hence the associated climatic system) are characterized by zonally as well as latitudinally propagating fluctuations at frequencies downward of 0.5 day-1. We have calculated the propagation speeds for those particular frequencies that are well documented in the literature. The calculated speeds are in excellent agreement with the measured speeds. (author). 13 refs
Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)
A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.
Coram, Geoffrey J.
Noise has been a concern from the very beginning of signal processing and electrical engineering in general, although it was perhaps of less interest until vacuum- tube amplifiers made it audible just after 1900. Rigorous noise models for linear resistors were developed in 1927 by Nyquist and Johnson [1, 2]. However, the intervening years have not brought similarly well-established models for noise in nonlinear devices. This thesis proposes using thermodynamic principles to determine whether a given nonlinear device noise model is physically valid. These tests are applied to several models. One conclusion is that the standard Gaussian noise models for nonlinear devices predict thermodynamically impossible circuit behavior: these models should be abandoned. But the nonlinear shot-noise model predicts thermodynamically acceptable behavior under a constraint derived here. This thesis shows how the thermodynamic requirements can be reduced to concise mathematical tests, involving no approximations, for the Gaussian and shot-noise models. When the above-mentioned constraint is satisfied, the nonlinear shot-noise model specifies the current noise amplitude at each operating point from knowledge of the device v - i curve alone. This relation between the dissipative behavior and the noise fluctuations is called, naturally enough, a fluctuation- dissipation relation. This thesis further investigates such FDRs, including one for linear resistors in nonlinear circuits that was previously unexplored. The aim of this thesis is to provide thermodynamically solid foundations for noise models. It is hoped that hypothesized noise models developed to match experiment will be validated against the concise mathematical tests of this thesis. Finding a correct noise model will help circuit designers and physicists understand the actual processes causing the noise, and perhaps help them minimize the noise or its effect in the circuit. (Copies available exclusively from MIT Libraries, Rm
Liu, X.; Tang, Q.
The impacts of climate change on renewable water resources are usually assessed using hydrological models driven by downscaled climate outputs from global climate models. Most hydrological models do not have explicit parameterization of vegetation and thus are unable to assess the effects of elevated atmospheric CO2 on stomatal conductance and water loss of leaf. The response of vegetation to elevated atmospheric CO2 would reduce evaporation and affect runoff and renewable water resources. To date, the impacts of elevated CO2 on vegetation transpiration were not well addressed in assessment of water resources under climate change. In this study, the distributed biosphere-hydrological (DBH) model, which incorporates a simple biosphere model into a distributed hydrological scheme, was used to assess the impacts of elevated CO2 on vegetation transpiration and consequent runoff. The DBH model was driven by five General Circulation Models (GCMs) under four Representative Concentration Pathways (RCPs). For each climate scenario, two model experiments were conducted. The atmospheric CO2 concentration in one experiment was assumed to remain at the level of 2000 and increased as described by the RCPs in the other experiment. The results showed that the elevated CO2 would result in decrease in evapotranspiration, increase in runoff, and have considerable impacts on water resources. However, CO2 induced runoff change is generally small in dry areas likely because vegetation is usually sparse in the arid area.
Huth, Radan; Metelka, L.; Kliegerová, S.; Sedlák, Pavel; Kyselý, Jan; Mládek, R.; Halenka, T.; Kalvová, J.
Bratislava: Geophysical Institute of SAS, Slovak Hydrometeorological Institute, Slovak Mining Society, Slovak Meteorological Society, 2001 - (Matejka, F.; Ostrožlík, M.), s. - ISBN 80-85754-10-X. [150 years of the meteorological service in central Europe. Stará Lesná (SK), 09.10.2001-11.10.2001] R&D Projects: GA ČR GA205/01/0804 Institutional research plan: CEZ:AV0Z3042911 Keywords : Regional Climate Model * validation * Central Europe Subject RIV: DG - Athmosphere Sciences, Meteorology
In this paper, some possible roles for natural analogues are discussed from the point of view of those involved with the development of mathematical models for radionuclide transport and with the use of these models in repository safety assessments. The characteristic features of a safety assessment are outlined in order to address the questions of where natural analogues can be used to improve our understanding of the processes involved and where they can assist in validating the models that are used. Natural analogues have the potential to provide useful information about some critical processes, especially long-term chemical processes and migration rates. There is likely to be considerable uncertainty and ambiguity associated with the interpretation of natural analogues, and thus it is their general features which should be emphasized, and models with appropriate levels of sophistication should be used. Experience gained in modelling the Koongarra uranium deposit in northern Australia is drawn upon. (author)
Collins, W D; Blackmon, M; Bitz, C; Bonan, G; Bretherton, C S; Carton, J A; Chang, P; Doney, S; Hack, J J; Kiehl, J T; Henderson, T; Large, W G; McKenna, D; Santer, B D; Smith, R D
A new version of the Community Climate System Model (CCSM) has been developed and released to the climate community. CCSM3 is a coupled climate model with components representing the atmosphere, ocean, sea ice, and land surface connected by a flux coupler. CCSM3 is designed to produce realistic simulations over a wide range of spatial resolutions, enabling inexpensive simulations lasting several millennia or detailed studies of continental-scale climate change. This paper will show results from the configuration used for climate-change simulations with a T85 grid for atmosphere and land and a 1-degree grid for ocean and sea-ice. The new system incorporates several significant improvements in the scientific formulation. The enhancements in the model physics are designed to reduce or eliminate several systematic biases in the mean climate produced by previous editions of CCSM. These include new treatments of cloud processes, aerosol radiative forcing, land-atmosphere fluxes, ocean mixed-layer processes, and sea-ice dynamics. There are significant improvements in the sea-ice thickness, polar radiation budgets, equatorial sea-surface temperatures, ocean currents, cloud radiative effects, and ENSO teleconnections. CCSM3 can produce stable climate simulations of millenial duration without ad hoc adjustments to the fluxes exchanged among the component models. Nonetheless, there are still systematic biases in the ocean-atmosphere fluxes in western coastal regions, the spectrum of ENSO variability, the spatial distribution of precipitation in the Pacific and Indian Oceans, and the continental precipitation and surface air temperatures. We conclude with the prospects for extending CCSM to a more comprehensive model of the Earth's climate system.
Osinga, T. [ETH-Zuerich (Switzerland); Olalde, G. [CNRS Odeillo (France); Steinfeld, A. [PSI and ETHZ (Switzerland)
A numerical model is formulated for the SOLZINC solar chemical reactor for the production of Zn by carbothermal reduction of ZnO. The model involves solving, by the finite-volume technique, a 1D unsteady state energy equation that couples heat transfer to the chemical kinetics for a shrinking packed bed exposed to thermal radiation. Validation is accomplished by comparison with experimentally measured temperature profiles and Zn production rates as a function of time, obtained for a 5-kW solar reactor tested at PSI's solar furnace. (author)
Mladinich, C. S.; Brunner, N. M.; Beal, Y. G.
The U.S. Geological Survey (USGS) is generating a suite of Essential Climate Variables (ECVs), as defined by the Global Climate Observing System program, from the Landsat data archive. The Landsat archive will provide high spatial resolution (30 m) and long-term (1972 to present) global land products, meeting the needs of climate and ecological studies at global, national, and regional scales. Validation protocols for these products are being established, paralleling the Committee on Earth Observing Satellites (CEOS) Calibration/Validation Working Groups' best practice guidelines, but also being modified to account for the unique characteristics of the Landsat data. The USGS validation plan is unique in that it incorporates protocols that span not only the breadth of ecoregions but the timespan of the ECV products and Landsat satellite sensors (MSS, TM, TM+, and OLI). To achieve these goals, the incorporation of existing data bases is essential. Protocols are being developed to perform a CEOS Working Group on Calibration/Validation Stage 2 validation with plans on performing a full Stage 4 validation ensuring the spatial and temporal consistency of the ECV products. A Stage 2 validation reports product accuracies over a large number of locations and time periods by comparison with in situ or other suitable reference data. The Stage 3 validation reports product uncertainties in a statistically robust way over multiple locations and time periods representing global conditions. Validation at this stage reports on the accuracies and confidence of products for the user communities as well as to the algorithm developers. The Stage 4 validation calls for continual assessments as new product versions of the algorithms are released. This presentation will report on the validation protocols used for the Burned Area ECV product. The burned area ECV product is unique from other ECV products such as land cover or LAI because of the transitory nature of fires. In the United
Full Text Available The present study analyzed the psychometric properties and the validity of the Spanish version of the Team Climate Inventory (TCI. The TCI is a measure of climate for innovation within groups at work and is based on the four-factor theory of climate for innovation (West, 1990. Cronbach's alpha and omega indexes revealed satisfactory reliabilities and exploratory factor analysis extracted the four original factors with the fifth factor as reported in other studies. Confirmatory factorial analysis confirmed that the five-factor solution presented the best fit to our data. Two samples (Spanish health care teams and Latin American software development teams for a total of 1099 participants were compared, showing metric measurement invariance. Evidences for validity based on team performance and team satisfaction prediction are offered.
Rackow, T.; Goessling, H. F.; Jung, T.; Sidorenko, D.; Semmler, T.; Barbi, D.; Handorf, D.
This study forms part II of two papers describing ECHAM6-FESOM, a newly established global climate model with a unique multi-resolution sea ice-ocean component. While part I deals with the model description and the mean climate state, here we examine the internal climate variability of the model under constant present-day (1990) conditions. We (1) assess the internal variations in the model in terms of objective variability performance indices, (2) analyze variations in global mean surface temperature and put them in context to variations in the observed record, with particular emphasis on the recent warming slowdown, (3) analyze and validate the most common atmospheric and oceanic variability patterns, (4) diagnose the potential predictability of various climate indices, and (5) put the multi-resolution approach to the test by comparing two setups that differ only in oceanic resolution in the equatorial belt, where one ocean mesh keeps the coarse ~1° resolution applied in the adjacent open-ocean regions and the other mesh is gradually refined to ~0.25°. Objective variability performance indices show that, in the considered setups, ECHAM6-FESOM performs overall favourably compared to five well-established climate models. Internal variations of the global mean surface temperature in the model are consistent with observed fluctuations and suggest that the recent warming slowdown can be explained as a once-in-one-hundred-years event caused by internal climate variability; periods of strong cooling in the model (`hiatus' analogs) are mainly associated with ENSO-related variability and to a lesser degree also to PDO shifts, with the AMO playing a minor role. Common atmospheric and oceanic variability patterns are simulated largely consistent with their real counterparts. Typical deficits also found in other models at similar resolutions remain, in particular too weak non-seasonal variability of SSTs over large parts of the ocean and episodic periods of almost absent
Nguyen, Viet Hung
Having a precise vulnerability discovery model (VDM) would provide a useful quantitative insight to assess software security. Thus far, several models have been proposed with some evidence supporting their goodness-of-fit. In this work we describe an independent validation of the applicability of six existing VDMs in seventeen releases of the three popular browsers Firefox, Google Chrome and Internet Explorer. We have collected five different kinds of data sets based on different definitions of a vulnerability. We introduce two quantitative metrics, goodness-of-fit entropy and goodness-of-fit quality, to analyze the impact of vulnerability data sets to the stability as well as quality of VDMs in the software life cycles. The experiment result shows that the "confirmed-by-vendors' advisories" data sets apparently yields more stable and better results for VDMs. And the performance of the s-shape logistic model (AML) seems to be superior performance in overall. Meanwhile, Anderson thermodynamic model (AT) is ind...
Stagge, James; Tallaksen, Lena; Rizzi, Jonathan
In response to the major European drought events of the last decade, projecting future drought frequency and severity in a non-stationary climate is a major concern for Europe. Prior drought studies have identified regional hotspots in the Mediterranean and Eastern European regions, but have otherwise produced conflicting results with regard to future drought severity. Some of this disagreement is likely related to the relatively coarse resolution of Global Climate Models (GCMs) and regional averaging, which tends to smooth extremes. This study makes use of the most current Regional Climate Models (RCMs) forced with CMIP5 climate projections to quantify the projected change in meteorological drought for Europe during the next century at a fine, gridded scale. Meteorological drought is quantified using the Standardized Precipitation Index (SPI) and the Standardized Precipitation-Evapotranspiration Index (SPEI), which normalize accumulated precipitation and climatic water balance anomaly, respectively, for a specific location and time of year. By comparing projections for these two indices, the importance of precipitation deficits can be contrasted with the importance of evapotranspiration increases related to temperature changes. Climate projections are based on output from CORDEX (the Coordinated Regional Climate Downscaling Experiment), which provides high resolution regional downscaled climate scenarios that have been extensively tested for numerous regions around the globe, including Europe. SPI and SPEI are then calculated on a gridded scale at a spatial resolution of either 0.44 degrees (~50 km) or 0.11 degrees (~12.5km) for the three projected emission pathways (rcp26, rcp45, rcp85). Analysis is divided into two major sections: first validating the models with respect to observed historical trends in meteorological drought from 1970-2005 and then comparing drought severity and frequency during three future time periods (2011-2040, 2041-2070, 2071-2100) to the
Bates, J. Ray
A theoretical investigation of climate stability and sensitivity is carried out using three simple linearized models based on the top-of-the-atmosphere energy budget. The simplest is the zero-dimensional model (ZDM) commonly used as a conceptual basis for climate sensitivity and feedback studies. The others are two-zone models with tropics and extratropics of equal area; in the first of these (Model A), the dynamical heat transport (DHT) between the zones is implicit, in the second (Model B) it is explicitly parameterized. It is found that the stability and sensitivity properties of the ZDM and Model A are very similar, both depending only on the global-mean radiative response coefficient and the global-mean forcing. The corresponding properties of Model B are more complex, depending asymmetrically on the separate tropical and extratropical values of these quantities, as well as on the DHT coefficient. Adopting Model B as a benchmark, conditions are found under which the validity of the ZDM and Model A as climate sensitivity models holds. It is shown that parameter ranges of physical interest exist for which such validity may not hold. The 2 × CO2 sensitivities of the simple models are studied and compared. Possible implications of the results for sensitivities derived from GCMs and palaeoclimate data are suggested. Sensitivities for more general scenarios that include negative forcing in the tropics (due to aerosols, inadvertent or geoengineered) are also studied. Some unexpected outcomes are found in this case. These include the possibility of a negative global-mean temperature response to a positive global-mean forcing, and vice versa.
T. O. Sonnenborg
Full Text Available Projections of climate change impact are associated with a cascade of uncertainties including CO2 emission scenario, climate model, downscaling and impact model. The relative importance of the individual uncertainty sources is expected to depend on several factors including the quantity that is projected. In the present study the impacts of climate model uncertainty and geological model uncertainty on hydraulic head, stream flow, travel time and capture zones are evaluated. Six versions of a physically based and distributed hydrological model, each containing a unique interpretation of the geological structure of the model area, are forced by 11 climate model projections. Each projection of future climate is a result of a GCM-RCM model combination (from the ENSEMBLES project forced by the same CO2 scenario (A1B. The changes from the reference period (1991–2010 to the future period (2081–2100 in projected hydrological variables are evaluated and the effects of geological model and climate model uncertainties are quantified. The results show that uncertainty propagation is context dependent. While the geological conceptualization is the dominating uncertainty source for projection of travel time and capture zones, the uncertainty on the climate models is more important for groundwater hydraulic heads and stream flow.
Sonnenborg, T. O.; Seifert, D.; Refsgaard, J. C.
Projections of climate change impact are associated with a cascade of uncertainties including in CO2 emission scenarios, climate models, downscaling and impact models. The relative importance of the individual uncertainty sources is expected to depend on several factors including the quantity that is projected. In the present study the impacts of climate model uncertainty and geological model uncertainty on hydraulic head, stream flow, travel time and capture zones are evaluated. Six versions of a physically based and distributed hydrological model, each containing a unique interpretation of the geological structure of the model area, are forced by 11 climate model projections. Each projection of future climate is a result of a GCM-RCM model combination (from the ENSEMBLES project) forced by the same CO2 scenario (A1B). The changes from the reference period (1991-2010) to the future period (2081-2100) in projected hydrological variables are evaluated and the effects of geological model and climate model uncertainties are quantified. The results show that uncertainty propagation is context-dependent. While the geological conceptualization is the dominating uncertainty source for projection of travel time and capture zones, the uncertainty due to the climate models is more important for groundwater hydraulic heads and stream flow.
Van Dijk, A; Den Outer, P.N.; Slaper, H.
The AMOUR2.0 (Assessment Model for Ultraviolet radiation and Risks) model is presented. With this model it is possible to relate ozone depletion scenarios to (changes in) skin cancer incidence. The estimation of UV maps is integrated in the model. The satellite-based method to estimate UV maps is validated for EPTOMS (Earth Probe - Total Ozone Mapping Spectrometer) data against ground measurements for 17 locations in Europe. For most ground stations the estimates for the yeardose agree within 5%. Deviations are related to high ground albedo. A suggestion has been made for improvement of the albedo-correction. The AMOUR2.0 UV estimate was found to correspond better with ground measurements than the models from NASA (National Aeronautics and Space Administration in the USA), TEMIS (Tropospheric Emission Monitoring Internet Service of the European Space Agency ESA) and FMI (Finnish Meteorological Institute). The EPTOMS-UV product and the FMI model overestimate the UV dose. The TEMIS model has a good clear-sky correspondence with ground measurement, but overestimates UV in clouded situations. Satellite measurements of ozone and historic chlorine level have been used to make global estimates for future ozone levels for a collection of emission scenarios for ozone depleting substances. Analysis of the 'best guess' scenario, shows that the minimum in ozone level will be reached within 15 years from now. In 2050 the UV dose for Europe will to a large extent have returned to the values observed in 1980 if there is no climate-change driven alteration in cloud patterns. Future incidence maps up to the year 2100 are estimated with the dose-effect relation presented in an earlier study. This is done for three UV related types of skin-cancer: Basal Cell Carcinoma (BCC), Squamous Cell Carcinoma (SCC) and Cutaneous Malignant Melanoma (CMM). For a stationary population, global incidences of BCC and CMM are expected to peak around the year 2065 and for SCC around 2040.
Precipitation, soil moisture, and runoff are vital to ecosystems and human activities. Predicting changes in the space-time characteristics of these water cycle processes has been a longstanding challenge in climate modeling. Different modeling approaches have been developed to allow high resolution to be achieved using available computing resources. Although high resolution is necessary to better resolve regional forcing and processes, improvements in simulating water cycle response are difficult to demonstrate and climate models have so far shown irreducible sensitivity to model resolution, dynamical framework, and physics parameterizations that confounds reliable predictions of regional climate change. Additionally, regional climate responds to both regional and global forcing but predicting changes in regional and global forcing such as related to land use/land cover and aerosol requires improved understanding and modeling of the dynamics of human-earth system interactions. Furthermore, regional response and regional forcing may be related through complex interactions that are dependent on the regional climate regimes, making decisions on regional mitigation and adaptation more challenging. Examples of the aforementioned challenges from on-going research and possible future directions will be discussed.
Simulations of the ionospheric model of Schunk et al. (1986) have been used for climatology and weather modeling. Steady state empirical models were used in the climatology model to provide plasma convection and particle precipitation patterns in the northern high-latitude region. The climatology model also depicts the ionospheric electron density and ion and electron temperatures for solar maximum, winter solstice, and strong geomagnetic activity conditions. The weather model describes the variations of ionospheric features during the solar cycle, seasonal changes, and geomagnetic activity. Prospects for future modeling are considered. 23 references
Yu, Winston; Yang, Yi-chen; Savitsky, Andre; Alford, Donald; Brown, Casey; Wescoat, James; Debowicz, Dario; Robinson, Sherman
Describes two models used in the integrated modeling framework designed to study water, climate, agriculture and the economy in Pakistan's Indus Basin: (1) the Indus Basin Model Revised (IBMR-1012), a hydro-economic optimization model that takes a variety of inputs (such as agronomic information, irrigation system data, and water inflows) to generate the optimal crop production across the provinces (subject to a variety of physical and political constraints) for every month of the year; and (...
Hertel, Thomas W.; Keeney, Roman; Valenzuela, Ernesto
This paper presents a validation experiment of a global CGE trade model widely used for analysis of trade liberalization. We focus on the ability of the model to reproduce price volatility in wheat markets. The literature on model validation is reviewed with an eye towards designing an appropriate methodology for validating large scale CGE models. The validation experiment results indicate that in its current form, the GTAP-AGR model is incapable of reproducing wheat market price volatility a...
Larsén, Xiaoli Guo; Mann, Jakob; Berg, Jacob;
Selected outputs from simulations with the regional climate model REMO from the Max Planck Institute, Hamburg, Germany were studied in connection with wind energy resource assessment. It was found that the mean wind characteristics based on observations from six mid-latitude stations are well...... described by the standard winds derived from the REMO pressure data. The mean wind parameters include the directional wind distribution, directional and omni-directional mean values and Weibull fitting parameters, spectral analysis and interannual variability of the standard winds. It was also found that...
The main aims of the present study are: (1) to evaluate the performance of two well-known mesoscale NWP (numerical weather prediction) models coupled to a UCM (Urban Canopy Models), and (2) to develop a proper measurement strategy for obtaining meteorological data that can be used in model evaluation studies. We choose the mesoscale models WRF (Weather Research and Forecasting Model) and RAMS (Regional Atmospheric Modeling System), respectively, because the partners in the present project have a large expertise with respect to these models. In addition WRF and RAMS have been successfully used in the meteorology and climate research communities for various purposes, including weather prediction and land-atmosphere interaction research. Recently, state-of-the-art UCM's were embedded within the land surface scheme of the respective models, in order to better represent the exchange of heat, momentum, and water vapour in the urban environment. Key questions addressed here are: What is the general model performance with respect to the urban environment?; How can useful and observational data be obtained that allow sensible validation and further parameterization of the models?; and Can the models be easily modified to simulate the urban climate under Dutch climatic conditions, urban configuration and morphology? Chapter 2 reviews the available Urban Canopy Models; we discuss their theoretical basis, the different representations of the urban environment, the required input and the output. Much of the information was obtained from the Urban Surface Energy Balance: Land Surface Scheme Comparison project (PILPS URBAN, PILPS stands for Project for Inter-comparison of Land-Surface Parameterization Schemes). This project started in March 2008 and was coordinated by the Department of Geography, King's College London. In order to test the performance of our models we participated in this project. Chapter 3 discusses the main results of the first phase of PILPS URBAN. A first
Soliman, E.; Jeuland, M.
Although the Nile River Basin is rich in natural resources, it faces many challenges. Rainfall is highly variable across the region, on both seasonal and inter-annual scales. This variability makes the region vulnerable to droughts and floods. Many development projects involving Nile waters are currently underway, or being studied. These projects will lead to land-use patterns changes and water distribution and availability. It is thus important to assess the effects of a) these projects and b) evolving water resource management and policies, on regional hydrological processes. This paper seeks to establish a basis for evaluation of such impacts within the Blue Nile River sub-basin, using the RegCM3 Regional Climate Model to simulate interactions between the land surface and climatic processes. We first present results from application of this RCM model nested with downscaled outputs obtained from the ECHAM5/MPI-OM1 transient simulations for the 20th Century. We then investigate changes associated with mid-21st century emissions forcing of the SRES A1B scenario. The results obtained from the climate model are then fed as inputs to the Nile Forecast System (NFS), a hydrologic distributed rainfall runoff model of the Nile Basin, The interaction between climatic and hydrological processes on the land surface has been fully coupled. Rainfall patterns and evaporation rates have been generated using RegCM3, and the resulting runoff and Blue Nile streamflow patterns have been simulated using the NFS. This paper compares the results obtained from the RegCM3 climate model with observational datasets for precipitation and temperature from the Climate Research Unit (UK) and the NASA Goddard Space Flight Center GPCP (USA) for 1985-2000. The validity of the streamflow predictions from the NFS is assessed using historical gauge records. Finally, we present results from modeling of the A1B emissions scenario of the IPCC for the years 2034-2055. Our results indicate that future
Climate modelers are most confident about the radiative heating by greenhouse gases and cooling by industrial, and other, aerosols. We have considerable confidence about our ability to simulate the large scale circulation of the atmosphere and oceans but as yet have little prediction skill for near-surface continental climates, extreme events or knowledge of the sensitivity to socio-economic forcing functions which drive climate change. As yet no formalism exists for validation of coupled climate models but evaluation and confirmation can and must be attempted by examination of the results of fully coupled simulations; model component intercomparisons; and sensitivity studies of interactions between numerical model components. Modelling and observation communities must jointly strive for improved accuracy (determined by more careful validation against high quality data); enhanced regional to local specificity (gained by model improvements and enhanced validation); and by increasing the skill with which we can detect climate change (observationally driven enhanced by model- based sampling strategies and scenarios). In this paper, specific examples of these three challenges to climate prediction improvement were discused: better accuracy in terms of continental surface climate prediction; enhanced specificity in terms of tropical cyclone predictions; and improved detection in terms of increased understanding. of the global carbon cycle
It is well known that soil erosion leads to agricultural productivity decline and contributes to water quality decline. The current widely used models for determining soil erosion for management purposes in agriculture focus on long term (~20 years) average annual soil loss and are not well suited to determining variations that occur over short timespans and as a result of climate change. Soil loss resulting from rainfall erosion is directly dependent on the product of runoff and sediment concentration both of which are likely to be influenced by climate change. This presentation demonstrates the capacity of models like the USLE, USLE-M and WEPP to predict variations in runoff and erosion associated with rainfall events eroding bare fallow plots in the USA with a view to modelling rainfall erosion in areas subject to climate change.
Berhane, F. G.; Anyah, R. O.
The program Soil and Water Assessment Tool (SWAT2009) model has been applied to the Blue Nile Basin to study the hydrological response to surrogate climate changes over the Blue Nile Basin (Ethiopia) by downscaling gridded weather data. The specific objectives of the study include (i) examining the performance of the SWAT model in simulating hydrology-climate interactions and feedbacks within the entire Blue Nile Basin, and (ii) investigating the response of hydrological variables to surrogate climate changes. Monthly weather data from the Climate Research Unit (CRU) are converted to daily values as input into the SWAT using Monthly to Daily Weather Converter (MODAWEC). Using the program SUFI-2 (Sequential Uncertainty Fitting Algorithm), data from 1979 to 1983 are applied for sensitivity analysis and calibration (P-factor = 90%, R-factor =0.7, R2 =0.93 and NS=0.93) and subsequently to validate hindcasts over the period 1984-1989 (R2 =0.92 and NS=0.92). The period from 1960-2000 was used as baseline and has been used to determine the changes and the effect of the surrogate climate changes over the Blue Nile Basin. Overall, our surrogate climate change based simulations indicate the hydrology of the Blue Nile catchment is very sensitive to potential climate change with 100%, 34% and 51% increase to the surface runoff, lateral flow and water yield respectively for the A2 scenario surrogate. Key Words: SWAT, MODAWEC, Blue Nile Basin, SUFI-2, climate change, hydrological modeling, CRU
Romanach, Stephanie; Watling, James I.; Fletcher, Robert J., Jr.; Speroterra, Carolina; Bucklin, David N.; Brandt, Laura A.; Pearlstine, Leonard G.; Escribano, Yesenia; Mazzotti, Frank J.
Climate change poses new challenges for natural resource managers. Predictive modeling of species–environment relationships using climate envelope models can enhance our understanding of climate change effects on biodiversity, assist in assessment of invasion risk by exotic organisms, and inform life-history understanding of individual species. While increasing interest has focused on the role of uncertainty in future conditions on model predictions, models also may be sensitive to the initial conditions on which they are trained. Although climate envelope models are usually trained using data on contemporary climate, we lack systematic comparisons of model performance and predictions across alternative climate data sets available for model training. Here, we seek to fill that gap by comparing variability in predictions between two contemporary climate data sets to variability in spatial predictions among three alternative projections of future climate. Overall, correlations between monthly temperature and precipitation variables were very high for both contemporary and future data. Model performance varied across algorithms, but not between two alternative contemporary climate data sets. Spatial predictions varied more among alternative general-circulation models describing future climate conditions than between contemporary climate data sets. However, we did find that climate envelope models with low Cohen's kappa scores made more discrepant spatial predictions between climate data sets for the contemporary period than did models with high Cohen's kappa scores. We suggest conservation planners evaluate multiple performance metrics and be aware of the importance of differences in initial conditions for spatial predictions from climate envelope models.
The BIOME model of Prentice et al. (1992), which predicts global vegetation patterns in equilibrium with climate, is coupled with the ECHAM climate model of the Max-Planck-Institut fuer Meteorologie, Hamburg. It is found that incorporation of the BIOME model into ECHAM, regardless at which frequency, does not enhance the simulated climate variability, expressed in terms of differences between global vegetation patterns. Strongest changes are seen only between the initial biome distribution and the biome distribution computed after the first simulation period, provided that the climate-biome model is started from a biome distribution that resembles the present-day distribution. After the first simulation period, there is no significant shrinking, expanding, or shifting of biomes. Likewise, no trend is seen in global averages of land-surface parameters and climate variables. (orig.)
Full Text Available An important source of uncertainty in climate models is linked to the calibration of model parameters. Interest in systematic and automated parameter optimization procedures stems from the desire to improve the model climatology and to quantify the average sensitivity associated with potential changes in the climate system. Building upon on the smoothness of the response of an atmospheric circulation model (AGCM to changes of four adjustable parameters, Neelin et al. (2010 used a quadratic metamodel to objectively calibrate the AGCM. The metamodel accurately estimates global spatial averages of common fields of climatic interest, from precipitation, to low and high level winds, from temperature at various levels to sea level pressure and geopotential height, while providing a computationally cheap strategy to explore the influence of parameter settings. Here, guided by the metamodel, the ambiguities or dilemmas related to the decision making process in relation to model sensitivity and optimization are examined. Simulations of current climate are subject to considerable regional-scale biases. Those biases may vary substantially depending on the climate variable considered, and/or on the performance metric adopted. Common dilemmas are associated with model revisions yielding improvement in one field or regional pattern or season, but degradation in another, or improvement in the model climatology but degradation in the interannual variability representation. Challenges are posed to the modeler by the high dimensionality of the model output fields and by the large number of adjustable parameters. The use of the metamodel in the optimization strategy helps visualize trade-offs at a regional level, e.g., how mismatches between sensitivity and error spatial fields yield regional errors under minimization of global objective functions.
Full Text Available An important source of uncertainty in climate models is linked to the calibration of model parameters. Interest in systematic and automated parameter optimization procedures stems from the desire to improve the model climatology and to quantify the average sensitivity associated with potential changes in the climate system. Neelin et al. (2010 used a quadratic metamodel to objectively calibrate an atmospheric circulation model (AGCM around four adjustable parameters. The metamodel accurately estimates global spatial averages of common fields of climatic interest, from precipitation, to low and high level winds, from temperature at various levels to sea level pressure and geopotential height, while providing a computationally cheap strategy to explore the influence of parameter settings. Here, guided by the metamodel, the ambiguities or dilemmas related to the decision making process in relation to model sensitivity and optimization are examined. Simulations of current climate are subject to considerable regional-scale biases. Those biases may vary substantially depending on the climate variable considered, and/or on the performance metric adopted. Common dilemmas are associated with model revisions yielding improvement in one field or regional pattern or season, but degradation in another, or improvement in the model climatology but degradation in the interannual variability representation. Challenges are posed to the modeler by the high dimensionality of the model output fields and by the large number of adjustable parameters. The use of the metamodel in the optimization strategy helps visualize trade-offs at a regional level, e.g. how mismatches between sensitivity and error spatial fields yield regional errors under minimization of global objective functions.
Tebaldi, Claudia; Arblaster, Julie M.; Knutti, Reto
Climate change projections are often based on simulations from multiple global climate models and are presented as maps with some form of stippling or measure of robustness to indicate where different models agree on the projected anthropogenically forced changes. The criteria used to determine model agreement, however, often ignore the presence of natural internal variability. We demonstrate that this leads to misleading presentations of the degree of model consensus on the sign and magnitude of the change if the ratio of the signal from the externally forced change to internal variability is low. We present a simple alternative method of depicting multimodel projections which clearly separates lack of climate change signal from lack of model agreement by assessing the degree of consensus on the significance of the change as well as the sign of the change. Our results demonstrate that the common interpretation of lack of model agreement in precipitation projections is largely an artifact of the large noise from climate variability masking the signal, an issue exacerbated by performing analyses at the grid point scale. We argue that separating more clearly the case of lack of agreement from the case of lack of signal will add valuable information for stake-holders' decision making, since adaptation measures required in the two cases are potentially very different.
Kiehl, Jeffrey T.; Gent, Peter R.
The Community Climate System Model, version 2 (CCSM2) is briefly described. A 1000-yr control simulation of the present day climate has been completed without flux adjustments. Minor modifications were made at year 350, which included all five components using the same physical constants. There are very small trends in the upper-ocean, sea ice, atmosphere, and land fields after year 150 of the control simulation. The deep ocean has small but significant trends; however, these are not large enough that the control simulation could not be continued much further. The equilibrium climate sensitivity of CCSM2 is 2.2 K, which is slightly larger than the Climate System Model, version 1 (CSM1) value of 2.0 K.Several aspects of the control simulation's mean climate and interannual variability are described, and good and bad properties of the control simulation are documented. In particular, several aspects of the simulation, especially in the Arctic region, are much improved over those obtained in CSM1. Other aspects, such as the tropical Pacific region simulation, have not been improved much compared to those in CSM1. Priorities for further model development are discussed in the conclusions section.HR ALIGN="center" WIDTH="30%">
Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).
Hashemi Farahani, H.; Ditmar, P.G.; Klees, R.; De Teixeira da Encarnacao, J.G.; Liu, X.; Zhao, Q.; Guo, J.
The ability of satellite gravimetry data to validate global static models of the Earth’s gravity field is studied. Two types of data are considered: K-band ranging (KBR) data from the Gravity Recovery and Climate Experiment (GRACE) mission and Satellite Gravity Gradiometry (SGG) data from the GOCE (Gravity field and steady-state Ocean Circulation Explorer) mission. The validation is based on analysis of misfits obtained as the differences between the data observed and those computed with a fo...
Li, Z.; Yang, C.; Liu, K.; Sun, M.; XIA, J.; Huang, Q.
Climate studies have become increasingly important due to the global climate change, one of the biggest challenges for the human in the 21st century. Climate data, not only observations data collected from various sensors but also simulated data generated from diverse climate models, are essential for scientists to explore the potential climate change patterns and analyze the complex climate dynamics. Climate modeling and simulation, a critical methodology for simulating the past and predicting the future climate conditions, can produce huge amount of data that contains potentially valuable information for climate studies. However, using modeling method in climate studies poses at least two challenges for scientists. First, running climate models is a computing intensive process, which requires large amounts of computation resources. Second, running climate models is also a data intensive process generating Big geospatial Data (model output), which demands large storage for managing the data and large computing power to process and analyze these data. This presentation introduces a novel framework to tackle the two challenges by 1) running climate models in a cloud environment in an automated fashion, and 2) managing and parallel processing Big model output Data by leveraging cloud computing technologies. A prototype system is developed based on the framework using ModelE as the climate model. Experiment results show that this framework can improve climate modeling in the research cycle by accelerating big data generation (model simulation), big data management (storage and processing) and on demand big data analytics.
The prime goal of model validation is to build confidence in the model concept and that the model is fit for its intended purpose. In other words: Does the model predict transport in fractured rock adequately to be used in repository performance assessments. Are the results reasonable for the type of modelling tasks the model is designed for. Commonly, in performance assessments a large number of realisations of flow and transport is made to cover the associated uncertainties. Thus, the flow and transport including radioactive chain decay are preferably calculated in the same model framework. A rather sophisticated concept is necessary to be able to model flow and radionuclide transport in the near field and far field of a deep repository, also including radioactive chain decay. In order to avoid excessively long computational times there is a need for well-based simplifications. For this reason, the far field code FARF31 is made relatively simple, and calculates transport by using averaged entities to represent the most important processes. FARF31 has been shown to be suitable for the performance assessments within the SKB studies, e.g. SR 97. Among the advantages are that it is a fast, simple and robust code, which enables handling of many realisations with wide spread in parameters in combination with chain decay of radionuclides. Being a component in the model chain PROPER, it is easy to assign statistical distributions to the input parameters. Due to the formulation of the advection-dispersion equation in FARF31 it is possible to perform the groundwater flow calculations separately.The basis for the modelling is a stream tube, i.e. a volume of rock including fractures with flowing water, with the walls of the imaginary stream tube defined by streamlines. The transport within the stream tube is described using a dual porosity continuum approach, where it is assumed that rock can be divided into two distinct domains with different types of porosity
Alonso Tapia, Jesús; Fernández Heredia, Blanca
Research on classroom goal-structures (CGS) has shown the usefulness of assessing the classroom motivational climate to evaluate educational interventions and to promote changes in teachers' activity. So, the Classroom Motivational Climate Questionnaire for Secondary and High-School students was developed. To validate it, confirmatory factor analysis and correlation and regression analyses were performed. Results showed that the CMCQ is a highly reliable instrument that covers many of the types of teaching patterns that favour motivation to learn, correlates as expected with other measures of CGS, predicts satisfaction with teacher's work well, and allows detecting teachers who should revise their teaching. PMID:18940098
Yi Huang; Ramaswamy, V.
We examine global climate models by comparing the satellite-observed high resolution global infrared spectra with the model-simulated counterpart. Because the topof-the-atmosphere outgoing Earth thermal emission at different frequencies is sensitive to different geophysical variables (temperature, water vapor and other greenhouse gas concentrations, clouds, etc.) at various levels, a comparison of observed and simulated spectra is as challenging as examining a variety of model-simulated geoph...
As part of the development effort of a regional climate model (RCM) for the southern Great Basin, this paper presents a validation analysis of the climatology generated by a high-resolution RCM driven by observations. Two multiyear simulations were performed over the western United States with the RCM driven by European Centre for Medium-Range Weather Forecasts analyses of observations. This validation analysis is the first phase of a project to produce simulations of future climate scenarios over a region surrounding Yucca Mountain, Nevada, the only location currently being considered as a potential high-level nuclear-waste repository site. Model-produced surface air temperatures and precipitation were compared with observations from five southern Nevada stations located in the vicinity of Yucca Mountain. The seasonal cycles of temperature and precipitation were simulated well. Monthly and seasonal temperature biases were generally negative and largely explained by differences in elevation between the observing stations and the model topography. The model-simulated precipitation captured the extreme dryness of the Great Basin. Average yearly precipitation biases were mostly negative in the summer and positive in the winter. The number of simulated daily precipitation events for various precipitation intervals was within factors of 1.5-3.5 of observed. Overall, the model tended to overestimate the number of light precipitation events and underestimate the number of heavy precipitation events. At Yucca Mountain, simulated precipitation, soil moisture content, and water infiltration below the root zone (top 1 m) were maximized in the winter. Evaporation peaked in the spring after temperatures began to increase. The conclusion drawn from this validation analysis is that this high-resolution RCM simulates the regional surface climatology of the southern Great Basin reasonably well when driven by meteorological fields derived from observations. 26 refs., 9 figs., 4 tabs
In this paper we discuss two different methods for validation of regression models, applied to corrosion data. One of them is based on the correlation coefficient and the other one is the statistical test of lack of fit. Both methods are used here to analyse fitting of bi logarithmic model in order to predict corrosion for very low carbon steel substrates in rural and urban-industrial atmospheres in Uruguay. Results for parameters A and n of the bi logarithmic model are reported here. For this purpose, all repeated values were used instead of using average values as usual. Modelling is carried out using experimental data corresponding to steel substrates under the same initial meteorological conditions ( in fact, they are put in the rack at the same time). Results of correlation coefficient are compared with the lack of it tested at two different signification levels (α=0.01 and α=0.05). Unexpected differences between them are explained and finally, it is possible to conclude, at least in the studied atmospheres, that the bi logarithmic model does not fit properly the experimental data. (Author) 18 refs
The Earth is warming up, with potentially disastrous consequences. Computer climate models based on physics are our best hope of predicting and managing climate change, as Adam Scaife, Chris Folland and John Mitchell explain. This month scientists from over 60 nations on the Intergovernmental Panel on Climate Change (IPCC) released the first part of their latest report on global warming. In the report the panel concludes that it is very likely that most of the 0.5 deg. C increase in global temperature over the last 50 years is due to man-made emissions of greenhouse gases. And the science suggests that much greater changes are in store: by 2100 anthropogenic global warming could be comparable to the warming of about 6 deg. C since the last ice age. The consequences of global warming could be catastrophic. As the Earth continues to heat up, the frequency of floods and droughts is likely to increase, water supplies and ecosystems will be placed under threat, agricultural practices will have to be changed and millions of people may be displaced as the sea level rises. The global economy could also be severely affected. The scientific consensus is that the observed warming of the Earth during the past half-century is mostly due to human emissions of greenhouse gases. Predicting climate change depends on sophisticated computer models developed over the past 50 years. Climate models are based on the Navier-Stokes equations for fluid flow, which are solved numerically on a grid covering the globe. These models have been very successful in simulating the past climate, giving researchers confidence in their predictions. The most likely value for the global temperature increase by 2100 is in the range 1.4-5.8 deg. C, which could have catastrophic consequences. (U.K.)
Mendizabal, Maddalen; Moncho, Roberto; Chust, Guillem; Torp, Peter
basin (Basque Country, North of Spain). So that adaptation strategies can be defined. In order to fulfil this objective four subobjectives are defined: (1)selection of the future climate projections for the case study area from a wide spectrum of possibilities; (2) model the hydrological processes of the basin with a physically distributed complex hydrological model; (3) validation of the hydrological model with observation data; and (4) runoff simulation introducing regional climate model data selected. The analysis of climate models suggests that extreme precipitation in the Basque Country increased by about 10% during the twenty-first century. This increase of extreme precipitations raised discharge and water level in Nerbioi river basin. That is why in the 21st century it is expected that the flood-prone area will expand for precipitation with a return period of 50 years. In this context, it is necessary to define and evaluate different adaptation options which are already in practice or conceivable according to the current scientific knowledge. As well as evaluate the adaptation measures in terms of their ability to lower the vulnerability of water resources to climate change. For example, land use change could be a useful tool to adapt our basin systems. The land use plays an important role on the water balance of a river by varying the proportion of precipitation that runs off and the fraction that is lost by evapotranspiration. Therefore, both climate change and adaptation strategies will have an impact on the hydrodynamic conditions of rivers; particularly the changes in flow conditions will have a severe ecological, economical and social impact. As future work, adaptation measures will introduce in the future runoff simulation in order to evaluate the effectiveness and as a decision-making tool to operational organisations.
Frank, H.P.; Landberg, L.
The wind climate of Ireland has been calculated using the Karlsruhe Atmospheric Mesoscale Model KAMM. The climatology is represented by 65 frequency classes of geostrophic wind that were selected as equiangular direction sectors and speed intervals with equal frequency in a sector. The results are...
The issue of global warming and related climatic changes from increasing concentrations of greenhouse gases in the atmosphere has received prominent attention during the past few years. The Computer Hardware, Advanced Mathematics, and Model Physics (CHAMMP) Climate Modeling Program is designed to contribute directly to this rapid improvement. The goal of the CHAMMP Climate Modeling Program is to develop, verify, and apply a new generation of climate models within a coordinated framework that incorporates the best available scientific and numerical approaches to represent physical, biogeochemical, and ecological processes, that fully utilizes the hardware and software capabilities of new computer architectures, that probes the limits of climate predictability, and finally that can be used to address the challenging problem of understanding the greenhouse climate issue through the ability of the models to simulate time-dependent climatic changes over extended times and with regional resolution.
Watts, Mark D.; Ghayoomie, S. Vahid; Larson, Stephen D.; Gerkin, Richard C.
The growth of the software industry has gone hand in hand with the development of tools and cultural practices for ensuring the reliability of complex pieces of software. These tools and practices are now acknowledged to be essential to the management of modern software. As computational models and methods have become increasingly common in the biological sciences, it is important to examine how these practices can accelerate biological software development and improve research quality. In this article, we give a focused case study of our experience with the practices of unit testing and test-driven development in OpenWorm, an open-science project aimed at modeling Caenorhabditis elegans. We identify and discuss the challenges of incorporating test-driven development into a heterogeneous, data-driven project, as well as the role of model validation tests, a category of tests unique to software which expresses scientific models.
Smith, D. A. J.; Martin, C. E.; Saunders, C. J.; Smith, D. A.; Stokes, P. H.
The Satellite Collision Assessment for the UK Licensing Process (SCALP) model was first introduced in a paper presented at IAC 2003. As a follow-on, this paper details the steps taken to validate the model and describes some of its applications. SCALP was developed for the British National Space Centre (BNSC) to support liability assessments as part of the UK's satellite license application process. Specifically, the model determines the collision risk that a satellite will pose to other orbiting objects during both its operational and post-mission phases. To date SCALP has been used to assess several LEO and GEO satellites for BNSC, and subsequently to provide the necessary technical basis for licenses to be issued. SCALP utilises the current population of operational satellites residing in LEO and GEO (extracted from ESA's DISCOS database) as a starting point. Realistic orbital dynamics, including the approximate simulation of generic GEO station-keeping strategies are used to propagate the objects over time. The method takes into account all of the appropriate orbit perturbations for LEO and GEO altitudes and allows rapid run times for multiple objects over time periods of many years. The orbit of a target satellite is also propagated in a similar fashion. During these orbital evolutions, a collision prediction and close approach algorithm assesses the collision risk posed to the satellite population. To validate SCALP, specific cases were set up to enable the comparison of collision risk results with other established models, such as the ESA MASTER model. Additionally, the propagation of operational GEO satellites within SCALP was compared with the expected behaviour of controlled GEO objects. The sensitivity of the model to changing the initial conditions of the target satellite such as semi-major axis and inclination has also been demonstrated. A further study shows the effect of including extra objects from the GTO population (which can pass through the LEO
Proemmel, K. [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Kuestenforschung
To determine whether the increase in resolution of climate models improves the representation of climate is a crucial topic in regional climate modelling. An improvement over coarser-scale models is expected especially in areas with complex orography or along coastlines. However, some studies have shown no clear added value for regional climate models. In this study a high-resolution regional climate model simulation performed with REMO over the period 1958-1998 is analysed for 2m temperature over the orographically complex European Alps and their surroundings called the Greater Alpine Region (GAR). The model setup is in hindcast mode meaning that the simulation is driven with perfect boundary conditions by the ERA40 reanalysis through prescribing the values at the lateral boundaries and spectral nudging of the large-scale wind field inside the model domain. The added value is analysed between the regional climate simulation with a resolution of 1/6 and the driving reanalysis with a resolution of 1.125 . Before analysing the added value both the REMO simulation and the ERA40 reanalysis are validated against different station datasets of monthly and daily mean 2m temperature. The largest dataset is the dense, homogenised and quality controlled HISTALP dataset covering the whole GAR, which gave the opportunity for the validation undertaken in this study. The temporal variability of temperature, as quantified by correlation, is well represented by both REMO and ERA40. However, both show considerable biases. The REMO bias reaches 3 K in summer in regions known to experience a problem with summer drying in a number of regional models. In winter the bias is strongly influenced by the choice of the temperature lapse rate, which is applied to compare grid box and station data at different altitudes, and has the strongest influence on inner Alpine subregions where the altitude differences are largest. By applying a constant lapse rate the REMO bias in winter in the high
Alexander, K. A.; Easterbrook, S. M.
It has become common to compare and contrast the output of multiple global climate models (GCMs), such as in the Climate Model Intercomparison Project Phase 5 (CMIP5). However, intercomparisons of the software architecture of GCMs are almost nonexistent. In this qualitative study of seven GCMs from Canada, the United States, and Europe, we attempt to fill this gap in research. We describe the various representations of the climate system as computer programs, and account for architectural differences between models. Most GCMs now practice component-based software engineering, where Earth system components (such as the atmosphere or land surface) are present as highly encapsulated sub-models. This architecture facilitates a mix-and-match approach to climate modelling that allows for convenient sharing of model components between institutions, but it also leads to difficulty when choosing where to draw the lines between systems that are not encapsulated in the real world, such as sea ice. We also examine different styles of couplers in GCMs, which manage interaction and data flow between components. Finally, we pay particular attention to the varying levels of complexity in GCMs, both between and within models. Many GCMs have some components that are significantly more complex than others, a phenomenon which can be explained by the respective institution's research goals as well as the origin of the model components. In conclusion, although some features of software architecture have been adopted by every GCM we examined, other features show a wide range of different design choices and strategies. These architectural differences may provide new insights into variability and spread between models.
Mohamed, Y. A.; van den Hurk, B. J. J. M.; Savenije, H. H. G.; Bastiaanssen, W. G. M.
This paper presents the result of the regional coupled climatic and hydrologic model of the Nile Basin. For the first time the interaction between the climatic processes and the hydrological processes on the land surface have been fully coupled. The hydrological model is driven by the rainfall and the energy available for evaporation generated in the climate model, and the runoff generated in the catchment is again routed over the wetlands of the Nile to supply moisture for atmospheric feedback. The results obtained are quite satisfactory given the extremely low runoff coefficients in the catchment. The paper presents the validation results over the sub-basins: Blue Nile, White Nile, Atbara river, the Sudd swamps, and the Main Nile for the period 1995 to 2000. Observational datasets were used to evaluate the model results including radiation, precipitation, runoff and evaporation data. The evaporation data were derived from satellite images over a major part of the Upper Nile. Limitations in both the observational data and the model are discussed. It is concluded that the model provides a sound representation of the regional water cycle over the Nile. The sources of atmospheric moisture to the basin, and location of convergence/divergence fields could be accurately illustrated. The model is used to describe the regional water cycle in the Nile basin in terms of atmospheric fluxes, land surface fluxes and land surface-climate feedbacks. The monthly moisture recycling ratio (i.e. locally generated/total precipitation) over the Nile varies between 8 and 14%, with an annual mean of 11%, which implies that 89% of the Nile water resources originates from outside the basin physical boundaries. The monthly precipitation efficiency varies between 12 and 53%, and the annual mean is 28%. The mean annual result of the Nile regional water cycle is compared to that of the Amazon and the Mississippi basins.
Y. A. Mohamed
Full Text Available This paper presents the result of the regional coupled climatic and hydrologic model of the Nile Basin. For the first time the interaction between the climatic processes and the hydrological processes on the land surface have been fully coupled. The hydrological model is driven by the rainfall and the energy available for evaporation generated in the climate model, and the runoff generated in the catchment is again routed over the wetlands of the Nile to supply moisture for atmospheric feedback. The results obtained are quite satisfactory given the extremely low runoff coefficients in the catchment. The paper presents the validation results over the sub-basins: Blue Nile, White Nile, Atbara river, the Sudd swamps, and the Main Nile for the period 1995 to 2000. Observational datasets were used to evaluate the model results including radiation, precipitation, runoff and evaporation data. The evaporation data were derived from satellite images over a major part of the Upper Nile. Limitations in both the observational data and the model are discussed. It is concluded that the model provides a sound representation of the regional water cycle over the Nile. The sources of atmospheric moisture to the basin, and location of convergence/divergence fields could be accurately illustrated. The model is used to describe the regional water cycle in the Nile basin in terms of atmospheric fluxes, land surface fluxes and land surface-climate feedbacks. The monthly moisture recycling ratio (i.e. locally generated/total precipitation over the Nile varies between 8 and 14%, with an annual mean of 11%, which implies that 89% of the Nile water resources originates from outside the basin physical boundaries. The monthly precipitation efficiency varies between 12 and 53%, and the annual mean is 28%. The mean annual result of the Nile regional water cycle is compared to that of the Amazon and the Mississippi basins.
Wagner, S.; Fast, I.; F. Kaspar
In this study, we assess how the anthropogenically induced increase in greenhouse gas concentrations affects the climate of central and southern South America. We utilise two regional climate simulations for present day (PD) and pre-industrial (PI) times. These simulations are compared to historical reconstructions in order to investigate the driving processes responsible for climatic changes between the different periods. The regional climate model is validated against observations for both ...
Wagner, S.; Fast, I.; F. Kaspar
In this study, we assess how the anthropogenically induced increase in greenhouse gas concentrations affects the climate of central and southern South America. We utilise two regional climate simulations for present day (PD) and pre-industrial (PI) times. These simulations are compared to historical reconstructions in order to investigate the driving processes responsible for climatic changes between the different periods. The regional climate model is validated against obse...
Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.
The latest Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with newly available global observations. The traditional approach to climate model evaluation, which compares a single parameter at a time, identifies symptomatic model biases and errors but fails to diagnose the model problems. The model diagnosis process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. To address these challenges, we are developing a parallel, distributed web-service system that enables the physics-based multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation and (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, and (4) the calculation of difference between two variables. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use
Bavel, van, M.A.H.J.; Takakura, T.; Bot, G.P.A.
Three dynamic simulation models for calculating the greenhouse climate and its energy requirements for both heating and cooling were compared by making detailed computations for each of seven sets of data. The data sets ranged from a cold winter day, requiring heating, to a hot summer day, requiring cooling. On the whole, the models agreed in regard to calculated air temperature, humidity, and heating requirements. Significant differences were found between the estimates of fan-and-pad (evapo...
ZHANG Wei; YAN Minhua; CHEN Panqin; XU Helan
Regional climate models have become the powerful tools for simulating regional climate and its changeprocess and have been widely used in China. Using regional climate models, some research results have been obtainedon the following aspects: 1) the numerical simulation of East Asian monsoon climate, including exceptional monsoonprecipitation, summer precipitation distribution, East Asian circulation, multi-year climate average condition, summerrain belt and so on; 2) the simulation of arid climate of the western China, including thermal effect of the Qing-hai-Tibet Plateau, the plateau precipitation in the Qilian Mountains; and the impacts of greenhouse effects (CO2 dou-bling) upon climate in the western China; and 3) the simulation of the climate effect of underlying surface changes, in-cluding the effect of soil on climate formation, the influence of terrain on precipitation, the effect of regional soil deg-radation on regional climate, the effect of various underlying surfaces on regional climate, the effect of land-sea con-trast on the climate formulation, the influence of snow cover over the plateau regions on the regional climate, the effectof vegetation changes on the regional climate, etc. In the process of application of regional climate models, the prefer-ences of the models are improved so that better simulation results are gotten. At last, some suggestions are made aboutthe application of regional climate models in regional climate research in the future.
Schmidt, Gavin A.; Sherwood, Steven
We give an overview of the practice of developing and using complex climate models, as seen from experiences in a major climate modelling center and through participation in the Coupled Model Intercomparison Project (CMIP).We discuss the construction and calibration of models; their evaluation, especially through use of out-of-sample tests; and their exploitation in multi-model ensembles to identify biases and make predictions. We stress that adequacy or utility of climate models is best assessed via their skill against more naive predictions. The framework we use for making inferences about reality using simulations is naturally Bayesian (in an informal sense), and has many points of contact with more familiar examples of scientific epistemology. While the use of complex simulations in science is a development that changes much in how science is done in practice, we argue that the concepts being applied fit very much into traditional practices of the scientific method, albeit those more often associated with laboratory work.
Napier, B.A.; Simpson, J.C.; Eslinger, P.W.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.
The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid.
Full Text Available BACKGROUND: Dengue dynamics are driven by complex interactions between human-hosts, mosquito-vectors and viruses that are influenced by environmental and climatic factors. The objectives of this study were to analyze and model the relationships between climate, Aedes aegypti vectors and dengue outbreaks in Noumea (New Caledonia, and to provide an early warning system. METHODOLOGY/PRINCIPAL FINDINGS: Epidemiological and meteorological data were analyzed from 1971 to 2010 in Noumea. Entomological surveillance indices were available from March 2000 to December 2009. During epidemic years, the distribution of dengue cases was highly seasonal. The epidemic peak (March-April lagged the warmest temperature by 1-2 months and was in phase with maximum precipitations, relative humidity and entomological indices. Significant inter-annual correlations were observed between the risk of outbreak and summertime temperature, precipitations or relative humidity but not ENSO. Climate-based multivariate non-linear models were developed to estimate the yearly risk of dengue outbreak in Noumea. The best explicative meteorological variables were the number of days with maximal temperature exceeding 32°C during January-February-March and the number of days with maximal relative humidity exceeding 95% during January. The best predictive variables were the maximal temperature in December and maximal relative humidity during October-November-December of the previous year. For a probability of dengue outbreak above 65% in leave-one-out cross validation, the explicative model predicted 94% of the epidemic years and 79% of the non epidemic years, and the predictive model 79% and 65%, respectively. CONCLUSIONS/SIGNIFICANCE: The epidemic dynamics of dengue in Noumea were essentially driven by climate during the last forty years. Specific conditions based on maximal temperature and relative humidity thresholds were determinant in outbreaks occurrence. Their persistence was
A deterministic model for transport of radionuclides in rivers was used for prediction of the activity concentration of radionuclides in scenarios as Clinch-Tennessee rivers and Dnjepr river, as experimental data were provided in a VAMP subgroup. Different runs of the calculation with data fitting and adaption of parameter lead to improved results. The model gives reasonable agreement with experimental data
Hagemann, S.; Chen, Cui; Clark, D.B.; S. Folwell; Gosling, S.; Haddeland, I.; Hanasaki, N.; J. Heinke; F. Ludwig
Climate change is expected to alter the hydrological cycle resulting in large-scale impacts on water availability. However, future climate change impact assessments are highly uncertain. For the first time, multiple global climate (three) and hydrological 5 models (eight) were used to systematically assess the hydrological response to climate change and project the future state of global water resources. The results show a large spread in projected changes in water resources within the climat...
Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E.; Bernstein, Jeremy Ray Rhythm
As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior
Erickson, T. A.; Koziol, B. W.; Rood, R. B.
The goal of the OpenClimateGIS project is to make climate model datasets readily available in commonly used, modern geospatial formats used by GIS software, browser-based mapping tools, and virtual globes.The climate modeling community typically stores climate data in multidimensional gridded formats capable of efficiently storing large volumes of data (such as netCDF, grib) while the geospatial community typically uses flexible vector and raster formats that are capable of storing small volumes of data (relative to the multidimensional gridded formats). OpenClimateGIS seeks to address this difference in data formats by clipping climate data to user-specified vector geometries (i.e. areas of interest) and translating the gridded data on-the-fly into multiple vector formats. The OpenClimateGIS system does not store climate data archives locally, but rather works in conjunction with external climate archives that expose climate data via the OPeNDAP protocol. OpenClimateGIS provides a RESTful API web service for accessing climate data resources via HTTP, allowing a wide range of applications to access the climate data.The OpenClimateGIS system has been developed using open source development practices and the source code is publicly available. The project integrates libraries from several other open source projects (including Django, PostGIS, numpy, Shapely, and netcdf4-python).OpenClimateGIS development is supported by a grant from NOAA's Climate Program Office.
Forest, Chris E.; Stone, Peter H; Sokolov, Andrei P.
We present revised probability density functions for climate model parameters (effective climate sensitivity, the rate of deep-ocean heat uptake, and the strength of the net aerosol forcing) that are based on climate change observations from the 20th century. First, we compare observed changes in surface, upper-air, and deep-ocean temperature changes against simulations of 20th century climate in which the climate model parameters were systematically varied. The estimated 90% range of effecti...
Eliseev, Alexey V.; Denisov, Sergey N.; Arzhanov, Maxim M.; Mokhov, Igor I.
Methane cycle module of the global climate model of intermediate complexity developed at the A.M. Obukhov Institute of Atmospheric Physics, Russian Academy of Sciences (IAP RAS CM) is extended by coupling with a detailed module for thermal and hydrological processes in soil (Deep Soil Simulator, (Arzhanov et al., 2008)). This is an important improvement with respect with the earlier IAP RAS CM version (Eliseev et al., 2008) which has employed prescribed soil hydrology to simulate CH4 emissions from soil. Geographical distribution of water inundated soil in the model was also improved by replacing the older Olson's ecosystem data base by the data based on the SCIAMACHY retrievals (Bergamaschi et al., 2007). New version of the IAP RAS CM module for methane emissions from soil is validated by using the simulation protocol adopted in the WETCHIMP (Wetland and Wetland CH4 Inter-comparison of Models Project). In addition, atmospheric part of the IAP RAS CM methane cycle is extended by temperature dependence of the methane life-time in the atmosphere in order to mimic the respective dependence of the atmospheric methane chemistry (Denisov et al., 2012). The IAP RAS CM simulations are performed for the 18th-21st centuries according with the CMIP5 protocol taking into account natural and anthropogenic forcings. The new IAP RAS CM version realistically reproduces pre-industrial and present-day characteristics of the global methane cycle including CH4 concentration qCH4 in the atmosphere and CH4 emissions from soil. The latter amounts 150 - 160 TgCH4-yr for the late 20th century and increases to 170 - 230 TgCH4-yr in the late 21st century. Atmospheric methane concentration equals 3900 ppbv under the most aggressive anthropogenic scenario RCP 8.5 and 1850 - 1980 ppbv under more moderate scenarios RCP 6.0 and RCP 4.5. Under the least aggressive scenario RCP 2.6 qCH4 reaches maximum 1730 ppbv in 2020s and declines afterwards. Climate change impact on the methane emissions from
Holtanová, Eva; Mikšovský, Jiří; Kalvová, Jaroslava; Pišoft, Petr; Motl, Martin
We show the evaluation of ENSEMBLES regional climate models (RCMs) driven by reanalysis ERA40 over a region centered at the Czech Republic. Attention is paid especially to the model ALADIN-CLIMATE/CZ, being used as the basis of the new climate change scenarios simulation for the Czech Republic. The validation criteria used here are based on monthly or seasonal mean air temperature and precipitation. We concentrate not only on spatiotemporal mean values but also on temporal standard deviation, inter-annual variability, the mean annual cycle, and the skill of the models to represent the observed spatial patterns of these quantities. Model ALADIN-CLIMATE/CZ performs quite well in comparison to the other RCMs; we find its performance satisfactory for further use for impact studies. However, it is also shown that the results of evaluation of the RCMs' skill in simulating observed climate strongly depend on the criteria incorporated for the evaluation.
Full Text Available Smoothness is an important characteristic of a spatial process that measures local variability. If climate model outputs are realistic, then not only the values at each grid pixel but also the relative variation over nearby pixels should represent the true climate. We estimate the smoothness of long-term averages for land surface temperature anomalies in the Coupled Model Intercomparison Project Phase 5 (CMIP5, and compare them by climate regions and seasons. We also compare the estimated smoothness of the climate outputs in CMIP5 with those of reanalysis data. The estimation is done through the composite likelihood approach for locally self-similar processes. The composite likelihood that we consider is a product of conditional likelihoods of neighbouring observations. We find that the smoothness of the surface temperature anomalies in CMIP5 depends primarily on the modelling institution and on the climate region. The seasonal difference in the smoothness is generally small, except for some climate regions where the average temperature is extremely high or low.
Smoothness is an important characteristic of a spatial process that measures local variability. If climate model outputs are realistic, then not only the values at each grid pixel but also the relative variation over nearby pixels should represent the true climate. We estimate the smoothness of long-term averages for land surface temperature anomalies in the Coupled Model Intercomparison Project Phase 5 (CMIP5), and compare them by climate regions and seasons. We also compare the estimated smoothness of the climate outputs in CMIP5 with those of reanalysis data. The estimation is done through the composite likelihood approach for locally self-similar processes. The composite likelihood that we consider is a product of conditional likelihoods of neighbouring observations. We find that the smoothness of the surface temperature anomalies in CMIP5 depends primarily on the modelling institution and on the climate region. The seasonal difference in the smoothness is generally small, except for some climate regions where the average temperature is extremely high or low.
DeLong, Edward; Harwood, Caroline; Reid, Ann
This report explains the connection between microbes and climate, discusses in general terms what modeling is and how it applied to climate, and discusses the need for knowledge in microbial physiology, evolution, and ecology to contribute to the determination of fluxes and rates in climate models. It recommends with a multi-pronged approach to address the gaps.
The effects of climate change (for 2050 compared to ambient climate) and change in climatic variability on soya bean growth and production at 3 sites in the EU have been calculated. These calculations have been done with both a simple growth model, SOYBEANW, and a comprehensive model, CROPGRO. Compa
To validate the Community Radiative Transfer Model (CRTM) developed by the U.S. Joint Center for Satellite Data Assimilation (JCSDA), the discrete ordinate radiative transfer (DISORT) model and the line-by-line radiative transfer model (LBLRTM) are combined in order to provide a reference benchmark. Compared with the benchmark, the CRTM appears quite accurate for both clear sky and ice cloud radiance simulations with RMS errors below 0.2 K, except for clouds with small ice particles. In a computer CPU run time comparison, the CRTM is faster than DISORT by approximately two orders of magnitude. Using the operational MODIS cloud products and the European Center for Medium-range Weather Forecasting (ECMWF) atmospheric profiles as an input, the CRTM is employed to simulate the Atmospheric Infrared Sounder (AIRS) radiances. The CRTM simulations are shown to be in reasonably close agreement with the AIRS measurements (the discrepancies are within 2 K in terms of brightness temperature difference). Furthermore, the impact of uncertainties in the input cloud properties and atmospheric profiles on the CRTM simulations has been assessed. The CRTM-based brightness temperatures (BTs) at the top of the atmosphere (TOA), for both thin (τ30) clouds, are highly sensitive to uncertainties in atmospheric temperature and cloud top pressure. However, for an optically thick cloud, the CRTM-based BTs are not sensitive to the uncertainties of cloud optical thickness, effective particle size, and atmospheric humidity profiles. On the contrary, the uncertainties of the CRTM-based TOA BTs resulting from effective particle size and optical thickness are not negligible in an optically thin cloud.
A three-generation planning model incorporating uncertain climate change is developed. Each generation features a production activity based on capital and an exhaustible resource. An irreversible climate change may occur in period two or three, reducing the productivity for this and the remaining generation. The model is solved by stochastic dynamic programming. If the climate impact and climate change probability is constant, the optimal period one (and two) resource extraction is larger than for the reference case of climate stability. If, however, climate impact and climate change probability increases with increased aggregate resource use, this result is reversed. 5 tabs., 1 appendix, 22 refs
Coron, L.; AndréAssian, V.; Perrin, C.; Lerat, J.; Vaze, J.; Bourqui, M.; Hendrickx, F.
This paper investigates the actual extrapolation capacity of three hydrological models in differing climate conditions. We propose a general testing framework, in which we perform series of split-sample tests, testing all possible combinations of calibration-validation periods using a 10 year sliding window. This methodology, which we have called the generalized split-sample test (GSST), provides insights into the model's transposability over time under various climatic conditions. The three conceptual rainfall-runoff models yielded similar results over a set of 216 catchments in southeast Australia. First, we assessed the model's efficiency in validation using a criterion combining the root-mean-square error and bias. A relation was found between this efficiency and the changes in mean rainfall (P) but not with changes in mean potential evapotranspiration (PE) or air temperature (T). Second, we focused on average runoff volumes and found that simulation biases are greatly affected by changes in P. Calibration over a wetter (drier) climate than the validation climate leads to an overestimation (underestimation) of the mean simulated runoff. We observed different magnitudes of these models deficiencies depending on the catchment considered. Results indicate that the transfer of model parameters in time may introduce a significant level of errors in simulations, meaning increased uncertainty in the various practical applications of these models (flow simulation, forecasting, design, reservoir management, climate change impact assessments, etc.). Testing model robustness with respect to this issue should help better quantify these uncertainties.
Renold, M.; Beyerle, U.; Raible, C. C.; Knutti, R.; Stocker, T. F.; Craig, T.
Until recently, computationally intensive calculations in many scientific disciplines have been limited to institutions which have access to supercomputing centers. Today, the computing power of PC processors permits the assembly of inexpensive PC clusters that nearly approach the power of supercomputers. Moreover, the combination of inexpensive network cards and Open Source software provides an easy linking of standard computer equipment to enlarge such clusters. Universities and other institutions have taken this opportunity and built their own mini-supercomputers on site. Computing power is a particular issue for the climate modeling and impacts community. The purpose of this article is to make available a Linux cluster version of the Community Climate System Model developed by the National Center for Atmospheric Research (NCAR; http://www.cgd.ucar.edu/csm).
Edwards, Neil R. [The Open University, Earth and Environmental Sciences, Milton Keynes (United Kingdom); Cameron, David [Centre for Ecology and Hydrology, Edinburgh (United Kingdom); Rougier, Jonathan [University of Bristol, Department of Mathematics, Bristol (United Kingdom)
Credible climate predictions require a rational quantification of uncertainty, but full Bayesian calibration requires detailed estimates of prior probability distributions and covariances, which are difficult to obtain in practice. We describe a simplified procedure, termed precalibration, which provides an approximate quantification of uncertainty in climate prediction, and requires only that uncontroversially implausible values of certain inputs and outputs are identified. The method is applied to intermediate-complexity model simulations of the Atlantic meridional overturning circulation (AMOC) and confirms the existence of a cliff-edge catastrophe in freshwater-forcing input space. When uncertainty in 14 further parameters is taken into account, an implausible, AMOC-off, region remains as a robust feature of the model dynamics, but its location is found to depend strongly on values of the other parameters. (orig.)
The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17).
The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17)
This paper gives an outline of three explanatory approaches to policymaking processes that allow the development of a rich set of non-trivial, probable assumptions. These assumptions provide a foundation for understanding climate policymaking behavior. First, the Unitary Rational Actor model provides a set of assumptions about the state’s interest in calculating costs and benefits as a basis for decision-making. By avoiding the inclusion of sub-actors in the analysis, it is possible to analyz...
We briefly review some of the scientific challenges and epistemological issues related to climate science. We discuss the formulation and testing of theories and numerical models, which, given the presence of unavoidable uncertainties in observational data, the non-repeatability of world-experiments, and the fact that relevant processes occur in a large variety of spatial and temporal scales, require a rather different approach than in other scientific contexts. A brief discussion of the intr...
Current regional and global climate models generally do not represent groundwater flow between grid cells as a component of the water budget. We estimate the magnitude of between-cell groundwater flow as a function of grid cell size by aggregating results from a numerical model of equilibrium groundwater flow run and validated globally. We find that over a broad range of cell sizes spanning that of state-of-the-art regional and global climate models, mean between-cell groundwater flow magnitudes scale with the reciprocal of grid cell length. We also derive this scaling a priori from a simple statistical model of a flow network. We offer operational definitions of ‘significant’ groundwater flow contributions to the grid cell water budget in both relative and absolute terms (between-cell flow magnitude exceeding 10% of local recharge or 10 mm y−1, respectively). Groundwater flow is a significant part of the water budget, as measured by a combined test requiring both relative and absolute significance, over 42% of the land area at 0.1° grid cell size (typical of regional and mesoscale models), decreasing to 1.5% at 1° (typical of global models). Based on these findings, we suggest that between-cell groundwater flow should be represented in regional and mesoscale climate models to ensure realistic water budgets, but will have small effects on water exchanges in current global models. As well, parameterization of subgrid moisture heterogeneity should include the effects of within-cell groundwater flow. (paper)
Kell, D. [TransGrid Solutions, Winnipeg, MB (Canada); Jacobson, D. [Manitoba Hydro, Winnipeg, MB (Canada); Oheidhin, G. [Areva T and D, Staffordshire (United Kingdom)
The paper described some of the main technical aspects of the commissioning of the Ponton static var compensator (SVC) to Manitoba Hydro's electrical system. Off-line simulations were performed to gain confidence in the system tests and to deliver a validated EMT-type model. The peak load of Manitoba Hydro is approximately 4200 MW with an installed generating capacity of 5700 MW. Approximately 70 per cent of the power is generated from 3 hydraulic stations on the Nelson River in northern Manitoba. This power is transmitted for 900 km via the Nelson River high voltage direct current (HVDC) transmission to the Dorsey switching station near the major load centre of Winnipeg. The AC system is interconnected with Saskatchewan to the west with four 230 kV AC lines, and with Ontario to the east by two 230 kV AC lines. There are three 230 kV and one 500 kV AC inter-ties between Manitoba Hydro and the power system in the United States. One 230 kV line travels south from the northern generators and inter-connects into the Ponton busbar where an SVC with a supplementary damping controller was recently installed to improve system stability. 7 figs.
Full Text Available This paper presents an innovative approach to model validation for a structure with significant parameter variations. Model uncertainty of the structural dynamics is quantified with the use of a singular value decomposition technique to extract the principal components of parameter change, and an interval model is generated to represent the system with parameter uncertainty. The coordinate vector, corresponding to the identified principal directions, of the validation system is computed. The coordinate distance between the validation system and the identified interval model is used as a metric for model validation. A beam structure with an attached subsystem, which has significant parameter uncertainty, is used to demonstrate the proposed approach.
Deke, Oliver; Hooss, Kurt Georg; Kasten, Christiane; Klepper, Gernot; Springer, Katrin
Climate change affects the physical and biological system in many regions of the world. The extent to which human systems will suffer economically from climate change depends on the adaptive capabilities within a region as well as across regions. We use an economic General-Equilibrium model and an Ocean-Atmosphere model in a regionally and sectorally disaggregated framework to analyze adaptation to climate change in different regions of the world. It turns out that vulnerability to climate im...
Demuzere, Matthias; Coutts, Andrew; Van Lipzig, Nicole
Urban climate models provide a useful tool for assessing the impacts of urban land surface modification on urban climates. It provides a mechanism for trialling different scenarios for urban heat island mitigation. Only recently, urban land surfaces have been included in global and regional climate models. Often they represent a trade-off between the complexity of the biophysical processes of the urban canopy layer and the computational demands in order to be workable on regional climate time...
Seiler, C.; Zwiers, F. W.
Explosive cyclones are rapidly intensifying low pressure systems with severe wind speeds and precipitation, affecting livelihoods and infrastructure primarily in coastal and marine environments. A better understanding of the potential impacts of climate change on these so called meteorological bombs is therefore of great societal relevance. This study evaluates how well CMIP5 climate models reproduce explosive cyclones in the extratropics of the northern hemisphere, and how these bombs respond to global warming. For this purpose an objective-feature tracking algorithm was used to identify and track extratropical cyclones from 25 CMIP5 models and 3 reanalysis products for the periods 1980 to 2005 and 2070 to 2099. Cyclones were identified as the maxima of T42 vorticity of 6h wind speed at 850 hPa. Explosive and non-explosive cyclones were separated based on the corresponding deepening rates of mean sea level pressure. Most models accurately reproduced the spatial distribution of bombs when compared to results from reanalysis data (R2 = 0.84, p-value = 0.00), with high frequencies along the Kuroshio Current and the Gulf Stream, as well as the exit regions of the polar jet streaks. Most models however significantly underestimated bomb frequencies by a third on average, and by 74% in the most extreme case. This negative frequency bias coincided with significant underestimations of either meridional sea surface temperature (SST) gradients, or wind speeds of the polar jet streaks. Bomb frequency biases were significantly correlated with the number vertical model levels (R2= 0.36, p-value = 0.001), suggesting that the vertical atmospheric model resolution is crucial for simulating bomb frequencies accurately. The impacts of climate change on the location, frequency, and intensity of explosive cyclones were then explored for the Representative Concentration Pathway 8.5. Projections were related to model bias, resolution, projected changes of SST gradients, and wind speeds
Giorgi, Filippo; Bates, Gary T.; Nieman, Steven J.
As part of the development effort of a regional climate model (RCM)for the southern Great Basin, this paper present savalidation analysis of the climatology generated by a high-resolution RCM driven by observations. The RCM is aversion of the National Center for atmospheric Research-Pennsylvania State University mesoscale model, version 4 (MM4), modified for application to regional climate simulation. Two multiyear simulations, for the periods 1 January 1982 to 31 December 1983 and 1 January 1988 to 25 April 1989, were performed over the western United States with the RCM driven by European Centre for Medium-Range Weather Forecasts analyses of observations. The model resolution is 60 km. This validation analysis is the first phase of a project to produce simulations of future climate scenarios over a region surrounding Yucca Mountain, Nevada, the only location currently being considered as a potential high-level nuclear-waste repository site.Model-produced surface air temperatures and precipitation were compared with observations from five southern Nevada stations located in the vicinity of Yucca Mountain. The seasonal cycles of temperature and precipitation were simulated well. Monthly and seasonal temperature biases were generally negative and largely explained by differences in elevation between the observing stations and the model topography. The model-simulated precipitation captured the extreme dryness of the Great Basin. Average yearly precipitation was generally within 30% of observed and the range of monthly precipitation amounts was the same as in the observations. Precipitation biases were mostly negative in the summer and positive in the winter. The number of simulated daily precipitation events for various precipitation intervals was within factors of 1.5-3.5 of observed. Overall, the model tended to overestimate the number of light precipitation events and underestimate the number of heavy precipitation events. At Yucca Mountain, simulated
The aim of the present study was to uncover the important dimensions of organizational climate for project organizations, and to explore which organizational levels are most important, to use this as the basis for developing an elaborated model of organizational climate for project organizations. Interviews with employees in a project organization in the Norwegian oil sector were coded onto two models, the general and validated Organizational Climate Measure (OCM), and the best-practice proje...
This report contains papers from the International Workshop on Cloud-Radiation Interactions and Their Parameterization in Climate Models met on 18--20 October 1993 in Camp Springs, Maryland, USA. It was organized by the Joint Working Group on Clouds and Radiation of the International Association of Meteorology and Atmospheric Sciences. Recommendations were grouped into three broad areas: (1) general circulation models (GCMs), (2) satellite studies, and (3) process studies. Each of the panels developed recommendations on the. themes of the workshop. Explicitly or implicitly, each panel independently recommended observations of basic cloud microphysical properties (water content, phase, size) on the scales resolved by GCMs. Such observations are necessary to validate cloud parameterizations in GCMs, to use satellite data to infer radiative forcing in the atmosphere and at the earth`s surface, and to refine the process models which are used to develop advanced cloud parameterizations.
An international consortium, called MECCA, (Model Evaluation Consortium for Climate Assessment) has been created in 1991 by different partners including electric utilities, government and academic groups to make available to the international scientific community, a super-computer facility for climate evolution studies. The first phase of the program consists to assess uncertainties of climate model simulations in the framework of global climate change studies. Fourteen scientific projects have been accepted on an international basis in this first phase. The second phase of the program will consist in the evaluation of a set of long climate simulations realized with coupled ocean/atmosphere models, in order to study the transient aspects of climate changes and the associated uncertainties. A particular attention will be devoted, on the consequences of these assessments on climate impact studies, and on the regional aspects of climate changes
To validate rigorously the accuracy of metamodel is an important research area in metamodel techniques. A leave-k-out cross-validation technique not only requires considerable computational cost but also cannot measure quantitatively the fidelity of metamodel. Recently, the average validation technique has been proposed. However the average validation criterion may stop a sampling process prematurely even if kriging model is inaccurate yet. In this research, we propose a new validation technique using an average and a variance of response during a sequential sampling method, such as maximum entropy sampling. The proposed validation technique becomes more efficient and accurate than cross-validation technique, because it integrates explicitly kriging model to achieve an accurate average and variance, rather than numerical integration. The proposed validation technique shows similar trend to root mean squared error such that it can be used as a strop criterion for sequential sampling
This paper gives an overview about existing Computable General Equilibrium (CGE) models dealing with climate impacts focusing on damage calculations and adaptation modelling. Empirical CGE models are used in a broad field of policy analysis. With respect to climate change applications have been focused on the calculation of climate damages and the mitigation of these damages. Facing the non-preventable damages from climate change that occur already in the next decades adaptation is becoming a...
Butts, M.; Rasmussen, S.H.; Ridler, M.; Larsen, Morten Andreas Dahl; Drews, Martin; Lerer, Sara Maria; Overgaard, J.; Grooss, J.; Rosbjerg, Dan; Christensen, J.H.; Refsgaard, J. C.
Motivated by the need to develop better tools to understand the impact of future management and climate change on water resources, we present a set of studies with the overall aim of developing a fully dynamic coupling between a comprehensive hydrological model, MIKE SHE, and a regional climate...... distributed parameters using satellite remote sensing. Secondly, field data are used to investigate the effects of model resolution and parameter scales for use in a coupled model. Finally, the development of the fully coupled climate-hydrology model is described and some of the challenges associated with...... coupling models for hydrological processes on sub-grid scales of the regional climate model are presented....
Full Text Available Fully coupled climate carbon cycle models are sophisticated tools that are used to predict future climate change and its impact on the land and ocean carbon cycles. These models should be able to adequately represent natural variability, requiring model validation by observations. The present study focuses on the ocean carbon cycle component, in particular the spatial and temporal variability in net primary productivity (PP and export production (EP of particulate organic carbon (POC. Results from three coupled climate carbon cycle models (IPSL, MPIM, NCAR are compared with observation-based estimates derived from satellite measurements of ocean colour and results from inverse modelling (data assimilation. Satellite observations of ocean colour have shown that temporal variability of PP on the global scale is largely dominated by the permanently stratified, low-latitude ocean (Behrenfeld et al., 2006 with stronger stratification (higher sea surface temperature; SST being associated with negative PP anomalies. Results from all three coupled models confirm the role of the low-latitude, permanently stratified ocean for anomalies in globally integrated PP, but only one model (IPSL also reproduces the inverse relationship between stratification (SST and PP. An adequate representation of iron and macronutrient co-limitation of phytoplankton growth in the tropical ocean has shown to be the crucial mechanism determining the capability of the models to reproduce observed interactions between climate and PP.
We briefly review some of the scientific challenges and epistemological issues related to climate science. We discuss the formulation and testing of theories and numerical models, which, given the presence of unavoidable uncertainties in observational data, the non-repeatability of world-experiments, and the fact that relevant processes occur in a large variety of spatial and temporal scales, require a rather different approach than in other scientific contexts. A brief discussion of the intrinsic limitations of geo-engineering solutions to global warming is presented, and a framework of investigation based upon non-equilibrium thermodynamics is proposed. We also critically discuss recently proposed perspectives of development of climate science based purely upon massive use of supercomputer and centralized planning of scientific priorities.
Seiler, C.; Hutjes, R. W. A.; Kruijt, B.; Quispe, J.; Añez, S.; Arora, V. K.; Melton, J. R.; Hickler, T.; Kabat, P.
Dynamic vegetation models have been used to assess the resilience of tropical forests to climate change, but the global application of these modeling experiments often misrepresents carbon dynamics at a regional level, limiting the validity of future projections. Here a dynamic vegetation model (Lund Potsdam Jena General Ecosystem Simulator) was adapted to simulate present-day potential vegetation as a baseline for climate change impact assessments in the evergreen and deciduous forests of Bolivia. Results were compared to biomass measurements (819 plots) and remote sensing data. Using regional parameter values for allometric relations, specific leaf area, wood density, and disturbance interval, a realistic transition from the evergreen Amazon to the deciduous dry forest was simulated. This transition coincided with threshold values for precipitation (1400 mm yr-1) and water deficit (i.e., potential evapotranspiration minus precipitation) (-830 mm yr-1), beyond which leaf abscission became a competitive advantage. Significant correlations were found between modeled and observed values of seasonal leaf abscission (R2 = 0.6, p <0.001) and vegetation carbon (R2 = 0.31, p <0.01). Modeled Gross Primary Productivity (GPP) and remotely sensed normalized difference vegetation index showed that dry forests were more sensitive to rainfall anomalies than wet forests. GPP was positively correlated to the El Niño-Southern Oscillation index in the Amazon and negatively correlated to consecutive dry days. Decreasing rainfall trends were simulated to reduce GPP in the Amazon. The current model setup provides a baseline for assessing the potential impacts of climate change in the transition zone from wet to dry tropical forests in Bolivia.
Gao, Jiangbo; Hou, Wenjuan; Xue, Yongkang; Wu, Shaohong
To better understand the regional climate model (RCM) performance for East Asian summer climate and the influencing factors, this study evaluated the dynamic downscaling ability of the Weather Research Forecast (WRF) RCM. According to the comprehensive comparison studies on different physical processes and experimental settings, the optimal combination of WRF model setups can be obtained for East Asian precipitation and temperature simulations. Furthermore, based on the optimal combination, when compared with climate observations, WRF shows high ability to downscale NCEP DOE Reanalysis-2, which provided initial and lateral boundary conditions for the WRF, especially for the precipitation simulation due to the better simulated low-level water vapor flux. However, the strengthened Western North Pacific Subtropical High (WPSH) from WRF simulation results in the positive anomaly for summer rainfall.
Judd, K.; Brock, W. A.
Authors: Dr. Kenneth L. Judd, Hoover Institution, and Prof. William A. Brock, University of Wisconsin Current climate models range from General Circulation Models (GCM’s) with millions of degrees of freedom to models with few degrees of freedom. Simple Energy Balance Climate Models (EBCM’s) help us understand the dynamics of GCM’s. The same is true in economics with Computable General Equilibrium Models (CGE’s) where some models are infinite-dimensional multidimensional differential equations but some are simple models. Nordhaus (2007, 2010) couples a simple EBCM with a simple economic model. One- and two- dimensional ECBM’s do better at approximating damages across the globe and positive and negative feedbacks from anthroprogenic forcing (North etal. (1981), Wu and North (2007)). A proper coupling of climate and economic systems is crucial for arriving at effective policies. Brock and Xepapadeas (2010) have used Fourier/Legendre based expansions to study the shape of socially optimal carbon taxes over time at the planetary level in the face of damages caused by polar ice cap melt (as discussed by Oppenheimer, 2005) but in only a “one dimensional” EBCM. Economists have used orthogonal polynomial expansions to solve dynamic, forward-looking economic models (Judd, 1992, 1998). This presentation will couple EBCM climate models with basic forward-looking economic models, and examine the effectiveness and scaling properties of alternative solution methods. We will use a two dimensional EBCM model on the sphere (Wu and North, 2007) and a multicountry, multisector regional model of the economic system. Our aim will be to gain insights into intertemporal shape of the optimal carbon tax schedule, and its impact on global food production, as modeled by Golub and Hertel (2009). We will initially have limited computing resources and will need to focus on highly aggregated models. However, this will be more complex than existing models with forward
Soon, W.; Baliunas, S.; Idso, S.; Kondratyev, K. Ya.; Posmentier, E. S.
A likelihood of disastrous global environmental consequences has been surmised as a result of projected increases in anthropogenic greenhouse gas emissions. These estimates are based on computer climate modeling, a branch of science still in its infancy despite recent, substantial strides in knowledge. Because the expected anthropogenic climate forcings are relatively small compared to other background and forcing factors (internal and external), the credibility of the modeled global and regional responses rests on the validity of the models. We focus on this important question of climate model validation. Specifically, we review common deficiencies in general circulation model calculations of atmospheric temperature, surface temperature, precipitation and their spatial and temporal variability. These deficiencies arise from complex problems associated with parameterization of multiply-interacting climate components, forcings and feedbacks, involving especially clouds and oceans. We also review examples of expected climatic impacts from anthropogenic CO2 forcing. Given the host of uncertainties and unknowns in the difficult but important task of climate modeling, the unique attribution of observed current climate change to increased atmospheric CO2 concentration, including the relatively well-observed latest 20 years, is not possible. We further conclude that the incautious use of GCMs to make future climate projections from incomplete or unknown forcing scenarios is antithetical to the intrinsically heuristic value of models. Such uncritical application of climate models has led to the commonly-held but erroneous impression that modeling has proven or substantiated the hypothesis that CO2 added to the air has caused or will cause significant global warming. An assessment of the positive skills of GCMs and their use in suggesting a discernible human influence on global climate can be found in the joint World Meteorological Organisation and United Nations
Validating models with correlated multivariate outputs involves the comparison of multiple stochastic quantities. Considering both uncertainty and correlations among multiple responses from model and physical observations imposes challenges. Existing marginal comparison methods and the hypothesis testing-based methods either ignore correlations among responses or only reach Boolean conclusions (yes or no) without accounting for the amount of discrepancy between a model and the underlying reality. A new validation metric is needed to quantitatively characterize the overall agreement of multiple responses considering correlations among responses and uncertainty in both model predictions and physical observations. In this paper, by extending the concept of “area metric” and the “u-pooling method” developed for validating a single response, we propose new model validation metrics for validating correlated multiple responses using the multivariate probability integral transformation (PIT). One new metric is the PIT area metric for validating multi-responses at a single validation site. The other is the t-pooling metric that allows for pooling observations of multiple responses observed at multiple validation sites to assess the global predictive capability. The proposed metrics have many favorable properties that are well suited for validation assessment of models with correlated responses. The two metrics are examined and compared with the direct area metric and the marginal u-pooling method respectively through numerical case studies and an engineering example to illustrate their validity and potential benefits
张贝克; 许欣; 马昕; 吴重光
Signed direct graph (SDG) theory provides algorithms and methods that can be applied directly to chemical process modeling and analysis to validate simulation models, and is a basis for the development of a soft-ware environment that can automate the validation activity. This paper is concentrated on the pretreatment of the model validation. We use the validation scenarios and standard sequences generated by well-established SDG model to validate the trends fitted from the simulation model. The results are helpful to find potential problems, as-sess possible bugs in the simulation model and solve the problem effectively. A case study on a simulation model of boiler is presented to demonstrate the effectiveness of this method.
Dams, J.; Nossent, J.; Senbeta, T. B.; Willems, P.; Batelaan, O.
decrease of the lowest flows, except for the SWAT model with the mean hydrological impact climate change scenario. The results of this study indicate that besides the uncertainty introduced by the climate change scenarios also the hydrological model structure uncertainty should be taken into account in the assessment of climate change impacts on hydrology. To make it more straightforward and transparent to include model structural uncertainty in hydrological impact studies, there is a need for hydrological modelling tools that allow flexible structures and methods to validate model structures in their ability to assess impacts under unobserved future climatic conditions.
Rind, D.; Rosenzweig, C.; Goldberg, R.
The predictions of climate change studies depend crucially on the hydrological cycles embedded in the different models used. It is shown here that uncertainties in hydrological processes and inconsistencies in both climate and impact models limit confidence in current assessments of climate change. A future course of action to remedy this problem is suggested.
Hagemann, S.; Chen, Cui; Clark, D.B.; Folwell, S.; Gosling, S.; Haddeland, I.; Hanasaki, N.; Heinke, J.; Ludwig, F.
Climate change is expected to alter the hydrological cycle resulting in large-scale impacts on water availability. However, future climate change impact assessments are highly uncertain. For the first time, multiple global climate (three) and hydrological 5 models (eight) were used to systematically
Freeman, J.; Whitmore, J.; Kaffine, L.; Blair, N.; Dobos, A. P.
The System Advisor Model (SAM) is a free software tool that performs detailed analysis of both system performance and system financing for a variety of renewable energy technologies. This report provides detailed validation of the SAM flat plate photovoltaic performance model by comparing SAM-modeled PV system generation data to actual measured production data for nine PV systems ranging from 75 kW to greater than 25 MW in size. The results show strong agreement between SAM predictions and field data, with annualized prediction error below 3% for all fixed tilt cases and below 8% for all one axis tracked cases. The analysis concludes that snow cover and system outages are the primary sources of disagreement, and other deviations resulting from seasonal biases in the irradiation models and one axis tracking issues are discussed in detail.
Weitzel, E. [Alliance for Residential Building Innovation, Davis, CA (United States); Hoeschele, E. [Alliance for Residential Building Innovation, Davis, CA (United States)
A developing body of work is forming that collects data on domestic hot water consumption, water use behaviors, and energy efficiency of various distribution systems. Transient System Simulation Tool (TRNSYS) is a full distribution system developed that has been validated using field monitoring data and then exercised in a number of climates to understand climate impact on performance. In this study, the Building America team built upon previous analysis modeling work to evaluate differing distribution systems and the sensitivities of water heating energy and water use efficiency to variations of climate, load, distribution type, insulation and compact plumbing practices. Overall, 124 different TRNSYS models were simulated. The results of this work are useful in informing future development of water heating best practices guides as well as more accurate (and simulation time efficient) distribution models for annual whole house simulation programs.
Lucas-Picher, P.; Wulff-Nielsen, M.; Christensen, J.H.;
meteorological stations (Danish Meteorological Institute) at the coast and automatic weather stations on the ice sheet (Greenland Climate Network). Generally, the temperature and precipitation biases are small, indicating a realistic simulation of the climate over Greenland that is suitable to drive ice sheet......This study presents two simulations of the climate over Greenland with the regional climate model (RCM) HIRHAM5 at 0.05° and 0.25° resolution driven at the lateral boundaries by the ERA-Interim reanalysis for the period 1989–2009. These simulations are validated against observations from...... models. However, the bias between the simulations and the few available observations does not reduce with higher resolution. This is partly explained by the lack of observations in regions where the higher resolution is expected to improve the simulated climate. The RCM simulations show that the...
CHOU JieMing; DONG WenJie; YE DuZheng
An attempt has been made to construct a novel economy-climate model by combining climate change research with agricultural economy research to evaluate the influence of global climate change on grain yields. The insertion of a climate change factor into the economic C-D (Cobb-Dauglas) production function model yields a novel evaluation model, which connects the climate change factor to the economic variation factor, and the performance and reasonableness of the novel evaluation model are also preliminarily simulated and verified.
Smallwood, R H; Holcombe, W.M.L.; Walker, D C
In this paper we take the view that computational models of biological systems should satisfy two conditions – they should be able to predict function at a systems biology level, and robust techniques of validation against biological models must be available. A modelling paradigm for developing a predictive computational model of cellular interaction is described, and methods of providing robust validation against biological models are explored, followed by a consideration of soft...
Hagemann, S.; Chen, C.; Clark, D.B.; S. Folwell; Gosling, S.N.; Haddeland, I.; Hanasaki, N.; J. Heinke; F. Ludwig; Voß, F.; A. J. Wiltshire
Climate change is expected to alter the hydrological cycle resulting in large-scale impacts on water availability. However, future climate change impact assessments are highly uncertain. For the first time, multiple global climate (three) and hydrological models (eight) were used to systematically assess the hydrological response to climate change and project the future state of global water resources. The results show a large spread in projected changes in water resources within the climate–...
Xu Chengjian, E-mail: email@example.com [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)
Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.
J. Cermak; Furrer, R.; Knutti, R.; Meehl, G. A.; Tebaldi, C.
Recent coordinated efforts, in which numerous general circulation climate models have been run for a common set of experiments, have produced large datasets of projections of future climate for various scenarios. Those multimodel ensembles sample initial conditions, parameters, and structural uncertainties in the model design, and they have prompted a variety of approaches to quantifying uncertainty in future climate change. International climate change assessments also rely heavily on these ...