WorldWideScience

Sample records for model validation streamflow

  1. Comparison of cross-validation and bootstrap aggregating for building a seasonal streamflow forecast model

    Science.gov (United States)

    Schick, Simon; Rössler, Ole; Weingartner, Rolf

    2016-10-01

    Based on a hindcast experiment for the period 1982-2013 in 66 sub-catchments of the Swiss Rhine, the present study compares two approaches of building a regression model for seasonal streamflow forecasting. The first approach selects a single "best guess" model, which is tested by leave-one-out cross-validation. The second approach implements the idea of bootstrap aggregating, where bootstrap replicates are employed to select several models, and out-of-bag predictions provide model testing. The target value is mean streamflow for durations of 30, 60 and 90 days, starting with the 1st and 16th day of every month. Compared to the best guess model, bootstrap aggregating reduces the mean squared error of the streamflow forecast by seven percent on average. Thus, if resampling is anyway part of the model building procedure, bootstrap aggregating seems to be a useful strategy in statistical seasonal streamflow forecasting. Since the improved accuracy comes at the cost of a less interpretable model, the approach might be best suited for pure prediction tasks, e.g. as in operational applications.

  2. Optimization Framework for Stochastic Modeling of Annual Streamflows

    Science.gov (United States)

    Srivastav, R. K.; Srinivasan, K.; Sudheer, K.

    2008-12-01

    Synthetic streamflow data generation involves the synthesis of likely streamflow patterns that are statistically indistinguishable from the observed streamflow data. The various kinds of stochastic models adopted for streamflow generation in hydrology are: i) parametric models which hypothesize the form of the dependence structure and the distributional form a priori (examples are AR, ARMA); ii) Nonparametric models (examples are bootstrap/kernel based methods), which characterize the laws of chance, describing the stream flow process, without recourse to prior assumptions as to the form or structure of these laws; iii) Hybrid models which blend both parametric and non-parametric models advantageously to model the streamflows effectively. Despite many of these developments that have taken place in the field of stochastic modeling of streamflows over the last four decades, accurate prediction of the storage and the critical drought (water use) characteristics has been posing a persistent challenge to the stochastic modeler. This may be because, usually, the stochastic streamflow model parameters are estimated by minimizing a statistically based objective function (such as maximum likelihood (MLE) or least squares estimation) and subsequently the efficacy of the models is being validated based on the accuracy of prediction of the estimates of the water- use characteristics. In this study a framework is proposed to find the optimal hybrid model (blend of ARMA(1,1) and moving block bootstrap (MBB)) based on the explicit objective function of minimizing the relative bias in estimating the storage capacity of the reservoir. The optimal parameter set of the hybrid model is obtained based on the search over a multi-dimensional parameter space involving simultaneous exploration of the parametric (ARMA[1,1]) as well as the non-parametric (MBB) components. This is achieved using the efficient evolutionary search based optimization tool namely, non-dominated sorting genetic

  3. Evaluation of streamflow simulation results of land surface models in GLDAS on the Tibetan plateau

    Science.gov (United States)

    Bai, Peng; Liu, Xiaomang; Yang, Tiantian; Liang, Kang; Liu, Changming

    2016-10-01

    The Global Land Data Assimilation System (GLDAS) project estimates long-term runoff based on land surface models (LSMs) and provides a potential way to solve the issue of nonexistent streamflow data in gauge-sparse regions such as the Tibetan Plateau (TP). However, the reliability of GLDAS runoff data must be validated before being practically applied. In this study, the streamflows simulated by four LSMs (CLM, Noah, VIC, and Mosaic) in GLDAS coupled with a river routing model are evaluated against observed streamflows in five river basins on the TP. The evaluation criteria include four aspects: monthly streamflow value, seasonal cycle of streamflow, annual streamflow trend, and streamflow component partitioning. The four LSMs display varying degrees of biases in monthly streamflow simulations: systematic overestimations are found in the Noah (1.74 ≤ bias ≤ 2.75) and CLM (1.22 ≤ bias ≤ 2.53) models, whereas systematic underestimations are observed in the VIC (0.36 ≤ bias ≤ 0.85) and Mosaic (0.34 ≤ bias ≤ 0.66) models. The Noah model shows the best performance in capturing the temporal variation in monthly streamflow and the seasonal cycle of streamflow, while the VIC model performs the best in terms of bias statistics. The Mosaic model provides the best performance in modeling annual runoff trends and runoff component partitioning. The possible reasons for the different performances of the LSMs are discussed in detail. In order to achieve more accurate streamflow simulations from the LSMs in GLDAS, suggestions are made to further improve the accuracy of the forcing data and parameterization schemes in all models.

  4. Validation of SWAT simulated streamflow in the Eastern Nile and sensitivity to climate change

    Directory of Open Access Journals (Sweden)

    D. T. Mengistu

    2011-10-01

    Full Text Available The hydrological model SWAT was calibrated with daily station based precipitation and temperature data for the whole Eastern Nile basin including the three subbasins: the Blue Nile, Baro Akobo and Tekeze. The daily and monthly streamflow was calibrated and validated at six outlets in the three different subbasins. The model performed very well in simulating the monthly variability of the Eastern Nile streamflow while comparison to daily data revealed a more diverse performance for the extreme events.

    Of the Eastern Nile average annual rainfall it was estimated that around 60% is lost through evaporation and estimated runoff coefficients were 0.24, 0.30 and 0.18 for Blue Nile, Baro Akobo and Tekeze subbasins, respectively. About half to two-thirds of the runoff could be attributed to surface runoff while the remaining contributions were from groundwater.

    The annual streamflow sensitivity to changes in precipitation and temperature differed among the basins and the dependence of the response on the strength of the changes was not linear. On average the annual streamflow responses to a change in precipitation with no temperature change was 19%, 17%, and 26% per 10% change in precipitation while the average annual streamflow responses to a change in temperature and no precipitation change was −4.4% K−1, −6.4% K−1, and −1.3% K−1 for Blue Nile, Baro Akobo and Tekeze river basin, respectively.

    While we show the Eastern Nile to be very sensitive to precipitation changes, using 47 temperature and precipitation scenarios from 19 AOGCMs participating in IPCC AR4 we estimated the future change in streamflow to be strongly dependent on the choice of climate model as the climate models disagree on both the strength and the direction of future precipitation changes. Thus, no clear conclusions can be made about the future changes in Eastern Nile streamflow.

  5. Free internet datasets for streamflow modelling using SWAT in the Johor river basin, Malaysia

    Science.gov (United States)

    Tan, M. L.

    2014-02-01

    Streamflow modelling is a mathematical computational approach that represents terrestrial hydrology cycle digitally and is used for water resources assessment. However, such modelling endeavours require a large amount of data. Generally, governmental departments produce and maintain these data sets which make it difficult to obtain this data due to bureaucratic constraints. In some countries, the availability and quality of geospatial and climate datasets remain a critical issue due to many factors such as lacking of ground station, expertise, technology, financial support and war time. To overcome this problem, this research used public domain datasets from the Internet as "input" to a streamflow model. The intention is simulate daily and monthly streamflow of the Johor River Basin in Malaysia. The model used is the Soil and Water Assessment Tool (SWAT). As input free data including a digital elevation model (DEM), land use information, soil and climate data were used. The model was validated by in-situ streamflow information obtained from Rantau Panjang station for the year 2006. The coefficient of determination and Nash-Sutcliffe efficiency were 0.35/0.02 for daily simulated streamflow and 0.92/0.21 for monthly simulated streamflow, respectively. The results show that free data can provide a better simulation at a monthly scale compared to a daily basis in a tropical region. A sensitivity analysis and calibration procedure should be conducted in order to maximize the "goodness-of-fit" between simulated and observed streamflow. The application of Internet datasets promises an acceptable performance of streamflow modelling. This research demonstrates that public domain data is suitable for streamflow modelling in a tropical river basin within acceptable accuracy.

  6. Streamflow modelling by remote sensing: a contribution to digital earth

    NARCIS (Netherlands)

    Tan, M.L.; Latif, A.B.; Pohl, C.; Duan, Z.

    2014-01-01

    Remote sensing contributes valuable information to streamflow estimates. This paper discusses its relevance to the digital earth concept. The authors categorize the role of remote sensing in streamflow modelling and estimation. This paper emphasizes the applications and challenges of satellite-based

  7. An Hourly Streamflow Forecasting Model Coupled with an Enforced Learning Strategy

    Directory of Open Access Journals (Sweden)

    Ming-Chang Wu

    2015-10-01

    Full Text Available Floods, one of the most significant natural hazards, often result in loss of life and property. Accurate hourly streamflow forecasting is always a key issue in hydrology for flood hazard mitigation. To improve the performance of hourly streamflow forecasting, a methodology concerning the development of neural network (NN based models with an enforced learning strategy is proposed in this paper. Firstly, four different NNs, namely back propagation network (BPN, radial basis function network (RBFN, self-organizing map (SOM, and support vector machine (SVM, are used to construct streamflow forecasting models. Through the cross-validation test, NN-based models with superior performance in streamflow forecasting are detected. Then, an enforced learning strategy is developed to further improve the performance of the superior NN-based models, i.e., SOM and SVM in this study. Finally, the proposed flow forecasting model is obtained. Actual applications are conducted to demonstrate the potential of the proposed model. Moreover, comparison between the NN-based models with and without the enforced learning strategy is performed to evaluate the effect of the enforced learning strategy on model performance. The results indicate that the NN-based models with the enforced learning strategy indeed improve the accuracy of hourly streamflow forecasting. Hence, the presented methodology is expected to be helpful for developing improved NN-based streamflow forecasting models.

  8. Application of ANN and fuzzy logic algorithms for streamflow modelling of Savitri catchment

    Indian Academy of Sciences (India)

    Mahesh Kothari; K D Gharde

    2015-07-01

    The streamflow prediction is an essentially important aspect of any watershed modelling. The black box models (soft computing techniques) have proven to be an efficient alternative to physical (traditional) methods for simulating streamflow and sediment yield of the catchments. The present study focusses on development of models using ANN and fuzzy logic (FL) algorithm for predicting the streamflow for catchment of Savitri River Basin. The input vector to these models were daily rainfall, mean daily evaporation, mean daily temperature and lag streamflow used. In the present study, 20 years (1992–2011) rainfall and other hydrological data were considered, of which 13 years (1992–2004) was for training and rest 7 years (2005–2011) for validation of the models. The mode performance was evaluated by R, RMSE, EV, CE, and MAD statistical parameters. It was found that, ANN model performance improved with increasing input vectors. The results with fuzzy logic models predict the streamflow with single input as rainfall better in comparison to multiple input vectors. While comparing both ANN and FL algorithms for prediction of streamflow, ANN model performance is quite superior.

  9. Multi-objective assessment of three remote sensing vegetation products for streamflow prediction in a conceptual ecohydrological model

    Science.gov (United States)

    Naseem, Bushra; Ajami, Hoori; Liu, Yi; Cordery, Ian; Sharma, Ashish

    2016-12-01

    This study assesses the implications of using three alternate remote sensing vegetation products in the simulation of streamflow using a conceptual ecohydrologic model. Vegetation is represented as a dynamic component in this model which simulates two response variables, streamflow and one of the following three vegetation attributes: Gross Primary Productivity (GPP), Leaf Area Index (LAI) or Vegetation Optical Depth (VOD). Model simulations are performed across 50 catchments with areas ranging between 50 and 1600 km2 in the Murray-Darling Basin in Australia. Moderate Resolution Imaging Spectroradiometer (MODIS) LAI and GPP products, passive microwave observations of VOD and streamflow are used for model calibration and/or validation. Single-objective model calibration based on one of the vegetation products (GPP, LAI and VOD) shows that GPP is the best vegetation simulating product. On the contrary, LAI produces the best streamflow during validation when the optimized parameters are applied for streamflow estimation. To obtain the best compromise solution for simultaneous simulation of streamflow and a vegetation product, a multi-objective optimization is applied on GPP and streamflow, VOD and streamflow and LAI and streamflow. Results show that LAI and then VOD are the two best products in simulating streamflow across these catchments. Improved simulation of VOD and LAI in a multi-objective setting is partly related to the higher temporal resolution of these datasets and inclusion of processes for converting GPP to net primary productivity and biomass. It is suggested that further development of these remote sensing products at finer spatial and temporal resolutions may lead to improved streamflow prediction, as well as a better simulation capability of the ecohydrological system being modeled.

  10. Simulation of daily streamflow for nine river basins in eastern Iowa using the Precipitation-Runoff Modeling System

    Science.gov (United States)

    Haj, Adel E.; Christiansen, Daniel E.; Hutchinson, Kasey J.

    2015-10-14

    The U.S. Geological Survey, in cooperation with the Iowa Department of Natural Resources, constructed Precipitation-Runoff Modeling System models to estimate daily streamflow for nine river basins in eastern Iowa that drain into the Mississippi River. The models are part of a suite of methods for estimating daily streamflow at ungaged sites. The Precipitation-Runoff Modeling System is a deterministic, distributed- parameter, physical-process-based modeling system developed to evaluate the response of streamflow and general drainage basin hydrology to various combinations of climate and land use. Calibration and validation periods used in each basin mostly were October 1, 2002, through September 30, 2012, but differed depending on the period of record available for daily mean streamflow measurements at U.S. Geological Survey streamflow-gaging stations.

  11. Ensemble forecasting of sub-seasonal to seasonal streamflow by a Bayesian joint probability modelling approach

    Science.gov (United States)

    Zhao, Tongtiegang; Schepen, Andrew; Wang, Q. J.

    2016-10-01

    The Bayesian joint probability (BJP) modelling approach is used operationally to produce seasonal (three-month-total) ensemble streamflow forecasts in Australia. However, water resource managers are calling for more informative sub-seasonal forecasts. Taking advantage of BJP's capability of handling multiple predictands, ensemble forecasting of sub-seasonal to seasonal streamflows is investigated for 23 catchments around Australia. Using antecedent streamflow and climate indices as predictors, monthly forecasts are developed for the three-month period ahead. Forecast reliability and skill are evaluated for the period 1982-2011 using a rigorous leave-five-years-out cross validation strategy. BJP ensemble forecasts of monthly streamflow volumes are generally reliable in ensemble spread. Forecast skill, relative to climatology, is positive in 74% of cases in the first month, decreasing to 57% and 46% respectively for streamflow forecasts for the final two months of the season. As forecast skill diminishes with increasing lead time, the monthly forecasts approach climatology. Seasonal forecasts accumulated from monthly forecasts are found to be similarly skilful to forecasts from BJP models based on seasonal totals directly. The BJP modelling approach is demonstrated to be a viable option for producing ensemble time-series sub-seasonal to seasonal streamflow forecasts.

  12. Climate information based streamflow and rainfall forecasts for Huai River Basin using Hierarchical Bayesian Modeling

    Directory of Open Access Journals (Sweden)

    X. Chen

    2013-09-01

    Full Text Available A Hierarchal Bayesian model for forecasting regional summer rainfall and streamflow season-ahead using exogenous climate variables for East Central China is presented. The model provides estimates of the posterior forecasted probability distribution for 12 rainfall and 2 streamflow stations considering parameter uncertainty, and cross-site correlation. The model has a multilevel structure with regression coefficients modeled from a common multivariate normal distribution results in partial-pooling of information across multiple stations and better representation of parameter and posterior distribution uncertainty. Covariance structure of the residuals across stations is explicitly modeled. Model performance is tested under leave-10-out cross-validation. Frequentist and Bayesian performance metrics used include Receiver Operating Characteristic, Reduction of Error, Coefficient of Efficiency, Rank Probability Skill Scores, and coverage by posterior credible intervals. The ability of the model to reliably forecast regional summer rainfall and streamflow season-ahead offers potential for developing adaptive water risk management strategies.

  13. Variational assimilation of streamflow into operational distributed hydrologic models: effect of spatiotemporal adjustment scale

    Directory of Open Access Journals (Sweden)

    H. Lee

    2012-01-01

    Full Text Available State updating of distributed rainfall-runoff models via streamflow assimilation is subject to overfitting because large dimensionality of the state space of the model may render the assimilation problem seriously under-determined. To examine the issue in the context of operational hydrology, we carry out a set of real-world experiments in which streamflow data is assimilated into gridded Sacramento Soil Moisture Accounting (SAC-SMA and kinematic-wave routing models of the US National Weather Service (NWS Research Distributed Hydrologic Model (RDHM with the variational data assimilation technique. Study basins include four basins in Oklahoma and five basins in Texas. To assess the sensitivity of data assimilation performance to dimensionality reduction in the control vector, we used nine different spatiotemporal adjustment scales, where state variables are adjusted in a lumped, semi-distributed, or distributed fashion and biases in precipitation and potential evaporation (PE are adjusted hourly, 6-hourly, or kept time-invariant. For each adjustment scale, three different streamflow assimilation scenarios are explored, where streamflow observations at basin interior points, at the basin outlet, or at both interior points and the outlet are assimilated. The streamflow assimilation experiments with nine different basins show that the optimum spatiotemporal adjustment scale varies from one basin to another and may be different for streamflow analysis and prediction in all of the three streamflow assimilation scenarios. The most preferred adjustment scale for seven out of nine basins is found to be the distributed, hourly scale, despite the fact that several independent validation results at this adjustment scale indicated the occurrence of overfitting. Basins with highly correlated interior and outlet flows tend to be less sensitive to the adjustment scale and could benefit more from streamflow assimilation. In comparison to outlet flow assimilation

  14. Ranking streamflow model performance based on Information theory metrics

    Science.gov (United States)

    Martinez, Gonzalo; Pachepsky, Yakov; Pan, Feng; Wagener, Thorsten; Nicholson, Thomas

    2016-04-01

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic model evaluation and selection. We simulated 10-year streamflow time series in five watersheds located in Texas, North Carolina, Mississippi, and West Virginia. Eight model of different complexity were applied. The information-theory based metrics were obtained after representing the time series as strings of symbols where different symbols corresponded to different quantiles of the probability distribution of streamflow. The symbol alphabet was used. Three metrics were computed for those strings - mean information gain that measures the randomness of the signal, effective measure complexity that characterizes predictability and fluctuation complexity that characterizes the presence of a pattern in the signal. The observed streamflow time series has smaller information content and larger complexity metrics than the precipitation time series. Watersheds served as information filters and and streamflow time series were less random and more complex than the ones of precipitation. This is reflected the fact that the watershed acts as the information filter in the hydrologic conversion process from precipitation to streamflow. The Nash Sutcliffe efficiency metric increased as the complexity of models increased, but in many cases several model had this efficiency values not statistically significant from each other. In such cases, ranking models by the closeness of the information-theory based parameters in simulated and measured streamflow time series can provide an additional criterion for the evaluation of hydrologic model performance.

  15. Simulating streamflow and water table depth with a coupled hydrological model

    Institute of Scientific and Technical Information of China (English)

    Alphonce Chenjerayi GUZHA; Thomas Byron HARDY

    2010-01-01

    A coupled model integrating MODFLOW and TOPNET with the models interacting through the exchange of recharge and baseflow and river-aquifer interactions was developed and applied to the Big Darby Watershed in Ohio,USA.Calibration and validation results show that there is generally good agreement between measured streamflow and simulated results from the coupled model.At two gauging stations,average goodness of fit(R2),percent bias(PB),and Nash Sutcliffe efficiency(ENS)values of 0.83,11.15%,and 0.83,respectively,were obtained for simulation of streamflow during calibration,and values of 0.84,8.75%,and 0.85,respectively,were obtained for validation.The simulated water table depths yielded average R2 values of 0.77and 0.76 for calibration and validation,respectively.The good match between measured and simulated streamflows and water table depths demonstrates that the model is capable of adequately simulating streamflows and water table depths in the watershed and also capturing the influence of spatial and temporal variation in recharge.

  16. Improving statistical forecasts of seasonal streamflows using hydrological model output

    Directory of Open Access Journals (Sweden)

    D. E. Robertson

    2013-02-01

    Full Text Available Statistical methods traditionally applied for seasonal streamflow forecasting use predictors that represent the initial catchment condition and future climate influences on future streamflows. Observations of antecedent streamflows or rainfall commonly used to represent the initial catchment conditions are surrogates for the true source of predictability and can potentially have limitations. This study investigates a hybrid seasonal forecasting system that uses the simulations from a dynamic hydrological model as a predictor to represent the initial catchment condition in a statistical seasonal forecasting method. We compare the skill and reliability of forecasts made using the hybrid forecasting approach to those made using the existing operational practice of the Australian Bureau of Meteorology for 21 catchments in eastern Australia. We investigate the reasons for differences. In general, the hybrid forecasting system produces forecasts that are more skilful than the existing operational practice and as reliable. The greatest increases in forecast skill tend to be (1 when the catchment is wetting up but antecedent streamflows have not responded to antecedent rainfall, (2 when the catchment is drying and the dominant source of antecedent streamflow is in transition between surface runoff and base flow, and (3 when the initial catchment condition is near saturation intermittently throughout the historical record.

  17. Improving statistical forecasts of seasonal streamflows using hydrological model output

    Science.gov (United States)

    Robertson, D. E.; Pokhrel, P.; Wang, Q. J.

    2013-02-01

    Statistical methods traditionally applied for seasonal streamflow forecasting use predictors that represent the initial catchment condition and future climate influences on future streamflows. Observations of antecedent streamflows or rainfall commonly used to represent the initial catchment conditions are surrogates for the true source of predictability and can potentially have limitations. This study investigates a hybrid seasonal forecasting system that uses the simulations from a dynamic hydrological model as a predictor to represent the initial catchment condition in a statistical seasonal forecasting method. We compare the skill and reliability of forecasts made using the hybrid forecasting approach to those made using the existing operational practice of the Australian Bureau of Meteorology for 21 catchments in eastern Australia. We investigate the reasons for differences. In general, the hybrid forecasting system produces forecasts that are more skilful than the existing operational practice and as reliable. The greatest increases in forecast skill tend to be (1) when the catchment is wetting up but antecedent streamflows have not responded to antecedent rainfall, (2) when the catchment is drying and the dominant source of antecedent streamflow is in transition between surface runoff and base flow, and (3) when the initial catchment condition is near saturation intermittently throughout the historical record.

  18. Reducing equifinality of hydrological models by integrating Functional Streamflow Disaggregation

    Science.gov (United States)

    Lüdtke, Stefan; Apel, Heiko; Nied, Manuela; Carl, Peter; Merz, Bruno

    2014-05-01

    A universal problem of the calibration of hydrological models is the equifinality of different parameter sets derived from the calibration of models against total runoff values. This is an intrinsic problem stemming from the quality of the calibration data and the simplified process representation by the model. However, discharge data contains additional information which can be extracted by signal processing methods. An analysis specifically developed for the disaggregation of runoff time series into flow components is the Functional Streamflow Disaggregation (FSD; Carl & Behrendt, 2008). This method is used in the calibration of an implementation of the hydrological model SWIM in a medium sized watershed in Thailand. FSD is applied to disaggregate the discharge time series into three flow components which are interpreted as base flow, inter-flow and surface runoff. In addition to total runoff, the model is calibrated against these three components in a modified GLUE analysis, with the aim to identify structural model deficiencies, assess the internal process representation and to tackle equifinality. We developed a model dependent (MDA) approach calibrating the model runoff components against the FSD components, and a model independent (MIA) approach comparing the FSD of the model results and the FSD of calibration data. The results indicate, that the decomposition provides valuable information for the calibration. Particularly MDA highlights and discards a number of standard GLUE behavioural models underestimating the contribution of soil water to river discharge. Both, MDA and MIA yield to a reduction of the parameter ranges by a factor up to 3 in comparison to standard GLUE. Based on these results, we conclude that the developed calibration approach is able to reduce the equifinality of hydrological model parameterizations. The effect on the uncertainty of the model predictions is strongest by applying MDA and shows only minor reductions for MIA. Besides

  19. Validation of streamflow measurements made with M9 and RiverRay acoustic Doppler current profilers

    Science.gov (United States)

    Boldt, Justin A.; Oberg, Kevin A.

    2015-01-01

    The U.S. Geological Survey (USGS) Office of Surface Water (OSW) previously validated the use of Teledyne RD Instruments (TRDI) Rio Grande (in 2007), StreamPro (in 2006), and Broadband (in 1996) acoustic Doppler current profilers (ADCPs) for streamflow (discharge) measurements made by the USGS. Two new ADCPs, the SonTek M9 and the TRDI RiverRay, were first used in the USGS Water Mission Area programs in 2009. Since 2009, the OSW and USGS Water Science Centers (WSCs) have been conducting field measurements as part of their stream-gaging program using these ADCPs. The purpose of this paper is to document the results of USGS OSW analyses for validation of M9 and RiverRay ADCP streamflow measurements. The OSW required each participating WSC to make comparison measurements over the range of operating conditions in which the instruments were used until sufficient measurements were available. The performance of these ADCPs was evaluated for validation and to identify any present and potential problems. Statistical analyses of streamflow measurements indicate that measurements made with the SonTek M9 ADCP using firmware 2.00–3.00 or the TRDI RiverRay ADCP using firmware 44.12–44.15 are unbiased, and therefore, can continue to be used to make streamflow measurements in the USGS stream-gaging program. However, for the M9 ADCP, there are some important issues to be considered in making future measurements. Possible future work may include additional validation of streamflow measurements made with these instruments from other locations in the United States and measurement validation using updated firmware and software.

  20. 78 FR 13874 - Watershed Modeling To Assess the Sensitivity of Streamflow, Nutrient, and Sediment Loads to...

    Science.gov (United States)

    2013-03-01

    ... AGENCY Watershed Modeling To Assess the Sensitivity of Streamflow, Nutrient, and Sediment Loads to... Streamflow, Nutrient, and Sediment Loads to Climate Change and Urban Development in 20 U.S. Watersheds (EPA... and Development and is intended to characterize the sensitivity of streamflow, nutrient (nitrogen and...

  1. The value of model averaging and dynamical climate model predictions for improving statistical seasonal streamflow forecasts over Australia

    Science.gov (United States)

    Pokhrel, Prafulla; Wang, Q. J.; Robertson, David E.

    2013-10-01

    Seasonal streamflow forecasts are valuable for planning and allocation of water resources. In Australia, the Bureau of Meteorology employs a statistical method to forecast seasonal streamflows. The method uses predictors that are related to catchment wetness at the start of a forecast period and to climate during the forecast period. For the latter, a predictor is selected among a number of lagged climate indices as candidates to give the "best" model in terms of model performance in cross validation. This study investigates two strategies for further improvement in seasonal streamflow forecasts. The first is to combine, through Bayesian model averaging, multiple candidate models with different lagged climate indices as predictors, to take advantage of different predictive strengths of the multiple models. The second strategy is to introduce additional candidate models, using rainfall and sea surface temperature predictions from a global climate model as predictors. This is to take advantage of the direct simulations of various dynamic processes. The results show that combining forecasts from multiple statistical models generally yields more skillful forecasts than using only the best model and appears to moderate the worst forecast errors. The use of rainfall predictions from the dynamical climate model marginally improves the streamflow forecasts when viewed over all the study catchments and seasons, but the use of sea surface temperature predictions provide little additional benefit.

  2. Simulation of daily streamflows at gaged and ungaged locations within the Cedar River Basin, Iowa, using a Precipitation-Runoff Modeling System model

    Science.gov (United States)

    Christiansen, Daniel E.

    2012-01-01

    The U.S. Geological Survey, in cooperation with the Iowa Department of Natural Resources, conducted a study to examine techniques for estimation of daily streamflows using hydrological models and statistical methods. This report focuses on the use of a hydrologic model, the U.S. Geological Survey's Precipitation-Runoff Modeling System, to estimate daily streamflows at gaged and ungaged locations. The Precipitation-Runoff Modeling System is a modular, physically based, distributed-parameter modeling system developed to evaluate the impacts of various combinations of precipitation, climate, and land use on surface-water runoff and general basin hydrology. The Cedar River Basin was selected to construct a Precipitation-Runoff Modeling System model that simulates the period from January 1, 2000, to December 31, 2010. The calibration period was from January 1, 2000, to December 31, 2004, and the validation periods were from January 1, 2005, to December 31, 2010 and January 1, 2000 to December 31, 2010. A Geographic Information System tool was used to delineate the Cedar River Basin and subbasins for the Precipitation-Runoff Modeling System model and to derive parameters based on the physical geographical features. Calibration of the Precipitation-Runoff Modeling System model was completed using a U.S. Geological Survey calibration software tool. The main objective of the calibration was to match the daily streamflow simulated by the Precipitation-Runoff Modeling System model with streamflow measured at U.S. Geological Survey streamflow gages. The Cedar River Basin daily streamflow model performed with a Nash-Sutcliffe efficiency ranged from 0.82 to 0.33 during the calibration period, and a Nash-Sutcliffe efficiency ranged from 0.77 to -0.04 during the validation period. The Cedar River Basin model is meeting the criteria of greater than 0.50 Nash-Sutcliffe and is a good fit for streamflow conditions for the calibration period at all but one location, Austin, Minnesota

  3. Streamflow-gain- and streamflow-loss data for streamgages in the Central Valley Hydrologic Model

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This digital dataset contains 61 sets of annual streamflow gains and losses between 1961 and 1977 along Central Valley surface-water network for the Central Valley...

  4. A Bayesian hierarchical nonhomogeneous hidden Markov model for multisite streamflow reconstructions

    Science.gov (United States)

    Bracken, C.; Rajagopalan, B.; Woodhouse, C.

    2016-10-01

    In many complex water supply systems, the next generation of water resources planning models will require simultaneous probabilistic streamflow inputs at multiple locations on an interconnected network. To make use of the valuable multicentury records provided by tree-ring data, reconstruction models must be able to produce appropriate multisite inputs. Existing streamflow reconstruction models typically focus on one site at a time, not addressing intersite dependencies and potentially misrepresenting uncertainty. To this end, we develop a model for multisite streamflow reconstruction with the ability to capture intersite correlations. The proposed model is a hierarchical Bayesian nonhomogeneous hidden Markov model (NHMM). A NHMM is fit to contemporary streamflow at each location using lognormal component distributions. Leading principal components of tree rings are used as covariates to model nonstationary transition probabilities and the parameters of the lognormal component distributions. Spatial dependence between sites is captured with a Gaussian elliptical copula. Parameters of the model are estimated in a fully Bayesian framework, in that marginal posterior distributions of all the parameters are obtained. The model is applied to reconstruct flows at 20 sites in the Upper Colorado River Basin (UCRB) from 1473 to 1906. Many previous reconstructions are available for this basin, making it ideal for testing this new method. The results show some improvements over regression-based methods in terms of validation statistics. Key advantages of the Bayesian NHMM over traditional approaches are a dynamic representation of uncertainty and the ability to make long multisite simulations that capture at-site statistics and spatial correlations between sites.

  5. Genetic Algorithm Based Framework for Automation of Stochastic Modeling of Multi-Season Streamflows

    Science.gov (United States)

    Srivastav, R. K.; Srinivasan, K.; Sudheer, K.

    2009-05-01

    Synthetic streamflow data generation involves the synthesis of likely streamflow patterns that are statistically indistinguishable from the observed streamflow data. The various kinds of stochastic models adopted for multi-season streamflow generation in hydrology are: i) parametric models which hypothesize the form of the periodic dependence structure and the distributional form a priori (examples are PAR, PARMA); disaggregation models that aim to preserve the correlation structure at the periodic level and the aggregated annual level; ii) Nonparametric models (examples are bootstrap/kernel based methods), which characterize the laws of chance, describing the stream flow process, without recourse to prior assumptions as to the form or structure of these laws; (k-nearest neighbor (k-NN), matched block bootstrap (MABB)); non-parametric disaggregation model. iii) Hybrid models which blend both parametric and non-parametric models advantageously to model the streamflows effectively. Despite many of these developments that have taken place in the field of stochastic modeling of streamflows over the last four decades, accurate prediction of the storage and the critical drought characteristics has been posing a persistent challenge to the stochastic modeler. This is partly because, usually, the stochastic streamflow model parameters are estimated by minimizing a statistically based objective function (such as maximum likelihood (MLE) or least squares (LS) estimation) and subsequently the efficacy of the models is being validated based on the accuracy of prediction of the estimates of the water-use characteristics, which requires large number of trial simulations and inspection of many plots and tables. Still accurate prediction of the storage and the critical drought characteristics may not be ensured. In this study a multi-objective optimization framework is proposed to find the optimal hybrid model (blend of a simple parametric model, PAR(1) model and matched block

  6. Systematic evaluation of autoregressive error models as post-processors for a probabilistic streamflow forecast system

    Science.gov (United States)

    Morawietz, Martin; Xu, Chong-Yu; Gottschalk, Lars; Tallaksen, Lena

    2010-05-01

    A post-processor that accounts for the hydrologic uncertainty in a probabilistic streamflow forecast system is necessary to account for the uncertainty introduced by the hydrological model. In this study different variants of an autoregressive error model that can be used as a post-processor for short to medium range streamflow forecasts, are evaluated. The deterministic HBV model is used to form the basis for the streamflow forecast. The general structure of the error models then used as post-processor is a first order autoregressive model of the form dt = αdt-1 + σɛt where dt is the model error (observed minus simulated streamflow) at time t, α and σ are the parameters of the error model, and ɛt is the residual error described through a probability distribution. The following aspects are investigated: (1) Use of constant parameters α and σ versus the use of state dependent parameters. The state dependent parameters vary depending on the states of temperature, precipitation, snow water equivalent and simulated streamflow. (2) Use of a Standard Normal distribution for ɛt versus use of an empirical distribution function constituted through the normalized residuals of the error model in the calibration period. (3) Comparison of two different transformations, i.e. logarithmic versus square root, that are applied to the streamflow data before the error model is applied. The reason for applying a transformation is to make the residuals of the error model homoscedastic over the range of streamflow values of different magnitudes. Through combination of these three characteristics, eight variants of the autoregressive post-processor are generated. These are calibrated and validated in 55 catchments throughout Norway. The discrete ranked probability score with 99 flow percentiles as standardized thresholds is used for evaluation. In addition, a non-parametric bootstrap is used to construct confidence intervals and evaluate the significance of the results. The main

  7. Statistical models for estimating daily streamflow in Michigan

    Science.gov (United States)

    Holtschlag, D.J.; Salehi, Habib

    1992-01-01

    Statistical models for estimating daily streamflow were analyzed for 25 pairs of streamflow-gaging stations in Michigan. Stations were paired by randomly choosing a station operated in 1989 at which 10 or more years of continuous flow data had been collected and at which flow is virtually unregulated; a nearby station was chosen where flow characteristics are similar. Streamflow data from the 25 randomly selected stations were used as the response variables; streamflow data at the nearby stations were used to generate a set of explanatory variables. Ordinary-least squares regression (OLSR) equations, autoregressive integrated moving-average (ARIMA) equations, and transfer function-noise (TFN) equations were developed to estimate the log transform of flow for the 25 randomly selected stations. The precision of each type of equation was evaluated on the basis of the standard deviation of the estimation errors. OLSR equations produce one set of estimation errors; ARIMA and TFN models each produce l sets of estimation errors corresponding to the forecast lead. The lead-l forecast is the estimate of flow l days ahead of the most recent streamflow used as a response variable in the estimation. In this analysis, the standard deviation of lead l ARIMA and TFN forecast errors were generally lower than the standard deviation of OLSR errors for l weighted average of forecasts based on TFN equations and backcasts (forecasts of the reverse-ordered series) based on ARIMA equations. The standard deviation of composite errors varied throughout the length of the estimation interval and generally was at maximum near the center of the interval. For comparison with OLSR errors, the mean standard deviation of composite errors were computed for intervals of length 1 to 40 days. The mean standard deviation of length-l composite errors were generally less than the standard deviation of the OLSR errors for l error magnitudes were compared by computing ratios of the mean standard deviation

  8. Streamflow Data Assimilation in SWAT Model Using Extended Kalman Filter

    Science.gov (United States)

    Sun, L.; Nistor, I.; Seidou, O.

    2014-12-01

    Although Extended Kalman Filter (EKF) is regarded as the de facto method for the application of Kalman Filter in non-linear system, it's application to complex distributed hydrological models faces a lot of challenges. Ensemble Kalman Filter (EnKF) is often preferred because it avoids the calculation of the linearization Jacobian Matrix and the propagation of estimation error covariance. EnKF is however difficult to apply to large models because of the huge computation demand needed for parallel propagation of ensemble members. This paper deals with the application of EKF in stream flow prediction using the SWAT model in the watershed of Senegal River, West Africa. In the Jacobian Matrix calculation, SWAT is regarded as a black box model and the derivatives are calculated in the form of differential equations. The state vector is the combination of runoff, soil, shallow aquifer and deep aquifer water contents. As an initial attempt, only stream flow observations are assimilated. Despite the fact that EKF is a sub-optimal filter, the coupling of EKF significantly improves the estimation of daily streamflow. The results of SWAT+EKF are also compared to those of a simpler quasi linear streamflow prediction model where both state and parameters are updated with the EKF.

  9. Determining the importance of model calibration for forecasting absolute/relative changes in streamflow from LULC and climate changes

    Science.gov (United States)

    Niraula, Rewati; Meixner, Thomas; Norman, Laura M.

    2015-03-01

    Land use/land cover (LULC) and climate changes are important drivers of change in streamflow. Assessing the impact of LULC and climate changes on streamflow is typically done with a calibrated and validated watershed model. However, there is a debate on the degree of calibration required. The objective of this study was to quantify the variation in estimated relative and absolute changes in streamflow associated with LULC and climate changes with different calibration approaches. The Soil and Water Assessment Tool (SWAT) was applied in an uncalibrated (UC), single outlet calibrated (OC), and spatially-calibrated (SC) mode to compare the relative and absolute changes in streamflow at 14 gaging stations within the Santa Cruz River Watershed in southern Arizona, USA. For this purpose, the effect of 3 LULC, 3 precipitation (P), and 3 temperature (T) scenarios were tested individually. For the validation period, Percent Bias (PBIAS) values were >100% with the UC model for all gages, the values were between 0% and 100% with the OC model and within 20% with the SC model. Changes in streamflow predicted with the UC and OC models were compared with those of the SC model. This approach implicitly assumes that the SC model is "ideal". Results indicated that the magnitude of both absolute and relative changes in streamflow due to LULC predicted with the UC and OC results were different than those of the SC model. The magnitude of absolute changes predicted with the UC and SC models due to climate change (both P and T) were also significantly different, but were not different for OC and SC models. Results clearly indicated that relative changes due to climate change predicted with the UC and OC were not significantly different than that predicted with the SC models. This result suggests that it is important to calibrate the model spatially to analyze the effect of LULC change but not as important for analyzing the relative change in streamflow due to climate change. This study

  10. Determining the importance of model calibration for forecasting absolute/relative changes in streamflow from LULC and climate changes

    Science.gov (United States)

    Niraula, Rewati; Meixner, Thomas; Norman, Laura M.

    2015-01-01

    Land use/land cover (LULC) and climate changes are important drivers of change in streamflow. Assessing the impact of LULC and climate changes on streamflow is typically done with a calibrated and validated watershed model. However, there is a debate on the degree of calibration required. The objective of this study was to quantify the variation in estimated relative and absolute changes in streamflow associated with LULC and climate changes with different calibration approaches. The Soil and Water Assessment Tool (SWAT) was applied in an uncalibrated (UC), single outlet calibrated (OC), and spatially-calibrated (SC) mode to compare the relative and absolute changes in streamflow at 14 gaging stations within the Santa Cruz River Watershed in southern Arizona, USA. For this purpose, the effect of 3 LULC, 3 precipitation (P), and 3 temperature (T) scenarios were tested individually. For the validation period, Percent Bias (PBIAS) values were >100% with the UC model for all gages, the values were between 0% and 100% with the OC model and within 20% with the SC model. Changes in streamflow predicted with the UC and OC models were compared with those of the SC model. This approach implicitly assumes that the SC model is “ideal”. Results indicated that the magnitude of both absolute and relative changes in streamflow due to LULC predicted with the UC and OC results were different than those of the SC model. The magnitude of absolute changes predicted with the UC and SC models due to climate change (both P and T) were also significantly different, but were not different for OC and SC models. Results clearly indicated that relative changes due to climate change predicted with the UC and OC were not significantly different than that predicted with the SC models. This result suggests that it is important to calibrate the model spatially to analyze the effect of LULC change but not as important for analyzing the relative change in streamflow due to climate change. This

  11. Numerical model for learning concepts of streamflow simulation

    Science.gov (United States)

    DeLong, L.L.; ,

    1993-01-01

    Numerical models are useful for demonstrating principles of open-channel flow. Such models can allow experimentation with cause-and-effect relations, testing concepts of physics and numerical techniques. Four PT is a numerical model written primarily as a teaching supplement for a course in one-dimensional stream-flow modeling. Four PT options particularly useful in training include selection of governing equations, boundary-value perturbation, and user-programmable constraint equations. The model can simulate non-trivial concepts such as flow in complex interconnected channel networks, meandering channels with variable effective flow lengths, hydraulic structures defined by unique three-parameter relations, and density-driven flow.The model is coded in FORTRAN 77, and data encapsulation is used extensively to simplify maintenance and modification and to enhance the use of Four PT modules by other programs and programmers.

  12. Multivariate synthetic streamflow generation using a hybrid model based on artificial neural networks

    Directory of Open Access Journals (Sweden)

    J. C. Ochoa-Rivera

    2002-01-01

    Full Text Available A model for multivariate streamflow generation is presented, based on a multilayer feedforward neural network. The structure of the model results from two components, the neural network (NN deterministic component and a random component which is assumed to be normally distributed. It is from this second component that the model achieves the ability to incorporate effectively the uncertainty associated with hydrological processes, making it valuable as a practical tool for synthetic generation of streamflow series. The NN topology and the corresponding analytical explicit formulation of the model are described in detail. The model is calibrated with a series of monthly inflows to two reservoir sites located in the Tagus River basin (Spain, while validation is performed through estimation of a set of statistics that is relevant for water resources systems planning and management. Among others, drought and storage statistics are computed and compared for both the synthetic and historical series. The performance of the NN-based model was compared to that of a standard autoregressive AR(2 model. Results show that NN represents a promising modelling alternative for simulation purposes, with interesting potential in the context of water resources systems management and optimisation. Keywords: neural networks, perceptron multilayer, error backpropagation, hydrological scenario generation, multivariate time-series..

  13. Evaluation of statistical and rainfall-runoff models for predicting historical daily streamflow time series in the Des Moines and Iowa River watersheds

    Science.gov (United States)

    Farmer, William H.; Knight, Rodney R.; Eash, David A.; Kasey J. Hutchinson,; Linhart, S. Mike; Christiansen, Daniel E.; Archfield, Stacey A.; Over, Thomas M.; Kiang, Julie E.

    2015-08-24

    Daily records of streamflow are essential to understanding hydrologic systems and managing the interactions between human and natural systems. Many watersheds and locations lack streamgages to provide accurate and reliable records of daily streamflow. In such ungaged watersheds, statistical tools and rainfall-runoff models are used to estimate daily streamflow. Previous work compared 19 different techniques for predicting daily streamflow records in the southeastern United States. Here, five of the better-performing methods are compared in a different hydroclimatic region of the United States, in Iowa. The methods fall into three classes: (1) drainage-area ratio methods, (2) nonlinear spatial interpolations using flow duration curves, and (3) mechanistic rainfall-runoff models. The first two classes are each applied with nearest-neighbor and map-correlated index streamgages. Using a threefold validation and robust rank-based evaluation, the methods are assessed for overall goodness of fit of the hydrograph of daily streamflow, the ability to reproduce a daily, no-fail storage-yield curve, and the ability to reproduce key streamflow statistics. As in the Southeast study, a nonlinear spatial interpolation of daily streamflow using flow duration curves is found to be a method with the best predictive accuracy. Comparisons with previous work in Iowa show that the accuracy of mechanistic models with at-site calibration is substantially degraded in the ungaged framework.

  14. Comparing large-scale hydrological model predictions with observed streamflow in the Pacific Northwest: effects of climate and groundwater

    Science.gov (United States)

    Mohammad Safeeq; Guillaume S. Mauger; Gordon E. Grant; Ivan Arismendi; Alan F. Hamlet; Se-Yeun Lee

    2014-01-01

    Assessing uncertainties in hydrologic models can improve accuracy in predicting future streamflow. Here, simulated streamflows using the Variable Infiltration Capacity (VIC) model at coarse (1/16°) and fine (1/120°) spatial resolutions were evaluated against observed streamflows from 217 watersheds. In...

  15. From spatially variable streamflow to distributed hydrological models: Analysis of key modeling decisions

    Science.gov (United States)

    Fenicia, Fabrizio; Kavetski, Dmitri; Savenije, Hubert H. G.; Pfister, Laurent

    2016-02-01

    This paper explores the development and application of distributed hydrological models, focusing on the key decisions of how to discretize the landscape, which model structures to use in each landscape element, and how to link model parameters across multiple landscape elements. The case study considers the Attert catchment in Luxembourg—a 300 km2 mesoscale catchment with 10 nested subcatchments that exhibit clearly different streamflow dynamics. The research questions are investigated using conceptual models applied at hydrologic response unit (HRU) scales (1-4 HRUs) on 6 hourly time steps. Multiple model structures are hypothesized and implemented using the SUPERFLEX framework. Following calibration, space/time model transferability is tested using a split-sample approach, with evaluation criteria including streamflow prediction error metrics and hydrological signatures. Our results suggest that: (1) models using geology-based HRUs are more robust and capture the spatial variability of streamflow time series and signatures better than models using topography-based HRUs; this finding supports the hypothesis that, in the Attert, geology exerts a stronger control than topography on streamflow generation, (2) streamflow dynamics of different HRUs can be represented using distinct and remarkably simple model structures, which can be interpreted in terms of the perceived dominant hydrologic processes in each geology type, and (3) the same maximum root zone storage can be used across the three dominant geological units with no loss in model transferability; this finding suggests that the partitioning of water between streamflow and evaporation in the study area is largely independent of geology and can be used to improve model parsimony. The modeling methodology introduced in this study is general and can be used to advance our broader understanding and prediction of hydrological behavior, including the landscape characteristics that control hydrologic response, the

  16. Modeling the effect of glacier recession on streamflow response using a coupled glacio-hydrological model

    Directory of Open Access Journals (Sweden)

    B. S. Naz

    2013-04-01

    Full Text Available We describe an integrated spatially distributed hydrologic and glacier dynamic model, and use it to investigate the effect of glacier recession on streamflow variations for the Upper Bow River basin, a tributary of the South Saskatchewan River. Several recent studies have suggested that observed decreases in summer flows in the South Saskatchewan River are partly due to the retreat of glaciers in the river's headwaters. Modeling the effect of glacier changes on streamflow response in river basins such as the South Saskatchewan is complicated due to the inability of most existing physically-based distributed hydrologic models to represent glacier dynamics. We compare predicted variations in glacier extent, snow water equivalent and streamflow discharge made with the integrated model with satellite estimates of glacier area and terminus position, observed streamflow and snow water equivalent measurements over the period of 1980–2007. Simulations with the coupled hydrology-glacier model reduce the uncertainty in streamflow predictions. Our results suggested that on average, the glacier melt contribution to the Bow River flow upstream of Lake Louise is about 30% in summer. For warm and dry years, however, the glacier melt contribution can be as large as 50% in August, whereas for cold years, it can be as small as 20% and the timing of glacier melt signature can be delayed by a month.

  17. A metric for attributing variability in modelled streamflows

    Science.gov (United States)

    Shoaib, Syed Abu; Marshall, Lucy; Sharma, Ashish

    2016-10-01

    Significant gaps in our present understanding of hydrological systems lead to enhanced uncertainty in key modelling decisions. This study proposes a method, namely "Quantile Flow Deviation (QFD)", for the attribution of forecast variability to different sources across different streamflow regimes. By using a quantile based metric, we can assess the change in uncertainty across individual percentiles, thereby allowing uncertainty to be expressed as a function of magnitude and time. As a result, one can address selective sources of uncertainty depending on whether low or high flows (say) are of interest. By way of a case study, we demonstrate the usefulness of the approach for estimating the relative importance of model parameter identification, objective functions and model structures as sources of stream flow forecast uncertainty. We use FUSE (Framework for Understanding Structural Errors) to implement our methods, allowing selection of multiple different model structures. Cross-catchment comparison is done for two different catchments: Leaf River in Mississippi, USA and Bass River of Victoria, Australia. Two different approaches to parameter estimation are presented that demonstrate the statistic- one based on GLUE, the other one based on optimization. The results presented in this study suggest that the determination of the model structure with the design catchment should be given priority but that objective function selection with parameter identifiability can lead to significant variability in results. By examining the QFD across multiple flow quantiles, the ability of certain models and optimization routines to constrain variability for different flow conditions is demonstrated.

  18. Bayesian Models for Streamflow and River Network Reconstruction using Tree Rings

    Science.gov (United States)

    Ravindranath, A.; Devineni, N.

    2016-12-01

    Water systems face non-stationary, dynamically shifting risks due to shifting societal conditions and systematic long-term variations in climate manifesting as quasi-periodic behavior on multi-decadal time scales. Water systems are thus vulnerable to long periods of wet or dry hydroclimatic conditions. Streamflow is a major component of water systems and a primary means by which water is transported to serve ecosystems' and human needs. Thus, our concern is in understanding streamflow variability. Climate variability and impacts on water resources are crucial factors affecting streamflow, and multi-scale variability increases risk to water sustainability and systems. Dam operations are necessary for collecting water brought by streamflow while maintaining downstream ecological health. Rules governing dam operations are based on streamflow records that are woefully short compared to periods of systematic variation present in the climatic factors driving streamflow variability and non-stationarity. We use hierarchical Bayesian regression methods in order to reconstruct paleo-streamflow records for dams within a basin using paleoclimate proxies (e.g. tree rings) to guide the reconstructions. The riverine flow network for the entire basin is subsequently modeled hierarchically using feeder stream and tributary flows. This is a starting point in analyzing streamflow variability and risks to water systems, and developing a scientifically-informed dynamic risk management framework for formulating dam operations and water policies to best hedge such risks. We will apply this work to the Missouri and Delaware River Basins (DRB). Preliminary results of streamflow reconstructions for eight dams in the upper DRB using standard Gaussian regression with regional tree ring chronologies give streamflow records that now span two to two and a half centuries, and modestly smoothed versions of these reconstructed flows indicate physically-justifiable trends in the time series.

  19. Simulation-optimization framework for multi-site multi-season hybrid stochastic streamflow modeling

    Science.gov (United States)

    Srivastav, Roshan; Srinivasan, K.; Sudheer, K. P.

    2016-11-01

    A simulation-optimization (S-O) framework is developed for the hybrid stochastic modeling of multi-site multi-season streamflows. The multi-objective optimization model formulated is the driver and the multi-site, multi-season hybrid matched block bootstrap model (MHMABB) is the simulation engine within this framework. The multi-site multi-season simulation model is the extension of the existing single-site multi-season simulation model. A robust and efficient evolutionary search based technique, namely, non-dominated sorting based genetic algorithm (NSGA - II) is employed as the solution technique for the multi-objective optimization within the S-O framework. The objective functions employed are related to the preservation of the multi-site critical deficit run sum and the constraints introduced are concerned with the hybrid model parameter space, and the preservation of certain statistics (such as inter-annual dependence and/or skewness of aggregated annual flows). The efficacy of the proposed S-O framework is brought out through a case example from the Colorado River basin. The proposed multi-site multi-season model AMHMABB (whose parameters are obtained from the proposed S-O framework) preserves the temporal as well as the spatial statistics of the historical flows. Also, the other multi-site deficit run characteristics namely, the number of runs, the maximum run length, the mean run sum and the mean run length are well preserved by the AMHMABB model. Overall, the proposed AMHMABB model is able to show better streamflow modeling performance when compared with the simulation based SMHMABB model, plausibly due to the significant role played by: (i) the objective functions related to the preservation of multi-site critical deficit run sum; (ii) the huge hybrid model parameter space available for the evolutionary search and (iii) the constraint on the preservation of the inter-annual dependence. Split-sample validation results indicate that the AMHMABB model is

  20. Retrospective evaluation of continental-scale streamflow nudging with WRF-Hydro National Water Model V1

    Science.gov (United States)

    McCreight, J. L.; Wu, Y.; Gochis, D.; Rafieeinasab, A.; Dugger, A. L.; Yu, W.; Cosgrove, B.; Cui, Z.; Oubeidillah, A.; Briar, D.

    2016-12-01

    The streamflow (discharge) data assimilation capability in version 1 of the National Water Model (NWM; a WRF-Hydro configuration) is applied and evaluated in a 5-year (2011-2015) retrospective study using NLDAS2 forcing data over CONUS. This talk will describe the NWM V1 operational nudging (continuous-time) streamflow data assimilation approach, its motivation, and its relationship to this retrospective evaluation. Results from this study will provide a an analysis-based (not forecast-based) benchmark for streamflow DA in the NWM. The goal of the assimilation is to reduce discharge bias and improve channel initial conditions for discharge forecasting (though forecasts are not considered here). The nudging method assimilates discharge observations at nearly 7,000 USGS gages (at frequency up to 1/15 minutes) to produce a (univariate) discharge reanalysis (i.e. this is the only variable affected by the assimilation). By withholding 14% nested gages throughout CONUS in a separate validation run, we evaluate the downstream impact of assimilation at upstream gages. Based on this sample, we estimate the skill of the streamflow reanalysis at ungaged locations and examine factors governing the skill of the assimilation. Comparison of assimilation and open-loop runs is presented. Performance of DA under both high and low flow regimes and selected flooding events is examined. Preliminary evaluation of nudging parameter sensitivity and its relationship to flow regime will be presented.

  1. Wavelet-linear genetic programming: A new approach for modeling monthly streamflow

    Science.gov (United States)

    Ravansalar, Masoud; Rajaee, Taher; Kisi, Ozgur

    2017-06-01

    The streamflows are important and effective factors in stream ecosystems and its accurate prediction is an essential and important issue in water resources and environmental engineering systems. A hybrid wavelet-linear genetic programming (WLGP) model, which includes a discrete wavelet transform (DWT) and a linear genetic programming (LGP) to predict the monthly streamflow (Q) in two gauging stations, Pataveh and Shahmokhtar, on the Beshar River at the Yasuj, Iran were used in this study. In the proposed WLGP model, the wavelet analysis was linked to the LGP model where the original time series of streamflow were decomposed into the sub-time series comprising wavelet coefficients. The results were compared with the single LGP, artificial neural network (ANN), a hybrid wavelet-ANN (WANN) and Multi Linear Regression (MLR) models. The comparisons were done by some of the commonly utilized relevant physical statistics. The Nash coefficients (E) were found as 0.877 and 0.817 for the WLGP model, for the Pataveh and Shahmokhtar stations, respectively. The comparison of the results showed that the WLGP model could significantly increase the streamflow prediction accuracy in both stations. Since, the results demonstrate a closer approximation of the peak streamflow values by the WLGP model, this model could be utilized for the simulation of cumulative streamflow data prediction in one month ahead.

  2. A Pareto-optimal moving average multigene genetic programming model for daily streamflow prediction

    Science.gov (United States)

    Danandeh Mehr, Ali; Kahya, Ercan

    2017-06-01

    Genetic programming (GP) is able to systematically explore alternative model structures of different accuracy and complexity from observed input and output data. The effectiveness of GP in hydrological system identification has been recognized in recent studies. However, selecting a parsimonious (accurate and simple) model from such alternatives still remains a question. This paper proposes a Pareto-optimal moving average multigene genetic programming (MA-MGGP) approach to develop a parsimonious model for single-station streamflow prediction. The three main components of the approach that take us from observed data to a validated model are: (1) data pre-processing, (2) system identification and (3) system simplification. The data pre-processing ingredient uses a simple moving average filter to diminish the lagged prediction effect of stand-alone data-driven models. The multigene ingredient of the model tends to identify the underlying nonlinear system with expressions simpler than classical monolithic GP and, eventually simplification component exploits Pareto front plot to select a parsimonious model through an interactive complexity-efficiency trade-off. The approach was tested using the daily streamflow records from a station on Senoz Stream, Turkey. Comparing to the efficiency results of stand-alone GP, MGGP, and conventional multi linear regression prediction models as benchmarks, the proposed Pareto-optimal MA-MGGP model put forward a parsimonious solution, which has a noteworthy importance of being applied in practice. In addition, the approach allows the user to enter human insight into the problem to examine evolved models and pick the best performing programs out for further analysis.

  3. Advancing monthly streamflow prediction accuracy of CART models using ensemble learning paradigms

    Science.gov (United States)

    Erdal, Halil Ibrahim; Karakurt, Onur

    2013-01-01

    SummaryStreamflow forecasting is one of the most important steps in the water resources planning and management. Ensemble techniques such as bagging, boosting and stacking have gained popularity in hydrological forecasting in the recent years. The study investigates the potential usage of two ensemble learning paradigms (i.e., bagging; stochastic gradient boosting) in building classification and regression trees (CARTs) ensembles to advance the streamflow prediction accuracy. The study, initially, investigates the use of classification and regression trees for monthly streamflow forecasting and employs a support vector regression (SVR) model as the benchmark model. The analytic results indicate that CART outperforms SVR in both training and testing phases. Although the obtained results of CART model in training phase are considerable, it is not in testing phase. Thus, to optimize the prediction accuracy of CART for monthly streamflow forecasting, we incorporate bagging and stochastic gradient boosting which are rooted in same philosophy, advancing the prediction accuracy of weak learners. Comparing with the results of bagged regression trees (BRTs) and stochastic gradient boosted regression trees (GBRTs) models possess satisfactory monthly streamflow forecasting performance than CART and SVR models. Overall, it is found that ensemble learning paradigms can remarkably advance the prediction accuracy of CART models in monthly streamflow forecasting.

  4. Post Processing Numerical Weather Prediction Model Rainfall Forecasts for Use in Ensemble Streamflow Forecasting in Australia

    Science.gov (United States)

    Shrestha, D. L.; Robertson, D.; Bennett, J.; Ward, P.; Wang, Q. J.

    2012-12-01

    Through the water information research and development alliance (WIRADA) project, CSIRO is conducting research to improve flood and short-term streamflow forecasting services delivered by the Australian Bureau of Meteorology. WIRADA aims to build and test systems to generate ensemble flood and short-term streamflow forecasts with lead times of up to 10 days by integrating rainfall forecasts from Numerical Weather Prediction (NWP) models and hydrological modelling. Here we present an overview of the latest progress towards developing this system. Rainfall during the forecast period is a major source of uncertainty in streamflow forecasting. Ensemble rainfall forecasts are used in streamflow forecasting to characterise the rainfall uncertainty. In Australia, NWP models provide forecasts of rainfall and other weather conditions for lead times of up to 10 days. However, rainfall forecasts from Australian NWP models are deterministic and often contain systematic errors. We use a simplified Bayesian joint probability (BJP) method to post-process rainfall forecasts from the latest generation of Australian NWP models. The BJP method generates reliable and skilful ensemble rainfall forecasts. The post-processed rainfall ensembles are then used to force a semi-distributed conceptual rainfall runoff model to produce ensemble streamflow forecasts. The performance of the ensemble streamflow forecasts is evaluated on a number of Australian catchments and the benefits of using post processed rainfall forecasts are demonstrated.

  5. Modeling streamflow from snowmelt in the upper Rio Grande

    Science.gov (United States)

    Annual snowpack in the high elevation snowsheds of the Upper Rio Grande (URG) Basin is a vital source of surface water for irrigated agriculture in New Mexico. Maximum streamflow from the annual snowpack usually occurs in early May for the southernmost snowsheds (e.g., Ojo Caliente) and at the end o...

  6. On the Performance of Alternate Conceptual Ecohydrological Models for Streamflow Prediction

    Science.gov (United States)

    Naseem, Bushra; Ajami, Hoori; Cordery, Ian; Sharma, Ashish

    2016-04-01

    A merging of a lumped conceptual hydrological model with two conceptual dynamic vegetation models is presented to assess the performance of these models for simultaneous simulations of streamflow and leaf area index (LAI). Two conceptual dynamic vegetation models with differing representation of ecological processes are merged with a lumped conceptual hydrological model (HYMOD) to predict catchment scale streamflow and LAI. The merged RR-LAI-I model computes relative leaf biomass based on transpiration rates while the RR-LAI-II model computes above ground green and dead biomass based on net primary productivity and water use efficiency in response to soil moisture dynamics. To assess the performance of these models, daily discharge and 8-day MODIS LAI product for 27 catchments of 90 - 1600km2 in size located in the Murray - Darling Basin in Australia are used. Our results illustrate that when single-objective optimisation was focussed on maximizing the objective function for streamflow or LAI, the other un-calibrated predicted outcome (LAI if streamflow is the focus) was consistently compromised. Thus, single-objective optimization cannot take into account the essence of all processes in the conceptual ecohydrological models. However, multi-objective optimisation showed great strength for streamflow and LAI predictions. Both response outputs were better simulated by RR-LAI-II than RR-LAI-I due to better representation of physical processes such as net primary productivity (NPP) in RR-LAI-II. Our results highlight that simultaneous calibration of streamflow and LAI using a multi-objective algorithm proves to be an attractive tool for improved streamflow predictions.

  7. Modeled future peak streamflows in four coastal Maine rivers

    Science.gov (United States)

    Hodgkins, Glenn A.; Dudley, Robert W.

    2013-01-01

    To safely and economically design bridges and culverts, it is necessary to compute the magnitude of peak streamflows that have specified annual exceedance probabilities (AEPs). Annual precipitation and air temperature in the northeastern United States are, in general, projected to increase during the 21st century. It is therefore important for engineers and resource managers to understand how peak flows may change in the future. This report, prepared in cooperation with the Maine Department of Transportation (MaineDOT), presents modeled changes in peak flows at four basins in coastal Maine on the basis of projected changes in air temperature and precipitation. To estimate future peak streamflows at the four basins in this study, historical values for climate (temperature and precipitation) in the basins were adjusted by different amounts and input to a hydrologic model of each study basin. To encompass the projected changes in climate in coastal Maine by the end of the 21st century, air temperatures were adjusted by four different amounts, from -3.6 degrees Fahrenheit (ºF) (-2 degrees Celsius (ºC)) to +10.8 ºF (+6 ºC) of observed temperatures. Precipitation was adjusted by three different percentage values from -15 percent to +30 percent of observed precipitation. The resulting 20 combinations of temperature and precipitation changes (includes the no-change scenarios) were input to Precipitation-Runoff Modeling System (PRMS) watershed models, and annual daily maximum peak flows were calculated for each combination. Modeled peak flows from the adjusted changes in temperature and precipitation were compared to unadjusted (historical) modeled peak flows. Annual daily maximum peak flows increase or decrease, depending on whether temperature or precipitation is adjusted; increases in air temperature (with no change in precipitation) lead to decreases in peak flows, whereas increases in precipitation (with no change in temperature) lead to increases in peak flows. As

  8. Diagnostic testing and evaluation of the community WRF-Hydro Modeling System for national streamflow prediction application

    Science.gov (United States)

    Rafieei Nasab, A.; Gochis, D.; Dugger, A. L.; Pan, L.; McCreight, J. L.; Yu, W.; Zhang, Y.; Yates, D. N.; Somos-Valenzuela, M. A.; Salas, F. R.; Maidment, D. R.

    2015-12-01

    A fully-distributed WRF-Hydro modeling system developed at National Center of Atmospheric Research (NCAR) will serve the initial operational nationwide streamflow forecasting needs of the National Water Center (NWC). This paper presents a multi-faceted evaluation of the WRF-hydro modeling system in preparation for operational national streamflow prediction. The testing period encompasses the 2015 warm season which included the National Flood Interoperability Experiment (NFIE) where WRF-Hydro and the RAPID channel routing model were driven by the Multi-Radar Multi-Sensor (MRMS) estimates as the real-time precipitation estimate product and the High Resolution Rapid Refresh (HRRR) for the short term forecast. Here, we validate the MRMS estimates and HRRR precipitation forecasts at national scale using daily precipitation observations from the Global Historical Climatology Network (GHCN). Because WRF-Hydro has several physics options such as surface overland flow, saturated subsurface flow, channel routing as well as conceptual deep groundwater base flow also conducted additional simulations to evaluate WRF-Hydro performance under different processes configurations. Streamflow verification data for model simulations and predictions was completed for a subset of GAGES-II reference basins. Multi-temporal and spatial scale verification is performed in order to test the robustness and skill improvement in WRF-Hydro streamflow simulations under different configuration over a wide range of basins sizes and from short-term (hourly) to longer-term (monthly) flow simulations. Evaluation will be also carried out based on various geographic regions to relate the skill improvement to dominant controls on flow based on the actual physical and climatic properties of the basins. The goal is to inform WRF-Hydro model configuration for the initial operating capabilities (IOC) project and target processes and parameter estimates for improvement.

  9. A regional GIS-based model for reconstructing natural monthly streamflow series at ungauged sites

    Science.gov (United States)

    Pumo, Dario; Lo Conti, Francesco; Viola, Francesco; Noto, Leonardo V.

    2016-04-01

    Several hydrologic applications require reliable estimates of monthly runoff in river basins to face the widespread lack of data, both in time and in space. The main aim of this work is to propose a regional model for the estimation of monthly natural runoff series at ungauged sites, analyzing its applicability, reliability and limitations. A GIS (Geographic Information System) based model is here developed and applied to the entire region of Sicily (Italy). The core of this tool is a regional model for the estimation of monthly natural runoff series, based on a simple modelling structure, consisting of a regression based rainfall-runoff model with only four parameters. The monthly runoff is obtained as a function of precipitation and mean temperature at the same month and runoff at the previous month. For a given basin, the four model parameters are assessed by specific regional equations as a function of some easily measurable geomorphic and climate basins' descriptors. The model is calibrated by a "two-step" procedure applied to a number of gauged basins over the region. The first step is aimed at the identification of a set of parameters optimizing model performances at the level of single basin. Such "optimal" parameters sets, derived for each calibration basin, are successively used inside a regional regression analysis, performed at the second step, by which the regional equations for model parameters assessment are defined and calibrated. All the gauged watersheds across the Sicily have been analyzed, selecting 53 basins for model calibration and using other 6 basins exclusively for validation purposes. Model performances, quantitatively evaluated considering different statistical indexes, demonstrate a relevant model ability in capturing the observed hydrological response at both the monthly level and higher time scales (seasonal and annual). One of the key features related to the proposed methodology is its easy transferability to other arid and semiarid

  10. Estimating streamflow in the Irrawaddy Basin, Myanmar by integrating hydrological model with remote sensing information

    Science.gov (United States)

    Sun, W.; Yu, J.; Wang, G.; Li, Z.

    2016-12-01

    In this study, a method of calibrating hydrological models using river width derived from remote sensing (synthetic aperture radar) is applied to Irrawaddy Basin in Myanmar, for the purpose of estimating daily streamflow in this data-sparse basin. The at-a-station hydraulic geometry is implemented to facilitate shifting the calibration objective from river discharge to river width. The generalized likelihood uncertainty estimation (GLUE) is applied to model calibration and uncertainty analysis. Of 50,000 randomly generated parameter sets, 997 are identified as behavioral, based on comparing model simulation with satellite observations. The uncertainty band of streamflow simulation can span most of 10-year average monthly observed streamflow for moderate and high flow conditions. And the posterior distribution of at-a-station hydraulic geometry parameter show single peak distribution, indicating they are strongly constrained by the calibration. The method is potentially valuable in data-sparse region for water resource management.

  11. Coupled daily streamflow and water temperature modelling in large river basins

    Directory of Open Access Journals (Sweden)

    M. T. H. van Vliet

    2012-11-01

    Full Text Available Realistic estimates of daily streamflow and water temperature are required for effective management of water resources (e.g. for electricity and drinking water production and freshwater ecosystems. Although hydrological and process-based water temperature modelling approaches have been successfully applied to small catchments and short time periods, much less work has been done at large spatial and temporal scales. We present a physically based modelling framework for daily river discharge and water temperature simulations applicable to large river systems on a global scale. Model performance was tested globally at 1/2 × 1/2° spatial resolution and a daily time step for the period 1971–2000. We made specific evaluations on large river basins situated in different hydro-climatic zones and characterized by different anthropogenic impacts. Effects of anthropogenic heat discharges on simulated water temperatures were incorporated by using global gridded thermoelectric water use datasets and representing thermal discharges as point sources into the heat advection equation. This resulted in a significant increase in the quality of the water temperature simulations for thermally polluted basins (Rhine, Meuse, Danube and Mississippi. Due to large reservoirs in the Columbia which affect streamflow and thermal regimes, a reservoir routing model was used. This resulted in a significant improvement in the performance of the river discharge and water temperature modelling. Overall, realistic estimates were obtained at daily time step for both river discharge (median normalized BIAS = 0.3; normalized RMSE = 1.2; r = 0.76 and water temperature (median BIAS = −0.3 °C; RMSE = 2.8 °C; r = 0.91 for the entire validation period, with similar performance during warm, dry periods. Simulated water temperatures are sensitive to headwater temperature, depending on resolution and flow velocity. A high sensitivity of water temperature to river

  12. Coupled daily streamflow and water temperature modelling in large river basins

    Directory of Open Access Journals (Sweden)

    M. T. H. van Vliet

    2012-07-01

    Full Text Available Realistic estimates of daily streamflow and water temperature are required for effective management of water resources (e.g. electricity and drinking water production and freshwater ecosystems. Although hydrological and process-based water temperature modelling approaches have been successfully applied to small catchments and short time periods, much less work has been done at large spatial and temporal scales. We present a physically-based modelling framework for daily river discharge and water temperature simulations applicable to large river systems on a global scale. Model performance was tested globally at 1/2° × 1/2° spatial resolution and a daily time step for the period 1971–2000. We made specific evaluations on large river basins situated in different hydro-climatic zones and characterized by different anthropogenic impacts. Effects of anthropogenic heat discharges on simulated water temperatures were incorporated by using global gridded thermoelectric water use data sets and representing thermal discharges as point sources into the heat-advection equation. This resulted in a significant increase in the quality of the water temperature simulations for thermally polluted basins (Rhine, Meuse, Danube and Mississippi. Due to large reservoirs in the Columbia which affect streamflow and thermal regimes, a reservoir routing model was used. This resulted in a significant improvement in the performance of the river discharge and water temperature modelling. Overall, realistic estimates were obtained at daily time step for both river discharge (median normalized BIAS = 0.3; normalized RMSE = 1.2; r = 0.76 and water temperature (median BIAS = −0.3 °C; RMSE = 2.8 °C; r = 0.91 for the entire validation period, with similar performance during warm, dry periods. Simulated water temperatures are sensitive to headwater temperature, depending on resolution and flow velocity. A high sensitivity of water temperature to river

  13. A comparative analysis of 9 multi-model averaging approaches in hydrological continuous streamflow simulation

    Science.gov (United States)

    Arsenault, Richard; Gatien, Philippe; Renaud, Benoit; Brissette, François; Martel, Jean-Luc

    2015-10-01

    This study aims to test whether a weighted combination of several hydrological models can simulate flows more accurately than the models taken individually. In addition, the project attempts to identify the most efficient model averaging method and the optimal number of models to include in the weighting scheme. In order to address the first objective, streamflow was simulated using four lumped hydrological models (HSAMI, HMETS, MOHYSE and GR4J-6), each of which were calibrated with three different objective functions on 429 watersheds. The resulting 12 hydrographs (4 models × 3 metrics) were weighted and combined with the help of 9 averaging methods which are the simple arithmetic mean (SAM), Akaike information criterion (AICA), Bates-Granger (BGA), Bayes information criterion (BICA), Bayesian model averaging (BMA), Granger-Ramanathan average variant A, B and C (GRA, GRB and GRC) and the average by SCE-UA optimization (SCA). The same weights were then applied to the hydrographs in validation mode, and the Nash-Sutcliffe Efficiency metric was measured between the averaged and observed hydrographs. Statistical analyses were performed to compare the accuracy of weighted methods to that of individual models. A Kruskal-Wallis test and a multi-objective optimization algorithm were then used to identify the most efficient weighted method and the optimal number of models to integrate. Results suggest that the GRA, GRB, GRC and SCA weighted methods perform better than the individual members. Model averaging from these four methods were superior to the best of the individual members in 76% of the cases. Optimal combinations on all watersheds included at least one of each of the four hydrological models. None of the optimal combinations included all members of the ensemble of 12 hydrographs. The Granger-Ramanathan average variant C (GRC) is recommended as the best compromise between accuracy, speed of execution, and simplicity.

  14. Coupled daily streamflow and water temperature modelling in large river basins

    NARCIS (Netherlands)

    Vliet, van M.T.H.; Yearsley, J.R.; Franssen, W.H.P.; Ludwig, F.; Haddeland, I.; Kabat, P.

    2012-01-01

    Realistic estimates of daily streamflow and water temperature are required for effective management of water resources (e.g. for electricity and drinking water production) and freshwater ecosystems. Although hydrological and process-based water temperature modelling approaches have been successfully

  15. Stochastic model for simulating Souris River Basin precipitation, evapotranspiration, and natural streamflow

    Science.gov (United States)

    Kolars, Kelsey A.; Vecchia, Aldo V.; Ryberg, Karen R.

    2016-02-24

    The Souris River Basin is a 61,000-square-kilometer basin in the Provinces of Saskatchewan and Manitoba and the State of North Dakota. In May and June of 2011, record-setting rains were seen in the headwater areas of the basin. Emergency spillways of major reservoirs were discharging at full or nearly full capacity, and extensive flooding was seen in numerous downstream communities. To determine the probability of future extreme floods and droughts, the U.S. Geological Survey, in cooperation with the North Dakota State Water Commission, developed a stochastic model for simulating Souris River Basin precipitation, evapotranspiration, and natural (unregulated) streamflow. Simulations from the model can be used in future studies to simulate regulated streamflow, design levees, and other structures; and to complete economic cost/benefit analyses.Long-term climatic variability was analyzed using tree-ring chronologies to hindcast precipitation to the early 1700s and compare recent wet and dry conditions to earlier extreme conditions. The extended precipitation record was consistent with findings from the Devils Lake and Red River of the North Basins (southeast of the Souris River Basin), supporting the idea that regional climatic patterns for many centuries have consisted of alternating wet and dry climate states.A stochastic climate simulation model for precipitation, temperature, and potential evapotranspiration for the Souris River Basin was developed using recorded meteorological data and extended precipitation records provided through tree-ring analysis. A significant climate transition was seen around1970, with 1912–69 representing a dry climate state and 1970–2011 representing a wet climate state. Although there were some distinct subpatterns within the basin, the predominant differences between the two states were higher spring through early fall precipitation and higher spring potential evapotranspiration for the wet compared to the dry state.A water

  16. Effects of modeling decisions on cold region hydrological model performance: snow, soil and streamflow

    Science.gov (United States)

    Musselman, Keith; Clark, Martyn; Endalamaw, Abraham; Bolton, W. Robert; Nijssen, Bart; Arnold, Jeffrey

    2017-04-01

    Cold regions are characterized by intense spatial gradients in climate, vegetation and soil properties that determine the complex spatiotemporal patterns of snowpack evolution, frozen soil dynamics, catchment connectivity, and streamflow. These spatial gradients pose unique challenges for hydrological models, including: 1) how the spatial variability of the physical processes are best represented across a hierarchy of scales, and 2) what algorithms and parameter sets best describe the biophysical and hydrological processes at the spatial scale of interest. To address these topics, we apply the Structure for Unifying Multiple Modeling Alternatives (SUMMA) to simulate hydrological processes at the Caribou - Poker Creeks Research Watershed in the Alaskan sub-arctic Boreal forest. The site is characterized by numerous gauged headwater catchments ranging in size from 5 sq. km to 106 sq. km with varying extents (3% to 53%) of discontinuous permafrost that permits a multi-scale paired watershed analysis of the hydrological impacts of frozen soils. We evaluate the effects of model decisions on the skill of SUMMA to simulate observed snow and soil dynamics, and the spatial integration of these processes as catchment streamflow. Decisions such as the number of soil layers, total soil column depth, and vertical soil discretization are shown to have profound impacts on the simulation of seasonal active layer dynamics. Decisions on the spatial organization (lateral connectivity, representation of riparian response units, and the spatial discretization of the hydrological landscape) are shown to be as important as accurate snowpack and soil process representation in the simulation of streamflow. The work serves to better inform hydrological model decisions for cold region hydrologic evaluation and to improve predictive capacity for water resource planning.

  17. Streamflow forecasting using the modular modeling system and an object-user interface

    Science.gov (United States)

    Jeton, A.E.

    2001-01-01

    The U.S. Geological Survey (USGS), in cooperation with the Bureau of Reclamation (BOR), developed a computer program to provide a general framework needed to couple disparate environmental resource models and to manage the necessary data. The Object-User Interface (OUI) is a map-based interface for models and modeling data. It provides a common interface to run hydrologic models and acquire, browse, organize, and select spatial and temporal data. One application is to assist river managers in utilizing streamflow forecasts generated with the Precipitation-Runoff Modeling System running in the Modular Modeling System (MMS), a distributed-parameter watershed model, and the National Weather Service Extended Streamflow Prediction (ESP) methodology.

  18. Evaluation of model-based seasonal streamflow and water allocation forecasts for the Elqui Valley, Chile

    Science.gov (United States)

    Delorit, Justin; Cristian Gonzalez Ortuya, Edmundo; Block, Paul

    2017-09-01

    In many semi-arid regions, multisectoral demands often stress available water supplies. Such is the case in the Elqui River valley of northern Chile, which draws on a limited-capacity reservoir to allocate 25 000 water rights. Delayed infrastructure investment forces water managers to address demand-based allocation strategies, particularly in dry years, which are realized through reductions in the volume associated with each water right. Skillful season-ahead streamflow forecasts have the potential to inform managers with an indication of future conditions to guide reservoir allocations. This work evaluates season-ahead statistical prediction models of October-January (growing season) streamflow at multiple lead times associated with manager and user decision points, and links predictions with a reservoir allocation tool. Skillful results (streamflow forecasts outperform climatology) are produced for short lead times (1 September: ranked probability skill score (RPSS) of 0.31, categorical hit skill score of 61 %). At longer lead times, climatological skill exceeds forecast skill due to fewer observations of precipitation. However, coupling the 1 September statistical forecast model with a sea surface temperature phase and strength statistical model allows for equally skillful categorical streamflow forecasts to be produced for a 1 May lead, triggered for 60 % of years (1950-2015), suggesting forecasts need not be strictly deterministic to be useful for water rights holders. An early (1 May) categorical indication of expected conditions is reinforced with a deterministic forecast (1 September) as more observations of local variables become available. The reservoir allocation model is skillful at the 1 September lead (categorical hit skill score of 53 %); skill improves to 79 % when categorical allocation prediction certainty exceeds 80 %. This result implies that allocation efficiency may improve when forecasts are integrated into reservoir decision frameworks. The

  19. Quantifying uncertainties in streamflow predictions through signature based inference of hydrological model parameters

    Science.gov (United States)

    Fenicia, Fabrizio; Reichert, Peter; Kavetski, Dmitri; Albert, Calro

    2016-04-01

    The calibration of hydrological models based on signatures (e.g. Flow Duration Curves - FDCs) is often advocated as an alternative to model calibration based on the full time series of system responses (e.g. hydrographs). Signature based calibration is motivated by various arguments. From a conceptual perspective, calibration on signatures is a way to filter out errors that are difficult to represent when calibrating on the full time series. Such errors may for example occur when observed and simulated hydrographs are shifted, either on the "time" axis (i.e. left or right), or on the "streamflow" axis (i.e. above or below). These shifts may be due to errors in the precipitation input (time or amount), and if not properly accounted in the likelihood function, may cause biased parameter estimates (e.g. estimated model parameters that do not reproduce the recession characteristics of a hydrograph). From a practical perspective, signature based calibration is seen as a possible solution for making predictions in ungauged basins. Where streamflow data are not available, it may in fact be possible to reliably estimate streamflow signatures. Previous research has for example shown how FDCs can be reliably estimated at ungauged locations based on climatic and physiographic influence factors. Typically, the goal of signature based calibration is not the prediction of the signatures themselves, but the prediction of the system responses. Ideally, the prediction of system responses should be accompanied by a reliable quantification of the associated uncertainties. Previous approaches for signature based calibration, however, do not allow reliable estimates of streamflow predictive distributions. Here, we illustrate how the Bayesian approach can be employed to obtain reliable streamflow predictive distributions based on signatures. A case study is presented, where a hydrological model is calibrated on FDCs and additional signatures. We propose an approach where the likelihood

  20. Watershed-scale modeling of streamflow change in incised montane meadows

    Science.gov (United States)

    Essaid, Hedeff I.; Hill, Barry R.

    2014-01-01

    Land use practices have caused stream channel incision and water table decline in many montane meadows of the Western United States. Incision changes the magnitude and timing of streamflow in water supply source watersheds, a concern to resource managers and downstream water users. The hydrology of montane meadows under natural and incised conditions was investigated using watershed simulation for a range of hydrologic conditions. The results illustrate the interdependence between: watershed and meadow hydrology; bedrock and meadow aquifers; and surface and groundwater flow through the meadow for the modeled scenarios. During the wet season, stream incision resulted in less overland flow and interflow and more meadow recharge causing a net decrease in streamflow and increase in groundwater storage relative to natural meadow conditions. During the dry season, incision resulted in less meadow evapotranspiration and more groundwater discharge to the stream causing a net increase in streamflow and a decrease in groundwater storage relative to natural meadow conditions. In general, for a given meadow setting, the magnitude of change in summer streamflow and long-term change in watershed groundwater storage due to incision will depend on the combined effect of: reduced evapotranspiration in the eroded meadow; induced groundwater recharge; replenishment of dry season groundwater storage depletion in meadow and bedrock aquifers by precipitation during wet years; and groundwater storage depletion that is not replenished by precipitation during wet years.

  1. Comparison of performance of statistical models in forecasting monthly streamflow of Kizil River,China

    Institute of Scientific and Technical Information of China (English)

    Shalamu ABUDU; Chun-liang CUI; James Phillip KING; Kaiser ABUDUKADEER

    2010-01-01

    This paper presents the application of autoregressive integrated moving average(ARIMA),seasonal ARIMA(SARIMA),and Jordan-Elman artificial neural networks(ANN)models in forecasting the monthly streamflow of the Kizil River in Xinjiang,China.Two different types of monthly streamflow data(original and deseasonalized data)were used to develop time series and Jordan-Elman ANN models using previous flow conditions as predictors.The one-month-ahead forecasting performances of all models for the testing period(1998-2005)were compared using the average monthly flow data from the Kalabeili gaging station on the Kizil River.The Jordan-Elman ANN models,using previous flow conditions as inputs,resulted in no significant improvement over time series models in one-month-ahead forecasting.The results suggest that the simple time series models(ARIMA and SARIMA)can be used in one-month-ahead streamflow forecasting at the study site with a simple and explicit model structure and a model performance similar to the Jordan-Elman ANN models.

  2. Statistical-dynamical long-range seasonal forecasting of streamflow with the North-American Multi Model Ensemble (NMME)

    Science.gov (United States)

    Slater, Louise; Villarini, Gabriele

    2017-04-01

    There are two main approaches to long-range (monthly to seasonal) streamflow forecasting: statistical approaches that typically relate climate precursors directly to streamflow, and dynamical physically-based approaches in which spatially distributed models are forced with downscaled meteorological forecasts. While the former approach is potentially limited by a lack of physical causality, the latter tends to be complex and time-consuming to implement. In contrast, hybrid statistical-dynamical techniques that use global climate model (GCM) ensemble forecasts as inputs to statistical models are both physically-based and rapid to run, but are a relatively new field of research. Here, we conduct the first systematic multimodel statistical-dynamical forecasting of streamflow using NMME climate forecasts from eight GCMs (CCSM3, CCSM4, CanCM3, CanCM4, GFDL2.1, FLORb01, GEOS5, and CFSv2) across a broad region. At several hundred U.S. Midwest stream gauges with long (50+ continuous years) streamflow records, we fit probabilistic statistical models for seasonal streamflow percentiles ranging from minimum to maximum flows. As predictors, we use basin-averaged values of precipitation, antecedent wetness, temperature, agricultural row crop acreage, and population density. Using the observed data, we select the best-fitting probabilistic model for every site, season, and streamflow percentile (ranging from low to high flows). The best-fitting models are then used to obtain streamflow predictions by incorporating the NMME climate forecasts and the extrapolated agricultural and population time series as predictors. The forecasting skill of our models is assessed using both deterministic and probabilistic verification measures. The influence of the different predictors is evaluated for all streamflow percentiles and across the full range of lead times. Our findings reveal that statistical-dynamical streamflow forecasting produces promising results, which may enable water managers

  3. Improving urban streamflow forecasting using a high-resolution large scale modeling framework

    Science.gov (United States)

    Read, Laura; Hogue, Terri; Gochis, David; Salas, Fernando

    2016-04-01

    Urban flood forecasting is a critical component in effective water management, emergency response, regional planning, and disaster mitigation. As populations across the world continue to move to cities (~1.8% growth per year), and studies indicate that significant flood damages are occurring outside the floodplain in urban areas, the ability to model and forecast flow over the urban landscape becomes critical to maintaining infrastructure and society. In this work, we use the Weather Research and Forecasting- Hydrological (WRF-Hydro) modeling framework as a platform for testing improvements to representation of urban land cover, impervious surfaces, and urban infrastructure. The three improvements we evaluate include: updating the land cover to the latest 30-meter National Land Cover Dataset, routing flow over a high-resolution 30-meter grid, and testing a methodology for integrating an urban drainage network into the routing regime. We evaluate performance of these improvements in the WRF-Hydro model for specific flood events in the Denver-Metro Colorado domain, comparing to historic gaged streamflow for retrospective forecasts. Denver-Metro provides an interesting case study as it is a rapidly growing urban/peri-urban region with an active history of flooding events that have caused significant loss of life and property. Considering that the WRF-Hydro model will soon be implemented nationally in the U.S. to provide flow forecasts on the National Hydrography Dataset Plus river reaches - increasing capability from 3,600 forecast points to 2.7 million, we anticipate that this work will support validation of this service in urban areas for operational forecasting. Broadly, this research aims to provide guidance for integrating complex urban infrastructure with a large-scale, high resolution coupled land-surface and distributed hydrologic model.

  4. Modelling the effects of land use changes on the streamflow of a peri-urban catchment in central Portugal

    Science.gov (United States)

    Hävermark, Saga; Santos Ferreira, Carla Sofia; Kalantari, Zahra; Di Baldassarre, Giuliano

    2016-04-01

    was calibrated for the hydrological years 2008 to 2010 and validated for the three following years using streamflow data. The impact of future land use changes was analysed by investigating the impact of the size and location of the urban areas within the catchment. Modelling results are expected to support the decision making process in planning and developing new urban areas.

  5. Verification of Advances in a Coupled Snow-runoff Modeling Framework for Operational Streamflow Forecasts

    Science.gov (United States)

    Barik, M. G.; Hogue, T. S.; Franz, K. J.; He, M.

    2011-12-01

    The National Oceanic and Atmospheric Administration's (NOAA's) River Forecast Centers (RFCs) issue hydrologic forecasts related to flood events, reservoir operations for water supply, streamflow regulation, and recreation on the nation's streams and rivers. The RFCs use the National Weather Service River Forecast System (NWSRFS) for streamflow forecasting which relies on a coupled snow model (i.e. SNOW17) and rainfall-runoff model (i.e. SAC-SMA) in snow-dominated regions of the US. Errors arise in various steps of the forecasting system from input data, model structure, model parameters, and initial states. The goal of the current study is to undertake verification of potential improvements in the SNOW17-SAC-SMA modeling framework developed for operational streamflow forecasts. We undertake verification for a range of parameters sets (i.e. RFC, DREAM (Differential Evolution Adaptive Metropolis)) as well as a data assimilation (DA) framework developed for the coupled models. Verification is also undertaken for various initial conditions to observe the influence of variability in initial conditions on the forecast. The study basin is the North Fork America River Basin (NFARB) located on the western side of the Sierra Nevada Mountains in northern California. Hindcasts are verified using both deterministic (i.e. Nash Sutcliffe efficiency, root mean square error, and joint distribution) and probabilistic (i.e. reliability diagram, discrimination diagram, containing ratio, and Quantile plots) statistics. Our presentation includes comparison of the performance of different optimized parameters and the DA framework as well as assessment of the impact associated with the initial conditions used for streamflow forecasts for the NFARB.

  6. A multitemporal remote sensing approach to parsimonious streamflow modeling in a southcentral Texas watershed, USA

    Directory of Open Access Journals (Sweden)

    B. P. Weissling

    2007-01-01

    Full Text Available Soil moisture condition plays a vital role in a watershed's hydrologic response to a precipitation event and is thus parameterized in most, if not all, rainfall-runoff models. Yet the soil moisture condition antecedent to an event has proven difficult to quantify both spatially and temporally. This study assesses the potential to parameterize a parsimonious streamflow prediction model solely utilizing precipitation records and multi-temporal remotely sensed biophysical variables (i.e.~from Moderate Resolution Imaging Spectroradiometer (MODIS/Terra satellite. This study is conducted on a 1420 km2 rural watershed in the Guadalupe River basin of southcentral Texas, a basin prone to catastrophic flooding from convective precipitation events. A multiple regression model, accounting for 78% of the variance of observed streamflow for calendar year 2004, was developed based on gauged precipitation, land surface temperature, and enhanced vegetation Index (EVI, on an 8-day interval. These results compared favorably with streamflow estimations utilizing the Natural Resources Conservation Service (NRCS curve number method and the 5-day antecedent moisture model. This approach has great potential for developing near real-time predictive models for flood forecasting and can be used as a tool for flood management in any region for which similar remotely sensed data are available.

  7. A multitemporal remote sensing approach to parsimonious streamflow modeling in a southcentral Texas watershed, USA

    Science.gov (United States)

    Weissling, B. P.; Xie, H.; Murray, K. E.

    2007-01-01

    Soil moisture condition plays a vital role in a watershed's hydrologic response to a precipitation event and is thus parameterized in most, if not all, rainfall-runoff models. Yet the soil moisture condition antecedent to an event has proven difficult to quantify both spatially and temporally. This study assesses the potential to parameterize a parsimonious streamflow prediction model solely utilizing precipitation records and multi-temporal remotely sensed biophysical variables (i.e.~from Moderate Resolution Imaging Spectroradiometer (MODIS)/Terra satellite). This study is conducted on a 1420 km2 rural watershed in the Guadalupe River basin of southcentral Texas, a basin prone to catastrophic flooding from convective precipitation events. A multiple regression model, accounting for 78% of the variance of observed streamflow for calendar year 2004, was developed based on gauged precipitation, land surface temperature, and enhanced vegetation Index (EVI), on an 8-day interval. These results compared favorably with streamflow estimations utilizing the Natural Resources Conservation Service (NRCS) curve number method and the 5-day antecedent moisture model. This approach has great potential for developing near real-time predictive models for flood forecasting and can be used as a tool for flood management in any region for which similar remotely sensed data are available.

  8. Artificial intelligence based models for stream-flow forecasting: 2000-2015

    Science.gov (United States)

    Yaseen, Zaher Mundher; El-shafie, Ahmed; Jaafar, Othman; Afan, Haitham Abdulmohsin; Sayl, Khamis Naba

    2015-11-01

    The use of Artificial Intelligence (AI) has increased since the middle of the 20th century as seen in its application in a wide range of engineering and science problems. The last two decades, for example, has seen a dramatic increase in the development and application of various types of AI approaches for stream-flow forecasting. Generally speaking, AI has exhibited significant progress in forecasting and modeling non-linear hydrological applications and in capturing the noise complexity in the dataset. This paper explores the state-of-the-art application of AI in stream-flow forecasting, focusing on defining the data-driven of AI, the advantages of complementary models, as well as the literature and their possible future application in modeling and forecasting stream-flow. The review also identifies the major challenges and opportunities for prospective research, including, a new scheme for modeling the inflow, a novel method for preprocessing time series frequency based on Fast Orthogonal Search (FOS) techniques, and Swarm Intelligence (SI) as an optimization approach.

  9. Characterizing streamflow generation in Alpine catchments

    Science.gov (United States)

    Chiogna, Gabriele; Cano Paoli, Karina; Bellin, Alberto

    2016-04-01

    framework to develop and validate hydrological models in this rather complex context, where several sources of streamflow combines in a complex way depending on precipitation and air temperature as meteorological forcing.

  10. Quantification of Modeled Streamflow under Climate Change over the Flint River Watershed in Northern Alabama

    Science.gov (United States)

    Acharya, A.; Tadesse, W.; Lemke, D.; Subedi, S.

    2015-12-01

    This study is carried out to quantify the impacts of climate change and land use change on water availability over the Flint River watershed (FRW) located in the wheeler lake watershed in Northern Alabama. The FRW directly drains into the Tennessee River, which is a major source of water supply for the Southern States. The observed precipitation and temperature data are obtained from the Alabama Mesonet Stations which are also a part of National Resources Conservation Service (USDA NRCS) Soil Climate Analysis Network (SCAN). The GCM simulated climate data are obtained from the WCRP CMIP5 ensemble that consists of 234 downscaled climate projections from four emission scenarios and 37 GCMs. The hydrologic model SWAT is calibrated and validated for a period of 2004 to 2014, based on daily meteorological forcing and monthly streamflow data for the FRW. A total of 15 parameters that directly influences the surface/base flow and basin response are selected and calibrated. The anticipated change in future climate (2030s, 2050s, 2070s, 2090s) with respect to baseline period (2004-2014/2010s) for each emission scenario are introduced into baseline climate to perturb it to future climate pattern. Various climate scenarios based on future climate are forced into the calibrated SWAT model to quantify future water availability over the basin and compared with the baseline period. This is a part of the Geospatial Education and Research Center (GERC) project and the major research findings from this project will help decision makers in evaluating the combined impacts of climate change and land use change on water availability, and developing strategies to sustain available natural resources.

  11. Streamflow simulation by a watershed model using stochastically generated weather in New York City watersheds

    Science.gov (United States)

    Mukundan, R.; Acharya, N.; Gelda, R.; Owens, E. M.; Frei, A.; Schneiderman, E. M.

    2016-12-01

    Recent studies have reported increasing trends in total precipitation, and in the frequency and magnitude of extreme precipitation events in the West of Hudson (WOH) watersheds of the New York City (NYC) water supply. The potential effects of these changes may pose challenges for both water quality (such as increased sediment and nutrient loading) and quantity (such as reservoir storage and management). The NYC Dept. of Environmental Protection Climate Change Integrated Modeling Project (CCIMP) is using "bottom-up" or vulnerability based methods to explore climate impacts on water resources. Stochastic weather generators (SWGs) are an integral component of the bottom-up approach. Previous work has identified and evaluated the skill of alternative stochastic weather generators of varying complexity for simulating the statistical characteristics of observed minimum and maximum daily air temperature and occurrence and amount of precipitation. This evaluation focused on the skill in representing extreme streamflow event probabilities across NYC West of Hudson (WOH) watersheds. Synthetic weather time series from the selected (skewed normal) SWG were used to drive the Generalized Watershed Loading Function (GWLF) watershed model for a 600 year long period to simulate daily streamflows for WOH watersheds under a wide range of hydrologic conditions. Long-term average daily streamflows generated using the synthetic weather time series were comparable to values generated using observed long-term (1950-2009) weather time series. This study demonstrates the ability of the selected weather generator to adequately represent the hydrologic response in WOH watersheds with respect to the total, peak, and seasonality in streamflows. Future application of SWGs in NYC watersheds will include generating multiple scenarios of changing climate to evaluate water supply system vulnerability and selection of appropriate adaptation measures.

  12. Impact of LUCC on streamflow based on the SWAT model over the Wei River basin on the Loess Plateau in China

    Science.gov (United States)

    Wang, Hong; Sun, Fubao; Xia, Jun; Liu, Wenbin

    2017-04-01

    Under the Grain for Green Project in China, vegetation recovery construction has been widely implemented on the Loess Plateau for the purpose of soil and water conservation. Now it is becoming controversial whether the recovery construction involving vegetation, particularly forest, is reducing the streamflow in the rivers of the Yellow River basin. In this study, we chose the Wei River, the largest branch of the Yellow River, with revegetated construction area as the study area. To do that, we apply the widely used Soil and Water Assessment Tool (SWAT) model for the upper and middle reaches of the Wei River basin. The SWAT model was forced with daily observed meteorological forcings (1960-2009) calibrated against daily streamflow for 1960-1969, validated for the period of 1970-1979, and used for analysis for 1980-2009. To investigate the impact of LUCC (land use and land cover change) on the streamflow, we firstly use two observed land use maps from 1980 and 2005 that are based on national land survey statistics merged with satellite observations. We found that the mean streamflow generated by using the 2005 land use map decreased in comparison with that using the 1980 one, with the same meteorological forcings. Of particular interest here is that the streamflow decreased on agricultural land but increased in forest areas. More specifically, the surface runoff, soil flow, and baseflow all decreased on agricultural land, while the soil flow and baseflow of forest areas increased. To investigate that, we then designed five scenarios: (S1) the present land use (1980) and (S2) 10 %, (S3) 20 %, (S4) 40 %, and (S5) 100 % of agricultural land that was converted into mixed forest. We found that the streamflow consistently increased with agricultural land converted into forest by about 7.4 mm per 10 %. Our modeling results suggest that forest recovery construction has a positive impact on both soil flow and baseflow by compensating for reduced surface runoff, which leads

  13. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...... of models has been somewhat narrow-minded reducing the notion of validation to establishment of truth. This article puts forward the diversity in applications of simulation models that demands a corresponding diversity in the notion of validation....... of models with regards to their purpose, character, field of application and time dimension inherently calls for a similar diversity in validation approaches. A classification of models in terms of the mentioned elements is presented and used to shed light on possible types of validation leading...

  14. Model Validation Status Review

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2001-11-28

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  15. Simultaneous assimilation of in situ soil moisture and streamflow in the SWAT model using the Extended Kalman Filter

    Science.gov (United States)

    Sun, Leqiang; Seidou, Ousmane; Nistor, Ioan; Goïta, Kalifa; Magagi, Ramata

    2016-12-01

    The Extended Kalman Filter (EKF) is used to assimilate in situ surface soil moisture and streamflow observation at the outlet of an experimental watershed outlet into a semi-distributed SWAT (Soil and Water Assessment Tool) model. Watershed scale, instead of HRU scale soil moisture was used in state vector to reduce computational burden. Numerical experiments were designed to select the best state vector which consists of streamflow and soil moisture in all vertical soil layers. Compared to open-loop model and direct-insert method, the estimate of both soil moisture and streamflow has been improved by EKF assimilation. The combined assimilation of surface soil moisture and streamflow outperforms the assimilation with only surface soil moisture or streamflow especially in the estimate of full profile soil moisture. The NSC has been improved to 0.63 from -4.45 and the RMSE has been reduced to 12.34 mm from 47.44 mm in open-loop. Such improvement is also reflected in the short term forecast of soil moisture. The improvement of streamflow prediction is relatively moderate in both simulation and forecast mode compared to quality of the soil moisture prediction. The quantification of the model error, especially the error covariance between different state variables, was found to be critical to the estimate of the state variable corresponding to the error covariance.

  16. Regression modeling of streamflow, baseflow, and runoff using geographic information systems.

    Science.gov (United States)

    Zhu, Yuanhong; Day, Rick L

    2009-02-01

    Regression models for predicting total streamflow (TSF), baseflow (TBF), and storm runoff (TRO) are needed for water resource planning and management. This study used 54 streams with >20 years of streamflow gaging station records during the period October 1971 to September 2001 in Pennsylvania and partitioned TSF into TBF and TRO. TBF was considered a surrogate of groundwater recharge for basins. Regression models for predicting basin-wide TSF, TBF, and TRO were developed under three scenarios that varied in regression variables used for model development. Regression variables representing basin geomorphological, geological, soil, and climatic characteristics were estimated using geographic information systems. All regression models for TSF, TBF, and TRO had R(2) values >0.94 and reasonable prediction errors. The two best TSF models developed under scenarios 1 and 2 had similar absolute prediction errors. The same was true for the two best TBF models. Therefore, any one of the two best TSF and TBF models could be used for respective flow prediction depending on variable availability. The TRO model developed under scenario 1 had smaller absolute prediction errors than that developed under scenario 2. Simplified Area-alone models developed under scenario 3 might be used when variables for using best models are not available, but had lower R(2) values and higher or more variable prediction errors than the best models.

  17. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2013-02-01

    Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.

  18. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2012-06-01

    Full Text Available This paper presents a hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this model, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The model includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for not only hydrological processes, but also for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity.

  19. An educational model for ensemble streamflow simulation and uncertainty analysis

    National Research Council Canada - National Science Library

    AghaKouchak, A; Nakhjiri, N; Habib, E

    2013-01-01

    ...) are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI) and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity...

  20. Numerical Validation of a Diurnal Streamflow-Pattern- Based Evapotranspiration Estimation Method

    Directory of Open Access Journals (Sweden)

    GRIBOVSZKI , Zoltán

    2011-01-01

    Full Text Available The evapotranspiration (ET estimation method by Gribovszki et al. (2010b has so farbeen validated only at one catchment because good quality discharge time series with the requiredhigh enough temporal resolution can probably be found at only a handful of watersheds worldwide. Tofill in the gap of measured data, synthetic groundwater discharge values were produced by a 2D finiteelement model representing a small catchment. Geometrical and soil physical parameters of thenumerical model were changed systematically and it was checked how well the model reproduced theprescribed ET time series. The tests corroborated that the ET-estimation method is applicable forcatchments underlain by a shallow aquifer. The slope of the riparian zone has a strong impact on theaccuracy of the ET results when the slope is steep, however, the method proved to be reliable forgentle or horizontal riparian zone surfaces, which are more typical in reality. Likewise, errors slightlyincrease with the decrease of riparian zone width, and unless this width is comparable to the width ofthe stream (the case of a narrow riparian zone, the ET estimates stay fairly accurate. The steepness ofthe valley slope had no significant effect on the results but the increase of the stream width (over 4mstrongly influences the ET estimation results, so this method can only be used for small headwatercatchments. Finally, even a magnitude change in the prescribed ET rates had only a small effect on theestimation accuracy. The soil physical parameters, however, strongly influence the accuracy of themethod. The model-prescribed ET values are recovered exactly only for the sandy-loam aquifer,because only in this case was the model groundwater flow system similar to the assumed, theoreticalone. For a low hydraulic conductivity aquifer (e.g. clay, silt, root water uptake creates a considerablydepressed water table under the riparian zone, therefore the method underestimates the ET. In a sandy

  1. A modified simple dynamic model: Derived from the information embedded in observed streamflows

    Science.gov (United States)

    Li, Wei; Nieber, John L.

    2017-09-01

    A zero-dimension hydrological model has been developed to simulate the discharge (Q) from watershed groundwater storage(S). The model is a modified version of the original model developed by Kirchner in 2009 which uses a unique sensitivity function, g (Q) to represent the relation between rate of flow recession and the instantaneous flow rate. The modified dynamic model instead uses a normalized sensitivity function g (Qnorm) which provides the model the flexibility to encompass the hysteretic effect of initial water storage on flow during recession periods. The sensitivity function is normalized based on a correlation function F (Q) which implicitly quantifies the influence of initial storage conditions on recession flow dynamics. For periods of either positive or negative net recharge to groundwater the model applies a term similar in form to an analytical solution based on solution to the linearized Boussinesq equation. The combination of these two streamflow components, the recession component and the net recharge response, provides the model with the flexibility to realistically mimic the hysteresis in the Q vs. S relations for a watershed. The model is applied to the Sagehen Creek watershed, a hilly watershed located in the Sierra Mountains of California. The results show that the modified model has an improved performance to simulate the discharge dynamic encompassing a wide range of water storage (degree of wetness) representing an almost ten-fold variation in annual streamflow.

  2. A simple model for assessing utilisable streamflow allocations in the ...

    African Journals Online (AJOL)

    2006-07-03

    Jul 3, 2006 ... It should be noted, however, that any water resource systems model is faced with the ... mendations made to decision makers (including water resource managers and river basin ... but should rather be viewed as a preliminary analysis method that can be used ..... some decision support information (Fig. 4).

  3. HYPERstream: a multi-scale framework for streamflow routing in large-scale hydrological model

    Science.gov (United States)

    Piccolroaz, Sebastiano; Di Lazzaro, Michele; Zarlenga, Antonio; Majone, Bruno; Bellin, Alberto; Fiori, Aldo

    2016-05-01

    We present HYPERstream, an innovative streamflow routing scheme based on the width function instantaneous unit hydrograph (WFIUH) theory, which is specifically designed to facilitate coupling with weather forecasting and climate models. The proposed routing scheme preserves geomorphological dispersion of the river network when dealing with horizontal hydrological fluxes, irrespective of the computational grid size inherited from the overlaying climate model providing the meteorological forcing. This is achieved by simulating routing within the river network through suitable transfer functions obtained by applying the WFIUH theory to the desired level of detail. The underlying principle is similar to the block-effective dispersion employed in groundwater hydrology, with the transfer functions used to represent the effect on streamflow of morphological heterogeneity at scales smaller than the computational grid. Transfer functions are constructed for each grid cell with respect to the nodes of the network where streamflow is simulated, by taking advantage of the detailed morphological information contained in the digital elevation model (DEM) of the zone of interest. These characteristics make HYPERstream well suited for multi-scale applications, ranging from catchment up to continental scale, and to investigate extreme events (e.g., floods) that require an accurate description of routing through the river network. The routing scheme enjoys parsimony in the adopted parametrization and computational efficiency, leading to a dramatic reduction of the computational effort with respect to full-gridded models at comparable level of accuracy. HYPERstream is designed with a simple and flexible modular structure that allows for the selection of any rainfall-runoff model to be coupled with the routing scheme and the choice of different hillslope processes to be represented, and it makes the framework particularly suitable to massive parallelization, customization according to

  4. Modeled streamflow metrics on small, ungaged stream reaches in the Upper Colorado River Basin

    Science.gov (United States)

    Lindsay V. Reynolds,; Shafroth, Patrick B.

    2016-01-20

    Modeling streamflow is an important approach for understanding landscape-scale drivers of flow and estimating flows where there are no streamgage records. In this study conducted by the U.S. Geological Survey in cooperation with Colorado State University, the objectives were to model streamflow metrics on small, ungaged streams in the Upper Colorado River Basin and identify streams that are potentially threatened with becoming intermittent under drier climate conditions. The Upper Colorado River Basin is a region that is critical for water resources and also projected to experience large future climate shifts toward a drying climate. A random forest modeling approach was used to model the relationship between streamflow metrics and environmental variables. Flow metrics were then projected to ungaged reaches in the Upper Colorado River Basin using environmental variables for each stream, represented as raster cells, in the basin. Last, the projected random forest models of minimum flow coefficient of variation and specific mean daily flow were used to highlight streams that had greater than 61.84 percent minimum flow coefficient of variation and less than 0.096 specific mean daily flow and suggested that these streams will be most threatened to shift to intermittent flow regimes under drier climate conditions. Map projection products can help scientists, land managers, and policymakers understand current hydrology in the Upper Colorado River Basin and make informed decisions regarding water resources. With knowledge of which streams are likely to undergo significant drying in the future, managers and scientists can plan for stream-dependent ecosystems and human water users.

  5. Multiobjective Optimal Algorithm for Automatic Calibration of Daily Streamflow Forecasting Model

    Directory of Open Access Journals (Sweden)

    Yi Liu

    2016-01-01

    Full Text Available Single-objection function cannot describe the characteristics of the complicated hydrologic system. Consequently, it stands to reason that multiobjective functions are needed for calibration of hydrologic model. The multiobjective algorithms based on the theory of nondominate are employed to solve this multiobjective optimal problem. In this paper, a novel multiobjective optimization method based on differential evolution with adaptive Cauchy mutation and Chaos searching (MODE-CMCS is proposed to optimize the daily streamflow forecasting model. Besides, to enhance the diversity performance of Pareto solutions, a more precise crowd distance assigner is presented in this paper. Furthermore, the traditional generalized spread metric (SP is sensitive with the size of Pareto set. A novel diversity performance metric, which is independent of Pareto set size, is put forward in this research. The efficacy of the new algorithm MODE-CMCS is compared with the nondominated sorting genetic algorithm II (NSGA-II on a daily streamflow forecasting model based on support vector machine (SVM. The results verify that the performance of MODE-CMCS is superior to the NSGA-II for automatic calibration of hydrologic model.

  6. mRM - multiscale Routing Model for Scale-Independent Streamflow Simulations

    Science.gov (United States)

    Thober, Stephan; Kumar, Rohini; Samaniego, Luis; Mai, Juliane; Rakovec, Oldrich; Cuntz, Matthias

    2016-04-01

    Routing streamflow through a river network is a basic step within any distributed hydrologic model. It integrates the generated runoff and allows comparison with observed discharge at the outlet of a catchment. The Muskingum routing is a textbook river routing scheme that has been implemented in Earth System Models (e.g., WRF-HYDRO), stand-alone routing schemes (e.g., RAPID) , and hydrologic models (e.g., the mesoscale Hydrologic Model - mHM). Two types of implementations are mostly used. In the first one, the spatial routing resolution is fixed to that of the elevation model irrespective of the hydrologic modeling resolution. This implementation suffers from a high computational demand. In the second one, the spatial resolution is always applied at the hydrologic modelling resolution. This approach requires a scale-independent model behaviour which is often not evaluated. Here, we present the multiscale Routing Model (mRM) that provides a flexible choice of the routing resolution independent of the hydrologic modelling resolution. It incorporates a triangular unit hydrograph for overland flow routing and a Muskingum routing scheme for river routing. mRM provides a scale-independent model behaviour by exploiting the Multiscale Parameter Regionalisation (MPR) included in the open-source mHM (www.ufz.de/mhm). MPR reflects the structure of the landscape within the parametrisation of hydrologic processes. Effective model parameters are derived by upscaling of high-resolution (i.e., landscape resolution) parameters to the hydrologic modelling/routing resolution as proposed in Samaniego et al. 2010 and Kumar et al. 2013. mRM is coupled in this work to the state-of-the-art land surface model Noah-MP. Simulated streamflow is derived for the Ohio River (≈~525 000 km^2) during the period 1990-2000 at resolutions of 0.0625

  7. Effects of nonlinear model response on allocation of streamflow depletion: exemplified by the case of Beaver Creek, USA

    Science.gov (United States)

    Ahlfeld, David P.; Schneider, James C.; Spalding, Charles P.

    2016-06-01

    Anomalies found when apportioning responsibility for streamflow depletion are examined. The anomalies arise when responsibility is assigned to the two states that contribute to depletion of Beaver Creek in the Republican River Basin in the United States. The apportioning procedure for this basin presumes that the sum of streamflow depletions, computed by comparing simulation model runs with and without groundwater pumping from individual states, approximates the streamflow depletion when both states are pumping. In the case study presented here, this presumed superposition fails dramatically. The stream drying and aquifer-storage depletion, as represented in the simulation model used for allocation, are examined in detail to understand the hydrologic and numerical basis for the severe nonlinear response. Users of apportioning procedures that rely on superposition should be aware of the presence and likely magnitude of nonlinear responses in modeling tools.

  8. Three-parameter-based streamflow elasticity model: application to MOPEX basins in the USA at annual and seasonal scales

    Science.gov (United States)

    Konapala, Goutam; Mishra, Ashok K.

    2016-07-01

    We present a three-parameter streamflow elasticity model as a function of precipitation, potential evaporation, and change in groundwater storage applicable at both seasonal and annual scales. The model was applied to 245 Model Parameter Estimation Experiment (MOPEX) basins spread across the continental USA. The analysis of the modified equation at annual and seasonal scales indicated that the groundwater and surface water storage change contributes significantly to the streamflow elasticity. Overall, in case of annual as well as seasonal water balances, precipitation has higher elasticity values when compared to both potential evapotranspiration and storage changes. The streamflow elasticities show significant nonlinear associations with the climate conditions of the catchments indicating a complex interplay between elasticities and climate variables with substantial seasonal variations.

  9. Effects of nonlinear model response on allocation of streamflow depletion: exemplified by the case of Beaver Creek, USA

    Science.gov (United States)

    Ahlfeld, David P.; Schneider, James C.; Spalding, Charles P.

    2016-11-01

    Anomalies found when apportioning responsibility for streamflow depletion are examined. The anomalies arise when responsibility is assigned to the two states that contribute to depletion of Beaver Creek in the Republican River Basin in the United States. The apportioning procedure for this basin presumes that the sum of streamflow depletions, computed by comparing simulation model runs with and without groundwater pumping from individual states, approximates the streamflow depletion when both states are pumping. In the case study presented here, this presumed superposition fails dramatically. The stream drying and aquifer-storage depletion, as represented in the simulation model used for allocation, are examined in detail to understand the hydrologic and numerical basis for the severe nonlinear response. Users of apportioning procedures that rely on superposition should be aware of the presence and likely magnitude of nonlinear responses in modeling tools.

  10. Simulation of Streamflow Using a Multidimensional Flow Model for White Sturgeon Habitat, Kootenai River near Bonners Ferry, Idaho - Supplement to Scientific Investigations Report 2005-5230

    Science.gov (United States)

    Barton, Gary J.; McDonald, Richard R.; Nelson, Jonathan M.

    2009-01-01

    During 2005, the U.S. Geological Survey (USGS) developed, calibrated, and validated a multidimensional flow model for simulating streamflow in the white sturgeon spawning habitat of the Kootenai River in Idaho. The model was developed as a tool to aid understanding of the physical factors affecting quality and quantity of spawning and rearing habitat used by the endangered white sturgeon (Acipenser transmontanus) and for assessing the feasibility of various habitat-enhancement scenarios to re-establish recruitment of white sturgeon. At the request of the Kootenai Tribe of Idaho, the USGS extended the two-dimensional flow model developed in 2005 into a braided reach upstream of the current white sturgeon spawning reach. Many scientists consider the braided reach a suitable substrate with adequate streamflow velocities for re-establishing recruitment of white sturgeon. The 2005 model was extended upstream to help assess the feasibility of various strategies to encourage white sturgeon to spawn in the reach. At the request of the Idaho Department of Fish and Game, the USGS also extended the two-dimensional flow model several kilometers downstream of the white sturgeon spawning reach. This modified model can quantify the physical characteristics of a reach that white sturgeon pass through as they swim upstream from Kootenay Lake to the spawning reach. The USGS Multi-Dimensional Surface-Water Modeling System was used for the 2005 modeling effort and for this subsequent modeling effort. This report describes the model applications and limitations, presents the results of a few simple simulations, and demonstrates how the model can be used to link physical characteristics of streamflow to the location of white sturgeon spawning events during 1994-2001. Model simulations also were used to report on the length and percentage of longitudinal profiles that met the minimum criteria during May and June 2006 and 2007 as stipulated in the U.S. Fish and Wildlife Biological Opinion.

  11. Impacts of uncertainties in weather and streamflow observations in calibration and evaluation of an elevation distributed HBV-model

    Science.gov (United States)

    Engeland, K.; Steinsland, I.; Petersen-Øverleir, A.; Johansen, S.

    2012-04-01

    The aim of this study is to assess the uncertainties in streamflow simulations when uncertainties in both observed inputs (precipitation and temperature) and streamflow observations used in the calibration of the hydrological model are explicitly accounted for. To achieve this goal we applied the elevation distributed HBV model operating on daily time steps to a small catchment in high elevation in Southern Norway where the seasonal snow cover is important. The uncertainties in precipitation inputs were quantified using conditional simulation. This procedure accounts for the uncertainty related to the density of the precipitation network, but neglects uncertainties related to measurement bias/errors and eventual elevation gradients in precipitation. The uncertainties in temperature inputs were quantified using a Bayesian temperature interpolation procedure where the temperature lapse rate is re-estimated every day. The uncertainty in the lapse rate was accounted for whereas the sampling uncertainty related to network density was neglected. For every day a random sample of precipitation and temperature inputs were drawn to be applied as inputs to the hydrologic model. The uncertainties in observed streamflow were assessed based on the uncertainties in the rating curve model. A Bayesian procedure was applied to estimate the probability for rating curve models with 1 to 3 segments and the uncertainties in their parameters. This method neglects uncertainties related to errors in observed water levels. Note that one rating curve was drawn to make one realisation of a whole time series of streamflow, thus the rating curve errors lead to a systematic bias in the streamflow observations. All these uncertainty sources were linked together in both calibration and evaluation of the hydrologic model using a DREAM based MCMC routine. Effects of having less information (e.g. missing one streamflow measurement for defining the rating curve or missing one precipitation station

  12. A past discharge assimilation system for ensemble streamflow forecasts over France – Part 2: Impact on the ensemble streamflow forecasts

    Directory of Open Access Journals (Sweden)

    G. Thirel

    2010-08-01

    Full Text Available The use of ensemble streamflow forecasts is developing in the international flood forecasting services. Ensemble streamflow forecast systems can provide more accurate forecasts and useful information about the uncertainty of the forecasts, thus improving the assessment of risks. Nevertheless, these systems, like all hydrological forecasts, suffer from errors on initialization or on meteorological data, which lead to hydrological prediction errors. This article, which is the second part of a 2-part article, concerns the impacts of initial states, improved by a streamflow assimilation system, on an ensemble streamflow prediction system over France. An assimilation system was implemented to improve the streamflow analysis of the SAFRAN-ISBA-MODCOU (SIM hydro-meteorological suite, which initializes the ensemble streamflow forecasts at Météo-France. This assimilation system, using the Best Linear Unbiased Estimator (BLUE and modifying the initial soil moisture states, showed an improvement of the streamflow analysis with low soil moisture increments. The final states of this suite were used to initialize the ensemble streamflow forecasts of Météo-France, which are based on the SIM model and use the European Centre for Medium-range Weather Forecasts (ECMWF 10-day Ensemble Prediction System (EPS. Two different configurations of the assimilation system were used in this study: the first with the classical SIM model and the second using improved soil physics in ISBA. The effects of the assimilation system on the ensemble streamflow forecasts were assessed for these two configurations, and a comparison was made with the original (i.e. without data assimilation and without the improved physics ensemble streamflow forecasts. It is shown that the assimilation system improved most of the statistical scores usually computed for the validation of ensemble predictions (RMSE, Brier Skill Score and its decomposition, Ranked Probability Skill Score, False Alarm

  13. Predicting streamflow response to fire-induced landcover change: implications of parameter uncertainty in the MIKE SHE model.

    Science.gov (United States)

    McMichael, Christine E; Hope, Allen S

    2007-08-01

    Fire is a primary agent of landcover transformation in California semi-arid shrubland watersheds, however few studies have examined the impacts of fire and post-fire succession on streamflow dynamics in these basins. While it may seem intuitive that larger fires will have a greater impact on streamflow response than smaller fires in these watersheds, the nature of these relationships has not been determined. The effects of fire size on seasonal and annual streamflow responses were investigated for a medium-sized basin in central California using a modified version of the MIKE SHE model which had been previously calibrated and tested for this watershed using the Generalized Likelihood Uncertainty Estimation methodology. Model simulations were made for two contrasting periods, wet and dry, in order to assess whether fire size effects varied with weather regime. Results indicated that seasonal and annual streamflow response increased nearly linearly with fire size in a given year under both regimes. Annual flow response was generally higher in wetter years for both weather regimes, however a clear trend was confounded by the effect of stand age. These results expand our understanding of the effects of fire size on hydrologic response in chaparral watersheds, but it is important to note that the majority of model predictions were largely indistinguishable from the predictive uncertainty associated with the calibrated model - a key finding that highlights the importance of analyzing hydrologic predictions for altered landcover conditions in the context of model uncertainty. Future work is needed to examine how alternative decisions (e.g., different likelihood measures) may influence GLUE-based MIKE SHE streamflow predictions following different size fires, and how the effect of fire size on streamflow varies with other factors such as fire location.

  14. Enhanced Identification of hydrologic models using streamflow and satellite water storage data: a multi-objective calibration approach

    Science.gov (United States)

    Yassin, F. A.; Razavi, S.; Sapriza, G.; Wheater, H. S.

    2015-12-01

    The conventional procedure for parameter identification of hydrological processes through conditioning only to streamflow data is challenging in physically based distributed hydrologic modelling. The challenge increases for modeling the landscapes where vertical processes dominate horizontal processes, leading to high uncertainties in modelled state variables, vertical fluxes and hence parameter estimates. Such behavior is common in modeling the prairie region of the Saskatchewan River Basin (SaskRB, our case study), Canada, where hydrologic connectivity and vertical fluxes are mainly controlled by surface and sub-surface water storage. To address this challenge, we developed a novel multi-criteria framework that utilizes total column water storage derived from the GRACE satellite, in addition to streamflows. We used a multi-objective optimization algorithm (Borg) and a recently-developed global sensitivity analysis approach (VARS) to effectively identify the model parameters and characterize their significance in model performance. We applied this framework in the calibration of a Land Surface Scheme-Hydrology model, MESH (Modélisation Environmentale Communautaire - Surface and Hydrology) to a sub-watershed of SaskRB. Results showed that the developed framework is superior to the conventional approach of calibration to streamflows. The new framework allowed us to find optimal solutions that effectively constrain the posterior parameter space and are representative of storage and streamflow performance criteria, yielding more credible prediction with reduced uncertainty of modeled storage and evaporation.

  15. Application of artificial neural network, fuzzy logic and decision tree algorithms for modelling of streamflow at Kasol in India.

    Science.gov (United States)

    Senthil Kumar, A R; Goyal, Manish Kumar; Ojha, C S P; Singh, R D; Swamee, P K

    2013-01-01

    The prediction of streamflow is required in many activities associated with the planning and operation of the components of a water resources system. Soft computing techniques have proven to be an efficient alternative to traditional methods for modelling qualitative and quantitative water resource variables such as streamflow, etc. The focus of this paper is to present the development of models using multiple linear regression (MLR), artificial neural network (ANN), fuzzy logic and decision tree algorithms such as M5 and REPTree for predicting the streamflow at Kasol located at the upstream of Bhakra reservoir in Sutlej basin in northern India. The input vector to the various models using different algorithms was derived considering statistical properties such as auto-correlation function, partial auto-correlation and cross-correlation function of the time series. It was found that REPtree model performed well compared to other soft computing techniques such as MLR, ANN, fuzzy logic, and M5P investigated in this study and the results of the REPTree model indicate that the entire range of streamflow values were simulated fairly well. The performance of the naïve persistence model was compared with other models and the requirement of the development of the naïve persistence model was also analysed by persistence index.

  16. Influence of hydro-meteorological data spatial aggregation on streamflow modelling

    Science.gov (United States)

    Girons Lopez, Marc; Seibert, Jan

    2016-10-01

    Data availability is important for virtually any purpose in hydrology. While some parts of the world continue to be under-monitored, other areas are experiencing an increased availability of high-resolution data. The use of the highest available resolution has always been preferred and many efforts have been made to maximize the information content of data and thus improve its predictive power and reduce the costs of maintenance of hydrometric sensor networks. In the light of ever-increasing data resolution, however, it is important to assess the added value of using the highest resolution available. In this study we present an assessment of the relative importance of hydro-meteorological data resolution for hydrological modelling. We used a case study with high-resolution data availability to investigate the influence of using models calibrated with different levels of spatially aggregated meteorological input data to estimate streamflow for different periods and at different locations. We found site specific variations, but model parameterizations calibrated using sub-catchment specific meteorological input data tended to produce better streamflow estimates, with model efficiency values being up to 0.35 efficiency units higher than those calibrated with catchment averaged meteorological data. We also found that basin characteristics other than catchment area have little effect on the performance of model parameterizations applied in different locations than the calibration site. Finally, we found that using an increased number of discharge data locations has a larger impact on model calibration efficiency than using spatially specific meteorological data. The results of this study contribute to improve the knowledge on assessing data needs for water management in terms of adequate data type and level of spatial aggregation.

  17. Evaluation of numerical weather prediction model precipitation forecasts for short-term streamflow forecasting purpose

    Directory of Open Access Journals (Sweden)

    D. L. Shrestha

    2013-05-01

    Full Text Available The quality of precipitation forecasts from four Numerical Weather Prediction (NWP models is evaluated over the Ovens catchment in Southeast Australia. Precipitation forecasts are compared with observed precipitation at point and catchment scales and at different temporal resolutions. The four models evaluated are the Australian Community Climate Earth-System Simulator (ACCESS including ACCESS-G with a 80 km resolution, ACCESS-R 37.5 km, ACCESS-A 12 km, and ACCESS-VT 5 km. The skill of the NWP precipitation forecasts varies considerably between rain gauging stations. In general, high spatial resolution (ACCESS-A and ACCESS-VT and regional (ACCESS-R NWP models overestimate precipitation in dry, low elevation areas and underestimate in wet, high elevation areas. The global model (ACCESS-G consistently underestimates the precipitation at all stations and the bias increases with station elevation. The skill varies with forecast lead time and, in general, it decreases with the increasing lead time. When evaluated at finer spatial and temporal resolution (e.g. 5 km, hourly, the precipitation forecasts appear to have very little skill. There is moderate skill at short lead times when the forecasts are averaged up to daily and/or catchment scale. The precipitation forecasts fail to produce a diurnal cycle shown in observed precipitation. Significant sampling uncertainty in the skill scores suggests that more data are required to get a reliable evaluation of the forecasts. The non-smooth decay of skill with forecast lead time can be attributed to diurnal cycle in the observation and sampling uncertainty. Future work is planned to assess the benefits of using the NWP rainfall forecasts for short-term streamflow forecasting. Our findings here suggest that it is necessary to remove the systematic biases in rainfall forecasts, particularly those from low resolution models, before the rainfall forecasts can be used for streamflow forecasting.

  18. A hybrid approach to monthly streamflow forecasting: Integrating hydrological model outputs into a Bayesian artificial neural network

    Science.gov (United States)

    Humphrey, Greer B.; Gibbs, Matthew S.; Dandy, Graeme C.; Maier, Holger R.

    2016-09-01

    Monthly streamflow forecasts are needed to support water resources decision making in the South East of South Australia, where baseflow represents a significant proportion of the total streamflow and soil moisture and groundwater are important predictors of runoff. To address this requirement, the utility of a hybrid monthly streamflow forecasting approach is explored, whereby simulated soil moisture from the GR4J conceptual rainfall-runoff model is used to represent initial catchment conditions in a Bayesian artificial neural network (ANN) statistical forecasting model. To assess the performance of this hybrid forecasting method, a comparison is undertaken of the relative performances of the Bayesian ANN, the GR4J conceptual model and the hybrid streamflow forecasting approach for producing 1-month ahead streamflow forecasts at three key locations in the South East of South Australia. Particular attention is paid to the quantification of uncertainty in each of the forecast models and the potential for reducing forecast uncertainty by using the hybrid approach is considered. Case study results suggest that the hybrid models developed in this study are able to take advantage of the complementary strengths of both the ANN models and the GR4J conceptual models. This was particularly the case when forecasting high flows, where the hybrid models were shown to outperform the two individual modelling approaches in terms of the accuracy of the median forecasts, as well as reliability and resolution of the forecast distributions. In addition, the forecast distributions generated by the hybrid models were up to 8 times more precise than those based on climatology; thus, providing a significant improvement on the information currently available to decision makers.

  19. Improved large-scale hydrological modelling through the assimilation of streamflow and downscaled satellite soil moisture observations

    Science.gov (United States)

    López López, Patricia; Wanders, Niko; Schellekens, Jaap; Renzullo, Luigi J.; Sutanudjaja, Edwin H.; Bierkens, Marc F. P.

    2016-07-01

    The coarse spatial resolution of global hydrological models (typically >  0.25°) limits their ability to resolve key water balance processes for many river basins and thus compromises their suitability for water resources management, especially when compared to locally tuned river models. A possible solution to the problem may be to drive the coarse-resolution models with locally available high-spatial-resolution meteorological data as well as to assimilate ground-based and remotely sensed observations of key water cycle variables. While this would improve the resolution of the global model, the impact of prediction accuracy remains largely an open question. In this study, we investigate the impact of assimilating streamflow and satellite soil moisture observations on the accuracy of global hydrological model estimations, when driven by either coarse- or high-resolution meteorological observations in the Murrumbidgee River basin in Australia. To this end, a 0.08° resolution version of the PCR-GLOBWB global hydrological model is forced with downscaled global meteorological data (downscaled from 0.5° to 0.08° resolution) obtained from the WATCH Forcing Data methodology applied to ERA-Interim (WFDEI) and a local high-resolution, gauging-station-based gridded data set (0.05°). Downscaled satellite-derived soil moisture (downscaled from ˜  0.5° to 0.08° resolution) from the remote observation system AMSR-E and streamflow observations collected from 23 gauging stations are assimilated using an ensemble Kalman filter. Several scenarios are analysed to explore the added value of data assimilation considering both local and global meteorological data. Results show that the assimilation of soil moisture observations results in the largest improvement of the model estimates of streamflow. The joint assimilation of both streamflow and downscaled soil moisture observations leads to further improvement in streamflow simulations (20 % reduction in RMSE). Furthermore

  20. Validated dynamic flow model

    DEFF Research Database (Denmark)

    Knudsen, Torben

    2011-01-01

    The purpose with this deliverable 2.5 is to use fresh experimental data for validation and selection of a flow model to be used for control design in WP3-4. Initially the idea was to investigate the models developed in WP2. However, in the project it was agreed to include and focus on a additive...... model turns out not to be useful for prediction of the flow. Moreover, standard Box Jenkins model structures and multiple output auto regressive models proves to be superior as they can give useful predictions of the flow....

  1. Post processing rainfall forecasts from numerical weather prediction models for short term streamflow forecasting

    Directory of Open Access Journals (Sweden)

    D. E. Robertson

    2013-05-01

    Full Text Available Sub-daily ensemble rainfall forecasts that are bias free and reliably quantify forecast uncertainty are critical for flood and short-term ensemble streamflow forecasting. Post processing of rainfall predictions from numerical weather prediction models is typically required to provide rainfall forecasts with these properties. In this paper, a new approach to generate ensemble rainfall forecasts by post processing raw NWP rainfall predictions is introduced. The approach uses a simplified version of the Bayesian joint probability modelling approach to produce forecast probability distributions for individual locations and forecast periods. Ensemble forecasts with appropriate spatial and temporal correlations are then generated by linking samples from the forecast probability distributions using the Schaake shuffle. The new approach is evaluated by applying it to post process predictions from the ACCESS-R numerical weather prediction model at rain gauge locations in the Ovens catchment in southern Australia. The joint distribution of NWP predicted and observed rainfall is shown to be well described by the assumed log-sinh transformed multivariate normal distribution. Ensemble forecasts produced using the approach are shown to be more skilful than the raw NWP predictions both for individual forecast periods and for cumulative totals throughout the forecast periods. Skill increases result from the correction of not only the mean bias, but also biases conditional on the magnitude of the NWP rainfall prediction. The post processed forecast ensembles are demonstrated to successfully discriminate between events and non-events for both small and large rainfall occurrences, and reliably quantify the forecast uncertainty. Future work will assess the efficacy of the post processing method for a wider range of climatic conditions and also investigate the benefits of using post processed rainfall forecast for flood and short term streamflow forecasting.

  2. Post-processing rainfall forecasts from numerical weather prediction models for short-term streamflow forecasting

    Directory of Open Access Journals (Sweden)

    D. E. Robertson

    2013-09-01

    Full Text Available Sub-daily ensemble rainfall forecasts that are bias free and reliably quantify forecast uncertainty are critical for flood and short-term ensemble streamflow forecasting. Post-processing of rainfall predictions from numerical weather prediction models is typically required to provide rainfall forecasts with these properties. In this paper, a new approach to generate ensemble rainfall forecasts by post-processing raw numerical weather prediction (NWP rainfall predictions is introduced. The approach uses a simplified version of the Bayesian joint probability modelling approach to produce forecast probability distributions for individual locations and forecast lead times. Ensemble forecasts with appropriate spatial and temporal correlations are then generated by linking samples from the forecast probability distributions using the Schaake shuffle. The new approach is evaluated by applying it to post-process predictions from the ACCESS-R numerical weather prediction model at rain gauge locations in the Ovens catchment in southern Australia. The joint distribution of NWP predicted and observed rainfall is shown to be well described by the assumed log-sinh transformed bivariate normal distribution. Ensemble forecasts produced using the approach are shown to be more skilful than the raw NWP predictions both for individual forecast lead times and for cumulative totals throughout all forecast lead times. Skill increases result from the correction of not only the mean bias, but also biases conditional on the magnitude of the NWP rainfall prediction. The post-processed forecast ensembles are demonstrated to successfully discriminate between events and non-events for both small and large rainfall occurrences, and reliably quantify the forecast uncertainty. Future work will assess the efficacy of the post-processing method for a wider range of climatic conditions and also investigate the benefits of using post-processed rainfall forecasts for flood and short

  3. Multi-variable calibration of a semi-distributed hydrological model using streamflow data and satellite-based evapotranspiration

    NARCIS (Netherlands)

    Rientjes, T.H.M.; Muthuwatta, L.P.; Bos, M.G.; Booij, M.J.; Bhatti, H.A.

    2013-01-01

    In this study, streamflow (Qs) and satellite-based actual evapotranspiration (ETa) are used in a multi-variable calibration framework to reproduce the catchment water balance. The application is for the HBV rainfall–runoff model at daily time-step for the Karkheh River Basin (51,000 km2) in Iran. Mo

  4. Coupling study of the Variable Infiltration Capacity (VIC) model with WRF model to simulate the streamflow in the Guadalquivir Basin

    Science.gov (United States)

    García-Valdecasas Ojeda, Matilde; De Franciscis, Sebastiano; Raquel Gámiz-Fortis, Sonia; Castro-Díez, Yolanda; Esteban-Parra, María Jesus

    2016-04-01

    Variable Infiltration Capacity (VIC) model is a large-scale, semi-distributed hydrologic model [1]. Its most important properties are related to the land surface, modeled as a grid of large and uniform cells with sub-grid heterogeneity (e.g. land cover), as well as to the local water influx (i.e. water can only enter a grid cell via the atmosphere and the channel flow between grid cells is ignored). The portions of surface and subsurface water runoff that reach the local channel network, are assumed to stay in the channel, and cannot flow back into the soil. In a second step, routing of streamflow is performed separately from the land surface simulation, using a separate model, the Routing Model, described in [2]. The final goal of our research consists into set an optimal hydrological and climate model to study the evolution of the streamflow of Guadalquivir Basin with different future land use, land cover and climate scenarios. In this work we study the coupling between VIC model, Routing model and Weather Research and Forecasting (WRF) model in order to perform the evolution of the streamflow for the Guadalquivir Basin (Spain). For this end, a calibration of the most relevant VIC model parameters using real streamflow daily time series, obtained from CEDEX (Centro de Estudios y Experimentación de Obras Públicas, Spain) database [3] was performed. In the time period under study, i.e. the decades 1988-1997 (calibration step) and 1998-2007 (verification step), the VIC model has been coupled with observational climate data, obtained from SPAIN02 database [4]. Additionally, we carried out a sensitivity analysis of WRF model to different parameterizations using different cumulus, microphysics and surface/planetary boundary layer schemes for the period 1995-1996. WRF runs were carried over a domain encompassing the Iberian Peninsula and nested in the coarser EURO-CORDEX domain [5]. The optimal parameters set resulting from such analysis have been used to obtain a

  5. Model-Based Attribution of High-Resolution Streamflow Trends in Two Alpine Basins of Western Austria

    Directory of Open Access Journals (Sweden)

    Christoph Kormann

    2016-02-01

    Full Text Available Several trend studies have shown that hydrological conditions are changing considerably in the Alpine region. However, the reasons for these changes are only partially understood and trend analyses alone are not able to shed much light. Hydrological modelling is one possible way to identify the trend drivers, i.e., to attribute the detected streamflow trends, given that the model captures all important processes causing the trends. We modelled the hydrological conditions for two alpine catchments in western Austria (a large, mostly lower-altitude catchment with wide valley plains and a nested high-altitude, glaciated headwater catchment with the distributed, physically-oriented WaSiM-ETH model, which includes a dynamical glacier module. The model was calibrated in a transient mode, i.e., not only on several standard goodness measures and glacier extents, but also in such a way that the simulated streamflow trends fit with the observed ones during the investigation period 1980 to 2007. With this approach, it was possible to separate streamflow components, identify the trends of flow components, and study their relation to trends in atmospheric variables. In addition to trends in annual averages, highly resolved trends for each Julian day were derived, since they proved powerful in an earlier, data-based attribution study. We were able to show that annual and highly resolved trends can be modelled sufficiently well. The results provide a holistic, year-round picture of the drivers of alpine streamflow changes: Higher-altitude catchments are strongly affected by earlier firn melt and snowmelt in spring and increased ice melt throughout the ablation season. Changes in lower-altitude areas are mostly caused by earlier and lower snowmelt volumes. All highly resolved trends in streamflow and its components show an explicit similarity to the local temperature trends. Finally, results indicate that evapotranspiration has been increasing in the lower

  6. A past discharge assimilation system for ensemble streamflow forecasts over France – Part 2: Impact on the ensemble streamflow forecasts

    Directory of Open Access Journals (Sweden)

    G. Thirel

    2010-04-01

    Full Text Available The use of ensemble streamflow forecasts is developing in the international flood forecasting services. Such systems can provide more accurate forecasts and useful information about the uncertainty of the forecasts, thus improving the assessment of risks. Nevertheless, these systems, like all hydrological forecasts, suffer from errors on initialization or on meteorological data, which lead to hydrological prediction errors. This article, which is the second part of a 2-part article, concerns the impacts of initial states, improved by a streamflow assimilation system, on an ensemble streamflow prediction system over France. An assimilation system was implemented to improve the streamflow analysis of the SAFRAN-ISBA-MODCOU (SIM hydro-meteorological suite, which initializes the ensemble streamflow forecasts at Météo-France. This assimilation system, using the Best Linear Unbiased Estimator (BLUE and modifying the initial soil moisture states, showed an improvement of the streamflow analysis with low soil moisture increments. The final states of this suite were used to initialize the ensemble streamflow forecasts of Météo-France, which are based on the SIM model and use the European Centre for Medium-range Weather Forecasts (ECMWF 10-day Ensemble Prediction System (EPS. Two different configurations of the assimilation system were used in this study: the first with the classical SIM model and the second using improved soil physics in ISBA. The effects of the assimilation system on the ensemble streamflow forecasts were assessed for these two configurations, and a comparison was made with the original (i.e. without data assimilation and without the improved physics ensemble streamflow forecasts. It is shown that the assimilation system improved most of the statistical scores usually computed for the validation of ensemble predictions (RMSE, Brier Skill Score and its decomposition, Ranked Probability Skill Score, False Alarm Rate, etc., especially

  7. Assessing the Use of Remote Sensing and a Crop Growth Model to Improve Modeled Streamflow in Central Asia

    Science.gov (United States)

    Richey, A. S.; Richey, J. E.; Tan, A.; Liu, M.; Adam, J. C.; Sokolov, V.

    2015-12-01

    Central Asia presents a perfect case study to understand the dynamic, and often conflicting, linkages between food, energy, and water in natural systems. The destruction of the Aral Sea is a well-known environmental disaster, largely driven by increased irrigation demand on the rivers that feed the endorheic sea. Continued reliance on these rivers, the Amu Darya and Syr Darya, often place available water resources at odds between hydropower demands upstream and irrigation requirements downstream. A combination of tools is required to understand these linkages and how they may change in the future as a function of climate change and population growth. In addition, the region is geopolitically complex as the former Soviet basin states develop management strategies to sustainably manage shared resources. This complexity increases the importance of relying upon publically available information sources and tools. Preliminary work has shown potential for the Variable Infiltration Capacity (VIC) model to recreate the natural water balance in the Amu Darya and Syr Darya basins by comparing results to total terrestrial water storage changes observed from NASA's Gravity Recovery and Climate Experiment (GRACE) satellite mission. Modeled streamflow is well correlated to observed streamflow at upstream gauges prior to the large-scale expansion of irrigation and hydropower. However, current modeled results are unable to capture the human influence of water use on downstream flow. This study examines the utility of a crop simulation model, CropSyst, to represent irrigation demand and GRACE to improve modeled streamflow estimates in the Amu Darya and Syr Darya basins. Specifically we determine crop water demand with CropSyst utilizing available data on irrigation schemes and cropping patterns. We determine how this demand can be met either by surface water, modeled by VIC with a reservoir operation scheme, and/or by groundwater derived from GRACE. Finally, we assess how the

  8. Error reduction and representation in stages (ERRIS) in hydrological modelling for ensemble streamflow forecasting

    Science.gov (United States)

    Li, Ming; Wang, Q. J.; Bennett, James C.; Robertson, David E.

    2016-09-01

    This study develops a new error modelling method for ensemble short-term and real-time streamflow forecasting, called error reduction and representation in stages (ERRIS). The novelty of ERRIS is that it does not rely on a single complex error model but runs a sequence of simple error models through four stages. At each stage, an error model attempts to incrementally improve over the previous stage. Stage 1 establishes parameters of a hydrological model and parameters of a transformation function for data normalization, Stage 2 applies a bias correction, Stage 3 applies autoregressive (AR) updating, and Stage 4 applies a Gaussian mixture distribution to represent model residuals. In a case study, we apply ERRIS for one-step-ahead forecasting at a range of catchments. The forecasts at the end of Stage 4 are shown to be much more accurate than at Stage 1 and to be highly reliable in representing forecast uncertainty. Specifically, the forecasts become more accurate by applying the AR updating at Stage 3, and more reliable in uncertainty spread by using a mixture of two Gaussian distributions to represent the residuals at Stage 4. ERRIS can be applied to any existing calibrated hydrological models, including those calibrated to deterministic (e.g. least-squares) objectives.

  9. Application of a land surface model for simulating river streamflow in high latitudes

    Science.gov (United States)

    Gusev, Yeugeniy; Nasonova, Olga; Dzhogan, Larissa

    2010-05-01

    Nowadays modelling runoff from the pan-Arctic river basins, which represents nearly 50% of water flow to the Arctic Ocean, is of great interest among hydrological modelling community because these regions are very sensitive to natural and anthropogenic impacts. This motivates the necessity of increase of the accuracy of hydrological estimations, runoff predictions, and water resources assessments in high latitudes. However, in these regions, observations required for model simulations (to specify model parameters and forcing inputs) are very scarce or even absent (especially this concerns land surface parameters). At the same time river discharge measurements are usually available that makes it possible to estimate model parameters by their calibration against measured discharge. Such a situation is typical of most of the northern basins of Russia. The major goal of the work is to reveal whether a physically-based land surface model (LSM) Soil Water - Atmosphere - Plants (SWAP) is able to reproduce snowmelt and rain driven daily streamflow in high latitudes (using poor input information) with the accuracy acceptable for hydrologic applications. Three river basins, located on the north of the European part of Russia, were chosen for investigation. They are the Mezen River basin (area: area: 78 000 km2), the Pechora River basin (area: 312 000 km2) and the Severnaya Dvina River basin (area: 348 000 km2). For modeling purposes the basins were presented, respectively, by 10, 57 and 62 one-degree computational grid boxes connected by river network. A priori estimation of the land surface parameters for each grid box was based on the global one-degree datasets prepared within the framework of the International Satellite Land-Surface Climatology Project Initiative II (ISLSCP) / the Second Global Soil Wetness Project (GSWP-2). Three versions of atmospheric forcing data prepared for the basins were based on: (1) NCEP/DOE reanalysis dataset; (2) NCEP/DOE reanalysis product

  10. Ecohydrologic Response of a Wetland Indicator Species to Climate Change and Streamflow Regulation: A Conceptual Model

    Science.gov (United States)

    Ward, E. M.; Gorelick, S.

    2015-12-01

    The Peace-Athabasca Delta ("Delta") in northeastern Alberta, Canada, is a UNESCO World Heritage Site and a Ramsar Wetland of International Importance. Delta ecohydrology is expected to respond rapidly to upstream water demand and climate change, with earlier spring meltwater, decreased springtime peak flow, and a decline in springtime ice-jam flooding. We focus on changes in the population and distribution of muskrat (Ondatra zibethicus), an ecohydrologic indicator species. We present a conceptual model linking hydrology and muskrat ecology. Our conceptual model links seven modules representing (1) upstream water demand, (2) streamflow and snowmelt, (3) floods, (4) the water balance of floodplain lakes, (5) muskrat habitat suitability, (6) wetland vegetation, and (7) muskrat population dynamics predicted using an agent-based model. Our goal is to evaluate the effects of different climate change and upstream water demand scenarios on the abundance and distribution of Delta muskrat, from present-2100. Moving from the current conceptual model to a predictive quantitative model, we will rely on abundant existing data and Traditional Ecological Knowledge of muskrat and hydrology in the Delta.

  11. Multi-objective optimization of empirical hydrological model for streamflow prediction

    Science.gov (United States)

    Guo, Jun; Zhou, Jianzhong; Lu, Jiazheng; Zou, Qiang; Zhang, Huajie; Bi, Sheng

    2014-04-01

    Traditional calibration of hydrological models is performed with a single objective function. Practical experience with the calibration of hydrologic models reveals that single objective functions are often inadequate to properly measure all of the characteristics of the hydrologic system. To circumvent this problem, in recent years, a lot of studies have looked into the automatic calibration of hydrological models with multi-objective functions. In this paper, the multi-objective evolution algorithm MODE-ACM is introduced to solve the multi-objective optimization of hydrologic models. Moreover, to improve the performance of the MODE-ACM, an Enhanced Pareto Multi-Objective Differential Evolution algorithm named EPMODE is proposed in this research. The efficacy of the MODE-ACM and EPMODE are compared with two state-of-the-art algorithms NSGA-II and SPEA2 on two case studies. Five test problems are used as the first case study to generate the true Pareto front. Then this approach is tested on a typical empirical hydrological model for monthly streamflow forecasting. The results of these case studies show that the EPMODE, as well as MODE-ACM, is effective in solving multi-objective problems and has great potential as an efficient and reliable algorithm for water resources applications.

  12. Modelling streamflow reductions resulting from commercial afforestation in South Africa: From research to application

    CSIR Research Space (South Africa)

    Gush, Mark B

    2006-08-01

    Full Text Available Numerous local and international studies have indicated conclusively that forest plantations consume more water than natural forests, grasslands or shrublands, and hence reduce water yield (streamflow) from afforested catchments. These water use...

  13. Assessing the predictive capability of randomized tree-based ensembles in streamflow modelling

    Directory of Open Access Journals (Sweden)

    S. Galelli

    2013-02-01

    Full Text Available Combining randomization methods with ensemble prediction is emerging as an effective option to balance accuracy and computational efficiency in data-driven modeling. In this paper we investigate the prediction capability of extremely randomized trees (Extra-Trees, in terms of accuracy, explanation ability and computational efficiency, in a streamflow modeling exercise. Extra-Trees are a totally randomized tree-based ensemble method that (i alleviates the poor generalization property and tendency to overfitting of traditional standalone decision trees (e.g. CART; (ii is computationally very efficient; and, (iii allows to infer the relative importance of the input variables, which might help in the ex-post physical interpretation of the model. The Extra-Trees potential is analyzed on two real-world case studies (Marina catchment (Singapore and Canning River (Western Australia representing two different morphoclimatic contexts comparatively with other tree-based methods (CART and M5 and parametric data-driven approaches (ANNs and multiple linear regression. Results show that Extra-Trees perform comparatively well to the best of the benchmarks (i.e. M5 in both the watersheds, while outperforming the other approaches in terms of computational requirement when adopted on large datasets. In addition, the ranking of the input variable provided can be given a physically meaningful interpretation.

  14. Assessing the predictive capability of randomized tree-based ensembles in streamflow modelling

    Science.gov (United States)

    Galelli, S.; Castelletti, A.

    2013-07-01

    Combining randomization methods with ensemble prediction is emerging as an effective option to balance accuracy and computational efficiency in data-driven modelling. In this paper, we investigate the prediction capability of extremely randomized trees (Extra-Trees), in terms of accuracy, explanation ability and computational efficiency, in a streamflow modelling exercise. Extra-Trees are a totally randomized tree-based ensemble method that (i) alleviates the poor generalisation property and tendency to overfitting of traditional standalone decision trees (e.g. CART); (ii) is computationally efficient; and, (iii) allows to infer the relative importance of the input variables, which might help in the ex-post physical interpretation of the model. The Extra-Trees potential is analysed on two real-world case studies - Marina catchment (Singapore) and Canning River (Western Australia) - representing two different morphoclimatic contexts. The evaluation is performed against other tree-based methods (CART and M5) and parametric data-driven approaches (ANNs and multiple linear regression). Results show that Extra-Trees perform comparatively well to the best of the benchmarks (i.e. M5) in both the watersheds, while outperforming the other approaches in terms of computational requirement when adopted on large datasets. In addition, the ranking of the input variable provided can be given a physically meaningful interpretation.

  15. An initial abstraction and constant loss model, and methods for estimating unit hydrographs, peak streamflows, and flood volumes for urban basins in Missouri

    Science.gov (United States)

    Huizinga, Richard J.

    2014-01-01

    Streamflow data, basin characteristics, and rainfall data from 39 streamflow-gaging stations for urban areas in and adjacent to Missouri were used by the U.S. Geological Survey in cooperation with the Metropolitan Sewer District of St. Louis to develop an initial abstraction and constant loss model (a time-distributed basin-loss model) and a gamma unit hydrograph (GUH) for urban areas in Missouri. Study-specific methods to determine peak streamflow and flood volume for a given rainfall event also were developed.

  16. Regionalization of subsurface stormflow parameters of hydrologic models: Derivation from regional analysis of streamflow recession curves

    Energy Technology Data Exchange (ETDEWEB)

    Ye, Sheng; Li, Hongyi; Huang, Maoyi; Ali, Melkamu; Leng, Guoyong; Leung, Lai-Yung R.; Wang, Shaowen; Sivapalan, Murugesu

    2014-07-21

    Subsurface stormflow is an important component of the rainfall–runoff response, especially in steep terrain. Its contribution to total runoff is, however, poorly represented in the current generation of land surface models. The lack of physical basis of these common parameterizations precludes a priori estimation of the stormflow (i.e. without calibration), which is a major drawback for prediction in ungauged basins, or for use in global land surface models. This paper is aimed at deriving regionalized parameterizations of the storage–discharge relationship relating to subsurface stormflow from a top–down empirical data analysis of streamflow recession curves extracted from 50 eastern United States catchments. Detailed regression analyses were performed between parameters of the empirical storage–discharge relationships and the controlling climate, soil and topographic characteristics. The regression analyses performed on empirical recession curves at catchment scale indicated that the coefficient of the power-law form storage–discharge relationship is closely related to the catchment hydrologic characteristics, which is consistent with the hydraulic theory derived mainly at the hillslope scale. As for the exponent, besides the role of field scale soil hydraulic properties as suggested by hydraulic theory, it is found to be more strongly affected by climate (aridity) at the catchment scale. At a fundamental level these results point to the need for more detailed exploration of the co-dependence of soil, vegetation and topography with climate.

  17. An Integrated Modeling System for Estimating Glacier and Snow Melt Driven Streamflow from Remote Sensing and Earth System Data Products in the Himalayas

    Science.gov (United States)

    Brown, M. E.; Racoviteanu, A. E.; Tarboton, D. G.; Sen Gupta, A.; Nigro, J.; Policelli, F.; Habib, S.; Tokay, M.; Shrestha, M. S.; Bajracharya, S.

    2014-01-01

    Quantification of the contribution of the hydrologic components (snow, ice and rain) to river discharge in the Hindu Kush Himalayan (HKH) region is important for decision-making in water sensitive sectors, and for water resources management and flood risk reduction. In this area, access to and monitoring of the glaciers and their melt outflow is challenging due to difficult access, thus modeling based on remote sensing offers the potential for providing information to improve water resources management and decision making. This paper describes an integrated modeling system developed using downscaled NASA satellite based and earth system data products coupled with in-situ hydrologic data to assess the contribution of snow and glaciers to the flows of the rivers in the HKH region. Snow and glacier melt was estimated using the Utah Energy Balance (UEB) model, further enhanced to accommodate glacier ice melt over clean and debris-covered tongues, then meltwater was input into the USGS Geospatial Stream Flow Model (Geo- SFM). The two model components were integrated into Better Assessment Science Integrating point and Nonpoint Sources modeling framework (BASINS) as a user-friendly open source system and was made available to countries in high Asia. Here we present a case study from the Langtang Khola watershed in the monsoon-influenced Nepal Himalaya, used to validate our energy balance approach and to test the applicability of our modeling system. The snow and glacier melt model predicts that for the eight years used for model evaluation (October 2003-September 2010), the total surface water input over the basin was 9.43 m, originating as 62% from glacier melt, 30% from snowmelt and 8% from rainfall. Measured streamflow for those years were 5.02 m, reflecting a runoff coefficient of 0.53. GeoSFM simulated streamflow was 5.31 m indicating reasonable correspondence between measured and model confirming the capability of the integrated system to provide a quantification

  18. An integrated modeling system for estimating glacier and snow melt driven streamflow from remote sensing and earth system data products in the Himalayas

    Science.gov (United States)

    Brown, M. E.; Racoviteanu, A. E.; Tarboton, D. G.; Gupta, A. Sen; Nigro, J.; Policelli, F.; Habib, S.; Tokay, M.; Shrestha, M. S.; Bajracharya, S.; Hummel, P.; Gray, M.; Duda, P.; Zaitchik, B.; Mahat, V.; Artan, G.; Tokar, S.

    2014-11-01

    Quantification of the contribution of the hydrologic components (snow, ice and rain) to river discharge in the Hindu Kush Himalayan (HKH) region is important for decision-making in water sensitive sectors, and for water resources management and flood risk reduction. In this area, access to and monitoring of the glaciers and their melt outflow is challenging due to difficult access, thus modeling based on remote sensing offers the potential for providing information to improve water resources management and decision making. This paper describes an integrated modeling system developed using downscaled NASA satellite based and earth system data products coupled with in-situ hydrologic data to assess the contribution of snow and glaciers to the flows of the rivers in the HKH region. Snow and glacier melt was estimated using the Utah Energy Balance (UEB) model, further enhanced to accommodate glacier ice melt over clean and debris-covered tongues, then meltwater was input into the USGS Geospatial Stream Flow Model (GeoSFM). The two model components were integrated into Better Assessment Science Integrating point and Nonpoint Sources modeling framework (BASINS) as a user-friendly open source system and was made available to countries in high Asia. Here we present a case study from the Langtang Khola watershed in the monsoon-influenced Nepal Himalaya, used to validate our energy balance approach and to test the applicability of our modeling system. The snow and glacier melt model predicts that for the eight years used for model evaluation (October 2003-September 2010), the total surface water input over the basin was 9.43 m, originating as 62% from glacier melt, 30% from snowmelt and 8% from rainfall. Measured streamflow for those years were 5.02 m, reflecting a runoff coefficient of 0.53. GeoSFM simulated streamflow was 5.31 m indicating reasonable correspondence between measured and model confirming the capability of the integrated system to provide a quantification of

  19. Role of surface-water and groundwater interactions on projected summertime streamflow in snow dominated regions : An integrated modeling approach

    Science.gov (United States)

    Huntington, Justin L.; Niswonger, Richard G.

    2012-01-01

    Previous studies indicate predominantly increasing trends in precipitation across the Western United States, while at the same time, historical streamflow records indicate decreasing summertime streamflow and 25th percentile annual flows. These opposing trends could be viewed as paradoxical, given that several studies suggest that increased annual precipitation will equate to increased annual groundwater recharge, and therefore increased summertime flow. To gain insight on mechanisms behind these potential changes, we rely on a calibrated, integrated surface and groundwater model to simulate climate impacts on surface water/groundwater interactions using 12 general circulation model projections of temperature and precipitation from 2010 to 2100, and evaluate the interplay between snowmelt timing and other hydrologic variables, including streamflow, groundwater recharge, storage, groundwater discharge, and evapotranspiration. Hydrologic simulations show that the timing of peak groundwater discharge to the stream is inversely correlated to snowmelt runoff and groundwater recharge due to the bank storage effect and reversal of hydraulic gradients between the stream and underlying groundwater. That is, groundwater flow to streams peaks following the decrease in stream depth caused by snowmelt recession, and the shift in snowmelt causes a corresponding shift in groundwater discharge to streams. Our results show that groundwater discharge to streams is depleted during the summer due to earlier drainage of shallow aquifers adjacent to streams even if projected annual precipitation and groundwater recharge increases. These projected changes in surface water/groundwater interactions result in more than a 30% decrease in the projected ensemble summertime streamflow. Our findings clarify causality of observed decreasing summertime flow, highlight important aspects of potential climate change impacts on groundwater resources, and underscore the need for integrated hydrologic

  20. Model validation, science and application

    NARCIS (Netherlands)

    Builtjes, P.J.H.; Flossmann, A.

    1998-01-01

    Over the last years there is a growing interest to try to establish a proper validation of atmospheric chemistry-transport (ATC) models. Model validation deals with the comparison of model results with experimental data, and in this way adresses both model uncertainty and uncertainty in, and adequac

  1. CEREF: A hybrid data-driven model for forecasting annual streamflow from a socio-hydrological system

    Science.gov (United States)

    Zhang, Hongbo; Singh, Vijay P.; Wang, Bin; Yu, Yinghao

    2016-09-01

    Hydrological forecasting is complicated by flow regime alterations in a coupled socio-hydrologic system, encountering increasingly non-stationary, nonlinear and irregular changes, which make decision support difficult for future water resources management. Currently, many hybrid data-driven models, based on the decomposition-prediction-reconstruction principle, have been developed to improve the ability to make predictions of annual streamflow. However, there exist many problems that require further investigation, the chief among which is the direction of trend components decomposed from annual streamflow series and is always difficult to ascertain. In this paper, a hybrid data-driven model was proposed to capture this issue, which combined empirical mode decomposition (EMD), radial basis function neural networks (RBFNN), and external forces (EF) variable, also called the CEREF model. The hybrid model employed EMD for decomposition and RBFNN for intrinsic mode function (IMF) forecasting, and determined future trend component directions by regression with EF as basin water demand representing the social component in the socio-hydrologic system. The Wuding River basin was considered for the case study, and two standard statistical measures, root mean squared error (RMSE) and mean absolute error (MAE), were used to evaluate the performance of CEREF model and compare with other models: the autoregressive (AR), RBFNN and EMD-RBFNN. Results indicated that the CEREF model had lower RMSE and MAE statistics, 42.8% and 7.6%, respectively, than did other models, and provided a superior alternative for forecasting annual runoff in the Wuding River basin. Moreover, the CEREF model can enlarge the effective intervals of streamflow forecasting compared to the EMD-RBFNN model by introducing the water demand planned by the government department to improve long-term prediction accuracy. In addition, we considered the high-frequency component, a frequent subject of concern in EMD

  2. Modeling Potential Impacts of Climate Change on Streamflow Using Projections of the 5th Assessment Report for the Bernam River Basin, Malaysia

    Directory of Open Access Journals (Sweden)

    Nkululeko Simeon Dlamini

    2017-03-01

    Full Text Available Potential impacts of climate change on the streamflow of the Bernam River Basin in Malaysia are assessed using ten Global Climate Models (GCMs under three Representative Concentration Pathways (RCP4.5, RCP6.0 and RCP8.5. A graphical user interface was developed that integrates all of the common procedures of assessing climate change impacts, to generate high resolution climate variables (e.g., rainfall, temperature, etc. at the local scale from large-scale climate models. These are linked in one executable module to generate future climate sequences that can be used as inputs to various models, including hydrological and crop models. The generated outputs were used as inputs to the SWAT hydrological model to simulate the hydrological processes. The evaluation results indicated that the model performed well for the watershed with a monthly R2, Nash–Sutcliffe Efficiency (NSE and Percent Bias (PBIAS values of 0.67, 0.62 and −9.4 and 0.62, 0.61 and −4.2 for the calibration and validation periods, respectively. The multi-model projections show an increase in future temperature (tmax and tmin in all respective scenarios, up to an average of 2.5 °C for under the worst-case scenario (RC8.5. Rainfall is also predicted to change with clear variations between the dry and wet season. Streamflow projections also followed rainfall pattern to a great extent with a distinct change between the dry and wet season possibly due to the increase in evapotranspiration in the watershed. In principle, the interface can be customized for the application to other watersheds by incorporating GCMs’ baseline data and their corresponding future data for those particular stations in the new watershed. Methodological limitations of the study are also discussed.

  3. Remote Sensing-based Methodologies for Snow Model Adjustments in Operational Streamflow Prediction

    Science.gov (United States)

    Bender, S.; Miller, W. P.; Bernard, B.; Stokes, M.; Oaida, C. M.; Painter, T. H.

    2015-12-01

    Water management agencies rely on hydrologic forecasts issued by operational agencies such as NOAA's Colorado Basin River Forecast Center (CBRFC). The CBRFC has partnered with the Jet Propulsion Laboratory (JPL) under funding from NASA to incorporate research-oriented, remotely-sensed snow data into CBRFC operations and to improve the accuracy of CBRFC forecasts. The partnership has yielded valuable analysis of snow surface albedo as represented in JPL's MODIS Dust Radiative Forcing in Snow (MODDRFS) data, across the CBRFC's area of responsibility. When dust layers within a snowpack emerge, reducing the snow surface albedo, the snowmelt rate may accelerate. The CBRFC operational snow model (SNOW17) is a temperature-index model that lacks explicit representation of snowpack surface albedo. CBRFC forecasters monitor MODDRFS data for emerging dust layers and may manually adjust SNOW17 melt rates. A technique was needed for efficient and objective incorporation of the MODDRFS data into SNOW17. Initial development focused in Colorado, where dust-on-snow events frequently occur. CBRFC forecasters used retrospective JPL-CBRFC analysis and developed a quantitative relationship between MODDRFS data and mean areal temperature (MAT) data. The relationship was used to generate adjusted, MODDRFS-informed input for SNOW17. Impacts of the MODDRFS-SNOW17 MAT adjustment method on snowmelt-driven streamflow prediction varied spatially and with characteristics of the dust deposition events. The largest improvements occurred in southwestern Colorado, in years with intense dust deposition events. Application of the method in other regions of Colorado and in "low dust" years resulted in minimal impact. The MODDRFS-SNOW17 MAT technique will be implemented in CBRFC operations in late 2015, prior to spring 2016 runoff. Collaborative investigation of remote sensing-based adjustment methods for the CBRFC operational hydrologic forecasting environment will continue over the next several years.

  4. Estimating daily time series of streamflow using hydrological model calibrated based on satellite observations of river water surface width: Toward real world applications.

    Science.gov (United States)

    Sun, Wenchao; Ishidaira, Hiroshi; Bastola, Satish; Yu, Jingshan

    2015-05-01

    Lacking observation data for calibration constrains applications of hydrological models to estimate daily time series of streamflow. Recent improvements in remote sensing enable detection of river water-surface width from satellite observations, making possible the tracking of streamflow from space. In this study, a method calibrating hydrological models using river width derived from remote sensing is demonstrated through application to the ungauged Irrawaddy Basin in Myanmar. Generalized likelihood uncertainty estimation (GLUE) is selected as a tool for automatic calibration and uncertainty analysis. Of 50,000 randomly generated parameter sets, 997 are identified as behavioral, based on comparing model simulation with satellite observations. The uncertainty band of streamflow simulation can span most of 10-year average monthly observed streamflow for moderate and high flow conditions. Nash-Sutcliffe efficiency is 95.7% for the simulated streamflow at the 50% quantile. These results indicate that application to the target basin is generally successful. Beyond evaluating the method in a basin lacking streamflow data, difficulties and possible solutions for applications in the real world are addressed to promote future use of the proposed method in more ungauged basins. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Streamflow hindcasting in European river basins via multi-parametric ensemble of the mesoscale hydrologic model (mHM)

    Science.gov (United States)

    Noh, Seong Jin; Rakovec, Oldrich; Kumar, Rohini; Samaniego, Luis

    2016-04-01

    There have been tremendous improvements in distributed hydrologic modeling (DHM) which made a process-based simulation with a high spatiotemporal resolution applicable on a large spatial scale. Despite of increasing information on heterogeneous property of a catchment, DHM is still subject to uncertainties inherently coming from model structure, parameters and input forcing. Sequential data assimilation (DA) may facilitate improved streamflow prediction via DHM using real-time observations to correct internal model states. In conventional DA methods such as state updating, parametric uncertainty is, however, often ignored mainly due to practical limitations of methodology to specify modeling uncertainty with limited ensemble members. If parametric uncertainty related with routing and runoff components is not incorporated properly, predictive uncertainty by DHM may be insufficient to capture dynamics of observations, which may deteriorate predictability. Recently, a multi-scale parameter regionalization (MPR) method was proposed to make hydrologic predictions at different scales using a same set of model parameters without losing much of the model performance. The MPR method incorporated within the mesoscale hydrologic model (mHM, http://www.ufz.de/mhm) could effectively represent and control uncertainty of high-dimensional parameters in a distributed model using global parameters. In this study, we present a global multi-parametric ensemble approach to incorporate parametric uncertainty of DHM in DA to improve streamflow predictions. To effectively represent and control uncertainty of high-dimensional parameters with limited number of ensemble, MPR method is incorporated with DA. Lagged particle filtering is utilized to consider the response times and non-Gaussian characteristics of internal hydrologic processes. The hindcasting experiments are implemented to evaluate impacts of the proposed DA method on streamflow predictions in multiple European river basins

  6. Parameterizing sub-surface drainage with geology to improve modeling streamflow responses to climate in data limited environments

    Directory of Open Access Journals (Sweden)

    C. L. Tague

    2012-07-01

    Full Text Available Hydrologic models are one of the core tools used to project how water resources may change under a warming climate. These models are typically applied over a range of scales, from headwater streams to higher order rivers, and for a variety of purposes, such as evaluating changes to aquatic habitat or reservoir operation. Most hydrologic models require streamflow data to calibrate subsurface drainage parameters. In many cases, long-term gage records may not be available for calibration, particularly when assessments are focused on low order stream reaches. Consequently, hydrologic modeling of climate change impacts is often performed in the absence of sufficient data to fully parameterize these hydrologic models. In this paper, we assess a geologic-based strategy for assigning drainage parameters. We examine the performance of this modeling strategy for the McKenzie River watershed in the US Oregon Cascades, a region where previous work has demonstrated sharp contrasts in hydrology based primarily on geological differences between the High and Western Cascades. Based on calibration and verification using existing streamflow data, we demonstrate that: (1 a set of streams ranging from 1st to 3rd order within the Western Cascade geologic region can share the same drainage parameter set, and (2 streams from the High Cascade geologic region, however, require a distinctive parameter set. Further, we show that a watershed comprised of a mixture of High and Western Cascade geology can be modeled without additional calibration by transferring parameters from these distinctive High and Western Cascade end-member parameter sets. Using this geologically-based parameter transfer scheme, our model predictions for all watersheds capture dominant historic streamflow patterns, and are sufficiently accurate to resolve geo-climatic differences in how these different watersheds are likely to respond to simple warming scenarios.

  7. Streamflow in the upper Mississippi river basin as simulated by SWAT driven by 20{sup th} century contemporary results of global climate models and NARCCAP regional climate models

    Energy Technology Data Exchange (ETDEWEB)

    Takle, Eugene S.; Jha, Manoj; Lu, Er; Arritt, Raymond W.; Gutowski, William J. [Iowa State Univ. Ames, IA (United States)

    2010-06-15

    We use Soil and Water Assessment Tool (SWAT) when driven by observations and results of climate models to evaluate hydrological quantities, including streamflow, in the Upper Mississippi River Basin (UMRB) for 1981-2003 in comparison to observed streamflow. Daily meteorological conditions used as input to SWAT are taken from (1) observations at weather stations in the basin, (2) daily meteorological conditions simulated by a collection of regional climate models (RCMs) driven by reanalysis boundary conditions, and (3) daily meteorological conditions simulated by a collection of global climate models (GCMs). Regional models used are those whose data are archived by the North American Regional Climate Change Assessment Program (NARCCAP). Results show that regional models correctly simulate the seasonal cycle of precipitation, temperature, and streamflow within the basin. Regional models also capture interannual extremes represented by the flood of 1993 and the dry conditions of 2000. The ensemble means of both the GCM-driven and RCM-driven simulations by SWAT capture both the timing and amplitude of the seasonal cycle of streamflow with neither demonstrating significant superiority at the basin level. (orig.)

  8. Comparative analysis of various real-time data assimilation approaches for assimilating streamflow into a hydrologic routing model

    Science.gov (United States)

    Noh, Seong Jin; Mazzoleni, Maurizio; Lee, Haksu; Liu, Yuqiong; Seo, Dong Jun; Solomatine, Dimitri

    2016-04-01

    Reliable water depth estimation is an extremely important issue in operational early flood warning systems. Different water system models have been implemented in the last decades, and, in parallel, data assimilation approaches have been introduced in order to reduce the uncertainty of such models. The goal of this study is to compare the performances of a distributed hydrologic routing model with streamflow assimilation using six different data assimilation methods, including direct insertion, nudging, Kalman filter, Ensemble Kalman filter, Asynchronous Ensemble Kalman filter and variational method. The model used in this study is a 3-parameter Muskingum (O'Donnell 1985) which was implemented in the Trinity River, within the Dallas-Fort-Worth Metroplex area in Texas, USA. The first methodological step is to discretize the river reach into multiple 1-km sub-reaches in order to estimate water depth in a distributed fashion. Then, different data assimilation approaches were implemented using the state-space approach formulation of the Muskingum model proposed by Georgakakos (1990). Finally, streamflow observations were assimilated at two points where flow sensors are located. The results of this work pointed out that assimilation of streamflow observations can noticeably improve the hydrologic routing model prediction and that ensemble definition is particularly important for both Ensemble Kalman filter and Asynchronous Ensemble Kalman filter. This study is part of the FP7 European Project WeSenseIt Citizen Water Observatory (www.http://wesenseit.eu/) and NSF Project Integrated Sensing and Prediction of urban Water for Sustainable Cities (http://ispuw.uta.edu/nsf)

  9. Monthly hydrometeorological ensemble prediction of streamflow droughts and corresponding drought indices

    Directory of Open Access Journals (Sweden)

    F. Fundel

    2013-01-01

    Full Text Available Streamflow droughts, characterized by low runoff as consequence of a drought event, affect numerous aspects of life. Economic sectors that are impacted by low streamflow are, e.g., power production, agriculture, tourism, water quality management and shipping. Those sectors could potentially benefit from forecasts of streamflow drought events, even of short events on the monthly time scales or below. Numerical hydrometeorological models have increasingly been used to forecast low streamflow and have become the focus of recent research. Here, we consider daily ensemble runoff forecasts for the river Thur, which has its source in the Swiss Alps. We focus on the evaluation of low streamflow and of the derived indices as duration, severity and magnitude, characterizing streamflow droughts up to a lead time of one month.

    The ECMWF VarEPS 5-member ensemble reforecast, which covers 18 yr, is used as forcing for the hydrological model PREVAH. A thorough verification reveals that, compared to probabilistic peak-flow forecasts, which show skill up to a lead time of two weeks, forecasts of streamflow droughts are skilful over the entire forecast range of one month. For forecasts at the lower end of the runoff regime, the quality of the initial state seems to be crucial to achieve a good forecast quality in the longer range. It is shown that the states used in this study to initialize forecasts satisfy this requirement. The produced forecasts of streamflow drought indices, derived from the ensemble forecasts, could be beneficially included in a decision-making process. This is valid for probabilistic forecasts of streamflow drought events falling below a daily varying threshold, based on a quantile derived from a runoff climatology. Although the forecasts have a tendency to overpredict streamflow droughts, it is shown that the relative economic value of the ensemble forecasts reaches up to 60%, in case a forecast user is able to take preventive

  10. Monthly hydrometeorological ensemble prediction of streamflow droughts and corresponding drought indices

    Science.gov (United States)

    Fundel, F.; Jörg-Hess, S.; Zappa, M.

    2013-01-01

    Streamflow droughts, characterized by low runoff as consequence of a drought event, affect numerous aspects of life. Economic sectors that are impacted by low streamflow are, e.g., power production, agriculture, tourism, water quality management and shipping. Those sectors could potentially benefit from forecasts of streamflow drought events, even of short events on the monthly time scales or below. Numerical hydrometeorological models have increasingly been used to forecast low streamflow and have become the focus of recent research. Here, we consider daily ensemble runoff forecasts for the river Thur, which has its source in the Swiss Alps. We focus on the evaluation of low streamflow and of the derived indices as duration, severity and magnitude, characterizing streamflow droughts up to a lead time of one month. The ECMWF VarEPS 5-member ensemble reforecast, which covers 18 yr, is used as forcing for the hydrological model PREVAH. A thorough verification reveals that, compared to probabilistic peak-flow forecasts, which show skill up to a lead time of two weeks, forecasts of streamflow droughts are skilful over the entire forecast range of one month. For forecasts at the lower end of the runoff regime, the quality of the initial state seems to be crucial to achieve a good forecast quality in the longer range. It is shown that the states used in this study to initialize forecasts satisfy this requirement. The produced forecasts of streamflow drought indices, derived from the ensemble forecasts, could be beneficially included in a decision-making process. This is valid for probabilistic forecasts of streamflow drought events falling below a daily varying threshold, based on a quantile derived from a runoff climatology. Although the forecasts have a tendency to overpredict streamflow droughts, it is shown that the relative economic value of the ensemble forecasts reaches up to 60%, in case a forecast user is able to take preventive action based on the forecast.

  11. Validating Animal Models

    Directory of Open Access Journals (Sweden)

    Nina Atanasova

    2015-06-01

    Full Text Available In this paper, I respond to the challenge raised against contemporary experimental neurobiology according to which the field is in a state of crisis because of the multiple experimental protocols employed in different laboratories and strengthening their reliability that presumably preclude the validity of neurobiological knowledge. I provide an alternative account of experimentation in neurobiology which makes sense of its experimental practices. I argue that maintaining a multiplicity of experimental protocols and strengthening their reliability are well justified and they foster rather than preclude the validity of neurobiological knowledge. Thus, their presence indicates thriving rather than crisis of experimental neurobiology.

  12. How does spatial variability of climate affect catchment streamflow predictions?

    Science.gov (United States)

    Spatial variability of climate can negatively affect catchment streamflow predictions if it is not explicitly accounted for in hydrologic models. In this paper, we examine the changes in streamflow predictability when a hydrologic model is run with spatially variable (distribute...

  13. Realtime Prediction in Disturbed Landscapes: Identifying Highest Priority Disturbance Characteristics Impacting Streamflow Response in a CONUS-Scale Operational Model

    Science.gov (United States)

    Dugger, A. L.; Gochis, D. J.; Yu, W.; McCreight, J. L.; Barlage, M. J.

    2015-12-01

    The "next generation" of hydrologic prediction systems - targeting unified, process-based, real-time prediction of the total water cycle - bring with them an increased need for real-time land surface characterization. Climatologically-derived estimates may perform well under stationary conditions, however disturbance can significantly alter hydrologic behavior and may be poorly represented by mean historical conditions. Fortunately, remote sensing and on-the-ground observation networks are collecting snapshots of these land characteristics over an increasing fraction of the globe. Given the computing constraints of operating a large-domain, real-time prediction system, to take advantage of these data streams we need a way to prioritize which landscape characteristics are most important to hydrologic prediction post-disturbance. To address this need, we setup a model experiment over the contiguous US using the community WRF-Hydro system with the NoahMP land surface model to assess the value of incorporating various aspects of disturbed landscapes into a real-time streamflow prediction model. WRF-Hydro will serve as the initial operational model for the US National Weather Service's new national water prediction effort, so use of WRF-Hydro allows us to leverage both an existing CONUS-scale model implementation and a short research-to-operations path. We first identify USGS GAGES-II basins that experienced more than 25% forest loss between 2000 and 2013. Based on basin disturbance type, geophysical setting, and climate regime, we formulate a conceptual model of which "disturbed" landscape characteristics we expect to dominate streamflow response. We test our conceptual model using WRF-Hydro by modeling a baseline (no disturbance) case, and then bringing in empirically-derived model state shifts representing key disturbance characteristics (e.g., leaf area index, rooting depth, overland roughness, surface detention). For each state update and each basin, we quantify

  14. Testing and validating environmental models

    Science.gov (United States)

    Kirchner, J.W.; Hooper, R.P.; Kendall, C.; Neal, C.; Leavesley, G.

    1996-01-01

    Generally accepted standards for testing and validating ecosystem models would benefit both modellers and model users. Universally applicable test procedures are difficult to prescribe, given the diversity of modelling approaches and the many uses for models. However, the generally accepted scientific principles of documentation and disclosure provide a useful framework for devising general standards for model evaluation. Adequately documenting model tests requires explicit performance criteria, and explicit benchmarks against which model performance is compared. A model's validity, reliability, and accuracy can be most meaningfully judged by explicit comparison against the available alternatives. In contrast, current practice is often characterized by vague, subjective claims that model predictions show 'acceptable' agreement with data; such claims provide little basis for choosing among alternative models. Strict model tests (those that invalid models are unlikely to pass) are the only ones capable of convincing rational skeptics that a model is probably valid. However, 'false positive' rates as low as 10% can substantially erode the power of validation tests, making them insufficiently strict to convince rational skeptics. Validation tests are often undermined by excessive parameter calibration and overuse of ad hoc model features. Tests are often also divorced from the conditions under which a model will be used, particularly when it is designed to forecast beyond the range of historical experience. In such situations, data from laboratory and field manipulation experiments can provide particularly effective tests, because one can create experimental conditions quite different from historical data, and because experimental data can provide a more precisely defined 'target' for the model to hit. We present a simple demonstration showing that the two most common methods for comparing model predictions to environmental time series (plotting model time series

  15. Base Flow Model Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets to...

  16. Validating Dart Model

    Directory of Open Access Journals (Sweden)

    Mazur Jolanta

    2014-12-01

    Full Text Available The primary objective of the study was to quantitatively test the DART model, which despite being one of the most popular representations of co-creation concept was so far studied almost solely with qualitative methods. To this end, the researchers developed a multiple measurement scale and employed it in interviewing managers. The statistical evidence for adequacy of the model was obtained through CFA with AMOS software. The findings suggest that the DART model may not be an accurate representation of co-creation practices in companies. From the data analysis it was evident that the building blocks of DART had too much of conceptual overlap to be an effective framework for quantitative analysis. It was also implied that the phenomenon of co-creation is so rich and multifaceted that it may be more adequately captured by a measurement model where co-creation is conceived as a third-level factor with two layers of intermediate latent variables.

  17. StreamFlow 1.0: an extension to the spatially distributed snow model Alpine3D for hydrological modelling and deterministic stream temperature prediction

    Science.gov (United States)

    Gallice, Aurélien; Bavay, Mathias; Brauchli, Tristan; Comola, Francesco; Lehning, Michael; Huwald, Hendrik

    2016-12-01

    Climate change is expected to strongly impact the hydrological and thermal regimes of Alpine rivers within the coming decades. In this context, the development of hydrological models accounting for the specific dynamics of Alpine catchments appears as one of the promising approaches to reduce our uncertainty of future mountain hydrology. This paper describes the improvements brought to StreamFlow, an existing model for hydrological and stream temperature prediction built as an external extension to the physically based snow model Alpine3D. StreamFlow's source code has been entirely written anew, taking advantage of object-oriented programming to significantly improve its structure and ease the implementation of future developments. The source code is now publicly available online, along with a complete documentation. A special emphasis has been put on modularity during the re-implementation of StreamFlow, so that many model aspects can be represented using different alternatives. For example, several options are now available to model the advection of water within the stream. This allows for an easy and fast comparison between different approaches and helps in defining more reliable uncertainty estimates of the model forecasts. In particular, a case study in a Swiss Alpine catchment reveals that the stream temperature predictions are particularly sensitive to the approach used to model the temperature of subsurface flow, a fact which has been poorly reported in the literature to date. Based on the case study, StreamFlow is shown to reproduce hourly mean discharge with a Nash-Sutcliffe efficiency (NSE) of 0.82 and hourly mean temperature with a NSE of 0.78.

  18. Parameterizing sub-surface drainage with geology to improve modeling streamflow responses to climate in data limited environments

    Directory of Open Access Journals (Sweden)

    C. L. Tague

    2013-01-01

    Full Text Available Hydrologic models are one of the core tools used to project how water resources may change under a warming climate. These models are typically applied over a range of scales, from headwater streams to higher order rivers, and for a variety of purposes, such as evaluating changes to aquatic habitat or reservoir operation. Most hydrologic models require streamflow data to calibrate subsurface drainage parameters. In many cases, long-term gage records may not be available for calibration, particularly when assessments are focused on low-order stream reaches. Consequently, hydrologic modeling of climate change impacts is often performed in the absence of sufficient data to fully parameterize these hydrologic models. In this paper, we assess a geologic-based strategy for assigning drainage parameters. We examine the performance of this modeling strategy for the McKenzie River watershed in the US Oregon Cascades, a region where previous work has demonstrated sharp contrasts in hydrology based primarily on geological differences between the High and Western Cascades. Based on calibration and verification using existing streamflow data, we demonstrate that: (1 a set of streams ranging from 1st to 3rd order within the Western Cascade geologic region can share the same drainage parameter set, while (2 streams from the High Cascade geologic region require a different parameter set. Further, we show that a watershed comprised of a mixture of High and Western Cascade geologies can be modeled without additional calibration by transferring parameters from these distinctive High and Western Cascade end-member parameter sets. More generally, we show that by defining a set of end-member parameters that reflect different geologic classes, we can more efficiently apply a hydrologic model over a geologically complex landscape and resolve geo-climatic differences in how different watersheds are likely to respond to simple warming scenarios.

  19. Quantifying Streamflow Variations in Ungauged Lake Basins by Integrating Remote Sensing and Water Balance Modelling: A Case Study of the Erdos Larus relictus National Nature Reserve, China

    Directory of Open Access Journals (Sweden)

    Kang Liang

    2017-06-01

    Full Text Available Hydrological predictions in ungauged lakes are one of the most important issues in hydrological sciences. The habitat of the Relict Gull (Larus relictus in the Erdos Larus relictus National Nature Reserve (ELRNNR has been seriously endangered by lake shrinkage, yet the hydrological processes in the catchment are poorly understood due to the lack of in-situ observations. Therefore, it is necessary to assess the variation in lake streamflow and its drivers. In this study, we employed the remote sensing technique and empirical equation to quantify the time series of lake water budgets, and integrated a water balance model and climate elasticity method to further examine ELRNNR basin streamflow variations from1974 to 2013. The results show that lake variations went through three phases with significant differences: The rapidly expanding sub-period (1974–1979, the relatively stable sub-period (1980–1999, and the dramatically shrinking sub-period (2000–2013. Both climate variation (expressed by precipitation and evapotranspiration and human activities were quantified as drivers of streamflow variation, and the driving forces in the three phases had different contributions. As human activities gradually intensified, the contributions of human disturbances on streamflow variation obviously increased, accounting for 22.3% during 1980–1999 and up to 59.2% during 2000–2013. Intensified human interferences and climate warming have jointly led to the lake shrinkage since 1999. This study provides a useful reference to quantify lake streamflow and its drivers in ungauged basins.

  20. Monthly streamflow forecasting in the Rhine basin

    Science.gov (United States)

    Schick, Simon; Rössler, Ole; Weingartner, Rolf

    2017-04-01

    Forecasting seasonal streamflow of the Rhine river is of societal relevance as the Rhine is an important water way and water resource in Western Europe. The present study investigates the predictability of monthly mean streamflow at lead times of zero, one, and two months with the focus on potential benefits by the integration of seasonal climate predictions. Specifically, we use seasonal predictions of precipitation and surface air temperature released by the European Centre for Medium-Range Weather Forecasts (ECMWF) for a regression analysis. In order to disentangle forecast uncertainty, the 'Reverse Ensemble Streamflow Prediction' framework is adapted here to the context of regression: By using appropriate subsets of predictors the regression model is constrained to either the initial conditions, the meteorological forcing, or both. An operational application is mimicked by equipping the model with the seasonal climate predictions provided by ECMWF. Finally, to mitigate the spatial aggregation of the meteorological fields the model is also applied at the subcatchment scale, and the resulting predictions are combined afterwards. The hindcast experiment is carried out for the period 1982-2011 in cross validation mode at two gauging stations, namely the Rhine at Lobith and Basel. The results show that monthly forecasts are skillful with respect to climatology only at zero lead time. In addition, at zero lead time the integration of seasonal climate predictions decreases the mean absolute error by 5 to 10 percentage compared to forecasts which are solely based on initial conditions. This reduction most likely is induced by the seasonal prediction of precipitation and not air temperature. The study is completed by bench marking the regression model with runoff simulations from ECMWFs seasonal forecast system. By simply using basin averages followed by a linear bias correction, these runoff simulations translate well to monthly streamflow. Though the regression model

  1. Coupling hydraulic and hydrological models to simulate the streamflow of a large arctic river: The case of the Mackenzie River

    Science.gov (United States)

    Elshamy, M.; Pietroniro, A.; Wheater, H. S.

    2016-12-01

    Accurate simulation of river streamflow is essential for water resources management and climate change impact studies. Hydrological models often route the streamflow using simple hydrological routing techniques that does not consider the characteristics of river channels or complex morphology present in certain rivers. Yet, for large river systems, as well as for regional and global modelling, routing effects can have a very significant impact on the magnitude of flood peaks and the timing of flows to seas and oceans. In this study, an approach to couple the MESH (Modélisation Environmentale Communautaire-Surface and Hydrology) model, which embeds the Canadian land surface scheme (CLASS), with a one-dimensional river hydraulic model (River-1D) of the main Mackenzie river and the 3 of its main tributaries (Peace, Athabasca, and Slave) is reported. Of particular interest is ensuring the complexity of dealing with the large delta environment where flow reversal and overbank storage is possible and can be a significant part of the water budget. Inflows at designated locations on those rivers are generated by the MESH hydrologic model run at 0.125° spatial resolution and 30 minutes temporal resolution. The one-dimensional hydraulic model simulates the routing along the river in a one-way coupling mode with due consideration to river ice processes including freeze-up and break-up. This approach improves the accuracy of river flow simulations along the main stem of the Mackenzie and its main tributes and allows for studying sediment transport and dynamic events, such as dam breaches or ice jam release and formation events.

  2. Complex networks for streamflow dynamics

    Directory of Open Access Journals (Sweden)

    B. Sivakumar

    2014-07-01

    Full Text Available Streamflow modeling is an enormously challenging problem, due to the complex and nonlinear interactions between climate inputs and landscape characteristics over a wide range of spatial and temporal scales. A basic idea in streamflow studies is to establish connections that generally exist, but attempts to identify such connections are largely dictated by the problem at hand and the system components in place. While numerous approaches have been proposed in the literature, our understanding of these connections remains far from adequate. The present study introduces the theory of networks, and in particular complex networks, to examine the connections in streamflow dynamics, with a particular focus on spatial connections. Monthly streamflow data observed over a period of 52 years from a large network of 639 monitoring stations in the contiguous United States are studied. The connections in this streamflow network are examined using the concept of clustering coefficient, which is a measure of local density and quantifies the network's tendency to cluster. The clustering coefficient analysis is performed with several different threshold levels, which are based on correlations in streamflow data between the stations. The clustering coefficient values of the 639 stations are used to obtain important information about the connections in the network and their extent, similarity and differences between stations/regions, and the influence of thresholds. The relationship of the clustering coefficient with the number of links/actual links in the network and the number of neighbors is also addressed. The results clearly indicate the usefulness of the network-based approach for examining connections in streamflow, with important implications for interpolation and extrapolation, classification of catchments, and predictions in ungaged basins.

  3. Evaluating Impacts of climate and land use changes on streamflow using SWAT and land use models based CESM1-CAM5 Climate scenarios

    Science.gov (United States)

    Lin, Tzu Ping; Lin, Yu Pin; Lien, Wan Yu

    2015-04-01

    Climate change projects have various levels of impacts on hydrological cycles around the world. The impact of climate change and uncertainty of climate projections from general circulation models (GCMs) from the Coupled Model Intercomparison Project (CMIP5) which has been just be released in Taiwan, 2014. Since the streamflow run into ocean directly due to the steep terrain and the rainfall difference between wet and dry seasons is apparent; as a result, the allocation water resource reasonable is very challenge in Taiwan, particularly under climate change. The purpose of this study was to evaluate the impacts of climate and land use changes on a small watershed in Taiwan. The AR5 General Circulation Models(GCM) output data was adopted in this study and was downscaled from the monthly to the daily weather data as the input data of hydrological model such as Soil and Water Assessment Tool (SWAT) model in this study. The spatially explicit land uses change model, the Conservation of Land Use and its Effects at Small regional extent (CLUE-s), was applied to simulate land use scenarios in 2020-2039. Combined climate and land use change scenarios were adopted as input data of the hydrological model, the SWAT model, to estimate the future streamflows. With the increasing precipitation, increasing urban area and decreasing agricultural and grass land, the annual streamflow in the most of twenty-three subbasins were also increased. Besides, due to the increasing rainfall in wet season and decreasing rainfall in dry season, the difference of streamflow between wet season and dry season are also increased. This result indicates a more stringent challenge on the water resource management in future. Therefore, impacts on water resource caused by climate change and land use change should be considered in water resource planning for the Datuan river watershed. Keywords: SWAT, GCM, CLUE-s, streamflow, climate change, land use change

  4. Seasonal forecasts of the SINTEX-F coupled model applied to maize yield and streamflow estimates over north-eastern South Africa

    CSIR Research Space (South Africa)

    Malherbe, J

    2014-07-01

    Full Text Available Forecasts of a Global Coupled Model for austral summer with a 1 month lead are downscaled to end-of-season maize yields and accumulated streamflow over the Limpopo Province and adjacent districts in northeastern South Africa through application...

  5. NEXRAD quantitative precipitation estimates, data acquisition, and processing for the DuPage County, Illinois, streamflow-simulation modeling system

    Science.gov (United States)

    Ortel, Terry W.; Spies, Ryan R.

    2015-11-19

    Next-Generation Radar (NEXRAD) has become an integral component in the estimation of precipitation (Kitzmiller and others, 2013). The high spatial and temporal resolution of NEXRAD has revolutionized the ability to estimate precipitation across vast regions, which is especially beneficial in areas without a dense rain-gage network. With the improved precipitation estimates, hydrologic models can produce reliable streamflow forecasts for areas across the United States. NEXRAD data from the National Weather Service (NWS) has been an invaluable tool used by the U.S. Geological Survey (USGS) for numerous projects and studies; NEXRAD data processing techniques similar to those discussed in this Fact Sheet have been developed within the USGS, including the NWS Quantitative Precipitation Estimates archive developed by Blodgett (2013).

  6. An extreme learning machine model for the simulation of monthly mean streamflow water level in eastern Queensland.

    Science.gov (United States)

    Deo, Ravinesh C; Şahin, Mehmet

    2016-02-01

    A predictive model for streamflow has practical implications for understanding the drought hydrology, environmental monitoring and agriculture, ecosystems and resource management. In this study, the state-or-art extreme learning machine (ELM) model was utilized to simulate the mean streamflow water level (Q WL) for three hydrological sites in eastern Queensland (Gowrie Creek, Albert, and Mary River). The performance of the ELM model was benchmarked with the artificial neural network (ANN) model. The ELM model was a fast computational method using single-layer feedforward neural networks and randomly determined hidden neurons that learns the historical patterns embedded in the input variables. A set of nine predictors with the month (to consider the seasonality of Q WL); rainfall; Southern Oscillation Index; Pacific Decadal Oscillation Index; ENSO Modoki Index; Indian Ocean Dipole Index; and Nino 3.0, Nino 3.4, and Nino 4.0 sea surface temperatures (SSTs) were utilized. A selection of variables was performed using cross correlation with Q WL, yielding the best inputs defined by (month; P; Nino 3.0 SST; Nino 4.0 SST; Southern Oscillation Index (SOI); ENSO Modoki Index (EMI)) for Gowrie Creek, (month; P; SOI; Pacific Decadal Oscillation (PDO); Indian Ocean Dipole (IOD); EMI) for Albert River, and by (month; P; Nino 3.4 SST; Nino 4.0 SST; SOI; EMI) for Mary River site. A three-layer neuronal structure trialed with activation equations defined by sigmoid, logarithmic, tangent sigmoid, sine, hardlim, triangular, and radial basis was utilized, resulting in optimum ELM model with hard-limit function and architecture 6-106-1 (Gowrie Creek), 6-74-1 (Albert River), and 6-146-1 (Mary River). The alternative ELM and ANN models with two inputs (month and rainfall) and the ELM model with all nine inputs were also developed. The performance was evaluated using the mean absolute error (MAE), coefficient of determination (r (2)), Willmott's Index (d), peak deviation (P dv), and Nash

  7. Validating the Runoff from the PRECIS Model Using a Large-Scale Routing Model

    Institute of Scientific and Technical Information of China (English)

    CAO Lijuan; DONG Wenjie; XU Yinlong; ZHANG Yong; Michael SPARROW

    2007-01-01

    The streamflow over the Yellow River basin is simulated using the PRECIS (Providing REgional Climates for Impacts Studies) regional climate model driven by 15-year (1979-1993) ECMWF reanalysis data as the initial and lateral boundary conditions and an off-line large-scale routing model (LRM). The LRM uses physical catchment and river channel information and allows streamflow to be predicted for large continental rivers with a 1°× 1° spatial resolution. The results show that the PRECIS model can reproduce the general southeast to northwest gradient distribution of the precipitation over the Yellow River basin. The PRECISLRM model combination has the capability to simulate the seasonal and annual streamflow over the Yellow River basin. The simulated streamflow is generally coincident with the naturalized streamflow both in timing and in magnitude.

  8. Streamflow generation in humid West Africa: the role of Bas-fonds investigated with a physically based model of the Critical Zone

    Science.gov (United States)

    Hector, B.; Cohard, J. M.; Séguis, L.

    2015-12-01

    In West Africa, the drought initiated in the 70's-80's together with intense land-use change due to increasing food demand produced very contrasted responses on water budgets of the critical zone (CZ) depending on the lithological and pedological contexts. In Sahel, streamflow increased, mostly due to increasing hortonian runoff from soil crusting, and so did groundwater storage. On the contrary, in the more humid southern Sudanian area, streamflow decreased and no clear signal has been observed concerning water storage in this hard-rock basement area. There, Bas-fonds are fundamental landscape features. They are seasonally water-logged valley bottoms from which first order streams originate, mostly composed of baseflow. They are a key feature for understanding streamflow generation processes. They also carry an important agronomic potential due to their moisture and nutrient availability. The role of Bas-fond in streamflow generation processes is investigated using a physically-based coupled model of the CZ, ParFlow-CLM at catchment scale (10km²). The model is evaluated against classical hydrological measurements (water table, soil moisture, streamflow, fluxes), acquired in the AMMA-CATCH observing system for the West African monsoon, but also hybrid gravity data which measure integrated water storage changes. The bas-fond system is shown to be composed of two components with different time scales. The slow component is characterized by the seasonal and interannual amplitude of the permanent water table, which is disconnected from streams, fed by direct recharge and lowered by evapotranspiration, mostly from riparian areas. The fast component is characterized by thresholds in storage and perched and permanent water tables surrounding the bas-fond during the wet season, which are linked with baseflow generation. This is a first step toward integrating these features into larger scale modeling of the critical zone for evaluating the effect of precipitation

  9. Improving daily streamflow forecasts in mountainous Upper Euphrates basin by multi-layer perceptron model with satellite snow products

    Science.gov (United States)

    Uysal, Gökçen; Şensoy, Aynur; Şorman, A. Arda

    2016-12-01

    This paper investigates the contribution of Moderate Resolution Imaging Spectroradiometer (MODIS) satellite Snow Cover Area (SCA) product and in-situ snow depth measurements to Artificial Neural Network model (ANN) based daily streamflow forecasting in a mountainous river basin. In order to represent non-linear structure of the snowmelt process, Multi-Layer Perceptron (MLP) Feed-Forward Backpropagation (FFBP) architecture is developed and applied in Upper Euphrates River Basin (10,275 km2) of Turkey where snowmelt constitutes approximately 2/3 of total annual volume of runoff during spring and early summer months. Snowmelt season is evaluated between March and July; 7 years (2002-2008) seasonal daily data are used during training while 3 years (2009-2011) seasonal daily data are split for forecasting. One of the fastest ANN training algorithms, the Levenberg-Marquardt, is used for optimization of the network weights and biases. The consistency of the network is checked with four performance criteria: coefficient of determination (R2), Nash-Sutcliffe model efficiency (ME), root mean square error (RMSE) and mean absolute error (MAE). According to the results, SCA observations provide useful information for developing of a neural network model to predict snowmelt runoff, whereas snow depth data alone are not sufficient. The highest performance is experienced when total daily precipitation, average air temperature data are combined with satellite snow cover data. The data preprocessing technique of Discrete Wavelet Analysis (DWA) is coupled with MLP modeling to further improve the runoff peak estimates. As a result, Nash-Sutcliffe model efficiency is increased from 0.52 to 0.81 for training and from 0.51 to 0.75 for forecasting. Moreover, the results are compared with that of a conceptual model, Snowmelt Runoff Model (SRM), application using SCA as an input. The importance and the main contribution of this study is to use of satellite snow products and data

  10. Adjustment of Peak Streamflows of a Tropical River for Urbanization

    Directory of Open Access Journals (Sweden)

    Ata Amini

    2009-01-01

    Full Text Available Peak runoff from a catchment is influenced by many factors such as intensity and duration of rainfall, catchment topography, catchment shape, land use and other variables. For a particular catchment, landuse change and other human activities will alter the characteristic of catchment hydrograph. Problem statement: As a result of urbanization, the magnitude of floods occurring in a catchment increased. It was found that the land use change in the Langat River catchment has clear impact on annual peak streamflow record, particularly from 1983-2003 while, the change has no significant impact on streamflow record from 1960-1982. Spatial data confirms the heavy development occurred in river basin from 1983-2003. Thus, urbanization makes the historical record of Langat River non-homogenous and this makes the mathematical simulation for the record inappropriate due to poor expected output. Approach: In this study, historical record of Langat River, Selangor, Malaysia from 1960-2003 was used to study the impact of urbanization on the streamflow. The annual peak streamflow was selected for this purpose. The peak streamflow record was divided into two sets. Set one from 1960-1982 and set two from 1983-2003, which represent periods before and after urbanization, respectively. To adjust the set one data for urbanization, different adjustment factors were used to make the data homogeneous. Normal model was applied to find the best factor for model fitness. Results: The best adjustment factors were selected by trial and error technique based on 95% confidence level. To determine the optimum adjustment factors from the best ones, the point of intersection between the homogeneity and Normal model evaluation curves was located. This point represented the optimum adjustment factor and its value was found to be 1.9. Autorun model was used to validate the above finding and it was found that the model prediction is acceptable with reasonable accuracy. Conclusion

  11. Machine learning methods for empirical streamflow simulation: a comparison of model accuracy, interpretability, and uncertainty in seasonal watersheds

    Science.gov (United States)

    Shortridge, Julie E.; Guikema, Seth D.; Zaitchik, Benjamin F.

    2016-07-01

    In the past decade, machine learning methods for empirical rainfall-runoff modeling have seen extensive development and been proposed as a useful complement to physical hydrologic models, particularly in basins where data to support process-based models are limited. However, the majority of research has focused on a small number of methods, such as artificial neural networks, despite the development of multiple other approaches for non-parametric regression in recent years. Furthermore, this work has often evaluated model performance based on predictive accuracy alone, while not considering broader objectives, such as model interpretability and uncertainty, that are important if such methods are to be used for planning and management decisions. In this paper, we use multiple regression and machine learning approaches (including generalized additive models, multivariate adaptive regression splines, artificial neural networks, random forests, and M5 cubist models) to simulate monthly streamflow in five highly seasonal rivers in the highlands of Ethiopia and compare their performance in terms of predictive accuracy, error structure and bias, model interpretability, and uncertainty when faced with extreme climate conditions. While the relative predictive performance of models differed across basins, data-driven approaches were able to achieve reduced errors when compared to physical models developed for the region. Methods such as random forests and generalized additive models may have advantages in terms of visualization and interpretation of model structure, which can be useful in providing insights into physical watershed function. However, the uncertainty associated with model predictions under extreme climate conditions should be carefully evaluated, since certain models (especially generalized additive models and multivariate adaptive regression splines) become highly variable when faced with high temperatures.

  12. Impacts of land use change on watershed streamflow and sediment yield: An assessment using hydrologic modelling and partial least squares regression

    Science.gov (United States)

    Yan, B.; Fang, N. F.; Zhang, P. C.; Shi, Z. H.

    2013-03-01

    SummaryUnderstanding how changes in individual land use types influence the dynamics of streamflow and sediment yield would greatly improve the predictability of the hydrological consequences of land use changes and could thus help stakeholders to make better decisions. Multivariate statistics are commonly used to compare individual land use types to control the dynamics of streamflow or sediment yields. However, one issue with the use of conventional statistical methods to address relationships between land use types and streamflow or sediment yield is multicollinearity. In this study, an integrated approach involving hydrological modelling and partial least squares regression (PLSR) was used to quantify the contributions of changes in individual land use types to changes in streamflow and sediment yield. In a case study, hydrological modelling was conducted using land use maps from four time periods (1978, 1987, 1999, and 2007) for the Upper Du watershed (8973 km2) in China using the Soil and Water Assessment Tool (SWAT). Changes in streamflow and sediment yield across the two simulations conducted using the land use maps from 2007 to 1978 were found to be related to land use changes according to a PLSR, which was used to quantify the effect of this influence at the sub-basin scale. The major land use changes that affected streamflow in the studied catchment areas were related to changes in the farmland, forest and urban areas between 1978 and 2007; the corresponding regression coefficients were 0.232, -0.147 and 1.256, respectively, and the Variable Influence on Projection (VIP) was greater than 1. The dominant first-order factors affecting the changes in sediment yield in our study were: farmland (the VIP and regression coefficient were 1.762 and 14.343, respectively) and forest (the VIP and regression coefficient were 1.517 and -7.746, respectively). The PLSR methodology presented in this paper is beneficial and novel, as it partially eliminates the co

  13. Application of a Distributed, Physically Based, Hydrologic Model to Improve Streamflow Forecasts in the Upper Rio Grande Basin

    Science.gov (United States)

    Gorham, T. A.; Boyle, D. P.; McConnell, J. R.; Lamorey, G. W.; Markstrom, S.; Viger, R.; Leavesley, G.

    2001-12-01

    Approximately two-thirds of the runoff in the Rio Grande begins as seasonal snowpack in the headwaters above the USGS stream gaging stations at several points (nodes) above Albuquerque, New Mexico. Resource managers in the Rio Grande Basin rely on accurate short and long term forecasts of water availability and flow at these nodes to make important decisions aimed at achieving a balance among many different and competing water uses such as municipal, fish and wildlife, agricultural, and water quality. In this study, a distributed, physically based hydrologic model is used to investigate the degree of spatial and temporal distribution of snow and the processes that control snowmelt necessary to accurately simulate streamflow at seven of these nodes. Specifically, snow distribution and surface runoff are estimated using a combination of the USGS Modular Modeling System (MMS), GIS Weasel, Precipitation-Runoff Modeling System (PRMS), and XYZ snow distribution model. This highly collaborative work between researchers at the Desert Research Institute and the USGS is an important part of SAHRA (Sustainability of semi-Arid Hydrology and Riparian Areas) efforts aimed at improving models of snow distribution and snowmelt processes.

  14. Analysis of the hydrological response of a distributed physically-based model using post-assimilation (EnKF) diagnostics of streamflow and in situ soil moisture observations

    Science.gov (United States)

    Trudel, Mélanie; Leconte, Robert; Paniconi, Claudio

    2014-06-01

    Data assimilation techniques not only enhance model simulations and forecast, they also provide the opportunity to obtain a diagnostic of both the model and observations used in the assimilation process. In this research, an ensemble Kalman filter was used to assimilate streamflow observations at a basin outlet and at interior locations, as well as soil moisture at two different depths (15 and 45 cm). The simulation model is the distributed physically-based hydrological model CATHY (CATchment HYdrology) and the study site is the Des Anglais watershed, a 690 km2 river basin located in southern Quebec, Canada. Use of Latin hypercube sampling instead of a conventional Monte Carlo method to generate the ensemble reduced the size of the ensemble, and therefore the calculation time. Different post-assimilation diagnostics, based on innovations (observation minus background), analysis residuals (observation minus analysis), and analysis increments (analysis minus background), were used to evaluate assimilation optimality. An important issue in data assimilation is the estimation of error covariance matrices. These diagnostics were also used in a calibration exercise to determine the standard deviation of model parameters, forcing data, and observations that led to optimal assimilations. The analysis of innovations showed a lag between the model forecast and the observation during rainfall events. Assimilation of streamflow observations corrected this discrepancy. Assimilation of outlet streamflow observations improved the Nash-Sutcliffe efficiencies (NSE) between the model forecast (one day) and the observation at both outlet and interior point locations, owing to the structure of the state vector used. However, assimilation of streamflow observations systematically increased the simulated soil moisture values.

  15. Streamflow alteration at selected sites in Kansas

    Science.gov (United States)

    Juracek, Kyle E.; Eng, Ken

    2017-06-26

    An understanding of streamflow alteration in response to various disturbances is necessary for the effective management of stream habitat for a variety of species in Kansas. Streamflow alteration can have negative ecological effects. Using a modeling approach, streamflow alteration was assessed for 129 selected U.S. Geological Survey streamgages in the State for which requisite streamflow and basin-characteristic information was available. The assessment involved a comparison of the observed condition from 1980 to 2015 with the predicted expected (least-disturbed) condition for 29 streamflow metrics. The metrics represent various characteristics of streamflow including average flow (annual, monthly) and low and high flow (frequency, duration, magnitude).Streamflow alteration in Kansas was indicated locally, regionally, and statewide. Given the absence of a pronounced trend in annual precipitation in Kansas, a precipitation-related explanation for streamflow alteration was not supported. Thus, the likely explanation for streamflow alteration was human activity. Locally, a flashier flow regime (typified by shorter lag times and more frequent and higher peak discharges) was indicated for three streamgages with urbanized basins that had higher percentages of impervious surfaces than other basins in the State. The combination of localized reservoir effects and regional groundwater pumping from the High Plains aquifer likely was responsible, in part, for diminished conditions indicated for multiple streamflow metrics in western and central Kansas. Statewide, the implementation of agricultural land-management practices to reduce runoff may have been responsible, in part, for a diminished duration and magnitude of high flows. In central and eastern Kansas, implemented agricultural land-management practices may have been partly responsible for an inflated magnitude of low flows at several sites.

  16. PEMFC modeling and experimental validation

    Energy Technology Data Exchange (ETDEWEB)

    Vargas, J.V.C. [Federal University of Parana (UFPR), Curitiba, PR (Brazil). Dept. of Mechanical Engineering], E-mail: jvargas@demec.ufpr.br; Ordonez, J.C.; Martins, L.S. [Florida State University, Tallahassee, FL (United States). Center for Advanced Power Systems], Emails: ordonez@caps.fsu.edu, martins@caps.fsu.edu

    2009-07-01

    In this paper, a simplified and comprehensive PEMFC mathematical model introduced in previous studies is experimentally validated. Numerical results are obtained for an existing set of commercial unit PEM fuel cells. The model accounts for pressure drops in the gas channels, and for temperature gradients with respect to space in the flow direction, that are investigated by direct infrared imaging, showing that even at low current operation such gradients are present in fuel cell operation, and therefore should be considered by a PEMFC model, since large coolant flow rates are limited due to induced high pressure drops in the cooling channels. The computed polarization and power curves are directly compared to the experimentally measured ones with good qualitative and quantitative agreement. The combination of accuracy and low computational time allow for the future utilization of the model as a reliable tool for PEMFC simulation, control, design and optimization purposes. (author)

  17. Disentangling the response of streamflow to forest management and climate

    Science.gov (United States)

    Dymond, S.; Miniat, C.; Bladon, K. D.; Keppeler, E.; Caldwell, P. V.

    2016-12-01

    Paired watershed studies have showcased the relationships between forests, management, and streamflow. However, classical analyses of paired-watershed studies have done little to disentangle the effects of management from overarching climatic signals, potentially masking the interaction between management and climate. Such approaches may confound our understanding of how forest management impacts streamflow. Here we use a 50-year record of streamflow and climate data from the Caspar Creek Experimental Watersheds (CCEW), California, USA to separate the effects of forest management and climate on streamflow. CCEW has two treatment watersheds that have been harvested in the past 50 years. We used a nonlinear mixed model to combine the pre-treatment relationship between streamflow and climate and the post-treatment relationship via an interaction between climate and management into one equation. Our results show that precipitation and potential evapotranspiration alone can account for >95% of the variability in pre-treatment streamflow. Including management scenarios into the model explained most of the variability in streamflow (R2 > 0.98). While forest harvesting altered streamflow in both of our modeled watersheds, removing 66% of the vegetation via selection logging using a tractor yarding system over the entire watershed had a more substantial impact on streamflow than clearcutting small portions of a watershed using cable-yarding. These results suggest that forest harvesting may result in differing impacts on streamflow and highlights the need to incorporate climate into streamflow analyses of paired-watershed studies.

  18. Long-term forest management effects on streamflow and evapotranspiration: modeling the interaction of vegetation and climate at the catchment scale

    Science.gov (United States)

    Ford, C. R.; Vose, J.

    2011-12-01

    Forested watersheds, an important provider of ecosystems services related to water supply, can have their structure, function, and resulting streamflow substantially altered by land use and land cover. Using a retrospective analysis and synthesis of long-term climate and streamflow data (75 years) from six watersheds differing in management histories we explored whether streamflow, and thus evapotranspiration, responded differently to variation in annual temperature and extreme precipitation than unmanaged watersheds. We used a hybrid modeling approach that incorporated terms for the classic paired watershed regression, the response of the vegetation regrowth, and the interaction of vegetation regrowth and precipitation. We show significant increases in temperature and the frequency of extreme wet and dry years since the 1980s. Response models explained almost all streamflow variability (R2adj > 0.99). In all cases, changing land use altered streamflow. Observed watershed responses differed significantly in wet and dry extreme years in all but a stand managed as a coppice forest. Converting deciduous stands to pine altered the streamflow response to extreme annual precipitation the most; the apparent frequency of observed extreme wet years decreased on average by 7-fold. This effect was attributable partially to increased interception, but also to increased transpiration in the pine stand compared to the unmanaged, deciduous hardwood stand as indicated by sap flow studies on individual species. This increased soil water storage may reduce flood risk in wet years, but create conditions that could exacerbate drought. Forest management can potentially mitigate extreme annual precipitation associated with climate change; however, offsetting effects suggest the need for spatially-explicit analyses of risk and vulnerability, as well as an increased understanding of the relative contributions of interception and transpiration across species and community types. To address

  19. Simulation of Streamflow in a Discontinuous Permafrost Environment Using a Modified First-order, Nonlinear Rainfall-runoff Model

    Science.gov (United States)

    Bolton, W. R.; Hinzman, L. D.

    2009-12-01

    The sub-arctic environment can be characterized by being located in the zone of discontinuous permafrost. Although the distribution of permafrost in this region is specific, it dominates the response of many of the hydrologic processes including stream flow, soil moisture dynamics, and water storage processes. In areas underlain by permafrost, ice-rich conditions at the permafrost table inhibit surface water percolation to the deep subsurface soils, resulting in an increased runoff generation generation during precipitation events, decreased baseflow between precipitation events, and relatively wetter soils compared to permafrost-free areas. Over the course of a summer season, the thawing of the active layer (the thin soil layer about the permafrost that seasonally freezes and thaws) increases the potential water holding capacity of the soil, resulting in a decreasing surface water contribution during precipitation events and a steadily increasing baseflow contribution between precipitation events. Simulation of stream flow in this region is challenging due to the rapidly changing thermal (permafrost versus non-permafrost, active layer development) and hydraulic (hydraulic conductivity and soil storage capacity) conditions in both time and space (x, y, and z-dimensions). Many of the factors that have a control on both permafrost distribution and the thawing/freezing of active layer (such as soil material, soil moisture, and ice content) are not easily quantified at scales beyond the point measurement. In this study, these issues are addressed through streamflow analysis - the only hydrologic process that is easily measured at the basin scale. Following the general procedure outlined in Kirchner (2008), a simple rainfall-runoff model was applied to three small head-water basins of varying permafrost coverage. A simple, first-order, non-linear differential equation that describes the storage-discharge relationship were derived from three years of stream flow data

  20. Measuring and modeling spatio-temporal patterns of groundwater storage dynamics to better understand nonlinear streamflow response

    Science.gov (United States)

    Rinderer, Michael; van Meerveld, Ilja; McGlynn, Brian

    2017-04-01

    Information about the spatial and temporal variability in catchment scale groundwater storage is needed to identify runoff source area dynamics and better understand variability in streamflow. However, information on groundwater levels is typically only available at a limited number of monitoring sites and interpolation or upscaling is necessary to obtain information on catchment scale groundwater dynamics. Here we used data from 51 spatially distributed groundwater monitoring sites in a Swiss pre-alpine catchment and time series clustering to define six groundwater response clusters. Each of the clusters was distinct in terms of the groundwater rise and recession but also had distinctly different topographic site characteristics, which allowed us to assign a groundwater response cluster to all non-monitored locations. Each of them was then assigned the mean groundwater response of the monitored cluster members. A site was considered active (i.e., enabling lateral subsurface flow) when the groundwater levels rose above the groundwater response threshold which was defined based on the depth of the more transmissive soil layers (typically between 10 cm and 30 cm below the soil surface). This allowed us to create maps of the active areas across the catchment at 15 min time intervals. The mean fraction of agreement between modeled groundwater activation (based on the mean cluster member time series) and measured groundwater activation (based on the measured groundwater level time series at a monitoring site) was 0.91 (25th percentile: 0.88, median: 0.92, 75th percentile: 0.95). The fraction of agreement dropped by 10 to 15 % at the beginning of events but was never lower than 0.4. Connectivity between all active areas and the stream network was determined using a graph theory approach. During rainfall events, the simulated active and connected area extended mainly laterally and longitudinally along the channel network, which is in agreement with the variable source

  1. Validation for a recirculation model.

    Science.gov (United States)

    LaPuma, P T

    2001-04-01

    Recent Clean Air Act regulations designed to reduce volatile organic compound (VOC) emissions have placed new restrictions on painting operations. Treating large volumes of air which contain dilute quantities of VOCs can be expensive. Recirculating some fraction of the air allows an operator to comply with environmental regulations at reduced cost. However, there is a potential impact on employee safety because indoor pollutants will inevitably increase when air is recirculated. A computer model was developed, written in Microsoft Excel 97, to predict compliance costs and indoor air concentration changes with respect to changes in the level of recirculation for a given facility. The model predicts indoor air concentrations based on product usage and mass balance equations. This article validates the recirculation model using data collected from a C-130 aircraft painting facility at Hill Air Force Base, Utah. Air sampling data and air control cost quotes from vendors were collected for the Hill AFB painting facility and compared to the model's predictions. The model's predictions for strontium chromate and isocyanate air concentrations were generally between the maximum and minimum air sampling points with a tendency to predict near the maximum sampling points. The model's capital cost predictions for a thermal VOC control device ranged from a 14 percent underestimate to a 50 percent overestimate of the average cost quotes. A sensitivity analysis of the variables is also included. The model is demonstrated to be a good evaluation tool in understanding the impact of recirculation.

  2. A weakly-constrained data assimilation approach to address rainfall-runoff model structural inadequacy in streamflow prediction

    Science.gov (United States)

    Lee, Haksu; Seo, Dong-Jun; Noh, Seong Jin

    2016-11-01

    This paper presents a simple yet effective weakly-constrained (WC) data assimilation (DA) approach for hydrologic models which accounts for model structural inadequacies associated with rainfall-runoff transformation processes. Compared to the strongly-constrained (SC) DA, WC DA adjusts the control variables less while producing similarly or more accurate analysis. Hence the adjusted model states are dynamically more consistent with those of the base model. The inadequacy of a rainfall-runoff model was modeled as an additive error to runoff components prior to routing and penalized in the objective function. Two example modeling applications, distributed and lumped, were carried out to investigate the effects of the WC DA approach on DA results. For distributed modeling, the distributed Sacramento Soil Moisture Accounting (SAC-SMA) model was applied to the TIFM7 Basin in Missouri, USA. For lumped modeling, the lumped SAC-SMA model was applied to nineteen basins in Texas. In both cases, the variational DA (VAR) technique was used to assimilate discharge data at the basin outlet. For distributed SAC-SMA, spatially homogeneous error modeling yielded updated states that are spatially much more similar to the a priori states, as quantified by Earth Mover's Distance (EMD), than spatially heterogeneous error modeling by up to ∼10 times. DA experiments using both lumped and distributed SAC-SMA modeling indicated that assimilating outlet flow using the WC approach generally produce smaller mean absolute difference as well as higher correlation between the a priori and the updated states than the SC approach, while producing similar or smaller root mean square error of streamflow analysis and prediction. Large differences were found in both lumped and distributed modeling cases between the updated and the a priori lower zone tension and primary free water contents for both WC and SC approaches, indicating possible model structural deficiency in describing low flows or

  3. Validation of Magnetospheric Magnetohydrodynamic Models

    Science.gov (United States)

    Curtis, Brian

    Magnetospheric magnetohydrodynamic (MHD) models are commonly used for both prediction and modeling of Earth's magnetosphere. To date, very little validation has been performed to determine their limits, uncertainties, and differences. In this work, we performed a comprehensive analysis using several commonly used validation techniques in the atmospheric sciences to MHD-based models of Earth's magnetosphere for the first time. The validation techniques of parameter variability/sensitivity analysis and comparison to other models were used on the OpenGGCM, BATS-R-US, and SWMF magnetospheric MHD models to answer several questions about how these models compare. The questions include: (1) the difference between the model's predictions prior to and following to a reversal of Bz in the upstream interplanetary field (IMF) from positive to negative, (2) the influence of the preconditioning duration, and (3) the differences between models under extreme solar wind conditions. A differencing visualization tool was developed and used to address these three questions. We find: (1) For a reversal in IMF Bz from positive to negative, the OpenGGCM magnetopause is closest to Earth as it has the weakest magnetic pressure near-Earth. The differences in magnetopause positions between BATS-R-US and SWMF are explained by the influence of the ring current, which is included in SWMF. Densities are highest for SWMF and lowest for OpenGGCM. The OpenGGCM tail currents differ significantly from BATS-R-US and SWMF; (2) A longer preconditioning time allowed the magnetosphere to relax more, giving different positions for the magnetopause with all three models before the IMF Bz reversal. There were differences greater than 100% for all three models before the IMF Bz reversal. The differences in the current sheet region for the OpenGGCM were small after the IMF Bz reversal. The BATS-R-US and SWMF differences decreased after the IMF Bz reversal to near zero; (3) For extreme conditions in the solar

  4. Software Validation via Model Animation

    Science.gov (United States)

    Dutle, Aaron M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Butler, Ricky W.

    2015-01-01

    This paper explores a new approach to validating software implementations that have been produced from formally-verified algorithms. Although visual inspection gives some confidence that the implementations faithfully reflect the formal models, it does not provide complete assurance that the software is correct. The proposed approach, which is based on animation of formal specifications, compares the outputs computed by the software implementations on a given suite of input values to the outputs computed by the formal models on the same inputs, and determines if they are equal up to a given tolerance. The approach is illustrated on a prototype air traffic management system that computes simple kinematic trajectories for aircraft. Proofs for the mathematical models of the system's algorithms are carried out in the Prototype Verification System (PVS). The animation tool PVSio is used to evaluate the formal models on a set of randomly generated test cases. Output values computed by PVSio are compared against output values computed by the actual software. This comparison improves the assurance that the translation from formal models to code is faithful and that, for example, floating point errors do not greatly affect correctness and safety properties.

  5. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  6. Obstructive lung disease models: what is valid?

    Science.gov (United States)

    Ferdinands, Jill M; Mannino, David M

    2008-12-01

    Use of disease simulation models has led to scrutiny of model methods and demand for evidence that models credibly simulate health outcomes. We sought to describe recent obstructive lung disease simulation models and their validation. Medline and EMBASE were used to identify obstructive lung disease simulation models published from January 2000 to June 2006. Publications were reviewed to assess model attributes and four types of validation: first-order (verification/debugging), second-order (comparison with studies used in model development), third-order (comparison with studies not used in model development), and predictive validity. Six asthma and seven chronic obstructive pulmonary disease models were identified. Seven (54%) models included second-order validation, typically by comparing observed outcomes to simulations of source study cohorts. Seven (54%) models included third-order validation, in which modeled outcomes were usually compared qualitatively for agreement with studies independent of the model. Validation endpoints included disease prevalence, exacerbation, and all-cause mortality. Validation was typically described as acceptable, despite near-universal absence of criteria for judging adequacy of validation. Although over half of recent obstructive lung disease simulation models report validation, inconsistencies in validation methods and lack of detailed reporting make assessing adequacy of validation difficult. For simulation modeling to be accepted as a tool for evaluating clinical and public health programs, models must be validated to credibly simulate health outcomes of interest. Defining the required level of validation and providing guidance for quantitative assessment and reporting of validation are important future steps in promoting simulation models as practical decision tools.

  7. Developing and testing a global-scale regression model to quantify mean annual streamflow

    Science.gov (United States)

    Barbarossa, Valerio; Huijbregts, Mark A. J.; Hendriks, A. Jan; Beusen, Arthur H. W.; Clavreul, Julie; King, Henry; Schipper, Aafke M.

    2017-01-01

    Quantifying mean annual flow of rivers (MAF) at ungauged sites is essential for assessments of global water supply, ecosystem integrity and water footprints. MAF can be quantified with spatially explicit process-based models, which might be overly time-consuming and data-intensive for this purpose, or with empirical regression models that predict MAF based on climate and catchment characteristics. Yet, regression models have mostly been developed at a regional scale and the extent to which they can be extrapolated to other regions is not known. In this study, we developed a global-scale regression model for MAF based on a dataset unprecedented in size, using observations of discharge and catchment characteristics from 1885 catchments worldwide, measuring between 2 and 106 km2. In addition, we compared the performance of the regression model with the predictive ability of the spatially explicit global hydrological model PCR-GLOBWB by comparing results from both models to independent measurements. We obtained a regression model explaining 89% of the variance in MAF based on catchment area and catchment averaged mean annual precipitation and air temperature, slope and elevation. The regression model performed better than PCR-GLOBWB for the prediction of MAF, as root-mean-square error (RMSE) values were lower (0.29-0.38 compared to 0.49-0.57) and the modified index of agreement (d) was higher (0.80-0.83 compared to 0.72-0.75). Our regression model can be applied globally to estimate MAF at any point of the river network, thus providing a feasible alternative to spatially explicit process-based global hydrological models.

  8. Geospatial tools effectively estimate nonexceedance probabilities of daily streamflow at ungauged and intermittently gauged locations in Ohio

    Science.gov (United States)

    Farmer, William H.; Koltun, Greg

    2017-01-01

    Study regionThe state of Ohio in the United States, a humid, continental climate.Study focusThe estimation of nonexceedance probabilities of daily streamflows as an alternative means of establishing the relative magnitudes of streamflows associated with hydrologic and water-quality observations.New hydrological insights for the regionSeveral methods for estimating nonexceedance probabilities of daily mean streamflows are explored, including single-index methodologies (nearest-neighboring index) and geospatial tools (kriging and topological kriging). These methods were evaluated by conducting leave-one-out cross-validations based on analyses of nearly 7 years of daily streamflow data from 79 unregulated streamgages in Ohio and neighboring states. The pooled, ordinary kriging model, with a median Nash–Sutcliffe performance of 0.87, was superior to the single-site index methods, though there was some bias in the tails of the probability distribution. Incorporating network structure through topological kriging did not improve performance. The pooled, ordinary kriging model was applied to 118 locations without systematic streamgaging across Ohio where instantaneous streamflow measurements had been made concurrent with water-quality sampling on at least 3 separate days. Spearman rank correlations between estimated nonexceedance probabilities and measured streamflows were high, with a median value of 0.76. In consideration of application, the degree of regulation in a set of sample sites helped to specify the streamgages required to implement kriging approaches successfully.

  9. Testing the usefulness of hydrological models in simulating extreme streamflows for frequency analysis purpose

    Science.gov (United States)

    Chen, H.; Li, L.; Wang, J.; Xu, C.-Y.; Guo, S.

    2012-04-01

    Recently, extreme flood events are becoming more uncertain and greater challenge in the world. Flood frequency analysis is a powerful tool to study and evaluate extreme flood events, and also a key step in design of water resources projects. Hydrological models have been used as an important tool for forecasting extreme flood event and design flood calculation. However, there are little studies on evaluation of the reasonability of flood frequency values obtained from runoff simulations of watershed hydrological models. In this study, the reasonability of the flood frequency analysis obtained from runoff simulations of different hydrological models is evaluated and analyzed by comparison with that from historical runoff observation. Xiangjiang basin, one of the most important economic belts in Hunan Province, is selected as the study region. Xiangjiang basin is always in a severe situation for flood control in summer and has also great influences on Dongting Lake's flood storage capacity. In this study Xiangjiang Basin was divided into 3 sub-basins and 1 downstream section, which have their outflow stations respectively. Each region has integrated and long observed historical runoff and rainfall series from 1961 to 2005. Three conceptual hydrological models, i.e., Xin-anjiang, HBV and WASMOD were established to simulate runoff in each sub-basins of Xiangjiang basin. To utilize the simulations from three hydrological models for frequency analysis, a transformation from deterministic rain-runoff models to stochastic models is needed by adding the model residuals to the simulated discharges using Monte-Carlo method. The commonly used Pearson type III distribution in China and L-moment were used to calculate the frequency. All three hydrological models perform well according to commonly used model evaluation criteria, i.e., Nash-Sutcliffe model efficiency coefficient and water balance error, etc. However, the frequency analysis results of annual maximum flow simulated

  10. Reliable long-range ensemble streamflow forecasts: Combining calibrated climate forecasts with a conceptual runoff model and a staged error model

    Science.gov (United States)

    Bennett, James C.; Wang, Q. J.; Li, Ming; Robertson, David E.; Schepen, Andrew

    2016-10-01

    We present a new streamflow forecasting system called forecast guided stochastic scenarios (FoGSS). FoGSS makes use of ensemble seasonal precipitation forecasts from a coupled ocean-atmosphere general circulation model (CGCM). The CGCM forecasts are post-processed with the method of calibration, bridging and merging (CBaM) to produce ensemble precipitation forecasts over river catchments. CBaM corrects biases and removes noise from the CGCM forecasts, and produces highly reliable ensemble precipitation forecasts. The post-processed CGCM forecasts are used to force the Wapaba monthly rainfall-runoff model. Uncertainty in the hydrological modeling is accounted for with a three-stage error model. Stage 1 applies the log-sinh transformation to normalize residuals and homogenize their variance; Stage 2 applies a conditional bias-correction to correct biases and help remove negative forecast skill; Stage 3 applies an autoregressive model to improve forecast accuracy at short lead-times and propagate uncertainty through the forecast. FoGSS generates ensemble forecasts in the form of time series for the coming 12 months. In a case study of two catchments, FoGSS produces reliable forecasts at all lead-times. Forecast skill with respect to climatology is evident to lead-times of about 3 months. At longer lead-times, forecast skill approximates that of climatology forecasts; that is, forecasts become like stochastic scenarios. Because forecast skill is virtually never negative at long lead-times, forecasts of accumulated volumes can be skillful. Forecasts of accumulated 12 month streamflow volumes are significantly skillful in several instances, and ensembles of accumulated volumes are reliable. We conclude that FoGSS forecasts could be highly useful to water managers.

  11. Comparing trends in European streamflow records to hydrological change in a large-scale model intercomparison experiment

    NARCIS (Netherlands)

    Stahl, K.; Tallaksen, L.M.; Lanen, van H.A.J.

    2011-01-01

    In Europe, an overall appraisal of runoff changes at a continental scale has long been hindered by the paucity of readily-available runoff data. Recently, a coherent picture of hydrological trends in the most recent decades (1960-2000) has emerged from regional analyses of streamflow observations. T

  12. Application of the Water Erosion Prediction Project (WEPP) Model to simulate streamflow in a PNW forest watershed

    Science.gov (United States)

    A. Srivastava; M. Dobre; E. Bruner; W. J. Elliot; I. S. Miller; J. Q. Wu

    2011-01-01

    Assessment of water yields from watersheds into streams and rivers is critical to managing water supply and supporting aquatic life. Surface runoff typically contributes the most to peak discharge of a hydrograph while subsurface flow dominates the falling limb of hydrograph and baseflow contributes to streamflow from shallow unconfined aquifers primarily during the...

  13. Multivariate power-law models for streamflow prediction in the Mekong Basin

    Directory of Open Access Journals (Sweden)

    Guillaume Lacombe

    2014-11-01

    New hydrological insights for the region: A combination of 3–6 explanatory variables – chosen among annual rainfall, drainage area, perimeter, elevation, slope, drainage density and latitude – is sufficient to predict a range of flow metrics with a prediction R-squared ranging from 84 to 95%. The inclusion of forest or paddy percentage coverage as an additional explanatory variable led to slight improvements in the predictive power of some of the low-flow models (lowest prediction R-squared = 89%. A physical interpretation of the model structure was possible for most of the resulting relationships. Compared to regional regression models developed in other parts of the world, this new set of equations performs reasonably well.

  14. Quantifying Uncertainty in Flood Inundation Mapping Using Streamflow Ensembles and Multiple Hydraulic Modeling Techniques

    Science.gov (United States)

    Hosseiny, S. M. H.; Zarzar, C.; Gomez, M.; Siddique, R.; Smith, V.; Mejia, A.; Demir, I.

    2016-12-01

    The National Water Model (NWM) provides a platform for operationalize nationwide flood inundation forecasting and mapping. The ability to model flood inundation on a national scale will provide invaluable information to decision makers and local emergency officials. Often, forecast products use deterministic model output to provide a visual representation of a single inundation scenario, which is subject to uncertainty from various sources. While this provides a straightforward representation of the potential inundation, the inherent uncertainty associated with the model output should be considered to optimize this tool for decision making support. The goal of this study is to produce ensembles of future flood inundation conditions (i.e. extent, depth, and velocity) to spatially quantify and visually assess uncertainties associated with the predicted flood inundation maps. The setting for this study is located in a highly urbanized watershed along the Darby Creek in Pennsylvania. A forecasting framework coupling the NWM with multiple hydraulic models was developed to produce a suite ensembles of future flood inundation predictions. Time lagged ensembles from the NWM short range forecasts were used to account for uncertainty associated with the hydrologic forecasts. The forecasts from the NWM were input to iRIC and HEC-RAS two-dimensional software packages, from which water extent, depth, and flow velocity were output. Quantifying the agreement between output ensembles for each forecast grid provided the uncertainty metrics for predicted flood water inundation extent, depth, and flow velocity. For visualization, a series of flood maps that display flood extent, water depth, and flow velocity along with the underlying uncertainty associated with each of the forecasted variables were produced. The results from this study demonstrate the potential to incorporate and visualize model uncertainties in flood inundation maps in order to identify the high flood risk zones.

  15. Realtime USGS Streamflow Stations

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Approximately 5,000 of the 6,900 U.S. Geological Survey sampling stations are equipped with telemetry to transmit data on streamflow, temperature, and other...

  16. Streamflow Gaging Stations

    Data.gov (United States)

    Department of Homeland Security — This map layer shows selected streamflow gaging stations of the United States, Puerto Rico, and the U.S. Virgin Islands, in 2013. Gaging stations, or gages, measure...

  17. Military Hydrology: Report 21, Regulation of Streamflow by Dams and Associated Modeling Capabilities

    Science.gov (United States)

    1992-10-01

    Laboratory (EL), and Dr. V. E. LaGarde III, Chief of the Environmental Systems Division (ESD), EL, and under the direct supervision of Mr. M. P. Keown ...optimize the operation of an interconnected system of reservoirs, hydroelectric power plants, pump canals, pipelines, and river reaches ( Martin 1981...Water Resource Systems, Harvard University Press, Cam- bridge, Mass. Martin , Quentin W., 1981. "Surface Water Resources Allocation Model (AL-V), Program

  18. Diagnostic analysis of distributed input and parameter datasets in Mediterranean basin streamflow modeling

    Science.gov (United States)

    Milella, Pamela; Bisantino, Tiziana; Gentile, Francesco; Iacobellis, Vito; Trisorio Liuzzi, Giuliana

    2012-11-01

    SummaryThe paper suggests a methodology, based on performance metrics, to select the optimal set of input and parameters to be used for the simulation of river flow discharges with a semi-distributed hydrologic model. The model is applied at daily scale in a semi-arid basin of Southern Italy (Carapelle river, basin area: 506 km2) for which rainfall and discharge series for the period 2006-2009 are available. A classification of inputs and parameters was made in two subsets: the former - spatially distributed - to be selected among different options, the latter - lumped - to be calibrated. Different data sources of (or methodologies to obtain) spatially distributed data have been explored for the first subset. In particular, the FAO Penman-Monteith, Hargreaves and Thornthwaite equations were tested for the evaluation of reference evapotranspiration that, in semi-arid areas, represents a key role in hydrological modeling. The availability of LAI maps from different remote sensing sources was exploited in order to enhance the characterization of the vegetation state and consequently of the spatio-temporal variation in actual evapotranspiration. Different type of pedotransfer functions were used to derive the soil hydraulic parameters of the area. For each configuration of the first subset of data, a manual calibration of the second subset of parameters was carried out. Both the manual calibration of the lumped parameters and the selection of the optimal distributed dataset were based on the calculation and the comparison of different performance metrics measuring the distance between observed and simulated discharge data series. Results not only show the best options for estimating reference evapotranspiration, crop coefficients, LAI values and hydraulic properties of soil, but also provide significant insights regarding the use of different performance metrics including traditional indexes such as RMSE, NSE, index of agreement, with the more recent Benchmark

  19. Modeling of hydrological drought durations and magnitudes: Experiences on Canadian streamflows

    Directory of Open Access Journals (Sweden)

    Tribeni C. Sharma

    2014-07-01

    New hydrological insights for the region: Approach based on the extreme number theorem predicted satisfactorily drought durations at monthly and annual time scales and was also found comparable to Markov chain of order-one for predicting monthly drought durations. The approach was found less satisfactory for predicting drought durations at weekly time scale but the performance was found to improve with the use of Markov chain of order-two. At annual, monthly, and weekly time scales, the relationship (magnitude = intensity × duration proved satisfactory for predicting drought magnitudes with the assumption that truncated normal distribution performs well for modeling the drought intensity. For predicting drought magnitudes at monthly and weekly time scales, the Markov chain proved more satisfactory with one order lower than the order that was used for predicting drought durations. Markov chain of order-one modeled durations satisfactorily at weekly time scale with uniform truncation levels corresponding to flows equivalent to 90% and 95%.

  20. Development and application of a comprehensive simulation model to evaluate impacts of watershed structures and irrigation water use on streamflow and groundwater: The case of Wet Walnut Creek Watershed, Kansas, USA

    Science.gov (United States)

    Ramireddygari, S.R.; Sophocleous, M.A.; Koelliker, J.K.; Perkins, S.P.; Govindaraju, R.S.

    2000-01-01

    This paper presents the results of a comprehensive modeling study of surface and groundwater systems, including stream-aquifer interactions, for the Wet Walnut Creek Watershed in west-central Kansas. The main objective of this study was to assess the impacts of watershed structures and irrigation water use on streamflow and groundwater levels, which in turn affect availability of water for the Cheyenne Bottoms Wildlife Refuge Management area. The surface-water flow model, POTYLDR, and the groundwater flow model, MODFLOW, were combined into an integrated, watershed-scale, continuous simulation model. Major revisions and enhancements were made to the POTYLDR and MODFLOW models for simulating the detailed hydrologic budget for the Wet Walnut Creek Watershed. The computer simulation model was calibrated and verified using historical streamflow records (at Albert and Nekoma gaging stations), reported irrigation water use, observed water-level elevations in watershed structure pools, and groundwater levels in the alluvial aquifer system. To assess the impact of watershed structures and irrigation water use on streamflow and groundwater levels, a number of hypothetical management scenarios were simulated under various operational criteria for watershed structures and different annual limits on water use for irrigation. A standard 'base case' was defined to allow comparative analysis of the results of different scenarios. The simulated streamflows showed that watershed structures decrease both streamflows and groundwater levels in the watershed. The amount of water used for irrigation has a substantial effect on the total simulated streamflow and groundwater levels, indicating that irrigation is a major budget item for managing water resources in the watershed. (C) 2000 Elsevier Science B.V.This paper presents the results of a comprehensive modeling study of surface and groundwater systems, including stream-aquifer interactions, for the Wet Walnut Creek Watershed in west

  1. Sensitivity of SWAT simulated streamflow to climatic changes within the Eastern Nile River basin

    Directory of Open Access Journals (Sweden)

    D. T. Mengistu

    2012-02-01

    Full Text Available The hydrological model SWAT was run with daily station based precipitation and temperature data for the whole Eastern Nile basin including the three subbasins: the Abbay (Blue Nile, BaroAkobo and Tekeze. The daily and monthly streamflows were calibrated and validated at six outlets with station-based streamflow data in the three different subbasins. The model performed very well in simulating the monthly variability while the validation against daily data revealed a more diverse performance. The simulations indicated that around 60% of the average annual rainfalls of the subbasins were lost through evaporation while the estimated runoff coefficients were 0.24, 0.30 and 0.18 for Abbay, BaroAkobo and Tekeze subbasins, respectively. About half to two-thirds of the runoff could be attributed to surface runoff while the other contributions came from groundwater.

    Twenty hypothetical climate change scenarios (perturbed temperatures and precipitation were conducted to test the sensitivity of SWAT simulated annual streamflow. The result revealed that the annual streamflow sensitivity to changes in precipitation and temperature differed among the basins and the dependence of the response on the strength of the changes was not linear. On average the annual streamflow responses to a change in precipitation with no temperature change were 19%, 17%, and 26% per 10% change in precipitation while the average annual streamflow responses to a change in temperature and no precipitation change were −4.4% K−1, −6.4% K−1, and −1.3% K−1 for Abbay, BaroAkobo and Tekeze river basins, respectively.

    47 temperature and precipitation scenarios from 19 AOGCMs participating inCMIP3 were used to estimate future changes in streamflow due to climate changes. The climate models disagreed on both the strength and the direction of future precipitation changes. Thus, no clear conclusions could be made about future

  2. Impacts of Forecasted Climate Change on Snowpack, Glacier Recession, and Streamflow in the Nooksack River Basin

    Science.gov (United States)

    Murphy, R. D.; Mitchell, R. J.; Bandaragoda, C.; Grah, O. J.

    2015-12-01

    Like many watersheds in the North Cascades Mountain range, streamflow in the Nooksack River is strongly influenced by precipitation and snowmelt in the spring and glacial melt in the warmer summer months. With a maritime climate and a high relief basin with glacial ice (3400 hectares), the streamflow response in the Nooksack is sensitive to increases in temperature, thus forecasting the basins response to future climate is of vital importance for water resources planning purposes. The watershed (2000 km2) in the northwest of Washington, USA, is a valuable freshwater resource for regional municipalities, industry, and agriculture, and provides critical habitat for endangered salmon species. Due to a lack of spatially distributed long-term historical weather observations in the basin for downscaling purposes, we apply publically available statistically derived 1/16 degree gridded surface data along with the Distributed Hydrology Soil Vegetation Model (DHSVM; Wigmosta et al., 1992) with newly developed coupled dynamic glacier model (Clarke et al., 2015) to simulate hydrologic processes in the Nooksack River basin. We calibrate and validate the DHSVM to observed glacial mass balance and glacial ice extent as well as to observed daily streamflow and SNOTEL data in the Nooksack basin. For the historical period, we model using a gridded meteorological forcing data set (1950-2010; Livneh et al., 2013). We simulate forecasted climate change impacts, including glacial recession on streamflow, using gridded daily statically downscaled data from global climate models of the CMIP5 with RCP4.5 and RCP8.5 forcing scenarios developed using the multivariate adaptive constructed analogs method (Abatzoglou and Brown, 2011). Simulation results project an increase in winter streamflows due to more rainfall rather than snow, and a decrease in summer flows with a general shift in peak spring flows toward earlier in the spring. Glacier melt contribution to streamflow initially increases

  3. Long Term Quantification of Climate and Land Cover Change Impacts on Streamflow in an Alpine River Catchment, Northwestern China

    Directory of Open Access Journals (Sweden)

    Zhenliang Yin

    2017-07-01

    Full Text Available Quantifying the long term impacts of climate and land cover change on streamflow is of great important for sustainable water resources management in inland river basins. The Soil and Water Assessment Tool (SWAT model was employed to simulate the streamflow in the upper reaches of Heihe River Basin, northwestern China, over the last half century. The Sequential Uncertainty Fitting algorithm (SUFI-2 was selected to calibrate and validate the SWAT model. The results showed that both Nash-Sutcliffe efficiency (NSE and determination coefficient (R2 were over 0.93 for calibration and validation periods, the percent bias (PBIAS of the two periods were—3.47% and 1.81%, respectively. The precipitation, average, maximum, and minimum air temperature were all showing increasing trends, with 14.87 mm/10 years, 0.30 °C/10 years, 0.27 °C/10 year, and 0.37 °C/10 years, respectively. Runoff coefficient has increased from 0.36 (averaged during 1964 to 1988 to 0.39 (averaged during 1989 to 2013. Based on the SWAT simulation, we quantified the contribution of climate and land cover change to streamflow change, indicated that the land cover change had a positive impact on river discharge by increasing 7.12% of the streamflow during 1964 to 1988, and climate change contributed 14.08% for the streamflow increasing over last 50 years. Meanwhile, the climate change impact was intensive after 2000s. The increasing of streamflow contributed to the increasing of total streamflow by 64.1% for cold season (November to following March and 35.9% for warm season (April to October. The results provide some references for dealing with climate and land cover change in an inland river basin for water resource management and planning.

  4. Analysing the Effects of Forest Cover and Irrigation Farm Dams on Streamflows of Water-Scarce Catchments in South Australia through the SWAT Model

    Directory of Open Access Journals (Sweden)

    Hong Hanh Nguyen

    2017-01-01

    Full Text Available To assist water resource managers with future land use planning efforts, the eco-hydrological model Soil and Water Assessment Tool (SWAT was applied to three catchments in South Australia that experience extreme low flow conditions. Particular land uses and management issues of interest included forest covers, known to affect water yields, and farm dams, known to intercept and change the hydrological dynamics in a catchment. The study achieved a satisfactory daily calibration when irrigation farm dams were incorporated in the model. For the catchment dominated by extreme low flows, a better daily simulation across a range of qualitative and quantitative metrics was gained using the base-flow static threshold optimization technique. Scenario analysis on effects of forest cover indicated an increase of surface flow and a reduction of base-flow when native eucalyptus lands were replaced by pastures and vice versa. A decreasing trend was observed for the overall water yield of catchments with more forest plantation due to the higher evapotranspiration (ET rate and the decline in surface flow. With regards to effects of irrigation farm dams, assessment on a daily time step suggested that a significant volume of water is stored in these systems with the water loss rate highest in June and July. On an annual basis, the model indicated that approximately 13.1% to 22.0% of water has been captured by farm dams for irrigation. However, the scenario analysis revealed that the purposes of use of farm dams rather than their volumetric capacities in the catchment determined the magnitude of effects on streamflows. Water extracted from farm dams for irrigation of orchards and vineyards are more likely to diminish streamflows than other land uses. Outputs from this study suggest that the water use restrictions from farm dams during recent drought periods were an effective tool to minimize impacts on streamflows.

  5. Drivers influencing streamflow changes in the Upper Turia basin, Spain.

    Science.gov (United States)

    Salmoral, Gloria; Willaarts, Bárbara A; Troch, Peter A; Garrido, Alberto

    2015-01-15

    Many rivers across the world have experienced a significant streamflow reduction over the last decades. Drivers of the observed streamflow changes are multiple, including climate change (CC), land use and land cover changes (LULCC), water transfers and river impoundment. Many of these drivers inter-act simultaneously, making it difficult to discern the impact of each driver individually. In this study we isolate the effects of LULCC on the observed streamflow reduction in the Upper Turia basin (east Spain) during the period 1973-2008. Regression models of annual streamflow are fitted with climatic variables and also additional time variant drivers like LULCC. The ecohydrological model SWAT is used to study the magnitude and sign of streamflow change when LULCC occurs. Our results show that LULCC does play a significant role on the water balance, but it is not the main driver underpinning the observed reduction on Turia's streamflow. Increasing mean temperature is the main factor supporting increasing evapotranspiration and streamflow reduction. In fact, LULCC and CC have had an offsetting effect on the streamflow generation during the study period. While streamflow has been negatively affected by increasing temperature, ongoing LULCC have positively compensated with reduced evapotranspiration rates, thanks to mainly shrubland clearing and forest degradation processes. These findings are valuable for the management of the Turia river basin, as well as a useful approach for the determination of the weight of LULCC on the hydrological response in other regions.

  6. On validation of multibody musculoskeletal models

    DEFF Research Database (Denmark)

    Lund, Morten Enemark; de Zee, Mark; Andersen, Michael Skipper;

    2012-01-01

    This paper reviews the opportunities to validate multibody musculoskeletal models in view of the current transition of musculoskeletal modelling from a research topic to a practical simulation tool in product design, healthcare and other important applications. This transition creates a new need...... for improvement of the validation of multibody musculoskeletal models are pointed out and directions for future research in the field are proposed. It is our hope that a more structured approach to model validation can help to improve the credibility of musculoskeletal models....

  7. Streamflow forecasting using functional regression

    Science.gov (United States)

    Masselot, Pierre; Dabo-Niang, Sophie; Chebana, Fateh; Ouarda, Taha B. M. J.

    2016-07-01

    Streamflow, as a natural phenomenon, is continuous in time and so are the meteorological variables which influence its variability. In practice, it can be of interest to forecast the whole flow curve instead of points (daily or hourly). To this end, this paper introduces the functional linear models and adapts it to hydrological forecasting. More precisely, functional linear models are regression models based on curves instead of single values. They allow to consider the whole process instead of a limited number of time points or features. We apply these models to analyse the flow volume and the whole streamflow curve during a given period by using precipitations curves. The functional model is shown to lead to encouraging results. The potential of functional linear models to detect special features that would have been hard to see otherwise is pointed out. The functional model is also compared to the artificial neural network approach and the advantages and disadvantages of both models are discussed. Finally, future research directions involving the functional model in hydrology are presented.

  8. Optimal Data Split Methodology for Model Validation

    CERN Document Server

    Morrison, Rebecca; Terejanu, Gabriel; Miki, Kenji; Prudhomme, Serge

    2011-01-01

    The decision to incorporate cross-validation into validation processes of mathematical models raises an immediate question - how should one partition the data into calibration and validation sets? We answer this question systematically: we present an algorithm to find the optimal partition of the data subject to certain constraints. While doing this, we address two critical issues: 1) that the model be evaluated with respect to predictions of a given quantity of interest and its ability to reproduce the data, and 2) that the model be highly challenged by the validation set, assuming it is properly informed by the calibration set. This framework also relies on the interaction between the experimentalist and/or modeler, who understand the physical system and the limitations of the model; the decision-maker, who understands and can quantify the cost of model failure; and the computational scientists, who strive to determine if the model satisfies both the modeler's and decision maker's requirements. We also note...

  9. An evaluation of the accuracy of modeled and computed streamflow time-series data for the Ohio River at Hannibal Lock and Dam and at a location upstream from Sardis, Ohio

    Science.gov (United States)

    Koltun, G.F.

    2015-01-01

    Between July 2013 and June 2014, the U.S. Geological Survey (USGS) made 10 streamflow measurements on the Ohio River about 1.5 miles (mi) downstream from the Hannibal Lock and Dam (near Hannibal, Ohio) and 11 streamflow measurements near the USGS Sardis gage (station number 03114306) located approximately 2.4 mi upstream from Sardis, Ohio. The measurement results were used to assess the accuracy of modeled or computed instantaneous streamflow time series created and supplied by the USGS, U.S. Army Corps of Engineers (USACE), and National Weather Service (NWS) for the Ohio River at Hannibal Lock and Dam and (or) at the USGS streamgage. Hydraulic or hydrologic models were used to create the modeled time series; index-velocity methods or gate-opening ratings coupled with hydropower operation data were used to create the computed time series. The time step of the various instantaneous streamflow time series ranged from 15 minutes to 24 hours (once-daily values at 12:00 Coordinated Universal Time [UTC]). The 15-minute time-series data, computed by the USGS for the Sardis gage, also were downsampled to 1-hour and 24-hour time steps to permit more direct comparisons with other streamflow time series.

  10. A regional estimate of postfire streamflow change in California

    Science.gov (United States)

    Bart, Ryan R.

    2016-02-01

    The effect of fire on annual streamflow has been examined in numerous watershed studies, with some studies observing postfire increases in streamflow while other have observed no conclusive change. Despite this inherent variability in streamflow response, the management of water resources for flood protection, water supply, water quality, and the environment necessitates an understanding of postfire effects on streamflow at regional scales. In this study, the regional effect of wildfire on annual streamflow was investigated using 12 paired watersheds in central and southern California. A mixed model was used to pool and statistically examine the combined paired-watershed data, with emphasis on the effects of percentage area burned, postfire recovery of vegetation, and postfire wetness conditions on postfire streamflow change. At a regional scale, postfire annual streamflow increased 134% (82%-200%) during the first postfire year assuming 100% area burned and average annual wetness conditions. Postfire response decreased with lower percentages of percentage area burned and during subsequent years as vegetation recovered following fire. Annual streamflow response to fire was found to be sensitive to annual wetness conditions, with postfire response being smallest during dry years, greatest during wet years, and slowly decreasing during very wet years. These findings provide watershed managers with a first-order estimate for predicting postfire streamflow response in both gauged and ungauged watersheds.

  11. Assessing the Snow Advance Index as potential predictor of winter streamflow of the Iberian Peninsula Rivers

    Science.gov (United States)

    Hidalgo-Muñoz, José Manuel; García-Valdecasas-Ojeda, Matilde; Raquel Gámiz-Fortis, Sonia; Castro-Díez, Yolanda; Jesús Esteban-Parra, María

    2015-04-01

    This study examines the ability of the Eurasian snow cover increase during the previous October as potential predictor of winter streamflow in the Iberian Peninsula Rivers. The streamflow data base used has been provided by the Center for Studies and Experimentation of Public Works, CEDEX. Series from gauging stations and reservoirs with less than 10% of missing data (filled by regression with well correlated neighboring stations) have been considered. The homogeneity of these series has been evaluated through the Pettit test and degree of human alteration by the Common Area Index. The application of these criteria led to the selection of 382 streamflow time series homogeneously distributed over the Iberian Peninsula, covering the period 1975-2008. For this streamflow data, winter seasonal values were obtained by averaging the monthly values from January to March. The recently proposed Snow Advance Index (SAI) was employed to monitor the snow cover increase during previous October. The stability of the correlations was the criterion followed to establish if SAI could be considered as potential predictor of winter streamflow at each gauging station. Winter streamflow is predicted using a linear regression model. A leave-one-out cross validation approach was adopted to create calibration and validations subsets. The correlation coefficient (RHO), Root Mean Square Error Skill Score (RMSESS) and the Gerrity Skill Score (GSS) were used to evaluate the forecasting skill. From the 382 stations evaluated, significant and stable correlations with SAI were found in 238 stations, covering most of the IP (except for the Cantabrian and Mediterranean slopes). Some forecasting skill was found in 223 of them, being this skill moderate (RHO>0.44, RMSESS>10%, GSS>0.2) in 141 of them, and particularly good (RHO>0.5, RMSESS>20%, GSS>0.4) in 23. This study shows that the SAI of previous October is a reliable predictor of following winter streamflow for the Iberian Peninsula Rivers

  12. Validation of systems biology models

    NARCIS (Netherlands)

    Hasdemir, D.

    2015-01-01

    The paradigm shift from qualitative to quantitative analysis of biological systems brought a substantial number of modeling approaches to the stage of molecular biology research. These include but certainly are not limited to nonlinear kinetic models, static network models and models obtained by the

  13. Feature Extraction for Structural Dynamics Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Farrar, Charles [Los Alamos National Laboratory; Nishio, Mayuko [Yokohama University; Hemez, Francois [Los Alamos National Laboratory; Stull, Chris [Los Alamos National Laboratory; Park, Gyuhae [Chonnam Univesity; Cornwell, Phil [Rose-Hulman Institute of Technology; Figueiredo, Eloi [Universidade Lusófona; Luscher, D. J. [Los Alamos National Laboratory; Worden, Keith [University of Sheffield

    2016-01-13

    As structural dynamics becomes increasingly non-modal, stochastic and nonlinear, finite element model-updating technology must adopt the broader notions of model validation and uncertainty quantification. For example, particular re-sampling procedures must be implemented to propagate uncertainty through a forward calculation, and non-modal features must be defined to analyze nonlinear data sets. The latter topic is the focus of this report, but first, some more general comments regarding the concept of model validation will be discussed.

  14. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  15. Quantitative model validation techniques: new insights

    CERN Document Server

    Ling, You

    2012-01-01

    This paper develops new insights into quantitative methods for the validation of computational model prediction. Four types of methods are investigated, namely classical and Bayesian hypothesis testing, a reliability-based method, and an area metric-based method. Traditional Bayesian hypothesis testing is extended based on interval hypotheses on distribution parameters and equality hypotheses on probability distributions, in order to validate models with deterministic/stochastic output for given inputs. Two types of validation experiments are considered - fully characterized (all the model/experimental inputs are measured and reported as point values) and partially characterized (some of the model/experimental inputs are not measured or are reported as intervals). Bayesian hypothesis testing can minimize the risk in model selection by properly choosing the model acceptance threshold, and its results can be used in model averaging to avoid Type I/II errors. It is shown that Bayesian interval hypothesis testing...

  16. Quantifying the impacts of climate change and ecological restoration on streamflow changes based on a Budyko hydrological model in China's Loess Plateau

    National Research Council Canada - National Science Library

    Liang, Wei; Bai, Dan; Wang, Feiyu; Fu, Bojie; Yan, Junping; Wang, Shuai; Yang, Yuting; Long, Di; Feng, Minquan

    2015-01-01

    Understanding hydrological effects of ecological restoration (ER) is fundamental to develop effective measures guiding future ER and to adapt climate change in China's Loess Plateau (LP). Streamflow ( Q...

  17. Base Flow Model Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The program focuses on turbulence modeling enhancements for predicting high-speed rocket base flows. A key component of the effort is the collection of high-fidelity...

  18. Land cover and climate change effects on streamflow and sediment yield: a case study of Tapacurá River basin, Brazil

    Science.gov (United States)

    Santos, J. Y. G.; Silva, R. M.; Carvalho Neto, J. G.; Montenegro, S. M. G. L.; Santos, C. A. G.; Silva, A. M.

    2015-06-01

    This study assesses the impact of the land use and climate changes between 1967-2008 on the streamflow and sediment yield in Tapacurá River basin (Brazil) using the Soil and Water Assessment Tool (SWAT) model. The model was calibrated and validated by comparing simulated mean monthly streamflow with observed long-term mean monthly streamflow. The obtained R2 and Nash-Sutcliffe efficiency values to streamflow data were respectively 0.82 and 0.71 for 1967-1974, and 0.84 and 0.82 for 1995-2008. The results show that the land cover and climate change affected the basin hydrology, decreasing the streamflow and sediment yield (227.39 mm and 18.21 t ha-1 yr-1 for 1967-1974 and 182.86 mm and 7.67 t ha-1 yr-1 for 1995-2008). The process changes are arising mainly due to the land cover/use variability, but, mainly due to the decreasing in the rainfall rates during 1995-2008 when compared with the first period analysed, which in turn decreased the streamflow and sediments during the wet seasons and reduced the base flow during the dry seasons.

  19. A GRACE-Streamflow Land Surface Model Calibration Approach for Improved Baseflow and Water Table Simulations over the Highly Managed Upper-Nile Basin of East Africa

    Science.gov (United States)

    Nanteza, J.; Lo, M. H.; Wu, R. J.; Thomas, B. F.; Famiglietti, J. S.

    2015-12-01

    Land surface models (LSMs) are useful tools for understanding behaviors of land hydrologic variables at different time and spatial scales. LSM outputs, however, are marked with great uncertainties resulting from the simplified assumptions on the parameterization and processes of the land surface and a poor representation of both the natural and anthropogenic controls on the system. The Upper-Nile basin, over Uganda, Kenya and Tanzania, is one region that is characteristic of significant human controls on streamflow, including Lake Victoria releases. The river Nile flow from Lake Victoria follows apriori rating curves that are not simulated by LSMs. Apart from management practices; the huge storage volume of Lake Victoria also modifies the seasonal characteristics of the Upper-Nile discharge, creating small seasonal variations in stream flow. In this study we calibrate several critical parameters in the Community Land Model (CLM.v4) in a multiobjective framework using total water storage anomalies (∆TWS) from GRACE, observed total runoff (Q) and estimated baseflow (BF) over the Upper-Nile basin. The goal is to improve the CLM parameters so that the model simulates the agreed curve (apriori) streamflow and baseflow with a better accuracy. We demonstrate the significance of improved parametrization by comparing model results of ∆TWS, Q and BF with a combination of insitu and estimated observations. Preliminary results based on RMSE statistics show that with calibration, simulations of ∆TWS, Q and BF achieve higher performance. Further, an improvement in the model's capacity to simulate the water table depth is also evident with the calibration. Such results provide a basis for using CLM for other hydrologic experiments that could guide water resources management in this highly managed basin.

  20. Long-range forecasting of intermittent streamflow

    Directory of Open Access Journals (Sweden)

    F. F. van Ogtrop

    2011-01-01

    Full Text Available Long-range forecasting of intermittent streamflow in semi-arid Australia poses a number of major challenges. One of the challenges relates to modelling zero, skewed, non-stationary, and non-linear data. To address this, a probabilistic statistical model to forecast streamflow 12 months ahead is applied to five semi-arid catchments in South Western Queensland. The model uses logistic regression through Generalised Additive Models for Location, Scale and Shape (GAMLSS to determine the probability of flow occurring in any of the systems. We then use the same regression framework in combination with a right-skewed distribution, the Box-Cox t distribution, to model the intensity (depth of the non-zero streamflows. Time, seasonality and climate indices, describing the Pacific and Indian Ocean sea surface temperatures, are tested as covariates in the GAMLSS model to make probabilistic 12-month forecasts of the occurrence and intensity of streamflow. The output reveals that in the study region the occurrence and variability of flow is driven by sea surface temperatures and therefore forecasts can be made with some skill.

  1. Model validation: Correlation for updating

    Indian Academy of Sciences (India)

    D J Ewins

    2000-06-01

    In this paper, a review is presented of the various methods which are available for the purpose of performing a systematic comparison and correlation between two sets of vibration data. In the present case, the application of interest is in conducting this correlation process as a prelude to model correlation or updating activity.

  2. Quantitative precipitation and streamflow forecast for two recent extreme hydro-meteorological events in Southern Italy with a fully-coupled model system

    Science.gov (United States)

    Mendicino, Giuseppe; Senatore, Alfonso

    2016-04-01

    Two severe hydro-meteorological events affected Calabria Region (Southern Italy) in the second half of the year 2015. The first event, on August 12th, focused on a relatively small area near the northern Ionian coast, resulted in a rainfall intensity of about 230 mm in 24 hours involving flash flooding with several million Euros of damages. The second event mainly affected the southern Ionian coast, was more persistent (it lasted from October 30th to November 2nd), interested a wider area and led to recorded rainfall values up to 400 mm in 24 hours and 700 mm in 48 hours, resulting in severe flooding, landslides and a human loss. The fully two-way dynamically coupled atmosphere-hydrology modeling system WRF-Hydro is used to reproduce both the events, in order to assess its skill in forecasting both quantitative precipitation and streamflow with initial and lateral atmospheric boundary conditions given by the recently available 0.25° output resolution GFS grid dataset. Precipitation estimates provided by 2 km-resolution atmospheric model are compared with both ground-based data and observations from a National Civil Protection Department single-polarization Doppler radar. Discharge data from the rivers and creeks affected by heavy precipitation are not available, then streamflow results are compared with either official discharge estimates provided by authorities (first event) or recorded river stages (second event). Results show good performances of the fully-coupled hydrometeorological prediction system which allows an improved representation of the coupled atmospheric and terrestrial processes and provides an integrated solution for the regional water cycle modeling, from atmospheric processes to river outlets.

  3. Streamflows at record highs

    Science.gov (United States)

    Streamflow was reported well above average in more than half the country during May, with flows at or near record levels for the month in 22 states, according to the U.S. Geological Survey (USGS), Department of the Interior.USGS hydrologists said that above average flow was reported at 98 of the 173 USGS key index gauging stations used in their monthly check on surface- and ground-water conditions. High flows were most prevalent in the Mississippi River basin states and in the east, with the exception of Maine, South Carolina, and Georgia. Below-average streamflow occurred in the Pacific northwest and in small scattered areas in Colorado, Kansas, Texas, and Minnesota.

  4. Impacts of Climate and Land Use/Cover Change on Streamflow Using SWAT and a Separation Method for the Xiying River Basin in Northwestern China

    Directory of Open Access Journals (Sweden)

    Jing Guo

    2016-05-01

    Full Text Available A better understanding of the effects of climate change and land use/cover change (LUCC on streamflow promotes the long-term water planning and management in the arid regions of northwestern China. In this paper, the Soil and Water Assessment Tool (SWAT and a separation approach were used to evaluate and separate the effects of climate change and LUCC on streamflow in the Xiying River basin. The SWAT model was calibrated by the hydro-meteorological data from 1980–1989 to obtain the optimum parameters, which were validated by the subsequent application to the period between 1990–2008. Moreover, streamflow under several scenarios with different climate change and land use conditions in 1990–2008 and 2010–2069 were further investigated. Results indicate that, in the period of 1990–2008, the streamflow was dominated by climate change (i.e., changes in precipitation and temperature, which led to a 102.8% increase in the mean annual streamflow, whereas LUCC produced a decrease of 2.8%. Furthermore, in the future period of 2010–2039, the mean annual streamflow will decrease by 5.4% and 4.5% compared with the data of 1961–1990 under scenarios A2 and B2, respectively, while it will decrease by 21.2% and 16.9% in the period of 2040–2069, respectively.

  5. Validation of the Hot Strip Mill Model

    Energy Technology Data Exchange (ETDEWEB)

    Richard Shulkosky; David Rosberg; Jerrud Chapman

    2005-03-30

    The Hot Strip Mill Model (HSMM) is an off-line, PC based software originally developed by the University of British Columbia (UBC) and the National Institute of Standards and Technology (NIST) under the AISI/DOE Advanced Process Control Program. The HSMM was developed to predict the temperatures, deformations, microstructure evolution and mechanical properties of steel strip or plate rolled in a hot mill. INTEG process group inc. undertook the current task of enhancing and validating the technology. With the support of 5 North American steel producers, INTEG process group tested and validated the model using actual operating data from the steel plants and enhanced the model to improve prediction results.

  6. Ground-water models: Validate or invalidate

    Science.gov (United States)

    Bredehoeft, J.D.; Konikow, L.F.

    1993-01-01

    The word validation has a clear meaning to both the scientific community and the general public. Within the scientific community the validation of scientific theory has been the subject of philosophical debate. The philosopher of science, Karl Popper, argued that scientific theory cannot be validated, only invalidated. Popper’s view is not the only opinion in this debate; however, many scientists today agree with Popper (including the authors). To the general public, proclaiming that a ground-water model is validated carries with it an aura of correctness that we do not believe many of us who model would claim. We can place all the caveats we wish, but the public has its own understanding of what the word implies. Using the word valid with respect to models misleads the public; verification carries with it similar connotations as far as the public is concerned. Our point is this: using the terms validation and verification are misleading, at best. These terms should be abandoned by the ground-water community.

  7. Bayesian Regression and Neuro-Fuzzy Methods Reliability Assessment for Estimating Streamflow

    Directory of Open Access Journals (Sweden)

    Yaseen A. Hamaamin

    2016-07-01

    Full Text Available Accurate and efficient estimation of streamflow in a watershed’s tributaries is prerequisite parameter for viable water resources management. This study couples process-driven and data-driven methods of streamflow forecasting as a more efficient and cost-effective approach to water resources planning and management. Two data-driven methods, Bayesian regression and adaptive neuro-fuzzy inference system (ANFIS, were tested separately as a faster alternative to a calibrated and validated Soil and Water Assessment Tool (SWAT model to predict streamflow in the Saginaw River Watershed of Michigan. For the data-driven modeling process, four structures were assumed and tested: general, temporal, spatial, and spatiotemporal. Results showed that both Bayesian regression and ANFIS can replicate global (watershed and local (subbasin results similar to a calibrated SWAT model. At the global level, Bayesian regression and ANFIS model performance were satisfactory based on Nash-Sutcliffe efficiencies of 0.99 and 0.97, respectively. At the subbasin level, Bayesian regression and ANFIS models were satisfactory for 155 and 151 subbasins out of 155 subbasins, respectively. Overall, the most accurate method was a spatiotemporal Bayesian regression model that outperformed other models at global and local scales. However, all ANFIS models performed satisfactory at both scales.

  8. Validating the passenger traffic model for Copenhagen

    DEFF Research Database (Denmark)

    Overgård, Christian Hansen; VUK, Goran

    2006-01-01

    The paper presents a comprehensive validation procedure for the passenger traffic model for Copenhagen based on external data from the Danish national travel survey and traffic counts. The model was validated for the years 2000 to 2004, with 2004 being of particular interest because the Copenhagen...... Metro became operational in autumn 2002. We observed that forecasts from the demand sub-models agree well with the data from the 2000 national travel survey, with the mode choice forecasts in particular being a good match with the observed modal split. The results of the 2000 car assignment model...... matched the observed traffic better than those of the transit assignment model. With respect to the metro forecasts, the model over-predicts metro passenger flows by 10% to 50%. The wide range of findings from the project resulted in two actions. First, a project was started in January 2005 to upgrade...

  9. Structural system identification: Structural dynamics model validation

    Energy Technology Data Exchange (ETDEWEB)

    Red-Horse, J.R.

    1997-04-01

    Structural system identification is concerned with the development of systematic procedures and tools for developing predictive analytical models based on a physical structure`s dynamic response characteristics. It is a multidisciplinary process that involves the ability (1) to define high fidelity physics-based analysis models, (2) to acquire accurate test-derived information for physical specimens using diagnostic experiments, (3) to validate the numerical simulation model by reconciling differences that inevitably exist between the analysis model and the experimental data, and (4) to quantify uncertainties in the final system models and subsequent numerical simulations. The goal of this project was to develop structural system identification techniques and software suitable for both research and production applications in code and model validation.

  10. Model performance analysis and model validation in logistic regression

    Directory of Open Access Journals (Sweden)

    Rosa Arboretti Giancristofaro

    2007-10-01

    Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.

  11. Feature extraction for structural dynamics model validation

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois [Los Alamos National Laboratory; Farrar, Charles [Los Alamos National Laboratory; Park, Gyuhae [Los Alamos National Laboratory; Nishio, Mayuko [UNIV OF TOKYO; Worden, Keith [UNIV OF SHEFFIELD; Takeda, Nobuo [UNIV OF TOKYO

    2010-11-08

    This study focuses on defining and comparing response features that can be used for structural dynamics model validation studies. Features extracted from dynamic responses obtained analytically or experimentally, such as basic signal statistics, frequency spectra, and estimated time-series models, can be used to compare characteristics of structural system dynamics. By comparing those response features extracted from experimental data and numerical outputs, validation and uncertainty quantification of numerical model containing uncertain parameters can be realized. In this study, the applicability of some response features to model validation is first discussed using measured data from a simple test-bed structure and the associated numerical simulations of these experiments. issues that must be considered were sensitivity, dimensionality, type of response, and presence or absence of measurement noise in the response. Furthermore, we illustrate a comparison method of multivariate feature vectors for statistical model validation. Results show that the outlier detection technique using the Mahalanobis distance metric can be used as an effective and quantifiable technique for selecting appropriate model parameters. However, in this process, one must not only consider the sensitivity of the features being used, but also correlation of the parameters being compared.

  12. Modeling the distributed effects of forest thinning on the long-term water balance and streamflow extremes for a semi-arid basin in the southwestern US

    Science.gov (United States)

    Moreno, Hernan A.; Gupta, Hoshin V.; White, Dave D.; Sampson, David A.

    2016-03-01

    To achieve water resource sustainability in the water-limited southwestern US, it is critical to understand the potential effects of proposed forest thinning on the hydrology of semi-arid basins, where disturbances to headwater catchments can cause significant changes in the local water balance components and basinwise streamflows. In Arizona, the Four Forest Restoration Initiative (4FRI) is being developed with the goal of restoring 2.4 million acres of ponderosa pine along the Mogollon Rim. Using the physically based, spatially distributed triangulated irregular network (TIN)-based Real-time Integrated Basin Simulator (tRIBS) model, we examine the potential impacts of the 4FRI on the hydrology of Tonto Creek, a basin in the Verde-Tonto-Salt (VTS) system, which provides much of the water supply for the Phoenix metropolitan area. Long-term (20-year) simulations indicate that forest removal can trigger significant shifts in the spatiotemporal patterns of various hydrological components, causing increases in net radiation, surface temperature, wind speed, soil evaporation, groundwater recharge and runoff, at the expense of reductions in interception and shading, transpiration, vadose zone moisture and snow water equivalent, with south-facing slopes being more susceptible to enhanced atmospheric losses. The net effect will likely be increases in mean and maximum streamflow, particularly during El Niño events and the winter months, and chiefly for those scenarios in which soil hydraulic conductivity has been significantly reduced due to thinning operations. In this particular climate, forest thinning can lead to net loss of surface water storage by vegetation and snowpack, increasing the vulnerability of ecosystems and populations to larger and more frequent hydrologic extreme conditions on these semi-arid systems.

  13. Regimes of validity for balanced models

    Science.gov (United States)

    Gent, Peter R.; McWilliams, James C.

    1983-07-01

    Scaling analyses are presented which delineate the atmospheric and oceanic regimes of validity for the family of balanced models described in Gent and McWilliams (1983a). The analyses follow and extend the classical work of Charney (1948) and others. The analyses use three non-dimensional parameters which represent the flow scale relative to the Earth's radius, the dominance of turbulent or wave-like processes, and the dominant component of the potential vorticity. For each regime, the models that are accurate both at leading order and through at least one higher order of accuracy in the appropriate small parameter are then identified. In particular, it is found that members of the balanced family are the appropriate models of higher-order accuracy over a broad range of parameter regimes. Examples are also given of particular atmospheric and oceanic phenomena which are in the regimes of validity for the different balanced models.

  14. Validation of Hadronic Models in GEANT4

    Energy Technology Data Exchange (ETDEWEB)

    Koi, Tatsumi; Wright, Dennis H.; /SLAC; Folger, Gunter; Ivanchenko, Vladimir; Kossov, Mikhail; Starkov, Nikolai; /CERN; Heikkinen, Aatos; /Helsinki Inst. of Phys.; Truscott,; Lei, Fan; /QinetiQ; Wellisch, Hans-Peter

    2007-09-26

    Geant4 is a software toolkit for the simulation of the passage of particles through matter. It has abundant hadronic models from thermal neutron interactions to ultra relativistic hadrons. An overview of validations in Geant4 hadronic physics is presented based on thin target measurements. In most cases, good agreement is available between Monte Carlo prediction and experimental data; however, several problems have been detected which require some improvement in the models.

  15. Validation of hadronic models in GEANT4

    CERN Document Server

    Koi, Tatsumi; Folger, Günter; Ivanchenko, Vladimir; Kossov, Mikhail; Starkov, Nikolai; Heikkinen, Aatos; Truscott, Pete; Lei, Fan; Wellisch, Hans-Peter

    2007-01-01

    Geant4 is a software toolkit for the simulation of the passage of particles through matter. It has abundant hadronic models from thermal neutron interactions to ultra relativistic hadrons. An overview of validations in Geant4 hadronic physics is presented based on thin-target measurements. In most cases, good agreement is available between Monte Carlo prediction and experimental data; however, several problems have been detected which require some improvement in the models.

  16. A Streamflow Statistics (StreamStats) Web Application for Ohio

    Science.gov (United States)

    Koltun, G.F.; Kula, Stephanie P.; Puskas, Barry M.

    2006-01-01

    A StreamStats Web application was developed for Ohio that implements equations for estimating a variety of streamflow statistics including the 2-, 5-, 10-, 25-, 50-, 100-, and 500-year peak streamflows, mean annual streamflow, mean monthly streamflows, harmonic mean streamflow, and 25th-, 50th-, and 75th-percentile streamflows. StreamStats is a Web-based geographic information system application designed to facilitate the estimation of streamflow statistics at ungaged locations on streams. StreamStats can also serve precomputed streamflow statistics determined from streamflow-gaging station data. The basic structure, use, and limitations of StreamStats are described in this report. To facilitate the level of automation required for Ohio's StreamStats application, the technique used by Koltun (2003)1 for computing main-channel slope was replaced with a new computationally robust technique. The new channel-slope characteristic, referred to as SL10-85, differed from the National Hydrography Data based channel slope values (SL) reported by Koltun (2003)1 by an average of -28.3 percent, with the median change being -13.2 percent. In spite of the differences, the two slope measures are strongly correlated. The change in channel slope values resulting from the change in computational method necessitated revision of the full-model equations for flood-peak discharges originally presented by Koltun (2003)1. Average standard errors of prediction for the revised full-model equations presented in this report increased by a small amount over those reported by Koltun (2003)1, with increases ranging from 0.7 to 0.9 percent. Mean percentage changes in the revised regression and weighted flood-frequency estimates relative to regression and weighted estimates reported by Koltun (2003)1 were small, ranging from -0.72 to -0.25 percent and -0.22 to 0.07 percent, respectively.

  17. Computing daily mean streamflow at ungaged locations in Iowa by using the Flow Anywhere and Flow Duration Curve Transfer statistical methods

    Science.gov (United States)

    Linhart, S. Mike; Nania, Jon F.; Sanders, Curtis L.; Archfield, Stacey A.

    2012-01-01

    linear regression method and the daily mean streamflow for the 15th day of every other month. The Flow Duration Curve Transfer method was used to estimate unregulated daily mean streamflow from the physical and climatic characteristics of gaged basins. For the Flow Duration Curve Transfer method, daily mean streamflow quantiles at the ungaged site were estimated with the parameter-based regression model, which results in a continuous daily flow-duration curve (the relation between exceedance probability and streamflow for each day of observed streamflow) at the ungaged site. By the use of a reference streamgage, the Flow Duration Curve Transfer is converted to a time series. Data used in the Flow Duration Curve Transfer method were retrieved for 113 continuous-record streamgages in Iowa and within a 50-mile buffer of Iowa. The final statewide regression equations for Iowa were computed by using a weighted-least-squares multiple linear regression method and were computed for the 0.01-, 0.05-, 0.10-, 0.15-, 0.20-, 0.30-, 0.40-, 0.50-, 0.60-, 0.70-, 0.80-, 0.85-, 0.90-, and 0.95-exceedance probability statistics determined from the daily mean streamflow with a reporting limit set at 0.1 ft3/s. The final statewide regression equation for Iowa computed by using left-censored regression techniques was computed for the 0.99-exceedance probability statistic determined from the daily mean streamflow with a low limit threshold and a reporting limit set at 0.1 ft3/s. For the Flow Anywhere method, results of the validation study conducted by using six streamgages show that differences between the root-mean-square error and the mean absolute error ranged from 1,016 to 138 ft3/s, with the larger value signifying a greater occurrence of outliers between observed and estimated streamflows. Root-mean-square-error values ranged from 1,690 to 237 ft3/s. Values of the percent root-mean-square error ranged from 115 percent to 26.2 percent. The logarithm (base 10) streamflow percent root

  18. Seasonal streamflow forecasting by conditioning climatology with precipitation indices

    Science.gov (United States)

    Crochemore, Louise; Ramos, Maria-Helena; Pappenberger, Florian; Perrin, Charles

    2017-03-01

    Many fields, such as drought-risk assessment or reservoir management, can benefit from long-range streamflow forecasts. Climatology has long been used in long-range streamflow forecasting. Conditioning methods have been proposed to select or weight relevant historical time series from climatology. They are often based on general circulation model (GCM) outputs that are specific to the forecast date due to the initialisation of GCMs on current conditions. This study investigates the impact of conditioning methods on the performance of seasonal streamflow forecasts. Four conditioning statistics based on seasonal forecasts of cumulative precipitation and the standardised precipitation index were used to select relevant traces within historical streamflows and precipitation respectively. This resulted in eight conditioned streamflow forecast scenarios. These scenarios were compared to the climatology of historical streamflows, the ensemble streamflow prediction approach and the streamflow forecasts obtained from ECMWF System 4 precipitation forecasts. The impact of conditioning was assessed in terms of forecast sharpness (spread), reliability, overall performance and low-flow event detection. Results showed that conditioning past observations on seasonal precipitation indices generally improves forecast sharpness, but may reduce reliability, with respect to climatology. Conversely, conditioned ensembles were more reliable but less sharp than streamflow forecasts derived from System 4 precipitation. Forecast attributes from conditioned and unconditioned ensembles are illustrated for a case of drought-risk forecasting: the 2003 drought in France. In the case of low-flow forecasting, conditioning results in ensembles that can better assess weekly deficit volumes and durations over a wider range of lead times.

  19. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  20. Model validation in soft systems practice

    Energy Technology Data Exchange (ETDEWEB)

    Checkland, P. [Univ. of Lancaster (United Kingdom)

    1995-03-01

    The concept of `a model` usually evokes the connotation `model of part of the real world`. That is an almost automatic response. It makes sense especially in relation to the way the concept has been developed and used in natural science. Classical operational research (OR), with its scientific aspirations, and systems engineering, use the concept in the same way and in addition use models as surrogates for the real world, on which experimentation is cheap. In these fields the key feature of a model is representativeness. In soft systems methodology (SSM) models are not of part of the world; they are only relevant to debate about the real world and are used in a cyclic learning process. The paper shows how the different concepts of validation in classical OR and SSM lead to a way of sharply defining the nature of `soft OR`. 21 refs.

  1. Isolating the impacts of land use and climate change on streamflow

    Science.gov (United States)

    Chawla, I.; Mujumdar, P. P.

    2015-08-01

    Quantifying the isolated and integrated impacts of land use (LU) and climate change on streamflow is challenging as well as crucial to optimally manage water resources in river basins. This paper presents a simple hydrologic modeling-based approach to segregate the impacts of land use and climate change on the streamflow of a river basin. The upper Ganga basin (UGB) in India is selected as the case study to carry out the analysis. Streamflow in the river basin is modeled using a calibrated variable infiltration capacity (VIC) hydrologic model. The approach involves development of three scenarios to understand the influence of land use and climate on streamflow. The first scenario assesses the sensitivity of streamflow to land use changes under invariant climate. The second scenario determines the change in streamflow due to change in climate assuming constant land use. The third scenario estimates the combined effect of changing land use and climate over the streamflow of the basin. Based on the results obtained from the three scenarios, quantification of isolated impacts of land use and climate change on streamflow is addressed. Future projections of climate are obtained from dynamically downscaled simulations of six general circulation models (GCMs) available from the Coordinated Regional Downscaling Experiment (CORDEX) project. Uncertainties associated with the GCMs and emission scenarios are quantified in the analysis. Results for the case study indicate that streamflow is highly sensitive to change in urban areas and moderately sensitive to change in cropland areas. However, variations in streamflow generally reproduce the variations in precipitation. The combined effect of land use and climate on streamflow is observed to be more pronounced compared to their individual impacts in the basin. It is observed from the isolated effects of land use and climate change that climate has a more dominant impact on streamflow in the region. The approach proposed in this

  2. Streamflow response to increasing precipitation extremes altered by forest management

    Science.gov (United States)

    Kelly, Charlene N.; McGuire, Kevin J.; Miniat, Chelcy Ford; Vose, James M.

    2016-04-01

    Increases in extreme precipitation events of floods and droughts are expected to occur worldwide. The increase in extreme events will result in changes in streamflow that are expected to affect water availability for human consumption and aquatic ecosystem function. We present an analysis that may greatly improve current streamflow models by quantifying the impact of the interaction between forest management and precipitation. We use daily long-term data from paired watersheds that have undergone forest harvest or species conversion. We find that interactive effects of climate change, represented by changes in observed precipitation trends, and forest management regime, significantly alter expected streamflow most often during extreme events, ranging from a decrease of 59% to an increase of 40% in streamflow, depending upon management. Our results suggest that vegetation might be managed to compensate for hydrologic responses due to climate change to help mitigate effects of extreme changes in precipitation.

  3. Influence of groundwater pumping on streamflow restoration following upstream dam removal

    Science.gov (United States)

    Constantz, J.; Essaid, H.

    2007-01-01

    We compared streamflow in basins under the combined impacts of an upland dam and groundwater pumping withdrawals, by examining streamflow in the presence and absence of each impact. As a qualitative analysis, inter-watersbed streamflow comparisons were performed for several rivers flowing into the east side of the Central Valley, CA. Results suggest that, in the absence of upland dams supporting large reservoirs, some reaches of these rivers might develop ephemeral streamflow in late summer. As a quantitative analysis, we conducted a series of streamflow/ groundwater simulations (using MODFLOW-2000 plus the streamflow routing package, SFR1) for a representative hypothetical watershed, with an upland dam and groundwater pumping in the downstream basin, under humid, semi-arid, and and conditions. As a result of including the impact of groundwater pumping, post-dam removal simulated streamflow was significantly less than natural streamflow. The model predicts extensive ephemeral conditions in the basin during September for both the arid and semi-arid cases. The model predicts continued perennial conditions in the humid case, but spatially weighted, average streamflow of only 71% of natural September streamflow, as a result of continued pumping after dam removal.

  4. Information systems validation using formal models

    Directory of Open Access Journals (Sweden)

    Azadeh Sarram

    2014-03-01

    Full Text Available During the past few years, there has been growing interest to use unified modeling language (UML to consider the functional requirements. However, lacking a tool to detect the accuracy and the logic of diagrams in this language makes a formal model indispensable. In this study, conversion of primary UML model of a system to a colored Petri net has been accomplished in order to examine the precision of the model. For this purpose, first the definition of priority and implementation tags for UML activity diagram are provided; then it is turned into colored Petri net. Second, the proposed model provides translated tags in terms of net transitions and some monitoring are used to control the system characteristics. Finally, an executable model of UML activity diagram is provided so that the designer could simulate the model by using the simulation results to detect and to refine the problems of the model. In addition, by checking the results, we find out the proposed method enhances authenticity and accuracy of early models and the ratio of system validation increases compared with previous methods.

  5. Bayesian structural equation modeling method for hierarchical model validation

    Energy Technology Data Exchange (ETDEWEB)

    Jiang Xiaomo [Department of Civil and Environmental Engineering, Vanderbilt University, Box 1831-B, Nashville, TN 37235 (United States)], E-mail: xiaomo.jiang@vanderbilt.edu; Mahadevan, Sankaran [Department of Civil and Environmental Engineering, Vanderbilt University, Box 1831-B, Nashville, TN 37235 (United States)], E-mail: sankaran.mahadevan@vanderbilt.edu

    2009-04-15

    A building block approach to model validation may proceed through various levels, such as material to component to subsystem to system, comparing model predictions with experimental observations at each level. Usually, experimental data becomes scarce as one proceeds from lower to higher levels. This paper presents a structural equation modeling approach to make use of the lower-level data for higher-level model validation under uncertainty, integrating several components: lower-level data, higher-level data, computational model, and latent variables. The method proposed in this paper uses latent variables to model two sets of relationships, namely, the computational model to system-level data, and lower-level data to system-level data. A Bayesian network with Markov chain Monte Carlo simulation is applied to represent the two relationships and to estimate the influencing factors between them. Bayesian hypothesis testing is employed to quantify the confidence in the predictive model at the system level, and the role of lower-level data in the model validation assessment at the system level. The proposed methodology is implemented for hierarchical assessment of three validation problems, using discrete observations and time-series data.

  6. Towards a tracer-based conceptualization of meltwater dynamics and streamflow response in a glacierized catchment

    Science.gov (United States)

    Penna, Daniele; Engel, Michael; Bertoldi, Giacomo; Comiti, Francesco

    2017-01-01

    Multiple water sources and the physiographic heterogeneity of glacierized catchments hamper a complete conceptualization of runoff response to meltwater dynamics. In this study, we used environmental tracers (stable isotopes of water and electrical conductivity) to obtain new insight into the hydrology of glacierized catchments, using the Saldur River catchment, Italian Alps, as a pilot site. We analysed the controls on the spatial and temporal patterns of the tracer signature in the main stream, its selected tributaries, shallow groundwater, snowmelt and glacier melt over a 3-year period. We found that stream water electrical conductivity and isotopic composition showed consistent patterns in snowmelt-dominated periods, whereas the streamflow contribution of glacier melt altered the correlations between the two tracers. By applying two- and three-component mixing models, we quantified the seasonally variable proportion of groundwater, snowmelt and glacier melt at different locations along the stream. We provided four model scenarios based on different tracer signatures of the end-members; the highest contributions of snowmelt to streamflow occurred in late spring-early summer and ranged between 70 and 79 %, according to different scenarios, whereas the largest inputs by glacier melt were observed in mid-summer, and ranged between 57 and 69 %. In addition to the identification of the main sources of uncertainty, we demonstrated how a careful sampling design is critical in order to avoid underestimation of the meltwater component in streamflow. The results of this study supported the development of a conceptual model of streamflow response to meltwater dynamics in the Saldur catchment, which is likely valid for other glacierized catchments worldwide.

  7. Model validation of channel zapping quality

    OpenAIRE

    Kooij, R.; Nicolai, F.; Ahmed, K.; Brunnström, K.

    2009-01-01

    In an earlier paper we showed, that perceived quality of channel zapping is related to the perceived quality of download time of web browsing, as suggested by ITU-T Rec.G.1030. We showed this by performing subjective tests resulting in an excellent fit with a 0.99 correlation. This was what we call a lean forward experiment and gave the rule of thumb result that the zapping time must be less than 0.43 sec to be good ( > 3.5 on the MOS scale). To validate the model we have done new subjective ...

  8. Natural streamflow simulation for two largest river basins in Poland: a baseline for identification of flow alterations

    Science.gov (United States)

    Piniewski, Mikołaj

    2016-05-01

    The objective of this study was to apply a previously developed large-scale and high-resolution SWAT model of the Vistula and the Odra basins, calibrated with the focus of natural flow simulation, in order to assess the impact of three different dam reservoirs on streamflow using the Indicators of Hydrologic Alteration (IHA). A tailored spatial calibration approach was designed, in which calibration was focused on a large set of relatively small non-nested sub-catchments with semi-natural flow regime. These were classified into calibration clusters based on the flow statistics similarity. After performing calibration and validation that gave overall positive results, the calibrated parameter values were transferred to the remaining part of the basins using an approach based on hydrological similarity of donor and target catchments. The calibrated model was applied in three case studies with the purpose of assessing the effect of dam reservoirs (Włocławek, Siemianówka and Czorsztyn Reservoirs) on streamflow alteration. Both the assessment based on gauged streamflow (Before-After design) and the one based on simulated natural streamflow showed large alterations in selected flow statistics related to magnitude, duration, high and low flow pulses and rate of change. Some benefits of using a large-scale and high-resolution hydrological model for the assessment of streamflow alteration include: (1) providing an alternative or complementary approach to the classical Before-After designs, (2) isolating the climate variability effect from the dam (or any other source of alteration) effect, (3) providing a practical tool that can be applied at a range of spatial scales over large area such as a country, in a uniform way. Thus, presented approach can be applied for designing more natural flow regimes, which is crucial for river and floodplain ecosystem restoration in the context of the European Union's policy on environmental flows.

  9. Validity of information security policy models

    Directory of Open Access Journals (Sweden)

    Joshua Onome Imoniana

    Full Text Available Validity is concerned with establishing evidence for the use of a method to be used with a particular set of population. Thus, when we address the issue of application of security policy models, we are concerned with the implementation of a certain policy, taking into consideration the standards required, through attribution of scores to every item in the research instrument. En today's globalized economic scenarios, the implementation of information security policy, in an information technology environment, is a condition sine qua non for the strategic management process of any organization. Regarding this topic, various studies present evidences that, the responsibility for maintaining a policy rests primarily with the Chief Security Officer. The Chief Security Officer, in doing so, strives to enhance the updating of technologies, in order to meet all-inclusive business continuity planning policies. Therefore, for such policy to be effective, it has to be entirely embraced by the Chief Executive Officer. This study was developed with the purpose of validating specific theoretical models, whose designs were based on literature review, by sampling 10 of the Automobile Industries located in the ABC region of Metropolitan São Paulo City. This sampling was based on the representativeness of such industries, particularly with regards to each one's implementation of information technology in the region. The current study concludes, presenting evidence of the discriminating validity of four key dimensions of the security policy, being such: the Physical Security, the Logical Access Security, the Administrative Security, and the Legal & Environmental Security. On analyzing the Alpha of Crombach structure of these security items, results not only attest that the capacity of those industries to implement security policies is indisputable, but also, the items involved, homogeneously correlate to each other.

  10. Simulating spatially distributed catchment response using a fully-integrated surface-subsurface model based on dual calibration with streamflow and evapotranspiration

    Science.gov (United States)

    Ala-aho, Pertti; Soulsby, Chris; Wang, Hailong; Tetzlaff, Doerthe

    2016-04-01

    We use above-ground hydrological fluxes (streamflow and evapotranspiration (ET)) to calibrate an integrated hydrological simulator for a headwater catchment located in the Scottish highlands. Our study explores the feasibility of simulating spatially distributed catchment response in a physically based framework whilst having only preliminary data about the subsurface hydrological parameters. Furthermore we investigate the added value of insitu ET data in the calibration process. Transient simulations are performed with a fully integrated surface-subsurface hydrological model HydroGeoSphere and calibration of model parameters is done in PEST framework. In the first calibration step only the stream hydrograph is included using the original time series alongside with log-transformed hydrograph and weekly flow volumes in the objective function. ET is estimated with energy balance technique using above canopy temperatures, humidity and net radiation measured within the catchment. In the second calibration step, the ET time series are introduced in the calibration objective function. Parameter identifiability along with uncertainty in the model output will be examined as a part of the model calibration for both calibration steps. Furthermore, the post-calibration model will allow us to simulate spatially distributed hydrological fluxes and to distinguish between different water sources that make up the stream hydrograph using the hydraulic mixing-cell method. Preliminary simulations have shown that transient and spatially distributed surface water, subsurface water and evaporative fluxes of a headwater catchment can be reproduced in integrated modelling framework using only above-ground hydrological data in model calibration. We hypothesize that the evapotranspiration dataset informs the catchment water budget and water transmission rates and is therefore useful in constraining subsurface hydraulic parameters, such as hydraulic conductivities, which are typically

  11. A retrospective streamflow ensemble forecast for an extreme hydrologic event: a case study of Hurricane Irene and on the Hudson River basin

    Science.gov (United States)

    Saleh, Firas; Ramaswamy, Venkatsundar; Georgas, Nickitas; Blumberg, Alan F.; Pullen, Julie

    2016-07-01

    This paper investigates the uncertainties in hourly streamflow ensemble forecasts for an extreme hydrological event using a hydrological model forced with short-range ensemble weather prediction models. A state-of-the art, automated, short-term hydrologic prediction framework was implemented using GIS and a regional scale hydrological model (HEC-HMS). The hydrologic framework was applied to the Hudson River basin ( ˜ 36 000 km2) in the United States using gridded precipitation data from the National Centers for Environmental Prediction (NCEP) North American Regional Reanalysis (NARR) and was validated against streamflow observations from the United States Geologic Survey (USGS). Finally, 21 precipitation ensemble members of the latest Global Ensemble Forecast System (GEFS/R) were forced into HEC-HMS to generate a retrospective streamflow ensemble forecast for an extreme hydrological event, Hurricane Irene. The work shows that ensemble stream discharge forecasts provide improved predictions and useful information about associated uncertainties, thus improving the assessment of risks when compared with deterministic forecasts. The uncertainties in weather inputs may result in false warnings and missed river flooding events, reducing the potential to effectively mitigate flood damage. The findings demonstrate how errors in the ensemble median streamflow forecast and time of peak, as well as the ensemble spread (uncertainty) are reduced 48 h pre-event by utilizing the ensemble framework. The methodology and implications of this work benefit efforts of short-term streamflow forecasts at regional scales, notably regarding the peak timing of an extreme hydrologic event when combined with a flood threshold exceedance diagram. Although the modeling framework was implemented on the Hudson River basin, it is flexible and applicable in other parts of the world where atmospheric reanalysis products and streamflow data are available.

  12. Validation of the filament winding process model

    Science.gov (United States)

    Calius, Emilo P.; Springer, George S.; Wilson, Brian A.; Hanson, R. Scott

    1987-01-01

    Tests were performed toward validating the WIND model developed previously for simulating the filament winding of composite cylinders. In these tests two 24 in. long, 8 in. diam and 0.285 in. thick cylinders, made of IM-6G fibers and HBRF-55 resin, were wound at + or - 45 deg angle on steel mandrels. The temperatures on the inner and outer surfaces and inside the composite cylinders were recorded during oven cure. The temperatures inside the cylinders were also calculated by the WIND model. The measured and calculated temperatures were then compared. In addition, the degree of cure and resin viscosity distributions inside the cylinders were calculated for the conditions which existed in the tests.

  13. Assessment model validity document FARF31

    Energy Technology Data Exchange (ETDEWEB)

    Elert, Mark; Gylling Bjoern; Lindgren, Maria [Kemakta Konsult AB, Stockholm (Sweden)

    2004-08-01

    The prime goal of model validation is to build confidence in the model concept and that the model is fit for its intended purpose. In other words: Does the model predict transport in fractured rock adequately to be used in repository performance assessments. Are the results reasonable for the type of modelling tasks the model is designed for. Commonly, in performance assessments a large number of realisations of flow and transport is made to cover the associated uncertainties. Thus, the flow and transport including radioactive chain decay are preferably calculated in the same model framework. A rather sophisticated concept is necessary to be able to model flow and radionuclide transport in the near field and far field of a deep repository, also including radioactive chain decay. In order to avoid excessively long computational times there is a need for well-based simplifications. For this reason, the far field code FARF31 is made relatively simple, and calculates transport by using averaged entities to represent the most important processes. FARF31 has been shown to be suitable for the performance assessments within the SKB studies, e.g. SR 97. Among the advantages are that it is a fast, simple and robust code, which enables handling of many realisations with wide spread in parameters in combination with chain decay of radionuclides. Being a component in the model chain PROPER, it is easy to assign statistical distributions to the input parameters. Due to the formulation of the advection-dispersion equation in FARF31 it is possible to perform the groundwater flow calculations separately.The basis for the modelling is a stream tube, i.e. a volume of rock including fractures with flowing water, with the walls of the imaginary stream tube defined by streamlines. The transport within the stream tube is described using a dual porosity continuum approach, where it is assumed that rock can be divided into two distinct domains with different types of porosity

  14. [Catalonia's primary healthcare accreditation model: a valid model].

    Science.gov (United States)

    Davins, Josep; Gens, Montserrat; Pareja, Clara; Guzmán, Ramón; Marquet, Roser; Vallès, Roser

    2014-07-01

    There are few experiences of accreditation models validated by primary care teams (EAP). The aim of this study was to detail the process of design, development, and subsequent validation of the consensus EAP accreditation model of Catalonia. An Operating Committee of the Health Department of Catalonia revised models proposed by the European Foundation for Quality Management, the Joint Commission International and the Institut Català de la Salut and proposed 628 essential standards to the technical group (25 experts in primary care and quality of care), to establish consensus standards. The consensus document was piloted in 30 EAP for the purpose of validating the contents, testing standards and identifying evidence. Finally, we did a survey to assess acceptance and validation of the document. The Technical Group agreed on a total of 414 essential standards. The pilot selected a total of 379. Mean compliance with the standards of the final document in the 30 EAP was 70.4%. The standards results were the worst fulfilment percentage. The survey target that 83% of the EAP found it useful and 78% found the content of the accreditation manual suitable as a tool to assess the quality of the EAP, and identify opportunities for improvement. On the downside they highlighted its complexity and laboriousness. We have a model that fits the reality of the EAP, and covers all relevant issues for the functioning of an excellent EAP. The model developed in Catalonia is a model for easy understanding.

  15. SDG and qualitative trend based model multiple scale validation

    Science.gov (United States)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  16. Impact of LUCC on streamflow based on the SWAT model over the Wei River basin on the Loess Plateau in China

    Directory of Open Access Journals (Sweden)

    H. Wang

    2017-04-01

    impact on both soil flow and baseflow by compensating for reduced surface runoff, which leads to a slight increase in the streamflow in the Wei River with the mixed landscapes on the Loess Plateau that include earth–rock mountain area.

  17. Unit testing, model validation, and biological simulation

    Science.gov (United States)

    Watts, Mark D.; Ghayoomie, S. Vahid; Larson, Stephen D.; Gerkin, Richard C.

    2016-01-01

    The growth of the software industry has gone hand in hand with the development of tools and cultural practices for ensuring the reliability of complex pieces of software. These tools and practices are now acknowledged to be essential to the management of modern software. As computational models and methods have become increasingly common in the biological sciences, it is important to examine how these practices can accelerate biological software development and improve research quality. In this article, we give a focused case study of our experience with the practices of unit testing and test-driven development in OpenWorm, an open-science project aimed at modeling Caenorhabditis elegans. We identify and discuss the challenges of incorporating test-driven development into a heterogeneous, data-driven project, as well as the role of model validation tests, a category of tests unique to software which expresses scientific models. PMID:27635225

  18. Sensitivity of streamflow to climate change in California

    Science.gov (United States)

    Grantham, T.; Carlisle, D.; Wolock, D.; McCabe, G. J.; Wieczorek, M.; Howard, J.

    2015-12-01

    Trends of decreasing snowpack and increasing risk of drought are looming challenges for California water resource management. Increasing vulnerability of the state's natural water supplies threatens California's social-economic vitality and the health of its freshwater ecosystems. Despite growing awareness of potential climate change impacts, robust management adaptation has been hindered by substantial uncertainty in future climate predictions for the region. Down-scaled global climate model (GCM) projections uniformly suggest future warming of the region, but projections are highly variable with respect to the direction and magnitude of change in regional precipitation. Here we examine the sensitivity of California surface water supplies to climate variation independently of GCMs. We use a statistical approach to construct predictive models of monthly streamflow based on historical climate and river basin features. We then propagate an ensemble of synthetic climate simulations through the models to assess potential streamflow responses to changes in temperature and precipitation in different months and regions of the state. We also consider the range of streamflow change predicted by bias-corrected downscaled GCMs. Our results indicate that the streamflow in the xeric and coastal mountain regions of California is more sensitive to changes in precipitation than temperature, whereas streamflow in the interior mountain region responds strongly to changes in both temperature and precipitation. Mean climate projections for 2025-2075 from GCM ensembles are highly variable, indicating streamflow changes of -50% to +150% relative to baseline (1980-2010) for most months and regions. By quantifying the sensitivity of streamflow to climate change, rather than attempting to predict future hydrologic conditions based on uncertain GCM projections, these results should be more informative to water managers seeking to assess, and potentially reduce, the vulnerability of surface

  19. An Assessment of Melting Season Streamflow Forecasts using EPS for a Snow Dominated Basin in Turkey

    Science.gov (United States)

    Ertaş, Cansaran; Şensoy, Aynur; Akkol, Bulut; Şorman, Arda; Uysal, Gökçen; Çoşkun, Cihan

    2016-04-01

    In many mountainous regions, snowmelt makes significant contribution to streamflow, particularly during spring and summer months. Therefore, runoff modeling and forecasting during spring and early summer is important in terms of energy and water resources management. In this study, the Upper Euphrates Basin (10,275 km2 area and elevation range of 1125-3500 m) located at the headwater of Euphrates River, one of Turkey's most important rivers, is selected as the application area. In this region, snowmelt runoff constitutes approximately 2/3 in volume of the total yearly runoff. The aim of the study is to make a forward-oriented, medium-range flow forecasting using Ensemble Prediction System (EPS) which is a pioneer study for Turkey. Conceptual hydrological model HBV, which has a common usage in the literature, is chosen to predict streamflows. According to preliminary results, Nash-Sutcliffe model efficiencies are 0.85 for calibration (2001-2008) and 0.71 for validation (2009-2014) respectively. After calibrating/validating the hydrologic model, EPS data including 51 different combinations produced by ECMWF is used as probability based weather forecasts. Melting period during March-June of 2009-2015 is chosen as the forecast period. The probabilistic skill of EPS based hydrological model results are analyzed to verify the ensemble forecasts.

  20. Full-Scale Cookoff Model Validation Experiments

    Energy Technology Data Exchange (ETDEWEB)

    McClelland, M A; Rattanapote, M K; Heimdahl, E R; Erikson, W E; Curran, P O; Atwood, A I

    2003-11-25

    This paper presents the experimental results of the third and final phase of a cookoff model validation effort. In this phase of the work, two generic Heavy Wall Penetrators (HWP) were tested in two heating orientations. Temperature and strain gage data were collected over the entire test period. Predictions for time and temperature of reaction were made prior to release of the live data. Predictions were comparable to the measured values and were highly dependent on the established boundary conditions. Both HWP tests failed at a weld located near the aft closure of the device. More than 90 percent of unreacted explosive was recovered in the end heated experiment and less than 30 percent recovered in the side heated test.

  1. Are streamflow recession characteristics really characteristic?

    Directory of Open Access Journals (Sweden)

    M. Stoelzle

    2012-09-01

    Full Text Available Streamflow recession has been investigated by a variety of methods, often involving the fit of a model to empirical recession plots to parameterize a non-linear storage-outflow relationship. Such recession analysis methods (RAMs are used to estimate hydraulic conductivity, storage capacity, or aquifer thickness and to model streamflow recession curves for regionalization and prediction at the catchment scale. Numerous RAMs have been published, but little is known about how characteristic the resulting recession models are to distinguish characteristic catchment behavior. In this study we combined three established recession extraction methods with three different parameter-fitting methods to the power-law storage-outflow model to compare the range of recession characteristics that result from the application of these different RAMs. Resulting recession characteristics including recession time and corresponding storage depletion were evaluated for 20 meso-scale catchments in Germany. We found plausible ranges for model parameterization, however, calculated recession characteristics varied over two orders of magnitude. While recession characteristics of the 20 catchments derived with the different methods correlate strongly, particularly for the RAMs that use the same extraction method and while they rank the catchments relatively consistent, there are still considerable differences among the methods. To elucidate this variability we discuss the ambiguous roles of recession extraction procedures and the parameterization of storage-outflow model and the limitations of the presented recession plots. The results suggest strong limitations to the comparability of recession characteristics derived with different methods, not only in the model parameters but also in the relative characterization of different catchments. A multiple methods approach to investigate streamflow recession characteristics should be considered for applications whenever possible.

  2. Validation of HEDR models. Hanford Environmental Dose Reconstruction Project

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Simpson, J.C.; Eslinger, P.W.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1994-05-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid.

  3. Streamflow impacts of biofuel policy-driven landscape change.

    Directory of Open Access Journals (Sweden)

    Sami Khanal

    Full Text Available Likely changes in precipitation (P and potential evapotranspiration (PET resulting from policy-driven expansion of bioenergy crops in the United States are shown to create significant changes in streamflow volumes and increase water stress in the High Plains. Regional climate simulations for current and biofuel cropping system scenarios are evaluated using the same atmospheric forcing data over the period 1979-2004 using the Weather Research Forecast (WRF model coupled to the NOAH land surface model. PET is projected to increase under the biofuel crop production scenario. The magnitude of the mean annual increase in PET is larger than the inter-annual variability of change in PET, indicating that PET increase is a forced response to the biofuel cropping system land use. Across the conterminous U.S., the change in mean streamflow volume under the biofuel scenario is estimated to range from negative 56% to positive 20% relative to a business-as-usual baseline scenario. In Kansas and Oklahoma, annual streamflow volume is reduced by an average of 20%, and this reduction in streamflow volume is due primarily to increased PET. Predicted increase in mean annual P under the biofuel crop production scenario is lower than its inter-annual variability, indicating that additional simulations would be necessary to determine conclusively whether predicted change in P is a response to biofuel crop production. Although estimated changes in streamflow volume include the influence of P change, sensitivity results show that PET change is the significantly dominant factor causing streamflow change. Higher PET and lower streamflow due to biofuel feedstock production are likely to increase water stress in the High Plains. When pursuing sustainable biofuels policy, decision-makers should consider the impacts of feedstock production on water scarcity.

  4. Streamflow impacts of biofuel policy-driven landscape change.

    Science.gov (United States)

    Khanal, Sami; Anex, Robert P; Anderson, Christopher J; Herzmann, Daryl E

    2014-01-01

    Likely changes in precipitation (P) and potential evapotranspiration (PET) resulting from policy-driven expansion of bioenergy crops in the United States are shown to create significant changes in streamflow volumes and increase water stress in the High Plains. Regional climate simulations for current and biofuel cropping system scenarios are evaluated using the same atmospheric forcing data over the period 1979-2004 using the Weather Research Forecast (WRF) model coupled to the NOAH land surface model. PET is projected to increase under the biofuel crop production scenario. The magnitude of the mean annual increase in PET is larger than the inter-annual variability of change in PET, indicating that PET increase is a forced response to the biofuel cropping system land use. Across the conterminous U.S., the change in mean streamflow volume under the biofuel scenario is estimated to range from negative 56% to positive 20% relative to a business-as-usual baseline scenario. In Kansas and Oklahoma, annual streamflow volume is reduced by an average of 20%, and this reduction in streamflow volume is due primarily to increased PET. Predicted increase in mean annual P under the biofuel crop production scenario is lower than its inter-annual variability, indicating that additional simulations would be necessary to determine conclusively whether predicted change in P is a response to biofuel crop production. Although estimated changes in streamflow volume include the influence of P change, sensitivity results show that PET change is the significantly dominant factor causing streamflow change. Higher PET and lower streamflow due to biofuel feedstock production are likely to increase water stress in the High Plains. When pursuing sustainable biofuels policy, decision-makers should consider the impacts of feedstock production on water scarcity.

  5. Grid of streamflow variability index for Ohio

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A generalized streamflow-variability index coverage was created by interpolating a grid (with 6,066-ft^2 cells) from at-site values of the streamflow-variability...

  6. Statistical validation of normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; van t Veld, Aart; Langendijk, Johannes A.; Schilstra, Cornelis

    2012-01-01

    PURPOSE: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: A penalized regression method, LASSO (least absolute shrinkage

  7. Contribution of multiple climatic variables and human activities to streamflow changes across China

    Science.gov (United States)

    Liu, Jianyu; Zhang, Qiang; Singh, Vijay P.; Shi, Peijun

    2017-02-01

    Using monthly streamflow data from the 1960-2000 period and annual streamflow data from the 2001-2014 period, and also meteorological data from the 1960 to 2014 period from 815 meteorological stations across China, the Budyko-based hydrothermal balance model was used to quantitatively evaluate the fractional contributions of climate change and human activities to streamflow changes in ten river basins across China. Particular importance was attached to human activities, such as population density and Gross Domestic Product (GDP), and also water reservoirs in terms of their relationship with streamflow changes. Results indicated that: (1) streamflow changes of river basins in northern China were more sensitive to climate change than those of river basins in southern China. Based on the degree of sensitivity, the influencing factors to which streamflow changes are sensitive included: precipitation > human activities > relative humidity > solar radiation > maximum temperature > wind speed > minimum temperature. Hence, it can be argued that hydrological systems in northern China are more fragile and more sensitive to changing environment than those in southern China and hence water resources management in northern China is more challenging; (2) during 1980-2000, climate change tended to increase streamflow changes across China and have a dominant role in streamflow variation. However, climate change tends to decrease streamflow in river basins of northern China. Generally, human activities cause a decrease of streamflow across China; (3) In recent years such as a period of 2001-2014, human activities tend to have increasing or enhancing impacts on instream flow changes, and fractional contributions of climate change and human activities to streamflow changes are, respectively, 53.5% and 46.5%. Increasing human-induced impacts on streamflow changes have the potential to add more uncertainty in the management of water resources at different spatial and temporal scales.

  8. Validating agent based models through virtual worlds.

    Energy Technology Data Exchange (ETDEWEB)

    Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E.; Bernstein, Jeremy Ray Rhythm

    2014-01-01

    As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior

  9. Real-time streamflow conditions

    Science.gov (United States)

    Graczyk, David J.; Gebert, Warren A.

    1996-01-01

    Would you like to know streamflow conditions before you go fishing in Wisconsin or in more distant locations? Real-time streamflow data throughout Wisconsin and the United States are available on the Internet from the U.S. Geological Survey. You can see if the stream you are interested in fishing is high due to recent rain or low because of an extended dry spell. Flow conditions at more than 100 stream-gaging stations located throughout Wisconsin can be viewed by accessing the Wisconsin District Home Page at: http://wwwdwimdn.er.usgs.gov

  10. Geochemistry Model Validation Report: Material Degradation and Release Model

    Energy Technology Data Exchange (ETDEWEB)

    H. Stockman

    2001-09-28

    The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17).

  11. Validation of Biomarker-based risk prediction models

    OpenAIRE

    Taylor, Jeremy M.G.; Ankerst, Donna P.; Andridge, Rebecca R.

    2008-01-01

    The increasing availability and use of predictive models to facilitate informed decision making highlights the need for careful assessment of the validity of these models. In particular, models involving biomarkers require careful validation for two reasons: issues with overfitting when complex models involve a large number of biomarkers, and inter-laboratory variation in assays used to measure biomarkers. In this paper we distinguish between internal and external statistical validation. Inte...

  12. Empirical data validation for model building

    Science.gov (United States)

    Kazarian, Aram

    2008-03-01

    Optical Proximity Correction (OPC) has become an integral and critical part of process development for advanced technologies with challenging k I requirements. OPC solutions in turn require stable, predictive models to be built that can project the behavior of all structures. These structures must comprehend all geometries that can occur in the layout in order to define the optimal corrections by feature, and thus enable a manufacturing process with acceptable margin. The model is built upon two main component blocks. First, is knowledge of the process conditions which includes the optical parameters (e.g. illumination source, wavelength, lens characteristics, etc) as well as mask definition, resist parameters and process film stack information. Second, is the empirical critical dimension (CD) data collected using this process on specific test features the results of which are used to fit and validate the model and to project resist contours for all allowable feature layouts. The quality of the model therefore is highly dependent on the integrity of the process data collected for this purpose. Since the test pattern suite generally extends to below the resolution limit that the process can support with adequate latitude, the CD measurements collected can often be quite noisy with marginal signal-to-noise ratios. In order for the model to be reliable and a best representation of the process behavior, it is necessary to scrutinize empirical data to ensure that it is not dominated by measurement noise or flyer/outlier points. The primary approach for generating a clean, smooth and dependable empirical data set should be a replicated measurement sampling that can help to statistically reduce measurement noise by averaging. However, it can often be impractical to collect the amount of data needed to ensure a clean data set by this method. An alternate approach is studied in this paper to further smooth the measured data by means of curve fitting to identify remaining

  13. A Self-Calibrating Runoff and Streamflow Remote Sensing Model for Ungauged Basins Using Open-Access Earth Observation Data

    Directory of Open Access Journals (Sweden)

    Ate Poortinga

    2017-01-01

    Full Text Available Due to increasing pressures on water resources, there is a need to monitor regional water resource availability in a spatially and temporally explicit manner. However, for many parts of the world, there is insufficient data to quantify stream flow or ground water infiltration rates. We present the results of a pixel-based water balance formulation to partition rainfall into evapotranspiration, surface water runoff and potential ground water infiltration. The method leverages remote sensing derived estimates of precipitation, evapotranspiration, soil moisture, Leaf Area Index, and a single F coefficient to distinguish between runoff and storage changes. The study produced significant correlations between the remote sensing method and field based measurements of river flow in two Vietnamese river basins. For the Ca basin, we found R2 values ranging from 0.88–0.97 and Nash–Sutcliffe efficiency (NSE values varying between 0.44–0.88. The R2 for the Red River varied between 0.87–0.93 and NSE values between 0.61 and 0.79. Based on these findings, we conclude that the method allows for a fast and cost-effective way to map water resource availability in basins with no gauges or monitoring infrastructure, without the need for application of sophisticated hydrological models or resource-intensive data.

  14. An integrated hydrological, ecological, and economical (HEE) modeling system for assessing water resources and ecosystem production: calibration and validation in the upper and middle parts of the Yellow River Basin, China

    Science.gov (United States)

    Li, Xianglian; Yang, Xiusheng; Gao, Wei

    2006-08-01

    Effective management of water resources in arid and semi-arid areas demands studies that cross over the disciplinaries of natural and social sciences. An integrated Hydrological, Ecological and Economical (HEE) modeling system at regional scale has been developed to assess water resources use and ecosystem production in arid and semi-arid areas. As a physically-based distributed modeling system, the HEE modeling system requires various input parameters including those for soil, vegetation, topography, groundwater, and water and agricultural management at different spatial levels. A successful implementation of the modeling system highly depends on how well it is calibrated. This paper presented an automatic calibration procedure for the HEE modeling system and its test in the upper and middle parts of the Yellow River basin. Previous to calibration, comprehensive literature investigation and sensitivity analysis were performed to identify important parameters for calibration. The automatic calibration procedure was base on conventional Monte Carlo sampling method together with a multi-objective criterion for calibration over multi-site and multi-output. The multi-objective function consisted of optimizing statistics of mean absolute relative error (MARE), Nash-Sutcliffe model efficiency coefficient (E NS), and coefficient of determination (R2). The modeling system was calibrated against streamflow and harvest yield data from multiple sites/provinces within the basin over 2001 by using the proposed automatic procedure, and validated over 1993-1995. Over the calibration period, the mean absolute relative error of simulated daily streamflow was within 7% while the statistics R2 and E NS of daily streamflow were 0.61 and 0.49 respectively. Average simulated harvest yield over the calibration period was about 9.2% less than that of observations. Overall calibration results have indicated that the calibration procedures developed in this study can efficiently calibrate

  15. Model validation of channel zapping quality

    Science.gov (United States)

    Kooij, Robert; Nicolai, Floris; Ahmed, Kamal; Brunnström, Kjell

    2009-02-01

    In an earlier paper we showed, that perceived quality of channel zapping is related to the perceived quality of download time of web browsing, as suggested by ITU-T Rec.G.1030. We showed this by performing subjective tests resulting in an excellent fit with a 0.99 correlation. This was what we call a lean forward experiment and gave the rule of thumb result that the zapping time must be less than 0.43 sec to be good ( > 3.5 on the MOS scale). To validate the model we have done new subjective experiments. These experiments included lean backwards zapping i.e. sitting in a sofa with a remote control. The subjects are more forgiving in this case and the requirement could be relaxed to 0.67 sec. We also conducted subjective experiments where the zapping times are varying. We found that the MOS rating decreases if zapping delay times are varying. In our experiments we assumed uniformly distributed delays, where the variance cannot be larger than the mean delay. We found that in order to obtain a MOS rating of at least 3.5, that the maximum allowed variance, and thus also the maximum allowed mean zapping delay, is 0.46 sec.

  16. Deep groundwater mediates streamflow response to climate warming in the Oregon Cascades

    Science.gov (United States)

    Christina Tague; Gordon Grant; Mike Farrell; Janet Choate; Anne Jefferson

    2008-01-01

    Recent studies predict that projected climate change will lead to significant reductions in summer streamflow in the mountainous regions of the Western United States. Hydrologic modeling directed at quantifying these potential changes has focused on the magnitude and timing of spring snowmelt as the key control on the spatial temporal pattern of summer streamflow. We...

  17. Streamflow trends in Europe: evidence from a dataset of near-natural catchments

    NARCIS (Netherlands)

    Stahl, K.; Hisdal, H.; Hannaford, J.; Tallaksen, L.; Lanen, van H.A.J.; Sauquet, E.; Demuth, S.; Fendeková, M.; Jódar, J.

    2010-01-01

    Streamflow observations from near-natural catchments are of paramount importance for detection and attribution studies, evaluation of large-scale model simulations, and assessment of water management, adaptation and policy options. This study investigates streamflow trends in a newly-assembled, cons

  18. The effects of changing land cover on streamflow simulation in Puerto Rico

    Science.gov (United States)

    A.E. Van Beusekom; L.E. Hay; R.J. Viger; W.A. Gould; J.A. Collazo; A. Henareh Khalyani

    2014-01-01

    This study quantitatively explores whether land cover changes have a substantive impact on simulated streamflow within the tropical island setting of Puerto Rico. The Precipitation Runoff Modeling System (PRMS) was used to compare streamflow simulations based on five static parameterizations of land cover with those based on dynamically varying parameters derived from...

  19. Unorganized machines for seasonal streamflow series forecasting.

    Science.gov (United States)

    Siqueira, Hugo; Boccato, Levy; Attux, Romis; Lyra, Christiano

    2014-05-01

    Modern unorganized machines--extreme learning machines and echo state networks--provide an elegant balance between processing capability and mathematical simplicity, circumventing the difficulties associated with the conventional training approaches of feedforward/recurrent neural networks (FNNs/RNNs). This work performs a detailed investigation of the applicability of unorganized architectures to the problem of seasonal streamflow series forecasting, considering scenarios associated with four Brazilian hydroelectric plants and four distinct prediction horizons. Experimental results indicate the pertinence of these models to the focused task.

  20. Validation technique using mean and variance of kriging model

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ho Sung; Jung, Jae Jun; Lee, Tae Hee [Hanyang Univ., Seoul (Korea, Republic of)

    2007-07-01

    To validate rigorously the accuracy of metamodel is an important research area in metamodel techniques. A leave-k-out cross-validation technique not only requires considerable computational cost but also cannot measure quantitatively the fidelity of metamodel. Recently, the average validation technique has been proposed. However the average validation criterion may stop a sampling process prematurely even if kriging model is inaccurate yet. In this research, we propose a new validation technique using an average and a variance of response during a sequential sampling method, such as maximum entropy sampling. The proposed validation technique becomes more efficient and accurate than cross-validation technique, because it integrates explicitly kriging model to achieve an accurate average and variance, rather than numerical integration. The proposed validation technique shows similar trend to root mean squared error such that it can be used as a strop criterion for sequential sampling.

  1. Validity of covariance models for the analysis of geographical variation

    DEFF Research Database (Denmark)

    Guillot, Gilles; Schilling, Rene L.; Porcu, Emilio

    2014-01-01

    attention lately and show that the conditions under which they are valid mathematical models have been overlooked so far. 3. We provide rigorous results for the construction of valid covariance models in this family. 4. We also outline how to construct alternative covariance models for the analysis...

  2. Estimation of baseline daily mean streamflows for ungaged locations on Pennsylvania streams, water years 1960-2008

    Science.gov (United States)

    Stuckey, Marla H.; Koerkle, Edward H.; Ulrich, James E.

    2012-01-01

    Water-resource managers use daily mean streamflows to generate streamflow statistics and analyze streamflow conditions. An in-depth evaluation of flow regimes to promote instream ecological health often requires streamflow information obtainable only from a time series hydrograph. Historically, it has been difficult to estimate daily mean streamflow for an ungaged location. The U.S. Geological Survey (USGS), in cooperation with the Pennsylvania Department of Environmental Protection, Susquehanna River Basin Commission, and The Nature Conservancy, has developed the Baseline Streamflow Estimator (BaSE) to estimate baseline streamflow at a daily time scale for ungaged streams in Pennsylvania using data collected during water years 1960–2008. Baseline streamflow is minimally altered by regulation, diversion, or mining, and other anthropogenic activities. Daily mean streamflow is estimated in BaSE using a methodology that equates streamflow as a percentile from a flow duration curve for a particular day at an ungaged location with streamflow as a percentile from the flow duration curve for the same day at a reference streamgage that is considered to be hydrologically similar to the ungaged location. An appropriate reference streamgage is selected using map correlation, in which variogram models are developed that correlate streamflow at one streamgage with streamflows at all other streamgages. The percentiles from a flow duration curve for the ungaged location are converted to streamflow through the use of regression equations. Regression equations used to predict 17 flow-duration exceedance probabilities were developed for Pennsylvania using geographic information system-derived basin characteristics. The standard error of prediction for the regression equations ranged from 11 percent to 92 percent with the mean of 31 percent.

  3. Test-driven verification/validation of model transformations

    Institute of Scientific and Technical Information of China (English)

    László LENGYEL; Hassan CHARAF

    2015-01-01

    Why is it important to verify/validate model transformations? The motivation is to improve the quality of the trans-formations, and therefore the quality of the generated software artifacts. Verified/validated model transformations make it possible to ensure certain properties of the generated software artifacts. In this way, verification/validation methods can guarantee different requirements stated by the actual domain against the generated/modified/optimized software products. For example, a verified/ validated model transformation can ensure the preservation of certain properties during the model-to-model transformation. This paper emphasizes the necessity of methods that make model transformation verified/validated, discusses the different scenarios of model transformation verification and validation, and introduces the principles of a novel test-driven method for verifying/ validating model transformations. We provide a solution that makes it possible to automatically generate test input models for model transformations. Furthermore, we collect and discuss the actual open issues in the field of verification/validation of model transformations.

  4. Analysis of subsurface storage and streamflow generation in urban watersheds

    Science.gov (United States)

    Bhaskar, Aditi S.; Welty, Claire

    2015-03-01

    Subsurface storage as a regulator of streamflow was investigated as an explanation for the large proportion of pre-event water observed in urban streams during storm events. We used multiple lines of inquiry to explore the relationship between pre-event water proportion, subsurface storage, and streamflow under storm conditions. First, we used a three-dimensional model of integrated subsurface and surface flow and solute transport to simulate an idealized hillslope to perform model-based chemical hydrograph separation of stormflow. Second, we employed simple dynamical systems analysis to derive the relationship between subsurface storage and streamflow for three Baltimore, Maryland watersheds (3.8-14 km2 in area) along an urban-to-rural gradient. Last, we applied chemical hydrograph separation to high-frequency specific conductance data in nested urban watersheds (˜50% impervious surface cover) in Dead Run, Baltimore County, Maryland. Unlike the importance of antecedent subsurface storage observed in some systems, we found that rainfall depth and not subsurface storage was the primary control on pre-event water proportion in both field observations and hillslope numerical experiments. Field observations showed that antecedent stream base flow did not affect pre-event water proportion or streamflow values under storm conditions. Hillslope model results showed that the relationship between streamflow values under storm conditions and subsurface storage was clockwise hysteretic. The simple dynamical systems approach showed that stream base flow in the most urbanized of three watersheds exhibited the largest sensitivity to changes in storage. This work raises questions about the streamflow generation mechanisms by which pre-event water dominates urban storm hydrographs, and the shifts between mechanisms in rural and urban watersheds.

  5. System Advisor Model: Flat Plate Photovoltaic Performance Modeling Validation Report

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, J.; Whitmore, J.; Kaffine, L.; Blair, N.; Dobos, A. P.

    2013-12-01

    The System Advisor Model (SAM) is a free software tool that performs detailed analysis of both system performance and system financing for a variety of renewable energy technologies. This report provides detailed validation of the SAM flat plate photovoltaic performance model by comparing SAM-modeled PV system generation data to actual measured production data for nine PV systems ranging from 75 kW to greater than 25 MW in size. The results show strong agreement between SAM predictions and field data, with annualized prediction error below 3% for all fixed tilt cases and below 8% for all one axis tracked cases. The analysis concludes that snow cover and system outages are the primary sources of disagreement, and other deviations resulting from seasonal biases in the irradiation models and one axis tracking issues are discussed in detail.

  6. SDG-based Model Validation in Chemical Process Simulation

    Institute of Scientific and Technical Information of China (English)

    张贝克; 许欣; 马昕; 吴重光

    2013-01-01

    Signed direct graph (SDG) theory provides algorithms and methods that can be applied directly to chemical process modeling and analysis to validate simulation models, and is a basis for the development of a soft-ware environment that can automate the validation activity. This paper is concentrated on the pretreatment of the model validation. We use the validation scenarios and standard sequences generated by well-established SDG model to validate the trends fitted from the simulation model. The results are helpful to find potential problems, as-sess possible bugs in the simulation model and solve the problem effectively. A case study on a simulation model of boiler is presented to demonstrate the effectiveness of this method.

  7. Drought and climatic change impact on streamflow in small watersheds.

    Science.gov (United States)

    Tigkas, Dimitris; Vangelis, Harris; Tsakiris, George

    2012-12-01

    The paper presents a comprehensive, thought simple, methodology, for forecasting the annual hydrological drought, based on meteorological drought indications available early during the hydrological year. The meteorological drought of 3, 6 and 9 months is estimated using the reconnaissance drought index (RDI), whereas the annual hydrological drought is represented by the streamflow drought index (SDI). Regression equations are derived between RDI and SDI, forecasting the level of hydrological drought for the entire year in real time. Further, using a wide range of scenarios representing possible climatic changes and drought events of varying severity, nomographs are devised for estimating the annual streamflow change. The Medbasin rainfall-runoff model is used to link meteorological data to streamflow. The later approach can be useful for developing preparedness plans to combat the consequences of drought and climate change. As a case study, the area of N. Peloponnese (Greece) was selected, incorporating several small river basins.

  8. A geohydrologic framework for characterizing summer streamflow sensitivity to climate warming in the Pacific Northwest, USA

    Directory of Open Access Journals (Sweden)

    M. Safeeq

    2014-03-01

    Full Text Available Summer streamflows in the Pacific Northwest are largely derived from melting snow and groundwater discharge. As the climate warms, diminishing snowpack and earlier snowmelt will cause reductions in summer streamflow. Most assessments of the impacts of a changing climate to streamflow make use of downscaled temperature and precipitation projections from General Circulation Models (GCMs. Projected climate simulations from these GCMs are often too coarse for planning purposes, as they do not capture smaller scale topographic controls and other important watershed processes. This uncertainty is further amplified when downscaled climate predictions are coupled to macroscale hydrologic models that fail to capture streamflow contributions from deep groundwater. Deep aquifers play an important role in mediating streamflow response to climate change, and groundwater needs to be explicitly incorporated into sensitivity assessments. Here we develop and apply an analytical framework for characterizing summer streamflow sensitivity to a change in the timing and magnitude of recharge in a spatially-explicit fashion. Two patterns emerge from this analysis: first, areas with high streamflow sensitivity also have higher summer streamflows as compared to low sensitivity areas. Second, the level of sensitivity and spatial extent of highly sensitive areas diminishes over time as the summer progresses. Results of this analysis point to a robust, practical, and scalable approach that can help assess risk at the landscape scale, complement the downscaling approach, be applied to any climate scenario of interest, and provide a framework to assist land and water managers adapt to an uncertain and potentially challenging future.

  9. Statistical validation of normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  11. Validation of ecological state space models using the Laplace approximation

    DEFF Research Database (Denmark)

    Thygesen, Uffe Høgsbro; Albertsen, Christoffer Moesgaard; Berg, Casper Willestofte

    2017-01-01

    Many statistical models in ecology follow the state space paradigm. For such models, the important step of model validation rarely receives as much attention as estimation or hypothesis testing, perhaps due to lack of available algorithms and software. Model validation is often based on a naive...... for estimation in general mixed effects models. Implementing one-step predictions in the R package Template Model Builder, we demonstrate that it is possible to perform model validation with little effort, even if the ecological model is multivariate, has non-linear dynamics, and whether observations...... are continuous or discrete. With both simulated data, and a real data set related to geolocation of seals, we demonstrate both the potential and the limitations of the techniques. Our results fill a need for convenient methods for validating a state space model, or alternatively, rejecting it while indicating...

  12. Streamflow simulation methods for ungauged and poorly gauged watersheds

    Science.gov (United States)

    Loukas, A.; Vasiliades, L.

    2014-07-01

    Rainfall-runoff modelling procedures for ungauged and poorly gauged watersheds are developed in this study. A well-established hydrological model, the University of British Columbia (UBC) watershed model, is selected and applied in five different river basins located in Canada, Cyprus, and Pakistan. Catchments from cold, temperate, continental, and semiarid climate zones are included to demonstrate the procedures developed. Two methodologies for streamflow modelling are proposed and analysed. The first method uses the UBC watershed model with a universal set of parameters for water allocation and flow routing, and precipitation gradients estimated from the available annual precipitation data as well as from regional information on the distribution of orographic precipitation. This method is proposed for watersheds without streamflow gauge data and limited meteorological station data. The second hybrid method proposes the coupling of UBC watershed model with artificial neural networks (ANNs) and is intended for use in poorly gauged watersheds which have limited streamflow measurements. The two proposed methods have been applied to five mountainous watersheds with largely varying climatic, physiographic, and hydrological characteristics. The evaluation of the applied methods is based on the combination of graphical results, statistical evaluation metrics, and normalized goodness-of-fit statistics. The results show that the first method satisfactorily simulates the observed hydrograph assuming that the basins are ungauged. When limited streamflow measurements are available, the coupling of ANNs with the regional, non-calibrated UBC flow model components is considered a successful alternative method to the conventional calibration of a hydrological model based on the evaluation criteria employed for streamflow modelling and flood frequency estimation.

  13. Three Dimensional Vapor Intrusion Modeling: Model Validation and Uncertainty Analysis

    Science.gov (United States)

    Akbariyeh, S.; Patterson, B.; Rakoczy, A.; Li, Y.

    2013-12-01

    Volatile organic chemicals (VOCs), such as chlorinated solvents and petroleum hydrocarbons, are prevalent groundwater contaminants due to their improper disposal and accidental spillage. In addition to contaminating groundwater, VOCs may partition into the overlying vadose zone and enter buildings through gaps and cracks in foundation slabs or basement walls, a process termed vapor intrusion. Vapor intrusion of VOCs has been recognized as a detrimental source for human exposures to potential carcinogenic or toxic compounds. The simulation of vapor intrusion from a subsurface source has been the focus of many studies to better understand the process and guide field investigation. While multiple analytical and numerical models were developed to simulate the vapor intrusion process, detailed validation of these models against well controlled experiments is still lacking, due to the complexity and uncertainties associated with site characterization and soil gas flux and indoor air concentration measurement. In this work, we present an effort to validate a three-dimensional vapor intrusion model based on a well-controlled experimental quantification of the vapor intrusion pathways into a slab-on-ground building under varying environmental conditions. Finally, a probabilistic approach based on Monte Carlo simulations is implemented to determine the probability distribution of indoor air concentration based on the most uncertain input parameters.

  14. Prospects and problems for standardizing model validation in systems biology.

    Science.gov (United States)

    Gross, Fridolin; MacLeod, Miles

    2017-10-01

    There are currently no widely shared criteria by which to assess the validity of computational models in systems biology. Here we discuss the feasibility and desirability of implementing validation standards for modeling. Having such a standard would facilitate journal review, interdisciplinary collaboration, model exchange, and be especially relevant for applications close to medical practice. However, even though the production of predictively valid models is considered a central goal, in practice modeling in systems biology employs a variety of model structures and model-building practices. These serve a variety of purposes, many of which are heuristic and do not seem to require strict validation criteria and may even be restricted by them. Moreover, given the current situation in systems biology, implementing a validation standard would face serious technical obstacles mostly due to the quality of available empirical data. We advocate a cautious approach to standardization. However even though rigorous standardization seems premature at this point, raising the issue helps us develop better insights into the practices of systems biology and the technical problems modelers face validating models. Further it allows us to identify certain technical validation issues which hold regardless of modeling context and purpose. Informal guidelines could in fact play a role in the field by helping modelers handle these. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    with simple unified modelling language (UML) requirements models, it is not easy for the development team to get confidence on the stakeholders' requirements validation. This paper describes an approach, based on the construction of executable interactive prototypes, to support the validation of workflow...

  16. Prospects and problems for standardizing model validation in systems biology

    NARCIS (Netherlands)

    Gross, Fridolin; MacLeod, Miles Alexander James

    2017-01-01

    There are currently no widely shared criteria by which to assess the validity of computational models in systems biology. Here we discuss the feasibility and desirability of implementing validation standards for modeling. Having such a standard would facilitate journal review, interdisciplinary coll

  17. Validation and Adaptation of Router and Switch Models

    NARCIS (Netherlands)

    Boltjes, B.; Fernandez Diaz, I.; Kock, B.A.; Langeveld, R.J.G.M.; Schoenmaker, G.

    2003-01-01

    This paper describes validating OPNET models of key devices for the next generation IP-based tactical network of the Royal Netherlands Army (RNLA). The task of TNO-FEL is to provide insight in scalability and performance of future deployed networks. Because validated models ol key Cisco equipment we

  18. Foundational Issues in Statistical Modeling: Statistical Model Specification and Validation

    Directory of Open Access Journals (Sweden)

    Aris Spanos

    2011-01-01

    Full Text Available Statistical model specification and validation raise crucial foundational problems whose pertinent resolution holds the key to learning from data by securing the reliability of frequentist inference. The paper questions the judiciousness of several current practices, including the theory-driven approach, and the Akaike-type model selection procedures, arguing that they often lead to unreliable inferences. This is primarily due to the fact that goodness-of-fit/prediction measures and other substantive and pragmatic criteria are of questionable value when the estimated model is statistically misspecified. Foisting one's favorite model on the data often yields estimated models which are both statistically and substantively misspecified, but one has no way to delineate between the two sources of error and apportion blame. The paper argues that the error statistical approach can address this Duhemian ambiguity by distinguishing between statistical and substantive premises and viewing empirical modeling in a piecemeal way with a view to delineate the various issues more effectively. It is also argued that Hendry's general to specific procedures does a much better job in model selection than the theory-driven and the Akaike-type procedures primary because of its error statistical underpinnings.

  19. Toward Validation of the Diagnostic-Prescriptive Model

    Science.gov (United States)

    Ysseldyke, James E.; Sabatino, David A.

    1973-01-01

    Criticized are recent research efforts to validate the diagnostic prescriptive model of remediating learning disabilities, and proposed is a 6-step psychoeducational model designed to ascertain links between behavioral differences and instructional outcomes. (DB)

  20. Models for Validation of Prior Learning (VPL)

    DEFF Research Database (Denmark)

    Ehlers, Søren

    would have been categorized as utopian can become realpolitik. Validation of Prior Learning (VPL) was in Europe mainly regarded as utopian while universities in the United States of America (USA) were developing ways to obtain credits to those students which was coming with experiences from working life....

  1. Modeling and Simulation Behavior Validation Methodology and Extension Model Validation for the Individual Soldier

    Science.gov (United States)

    2015-03-01

    Historical Methods The three historical methods of validation are rationalism, empiricism , and positive economics. Rationalism requires that... Empiricism requires every assumption and outcome to be empirically validated. Positive economics requires only that the model’s outcome(s) be correct...historical methods of rationalism, empiricism , and positive economics into a multistage process of validation. This validation method consists of (1

  2. DEBRIS FLOWS AND HYPERCONCENTRATED STREAMFLOWS.

    Science.gov (United States)

    Wieczorek, Gerald F.

    1986-01-01

    Examination of recent debris-flow and hyperconcentrated-streamflow events in the western United States reveals (1) the topographic, geologic, hydrologic, and vegetative conditions that affect initiation of debris flows and (2) the wide ranging climatic conditions that can trigger debris flows. Recognition of these physiographic and climatic conditions has aided development of preliminary methods for hazard evaluation. Recent developments in the application of electronic data gathering, transmitting, and processing systems shows potential for real-time hazard warning.

  3. Global separation of plant transpiration from groundwater and streamflow

    Science.gov (United States)

    Jaivime Evaristo; Scott Jasechko; Jeffrey J. McDonnell

    2015-01-01

    Current land surface models assume that groundwater, streamflow and plant transpiration are all sourced and mediated by the same well mixed water reservoir—the soil. However, recent work in Oregon and Mexico has shown evidence of ecohydrological separation, whereby different subsurface compartmentalized pools of water supply either plant transpiration fluxes or the...

  4. Contribution of MODIS Derived Snow Cover Satellite Data into Artificial Neural Network for Streamflow Estimation

    Science.gov (United States)

    Uysal, Gokcen; Arda Sorman, Ali; Sensoy, Aynur

    2014-05-01

    Contribution of snowmelt and correspondingly snow observations are highly important in mountainous basins for modelers who deal with conceptual, physical or soft computing models in terms of effective water resources management. Long term archived continuous data are needed for appropriate training and testing of data driven approaches like artificial neural networks (ANN). Data is scarce at the upper elevations due to the difficulty of installing sufficient automated SNOTEL stations; thus in literatures many attempts are made on the rainfall dominated basins for streamflow estimation studies. On the other hand, optical satellites can easily detect snow because of its high reflectance property. MODIS (Moderate Resolution Imaging Spectroradiometer) satellite that has two platforms (Terra and Aqua) provides daily and 8-daily snow images for different time periods since 2000, therefore snow cover data (SCA) may be useful as an input layer for ANN applications. In this study, a multi-layer perceptron (MLP) model is trained and tested with precipitation, temperature, radiation, previous day discharges as well as MODIS daily SCA data. The weights and biases are optimized with fastest and robust Levenberg-Marquardt backpropagation algorithm. MODIS snow cover images are removed from cloud coverage using certain filtering techniques. The Upper Euphrates River Basin in eastern part of Turkey (10 250 km2) is selected as the application area since it is fed by snowmelt approximately 2/3 of total annual volume during spring and early summer. Several input models and ANN structures are investigated to see the effect of the contributions using 10 years of data (2001-2010) for training and validation. The accuracy of the streamflow estimations is checked with statistical criteria (coefficient of determination, Nash-Sutcliffe model efficiency, root mean square error, mean absolute error) and the results seem to improve when SCA data is introduced. Furthermore, a forecast study is

  5. Separating streamflow components to reveal nutrient flowpaths: Toenepi Stream

    Science.gov (United States)

    Stewart, Michael

    2015-04-01

    separation of the streamflow into three components, and the procedure gave results comparable to the modelling study of Woodward et al. (2013). References Stewart, M.K. 2014: New baseflow separation and recession analysis approaches for streamflow. Hydrol. Earth System Sci. Disc. 11, 7089-7131. doi:10.5194/hessd-11-7089-2014 Woodward, S.J.R., Stenger, R., Bidwell, V.J. 2013: Dynamic analysis of stream flow and water chemistry to infer subsurface water and nitrate fluxes in a lowland dairying catchment. J. Hydrol. 505, 299-311.

  6. Validation of Numerical Shallow Water Models for Tidal Lagoons

    Energy Technology Data Exchange (ETDEWEB)

    Eliason, D.; Bourgeois, A.

    1999-11-01

    An analytical solution is presented for the case of a stratified, tidally forced lagoon. This solution, especially its energetics, is useful for the validation of numerical shallow water models under stratified, tidally forced conditions. The utility of the analytical solution for validation is demonstrated for a simple finite difference numerical model. A comparison is presented of the energetics of the numerical and analytical solutions in terms of the convergence of model results to the analytical solution with increasing spatial and temporal resolution.

  7. Using virtual reality to validate system models

    Energy Technology Data Exchange (ETDEWEB)

    Winter, V.L.; Caudell, T.P.

    1999-12-09

    To date most validation techniques are highly biased towards calculations involving symbolic representations of problems. These calculations are either formal (in the case of consistency and completeness checks), or informal in the case of code inspections. The authors believe that an essential type of evidence of the correctness of the formalization process must be provided by (i.e., must originate from) human-based calculation. They further believe that human calculation can by significantly amplified by shifting from symbolic representations to graphical representations. This paper describes their preliminary efforts in realizing such a representational shift.

  8. Extending Model Checking to Object Process Validation

    NARCIS (Netherlands)

    Rein, van H.

    2002-01-01

    Object-oriented techniques allow the gathering and modelling of system requirements in terms of an application area. The expression of data and process models at that level is a great asset in communication with non-technical people in that area, but it does not necessarily lead to consistent models

  9. Regional changes in streamflow after a megathrust earthquake

    Science.gov (United States)

    Mohr, Christian H.; Manga, Michael; Wang, Chi-Yuen; Korup, Oliver

    2017-01-01

    Moderate to large earthquakes can increase the amount of water feeding stream flows, mobilizing excess water from deep groundwater, shallow groundwater, or the vadose zone. Here we examine the regional pattern of streamflow response to the Maule M8.8 earthquake across Chile's diverse topographic and hydro-climatic gradients. We combine streamflow analyses with groundwater flow modeling and a random forest classifier, and find that, after the earthquake, at least 85 streams had a change in flow. Discharge mostly increased (n = 78) shortly after the earthquake, liberating an excess water volume of >1.1 km3, which is the largest ever reported following an earthquake. Several catchments had increased discharge of >50 mm, locally exceeding seasonal streamflow discharge under undisturbed conditions. Our modeling results favor enhanced vertical permeability induced by dynamic strain as the most probable process explaining the observed changes at the regional scale. Supporting this interpretation, our random forest classification identifies peak ground velocity and elevation extremes as most important for predicting streamflow response. Given the mean recurrence interval of ∼25 yr for >M8.0 earthquakes along the Peru-Chile Trench, our observations highlight the role of earthquakes in the regional water cycle, especially in arid environments.

  10. Cool-Season Moisture Delivery and Multi-Basin Streamflow Anomalies in the Western United States

    Science.gov (United States)

    Malevich, Steven B.

    Widespread droughts can have a significant impact on western United States streamflow, but the causes of these events are not fully understood. This dissertation examines streamflow from multiple western US basins and establishes the robust, leading modes of variability in interannual streamflow throughout the past century. I show that approximately 50% of this variability is associated with spatially widespread streamflow anomalies that are statistically independent from streamflow's response to the El Nino-Southern Oscillation (ENSO). The ENSO-teleconnection accounts for approximately 25% of the interannual variability in streamflow, across this network. These atmospheric circulation anomalies associated with the most spatially widespread variability are associated with the Aleutian low and the persistent coastal atmospheric ridge in the Pacific Northwest. I use a watershed segmentation algorithm to explicitly track the position and intensity of these features and compare their variability to the multi-basin streamflow variability. Results show that latitudinal shifts in the coastal atmospheric ridge are more strongly associated with streamflow's north-south dipole response to ENSO variability while more spatially widespread anomalies in streamflow most strongly relate to seasonal changes in the coastal ridge intensity. This likely reflects persistent coastal ridge blocking of cool-season precipitation into western US river basins. I utilize the 35 model runs of the Community Earth System Model Large Ensemble (CESMLE) to determine whether the model ensemble simulates the anomalously strong coastal ridges and extreme widespread wintertime precipitation anomalies found in the observation record. Though there is considerable bias in the CESMLE, the CESMLE runs simulate extremely widespread dry precipitation anomalies with a frequency of approximately one extreme event per century during the historical simulations (1920 - 2005). These extremely widespread dry events

  11. A practical approach to validating a PD model

    NARCIS (Netherlands)

    Medema, Lydian; Koning, Ruud H.; Lensink, Robert; Medema, M.

    2009-01-01

    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their credi

  12. A Practical Approach to Validating a PD Model

    NARCIS (Netherlands)

    Medema, L.; Koning, de R.; Lensink, B.W.

    2009-01-01

    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their credi

  13. Cross-validation criteria for SETAR model selection

    NARCIS (Netherlands)

    de Gooijer, J.G.

    2001-01-01

    Three cross-validation criteria, denoted C, C_c, and C_u are proposed for selecting the orders of a self-exciting threshold autoregressive SETAR) model when both the delay and the threshold value are unknown. The derivatioon of C is within a natural cross-validation framework. The crietion C_c is si

  14. Ensuring the Validity of the Micro Foundation in DSGE Models

    DEFF Research Database (Denmark)

    Andreasen, Martin Møller

    The presence of i) stochastic trends, ii) deterministic trends, and/or iii) stochastic volatil- ity in DSGE models may imply that the agents' objective functions attain infinite values. We say that such models do not have a valid micro foundation. The paper derives sufficient condi- tions which...... ensure that the objective functions of the households and the firms are finite even when various trends and stochastic volatility are included in a standard DSGE model. Based on these conditions we test the validity of the micro foundation in six DSGE models from the literature. The models of Justiniano...... in these models remains to be established....

  15. Amendment to Validated dynamic flow model

    DEFF Research Database (Denmark)

    Knudsen, Torben

    2011-01-01

    The purpose of WP2 is to establish flow models relating the wind speed at turbines in a farm. Until now, active control of power reference has not been included in these models as only data with standard operation has been available. In this report the first data series with power reference...... excitations from the Thanet farm are used for trying to update some of the models discussed in D2.5. Because of very limited amount of data only simple dynamic transfer function models can be obtained. The three obtained data series are somewhat different. Only the first data set seems to have the front...... turbine in undisturbed flow. For this data set both the multiplicative model and in particular the simple first order transfer function model can predict the down wind wind speed from upwind wind speed and loading....

  16. Gear Windage Modeling Progress - Experimental Validation Status

    Science.gov (United States)

    Kunz, Rob; Handschuh, Robert F.

    2008-01-01

    In the Subsonics Rotary Wing (SRW) Project being funded for propulsion work at NASA Glenn Research Center, performance of the propulsion system is of high importance. In current rotorcraft drive systems many gearing components operate at high rotational speed (pitch line velocity > 24000 ft/ min). In our testing of high speed helical gear trains at NASA Glenn we have found that the work done on the air - oil mist within the gearbox can become a significant part of the power loss of the system. This loss mechanism is referred to as windage. The effort described in this presentation is to try to understand the variables that affect windage, develop a good experimental data base to validate, the analytical project being conducted at Penn State University by Dr. Rob Kunz under a NASA SRW NRA. The presentation provides an update to the status of these efforts.

  17. Validity of microgravity simulation models on earth

    DEFF Research Database (Denmark)

    Regnard, J; Heer, M; Drummer, C

    2001-01-01

    Many studies have used water immersion and head-down bed rest as experimental models to simulate responses to microgravity. However, some data collected during space missions are at variance or in contrast with observations collected from experimental models. These discrepancies could reflect inc...

  18. Tree-Ring Based Streamflow Reconstructions of the Yaqui River, MX and Implications for Drought and Water Management Studies

    Science.gov (United States)

    Hoover, K. J.; Ray, A. J.; Lukas, J. J.; Villanueva-Diaz, J.

    2008-05-01

    The Yaqui River is the irrigation source for an economically important agricultural region of Northwest Mexico. Currently, planning and forecasting are based on streamflow gauge data of only about 50 years. Understanding past variations in Yaqui streamflow is important to developing river forecasts and management plans. This presentation describes an effort to develop longer proxy records of streamflow to better understand the region's climate variability and drought history. The result is a 363-year dendrochronology based reconstruction model of Yaqui River streamflow. The model is based on a correlation between 44-years of Yaqui streamflow data and tree-ring chronologies dating to A.D. 1639. Chronologies are from Bisaloachi (28.66 N, 108.29 W), Cebadilla de Ocampo (28.122 N, 107.95 W) and Mesa de las Guacamayas (30.55 N, 108.62 W) in the state of Chihuahua, MX. The binary model uses a normalized index of annual total tree ring width (Tree-Ring Index, TRI). The model output is the probability that a given year experienced less than median streamflow, a possible indicator of drought. This model correctly predicts 100% of less than median streamflow years using a TRI input of precipitation. Total ring width (TRW) is typically associated with winter precipitation (October-June, in this case), which represents less than 40% of annual streamflow in this region where much of the precipitation and streamflow are related to the North American Monsoon (NAM), typically from July-September. The late wood (LW) growth portion of tree-rings may better reflect the NAM precipitation and streamflow, and produce a better reconstruction model. These results show that representation of NAM streamflow is essential for a more accurate streamflow reconstruction model. More tree-ring chronologies from other parts of the basin may increase the signal of natural streamflow variance captured, strengthening the reconstruction model. In particular, analyses of LW correlation with summer

  19. EMMD-Prony approach for dynamic validation of simulation models

    Institute of Scientific and Technical Information of China (English)

    Ruiyang Bai

    2015-01-01

    Model validation and updating is critical to model credi-bility growth. In order to assess model credibility quantitatively and locate model error precisely, a new dynamic validation method based on extremum field mean mode decomposition (EMMD) and the Prony method is proposed in this paper. Firstly, complex dy-namic responses from models and real systems are processed into stationary components by EMMD. These components always have definite physical meanings which can be the evidence about rough model error location. Secondly, the Prony method is applied to identify the features of each EMMD component. Amplitude si-milarity, frequency similarity, damping similarity and phase simi-larity are defined to describe the similarity of dynamic responses. Then quantitative validation metrics are obtained based on the improved entropy weight and energy proportion. Precise model error location is realized based on the physical meanings of these features. The application of this method in aircraft control er design provides evidence about its feasibility and usability.

  20. Uncertainty Quantification and Validation for RANS Turbulence Models

    Science.gov (United States)

    Oliver, Todd; Moser, Robert

    2011-11-01

    Uncertainty quantification and validation procedures for RANS turbulence models are developed and applied. The procedures used here rely on a Bayesian view of probability. In particular, the uncertainty quantification methodology requires stochastic model development, model calibration, and model comparison, all of which are pursued using tools from Bayesian statistics. Model validation is also pursued in a probabilistic framework. The ideas and processes are demonstrated on a channel flow example. Specifically, a set of RANS models--including Baldwin-Lomax, Spalart-Allmaras, k- ɛ, k- ω, and v2- f--and uncertainty representations are analyzed using DNS data for fully-developed channel flow. Predictions of various quantities of interest and the validity (or invalidity) of the various models for making those predictions will be examined. This work is supported by the Department of Energy [National Nuclear Security Administration] under Award Number [DE-FC52-08NA28615].

  1. Validation of a national hydrological model

    Science.gov (United States)

    McMillan, H. K.; Booker, D. J.; Cattoën, C.

    2016-10-01

    Nationwide predictions of flow time-series are valuable for development of policies relating to environmental flows, calculating reliability of supply to water users, or assessing risk of floods or droughts. This breadth of model utility is possible because various hydrological signatures can be derived from simulated flow time-series. However, producing national hydrological simulations can be challenging due to strong environmental diversity across catchments and a lack of data available to aid model parameterisation. A comprehensive and consistent suite of test procedures to quantify spatial and temporal patterns in performance across various parts of the hydrograph is described and applied to quantify the performance of an uncalibrated national rainfall-runoff model of New Zealand. Flow time-series observed at 485 gauging stations were used to calculate Nash-Sutcliffe efficiency and percent bias when simulating between-site differences in daily series, between-year differences in annual series, and between-site differences in hydrological signatures. The procedures were used to assess the benefit of applying a correction to the modelled flow duration curve based on an independent statistical analysis. They were used to aid understanding of climatological, hydrological and model-based causes of differences in predictive performance by assessing multiple hypotheses that describe where and when the model was expected to perform best. As the procedures produce quantitative measures of performance, they provide an objective basis for model assessment that could be applied when comparing observed daily flow series with competing simulated flow series from any region-wide or nationwide hydrological model. Model performance varied in space and time with better scores in larger and medium-wet catchments, and in catchments with smaller seasonal variations. Surprisingly, model performance was not sensitive to aquifer fraction or rain gauge density.

  2. Analysis of managed aquifer recharge for retiming streamflow in an alluvial river

    Science.gov (United States)

    Ronayne, Michael J.; Roudebush, Jason A.; Stednick, John D.

    2017-01-01

    Maintenance of low flows during dry periods is critical for supporting ecosystem function in many rivers. Managed aquifer recharge is one method that can be used to augment low flows in rivers that are hydraulically connected to an alluvial groundwater system. In this study, we performed numerical modeling to evaluate a managed recharge operation designed to retime streamflow in the South Platte River, northeastern Colorado (USA). Modeling involved the simulation of spatially and temporally variable groundwater-surface water exchange, as well as streamflow routing in the river. Periodic solutions that incorporate seasonality were developed for two scenarios, a natural base case scenario and an active management scenario that included groundwater pumping and managed recharge. A framework was developed to compare the scenarios by analyzing changes in head-dependent inflows and outflows to/from the aquifer, which was used to interpret the simulated impacts on streamflow. The results clearly illustrate a retiming of streamflow. Groundwater pumping near the river during winter months causes a reduction in streamflow during those months. Delivery of the pumped water to recharge ponds, located further from the river, has the intended effect of augmenting streamflow during low-flow summer months. Higher streamflow is not limited to the target time period, however, which highlights an inefficiency of flow augmentation projects that rely on water retention in the subsurface.

  3. Two Topics in Seasonal Streamflow Forecasting: Soil Moisture Initialization Error and Precipitation Downscaling

    Science.gov (United States)

    Koster, Randal; Walker, Greg; Mahanama, Sarith; Reichle, Rolf

    2012-01-01

    Continental-scale offline simulations with a land surface model are used to address two important issues in the forecasting of large-scale seasonal streamflow: (i) the extent to which errors in soil moisture initialization degrade streamflow forecasts, and (ii) the extent to which the downscaling of seasonal precipitation forecasts, if it could be done accurately, would improve streamflow forecasts. The reduction in streamflow forecast skill (with forecasted streamflow measured against observations) associated with adding noise to a soil moisture field is found to be, to first order, proportional to the average reduction in the accuracy of the soil moisture field itself. This result has implications for streamflow forecast improvement under satellite-based soil moisture measurement programs. In the second and more idealized ("perfect model") analysis, precipitation downscaling is found to have an impact on large-scale streamflow forecasts only if two conditions are met: (i) evaporation variance is significant relative to the precipitation variance, and (ii) the subgrid spatial variance of precipitation is adequately large. In the large-scale continental region studied (the conterminous United States), these two conditions are met in only a somewhat limited area.

  4. Estimating natural monthly streamflows in California and the likelihood of anthropogenic modification

    Science.gov (United States)

    Carlisle, Daren M.; Wolock, David M.; Howard, Jeannette K.; Grantham, Theodore E.; Fesenmyer, Kurt; Wieczorek, Michael

    2016-12-12

    Because natural patterns of streamflow are a fundamental property of the health of streams, there is a critical need to quantify the degree to which human activities have modified natural streamflows. A requirement for assessing streamflow modification in a given stream is a reliable estimate of flows expected in the absence of human influences. Although there are many techniques to predict streamflows in specific river basins, there is a lack of approaches for making predictions of natural conditions across large regions and over many decades. In this study conducted by the U.S. Geological Survey, in cooperation with The Nature Conservancy and Trout Unlimited, the primary objective was to develop empirical models that predict natural (that is, unaffected by land use or water management) monthly streamflows from 1950 to 2012 for all stream segments in California. Models were developed using measured streamflow data from the existing network of streams where daily flow monitoring occurs, but where the drainage basins have minimal human influences. Widely available data on monthly weather conditions and the physical attributes of river basins were used as predictor variables. Performance of regional-scale models was comparable to that of published mechanistic models for specific river basins, indicating the models can be reliably used to estimate natural monthly flows in most California streams. A second objective was to develop a model that predicts the likelihood that streams experience modified hydrology. New models were developed to predict modified streamflows at 558 streamflow monitoring sites in California where human activities affect the hydrology, using basin-scale geospatial indicators of land use and water management. Performance of these models was less reliable than that for the natural-flow models, but results indicate the models could be used to provide a simple screening tool for identifying, across the State of California, which streams may be

  5. Enso modulations on streamflow characteristics

    OpenAIRE

    Yerdelen Cahit; Marti Ali Ihsan; Kahya Ercan

    2011-01-01

    El Niño Southern Oscillation (ENSO) has been linked to climate and hydrologic anomalies throughout the world. This paperpresents how ENSO modulates the basic statistical characteristics of streamflow time series that is assumed to be affected byENSO. For this we first considered hypothetical series that can be obtained from the original series at each station by assumingnon-occurrence of El Niño events in the past. Instead those data belonging to El Niño years were simulated by the RadialBase...

  6. Toward a validation process for model based safety analysis

    OpenAIRE

    Adeline, Romain; Cardoso, Janette; Darfeuil, Pierre; Humbert, Sophie; Seguin, Christel

    2010-01-01

    Today, Model Based processes become more and more widespread to achieve the analysis of a system. However, there is no formal testing approach to ensure that the formal model is compliant with the real system. In the paper, we choose to study AltaRica model. We present a general process to well construct and validate an AltaRica formal model. The focus is made on this validation phase, i.e. verifying the compliance between the model and the real system. For it, the proposed process recommends...

  7. Analysis of Future Streamflow Regimes under Global Change Scenarios in Central Chile for Ecosystem Sustainability

    Science.gov (United States)

    Henriquez Dole, L. E.; Gironas, J. A.; Vicuna, S.

    2015-12-01

    Given the critical role of the streamflow regime for ecosystem sustainability, modeling long term effects of climate change and land use change on streamflow is important to predict possible impacts in stream ecosystems. Because flow duration curves are largely used to characterize the streamflow regime and define indices of ecosystem health, they were used to represent and analyze in this study the stream regime in the Maipo River Basin in Central Chile. Water and Environmental Assessment and Planning (WEAP) model and the Plant Growth Model (PGM) were used to simulate water distribution, consumption in rural areas and stream flows on a weekly basis. Historical data (1990-2014), future land use scenarios (2030/2050) and climate change scenarios were included in the process. Historical data show a declining trend in flows mainly by unprecedented climatic conditions, increasing interest among users on future streamflow scenarios. In the future, under an expected decline in water availability coupled with changes in crop water demand, water users will be forced to adapt by changing water allocation rules. Such adaptation actions would in turns affect the streamflow regime. Future scenarios for streamflow regime show dramatic changes in water availability and temporal distribution. Annual weekly mean flows can reduce in 19% in the worst scenario and increase in 3.3% in the best of them, and variability in streamflow increases nearly 90% in all scenarios under evaluation. The occurrence of maximum and minimum monthly flows changes, as June instead of July becomes the driest month, and December instead of January becomes the month with maximum flows. Overall, results show that under future scenarios streamflow is affected and altered by water allocation rules to satisfy water demands, and thus decisions will need to consider the streamflow regime (and habitat) in order to be sustainable.

  8. Water resources of the Great Smoky Mountains National Park: Streamflow Statistics

    Data.gov (United States)

    National Park Service, Department of the Interior — A feature class depicting geographic locations where streamflow statistics have been modeled within the Great Smoky Mountains National Park. Locations are expressed...

  9. A global evaluation of streamflow drought characteristics

    Directory of Open Access Journals (Sweden)

    A. K. Fleig

    2005-11-01

    Full Text Available How drought is characterised depends on the region under study, the purpose of the study and the available data. In case of regional applications or global comparison a standardisation of the methodology is preferable. In this study several methods to derive streamflow drought characteristics are evaluated based on their application to daily streamflow series from a wide range of hydrological regimes. Drought deficit characteristics, such as drought duration and deficit volume, are derived with the threshold level method. When it is applied to daily time series an additional pooling procedure is required and three different pooling procedures are evaluated, the moving average procedure (MA-procedure, the inter event time method (IT-method, and the sequent peak algorithm (SPA. The MA-procedure proved to be a flexible approach for the different series, and its parameter, the averaging interval, can easily be optimised for each stream. However, it modifies the discharge series and might introduce dependency between drought events. For the IT-method it is more difficult to find an optimal value for its parameter, the length of the excess period, in particular for flashy streams. The SPA can only be recommended for the selection of annual maximum series of deficit characteristics and for very low threshold levels due to the high degree of pooling. Furthermore, a frequency analysis of deficit volume and duration is conducted based on partial duration series of drought events. According to extreme value theory, excesses over a certain limit are Generalized Pareto (GP distributed. It was found that this model indeed performed better than or equally to other distribution models. In general, the GP-model could be used for streams in all regime types. However, for intermittent streams, zero-flow periods should be treated as censored data. For catchments with frost during the winter season, summer and winter droughts have to be analysed separately.

  10. Validation of Modeling Flow Approaching Navigation Locks

    Science.gov (United States)

    2013-08-01

    instrumentation, direction vernier . ........................................................................ 8  Figure 11. Plan A lock approach, upstream approach...13-9 8 Figure 9. Tools and instrumentation, bracket attached to rail. Figure 10. Tools and instrumentation, direction vernier . Numerical model

  11. Validating predictions from climate envelope models

    Science.gov (United States)

    Watling, J.; Bucklin, D.; Speroterra, C.; Brandt, L.; Cabal, C.; Romañach, Stephanie S.; Mazzotti, Frank J.

    2013-01-01

    Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species’ distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967–1971 (t1) and evaluated using occurrence data from 1998–2002 (t2). Model sensitivity (the ability to correctly classify species presences) was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences) was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on species.

  12. Streamflow, Infiltration, and Recharge in Arroyo Hondo, New Mexico

    Science.gov (United States)

    Moore, Stephanie J.

    2007-01-01

    Infiltration events in channels that flow only sporadically produce focused recharge to the Tesuque aquifer in the Espa?ola Basin. The current study examined the quantity and timing of streamflow and associated infiltration in Arroyo Hondo, an unregulated mountain-front stream that enters the basin from the western slope of the Sangre de Cristo Mountains. Traditional methods of stream gaging were combined with environmental-tracer based methods to provide the estimates. The study was conducted during a three-year period, October 1999?October 2002. The period was characterized by generally low precipitation and runoff. Summer monsoonal rains produced four brief periods of streamflow in water year 2000, only three of which extended beyond the mountain front, and negligible runoff in subsequent years. The largest peak flow during summer monsoon events was 0.59 cubic meters per second. Snowmelt was the main contributor to annual streamflow. Snowmelt produced more cumulative flow downstream from the mountain front during the study period than summer monsoonal rains. The presence or absence of streamflow downstream of the mountain front was determined by interpretation of streambed thermographs. Infiltration rates were estimated by numerical modeling of transient vertical streambed temperature profiles. Snowmelt extended throughout the instrumented reach during the spring of 2001. Flow was recorded at a station two kilometers downstream from the mountain front for six consecutive days in March. Inverse modeling of this event indicated an average infiltration rate of 1.4 meters per day at this location. For the entire study reach, the estimated total annual volume of infiltration ranged from 17,100 to 246,000 m3 during water years 2000 and 2001. During water year 2002, due to severe drought, streamflow and streambed infiltration in the study reach were both zero.

  13. Using Ensemble Streamflows for Power Marketing at Bonneville Power Administration

    Science.gov (United States)

    Barton, S. B.; Koski, P.

    2014-12-01

    Bonneville Power Administration (BPA) is a federal non-profit agency within the Pacific Northwest responsible for marketing the power generated from 31 federal hydro projects throughout the Columbia River Basin. The basin encompasses parts of five states and portions of British Columbia, Canada. BPA works with provincial entities, federal and state agencies, and tribal members to manage the water resources for a variety of purposes including flood risk management, power generation, fisheries, irrigation, recreation, and navigation. This basin is subject to significant hydrologic variability in terms of seasonal volume and runoff shape from year to year which presents new water management challenges each year. The power generation planning group at BPA includes a team of meteorologists and hydrologists responsible for preparing both short-term (up to three weeks) and mid-term (up to 18 months) weather and streamflow forecasts including ensemble streamflow data. Analysts within the mid-term planning group are responsible for running several different hydrologic models used for planning studies. These models rely on these streamflow ensembles as a primary input. The planning studies are run bi-weekly to help determine the amount of energy available, or energy inventory, for forward marketing (selling or purchasing energy up to a year in advance). These studies are run with the objective of meeting the numerous multi-purpose objectives of the basin under the various streamflow conditions within the ensemble set. In addition to ensemble streamflows, an ensemble of seasonal volume forecasts is also provided for the various water conditions in order to set numerous constraints on the system. After meeting all the various requirements of the system, a probabilistic energy inventory is calculated and used for marketing purposes.

  14. Validation of Model Forecasts of the Ambient Solar Wind

    Science.gov (United States)

    Macneice, P. J.; Hesse, M.; Kuznetsova, M. M.; Rastaetter, L.; Taktakishvili, A.

    2009-01-01

    Independent and automated validation is a vital step in the progression of models from the research community into operational forecasting use. In this paper we describe a program in development at the CCMC to provide just such a comprehensive validation for models of the ambient solar wind in the inner heliosphere. We have built upon previous efforts published in the community, sharpened their definitions, and completed a baseline study. We also provide first results from this program of the comparative performance of the MHD models available at the CCMC against that of the Wang-Sheeley-Arge (WSA) model. An important goal of this effort is to provide a consistent validation to all available models. Clearly exposing the relative strengths and weaknesses of the different models will enable forecasters to craft more reliable ensemble forecasting strategies. Models of the ambient solar wind are developing rapidly as a result of improvements in data supply, numerical techniques, and computing resources. It is anticipated that in the next five to ten years, the MHD based models will supplant semi-empirical potential based models such as the WSA model, as the best available forecast models. We anticipate that this validation effort will track this evolution and so assist policy makers in gauging the value of past and future investment in modeling support.

  15. Traffic modelling validation of advanced driver assistance systems

    NARCIS (Netherlands)

    Tongeren, R. van; Gietelink, O.J.; Schutter, B. de; Verhaegen, M.

    2007-01-01

    This paper presents a microscopic traffic model for the validation of advanced driver assistance systems. This model describes single-lane traffic and is calibrated with data from a field operational test. To illustrate the use of the model, a Monte Carlo simulation of single-lane traffic scenarios

  16. Bivariate Drought Analysis Using Streamflow Reconstruction with Tree Ring Indices in the Sacramento Basin, California, USA

    Directory of Open Access Journals (Sweden)

    Jaewon Kwak

    2016-03-01

    Full Text Available Long-term streamflow data are vital for analysis of hydrological droughts. Using an artificial neural network (ANN model and nine tree-ring indices, this study reconstructed the annual streamflow of the Sacramento River for the period from 1560 to 1871. Using the reconstructed streamflow data, the copula method was used for bivariate drought analysis, deriving a hydrological drought return period plot for the Sacramento River basin. Results showed strong correlation among drought characteristics, and the drought with a 20-year return period (17.2 million acre-feet (MAF per year in the Sacramento River basin could be considered a critical level of drought for water shortages.

  17. Measuring real-time streamflow using emerging technologies: Radar, hydroacoustics, and the probability concept

    Science.gov (United States)

    Fulton, J.; Ostrowski, J.

    2008-01-01

    Forecasting streamflow during extreme hydrologic events such as floods can be problematic. This is particularly true when flow is unsteady, and river forecasts rely on models that require uniform-flow rating curves to route water from one forecast point to another. As a result, alternative methods for measuring streamflow are needed to properly route flood waves and account for inertial and pressure forces in natural channels dominated by nonuniform-flow conditions such as mild water surface slopes, backwater, tributary inflows, and reservoir operations. The objective of the demonstration was to use emerging technologies to measure instantaneous streamflow in open channels at two existing US Geological Survey streamflow-gaging stations in Pennsylvania. Surface-water and instream-point velocities were measured using hand-held radar and hydroacoustics. Streamflow was computed using the probability concept, which requires velocity data from a single vertical containing the maximum instream velocity. The percent difference in streamflow at the Susquehanna River at Bloomsburg, PA ranged from 0% to 8% with an average difference of 4% and standard deviation of 8.81 m3/s. The percent difference in streamflow at Chartiers Creek at Carnegie, PA ranged from 0% to 11% with an average difference of 5% and standard deviation of 0.28 m3/s. New generation equipment is being tested and developed to advance the use of radar-derived surface-water velocity and instantaneous streamflow to facilitate the collection and transmission of real-time streamflow that can be used to parameterize hydraulic routing models.

  18. Estimating current and future streamflow characteristics at ungaged sites, central and eastern Montana, with application to evaluating effects of climate change on fish populations

    Science.gov (United States)

    Sando, Roy; Chase, Katherine J.

    2017-03-23

    A common statistical procedure for estimating streamflow statistics at ungaged locations is to develop a relational model between streamflow and drainage basin characteristics at gaged locations using least squares regression analysis; however, least squares regression methods are parametric and make constraining assumptions about the data distribution. The random forest regression method provides an alternative nonparametric method for estimating streamflow characteristics at ungaged sites and requires that the data meet fewer statistical conditions than least squares regression methods.Random forest regression analysis was used to develop predictive models for 89 streamflow characteristics using Precipitation-Runoff Modeling System simulated streamflow data and drainage basin characteristics at 179 sites in central and eastern Montana. The predictive models were developed from streamflow data simulated for current (baseline, water years 1982–99) conditions and three future periods (water years 2021–38, 2046–63, and 2071–88) under three different climate-change scenarios. These predictive models were then used to predict streamflow characteristics for baseline conditions and three future periods at 1,707 fish sampling sites in central and eastern Montana. The average root mean square error for all predictive models was about 50 percent. When streamflow predictions at 23 fish sampling sites were compared to nearby locations with simulated data, the mean relative percent difference was about 43 percent. When predictions were compared to streamflow data recorded at 21 U.S. Geological Survey streamflow-gaging stations outside of the calibration basins, the average mean absolute percent error was about 73 percent.

  19. Quantitative system validation in model driven design

    DEFF Research Database (Denmark)

    Hermanns, Hilger; Larsen, Kim Guldstrand; Raskin, Jean-Francois;

    2010-01-01

    The European STREP project Quasimodo1 develops theory, techniques and tool components for handling quantitative constraints in model-driven development of real-time embedded systems, covering in particular real-time, hybrid and stochastic aspects. This tutorial highlights the advances made, focus...

  20. Validation of limited sampling models (LSM) for estimating AUC in therapeutic drug monitoring - is a separate validation group required?

    NARCIS (Netherlands)

    Proost, J. H.

    2007-01-01

    Objective: Limited sampling models (LSM) for estimating AUC in therapeutic drug monitoring are usually validated in a separate group of patients, according to published guidelines. The aim of this study is to evaluate the validation of LSM by comparing independent validation with cross-validation us

  1. Ensuring the Validity of the Micro Foundation in DSGE Models

    DEFF Research Database (Denmark)

    Andreasen, Martin Møller

    The presence of i) stochastic trends, ii) deterministic trends, and/or iii) stochastic volatil- ity in DSGE models may imply that the agents' objective functions attain infinite values. We say that such models do not have a valid micro foundation. The paper derives sufficient condi- tions which...... ensure that the objective functions of the households and the firms are finite even when various trends and stochastic volatility are included in a standard DSGE model. Based on these conditions we test the validity of the micro foundation in six DSGE models from the literature. The models of Justiniano...... & Primiceri (American Economic Review, forth- coming) and Fernández-Villaverde & Rubio-Ramírez (Review of Economic Studies, 2007) do not satisfy these sufficient conditions, or any other known set of conditions ensuring finite values for the objective functions. Thus, the validity of the micro foundation...

  2. Validation of Air Traffic Controller Workload Models

    Science.gov (United States)

    1979-09-01

    SAR) tapes dtirinq the data reduc- tion phase of the project. Kentron International Limited provided the software support for the oroject. This included... ETABS ) or to revised traffic control procedures. The models also can be used to verify productivity benefits after new configurations have been...col- lected and processed manually. A preliminary compari- son has been made between standard NAS Stage A and ETABS operations at Miami. 1.2

  3. Multisite Spatiotemporal Streamflow Simulation - With an Application to Irrigation Water Shortage Risk Assessment

    Directory of Open Access Journals (Sweden)

    Hsin-I Hsieh

    2014-01-01

    Full Text Available Regional water resources management generally requires knowledge of multisite streamflows which exhibit random, yet spatially and temporally correlated, variabilities. The complexity of such correlated randomness makes decision-making for water resources management a difficult task. With presence of uncertainties in space and time, risk-based decision making using stochastic models is sought after. In this study we propose a spatiotemporal stochastic simulation model for multisite streamflow simulation. The model is composed of three components: (1 stochastic simulation of bivariate non-Gaussian distributions, (2 anisotropic space-time covariance function which characterizes the spatial and temporal variations of multisite ten-day periods (TDP streamflows, and (3 Monte Carlo spatiotemporal simulation of streamflows. The model was applied to the Chia-Nan Irrigation District in southern Taiwan for a multisite spatiotemporal ten-day-period streamflow simulation. Through a rigorous evaluation, the proposed spatiotemporal model is found capable of preserving not only the marginal distributions but also the spatiotemporal correlation structure of the multisite streamflows. An example application which demonstrates utilization of the proposed model for irrigation water shortage risk assessment is also presented.

  4. Experiments for foam model development and validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bourdon, Christopher Jay; Cote, Raymond O.; Moffat, Harry K.; Grillet, Anne Mary; Mahoney, James F. (Honeywell Federal Manufacturing and Technologies, Kansas City Plant, Kansas City, MO); Russick, Edward Mark; Adolf, Douglas Brian; Rao, Rekha Ranjana; Thompson, Kyle Richard; Kraynik, Andrew Michael; Castaneda, Jaime N.; Brotherton, Christopher M.; Mondy, Lisa Ann; Gorby, Allen D.

    2008-09-01

    A series of experiments has been performed to allow observation of the foaming process and the collection of temperature, rise rate, and microstructural data. Microfocus video is used in conjunction with particle image velocimetry (PIV) to elucidate the boundary condition at the wall. Rheology, reaction kinetics and density measurements complement the flow visualization. X-ray computed tomography (CT) is used to examine the cured foams to determine density gradients. These data provide input to a continuum level finite element model of the blowing process.

  5. Functional state modelling approach validation for yeast and bacteria cultivations

    Science.gov (United States)

    Roeva, Olympia; Pencheva, Tania

    2014-01-01

    In this paper, the functional state modelling approach is validated for modelling of the cultivation of two different microorganisms: yeast (Saccharomyces cerevisiae) and bacteria (Escherichia coli). Based on the available experimental data for these fed-batch cultivation processes, three different functional states are distinguished, namely primary product synthesis state, mixed oxidative state and secondary product synthesis state. Parameter identification procedures for different local models are performed using genetic algorithms. The simulation results show high degree of adequacy of the models describing these functional states for both S. cerevisiae and E. coli cultivations. Thus, the local models are validated for the cultivation of both microorganisms. This fact is a strong structure model verification of the functional state modelling theory not only for a set of yeast cultivations, but also for bacteria cultivation. As such, the obtained results demonstrate the efficiency and efficacy of the functional state modelling approach. PMID:26740778

  6. Functional state modelling approach validation for yeast and bacteria cultivations.

    Science.gov (United States)

    Roeva, Olympia; Pencheva, Tania

    2014-09-03

    In this paper, the functional state modelling approach is validated for modelling of the cultivation of two different microorganisms: yeast (Saccharomyces cerevisiae) and bacteria (Escherichia coli). Based on the available experimental data for these fed-batch cultivation processes, three different functional states are distinguished, namely primary product synthesis state, mixed oxidative state and secondary product synthesis state. Parameter identification procedures for different local models are performed using genetic algorithms. The simulation results show high degree of adequacy of the models describing these functional states for both S. cerevisiae and E. coli cultivations. Thus, the local models are validated for the cultivation of both microorganisms. This fact is a strong structure model verification of the functional state modelling theory not only for a set of yeast cultivations, but also for bacteria cultivation. As such, the obtained results demonstrate the efficiency and efficacy of the functional state modelling approach.

  7. Monthly streamflow forecasting with auto-regressive integrated moving average

    Science.gov (United States)

    Nasir, Najah; Samsudin, Ruhaidah; Shabri, Ani

    2017-09-01

    Forecasting of streamflow is one of the many ways that can contribute to better decision making for water resource management. The auto-regressive integrated moving average (ARIMA) model was selected in this research for monthly streamflow forecasting with enhancement made by pre-processing the data using singular spectrum analysis (SSA). This study also proposed an extension of the SSA technique to include a step where clustering was performed on the eigenvector pairs before reconstruction of the time series. The monthly streamflow data of Sungai Muda at Jeniang, Sungai Muda at Jambatan Syed Omar and Sungai Ketil at Kuala Pegang was gathered from the Department of Irrigation and Drainage Malaysia. A ratio of 9:1 was used to divide the data into training and testing sets. The ARIMA, SSA-ARIMA and Clustered SSA-ARIMA models were all developed in R software. Results from the proposed model are then compared to a conventional auto-regressive integrated moving average model using the root-mean-square error and mean absolute error values. It was found that the proposed model can outperform the conventional model.

  8. Downscaling of GCM forecasts to streamflow over Scandinavia

    DEFF Research Database (Denmark)

    Nilsson, P.; Uvo, C.B.; Landman, W.A.

    2008-01-01

    A seasonal forecasting technique to produce probabilistic and deterministic streamflow forecasts for 23 basins in Norway and northern Sweden is developed in this work. Large scale circulation and moisture fields, forecasted by the ECHAM4.5 model 4 months in advance, are used to forecast spring......-western Norway. The physical interpretation of the forecasting skill is that stations close to the Norwegian coast are directly exposed to prevailing winds from the Atlantic ocean, which constitute the principal source of predictive information from the atmosphere on the seasonal timescale....... flows. The technique includes model output statistics (MOS) based on a non-linear Neural Network (NN) approach. Results show that streamflow forecasts from Global Circulation Model (GCM) predictions, for the Scandinavia region are viable and highest skill values were found for basins located in south...

  9. Validation of Orthorectified Interferometric Radar Imagery and Digital Elevation Models

    Science.gov (United States)

    Smith Charles M.

    2004-01-01

    This work was performed under NASA's Verification and Validation (V&V) Program as an independent check of data supplied by EarthWatch, Incorporated, through the Earth Science Enterprise Scientific Data Purchase (SDP) Program. This document serves as the basis of reporting results associated with validation of orthorectified interferometric interferometric radar imagery and digital elevation models (DEM). This validation covers all datasets provided under the first campaign (Central America & Virginia Beach) plus three earlier missions (Indonesia, Red River: and Denver) for a total of 13 missions.

  10. Validation of a Model of the Domino Effect?

    CERN Document Server

    Larham, Ron

    2008-01-01

    A recent paper proposing a model of the limiting speed of the domino effect is discussed with reference to its need and the need of models in general for validation against experimental data. It is shown that the proposed model diverges significantly from experimentally derived speed estimates over a significant range of domino spacing using data from the existing literature and this author's own measurements, hence if its use had had economic importance its use outside its range of validity could have led to loses of one sort or another to its users.

  11. Resampling procedures to validate dendro-auxometric regression models

    Directory of Open Access Journals (Sweden)

    2009-03-01

    Full Text Available Regression analysis has a large use in several sectors of forest research. The validation of a dendro-auxometric model is a basic step in the building of the model itself. The more a model resists to attempts of demonstrating its groundlessness, the more its reliability increases. In the last decades many new theories, that quite utilizes the calculation speed of the calculators, have been formulated. Here we show the results obtained by the application of a bootsprap resampling procedure as a validation tool.

  12. Validation of an Efficient Outdoor Sound Propagation Model Using BEM

    DEFF Research Database (Denmark)

    Quirós-Alpera, S.; Henriquez, Vicente Cutanda; Jacobsen, Finn

    2001-01-01

    An approximate, simple and practical model for prediction of outdoor sound propagation exists based on ray theory, diffraction theory and Fresnel-zone considerations [1]. This model, which can predict sound propagation over non-flat terrain, has been validated for combinations of flat ground, hills...... and barriers, but it still needs to be validated for configurations that involve combinations of valleys and barriers. In order to do this a boundary element model has been implemented in MATLAB to serve as a reliable reference....

  13. Validation of a Model for Ice Formation around Finned Tubes

    Directory of Open Access Journals (Sweden)

    Kamal A. R. Ismai

    2016-09-01

    Full Text Available Phase change materials although attaractive option for thermal storage applications its main drawback is the slow thermal response during charging and discharging processes due to their low thermal conductivity. The present study validates a model developed by the authors some years ago on radial fins as a method to meliorate the thermal performance of PCM in horizontal storage system. The developed model for the radial finned tube is based on pure conduction, the enthalpy approach and was discretized by the finite difference method. Experiments were realized specifically to validate the model and its numerical predictions.

  14. Child human model development: a hybrid validation approach

    NARCIS (Netherlands)

    Forbes, P.A.; Rooij, L. van; Rodarius, C.; Crandall, J.

    2008-01-01

    The current study presents a development and validation approach of a child human body model that will help understand child impact injuries and improve the biofidelity of child anthropometric test devices. Due to the lack of fundamental child biomechanical data needed to fully develop such models a

  15. Child human model development: a hybrid validation approach

    NARCIS (Netherlands)

    Forbes, P.A.; Rooij, L. van; Rodarius, C.; Crandall, J.

    2008-01-01

    The current study presents a development and validation approach of a child human body model that will help understand child impact injuries and improve the biofidelity of child anthropometric test devices. Due to the lack of fundamental child biomechanical data needed to fully develop such models a

  16. Reverse electrodialysis : A validated process model for design and optimization

    NARCIS (Netherlands)

    Veerman, J.; Saakes, M.; Metz, S. J.; Harmsen, G. J.

    2011-01-01

    Reverse electrodialysis (RED) is a technology to generate electricity using the entropy of the mixing of sea and river water. A model is made of the RED process and validated experimentally. The model is used to design and optimize the RED process. It predicts very small differences between counter-

  17. Validation of a multi-objective, predictive urban traffic model

    NARCIS (Netherlands)

    Wilmink, I.R.; Haak, P. van den; Woldeab, Z.; Vreeswijk, J.

    2013-01-01

    This paper describes the results of the verification and validation of the ecoStrategic Model, which was developed, implemented and tested in the eCoMove project. The model uses real-time and historical traffic information to determine the current, predicted and desired state of traffic in a network

  18. Measurements for validation of high voltage underground cable modelling

    DEFF Research Database (Denmark)

    Bak, Claus Leth; Gudmundsdottir, Unnur Stella; Wiechowski, Wojciech Tomasz

    2009-01-01

    This paper discusses studies concerning cable modelling for long high voltage AC cable lines. In investigating the possibilities of using long cables instead of overhead lines, the simulation results must be trustworthy. Therefore a model validation is of great importance. This paper describes...

  19. Model validation for karst flow using sandbox experiments

    Science.gov (United States)

    Ye, M.; Pacheco Castro, R. B.; Tao, X.; Zhao, J.

    2015-12-01

    The study of flow in karst is complex due of the high heterogeneity of the porous media. Several approaches have been proposed in the literature to study overcome the natural complexity of karst. Some of those methods are the single continuum, double continuum and the discrete network of conduits coupled with the single continuum. Several mathematical and computing models are available in the literature for each approach. In this study one computer model has been selected for each category to validate its usefulness to model flow in karst using a sandbox experiment. The models chosen are: Modflow 2005, Modflow CFPV1 and Modflow CFPV2. A sandbox experiment was implemented in such way that all the parameters required for each model can be measured. The sandbox experiment was repeated several times under different conditions. The model validation will be carried out by comparing the results of the model simulation and the real data. This model validation will allows ud to compare the accuracy of each model and the applicability in Karst. Also we will be able to evaluate if the results of the complex models improve a lot compared to the simple models specially because some models require complex parameters that are difficult to measure in the real world.

  20. Composing, Analyzing and Validating Software Models

    Science.gov (United States)

    Sheldon, Frederick T.

    1998-10-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  1. Validation of a terrestrial food chain model.

    Science.gov (United States)

    Travis, C C; Blaylock, B P

    1992-01-01

    An increasingly important topic in risk assessment is the estimation of human exposure to environmental pollutants through pathways other than inhalation. The Environmental Protection Agency (EPA) has recently developed a computerized methodology (EPA, 1990) to estimate indirect exposure to toxic pollutants from Municipal Waste Combuster emissions. This methodology estimates health risks from exposure to toxic pollutants from the terrestrial food chain (TFC), soil ingestion, drinking water ingestion, fish ingestion, and dermal absorption via soil and water. Of these, one of the most difficult to estimate is exposure through the food chain. This paper estimates the accuracy of the EPA methodology for estimating food chain contamination. To our knowledge, no data exist on measured concentrations of pollutants in food grown around Municipal Waste Incinerators, and few field-scale studies have been performed on the uptake of pollutants in the food chain. Therefore, to evaluate the EPA methodology, we compare actual measurements of background contaminant levels in food with estimates made using EPA's computerized methodology. Background levels of contaminants in air, water, and soil were used as input to the EPA food chain model to predict background levels of contaminants in food. These predicted values were then compared with the measured background contaminant levels. Comparisons were performed for dioxin, pentachlorophenol, polychlorinated biphenyls, benzene, benzo(a)pyrene, mercury, and lead.

  2. Simulation of Streamflow and Selected Water-Quality Constituents through a Model of the Onondaga Lake Basin, Onondaga County, New York - A Guide to Model Application

    Science.gov (United States)

    Coon, William F.

    2008-01-01

    A computer model of hydrologic and water-quality processes of the Onondaga Lake basin in Onondaga County, N.Y., was developed during 2003-07 to assist water-resources managers in making basin-wide management decisions that could affect peak flows and the water quality of tributaries to Onondaga Lake. The model was developed with the Hydrological Simulation Program-Fortran (HSPF) and was designed to allow simulation of proposed or hypothetical land-use changes, best-management practices (BMPs), and instream stormwater-detention basins such that their effects on flows and loads of suspended sediment, orthophosphate, total phosphorus, ammonia, organic nitrogen, and nitrate could be analyzed. Extreme weather conditions, such as intense storms and prolonged droughts, can be simulated through manipulation of the precipitation record. Model results obtained from different scenarios can then be compared and analyzed through an interactive computer program known as Generation and Analysis of Model Simulation Scenarios for Watersheds (GenScn). Background information on HSPF and GenScn is presented to familiarize the user with these two programs. Step-by-step examples are provided on (1) the creation of land-use, BMP, and stormflow-detention scenarios for simulation by the HSPF model, and (2) the analysis of simulation results through GenScn.

  3. On the development and validation of QSAR models.

    Science.gov (United States)

    Gramatica, Paola

    2013-01-01

    The fundamental and more critical steps that are necessary for the development and validation of QSAR models are presented in this chapter as best practices in the field. These procedures are discussed in the context of predictive QSAR modelling that is focused on achieving models of the highest statistical quality and with external predictive power. The most important and most used statistical parameters needed to verify the real performances of QSAR models (of both linear regression and classification) are presented. Special emphasis is placed on the validation of models, both internally and externally, as well as on the need to define model applicability domains, which should be done when models are employed for the prediction of new external compounds.

  4. Potential effects of climate change on streamflow for seven watersheds in eastern and central Montana

    Science.gov (United States)

    Chase, Katherine J.; Haj, Adel; Regan, R. Steven; Viger, Roland J.

    2016-01-01

    Study regionEastern and central Montana.Study focusFish in Northern Great Plains streams tolerate extreme conditions including heat, cold, floods, and drought; however changes in streamflow associated with long-term climate change may render some prairie streams uninhabitable for current fish species. To better understand future hydrology of these prairie streams, the Precipitation-Runoff Modeling System model and output from the RegCM3 Regional Climate model were used to simulate streamflow for seven watersheds in eastern and central Montana, for a baseline period (water years 1982–1999) and three future periods: water years 2021–2038 (2030 period), 2046–2063 (2055 period), and 2071–2088 (2080 period).New hydrological insights for the regionProjected changes in mean annual and mean monthly streamflow vary by the RegCM3 model selected, by watershed, and by future period. Mean annual streamflows for all future periods are projected to increase (11–21%) for two of the four central Montana watersheds: Middle Musselshell River and Cottonwood Creek. Mean annual streamflows for all future periods are projected to decrease (changes of −24 to −75%) for Redwater River watershed in eastern Montana. Mean annual streamflows are projected to increase slightly (2–15%) for the 2030 period and decrease (changes of −16 to −44%) for the 2080 period for the four remaining watersheds.

  5. Assessing the impact of climate variability and human activities on streamflow variation

    OpenAIRE

    Chang, Jianxia; Zhang, Hongxue; Wang, Yimin; Zhu, Yuelu

    2016-01-01

    Water resources in river systems have been changing under the impact of both climate variability and human activities. Assessing the respective impact on decadal streamflow variation is important for water resource management. By using an elasticity-based method and calibrated TOPMODEL and VIC hydrological models, we quantitatively isolated the relative contributions that human activities and climate variability made to decadal streamflow changes in the Jinghe basin, located...

  6. Assessing the impact of climate variability and human activity to streamflow variation

    OpenAIRE

    Chang, J.; Zhang, H.; Y. Wang; Zhu, Y.

    2015-01-01

    Water resources in river systems have been changing under the impacts of both climate variability and human activities. Assessing the respective impacts on decadal streamflow variation is important for water resources management. By using an elasticity-based method, calibrated TOPMODEL and VIC hydrologic models, we have quantitatively isolated the relative contributions that human activity and climate variability made to decadal streamflow changes in Jinhe b...

  7. Assessing the impact of climate variability and human activities on streamflow variation

    OpenAIRE

    Chang, J.; Zhang, H.; Y. Wang; Zhu, Y.

    2015-01-01

    Water resources in river systems have been changing under the impact of both climate variability and human activities. Assessing the respective impact on decadal streamflow variation is important for water resource management. By using an elasticity-based method and calibrated TOPMODEL and VIC hydrological models, we quantitatively isolated the relative contributions that human activities and climate variability made to decadal streamflow changes in Jinghe basin, located in ...

  8. Multiscale temporal variability and regional patterns in 555 years of conterminous U.S. streamflow

    Science.gov (United States)

    Ho, Michelle; Lall, Upmanu; Sun, Xun; Cook, Edward R.

    2017-04-01

    The development of paleoclimate streamflow reconstructions in the conterminous United States (CONUS) has provided water resource managers with improved insights into multidecadal and centennial scale variability that cannot be reliably detected using shorter instrumental records. Paleoclimate streamflow reconstructions have largely focused on individual catchments limiting the ability to quantify variability across the CONUS. The Living Blended Drought Atlas (LBDA), a spatially and temporally complete 555 year long paleoclimate record of summer drought across the CONUS, provides an opportunity to reconstruct and characterize streamflow variability at a continental scale. We explore the validity of the first paleoreconstructions of streamflow that span the CONUS informed by the LBDA targeting a set of U.S. Geological Survey streamflow sites. The reconstructions are skillful under cross validation across most of the country, but the variance explained is generally low. Spatial and temporal structures of streamflow variability are analyzed using hierarchical clustering, principal component analysis, and wavelet analyses. Nine spatially coherent clusters are identified. The reconstructions show signals of contemporary droughts such as the Dust Bowl (1930s) and 1950s droughts. Decadal-scale variability was detected in the late 1900s in the western U.S., however, similar modes of temporal variability were rarely present prior to the 1950s. The twentieth century featured longer wet spells and shorter dry spells compared with the preceding 450 years. Streamflows in the Pacific Northwest and Northeast are negatively correlated with the central U.S. suggesting the potential to mitigate some drought impacts by balancing economic activities and insurance pools across these regions during major droughts.

  9. A prediction model for ocular damage - Experimental validation.

    Science.gov (United States)

    Heussner, Nico; Vagos, Márcia; Spitzer, Martin S; Stork, Wilhelm

    2015-08-01

    With the increasing number of laser applications in medicine and technology, accidental as well as intentional exposure of the human eye to laser sources has become a major concern. Therefore, a prediction model for ocular damage (PMOD) is presented within this work and validated for long-term exposure. This model is a combination of a raytracing model with a thermodynamical model of the human and an application which determines the thermal damage by the implementation of the Arrhenius integral. The model is based on our earlier work and is here validated against temperature measurements taken with porcine eye samples. For this validation, three different powers were used: 50mW, 100mW and 200mW with a spot size of 1.9mm. Also, the measurements were taken with two different sensing systems, an infrared camera and a fibre optic probe placed within the tissue. The temperatures were measured up to 60s and then compared against simulations. The measured temperatures were found to be in good agreement with the values predicted by the PMOD-model. To our best knowledge, this is the first model which is validated for both short-term and long-term irradiations in terms of temperature and thus demonstrates that temperatures can be accurately predicted within the thermal damage regime. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. The hypothetical world of CoMFA and model validation

    Energy Technology Data Exchange (ETDEWEB)

    Oprea, T.I. [Los Alamos National Lab., NM (United States)

    1996-12-31

    CoMFA is a technique used to establish the three-dimensional similarity of molecular structures, in relationship to a target property. Because the risk of chance correlation is high, validation is required for all CoMFA models. The following validation steps should be performed: the choice of alignment rules (superimposition and conformer criteria) has to use experimental data when available, or different (alternate) hypotheses; statistical methods (e.g., cross-validation with randomized groups), have to emphasize simplicity, robustness, predictivity and explanatory power. When several CoMFA-QSAR models on similar targets and/or structures are available, qualitative lateral validation can be applied. This meta-analysis for CoMFA models offers a broader perspective on the similarities and differences between compared biological targets, with potential applications in rational drug design [e.g., selectivity, efficacy] and environmental toxicology. Examples that focus on validation of CoMFA models include the following steroid-binding proteins: aromatase, the estrogen and the androgen receptors, a monoclonal antibody against progesterone and two steroid binding globulins.

  11. HYDRORECESSION: A Matlab toolbox for streamflow recession analysis

    Science.gov (United States)

    Arciniega-Esparza, Saúl; Breña-Naranjo, José Agustín; Pedrozo-Acuña, Adrián; Appendini, Christian Mario

    2017-01-01

    Streamflow recession analysis from observed hydrographs allows to extract information about the storage-discharge relationship of a catchment and some of their groundwater hydraulic properties. The HYDRORECESSION toolbox, presented in this paper, is a graphical user interface for Matlab and it was developed to analyse streamflow recession curves with the support of different tools. The software extracts hydrograph recessions segments with three different methods (Vogel, Brutsaert and Aksoy) that are later analysed with four of the most common models to simulate recession curves (Maillet, Boussinesq, Coutagne and Wittenberg) and it includes four parameter-fitting techniques (linear regression, lower envelope, data binning and mean squared error). HYDRORECESSION offers tools to parameterize linear and nonlinear storage-outflow relationships and it is useful for regionalization purposes, catchment classification, baseflow separation, hydrological modeling and low flows prediction. HYDRORECESSION is freely available for non-commercial and academic purposes and is available at Matlab File Exchange (http://www.mathworks.com/matlabcentral/fileexchange/51332-hydroecession).

  12. The Validation of Climate Models: The Development of Essential Practice

    Science.gov (United States)

    Rood, R. B.

    2011-12-01

    It is possible from both a scientific and philosophical perspective to state that climate models cannot be validated. However, with the realization that the scientific investigation of climate change is as much a subject of politics as of science, maintaining this formal notion of "validation" has significant consequences. For example, it relegates the bulk of work of many climate scientists to an exercise of model evaluation that can be construed as ill-posed. Even within the science community this motivates criticism of climate modeling as an exercise of weak scientific practice. Stepping outside of the science community, statements that validation is impossible are used in political arguments to discredit the scientific investigation of climate, to maintain doubt about projections of climate change, and hence, to prohibit the development of public policy to regulate the emissions of greenhouse gases. With the acceptance of the impossibility of validation, scientists often state that the credibility of models can be established through an evaluation process. A robust evaluation process leads to the quantitative description of the modeling system against a standard set of measures. If this process is standardized as institutional practice, then this provides a measure of model performance from one modeling release to the next. It is argued, here, that such a robust and standardized evaluation of climate models can be structured and quantified as "validation." Arguments about the nuanced meaning of validation and evaluation are a subject about which the climate modeling community needs to develop a standard. It does injustice to a body of science-based knowledge to maintain that validation is "impossible." Rather than following such a premise, which immediately devalues the knowledge base, it is more useful to develop a systematic, standardized approach to robust, appropriate validation. This stands to represent the complexity of the Earth's climate and its

  13. Importance of Computer Model Validation in Pyroprocessing Technology Development

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Y. E.; Li, Hui; Yim, M. S. [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2014-05-15

    In this research, we developed a plan for experimental validation of one of the computer models developed for ER process modeling, i. e., the ERAD code. Several candidate surrogate materials are selected for the experiment considering the chemical and physical properties. Molten salt-based pyroprocessing technology is being examined internationally as an alternative to treat spent nuclear fuel over aqueous technology. The central process in pyroprocessing is electrorefining(ER) which separates uranium from transuranic elements and fission products present in spent nuclear fuel. ER is a widely used process in the minerals industry to purify impure metals. Studies of ER by using actual spent nuclear fuel materials are problematic for both technical and political reasons. Therefore, the initial effort for ER process optimization is made by using computer models. A number of models have been developed for this purpose. But as validation of these models is incomplete and often times problematic, the simulation results from these models are inherently uncertain.

  14. The turbulent viscosity models and their experimental validation; Les modeles de viscosite turbulente et leur validation experimentale

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    This workshop on turbulent viscosity models and on their experimental validation was organized by the `convection` section of the French society of thermal engineers. From the 9 papers presented during this workshop, 8 deal with the modeling of turbulent flows inside combustion chambers, turbo-machineries or in other energy-related applications, and have been selected for ETDE. (J.S.)

  15. Nonequilibrium stage modelling of dividing wall columns and experimental validation

    Science.gov (United States)

    Hiller, Christoph; Buck, Christina; Ehlers, Christoph; Fieg, Georg

    2010-11-01

    Dealing with complex process units like dividing wall columns pushes the focus on the determination of suitable modelling approaches. For this purpose a nonequilibrium stage model is developed. The successful validation is achieved by an experimental investigation of fatty alcohol mixtures under vacuum condition at pilot scale. Aim is the recovery of high purity products. The proposed model predicts the product qualities and temperature profiles very well.

  16. Human surrogate models of neuropathic pain: validity and limitations.

    Science.gov (United States)

    Binder, Andreas

    2016-02-01

    Human surrogate models of neuropathic pain in healthy subjects are used to study symptoms, signs, and the hypothesized underlying mechanisms. Although different models are available, different spontaneous and evoked symptoms and signs are inducible; 2 key questions need to be answered: are human surrogate models conceptually valid, ie, do they share the sensory phenotype of neuropathic pain states, and are they sufficiently reliable to allow consistent translational research?

  17. Contact Modelling in Resistance Welding, Part II: Experimental Validation

    DEFF Research Database (Denmark)

    Song, Quanfeng; Zhang, Wenqi; Bay, Niels

    2006-01-01

    Contact algorithms in resistance welding presented in the previous paper are experimentally validated in the present paper. In order to verify the mechanical contact algorithm, two types of experiments, i.e. sandwich upsetting of circular, cylindrical specimens and compression tests of discs...... with a solid ring projection towards a flat ring, are carried out at room temperature. The complete algorithm, involving not only the mechanical model but also the thermal and electrical models, is validated by projection welding experiments. The experimental results are in satisfactory agreement...

  18. Blast Load Simulator Experiments for Computational Model Validation: Report 2

    Science.gov (United States)

    2017-02-01

    O’Daniel, 2016. Blast load simulator experiments for computational model validation – Report 1. ERDC/GSL TR-16-27. Vicksburg, MS: U.S. Army Engineer ...ER D C/ G SL T R- 16 -2 7 Blast Load Simulator Experiments for Computational Model Validation Report 2 G eo te ch ni ca l a nd S tr uc...Approved for public release; distribution is unlimited. The U.S. Army Engineer Research and Development Center (ERDC) solves the nation’s toughest

  19. Northern Rocky Mountain streamflow records: Global warming trends, human impacts or natural variability?

    Science.gov (United States)

    St. Jacques, Jeannine-Marie; Sauchyn, David J.; Zhao, Yang

    2010-03-01

    The ˜60 year Pacific Decadal Oscillation (PDO) is a major factor controlling streamflow in the northern Rocky Mountains, causing dryness during its positive phase, and wetness during its negative phase. If the PDO’s influence is not incorporated into a trend analysis of streamflows, it can produce detected declines that are actually artifacts of this low-frequency variability. Further difficulties arise from the short length and discontinuity of most gauge records, human impacts, and residual autocorrelation. We analyze southern Alberta and environs instrumental streamflow data, using void-filled datasets from unregulated and regulated gauges and naturalized records, and Generalized Least Squares regression to explicitly model the impacts of the PDO and other climate oscillations. We conclude that streamflows are declining at most gauges due to hydroclimatic changes (probably from global warming) and severe human impacts, which are of the same order of magnitude as the hydroclimate changes, if not greater.

  20. Validation techniques of agent based modelling for geospatial simulations

    Science.gov (United States)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  1. Validation techniques of agent based modelling for geospatial simulations

    Directory of Open Access Journals (Sweden)

    M. Darvishi

    2014-10-01

    Full Text Available One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS, biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI’s ArcGIS, OpenMap, GeoTools, etc for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  2. MODEL-BASED VALIDATION AND VERIFICATION OF ANOMALIES IN LEGISLATION

    Directory of Open Access Journals (Sweden)

    Vjeran Strahonja

    2006-12-01

    Full Text Available An anomaly in legislation is absence of completeness, consistency and other desirable properties, caused by different semantic, syntactic or pragmatic reasons. In general, the detection of anomalies in legislation comprises validation and verification. The basic idea of research, as presented in this paper, is modelling legislation by capturing domain knowledge of legislation and specifying it in a generic way by using commonly agreed and understandable modelling concepts of the Unified Modelling Language (UML. Models of legislation enable to understand the system better, support the detection of anomalies and help to improve the quality of legislation by validation and verification. By implementing model-based approach, the object of validation and verification moves from legislation to its model. The business domain of legislation has two distinct aspects: a structural or static aspect (functionality, business data etc., and a behavioural or dynamic part (states, transitions, activities, sequences etc.. Because anomalism can occur on two different levels, on the level of a model, or on the level of legislation itself, a framework for validation and verification of legal regulation and its model is discussed. The presented framework includes some significant types of semantic and syntactic anomalies. Some ideas for assessment of pragmatic anomalies of models were found in the field of software quality metrics. Thus pragmatic features and attributes can be determined that could be relevant for evaluation purposes of models. Based on analogue standards for the evaluation of software, a qualitative and quantitative scale can be applied to determine the value of some feature for a specific model.

  3. External model validation of binary clinical risk prediction models in cardiovascular and thoracic surgery.

    Science.gov (United States)

    Hickey, Graeme L; Blackstone, Eugene H

    2016-08-01

    Clinical risk-prediction models serve an important role in healthcare. They are used for clinical decision-making and measuring the performance of healthcare providers. To establish confidence in a model, external model validation is imperative. When designing such an external model validation study, thought must be given to patient selection, risk factor and outcome definitions, missing data, and the transparent reporting of the analysis. In addition, there are a number of statistical methods available for external model validation. Execution of a rigorous external validation study rests in proper study design, application of suitable statistical methods, and transparent reporting.

  4. Validation of spectral gas radiation models under oxyfuel conditions

    Energy Technology Data Exchange (ETDEWEB)

    Becher, Johann Valentin

    2013-05-15

    Combustion of hydrocarbon fuels with pure oxygen results in a different flue gas composition than combustion with air. Standard computational-fluid-dynamics (CFD) spectral gas radiation models for air combustion are therefore out of their validity range in oxyfuel combustion. This thesis provides a common spectral basis for the validation of new spectral models. A literature review about fundamental gas radiation theory, spectral modeling and experimental methods provides the reader with a basic understanding of the topic. In the first results section, this thesis validates detailed spectral models with high resolution spectral measurements in a gas cell with the aim of recommending one model as the best benchmark model. In the second results section, spectral measurements from a turbulent natural gas flame - as an example for a technical combustion process - are compared to simulated spectra based on measured gas atmospheres. The third results section compares simplified spectral models to the benchmark model recommended in the first results section and gives a ranking of the proposed models based on their accuracy. A concluding section gives recommendations for the selection and further development of simplified spectral radiation models. Gas cell transmissivity spectra in the spectral range of 2.4 - 5.4 {mu}m of water vapor and carbon dioxide in the temperature range from 727 C to 1500 C and at different concentrations were compared in the first results section at a nominal resolution of 32 cm{sup -1} to line-by-line models from different databases, two statistical-narrow-band models and the exponential-wide-band model. The two statistical-narrow-band models EM2C and RADCAL showed good agreement with a maximal band transmissivity deviation of 3 %. The exponential-wide-band model showed a deviation of 6 %. The new line-by-line database HITEMP2010 had the lowest band transmissivity deviation of 2.2% and was therefore recommended as a reference model for the

  5. Experimental Validation of a Thermoelastic Model for SMA Hybrid Composites

    Science.gov (United States)

    Turner, Travis L.

    2001-01-01

    This study presents results from experimental validation of a recently developed model for predicting the thermomechanical behavior of shape memory alloy hybrid composite (SMAHC) structures, composite structures with an embedded SMA constituent. The model captures the material nonlinearity of the material system with temperature and is capable of modeling constrained, restrained, or free recovery behavior from experimental measurement of fundamental engineering properties. A brief description of the model and analysis procedures is given, followed by an overview of a parallel effort to fabricate and characterize the material system of SMAHC specimens. Static and dynamic experimental configurations for the SMAHC specimens are described and experimental results for thermal post-buckling and random response are presented. Excellent agreement is achieved between the measured and predicted results, fully validating the theoretical model for constrained recovery behavior of SMAHC structures.

  6. Quantitative predictions of streamflow variability in the Susquehanna River Basin

    Science.gov (United States)

    Alexander, R.; Boyer, E. W.; Leonard, L. N.; Duffy, C.; Schwarz, G. E.; Smith, R. A.

    2012-12-01

    Hydrologic researchers and water managers have increasingly sought an improved understanding of the major processes that control fluxes of water and solutes across diverse environmental settings and large spatial scales. Regional analyses of observed streamflow data have led to advances in our knowledge of relations among land use, climate, and streamflow, with methodologies ranging from statistical assessments of multiple monitoring sites to the regionalization of the parameters of catchment-scale mechanistic simulation models. However, gaps remain in our understanding of the best ways to transfer the knowledge of hydrologic response and governing processes among locations, including methods for regionalizing streamflow measurements and model predictions. We developed an approach to predict variations in streamflow using the SPARROW (SPAtially Referenced Regression On Watershed attributes) modeling infrastructure, with mechanistic functions, mass conservation constraints, and statistical estimation of regional and sub-regional parameters. We used the model to predict discharge in the Susquehanna River Basin (SRB) under varying hydrological regimes that are representative of contemporary flow conditions. The resulting basin-scale water balance describes mean monthly flows in stream reaches throughout the entire SRB (represented at a 1:100,000 scale using the National Hydrologic Data network), with water supply and demand components that are inclusive of a range of hydrologic, climatic, and cultural properties (e.g., precipitation, evapotranspiration, soil and groundwater storage, runoff, baseflow, water use). We compare alternative models of varying complexity that reflect differences in the number and types of explanatory variables and functional expressions as well as spatial and temporal variability in the model parameters. Statistical estimation of the models reveals the levels of complexity that can be uniquely identified, subject to the information content

  7. System Modeling, Validation, and Design of Shape Controllers for NSTX

    Science.gov (United States)

    Walker, M. L.; Humphreys, D. A.; Eidietis, N. W.; Leuer, J. A.; Welander, A. S.; Kolemen, E.

    2011-10-01

    Modeling of the linearized control response of plasma shape and position has become fairly routine in the last several years. However, such response models rely on the input of accurate values of model parameters such as conductor and diagnostic sensor geometry and conductor resistivity or resistance. Confidence in use of such a model therefore requires that some effort be spent in validating that the model has been correctly constructed. We describe the process of constructing and validating a response model for NSTX plasma shape and position control, and subsequent use of that model for the development of shape and position controllers. The model development, validation, and control design processes are all integrated within a Matlab-based toolset known as TokSys. The control design method described emphasizes use of so-called decoupling control, in which combinations of coil current modifications are designed to modify only one control parameter at a time, without perturbing any other control parameter values. Work supported by US DOE under DE-FG02-99ER54522 and DE-AC02-09CH11466.

  8. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    Science.gov (United States)

    Apostolakis, J.; Asai, M.; Bagulya, A.; Brown, J. M. C.; Burkhardt, H.; Chikuma, N.; Cortes-Giraldo, M. A.; Elles, S.; Grichine, V.; Guatelli, S.; Incerti, S.; Ivanchenko, V. N.; Jacquemier, J.; Kadri, O.; Maire, M.; Pandola, L.; Sawkey, D.; Toshito, T.; Urban, L.; Yamashita, T.

    2015-12-01

    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed.

  9. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    CERN Document Server

    Apostolakis, J; Bagulya, A; Brown, J M C; Burkhardt, H; Chikuma, N; Cortes-Giraldo, M A; Elles, S; Grichine, V; Guatelli, S; Incerti, S; Ivanchenko, V N; Jacquemier, J; Kadri, O; Maire, M; Pandola, L; Sawkey, D; Toshito, T; Urban, L; Yamashita, T

    2015-01-01

    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed.

  10. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    CERN Document Server

    Apostolakis, J; Bagulya, A; Brown, J M C; Burkhardt, H; Chikuma, N; Cortes-Giraldo, M A; Elles, S; Grichine, V; Guatelli, S; Incerti, S; Ivanchenko, V N; Jacquemier, J; Kadri, O; Maire, M; Pandola, L; Sawkey, D; Toshito, T; Urban, L; Yamashita, T

    2015-01-01

    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed.

  11. Cross-validation model assessment for modular networks

    CERN Document Server

    Kawamoto, Tatsuro

    2016-01-01

    Model assessment of the stochastic block model is a crucial step in identification of modular structures in networks. Although this has typically been done according to the principle that a parsimonious model with a large marginal likelihood or a short description length should be selected, another principle is that a model with a small prediction error should be selected. We show that the leave-one-out cross-validation estimate of the prediction error can be efficiently obtained using belief propagation for sparse networks. Furthermore, the relations among the objectives for model assessment enable us to determine the exact cause of overfitting.

  12. Model Validation for Shipboard Power Cables Using Scattering Parameters%Model Validation for Shipboard Power Cables Using Scattering Parameters

    Institute of Scientific and Technical Information of China (English)

    Lukas Graber; Diomar Infante; Michael Steurer; William W. Brey

    2011-01-01

    Careful analysis of transients in shipboard power systems is important to achieve long life times of the com ponents in future all-electric ships. In order to accomplish results with high accuracy, it is recommended to validate cable models as they have significant influence on the amplitude and frequency spectrum of voltage transients. The authors propose comparison of model and measurement using scattering parameters. They can be easily obtained from measurement and simulation and deliver broadband information about the accuracy of the model. The measurement can be performed using a vector network analyzer. The process to extract scattering parameters from simulation models is explained in detail. Three different simulation models of a 5 kV XLPE power cable have been validated. The chosen approach delivers an efficient tool to quickly estimate the quality of a model.

  13. An integrated uncertainty and ensemble-based data assimilation approach for improved operational streamflow predictions

    Directory of Open Access Journals (Sweden)

    M. He

    2011-08-01

    Full Text Available The current study proposes an integrated uncertainty and ensemble-based data assimilation framework (ICEA and evaluates its viability in providing operational streamflow predictions via assimilating snow water equivalent (SWE data. This step-wise framework applies a parameter uncertainty analysis algorithm (ISURF to identify the uncertainty structure of sensitive model parameters, which is subsequently formulated into an Ensemble Kalman Filter (EnKF to generate updated snow states for streamflow prediction. The framework is coupled to the US National Weather Service (NWS snow and rainfall-runoff models. Its applicability is demonstrated for an operational basin of a western River Forecast Center (RFC of the NWS. Performance of the framework is evaluated against existing operational baseline (RFC predictions, the stand-alone ISURF, and the stand-alone EnKF. Results indicate that the ensemble-mean prediction of ICEA considerably outperforms predictions from the other three scenarios investigated, particularly in the context of predicting high flows (top 5th percentile. The ICEA streamflow ensemble predictions capture the variability of the observed streamflow well, however the ensemble is not wide enough to consistently contain the range of streamflow observations in the study basin. Our findings indicate that the ICEA has the potential to supplement the current operational (deterministic forecasting method in terms of providing improved single-valued (e.g., ensemble mean streamflow predictions as well as meaningful ensemble predictions.

  14. An integrated uncertainty and ensemble-based data assimilation approach for improved operational streamflow predictions

    Directory of Open Access Journals (Sweden)

    M. He

    2012-03-01

    Full Text Available The current study proposes an integrated uncertainty and ensemble-based data assimilation framework (ICEA and evaluates its viability in providing operational streamflow predictions via assimilating snow water equivalent (SWE data. This step-wise framework applies a parameter uncertainty analysis algorithm (ISURF to identify the uncertainty structure of sensitive model parameters, which is subsequently formulated into an Ensemble Kalman Filter (EnKF to generate updated snow states for streamflow prediction. The framework is coupled to the US National Weather Service (NWS snow and rainfall-runoff models. Its applicability is demonstrated for an operational basin of a western River Forecast Center (RFC of the NWS. Performance of the framework is evaluated against existing operational baseline (RFC predictions, the stand-alone ISURF and the stand-alone EnKF. Results indicate that the ensemble-mean prediction of ICEA considerably outperforms predictions from the other three scenarios investigated, particularly in the context of predicting high flows (top 5th percentile. The ICEA streamflow ensemble predictions capture the variability of the observed streamflow well, however the ensemble is not wide enough to consistently contain the range of streamflow observations in the study basin. Our findings indicate that the ICEA has the potential to supplement the current operational (deterministic forecasting method in terms of providing improved single-valued (e.g., ensemble mean streamflow predictions as well as meaningful ensemble predictions.

  15. Soil Moisture Initialization Error and Subgrid Variability of Precipitation in Seasonal Streamflow Forecasting

    Science.gov (United States)

    Koster, Randal D.; Walker, Gregory K.; Mahanama, Sarith P.; Reichle, Rolf H.

    2013-01-01

    Offline simulations over the conterminous United States (CONUS) with a land surface model are used to address two issues relevant to the forecasting of large-scale seasonal streamflow: (i) the extent to which errors in soil moisture initialization degrade streamflow forecasts, and (ii) the extent to which a realistic increase in the spatial resolution of forecasted precipitation would improve streamflow forecasts. The addition of error to a soil moisture initialization field is found to lead to a nearly proportional reduction in streamflow forecast skill. The linearity of the response allows the determination of a lower bound for the increase in streamflow forecast skill achievable through improved soil moisture estimation, e.g., through satellite-based soil moisture measurements. An increase in the resolution of precipitation is found to have an impact on large-scale streamflow forecasts only when evaporation variance is significant relative to the precipitation variance. This condition is met only in the western half of the CONUS domain. Taken together, the two studies demonstrate the utility of a continental-scale land surface modeling system as a tool for addressing the science of hydrological prediction.

  16. Validation of statistical models for creep rupture by parametric analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bolton, J., E-mail: john.bolton@uwclub.net [65, Fisher Ave., Rugby, Warks CV22 5HW (United Kingdom)

    2012-01-15

    Statistical analysis is an efficient method for the optimisation of any candidate mathematical model of creep rupture data, and for the comparative ranking of competing models. However, when a series of candidate models has been examined and the best of the series has been identified, there is no statistical criterion to determine whether a yet more accurate model might be devised. Hence there remains some uncertainty that the best of any series examined is sufficiently accurate to be considered reliable as a basis for extrapolation. This paper proposes that models should be validated primarily by parametric graphical comparison to rupture data and rupture gradient data. It proposes that no mathematical model should be considered reliable for extrapolation unless the visible divergence between model and data is so small as to leave no apparent scope for further reduction. This study is based on the data for a 12% Cr alloy steel used in BS PD6605:1998 to exemplify its recommended statistical analysis procedure. The models considered in this paper include a) a relatively simple model, b) the PD6605 recommended model and c) a more accurate model of somewhat greater complexity. - Highlights: Black-Right-Pointing-Pointer The paper discusses the validation of creep rupture models derived from statistical analysis. Black-Right-Pointing-Pointer It demonstrates that models can be satisfactorily validated by a visual-graphic comparison of models to data. Black-Right-Pointing-Pointer The method proposed utilises test data both as conventional rupture stress and as rupture stress gradient. Black-Right-Pointing-Pointer The approach is shown to be more reliable than a well-established and widely used method (BS PD6605).

  17. Drivers of annual to decadal streamflow variability in the lower Colorado River Basin

    Science.gov (United States)

    Lambeth-Beagles, R. S.; Troch, P. A.

    2010-12-01

    The Colorado River is the main water supply to the southwest region. As demand reaches the limit of supply in the southwest it becomes increasingly important to understand the dynamics of streamflow in the Colorado River and in particular the tributaries to the lower Colorado River. Climate change may pose an additional threat to the already-scarce water supply in the southwest. Due to the narrowing margin for error, water managers are keen on extending their ability to predict streamflow volumes on a mid-range to decadal scale. Before a predictive streamflow model can be developed, an understanding of the physical drivers of annual to decadal streamflow variability in the lower Colorado River Basin is needed. This research addresses this need by applying multiple statistical methods to identify trends, patterns and relationships present in streamflow, precipitation and temperature over the past century in four contributing watersheds to the lower Colorado River. The four watersheds selected were the Paria, Little Colorado, Virgin/Muddy, and Bill Williams. Time series data over a common period from 1906-2007 for streamflow, precipitation and temperature were used for the initial analysis. Through statistical analysis the following questions were addressed: 1) are there observable trends and patterns in these variables during the past century and 2) if there are trends or patterns, how are they related to each other? The Mann-Kendall test was used to identify trends in the three variables. Assumptions regarding autocorrelation and persistence in the data were taken into consideration. Kendall’s tau-b test was used to establish association between any found trends in the data. Initial results suggest there are two primary processes occurring. First, statistical analysis reveals significant upward trends in temperatures and downward trends in streamflow. However, there appears to be no trend in precipitation data. These trends in streamflow and temperature speak to

  18. National Streamflow Information Program: Implementation Status Report

    Science.gov (United States)

    Norris, J. Michael

    2009-01-01

    The U.S. Geological Survey (USGS) operates and maintains a nationwide network of about 7,500 streamgages designed to provide and interpret long-term, accurate, and unbiased streamflow information to meet the multiple needs of many diverse national, regional, state, and local users. The National Streamflow Information Program (NSIP) was initiated in 2003 in response to Congressional and stakeholder concerns about (1) the decrease in the number of operating streamgages, including a disproportionate loss of streamgages with a long period of record; (2) the inability of the USGS to continue operating high-priority streamgages in an environment of reduced funding through partnerships; and (3) the increasing demand for streamflow information due to emerging resource-management issues and new data-delivery capabilities. The NSIP's mission is to provide the streamflow information and understanding required to meet national, regional, state, and local needs. Most of the existing streamgages are funded through partnerships with more than 850 other Federal, state, tribal, and local agencies. Currently, about 90 percent of the streamgages send data to the World Wide Web in near-real time (some information is transmitted within 15 minutes, whereas some lags by about 4 hours). The streamflow information collected at USGS streamgages is used for many purposes: *In water-resource appraisals and allocations - to determine how much water is available and how it is being allocated; *To provide streamflow information required by interstate agreements, compacts, and court decrees; *For engineering design of reservoirs, bridges, roads, culverts, and treatment plants; *For the operation of reservoirs, the operation of locks and dams for navigation purposes, and power production; *To identify changes in streamflow resulting from changes in land use, water use, and climate; *For streamflow forecasting, flood planning, and flood forecasting; *To support water-quality programs by allowing

  19. Validation of a tuber blight (Phytophthora infestans) prediction model

    Science.gov (United States)

    Potato tuber blight caused by Phytophthora infestans accounts for significant losses in storage. There is limited published quantitative data on predicting tuber blight. We validated a tuber blight prediction model developed in New York with cultivars Allegany, NY 101, and Katahdin using independent...

  20. Technical Note: Calibration and validation of geophysical observation models

    NARCIS (Netherlands)

    Salama, M.S.; van der Velde, R.; van der Woerd, H.J.; Kromkamp, J.C.; Philippart, C.J.M.; Joseph, A.T.; O'Neill, P.E.; Lang, R.H.; Gish, T.; Werdell, P.J.; Su, Z.

    2012-01-01

    We present a method to calibrate and validate observational models that interrelate remotely sensed energy fluxes to geophysical variables of land and water surfaces. Coincident sets of remote sensing observation of visible and microwave radiations and geophysical data are assembled and subdivided i

  1. Validation of Geant4 hadronic physics models at intermediate energies

    Science.gov (United States)

    Banerjee, Sunanda; Geant4 Hadronic Group

    2010-04-01

    GEANT4 provides a number of physics models at intermediate energies (corresponding to incident momenta in the range 1-20 GeV/c). Recently, these models have been validated with existing data from a number of experiments: (a) inclusive proton and neutron production with a variety of beams (π-, π+, p) at different energies between 1 and 9 GeV/c on a number of nuclear targets (from beryllium to uranium); (2) inclusive pion/kaon/proton production from 14.6 GeV/c proton beams on nuclear targets (from beryllium to gold); (3) inclusive pion production from pion beams between 3-13 GeV/c on a number of nuclear targets (from beryllium to lead). The results of simulation/data comparison for different GEANT4 models are discussed in the context of validating the models and determining their usage in physics lists for high energy application. Due to the increasing number of validations becoming available, and the requirement that they be done at regular intervals corresponding to the GEANT4 release schedule, automated methods of validation are being developed.

  2. Empirical validation data sets for double skin facade models

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Jensen, Rasmus Lund; Heiselberg, Per

    2008-01-01

    During recent years application of double skin facades (DSF) has greatly increased. However, successful application depends heavily on reliable and validated models for simulation of the DSF performance and this in turn requires access to high quality experimental data. Three sets of accurate emp...

  3. Validation of Models : Statistical Techniques and Data Availability

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1999-01-01

    This paper shows which statistical techniques can be used to validate simulation models, depending on which real-life data are available. Concerning this availability three situations are distinguished (i) no data, (ii) only output data, and (iii) both input and output data. In case (i) - no real

  4. Model validation studies of solar systems, Phase III. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Lantz, L.J.; Winn, C.B.

    1978-12-01

    Results obtained from a validation study of the TRNSYS, SIMSHAC, and SOLCOST solar system simulation and design are presented. Also included are comparisons between the FCHART and SOLCOST solar system design programs and some changes that were made to the SOLCOST program. Finally, results obtained from the analysis of several solar radiation models are presented. Separate abstracts were prepared for ten papers.

  5. Improving Perovskite Solar Cells: Insights From a Validated Device Model

    NARCIS (Netherlands)

    Sherkar, Tejas S.; Momblona, Cristina; Gil-Escrig, Lidon; Bolink, Henk J.; Koster, L. Jan Anton

    2017-01-01

    To improve the efficiency of existing perovskite solar cells (PSCs), a detailed understanding of the underlying device physics during their operation is essential. Here, a device model has been developed and validated that describes the operation of PSCs and quantitatively explains the role of conta

  6. ID Model Construction and Validation: A Multiple Intelligences Case

    Science.gov (United States)

    Tracey, Monica W.; Richey, Rita C.

    2007-01-01

    This is a report of a developmental research study that aimed to construct and validate an instructional design (ID) model that incorporates the theory and practice of multiple intelligences (MI). The study consisted of three phases. In phase one, the theoretical foundations of multiple Intelligences and ID were examined to guide the development…

  7. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Eve...... requirements, where the system to be built must explicitly support the interaction between people within a pervasive cooperative workflow execution. A case study from a real project is used to illustrate the proposed approach.......Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Even...... with simple unified modelling language (UML) requirements models, it is not easy for the development team to get confidence on the stakeholders' requirements validation. This paper describes an approach, based on the construction of executable interactive prototypes, to support the validation of workflow...

  8. Bibliometric Modeling Processes and the Empirical Validity of Lotka's Law.

    Science.gov (United States)

    Nicholls, Paul Travis

    1989-01-01

    Examines the elements involved in fitting a bibliometric model to empirical data, proposes a consistent methodology for applying Lotka's law, and presents the results of an empirical test of the methodology. The results are discussed in terms of the validity of Lotka's law and the suitability of the proposed methodology. (49 references) (CLB)

  9. Improving Perovskite Solar Cells: Insights From a Validated Device Model

    NARCIS (Netherlands)

    Sherkar, Tejas S.; Momblona, Cristina; Gil-Escrig, Lidon; Bolink, Henk J.; Koster, L. Jan Anton

    2017-01-01

    To improve the efficiency of existing perovskite solar cells (PSCs), a detailed understanding of the underlying device physics during their operation is essential. Here, a device model has been developed and validated that describes the operation of PSCs and quantitatively explains the role of conta

  10. Paleoflood investigations to improve peak-streamflow regional-regression equations for natural streamflow in eastern Colorado, 2015

    Science.gov (United States)

    Kohn, Michael S.; Stevens, Michael R.; Harden, Tessa M.; Godaire, Jeanne E.; Klinger, Ralph E.; Mommandi, Amanullah

    2016-09-09

    The U.S. Geological Survey (USGS), in cooperation with the Colorado Department of Transportation, developed regional-regression equations for estimating the 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, 0.2-percent annual exceedance-probability discharge (AEPD) for natural streamflow in eastern Colorado. A total of 188 streamgages, consisting of 6,536 years of record and a mean of approximately 35 years of record per streamgage, were used to develop the peak-streamflow regional-regression equations. The estimated AEPDs for each streamgage were computed using the USGS software program PeakFQ. The AEPDs were determined using systematic data through water year 2013. Based on previous studies conducted in Colorado and neighboring States and on the availability of data, 72 characteristics (57 basin and 15 climatic characteristics) were evaluated as candidate explanatory variables in the regression analysis. Paleoflood and non-exceedance bound ages were established based on reconnaissance-level methods. Multiple lines of evidence were used at each streamgage to arrive at a conclusion (age estimate) to add a higher degree of certainty to reconnaissance-level estimates. Paleoflood or nonexceedance bound evidence was documented at 41 streamgages, and 3 streamgages had previously collected paleoflood data.To determine the peak discharge of a paleoflood or non-exceedanc bound, two different hydraulic models were used.The mean standard error of prediction (SEP) for all 8 AEPDs was reduced approximately 25 percent compared to the previous flood-frequency study. For paleoflood data to be effective in reducing the SEP in eastern Colorado, a larger ratio than 44 of 188 (23 percent) streamgages would need paleoflood data and that paleoflood data would need to increase the record length by more than 25 years for the 1-percent AEPD. The greatest reduction in SEP for the peak-streamflow regional-regression equations was observed when additional new basin characteristics were included in the peak-streamflow

  11. Streamflow disaggregation: a nonlinear deterministic approach

    Directory of Open Access Journals (Sweden)

    B. Sivakumar

    2004-01-01

    Full Text Available This study introduces a nonlinear deterministic approach for streamflow disaggregation. According to this approach, the streamflow transformation process from one scale to another is treated as a nonlinear deterministic process, rather than a stochastic process as generally assumed. The approach follows two important steps: (1 reconstruction of the scalar (streamflow series in a multi-dimensional phase-space for representing the transformation dynamics; and (2 use of a local approximation (nearest neighbor method for disaggregation. The approach is employed for streamflow disaggregation in the Mississippi River basin, USA. Data of successively doubled resolutions between daily and 16 days (i.e. daily, 2-day, 4-day, 8-day, and 16-day are studied, and disaggregations are attempted only between successive resolutions (i.e. 2-day to daily, 4-day to 2-day, 8-day to 4-day, and 16-day to 8-day. Comparisons between the disaggregated values and the actual values reveal excellent agreements for all the cases studied, indicating the suitability of the approach for streamflow disaggregation. A further insight into the results reveals that the best results are, in general, achieved for low embedding dimensions (2 or 3 and small number of neighbors (less than 50, suggesting possible presence of nonlinear determinism in the underlying transformation process. A decrease in accuracy with increasing disaggregation scale is also observed, a possible implication of the existence of a scaling regime in streamflow.

  12. Validating firn compaction model with remote sensing data

    DEFF Research Database (Denmark)

    Simonsen, S. B.; Stenseng, Lars; Sørensen, Louise Sandberg

    A comprehensive understanding of firn processes is of outmost importance, when estimating present and future changes of the Greenland Ice Sheet. Especially, when remote sensing altimetry is used to assess the state of ice sheets and their contribution to global sea level rise, firn compaction...... models have been shown to be a key component. Now, remote sensing data can also be used to validate the firn models. Radar penetrating the upper part of the firn column in the interior part of Greenland shows a clear layering. The observed layers from the radar data can be used as an in-situ validation...... correction relative to the changes in the elevation of the surface observed with remote sensing altimetry? What model time resolution is necessary to resolved the observed layering? What model refinements are necessary to give better estimates of the surface mass balance of the Greenland ice sheet from...

  13. Validation of a Model for Ice Formation around Finned Tubes

    OpenAIRE

    Kamal A. R. Ismai; Fatima A. M. Lino

    2016-01-01

    Phase change materials although attaractive option for thermal storage applications its main drawback is the slow thermal response during charging and discharging processes due to their low thermal conductivity. The present study validates a model developed by the authors some years ago on radial fins as a method to meliorate the thermal performance of PCM in horizontal storage system. The developed model for the radial finned tube is based on pure conduction, the enthalpy approach and was di...

  14. Toward metrics and model validation in web-site QEM

    OpenAIRE

    Olsina Santos, Luis Antonio; Pons, Claudia; Rossi, Gustavo Héctor

    2000-01-01

    In this work, a conceptual framework and the associated strategies for metrics and model validation are analyzed regarding website measurement and evaluation. Particularly, we have conducted three case studies in different Web domains in order to evaluate and compare the quality of sites. For such an end the quantitative, model-based methodology, so-called Web-site QEM (Quality Evaluation Methodology), was utilized. In the assessment process of sites, definition of attributes and measurements...

  15. Validating firn compaction model with remote sensing data

    OpenAIRE

    2011-01-01

    A comprehensive understanding of firn processes is of outmost importance, when estimating present and future changes of the Greenland Ice Sheet. Especially, when remote sensing altimetry is used to assess the state of ice sheets and their contribution to global sea level rise, firn compaction models have been shown to be a key component. Now, remote sensing data can also be used to validate the firn models. Radar penetrating the upper part of the firn column in the interior part of Greenland ...

  16. Validation of models with constant bias: an applied approach

    Directory of Open Access Journals (Sweden)

    Salvador Medina-Peralta

    2014-06-01

    Full Text Available Objective. This paper presents extensions to the statistical validation method based on the procedure of Freese when a model shows constant bias (CB in its predictions and illustrate the method with data from a new mechanistic model that predict weight gain in cattle. Materials and methods. The extensions were the hypothesis tests and maximum anticipated error for the alternative approach, and the confidence interval for a quantile of the distribution of errors. Results. The model evaluated showed CB, once the CB is removed and with a confidence level of 95%, the magnitude of the error does not exceed 0.575 kg. Therefore, the validated model can be used to predict the daily weight gain of cattle, although it will require an adjustment in its structure based on the presence of CB to increase the accuracy of its forecasts. Conclusions. The confidence interval for the 1-α quantile of the distribution of errors after correcting the constant bias, allows determining the top limit for the magnitude of the error of prediction and use it to evaluate the evolution of the model in the forecasting of the system. The confidence interval approach to validate a model is more informative than the hypothesis tests for the same purpose.

  17. Bias correcting precipitation forecasts to improve the skill of seasonal streamflow forecasts

    Science.gov (United States)

    Crochemore, Louise; Ramos, Maria-Helena; Pappenberger, Florian

    2016-09-01

    Meteorological centres make sustained efforts to provide seasonal forecasts that are increasingly skilful, which has the potential to benefit streamflow forecasting. Seasonal streamflow forecasts can help to take anticipatory measures for a range of applications, such as water supply or hydropower reservoir operation and drought risk management. This study assesses the skill of seasonal precipitation and streamflow forecasts in France to provide insights into the way bias correcting precipitation forecasts can improve the skill of streamflow forecasts at extended lead times. We apply eight variants of bias correction approaches to the precipitation forecasts prior to generating the streamflow forecasts. The approaches are based on the linear scaling and the distribution mapping methods. A daily hydrological model is applied at the catchment scale to transform precipitation into streamflow. We then evaluate the skill of raw (without bias correction) and bias-corrected precipitation and streamflow ensemble forecasts in 16 catchments in France. The skill of the ensemble forecasts is assessed in reliability, sharpness, accuracy and overall performance. A reference prediction system, based on historical observed precipitation and catchment initial conditions at the time of forecast (i.e. ESP method) is used as benchmark in the computation of the skill. The results show that, in most catchments, raw seasonal precipitation and streamflow forecasts are often more skilful than the conventional ESP method in terms of sharpness. However, they are not significantly better in terms of reliability. Forecast skill is generally improved when applying bias correction. Two bias correction methods show the best performance for the studied catchments, each method being more successful in improving specific attributes of the forecasts: the simple linear scaling of monthly values contributes mainly to increasing forecast sharpness and accuracy, while the empirical distribution mapping

  18. Effect of Streamflow Forecast Uncertainty on Real-Time Reservoir Operation

    Science.gov (United States)

    Zhao, T.; Cai, X.; Yang, D.

    2010-12-01

    Various hydrological forecast products have been applied to real-time reservoir operation, including deterministic streamflow forecast (DSF), DSF-based probabilistic streamflow forecast (DPSF), and ensemble streamflow forecast (ESF), which represent forecast uncertainty in the form of deterministic forecast error, deterministic forecast error-based uncertainty distribution, and ensemble forecast errors, respectively. Compared to previous studies that treat these forecast products as ad hoc inputs for reservoir operation models, this paper attempts to model the uncertainties involved in the various forecast products and explores their effect on real-time reservoir operation decisions. In hydrology, there are various indices reflecting the magnitude of streamflow forecast uncertainty; meanwhile, few models illustrate the forecast uncertainty evolution process. This research introduces Martingale Model of Forecast Evolution (MMFE) from supply chain management and justifies its assumptions for quantifying the evolution of uncertainty in streamflow forecast as time progresses. Based on MMFE, this research simulates the evolution of forecast uncertainty in DSF, DPSF, and ESF, and applies the reservoir operation models (dynamic programming, DP; stochastic dynamic programming, SDP; and standard operation policy, SOP) to assess the effect of different forms of forecast uncertainty on real-time reservoir operation. Through a hypothetical single-objective real-time reservoir operation model, the results illustrate that forecast uncertainty exerts significant effects. Reservoir operation efficiency, as measured by a utility function, decreases as the forecast uncertainty increases. Meanwhile, these effects also depend on the type of forecast product being used. In general, the utility of reservoir operation with ESF is nearly as high as the utility obtained with a perfect forecast; the utilities of DSF and DPSF are similar to each other but not as efficient as ESF. Moreover

  19. Validity of the Bersohn–Zewail model beyond justification

    DEFF Research Database (Denmark)

    Petersen, Jakob; Henriksen, Niels Engholm; Møller, Klaus Braagaard

    2012-01-01

    The absorption of probe pulses in ultrafast pump–probe experiments can be determined from the Bersohn–Zewail (BZ) model. The model relies on classical mechanics to describe the dynamics of the nuclei in the excited electronic state prepared by the ultrashort pump pulse. The BZ model provides...... excellent agreement between the classical trajectory and the average position of the excited state wave packet. By investigating the approximations connecting the nuclear dynamics described by quantum mechanics and the BZ model, we conclude that this agreement goes far beyond the validity of the individual...

  20. Experimentally validated finite element model of electrocaloric multilayer ceramic structures

    Energy Technology Data Exchange (ETDEWEB)

    Smith, N. A. S., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk; Correia, T. M., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk [National Physical Laboratory, Hampton Road, TW11 0LW Middlesex (United Kingdom); Rokosz, M. K., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk [National Physical Laboratory, Hampton Road, TW11 0LW Middlesex (United Kingdom); Department of Materials, Imperial College London, London SW7 2AZ (United Kingdom)

    2014-07-28

    A novel finite element model to simulate the electrocaloric response of a multilayer ceramic capacitor (MLCC) under real environment and operational conditions has been developed. The two-dimensional transient conductive heat transfer model presented includes the electrocaloric effect as a source term, as well as accounting for radiative and convective effects. The model has been validated with experimental data obtained from the direct imaging of MLCC transient temperature variation under application of an electric field. The good agreement between simulated and experimental data, suggests that the novel experimental direct measurement methodology and the finite element model could be used to support the design of optimised electrocaloric units and operating conditions.

  1. August Median Streamflow on Ungaged Streams in Eastern Aroostook County, Maine

    Science.gov (United States)

    Lombard, Pamela J.; Tasker, Gary D.; Nielsen, Martha G.

    2003-01-01

    Methods for estimating August median streamflow were developed for ungaged, unregulated streams in the eastern part of Aroostook County, Maine, with drainage areas from 0.38 to 43 square miles and mean basin elevations from 437 to 1,024 feet. Few long-term, continuous-record streamflow-gaging stations with small drainage areas were available from which to develop the equations; therefore, 24 partial-record gaging stations were established in this investigation. A mathematical technique for estimating a standard low-flow statistic, August median streamflow, at partial-record stations was applied by relating base-flow measurements at these stations to concurrent daily flows at nearby long-term, continuous-record streamflow- gaging stations (index stations). Generalized least-squares regression analysis (GLS) was used to relate estimates of August median streamflow at gaging stations to basin characteristics at these same stations to develop equations that can be applied to estimate August median streamflow on ungaged streams. GLS accounts for varying periods of record at the gaging stations and the cross correlation of concurrent streamflows among gaging stations. Twenty-three partial-record stations and one continuous-record station were used for the final regression equations. The basin characteristics of drainage area and mean basin elevation are used in the calculated regression equation for ungaged streams to estimate August median flow. The equation has an average standard error of prediction from -38 to 62 percent. A one-variable equation uses only drainage area to estimate August median streamflow when less accuracy is acceptable. This equation has an average standard error of prediction from -40 to 67 percent. Model error is larger than sampling error for both equations, indicating that additional basin characteristics could be important to improved estimates of low-flow statistics. Weighted estimates of August median streamflow, which can be used when

  2. Validation of a Model for Teaching Canine Fundoscopy.

    Science.gov (United States)

    Nibblett, Belle Marie D; Pereira, Mary Mauldin; Williamson, Julie A; Sithole, Fortune

    2015-01-01

    A validated teaching model for canine fundoscopic examination was developed to improve Day One fundoscopy skills while at the same time reducing use of teaching dogs. This novel eye model was created from a hollow plastic ball with a cutout for the pupil, a suspended 20-diopter lens, and paint and paper simulation of relevant eye structures. This eye model was mounted on a wooden stand with canine head landmarks useful in performing fundoscopy. Veterinary educators performed fundoscopy using this model and completed a survey to establish face and content validity. Subsequently, veterinary students were randomly assigned to pre-laboratory training with or without the use of this teaching model. After completion of an ophthalmology laboratory on teaching dogs, student outcome was assessed by measuring students' ability to see a symbol inserted on the simulated retina in the model. Students also completed a survey regarding their experience with the model and the laboratory. Overall, veterinary educators agreed that this eye model was well constructed and useful in teaching good fundoscopic technique. Student performance of fundoscopy was not negatively impacted by the use of the model. This novel canine model shows promise as a teaching and assessment tool for fundoscopy.

  3. FDA 2011 process validation guidance: lifecycle compliance model.

    Science.gov (United States)

    Campbell, Cliff

    2014-01-01

    This article has been written as a contribution to the industry's efforts in migrating from a document-driven to a data-driven compliance mindset. A combination of target product profile, control engineering, and general sum principle techniques is presented as the basis of a simple but scalable lifecycle compliance model in support of modernized process validation. Unit operations and significant variables occupy pole position within the model, documentation requirements being treated as a derivative or consequence of the modeling process. The quality system is repositioned as a subordinate of system quality, this being defined as the integral of related "system qualities". The article represents a structured interpretation of the U.S. Food and Drug Administration's 2011 Guidance for Industry on Process Validation and is based on the author's educational background and his manufacturing/consulting experience in the validation field. The U.S. Food and Drug Administration's Guidance for Industry on Process Validation (2011) provides a wide-ranging and rigorous outline of compliant drug manufacturing requirements relative to its 20(th) century predecessor (1987). Its declared focus is patient safety, and it identifies three inter-related (and obvious) stages of the compliance lifecycle. Firstly, processes must be designed, both from a technical and quality perspective. Secondly, processes must be qualified, providing evidence that the manufacturing facility is fully "roadworthy" and fit for its intended purpose. Thirdly, processes must be verified, meaning that commercial batches must be monitored to ensure that processes remain in a state of control throughout their lifetime.

  4. Verification and Validation of Heat Transfer Model of AGREE Code

    Energy Technology Data Exchange (ETDEWEB)

    Tak, N. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Seker, V.; Drzewiecki, T. J.; Downar, T. J. [Department of Nuclear Engineering and Radiological Sciences, Univ. of Michigan, Michigan (United States); Kelly, J. M. [US Nuclear Regulatory Commission, Washington (United States)

    2013-05-15

    The AGREE code was originally developed as a multi physics simulation code to perform design and safety analysis of Pebble Bed Reactors (PBR). Currently, additional capability for the analysis of Prismatic Modular Reactor (PMR) core is in progress. Newly implemented fluid model for a PMR core is based on a subchannel approach which has been widely used in the analyses of light water reactor (LWR) cores. A hexagonal fuel (or graphite block) is discretized into triangular prism nodes having effective conductivities. Then, a meso-scale heat transfer model is applied to the unit cell geometry of a prismatic fuel block. Both unit cell geometries of multi-hole and pin-in-hole types of prismatic fuel blocks are considered in AGREE. The main objective of this work is to verify and validate the heat transfer model newly implemented for a PMR core in the AGREE code. The measured data in the HENDEL experiment were used for the validation of the heat transfer model for a pin-in-hole fuel block. However, the HENDEL tests were limited to only steady-state conditions of pin-in-hole fuel blocks. There exist no available experimental data regarding a heat transfer in multi-hole fuel blocks. Therefore, numerical benchmarks using conceptual problems are considered to verify the heat transfer model of AGREE for multi-hole fuel blocks as well as transient conditions. The CORONA and GAMMA+ codes were used to compare the numerical results. In this work, the verification and validation study were performed for the heat transfer model of the AGREE code using the HENDEL experiment and the numerical benchmarks of selected conceptual problems. The results of the present work show that the heat transfer model of AGREE is accurate and reliable for prismatic fuel blocks. Further validation of AGREE is in progress for a whole reactor problem using the HTTR safety test data such as control rod withdrawal tests and loss-of-forced convection tests.

  5. Propeller aircraft interior noise model utilization study and validation

    Science.gov (United States)

    Pope, L. D.

    1984-01-01

    Utilization and validation of a computer program designed for aircraft interior noise prediction is considered. The program, entitled PAIN (an acronym for Propeller Aircraft Interior Noise), permits (in theory) predictions of sound levels inside propeller driven aircraft arising from sidewall transmission. The objective of the work reported was to determine the practicality of making predictions for various airplanes and the extent of the program's capabilities. The ultimate purpose was to discern the quality of predictions for tonal levels inside an aircraft occurring at the propeller blade passage frequency and its harmonics. The effort involved three tasks: (1) program validation through comparisons of predictions with scale-model test results; (2) development of utilization schemes for large (full scale) fuselages; and (3) validation through comparisons of predictions with measurements taken in flight tests on a turboprop aircraft. Findings should enable future users of the program to efficiently undertake and correctly interpret predictions.

  6. Impact of bushfire and climate variability on streamflow from forested catchments in southeast Australia

    Directory of Open Access Journals (Sweden)

    Y. Zhou

    2013-04-01

    Full Text Available Most of the surface water for natural environmental and human water uses in southeast Australia is sourced from forested catchments located in the higher rainfall areas. Water yield of these catchments is mainly affected by climatic conditions, but it is also greatly affected by vegetation cover change. Bushfires are a major natural disturbance in forested catchments and potentially modify the water yield of the catchments through changes to evapotranspiration (ET, interception and soil moisture storage. This paper quantifies the impacts of bushfire and climate variability on streamflow from three southeast Australian catchments where Ash Wednesday bushfires occurred in February 1983. The hydrological models used here include AWRA-L, Xinanjiang and GR4J. The three models are first calibrated against streamflow data from the pre-bushfire period and they are used to simulate runoff for the post-bushfire period with the calibrated parameters. The difference between the observed and model simulated runoff for the post-bushfire period provides an estimate of the impact of bushfire on streamflow. The hydrological modelling results for the three catchments indicate that there is a substantial increase in streamflow in the first 15 yr after the 1983 bushfires. The increase in streamflow is attributed to initial decreases in ET and interception resulting from the fires, followed by logging activity. After 15 yr, streamflow dynamics are more heavily influenced by climate effects, although some impact from fire and logging regeneration may still occur. It is shown that hydrological models provide reasonable consistent estimates of forest disturbance and climate impacts on streamflow for the three catchments. The results might be used by forest managers to understand the relationship between forest disturbance and climate variability impacts on water yield in the context of climate change.

  7. Dynamic validation of the Planck/LFI thermal model

    CERN Document Server

    Tomasi, M; Gregorio, A; Colombo, F; Lapolla, M; Terenzi, L; Morgante, G; Bersanelli, M; Butler, R C; Galeotta, S; Mandolesi, N; Maris, M; Mennella, A; Valenziano, L; Zacchei, A; 10.1088/1748-0221/5/01/T01002

    2010-01-01

    The Low Frequency Instrument (LFI) is an array of cryogenically cooled radiometers on board the Planck satellite, designed to measure the temperature and polarization anisotropies of the cosmic microwave backgrond (CMB) at 30, 44 and 70 GHz. The thermal requirements of the LFI, and in particular the stringent limits to acceptable thermal fluctuations in the 20 K focal plane, are a critical element to achieve the instrument scientific performance. Thermal tests were carried out as part of the on-ground calibration campaign at various stages of instrument integration. In this paper we describe the results and analysis of the tests on the LFI flight model (FM) performed at Thales Laboratories in Milan (Italy) during 2006, with the purpose of experimentally sampling the thermal transfer functions and consequently validating the numerical thermal model describing the dynamic response of the LFI focal plane. This model has been used extensively to assess the ability of LFI to achieve its scientific goals: its valid...

  8. Validation of a finite element model of the human metacarpal.

    Science.gov (United States)

    Barker, D S; Netherway, D J; Krishnan, J; Hearn, T C

    2005-03-01

    Implant loosening and mechanical failure of components are frequently reported following metacarpophalangeal (MCP) joint replacement. Studies of the mechanical environment of the MCP implant-bone construct are rare. The objective of this study was to evaluate the predictive ability of a finite element model of the intact second human metacarpal to provide a validated baseline for further mechanical studies. A right index human metacarpal was subjected to torsion and combined axial/bending loading using strain gauge (SG) and 3D finite element (FE) analysis. Four different representations of bone material properties were considered. Regression analyses were performed comparing maximum and minimum principal surface strains taken from the SG and FE models. Regression slopes close to unity and high correlation coefficients were found when the diaphyseal cortical shell was modelled as anisotropic and cancellous bone properties were derived from quantitative computed tomography. The inclusion of anisotropy for cortical bone was strongly influential in producing high model validity whereas variation in methods of assigning stiffness to cancellous bone had only a minor influence. The validated FE model provides a tool for future investigations of current and novel MCP joint prostheses.

  9. Beyond Corroboration: Strengthening Model Validation by Looking for Unexpected Patterns.

    Directory of Open Access Journals (Sweden)

    Guillaume Chérel

    Full Text Available Models of emergent phenomena are designed to provide an explanation to global-scale phenomena from local-scale processes. Model validation is commonly done by verifying that the model is able to reproduce the patterns to be explained. We argue that robust validation must not only be based on corroboration, but also on attempting to falsify the model, i.e. making sure that the model behaves soundly for any reasonable input and parameter values. We propose an open-ended evolutionary method based on Novelty Search to look for the diverse patterns a model can produce. The Pattern Space Exploration method was tested on a model of collective motion and compared to three common a priori sampling experiment designs. The method successfully discovered all known qualitatively different kinds of collective motion, and performed much better than the a priori sampling methods. The method was then applied to a case study of city system dynamics to explore the model's predicted values of city hierarchisation and population growth. This case study showed that the method can provide insights on potential predictive scenarios as well as falsifiers of the model when the simulated dynamics are highly unrealistic.

  10. Using SST, PDO and SOI for Streamflow Reconstruction

    Science.gov (United States)

    Bukhary, S. S.; Kalra, A.; Ahmad, S.

    2015-12-01

    Recurring droughts in southwestern U.S. particularly California, have strained the existing water reserves of the region. Frequency, severity and duration of these recurring drought events may not be captured by the available instrumental records. Thus streamflow reconstruction becomes imperative to identify the historic hydroclimatic extremes of a region and assists in developing better water management strategies, vital for sustainability of water reserves. Tree ring chronologies (TRC) are conventionally used to reconstruct streamflows, since tree rings are representative of climatic information. Studies have shown that sea surface temperature (SST) and climate indices of southern oscillation index (SOI) and pacific decadal oscillation (PDO) influence U.S. streamflow volumes. The purpose of this study was to improve the traditional reconstruction methodology by incorporating the oceanic-atmospheric variables of PDO, SOI, and Pacific Ocean SST, alongwith TRC as predictors in a step-wise linear regression model. The methodology of singular value decomposition was used to identify teleconnected regions of streamflow and SST. The approach was tested on eleven gage stations in Sacramento River Basin (SRB) and San Joaquin River Basin (JRB). The reconstructions were successfully generated from 1800-1980, having an overlap period of 1932-1980. Improved results were exhibited when using the predictor variable of SST along with TRC (calibration r2=0.6-0.91) compared to when using TRC in combination with SOI and PDO (calibration r2=0.51-0.78) or when using TRC by itself (calibration r2=0.51-0.86). For future work, this approach can be replicated for other watersheds by using the oceanic-atmospheric climate variables influencing that region.

  11. Predicting the ungauged basin: model validation and realism assessment

    Science.gov (United States)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2016-04-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) [1] led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of model outcome has not been discussed to a great extent. With this study [2] we aim to contribute to the discussion on how one can determine the value and validity of a hydrological model developed for an ungauged basin. As in many cases no local, or even regional, data are available, alternative methods should be applied. Using a PUB case study in a genuinely ungauged basin in southern Cambodia, we give several examples of how one can use different types of soft data to improve model design, calibrate and validate the model, and assess the realism of the model output. A rainfall-runoff model was coupled to an irrigation reservoir, allowing the use of additional and unconventional data. The model was mainly forced with remote sensing data, and local knowledge was used to constrain the parameters. Model realism assessment was done using data from surveys. This resulted in a successful reconstruction of the reservoir dynamics, and revealed the different hydrological characteristics of the two topographical classes. We do not present a generic approach that can be transferred to other ungauged catchments, but we aim to show how clever model design and alternative data acquisition can result in a valuable hydrological model for ungauged catchments. [1] Sivapalan, M., Takeuchi, K., Franks, S., Gupta, V., Karambiri, H., Lakshmi, V., et al. (2003). IAHS decade on predictions in ungauged basins (PUB), 2003-2012: shaping an exciting future for the hydrological sciences. Hydrol. Sci. J. 48, 857-880. doi: 10.1623/hysj.48.6.857.51421 [2] van Emmerik, T., Mulder, G., Eilander, D., Piet, M. and Savenije, H. (2015). Predicting the ungauged basin: model validation and realism assessment

  12. A verification and validation process for model-driven engineering

    Science.gov (United States)

    Delmas, R.; Pires, A. F.; Polacsek, T.

    2013-12-01

    Model Driven Engineering practitioners already benefit from many well established verification tools, for Object Constraint Language (OCL), for instance. Recently, constraint satisfaction techniques have been brought to Model-Driven Engineering (MDE) and have shown promising results on model verification tasks. With all these tools, it becomes possible to provide users with formal support from early model design phases to model instantiation phases. In this paper, a selection of such tools and methods is presented, and an attempt is made to define a verification and validation process for model design and instance creation centered on UML (Unified Modeling Language) class diagrams and declarative constraints, and involving the selected tools. The suggested process is illustrated with a simple example.

  13. Has streamflow changed in the Nordic countries?

    Energy Technology Data Exchange (ETDEWEB)

    Hisdal, Hege; Holmqvist, Erik; Jonsdottir, Jona Finndis; Jonsson, Pall; Kuusisto, Esko; Lindstroem, Goeran; Roald, Lars A.

    2010-01-15

    Climate change studies traditionally include elaboration of possible scenarios for the future and attempts to detect a climate change signal in historical data. This study focuses on the latter. A pan-Nordic dataset of more than 160 streamflow records was analysed to detect spatial and temporal changes in streamflow. The Mann-Kendall trend test was applied to study changes in annual and seasonal streamflow as well as floods and droughts for three periods: 1961-2000, 1941-2002 and 1920-2002. The period analysed and the selection of stations influenced the regional patterns found, but the overall picture was that trends towards increased streamflow were dominating for annual values and the winter and spring seasons. Trends in summer flow highly depended on the period analysed whereas no trend was found for the autumn season. A signal towards earlier snowmelt floods was clear and a tendency towards more severe summer droughts was found in southern Norway. A qualitative comparison of the findings to available streamflow scenarios for the region showed that the strongest trends found are coherent with changes expected in the scenario period, for example increased winter discharge and earlier snowmelt floods. However, there are also expected changes that are not reflected in the trends, such as the expected increase in autumn discharge in Norway. It can be concluded that the observed temperature increase has clearly affected the streamflow in the Nordic countries. These changes correspond well with the estimated consequences of a projected temperature increase. The effect of the observed and projected precipitation increase on streamflow is less clear.(Author)

  14. Improvement and Validation of Weld Residual Stress Modelling Procedure

    Energy Technology Data Exchange (ETDEWEB)

    Zang, Weilin; Gunnars, Jens (Inspecta Technology AB, Stockholm (Sweden)); Dong, Pingsha; Hong, Jeong K. (Center for Welded Structures Research, Battelle, Columbus, OH (United States))

    2009-06-15

    The objective of this work is to identify and evaluate improvements for the residual stress modelling procedure currently used in Sweden. There is a growing demand to eliminate any unnecessary conservatism involved in residual stress assumptions. The study was focused on the development and validation of an improved weld residual stress modelling procedure, by taking advantage of the recent advances in residual stress modelling and stress measurement techniques. The major changes applied in the new weld residual stress modelling procedure are: - Improved procedure for heat source calibration based on use of analytical solutions. - Use of an isotropic hardening model where mixed hardening data is not available. - Use of an annealing model for improved simulation of strain relaxation in re-heated material. The new modelling procedure is demonstrated to capture the main characteristics of the through thickness stress distributions by validation to experimental measurements. Three austenitic stainless steel butt-welds cases are analysed, covering a large range of pipe geometries. From the cases it is evident that there can be large differences between the residual stresses predicted using the new procedure, and the earlier procedure or handbook recommendations. Previously recommended profiles could give misleading fracture assessment results. The stress profiles according to the new procedure agree well with the measured data. If data is available then a mixed hardening model should be used

  15. Validation of Advanced EM Models for UXO Discrimination

    CERN Document Server

    Weichman, Peter B

    2012-01-01

    The work reported here details basic validation of our advanced physics-based EMI forward and inverse models against data collected by the NRL TEMTADS system. The data was collected under laboratory-type conditions using both artificial spheroidal targets and real UXO. The artificial target models are essentially exact, and enable detailed comparison of theory and data in support of measurement platform characterization and target identification. Real UXO targets cannot be treated exactly, but it is demonstrated that quantitative comparisons of the data with the spheroid models nevertheless aids in extracting key target discrimination information, such as target geometry and hollow target shell thickness.

  16. Trends and sensitivities of low streamflow extremes to discharge timing and magnitude in Pacific Northwest mountain streams

    Science.gov (United States)

    Kormos, Patrick R.; Luce, Charles H.; Wenger, Seth J.; Berghuijs, Wouter R.

    2016-07-01

    Path analyses of historical streamflow data from the Pacific Northwest indicate that the precipitation amount has been the dominant control on the magnitude of low streamflow extremes compared to the air temperature-affected timing of snowmelt runoff. The relative sensitivities of low streamflow to precipitation and temperature changes have important implications for adaptation planning because global circulation models produce relatively robust estimates of air temperature changes but have large uncertainties in projected precipitation amounts in the Pacific Northwest U.S. Quantile regression analyses indicate that low streamflow extremes from the majority of catchments in this study have declined from 1948 to 2013, which may significantly affect terrestrial and aquatic ecosystems, and water resource management. Trends in the 25th percentile of mean annual streamflow have declined and the center of timing has occurred earlier. We quantify the relative influences of total precipitation and air temperature on the annual low streamflow extremes from 42 stream gauges using mean annual streamflow as a proxy for precipitation amount effects and streamflow center of timing as a proxy for temperature effects on low flow metrics, including 7q10 summer (the minimum 7 day flow during summer with a 10 year return period), mean August, mean September, mean summer, 7q10 winter, and mean winter flow metrics. These methods have the benefit of using only readily available streamflow data, which makes our results robust against systematic errors in high elevation distributed precipitation data. Winter low flow metrics are weakly tied to both mean annual streamflow and center of timing.

  17. Experimental validation of a solar-chimney power plant model

    Science.gov (United States)

    Fathi, Nima; Wayne, Patrick; Trueba Monje, Ignacio; Vorobieff, Peter

    2016-11-01

    In a solar chimney power plant system (SCPPS), the energy of buoyant hot air is converted to electrical energy. SCPPS includes a collector at ground level covered with a transparent roof. Solar radiation heats the air inside and the ground underneath. There is a tall chimney at the center of the collector, and a turbine located at the base of the chimney. Lack of detailed experimental data for validation is one of the important issues in modeling this type of power plants. We present a small-scale experimental prototype developed to perform validation analysis for modeling and simulation of SCCPS. Detailed velocity measurements are acquired using particle image velocimetry (PIV) at a prescribed Reynolds number. Convection is driven by a temperature-controlled hot plate at the bottom of the prototype. Velocity field data are used to perform validation analysis and measure any mismatch of the experimental results and the CFD data. CFD Code verification is also performed, to assess the uncertainly of the numerical model with respect to our grid and the applied mathematical model. The dimensionless output power of the prototype is calculated and compared with a recent analytical solution and the experimental results.

  18. Development and validation of a building design waste reduction model.

    Science.gov (United States)

    Llatas, C; Osmani, M

    2016-10-01

    Reduction in construction waste is a pressing need in many countries. The design of building elements is considered a pivotal process to achieve waste reduction at source, which enables an informed prediction of their wastage reduction levels. However the lack of quantitative methods linking design strategies to waste reduction hinders designing out waste practice in building projects. Therefore, this paper addresses this knowledge gap through the design and validation of a Building Design Waste Reduction Strategies (Waste ReSt) model that aims to investigate the relationships between design variables and their impact on onsite waste reduction. The Waste ReSt model was validated in a real-world case study involving 20 residential buildings in Spain. The validation process comprises three stages. Firstly, design waste causes were analyzed. Secondly, design strategies were applied leading to several alternative low waste building elements. Finally, their potential source reduction levels were quantified and discussed within the context of the literature. The Waste ReSt model could serve as an instrumental tool to simulate designing out strategies in building projects. The knowledge provided by the model could help project stakeholders to better understand the correlation between the design process and waste sources and subsequently implement design practices for low-waste buildings.

  19. Calibration of Predictor Models Using Multiple Validation Experiments

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2015-01-01

    This paper presents a framework for calibrating computational models using data from several and possibly dissimilar validation experiments. The offset between model predictions and observations, which might be caused by measurement noise, model-form uncertainty, and numerical error, drives the process by which uncertainty in the models parameters is characterized. The resulting description of uncertainty along with the computational model constitute a predictor model. Two types of predictor models are studied: Interval Predictor Models (IPMs) and Random Predictor Models (RPMs). IPMs use sets to characterize uncertainty, whereas RPMs use random vectors. The propagation of a set through a model makes the response an interval valued function of the state, whereas the propagation of a random vector yields a random process. Optimization-based strategies for calculating both types of predictor models are proposed. Whereas the formulations used to calculate IPMs target solutions leading to the interval value function of minimal spread containing all observations, those for RPMs seek to maximize the models' ability to reproduce the distribution of observations. Regarding RPMs, we choose a structure for the random vector (i.e., the assignment of probability to points in the parameter space) solely dependent on the prediction error. As such, the probabilistic description of uncertainty is not a subjective assignment of belief, nor is it expected to asymptotically converge to a fixed value, but instead it casts the model's ability to reproduce the experimental data. This framework enables evaluating the spread and distribution of the predicted response of target applications depending on the same parameters beyond the validation domain.

  20. Finite Element Model and Validation of Nasal Tip Deformation.

    Science.gov (United States)

    Manuel, Cyrus T; Harb, Rani; Badran, Alan; Ho, David; Wong, Brian J F

    2017-03-01

    Nasal tip mechanical stability is important for functional and cosmetic nasal airway surgery. Palpation of the nasal tip provides information on tip strength to the surgeon, though it is a purely subjective assessment. Providing a means to simulate nasal tip deformation with a validated model can offer a more objective approach in understanding the mechanics and nuances of the nasal tip support and eventual nasal mechanics as a whole. Herein we present validation of a finite element (FE) model of the nose using physical measurements recorded using an ABS plastic-silicone nasal phantom. Three-dimensional photogrammetry was used to capture the geometry of the phantom at rest and while under steady state load. The silicone used to make the phantom was mechanically tested and characterized using a linear elastic constitutive model. Surface point clouds of the silicone and FE model were compared for both the loaded and unloaded state. The average Hausdorff distance between actual measurements and FE simulations across the nose were 0.39 ± 1.04 mm and deviated up to 2 mm at the outermost boundaries of the model. FE simulation and measurements were in near complete agreement in the immediate vicinity of the nasal tip with millimeter accuracy. We have demonstrated validation of a two-component nasal FE model, which could be used to model more complex modes of deformation where direct measurement may be challenging. This is the first step in developing a nasal model to simulate nasal mechanics and ultimately the interaction between geometry and airflow.

  1. Escherichia coli bacteria density in relation to turbidity, streamflow characteristics, and season in the Chattahoochee River near Atlanta, Georgia, October 2000 through September 2008—Description, statistical analysis, and predictive modeling

    Science.gov (United States)

    Lawrence, Stephen J.

    2012-01-01

    Water-based recreation—such as rafting, canoeing, and fishing—is popular among visitors to the Chattahoochee River National Recreation Area (CRNRA) in north Georgia. The CRNRA is a 48-mile reach of the Chattahoochee River upstream from Atlanta, Georgia, managed by the National Park Service (NPS). Historically, high densities of fecal-indicator bacteria have been documented in the Chattahoochee River and its tributaries at levels that commonly exceeded Georgia water-quality standards. In October 2000, the NPS partnered with the U.S. Geological Survey (USGS), State and local agencies, and non-governmental organizations to monitor Escherichia coli bacteria (E. coli) density and develop a system to alert river users when E. coli densities exceeded the U.S. Environmental Protection Agency (USEPA) single-sample beach criterion of 235 colonies (most probable number) per 100 milliliters (MPN/100 mL) of water. This program, called BacteriALERT, monitors E. coli density, turbidity, and water temperature at two sites on the Chattahoochee River upstream from Atlanta, Georgia. This report summarizes E. coli bacteria density and turbidity values in water samples collected between 2000 and 2008 as part of the BacteriALERT program; describes the relations between E. coli density and turbidity, streamflow characteristics, and season; and describes the regression analyses used to develop predictive models that estimate E. coli density in real time at both sampling sites.

  2. The implications of climate change scenario selection for future streamflow projection in the Upper Colorado River Basin

    OpenAIRE

    B. L. Harding; A. W. Wood; Prairie, J. R.

    2012-01-01

    The impact of projected 21st century climate conditions on streamflow in the Upper Colorado River Basin was estimated using a multi-model ensemble approach wherein the downscaled outputs of 112 future climate projections from 16 global climate models (GCMs) were used to drive a macroscale hydrology model. By the middle of the century, the impacts on streamflow range, over the entire ensemble, from a decrease of approximately 30% to an increase of approximately the same magnitude. Although pri...

  3. Validation of the Osteopenia Sheep Model for Orthopaedic Biomaterial Research

    DEFF Research Database (Denmark)

    Ding, Ming; Danielsen, C.C.; Cheng, L.

    2009-01-01

    Validation of the Osteopenia Sheep Model for Orthopaedic Biomaterial Research +1Ding, M; 2Danielsen, CC; 1Cheng, L; 3Bollen, P; 4Schwarz, P; 1Overgaard, S +1Dept of Orthopaedics O, Odense University Hospital, Denmark, 2Dept of Connective Tissue Biology, University of Aarhus, Denmark, 3Biomedicine...... Lab, University of Southern Denmark, 4Dept of Geriatrics, Glostrup University Hospital, Denmark ming.ding@ouh.regionsyddanmark.dk   Introduction:  Currently, majority orthopaedic prosthesis and biomaterial researches have been based on investigation in normal animals. In most clinical situations, most...... resemble osteoporosis in humans. This study aimed to validate glucocorticoid-induced osteopenia sheep model for orthopaedic implant and biomaterial research. We hypothesized that a 7-month GC treatment together with restricted diet but without OVX would induce osteopenia. Materials and Methods: Eighteen...

  4. Seine estuary modelling and AirSWOT measurements validation

    Science.gov (United States)

    Chevalier, Laetitia; Lyard, Florent; Laignel, Benoit

    2013-04-01

    In the context of global climate change, knowing water fluxes and storage, from the global scale to the local scale, is a crucial issue. The future satellite SWOT (Surface Water and Ocean Topography) mission, dedicated to the surface water observation, is proposed to meet this challenge. SWOT main payload will be a Ka-band Radar Interferometer (KaRIn). To validate this new kind of measurements, preparatory airborne campaigns (called AirSWOT) are currently being designed. AirSWOT will carry an interferometer similar to Karin: Kaspar-Ka-band SWOT Phenomenology Airborne Radar. Some campaigns are planned in France in 2014. During these campaigns, the plane will fly over the Seine River basin, especially to observe its estuary, the upstream river main channel (to quantify river-aquifer exchange) and some wetlands. The present work objective is to validate the ability of AirSWOT and SWOT, using a Seine estuary hydrodynamic modelling. In this context, field measurements will be collected by different teams such as GIP (Public Interest Group) Seine Aval, the GPMR (Rouen Seaport), SHOM (Hydrographic and Oceanographic Service of the Navy), the IFREMER (French Research Institute for Sea Exploitation), Mercator-Ocean, LEGOS (Laboratory of Space Study in Geophysics and Oceanography), ADES (Data Access Groundwater) ... . These datasets will be used first to validate locally AirSWOT measurements, and then to improve a hydrodynamic simulations (using tidal boundary conditions, river and groundwater inflows ...) for AirSWOT data 2D validation. This modelling will also be used to estimate the benefit of the future SWOT mission for mid-latitude river hydrology. To do this modelling,the TUGOm barotropic model (Toulouse Unstructured Grid Ocean model 2D) is used. Preliminary simulations have been performed by first modelling and then combining to different regions: first the Seine River and its estuarine area and secondly the English Channel. These two simulations h are currently being

  5. A community diagnostic tool for chemistry climate model validation

    Directory of Open Access Journals (Sweden)

    A. Gettelman

    2012-09-01

    Full Text Available This technical note presents an overview of the Chemistry-Climate Model Validation Diagnostic (CCMVal-Diag tool for model evaluation. The CCMVal-Diag tool is a flexible and extensible open source package that facilitates the complex evaluation of global models. Models can be compared to other models, ensemble members (simulations with the same model, and/or many types of observations. The initial construction and application is to coupled chemistry-climate models (CCMs participating in CCMVal, but the evaluation of climate models that submitted output to the Coupled Model Intercomparison Project (CMIP is also possible. The package has been used to assist with analysis of simulations for the 2010 WMO/UNEP Scientific Ozone Assessment and the SPARC Report on the Evaluation of CCMs. The CCMVal-Diag tool is described and examples of how it functions are presented, along with links to detailed descriptions, instructions and source code. The CCMVal-Diag tool supports model development as well as quantifies model changes, both for different versions of individual models and for different generations of community-wide collections of models used in international assessments. The code allows further extensions by different users for different applications and types, e.g. to other components of the Earth system. User modifications are encouraged and easy to perform with minimum coding.

  6. Certified reduced basis model validation: A frequentistic uncertainty framework

    OpenAIRE

    Patera, A. T.; Huynh, Dinh Bao Phuong; Knezevic, David; Patera, Anthony T.

    2011-01-01

    We introduce a frequentistic validation framework for assessment — acceptance or rejection — of the consistency of a proposed parametrized partial differential equation model with respect to (noisy) experimental data from a physical system. Our method builds upon the Hotelling T[superscript 2] statistical hypothesis test for bias first introduced by Balci and Sargent in 1984 and subsequently extended by McFarland and Mahadevan (2008). Our approach introduces two new elements: a spectral repre...

  7. Trailing Edge Noise Model Validation and Application to Airfoil Optimization

    DEFF Research Database (Denmark)

    Bertagnolio, Franck; Aagaard Madsen, Helge; Bak, Christian

    2010-01-01

    The aim of this article is twofold. First, an existing trailing edge noise model is validated by comparing with airfoil surface pressure fluctuations and far field sound pressure levels measured in three different experiments. The agreement is satisfactory in one case but poor in two other cases...... across the boundary layer near the trailing edge and to a lesser extent by a smaller boundary layer displacement thickness. ©2010 American Society of Mechanical Engineers...

  8. Methods for Geometric Data Validation of 3d City Models

    Science.gov (United States)

    Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2015-12-01

    Geometric quality of 3D city models is crucial for data analysis and simulation tasks, which are part of modern applications of the data (e.g. potential heating energy consumption of city quarters, solar potential, etc.). Geometric quality in these contexts is however a different concept as it is for 2D maps. In the latter case, aspects such as positional or temporal accuracy and correctness represent typical quality metrics of the data. They are defined in ISO 19157 and should be mentioned as part of the metadata. 3D data has a far wider range of aspects which influence their quality, plus the idea of quality itself is application dependent. Thus, concepts for definition of quality are needed, including methods to validate these definitions. Quality on this sense means internal validation and detection of inconsistent or wrong geometry according to a predefined set of rules. A useful starting point would be to have correct geometry in accordance with ISO 19107. A valid solid should consist of planar faces which touch their neighbours exclusively in defined corner points and edges. No gaps between them are allowed, and the whole feature must be 2-manifold. In this paper, we present methods to validate common geometric requirements for building geometry. Different checks based on several algorithms have been implemented to validate a set of rules derived from the solid definition mentioned above (e.g. water tightness of the solid or planarity of its polygons), as they were developed for the software tool CityDoctor. The method of each check is specified, with a special focus on the discussion of tolerance values where they are necessary. The checks include polygon level checks to validate the correctness of each polygon, i.e. closeness of the bounding linear ring and planarity. On the solid level, which is only validated if the polygons have passed validation, correct polygon orientation is checked, after self-intersections outside of defined corner points and edges

  9. A validated approach for modeling collapse of steel structures

    Science.gov (United States)

    Saykin, Vitaliy Victorovich

    A civil engineering structure is faced with many hazardous conditions such as blasts, earthquakes, hurricanes, tornadoes, floods, and fires during its lifetime. Even though structures are designed for credible events that can happen during a lifetime of the structure, extreme events do happen and cause catastrophic failures. Understanding the causes and effects of structural collapse is now at the core of critical areas of national need. One factor that makes studying structural collapse difficult is the lack of full-scale structural collapse experimental test results against which researchers could validate their proposed collapse modeling approaches. The goal of this work is the creation of an element deletion strategy based on fracture models for use in validated prediction of collapse of steel structures. The current work reviews the state-of-the-art of finite element deletion strategies for use in collapse modeling of structures. It is shown that current approaches to element deletion in collapse modeling do not take into account stress triaxiality in vulnerable areas of the structure, which is important for proper fracture and element deletion modeling. The report then reviews triaxiality and its role in fracture prediction. It is shown that fracture in ductile materials is a function of triaxiality. It is also shown that, depending on the triaxiality range, different fracture mechanisms are active and should be accounted for. An approach using semi-empirical fracture models as a function of triaxiality are employed. The models to determine fracture initiation, softening and subsequent finite element deletion are outlined. This procedure allows for stress-displacement softening at an integration point of a finite element in order to subsequently remove the element. This approach avoids abrupt changes in the stress that would create dynamic instabilities, thus making the results more reliable and accurate. The calibration and validation of these models are

  10. Full-scale validation of a model of algal productivity.

    Science.gov (United States)

    Béchet, Quentin; Shilton, Andy; Guieysse, Benoit

    2014-12-02

    While modeling algal productivity outdoors is crucial to assess the economic and environmental performance of full-scale cultivation, most of the models hitherto developed for this purpose have not been validated under fully relevant conditions, especially with regard to temperature variations. The objective of this study was to independently validate a model of algal biomass productivity accounting for both light and temperature and constructed using parameters experimentally derived using short-term indoor experiments. To do this, the accuracy of a model developed for Chlorella vulgaris was assessed against data collected from photobioreactors operated outdoor (New Zealand) over different seasons, years, and operating conditions (temperature-control/no temperature-control, batch, and fed-batch regimes). The model accurately predicted experimental productivities under all conditions tested, yielding an overall accuracy of ±8.4% over 148 days of cultivation. For the purpose of assessing the feasibility of full-scale algal cultivation, the use of the productivity model was therefore shown to markedly reduce uncertainty in cost of biofuel production while also eliminating uncertainties in water demand, a critical element of environmental impact assessments. Simulations at five climatic locations demonstrated that temperature-control in outdoor photobioreactors would require tremendous amounts of energy without considerable increase of algal biomass. Prior assessments neglecting the impact of temperature variations on algal productivity in photobioreactors may therefore be erroneous.

  11. On the contribution of groundwater storage to interannual streamflow anomalies in the Colorado River basin

    Directory of Open Access Journals (Sweden)

    E. A. Rosenberg

    2012-11-01

    Full Text Available We assess the significance of groundwater storage for seasonal streamflow forecasts by evaluating its contribution to interannual streamflow anomalies in the 29 tributary sub-basins of the Colorado River. Monthly and annual changes in total basin storage are simulated by two implementations of the Variable Infiltration Capacity (VIC macroscale hydrology model – the standard release of the model, and an alternate version that has been modified to include the SIMple Groundwater Model (SIMGM, which represents an unconfined aquifer underlying the soil column. These estimates are compared to those resulting from basin-scale water balances derived exclusively from observational data and changes in terrestrial water storage from the Gravity Recovery and Climate Experiment (GRACE satellites. Changes in simulated groundwater storage are then compared to those derived via baseflow recession analysis for 72 reference-quality watersheds. Finally, estimates are statistically analyzed for relationships to interannual streamflow anomalies, and predictive capacities are compared across storage terms. We find that both model simulations result in similar estimates of total basin storage change, that these estimates compare favorably with those obtained from basin-scale water balances and GRACE data, and that baseflow recession analyses are consistent with simulated changes in groundwater storage. Statistical analyses reveal essentially no relationship between groundwater storage and interannual streamflow anomalies, suggesting that operational seasonal streamflow forecasts, which do not account for groundwater conditions implicitly or explicitly, are likely not detrimentally affected by this omission in the Colorado River basin.

  12. On the contribution of groundwater storage to interannual streamflow anomalies in the Colorado River basin

    Directory of Open Access Journals (Sweden)

    E. A. Rosenberg

    2013-04-01

    Full Text Available We assess the significance of groundwater storage for seasonal streamflow forecasts by evaluating its contribution to interannual streamflow anomalies in the 29 tributary sub-basins of the Colorado River. Monthly and annual changes in total basin storage are simulated by two implementations of the Variable Infiltration Capacity (VIC macroscale hydrology model – the standard release of the model, and an alternate version that has been modified to include the SIMple Groundwater Model (SIMGM, which represents an unconfined aquifer underlying the soil column. These estimates are compared to those resulting from basin-scale water balances derived exclusively from observational data and changes in terrestrial water storage from the Gravity Recovery and Climate Experiment (GRACE satellites. Changes in simulated groundwater storage are then compared to those derived via baseflow recession analysis for 72 reference-quality watersheds. Finally, estimates are statistically analyzed for relationships to interannual streamflow anomalies, and predictive capacities are compared across storage terms. We find that both model simulations result in similar estimates of total basin storage change, that these estimates compare favorably with those obtained from basin-scale water balances and GRACE data, and that baseflow recession analyses are consistent with simulated changes in groundwater storage. Statistical analyses reveal essentially no relationship between groundwater storage and interannual streamflow anomalies, suggesting that operational seasonal streamflow forecasts, which do not account for groundwater conditions implicitly or explicitly, are likely not detrimentally affected by this omission in the Colorado River basin.

  13. Regime-shifting streamflow processes: Implications for water supply reservoir operations

    Science.gov (United States)

    Turner, S. W. D.; Galelli, S.

    2016-05-01

    This paper examines the extent to which regime-like behavior in streamflow time series impacts reservoir operating policy performance. We begin by incorporating a regime state variable into a well-established stochastic dynamic programming model. We then simulate and compare optimized release policies—with and without the regime state variable—to understand how regime shifts affect operating performance in terms of meeting water delivery targets. Our optimization approach uses a Hidden Markov Model to partition the streamflow time series into a small number of separate regime states. The streamflow persistence structures associated with each state define separate month-to-month streamflow transition probability matrices for computing penalty cost expectations within the optimization procedure. The algorithm generates a four-dimensional array of release decisions conditioned on the within-year time period, reservoir storage state, inflow class, and underlying regime state. Our computational experiment is executed on 99 distinct, hypothetical water supply reservoirs fashioned from the Australian Bureau of Meteorology's Hydrologic Reference Stations. Results show that regime-like behavior is a major cause of suboptimal operations in water supply reservoirs; conventional techniques for optimal policy design may misguide the operator, particularly in regions susceptible to multiyear drought. Stationary streamflow models that allow for regime-like behavior can be incorporated into traditional stochastic optimization models to enhance the flexibility of operations.

  14. Validation of a Hertzian contact model with nonlinear damping

    Science.gov (United States)

    Sierakowski, Adam

    2015-11-01

    Due to limited spatial resolution, most disperse particle simulation methods rely on simplified models for incorporating short-range particle interactions. In this presentation, we introduce a contact model that combines the Hertz elastic restoring force with a nonlinear damping force, requiring only material properties and no tunable parameters. We have implemented the model in a resolved-particle flow solver that implements the Physalis method, which accurately captures hydrodynamic interactions by analytically enforcing the no-slip condition on the particle surface. We summarize the results of a few numerical studies that suggest the validity of the contact model over a range of particle interaction intensities (i.e., collision Stokes numbers) when compared with experimental data. This work was supported by the National Science Foundation under Grant Number CBET1335965 and the Johns Hopkins University Modeling Complex Systems IGERT program.

  15. Statistical validation of high-dimensional models of growing networks

    CERN Document Server

    Medo, Matus

    2013-01-01

    The abundance of models of complex networks and the current insufficient validation standards make it difficult to judge which models are strongly supported by data and which are not. We focus here on likelihood maximization methods for models of growing networks with many parameters and compare their performance on artificial and real datasets. While high dimensionality of the parameter space harms the performance of direct likelihood maximization on artificial data, this can be improved by introducing a suitable penalization term. Likelihood maximization on real data shows that the presented approach is able to discriminate among available network models. To make large-scale datasets accessible to this kind of analysis, we propose a subset sampling technique and show that it yields substantial model evidence in a fraction of time necessary for the analysis of the complete data.

  16. Filament winding cylinders. II - Validation of the process model

    Science.gov (United States)

    Calius, Emilio P.; Lee, Soo-Yong; Springer, George S.

    1990-01-01

    Analytical and experimental studies were performed to validate the model developed by Lee and Springer for simulating the manufacturing process of filament wound composite cylinders. First, results calculated by the Lee-Springer model were compared to results of the Calius-Springer thin cylinder model. Second, temperatures and strains calculated by the Lee-Springer model were compared to data. The data used in these comparisons were generated during the course of this investigation with cylinders made of Hercules IM-6G/HBRF-55 and Fiberite T-300/976 graphite-epoxy tows. Good agreement was found between the calculated and measured stresses and strains, indicating that the model is a useful representation of the winding and curing processes.

  17. Model selection, identification and validation in anaerobic digestion: a review.

    Science.gov (United States)

    Donoso-Bravo, Andres; Mailier, Johan; Martin, Cristina; Rodríguez, Jorge; Aceves-Lara, César Arturo; Vande Wouwer, Alain

    2011-11-01

    Anaerobic digestion enables waste (water) treatment and energy production in the form of biogas. The successful implementation of this process has lead to an increasing interest worldwide. However, anaerobic digestion is a complex biological process, where hundreds of microbial populations are involved, and whose start-up and operation are delicate issues. In order to better understand the process dynamics and to optimize the operating conditions, the availability of dynamic models is of paramount importance. Such models have to be inferred from prior knowledge and experimental data collected from real plants. Modeling and parameter identification are vast subjects, offering a realm of approaches and methods, which can be difficult to fully understand by scientists and engineers dedicated to the plant operation and improvements. This review article discusses existing modeling frameworks and methodologies for parameter estimation and model validation in the field of anaerobic digestion processes. The point of view is pragmatic, intentionally focusing on simple but efficient methods.

  18. Validation of the WATEQ4 geochemical model for uranium

    Energy Technology Data Exchange (ETDEWEB)

    Krupka, K.M.; Jenne, E.A.; Deutsch, W.J.

    1983-09-01

    As part of the Geochemical Modeling and Nuclide/Rock/Groundwater Interactions Studies Program, a study was conducted to partially validate the WATEQ4 aqueous speciation-solubility geochemical model for uranium. The solubility controls determined with the WATEQ4 geochemical model were in excellent agreement with those laboratory studies in which the solids schoepite (UO/sub 2/(OH)/sub 2/ . H/sub 2/O), UO/sub 2/(OH)/sub 2/, and rutherfordine ((UO/sub 2/CO/sub 3/) were identified as actual solubility controls for uranium. The results of modeling solution analyses from laboratory studies of uranyl phosphate solids, however, identified possible errors in the characterization of solids in the original solubility experiments. As part of this study, significant deficiencies in the WATEQ4 thermodynamic data base for uranium solutes and solids were corrected. Revisions included recalculation of selected uranium reactions. Additionally, thermodynamic data for the hydroxyl complexes of U(VI), including anionic (VI) species, were evaluated (to the extent permitted by the available data). Vanadium reactions were also added to the thermodynamic data base because uranium-vanadium solids can exist in natural ground-water systems. This study is only a partial validation of the WATEQ4 geochemical model because the available laboratory solubility studies do not cover the range of solid phases, alkaline pH values, and concentrations of inorganic complexing ligands needed to evaluate the potential solubility of uranium in ground waters associated with various proposed nuclear waste repositories. Further validation of this or other geochemical models for uranium will require careful determinations of uraninite solubility over the pH range of 7 to 10 under highly reducing conditions and of uranyl hydroxide and phosphate solubilities over the pH range of 7 to 10 under oxygenated conditions.

  19. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis: Model development and validation of existing models

    Science.gov (United States)

    van der Wijk, Lars; Proost, Johannes H.; Sinha, Bhanu; Touw, Daan J.

    2017-01-01

    Gentamicin shows large variations in half-life and volume of distribution (Vd) within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for predicting gentamicin doses in adults. For endocarditis patients the optimal model is unknown. We aimed at: 1) creating an optimal model for endocarditis patients; and 2) assessing whether the endocarditis and existing models can accurately predict serum levels. We performed a retrospective observational two-cohort study: one cohort to parameterize the endocarditis model by iterative two-stage Bayesian analysis, and a second cohort to validate and compare all three models. The Akaike Information Criterion and the weighted sum of squares of the residuals divided by the degrees of freedom were used to select the endocarditis model. Median Prediction Error (MDPE) and Median Absolute Prediction Error (MDAPE) were used to test all models with the validation dataset. We built the endocarditis model based on data from the modeling cohort (65 patients) with a fixed 0.277 L/h/70kg metabolic clearance, 0.698 (±0.358) renal clearance as fraction of creatinine clearance, and Vd 0.312 (±0.076) L/kg corrected lean body mass. External validation with data from 14 validation cohort patients showed a similar predictive power of the endocarditis model (MDPE -1.77%, MDAPE 4.68%) as compared to the intensive-care (MDPE -1.33%, MDAPE 4.37%) and standard (MDPE -0.90%, MDAPE 4.82%) models. All models acceptably predicted pharmacokinetic parameters for gentamicin in endocarditis patients. However, these patients appear to have an increased Vd, similar to intensive care patients. Vd mainly determines the height of peak serum levels, which in turn correlate with bactericidal activity. In order to maintain simplicity, we advise to use the existing intensive-care model in clinical practice to avoid

  20. Experimental validation of flexible robot arm modeling and control

    Science.gov (United States)

    Ulsoy, A. Galip

    1989-01-01

    Flexibility is important for high speed, high precision operation of lightweight manipulators. Accurate dynamic modeling of flexible robot arms is needed. Previous work has mostly been based on linear elasticity with prescribed rigid body motions (i.e., no effect of flexible motion on rigid body motion). Little or no experimental validation of dynamic models for flexible arms is available. Experimental results are also limited for flexible arm control. Researchers include the effects of prismatic as well as revolute joints. They investigate the effect of full coupling between the rigid and flexible motions, and of axial shortening, and consider the control of flexible arms using only additional sensors.

  1. Validation of Power Requirement Model for Active Loudspeakers

    DEFF Research Database (Denmark)

    Schneider, Henrik; Madsen, Anders Normann; Bjerregaard, Ruben

    2015-01-01

    The actual power requirement of an active loudspeaker during playback of music has not received much attention in the literature. This is probably because no single and simple solution exists and because a complete system knowledge from input voltage to output sound pressure level is required....... There are however many advantages that could be harvested from such knowledge like size, cost and efficiency improvements. In this paper a recently proposed power requirement model for active loudspeakers is experimentally validated and the model is expanded to include the closed and vented type enclosures...

  2. Validation of a Business Model for Cultural Heritage Institutions

    Directory of Open Access Journals (Sweden)

    Cristian CIUREA

    2015-01-01

    Full Text Available The paper proposes a business model for the efficiency optimization of the interaction between all actors involved in cultural heritage sector, such as galleries, libraries, archives and museums (GLAM. The validation of the business model is subject of analyses and implementations in a real environment made by different cultural institutions. The implementation of virtual exhibitions on mobile devices is described and analyzed as a key factor for increasing the cultural heritage visibility. New perspectives on the development of virtual exhibitions for mobile devices are considered. A study on the number of visitors of cultural institutions is carried out and ways to increase the number of visitors are described.

  3. Validating a spatially distributed hydrological model with soil morphology data

    Directory of Open Access Journals (Sweden)

    T. Doppler

    2013-10-01

    Full Text Available Spatially distributed hydrological models are popular tools in hydrology and they are claimed to be useful to support management decisions. Despite the high spatial resolution of the computed variables, calibration and validation is often carried out only on discharge time-series at specific locations due to the lack of spatially distributed reference data. Because of this restriction, the predictive power of these models, with regard to predicted spatial patterns, can usually not be judged. An example of spatial predictions in hydrology is the prediction of saturated areas in agricultural catchments. These areas can be important source areas for the transport of agrochemicals to the stream. We set up a spatially distributed model to predict saturated areas in a 1.2 km2 catchment in Switzerland with moderate topography. Around 40% of the catchment area are artificially drained. We measured weather data, discharge and groundwater levels in 11 piezometers for 1.5 yr. For broadening the spatially distributed data sets that can be used for model calibration and validation, we translated soil morphological data available from soil maps into an estimate of the duration of soil saturation in the soil horizons. We used redox-morphology signs for these estimates. This resulted in a data set with high spatial coverage on which the model predictions were validated. In general, these saturation estimates corresponded well to the measured groundwater levels. We worked with a model that would be applicable for management decisions because of its fast calculation speed and rather low data requirements. We simultaneously calibrated the model to the groundwater levels in the piezometers and discharge. The model was able to reproduce the general hydrological behavior of the catchment in terms of discharge and absolute groundwater levels. However, the accuracy of the groundwater level predictions was not high enough to be used for the prediction of saturated areas

  4. Reconstructing streamflow variation of the Baker River from tree-rings in Northern Patagonia since 1765

    Science.gov (United States)

    Lara, Antonio; Bahamondez, Alejandra; González-Reyes, Alvaro; Muñoz, Ariel A.; Cuq, Emilio; Ruiz-Gómez, Carolina

    2015-10-01

    The understanding of the long-term variation of large rivers streamflow with a high economic and social relevance is necessary in order to improve the planning and management of water resources in different regions of the world. The Baker River has the highest mean discharge of those draining both slopes of the Andes South of 20°S and it is among the six rivers with the highest mean streamflow in the Pacific domain of South America (1100 m3 s-1 at its outlet). It drains an international basin of 29,000 km2 shared by Chile and Argentina and has a high ecologic and economic value including conservation, tourism, recreational fishing, and projected hydropower. This study reconstructs the austral summer - early fall (January-April) streamflow for the Baker River from Nothofagus pumilio tree-rings for the period 1765-2004. Summer streamflow represents 45.2% of the annual discharge. The regression model for the period (1961-2004) explains 54% of the variance of the Baker River streamflow (R2adj = 0.54). The most significant temporal pattern in the record is the sustained decline since the 1980s (τ = -0.633, p = 1.0144 ∗ 10-5 for the 1985-2004 period), which is unprecedented since 1765. The Correlation of the Baker streamflow with the November-April observed Southern Annular Mode (SAM) is significant (1961-2004, r = -0.55, p development of new tree-ring reconstructions to increase the geographic range and to cover the last 1000 or more years using long-lived species (e.g. Fitzroya cupressoides and Pilgerodendron uviferum). Expanding the network and quality of instrumental weather, streamflow and other monitoring stations as well as the study and modeling of the complex hydrological processes in the Baker basin are necessary. This should be the basis for planning, policy design and decision making regarding water resources in the Baker basin.

  5. Decomposition of Sources of Errors in Seasonal Streamflow Forecasting over the U.S. Sunbelt

    Science.gov (United States)

    Mazrooei, Amirhossein; Sinah, Tusshar; Sankarasubramanian, A.; Kumar, Sujay V.; Peters-Lidard, Christa D.

    2015-01-01

    Seasonal streamflow forecasts, contingent on climate information, can be utilized to ensure water supply for multiple uses including municipal demands, hydroelectric power generation, and for planning agricultural operations. However, uncertainties in the streamflow forecasts pose significant challenges in their utilization in real-time operations. In this study, we systematically decompose various sources of errors in developing seasonal streamflow forecasts from two Land Surface Models (LSMs) (Noah3.2 and CLM2), which are forced with downscaled and disaggregated climate forecasts. In particular, the study quantifies the relative contributions of the sources of errors from LSMs, climate forecasts, and downscaling/disaggregation techniques in developing seasonal streamflow forecast. For this purpose, three month ahead seasonal precipitation forecasts from the ECHAM4.5 general circulation model (GCM) were statistically downscaled from 2.8deg to 1/8deg spatial resolution using principal component regression (PCR) and then temporally disaggregated from monthly to daily time step using kernel-nearest neighbor (K-NN) approach. For other climatic forcings, excluding precipitation, we considered the North American Land Data Assimilation System version 2 (NLDAS-2) hourly climatology over the years 1979 to 2010. Then the selected LSMs were forced with precipitation forecasts and NLDAS-2 hourly climatology to develop retrospective seasonal streamflow forecasts over a period of 20 years (1991-2010). Finally, the performance of LSMs in forecasting streamflow under different schemes was analyzed to quantify the relative contribution of various sources of errors in developing seasonal streamflow forecast. Our results indicate that the most dominant source of errors during winter and fall seasons is the errors due to ECHAM4.5 precipitation forecasts, while temporal disaggregation scheme contributes to maximum errors during summer season.

  6. Organic acid modeling and model validation: Workshop summary. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, T.J.; Eilers, J.M.

    1992-08-14

    A workshop was held in Corvallis, Oregon on April 9--10, 1992 at the offices of E&S Environmental Chemistry, Inc. The purpose of this workshop was to initiate research efforts on the entitled ``Incorporation of an organic acid representation into MAGIC (Model of Acidification of Groundwater in Catchments) and testing of the revised model using Independent data sources.`` The workshop was attended by a team of internationally-recognized experts in the fields of surface water acid-bass chemistry, organic acids, and watershed modeling. The rationale for the proposed research is based on the recent comparison between MAGIC model hindcasts and paleolimnological inferences of historical acidification for a set of 33 statistically-selected Adirondack lakes. Agreement between diatom-inferred and MAGIC-hindcast lakewater chemistry in the earlier research had been less than satisfactory. Based on preliminary analyses, it was concluded that incorporation of a reasonable organic acid representation into the version of MAGIC used for hindcasting was the logical next step toward improving model agreement.

  7. Organic acid modeling and model validation: Workshop summary

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, T.J.; Eilers, J.M.

    1992-08-14

    A workshop was held in Corvallis, Oregon on April 9--10, 1992 at the offices of E S Environmental Chemistry, Inc. The purpose of this workshop was to initiate research efforts on the entitled Incorporation of an organic acid representation into MAGIC (Model of Acidification of Groundwater in Catchments) and testing of the revised model using Independent data sources.'' The workshop was attended by a team of internationally-recognized experts in the fields of surface water acid-bass chemistry, organic acids, and watershed modeling. The rationale for the proposed research is based on the recent comparison between MAGIC model hindcasts and paleolimnological inferences of historical acidification for a set of 33 statistically-selected Adirondack lakes. Agreement between diatom-inferred and MAGIC-hindcast lakewater chemistry in the earlier research had been less than satisfactory. Based on preliminary analyses, it was concluded that incorporation of a reasonable organic acid representation into the version of MAGIC used for hindcasting was the logical next step toward improving model agreement.

  8. Packed bed heat storage: Continuum mechanics model and validation

    Science.gov (United States)

    Knödler, Philipp; Dreißigacker, Volker; Zunft, Stefan

    2016-05-01

    Thermal energy storage (TES) systems are key elements for various types of new power plant concepts. As possible cost-effective storage inventory option, packed beds of miscellaneous material come into consideration. However, high technical risks arise due to thermal expansion and shrinking of the packed bed's particles during cyclic thermal operation, possibly leading to material failure. Therefore, suitable tools for designing the heat storage system are mandatory. While particle discrete models offer detailed simulation results, the computing time for large scale applications is inefficient. In contrast, continuous models offer time-efficient simulation results but are in need of effective packed bed parameters. This work focuses on providing insight into some basic methods and tools on how to obtain such parameters and on how they are implemented into a continuum model. In this context, a particle discrete model as well as a test rig for carrying out uniaxial compression tests (UCT) is introduced. Performing of experimental validation tests indicate good agreement with simulated UCT results. In this process, effective parameters required for a continuous packed bed model were identified and used for continuum simulation. This approach is validated by comparing the simulated results with experimental data from another test rig. The presented method significantly simplifies subsequent design studies.

  9. Multi-site calibration, validation, and sensitivity analysis of the MIKE SHE Model for a large watershed in northern China

    Directory of Open Access Journals (Sweden)

    S. Wang

    2012-12-01

    Full Text Available Model calibration is essential for hydrologic modeling of large watersheds in a heterogeneous mountain environment. Little guidance is available for model calibration protocols for distributed models that aim at capturing the spatial variability of hydrologic processes. This study used the physically-based distributed hydrologic model, MIKE SHE, to contrast a lumped calibration protocol that used streamflow measured at one single watershed outlet to a multi-site calibration method which employed streamflow measurements at three stations within the large Chaohe River basin in northern China. Simulation results showed that the single-site calibrated model was able to sufficiently simulate the hydrographs for two of the three stations (Nash-Sutcliffe coefficient of 0.65–0.75, and correlation coefficient 0.81–0.87 during the testing period, but the model performed poorly for the third station (Nash-Sutcliffe coefficient only 0.44. Sensitivity analysis suggested that streamflow of upstream area of the watershed was dominated by slow groundwater, whilst streamflow of middle- and down- stream areas by relatively quick interflow. Therefore, a multi-site calibration protocol was deemed necessary. Due to the potential errors and uncertainties with respect to the representation of spatial variability, performance measures from the multi-site calibration protocol slightly decreased for two of the three stations, whereas it was improved greatly for the third station. We concluded that multi-site calibration protocol reached a compromise in term of model performance for the three stations, reasonably representing the hydrographs of all three stations with Nash-Sutcliffe coefficient ranging from 0.59–072. The multi-site calibration protocol applied in the analysis generally has advantages to the single site calibration protocol.

  10. Model calibration and validation of an impact test simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, F. M. (François M.); Wilson, A. C. (Amanda C.); Havrilla, G. N. (George N.)

    2001-01-01

    This paper illustrates the methodology being developed at Los Alamos National Laboratory for the validation of numerical simulations for engineering structural dynamics. The application involves the transmission of a shock wave through an assembly that consists of a steel cylinder and a layer of elastomeric (hyper-foam) material. The assembly is mounted on an impact table to generate the shock wave. The input acceleration and three output accelerations are measured. The main objective of the experiment is to develop a finite element representation of the system capable of reproducing the test data with acceptable accuracy. Foam layers of various thicknesses and several drop heights are considered during impact testing. Each experiment is replicated several times to estimate the experimental variability. Instead of focusing on the calibration of input parameters for a single configuration, the numerical model is validated for its ability to predict the response of three different configurations (various combinations of foam thickness and drop height). Design of Experiments is implemented to perform parametric and statistical variance studies. Surrogate models are developed to replace the computationally expensive numerical simulation. Variables of the finite element model are separated into calibration variables and control variables, The models are calibrated to provide numerical simulations that correctly reproduce the statistical variation of the test configurations. The calibration step also provides inference for the parameters of a high strain-rate dependent material model of the hyper-foam. After calibration, the validity of the numerical simulation is assessed through its ability to predict the response of a fourth test setup.

  11. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  12. Calibration and validation of DRAINMOD to model bioretention hydrology

    Science.gov (United States)

    Brown, R. A.; Skaggs, R. W.; Hunt, W. F.

    2013-04-01

    SummaryPrevious field studies have shown that the hydrologic performance of bioretention cells varies greatly because of factors such as underlying soil type, physiographic region, drainage configuration, surface storage volume, drainage area to bioretention surface area ratio, and media depth. To more accurately describe bioretention hydrologic response, a long-term hydrologic model that generates a water balance is needed. Some current bioretention models lack the ability to perform long-term simulations and others have never been calibrated from field monitored bioretention cells with underdrains. All peer-reviewed models lack the ability to simultaneously perform both of the following functions: (1) model an internal water storage (IWS) zone drainage configuration and (2) account for soil-water content using the soil-water characteristic curve. DRAINMOD, a widely-accepted agricultural drainage model, was used to simulate the hydrologic response of runoff entering a bioretention cell. The concepts of water movement in bioretention cells are very similar to those of agricultural fields with drainage pipes, so many bioretention design specifications corresponded directly to DRAINMOD inputs. Detailed hydrologic measurements were collected from two bioretention field sites in Nashville and Rocky Mount, North Carolina, to calibrate and test the model. Each field site had two sets of bioretention cells with varying media depths, media types, drainage configurations, underlying soil types, and surface storage volumes. After 12 months, one of these characteristics was altered - surface storage volume at Nashville and IWS zone depth at Rocky Mount. At Nashville, during the second year (post-repair period), the Nash-Sutcliffe coefficients for drainage and exfiltration/evapotranspiration (ET) both exceeded 0.8 during the calibration and validation periods. During the first year (pre-repair period), the Nash-Sutcliffe coefficients for drainage, overflow, and exfiltration

  13. Effects of water-supply reservoirs on streamflow in Massachusetts

    Science.gov (United States)

    Levin, Sara B.

    2016-10-06

    State and local water-resource managers need modeling tools to help them manage and protect water-supply resources for both human consumption and ecological needs. The U.S. Geological Survey, in cooperation with the Massachusetts Department of Environmental Protection, has developed a decision-support tool to estimate the effects of reservoirs on natural streamflow. The Massachusetts Reservoir Simulation Tool is a model that simulates the daily water balance of a reservoir. The reservoir simulation tool provides estimates of daily outflows from reservoirs and compares the frequency, duration, and magnitude of the volume of outflows from reservoirs with estimates of the unaltered streamflow that would occur if no dam were present. This tool will help environmental managers understand the complex interactions and tradeoffs between water withdrawals, reservoir operational practices, and reservoir outflows needed for aquatic habitats.A sensitivity analysis of the daily water balance equation was performed to identify physical and operational features of reservoirs that could have the greatest effect on reservoir outflows. For the purpose of this report, uncontrolled releases of water (spills or spillage) over the reservoir spillway were considered to be a proxy for reservoir outflows directly below the dam. The ratio of average withdrawals to the average inflows had the largest effect on spillage patterns, with the highest withdrawals leading to the lowest spillage. The size of the surface area relative to the drainage area of the reservoir also had an effect on spillage; reservoirs with large surface areas have high evaporation rates during the summer, which can contribute to frequent and long periods without spillage, even in the absence of water withdrawals. Other reservoir characteristics, such as variability of inflows, groundwater interactions, and seasonal demand patterns, had low to moderate effects on the frequency, duration, and magnitude of spillage. The

  14. Validating and Verifying Biomathematical Models of Human Fatigue

    Science.gov (United States)

    Martinez, Siera Brooke; Quintero, Luis Ortiz; Flynn-Evans, Erin

    2015-01-01

    Airline pilots experience acute and chronic sleep deprivation, sleep inertia, and circadian desynchrony due to the need to schedule flight operations around the clock. This sleep loss and circadian desynchrony gives rise to cognitive impairments, reduced vigilance and inconsistent performance. Several biomathematical models, based principally on patterns observed in circadian rhythms and homeostatic drive, have been developed to predict a pilots levels of fatigue or alertness. These models allow for the Federal Aviation Administration (FAA) and commercial airlines to make decisions about pilot capabilities and flight schedules. Although these models have been validated in a laboratory setting, they have not been thoroughly tested in operational environments where uncontrolled factors, such as environmental sleep disrupters, caffeine use and napping, may impact actual pilot alertness and performance. We will compare the predictions of three prominent biomathematical fatigue models (McCauley Model, Harvard Model, and the privately-sold SAFTE-FAST Model) to actual measures of alertness and performance. We collected sleep logs, movement and light recordings, psychomotor vigilance task (PVT), and urinary melatonin (a marker of circadian phase) from 44 pilots in a short-haul commercial airline over one month. We will statistically compare with the model predictions to lapses on the PVT and circadian phase. We will calculate the sensitivity and specificity of each model prediction under different scheduling conditions. Our findings will aid operational decision-makers in determining the reliability of each model under real-world scheduling situations.

  15. Multicomponent aerosol dynamics model UHMA: model development and validation

    Directory of Open Access Journals (Sweden)

    H. Korhonen

    2004-01-01

    Full Text Available A size-segregated aerosol dynamics model UHMA (University of Helsinki Multicomponent Aerosol model was developed for studies of multicomponent tropospheric aerosol particles. The model includes major aerosol microphysical processes in the atmosphere with a focus on new particle formation and growth; thus it incorporates particle coagulation and multicomponent condensation, applying a revised treatment of condensation flux onto free molecular regime particles and the activation of nanosized clusters by organic vapours (Nano-Köhler theory, as well as recent parameterizations for binary H2SO4-H2O and ternary H2SO4-NH3-H2O homogeneous nucleation and dry deposition. The representation of particle size distribution can be chosen from three sectional methods: the hybrid method, the moving center method, and the retracking method in which moving sections are retracked to a fixed grid after a certain time interval. All these methods can treat particle emissions and atmospheric transport consistently, and are therefore suitable for use in large scale atmospheric models. In a test simulation against an accurate high resolution solution, all the methods showed reasonable treatment of new particle formation with 20 size sections although the hybrid and the retracking methods suffered from artificial widening of the distribution. The moving center approach, on the other hand, showed extra dents in the particle size distribution and failed to predict the onset of detectable particle formation. In a separate test simulation of an observed nucleation event, the model captured the key qualitative behaviour of the system well. Furthermore, its prediction of the organic volume fraction in newly formed particles, suggesting values as high as 0.5 for 3–4 nm particles and approximately 0.8 for 10 nm particles, agrees with recent indirect composition measurements.

  16. Multicomponent aerosol dynamics model UHMA: model development and validation

    Directory of Open Access Journals (Sweden)

    H. Korhonen

    2004-01-01

    Full Text Available A size-segregated aerosol dynamics model UHMA (University of Helsinki Multicomponent Aerosol model was developed for studies of multicomponent tropospheric aerosol particles. The model includes major aerosol microphysical processes in the atmosphere with a focus on new particle formation and growth; thus it incorporates particle coagulation and multicomponent condensation, applying a revised treatment of condensation flux onto free molecular regime particles and the activation of nanosized clusters by organic vapours (Nano-Köhler theory, as well as recent parameterizations for binary H2SO4–H2O and ternary H2SO4–NH3-H2O homogeneous nucleation and dry deposition. The representation of particle size distribution can be chosen from three sectional methods: the hybrid method, the moving center method, and the retracking method in which moving sections are retracked to a fixed grid after a certain time interval. All these methods can treat particle emissions and transport consistently, and are therefore suitable for use in large scale atmospheric models. In a test simulation against an accurate high resolution solution, all the methods showed reasonable treatment of new particle formation with 20 size sections although the hybrid and the retracking methods suffered from artificial widening of the distribution. The moving center approach, on the other hand, showed extra dents in the particle size distribution and failed to predict the onset of detectable particle formation. In a separate test simulation of an observed nucleation event, the model captured the key qualitative behaviour of the system well. Furthermore, its prediction of the organic volume fraction in newly formed particles, suggesting values as high as 0.5 for 3–4 nm particles and approximately 0.8 for 10 nm particles, agrees with recent indirect composition measurements.

  17. Multicomponent aerosol dynamics model UHMA: model development and validation

    Science.gov (United States)

    Korhonen, H.; Lehtinen, K. E. J.; Kulmala, M.

    2004-05-01

    A size-segregated aerosol dynamics model UHMA (University of Helsinki Multicomponent Aerosol model) was developed for studies of multicomponent tropospheric aerosol particles. The model includes major aerosol microphysical processes in the atmosphere with a focus on new particle formation and growth; thus it incorporates particle coagulation and multicomponent condensation, applying a revised treatment of condensation flux onto free molecular regime particles and the activation of nanosized clusters by organic vapours (Nano-Köhler theory), as well as recent parameterizations for binary H2SO4-H2O and ternary H2SO4-NH3-H2O homogeneous nucleation and dry deposition. The representation of particle size distribution can be chosen from three sectional methods: the hybrid method, the moving center method, and the retracking method in which moving sections are retracked to a fixed grid after a certain time interval. All these methods can treat particle emissions and atmospheric transport consistently, and are therefore suitable for use in large scale atmospheric models. In a test simulation against an accurate high resolution solution, all the methods showed reasonable treatment of new particle formation with 20 size sections although the hybrid and the retracking methods suffered from artificial widening of the distribution. The moving center approach, on the other hand, showed extra dents in the particle size distribution and failed to predict the onset of detectable particle formation. In a separate test simulation of an observed nucleation event, the model captured the key qualitative behaviour of the system well. Furthermore, its prediction of the organic volume fraction in newly formed particles, suggesting values as high as 0.5 for 3-4 nm particles and approximately 0.8 for 10 nm particles, agrees with recent indirect composition measurements.

  18. A global evaluation of streamflow drought characteristics

    Directory of Open Access Journals (Sweden)

    A. K. Fleig

    2006-01-01

    Full Text Available How drought is characterised depends on the purpose and region of the study and the available data. In case of regional applications or global comparison a standardisation of the methodology to characterise drought is preferable. In this study the threshold level method in combination with three common pooling procedures is applied to daily streamflow series from a wide range of hydrological regimes. Drought deficit characteristics, such as drought duration and deficit volume, are derived, and the methods are evaluated for their applicability for regional studies. Three dif