WorldWideScience

Sample records for model validation streamflow

  1. STREAMFLOW AND WATER QUALITY REGRESSION MODELING ...

    African Journals Online (AJOL)

    ... downstream Obigbo station show: consistent time-trends in degree of contamination; linear and non-linear relationships for water quality models against total dissolved solids (TDS), total suspended sediment (TSS), chloride, pH and sulphate; and non-linear relationship for streamflow and water quality transport models.

  2. Modeling multisite streamflow dependence with maximum entropy copula

    Science.gov (United States)

    Hao, Z.; Singh, V. P.

    2013-10-01

    Synthetic streamflows at different sites in a river basin are needed for planning, operation, and management of water resources projects. Modeling the temporal and spatial dependence structure of monthly streamflow at different sites is generally required. In this study, the maximum entropy copula method is proposed for multisite monthly streamflow simulation, in which the temporal and spatial dependence structure is imposed as constraints to derive the maximum entropy copula. The monthly streamflows at different sites are then generated by sampling from the conditional distribution. A case study for the generation of monthly streamflow at three sites in the Colorado River basin illustrates the application of the proposed method. Simulated streamflow from the maximum entropy copula is in satisfactory agreement with observed streamflow.

  3. Free internet datasets for streamflow modelling using SWAT in the Johor river basin, Malaysia

    Science.gov (United States)

    Tan, M. L.

    2014-02-01

    Streamflow modelling is a mathematical computational approach that represents terrestrial hydrology cycle digitally and is used for water resources assessment. However, such modelling endeavours require a large amount of data. Generally, governmental departments produce and maintain these data sets which make it difficult to obtain this data due to bureaucratic constraints. In some countries, the availability and quality of geospatial and climate datasets remain a critical issue due to many factors such as lacking of ground station, expertise, technology, financial support and war time. To overcome this problem, this research used public domain datasets from the Internet as "input" to a streamflow model. The intention is simulate daily and monthly streamflow of the Johor River Basin in Malaysia. The model used is the Soil and Water Assessment Tool (SWAT). As input free data including a digital elevation model (DEM), land use information, soil and climate data were used. The model was validated by in-situ streamflow information obtained from Rantau Panjang station for the year 2006. The coefficient of determination and Nash-Sutcliffe efficiency were 0.35/0.02 for daily simulated streamflow and 0.92/0.21 for monthly simulated streamflow, respectively. The results show that free data can provide a better simulation at a monthly scale compared to a daily basis in a tropical region. A sensitivity analysis and calibration procedure should be conducted in order to maximize the "goodness-of-fit" between simulated and observed streamflow. The application of Internet datasets promises an acceptable performance of streamflow modelling. This research demonstrates that public domain data is suitable for streamflow modelling in a tropical river basin within acceptable accuracy.

  4. Free internet datasets for streamflow modelling using SWAT in the Johor river basin, Malaysia

    International Nuclear Information System (INIS)

    Tan, M L

    2014-01-01

    Streamflow modelling is a mathematical computational approach that represents terrestrial hydrology cycle digitally and is used for water resources assessment. However, such modelling endeavours require a large amount of data. Generally, governmental departments produce and maintain these data sets which make it difficult to obtain this data due to bureaucratic constraints. In some countries, the availability and quality of geospatial and climate datasets remain a critical issue due to many factors such as lacking of ground station, expertise, technology, financial support and war time. To overcome this problem, this research used public domain datasets from the Internet as ''input'' to a streamflow model. The intention is simulate daily and monthly streamflow of the Johor River Basin in Malaysia. The model used is the Soil and Water Assessment Tool (SWAT). As input free data including a digital elevation model (DEM), land use information, soil and climate data were used. The model was validated by in-situ streamflow information obtained from Rantau Panjang station for the year 2006. The coefficient of determination and Nash-Sutcliffe efficiency were 0.35/0.02 for daily simulated streamflow and 0.92/0.21 for monthly simulated streamflow, respectively. The results show that free data can provide a better simulation at a monthly scale compared to a daily basis in a tropical region. A sensitivity analysis and calibration procedure should be conducted in order to maximize the ''goodness-of-fit'' between simulated and observed streamflow. The application of Internet datasets promises an acceptable performance of streamflow modelling. This research demonstrates that public domain data is suitable for streamflow modelling in a tropical river basin within acceptable accuracy

  5. An Hourly Streamflow Forecasting Model Coupled with an Enforced Learning Strategy

    Directory of Open Access Journals (Sweden)

    Ming-Chang Wu

    2015-10-01

    Full Text Available Floods, one of the most significant natural hazards, often result in loss of life and property. Accurate hourly streamflow forecasting is always a key issue in hydrology for flood hazard mitigation. To improve the performance of hourly streamflow forecasting, a methodology concerning the development of neural network (NN based models with an enforced learning strategy is proposed in this paper. Firstly, four different NNs, namely back propagation network (BPN, radial basis function network (RBFN, self-organizing map (SOM, and support vector machine (SVM, are used to construct streamflow forecasting models. Through the cross-validation test, NN-based models with superior performance in streamflow forecasting are detected. Then, an enforced learning strategy is developed to further improve the performance of the superior NN-based models, i.e., SOM and SVM in this study. Finally, the proposed flow forecasting model is obtained. Actual applications are conducted to demonstrate the potential of the proposed model. Moreover, comparison between the NN-based models with and without the enforced learning strategy is performed to evaluate the effect of the enforced learning strategy on model performance. The results indicate that the NN-based models with the enforced learning strategy indeed improve the accuracy of hourly streamflow forecasting. Hence, the presented methodology is expected to be helpful for developing improved NN-based streamflow forecasting models.

  6. Testing and modelling autoregressive conditional heteroskedasticity of streamflow processes

    Directory of Open Access Journals (Sweden)

    W. Wang

    2005-01-01

    Full Text Available Conventional streamflow models operate under the assumption of constant variance or season-dependent variances (e.g. ARMA (AutoRegressive Moving Average models for deseasonalized streamflow series and PARMA (Periodic AutoRegressive Moving Average models for seasonal streamflow series. However, with McLeod-Li test and Engle's Lagrange Multiplier test, clear evidences are found for the existence of autoregressive conditional heteroskedasticity (i.e. the ARCH (AutoRegressive Conditional Heteroskedasticity effect, a nonlinear phenomenon of the variance behaviour, in the residual series from linear models fitted to daily and monthly streamflow processes of the upper Yellow River, China. It is shown that the major cause of the ARCH effect is the seasonal variation in variance of the residual series. However, while the seasonal variation in variance can fully explain the ARCH effect for monthly streamflow, it is only a partial explanation for daily flow. It is also shown that while the periodic autoregressive moving average model is adequate in modelling monthly flows, no model is adequate in modelling daily streamflow processes because none of the conventional time series models takes the seasonal variation in variance, as well as the ARCH effect in the residuals, into account. Therefore, an ARMA-GARCH (Generalized AutoRegressive Conditional Heteroskedasticity error model is proposed to capture the ARCH effect present in daily streamflow series, as well as to preserve seasonal variation in variance in the residuals. The ARMA-GARCH error model combines an ARMA model for modelling the mean behaviour and a GARCH model for modelling the variance behaviour of the residuals from the ARMA model. Since the GARCH model is not followed widely in statistical hydrology, the work can be a useful addition in terms of statistical modelling of daily streamflow processes for the hydrological community.

  7. Simulation of daily streamflow for nine river basins in eastern Iowa using the Precipitation-Runoff Modeling System

    Science.gov (United States)

    Haj, Adel E.; Christiansen, Daniel E.; Hutchinson, Kasey J.

    2015-10-14

    The U.S. Geological Survey, in cooperation with the Iowa Department of Natural Resources, constructed Precipitation-Runoff Modeling System models to estimate daily streamflow for nine river basins in eastern Iowa that drain into the Mississippi River. The models are part of a suite of methods for estimating daily streamflow at ungaged sites. The Precipitation-Runoff Modeling System is a deterministic, distributed- parameter, physical-process-based modeling system developed to evaluate the response of streamflow and general drainage basin hydrology to various combinations of climate and land use. Calibration and validation periods used in each basin mostly were October 1, 2002, through September 30, 2012, but differed depending on the period of record available for daily mean streamflow measurements at U.S. Geological Survey streamflow-gaging stations.

  8. Variational assimilation of streamflow into operational distributed hydrologic models: effect of spatiotemporal adjustment scale

    Science.gov (United States)

    Lee, H.; Seo, D.-J.; Liu, Y.; Koren, V.; McKee, P.; Corby, R.

    2012-01-01

    State updating of distributed rainfall-runoff models via streamflow assimilation is subject to overfitting because large dimensionality of the state space of the model may render the assimilation problem seriously under-determined. To examine the issue in the context of operational hydrology, we carry out a set of real-world experiments in which streamflow data is assimilated into gridded Sacramento Soil Moisture Accounting (SAC-SMA) and kinematic-wave routing models of the US National Weather Service (NWS) Research Distributed Hydrologic Model (RDHM) with the variational data assimilation technique. Study basins include four basins in Oklahoma and five basins in Texas. To assess the sensitivity of data assimilation performance to dimensionality reduction in the control vector, we used nine different spatiotemporal adjustment scales, where state variables are adjusted in a lumped, semi-distributed, or distributed fashion and biases in precipitation and potential evaporation (PE) are adjusted hourly, 6-hourly, or kept time-invariant. For each adjustment scale, three different streamflow assimilation scenarios are explored, where streamflow observations at basin interior points, at the basin outlet, or at both interior points and the outlet are assimilated. The streamflow assimilation experiments with nine different basins show that the optimum spatiotemporal adjustment scale varies from one basin to another and may be different for streamflow analysis and prediction in all of the three streamflow assimilation scenarios. The most preferred adjustment scale for seven out of nine basins is found to be the distributed, hourly scale, despite the fact that several independent validation results at this adjustment scale indicated the occurrence of overfitting. Basins with highly correlated interior and outlet flows tend to be less sensitive to the adjustment scale and could benefit more from streamflow assimilation. In comparison to outlet flow assimilation, interior flow

  9. Simulating streamflow and water table depth with a coupled hydrological model

    Directory of Open Access Journals (Sweden)

    Alphonce Chenjerayi Guzha

    2010-09-01

    Full Text Available A coupled model integrating MODFLOW and TOPNET with the models interacting through the exchange of recharge and baseflow and river-aquifer interactions was developed and applied to the Big Darby Watershed in Ohio, USA. Calibration and validation results show that there is generally good agreement between measured streamflow and simulated results from the coupled model. At two gauging stations, average goodness of fit (R2, percent bias (PB, and Nash Sutcliffe efficiency (ENS values of 0.83, 11.15%, and 0.83, respectively, were obtained for simulation of streamflow during calibration, and values of 0.84, 8.75%, and 0.85, respectively, were obtained for validation. The simulated water table depths yielded average R2 values of 0.77 and 0.76 for calibration and validation, respectively. The good match between measured and simulated streamflows and water table depths demonstrates that the model is capable of adequately simulating streamflows and water table depths in the watershed and also capturing the influence of spatial and temporal variation in recharge.

  10. Streamflow ratings

    Science.gov (United States)

    Holmes, Robert R.; Singh, Vijay P.

    2016-01-01

    Autonomous direct determination of a continuous time series of streamflow is not economically feasible at present (2014). As such, surrogates are used to derive a continuous time series of streamflow. The derivation process entails developing a streamflow rating, which can range from a simple, single-valued relation between stage and streamflow to a fully dynamic one-dimensional model based on hydraulics of the flow.

  11. Assessing the value of variational assimilation of streamflow data into distributed hydrologic models for improved streamflow monitoring and prediction at ungauged and gauged locations in the catchment

    Science.gov (United States)

    Lee, Hak Su; Seo, Dong-Jun; Liu, Yuqiong; McKee, Paul; Corby, Robert

    2010-05-01

    State updating of distributed hydrologic models via assimilation of streamflow data is subject to "overfitting" because large dimensionality of the state space of the model may render the assimilation problem seriously underdetermined. To examine the issue in the context of operational hydrology, we carried out a set of real-world experiments in which we assimilate streamflow data at interior and/or outlet locations into gridded SAC and kinematic-wave routing models of the U.S. National Weather Service (NWS) Research Distributed Hydrologic Model (RDHM). We used for the experiments nine basins in the southern plains of the U.S. The experiments consist of selectively assimilating streamflow at different gauge locations, outlet and/or interior, and carrying out both dependent and independent validation. To assess the sensitivity of the quality of assimilation-aided streamflow simulation to the reduced dimensionality of the state space, we carried out data assimilation at spatially semi-distributed or lumped scale and by adjusting biases in precipitation and potential evaporation at a 6-hourly or larger scale. In this talk, we present the results and findings.

  12. Partitioning uncertainty in streamflow projections under nonstationary model conditions

    Science.gov (United States)

    Chawla, Ila; Mujumdar, P. P.

    2018-02-01

    Assessing the impacts of Land Use (LU) and climate change on future streamflow projections is necessary for efficient management of water resources. However, model projections are burdened with significant uncertainty arising from various sources. Most of the previous studies have considered climate models and scenarios as major sources of uncertainty, but uncertainties introduced by land use change and hydrologic model assumptions are rarely investigated. In this paper an attempt is made to segregate the contribution from (i) general circulation models (GCMs), (ii) emission scenarios, (iii) land use scenarios, (iv) stationarity assumption of the hydrologic model, and (v) internal variability of the processes, to overall uncertainty in streamflow projections using analysis of variance (ANOVA) approach. Generally, most of the impact assessment studies are carried out with unchanging hydrologic model parameters in future. It is, however, necessary to address the nonstationarity in model parameters with changing land use and climate. In this paper, a regression based methodology is presented to obtain the hydrologic model parameters with changing land use and climate scenarios in future. The Upper Ganga Basin (UGB) in India is used as a case study to demonstrate the methodology. The semi-distributed Variable Infiltration Capacity (VIC) model is set-up over the basin, under nonstationary conditions. Results indicate that model parameters vary with time, thereby invalidating the often-used assumption of model stationarity. The streamflow in UGB under the nonstationary model condition is found to reduce in future. The flows are also found to be sensitive to changes in land use. Segregation results suggest that model stationarity assumption and GCMs along with their interactions with emission scenarios, act as dominant sources of uncertainty. This paper provides a generalized framework for hydrologists to examine stationarity assumption of models before considering them

  13. Validation of streamflow measurements made with M9 and RiverRay acoustic Doppler current profilers

    Science.gov (United States)

    Boldt, Justin A.; Oberg, Kevin A.

    2015-01-01

    The U.S. Geological Survey (USGS) Office of Surface Water (OSW) previously validated the use of Teledyne RD Instruments (TRDI) Rio Grande (in 2007), StreamPro (in 2006), and Broadband (in 1996) acoustic Doppler current profilers (ADCPs) for streamflow (discharge) measurements made by the USGS. Two new ADCPs, the SonTek M9 and the TRDI RiverRay, were first used in the USGS Water Mission Area programs in 2009. Since 2009, the OSW and USGS Water Science Centers (WSCs) have been conducting field measurements as part of their stream-gaging program using these ADCPs. The purpose of this paper is to document the results of USGS OSW analyses for validation of M9 and RiverRay ADCP streamflow measurements. The OSW required each participating WSC to make comparison measurements over the range of operating conditions in which the instruments were used until sufficient measurements were available. The performance of these ADCPs was evaluated for validation and to identify any present and potential problems. Statistical analyses of streamflow measurements indicate that measurements made with the SonTek M9 ADCP using firmware 2.00–3.00 or the TRDI RiverRay ADCP using firmware 44.12–44.15 are unbiased, and therefore, can continue to be used to make streamflow measurements in the USGS stream-gaging program. However, for the M9 ADCP, there are some important issues to be considered in making future measurements. Possible future work may include additional validation of streamflow measurements made with these instruments from other locations in the United States and measurement validation using updated firmware and software.

  14. Inferring Soil Moisture Memory from Streamflow Observations Using a Simple Water Balance Model

    Science.gov (United States)

    Orth, Rene; Koster, Randal Dean; Seneviratne, Sonia I.

    2013-01-01

    Soil moisture is known for its integrative behavior and resulting memory characteristics. Soil moisture anomalies can persist for weeks or even months into the future, making initial soil moisture a potentially important contributor to skill in weather forecasting. A major difficulty when investigating soil moisture and its memory using observations is the sparse availability of long-term measurements and their limited spatial representativeness. In contrast, there is an abundance of long-term streamflow measurements for catchments of various sizes across the world. We investigate in this study whether such streamflow measurements can be used to infer and characterize soil moisture memory in respective catchments. Our approach uses a simple water balance model in which evapotranspiration and runoff ratios are expressed as simple functions of soil moisture; optimized functions for the model are determined using streamflow observations, and the optimized model in turn provides information on soil moisture memory on the catchment scale. The validity of the approach is demonstrated with data from three heavily monitored catchments. The approach is then applied to streamflow data in several small catchments across Switzerland to obtain a spatially distributed description of soil moisture memory and to show how memory varies, for example, with altitude and topography.

  15. Assimilating uncertain, dynamic and intermittent streamflow observations in hydrological models

    Science.gov (United States)

    Mazzoleni, Maurizio; Alfonso, Leonardo; Chacon-Hurtado, Juan; Solomatine, Dimitri

    2015-09-01

    Catastrophic floods cause significant socio-economical losses. Non-structural measures, such as real-time flood forecasting, can potentially reduce flood risk. To this end, data assimilation methods have been used to improve flood forecasts by integrating static ground observations, and in some cases also remote sensing observations, within water models. Current hydrologic and hydraulic research works consider assimilation of observations coming from traditional, static sensors. At the same time, low-cost, mobile sensors and mobile communication devices are becoming also increasingly available. The main goal and innovation of this study is to demonstrate the usefulness of assimilating uncertain streamflow observations that are dynamic in space and intermittent in time in the context of two different semi-distributed hydrological model structures. The developed method is applied to the Brue basin, where the dynamic observations are imitated by the synthetic observations of discharge. The results of this study show how model structures and sensors locations affect in different ways the assimilation of streamflow observations. In addition, it proves how assimilation of such uncertain observations from dynamic sensors can provide model improvements similar to those of streamflow observations coming from a non-optimal network of static physical sensors. This can be a potential application of recent efforts to build citizen observatories of water, which can make the citizens an active part in information capturing, evaluation and communication, helping simultaneously to improvement of model-based flood forecasting.

  16. The value of model averaging and dynamical climate model predictions for improving statistical seasonal streamflow forecasts over Australia

    Science.gov (United States)

    Pokhrel, Prafulla; Wang, Q. J.; Robertson, David E.

    2013-10-01

    Seasonal streamflow forecasts are valuable for planning and allocation of water resources. In Australia, the Bureau of Meteorology employs a statistical method to forecast seasonal streamflows. The method uses predictors that are related to catchment wetness at the start of a forecast period and to climate during the forecast period. For the latter, a predictor is selected among a number of lagged climate indices as candidates to give the "best" model in terms of model performance in cross validation. This study investigates two strategies for further improvement in seasonal streamflow forecasts. The first is to combine, through Bayesian model averaging, multiple candidate models with different lagged climate indices as predictors, to take advantage of different predictive strengths of the multiple models. The second strategy is to introduce additional candidate models, using rainfall and sea surface temperature predictions from a global climate model as predictors. This is to take advantage of the direct simulations of various dynamic processes. The results show that combining forecasts from multiple statistical models generally yields more skillful forecasts than using only the best model and appears to moderate the worst forecast errors. The use of rainfall predictions from the dynamical climate model marginally improves the streamflow forecasts when viewed over all the study catchments and seasons, but the use of sea surface temperature predictions provide little additional benefit.

  17. Simulation of daily streamflows at gaged and ungaged locations within the Cedar River Basin, Iowa, using a Precipitation-Runoff Modeling System model

    Science.gov (United States)

    Christiansen, Daniel E.

    2012-01-01

    The U.S. Geological Survey, in cooperation with the Iowa Department of Natural Resources, conducted a study to examine techniques for estimation of daily streamflows using hydrological models and statistical methods. This report focuses on the use of a hydrologic model, the U.S. Geological Survey's Precipitation-Runoff Modeling System, to estimate daily streamflows at gaged and ungaged locations. The Precipitation-Runoff Modeling System is a modular, physically based, distributed-parameter modeling system developed to evaluate the impacts of various combinations of precipitation, climate, and land use on surface-water runoff and general basin hydrology. The Cedar River Basin was selected to construct a Precipitation-Runoff Modeling System model that simulates the period from January 1, 2000, to December 31, 2010. The calibration period was from January 1, 2000, to December 31, 2004, and the validation periods were from January 1, 2005, to December 31, 2010 and January 1, 2000 to December 31, 2010. A Geographic Information System tool was used to delineate the Cedar River Basin and subbasins for the Precipitation-Runoff Modeling System model and to derive parameters based on the physical geographical features. Calibration of the Precipitation-Runoff Modeling System model was completed using a U.S. Geological Survey calibration software tool. The main objective of the calibration was to match the daily streamflow simulated by the Precipitation-Runoff Modeling System model with streamflow measured at U.S. Geological Survey streamflow gages. The Cedar River Basin daily streamflow model performed with a Nash-Sutcliffe efficiency ranged from 0.82 to 0.33 during the calibration period, and a Nash-Sutcliffe efficiency ranged from 0.77 to -0.04 during the validation period. The Cedar River Basin model is meeting the criteria of greater than 0.50 Nash-Sutcliffe and is a good fit for streamflow conditions for the calibration period at all but one location, Austin, Minnesota

  18. A stepwise model to predict monthly streamflow

    Science.gov (United States)

    Mahmood Al-Juboori, Anas; Guven, Aytac

    2016-12-01

    In this study, a stepwise model empowered with genetic programming is developed to predict the monthly flows of Hurman River in Turkey and Diyalah and Lesser Zab Rivers in Iraq. The model divides the monthly flow data to twelve intervals representing the number of months in a year. The flow of a month, t is considered as a function of the antecedent month's flow (t - 1) and it is predicted by multiplying the antecedent monthly flow by a constant value called K. The optimum value of K is obtained by a stepwise procedure which employs Gene Expression Programming (GEP) and Nonlinear Generalized Reduced Gradient Optimization (NGRGO) as alternative to traditional nonlinear regression technique. The degree of determination and root mean squared error are used to evaluate the performance of the proposed models. The results of the proposed model are compared with the conventional Markovian and Auto Regressive Integrated Moving Average (ARIMA) models based on observed monthly flow data. The comparison results based on five different statistic measures show that the proposed stepwise model performed better than Markovian model and ARIMA model. The R2 values of the proposed model range between 0.81 and 0.92 for the three rivers in this study.

  19. Streamflow data assimilation in SWAT model using Extended Kalman Filter

    Science.gov (United States)

    Sun, Leqiang; Nistor, Ioan; Seidou, Ousmane

    2015-12-01

    The Extended Kalman Filter (EKF) is coupled with the Soil and Water Assessment Tools (SWAT) model in the streamflow assimilation of the upstream Senegal River in West Africa. Given the large number of distributed variables in SWAT, only the average watershed scale variables are included in the state vector and the Hydrological Response Unit (HRU) scale variables are updated with the a posteriori/a priori ratio of their watershed scale counterparts. The Jacobian matrix is calculated numerically by perturbing the state variables. Both the soil moisture and CN2 are significantly updated in the wet season, yet they have opposite update patterns. A case study for a large flood forecast shows that for up to seven days, the streamflow forecast is moderately improved using the EKF-subsequent open loop scheme but significantly improved with a newly designed quasi-error update scheme. The former has better performances in the flood rising period while the latter has better performances in the recession period. For both schemes, the streamflow forecast is improved more significantly when the lead time is shorter.

  20. Genetic Algorithm Based Framework for Automation of Stochastic Modeling of Multi-Season Streamflows

    Science.gov (United States)

    Srivastav, R. K.; Srinivasan, K.; Sudheer, K.

    2009-05-01

    Synthetic streamflow data generation involves the synthesis of likely streamflow patterns that are statistically indistinguishable from the observed streamflow data. The various kinds of stochastic models adopted for multi-season streamflow generation in hydrology are: i) parametric models which hypothesize the form of the periodic dependence structure and the distributional form a priori (examples are PAR, PARMA); disaggregation models that aim to preserve the correlation structure at the periodic level and the aggregated annual level; ii) Nonparametric models (examples are bootstrap/kernel based methods), which characterize the laws of chance, describing the stream flow process, without recourse to prior assumptions as to the form or structure of these laws; (k-nearest neighbor (k-NN), matched block bootstrap (MABB)); non-parametric disaggregation model. iii) Hybrid models which blend both parametric and non-parametric models advantageously to model the streamflows effectively. Despite many of these developments that have taken place in the field of stochastic modeling of streamflows over the last four decades, accurate prediction of the storage and the critical drought characteristics has been posing a persistent challenge to the stochastic modeler. This is partly because, usually, the stochastic streamflow model parameters are estimated by minimizing a statistically based objective function (such as maximum likelihood (MLE) or least squares (LS) estimation) and subsequently the efficacy of the models is being validated based on the accuracy of prediction of the estimates of the water-use characteristics, which requires large number of trial simulations and inspection of many plots and tables. Still accurate prediction of the storage and the critical drought characteristics may not be ensured. In this study a multi-objective optimization framework is proposed to find the optimal hybrid model (blend of a simple parametric model, PAR(1) model and matched block

  1. Ensemble streamflow assimilation with the National Water Model.

    Science.gov (United States)

    Rafieeinasab, A.; McCreight, J. L.; Noh, S.; Seo, D. J.; Gochis, D.

    2017-12-01

    Through case studies of flooding across the US, we compare the performance of the National Water Model (NWM) data assimilation (DA) scheme to that of a newly implemented ensemble Kalman filter approach. The NOAA National Water Model (NWM) is an operational implementation of the community WRF-Hydro modeling system. As of August 2016, the NWM forecasts of distributed hydrologic states and fluxes (including soil moisture, snowpack, ET, and ponded water) over the contiguous United States have been publicly disseminated by the National Center for Environmental Prediction (NCEP) . It also provides streamflow forecasts at more than 2.7 million river reaches up to 30 days in advance. The NWM employs a nudging scheme to assimilate more than 6,000 USGS streamflow observations and provide initial conditions for its forecasts. A problem with nudging is how the forecasts relax quickly to open-loop bias in the forecast. This has been partially addressed by an experimental bias correction approach which was found to have issues with phase errors during flooding events. In this work, we present an ensemble streamflow data assimilation approach combining new channel-only capabilities of the NWM and HydroDART (a coupling of the offline WRF-Hydro model and NCAR's Data Assimilation Research Testbed; DART). Our approach focuses on the single model state of discharge and incorporates error distributions on channel-influxes (overland and groundwater) in the assimilation via an ensemble Kalman filter (EnKF). In order to avoid filter degeneracy associated with a limited number of ensemble at large scale, DART's covariance inflation (Anderson, 2009) and localization capabilities are implemented and evaluated. The current NWM data assimilation scheme is compared to preliminary results from the EnKF application for several flooding case studies across the US.

  2. Reconstruction of missing daily streamflow data using dynamic regression models

    Science.gov (United States)

    Tencaliec, Patricia; Favre, Anne-Catherine; Prieur, Clémentine; Mathevet, Thibault

    2015-12-01

    River discharge is one of the most important quantities in hydrology. It provides fundamental records for water resources management and climate change monitoring. Even very short data-gaps in this information can cause extremely different analysis outputs. Therefore, reconstructing missing data of incomplete data sets is an important step regarding the performance of the environmental models, engineering, and research applications, thus it presents a great challenge. The objective of this paper is to introduce an effective technique for reconstructing missing daily discharge data when one has access to only daily streamflow data. The proposed procedure uses a combination of regression and autoregressive integrated moving average models (ARIMA) called dynamic regression model. This model uses the linear relationship between neighbor and correlated stations and then adjusts the residual term by fitting an ARIMA structure. Application of the model to eight daily streamflow data for the Durance river watershed showed that the model yields reliable estimates for the missing data in the time series. Simulation studies were also conducted to evaluate the performance of the procedure.

  3. Historical Trends in Mean and Extreme Runoff and Streamflow Based on Observations and Climate Models

    Directory of Open Access Journals (Sweden)

    Behzad Asadieh

    2016-05-01

    Full Text Available To understand changes in global mean and extreme streamflow volumes over recent decades, we statistically analyzed runoff and streamflow simulated by the WBM-plus hydrological model using either observational-based meteorological inputs from WATCH Forcing Data (WFD, or bias-corrected inputs from five global climate models (GCMs provided by the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP. Results show that the bias-corrected GCM inputs yield very good agreement with the observation-based inputs in average magnitude of runoff and streamflow. On global average, the observation-based simulated mean runoff and streamflow both decreased about 1.3% from 1971 to 2001. However, GCM-based simulations yield increasing trends over that period, with an inter-model global average of 1% for mean runoff and 0.9% for mean streamflow. In the GCM-based simulations, relative changes in extreme runoff and extreme streamflow (annual maximum daily values and annual-maximum seven-day streamflow are slightly greater than those of mean runoff and streamflow, in terms of global and continental averages. Observation-based simulations show increasing trend in mean runoff and streamflow for about one-half of the land areas and decreasing trend for the other half. However, mean and extreme runoff and streamflow based on the GCMs show increasing trend for approximately two-thirds of the global land area and decreasing trend for the other one-third. Further work is needed to understand why GCM simulations appear to indicate trends in streamflow that are more positive than those suggested by climate observations, even where, as in ISI-MIP, bias correction has been applied so that their streamflow climatology is realistic.

  4. Artificial Neural Network Models for Long Lead Streamflow Forecasts using Climate Information

    Science.gov (United States)

    Kumar, J.; Devineni, N.

    2007-12-01

    developed between the identified predictors and the predictand. Predictors used are the scores of Principal Components Analysis (PCA). The models were tested and validated. The feed- forward multi-layer perceptron (MLP) type neural networks trained using the back-propagation algorithms are employed in the current study. The performance of the ANN-model forecasts are evaluated using various performance evaluation measures such as correlation coefficient, root mean square error (RMSE). The preliminary results shows that ANNs are efficient to forecast long lead time streamflows using climatic predictors.

  5. Multivariate synthetic streamflow generation using a hybrid model based on artificial neural networks

    Directory of Open Access Journals (Sweden)

    J. C. Ochoa-Rivera

    2002-01-01

    Full Text Available A model for multivariate streamflow generation is presented, based on a multilayer feedforward neural network. The structure of the model results from two components, the neural network (NN deterministic component and a random component which is assumed to be normally distributed. It is from this second component that the model achieves the ability to incorporate effectively the uncertainty associated with hydrological processes, making it valuable as a practical tool for synthetic generation of streamflow series. The NN topology and the corresponding analytical explicit formulation of the model are described in detail. The model is calibrated with a series of monthly inflows to two reservoir sites located in the Tagus River basin (Spain, while validation is performed through estimation of a set of statistics that is relevant for water resources systems planning and management. Among others, drought and storage statistics are computed and compared for both the synthetic and historical series. The performance of the NN-based model was compared to that of a standard autoregressive AR(2 model. Results show that NN represents a promising modelling alternative for simulation purposes, with interesting potential in the context of water resources systems management and optimisation. Keywords: neural networks, perceptron multilayer, error backpropagation, hydrological scenario generation, multivariate time-series..

  6. Evaluation of statistical and rainfall-runoff models for predicting historical daily streamflow time series in the Des Moines and Iowa River watersheds

    Science.gov (United States)

    Farmer, William H.; Knight, Rodney R.; Eash, David A.; Kasey J. Hutchinson,; Linhart, S. Mike; Christiansen, Daniel E.; Archfield, Stacey A.; Over, Thomas M.; Kiang, Julie E.

    2015-08-24

    Daily records of streamflow are essential to understanding hydrologic systems and managing the interactions between human and natural systems. Many watersheds and locations lack streamgages to provide accurate and reliable records of daily streamflow. In such ungaged watersheds, statistical tools and rainfall-runoff models are used to estimate daily streamflow. Previous work compared 19 different techniques for predicting daily streamflow records in the southeastern United States. Here, five of the better-performing methods are compared in a different hydroclimatic region of the United States, in Iowa. The methods fall into three classes: (1) drainage-area ratio methods, (2) nonlinear spatial interpolations using flow duration curves, and (3) mechanistic rainfall-runoff models. The first two classes are each applied with nearest-neighbor and map-correlated index streamgages. Using a threefold validation and robust rank-based evaluation, the methods are assessed for overall goodness of fit of the hydrograph of daily streamflow, the ability to reproduce a daily, no-fail storage-yield curve, and the ability to reproduce key streamflow statistics. As in the Southeast study, a nonlinear spatial interpolation of daily streamflow using flow duration curves is found to be a method with the best predictive accuracy. Comparisons with previous work in Iowa show that the accuracy of mechanistic models with at-site calibration is substantially degraded in the ungaged framework.

  7. From spatially variable streamflow to distributed hydrological models: Analysis of key modeling decisions

    Science.gov (United States)

    Fenicia, Fabrizio; Kavetski, Dmitri; Savenije, Hubert H. G.; Pfister, Laurent

    2016-02-01

    This paper explores the development and application of distributed hydrological models, focusing on the key decisions of how to discretize the landscape, which model structures to use in each landscape element, and how to link model parameters across multiple landscape elements. The case study considers the Attert catchment in Luxembourg—a 300 km2 mesoscale catchment with 10 nested subcatchments that exhibit clearly different streamflow dynamics. The research questions are investigated using conceptual models applied at hydrologic response unit (HRU) scales (1-4 HRUs) on 6 hourly time steps. Multiple model structures are hypothesized and implemented using the SUPERFLEX framework. Following calibration, space/time model transferability is tested using a split-sample approach, with evaluation criteria including streamflow prediction error metrics and hydrological signatures. Our results suggest that: (1) models using geology-based HRUs are more robust and capture the spatial variability of streamflow time series and signatures better than models using topography-based HRUs; this finding supports the hypothesis that, in the Attert, geology exerts a stronger control than topography on streamflow generation, (2) streamflow dynamics of different HRUs can be represented using distinct and remarkably simple model structures, which can be interpreted in terms of the perceived dominant hydrologic processes in each geology type, and (3) the same maximum root zone storage can be used across the three dominant geological units with no loss in model transferability; this finding suggests that the partitioning of water between streamflow and evaporation in the study area is largely independent of geology and can be used to improve model parsimony. The modeling methodology introduced in this study is general and can be used to advance our broader understanding and prediction of hydrological behavior, including the landscape characteristics that control hydrologic response, the

  8. Comparing large-scale hydrological model predictions with observed streamflow in the Pacific Northwest: effects of climate and groundwater

    Science.gov (United States)

    Mohammad Safeeq; Guillaume S. Mauger; Gordon E. Grant; Ivan Arismendi; Alan F. Hamlet; Se-Yeun Lee

    2014-01-01

    Assessing uncertainties in hydrologic models can improve accuracy in predicting future streamflow. Here, simulated streamflows using the Variable Infiltration Capacity (VIC) model at coarse (1/16°) and fine (1/120°) spatial resolutions were evaluated against observed streamflows from 217 watersheds. In...

  9. Climate model assessment of changes in winter-spring streamflow timing over North America

    Science.gov (United States)

    Kam, Jonghun; Knutson, Thomas R.; Milly, Paul C. D.

    2018-01-01

    Over regions where snow-melt runoff substantially contributes to winter-spring streamflows, warming can accelerate snow melt and reduce dry-season streamflows. However, conclusive detection of changes and attribution to anthropogenic forcing is hindered by brevity of observational records, model uncertainty, and uncertainty concerning internal variability. In this study, a detection/attribution of changes in mid-latitude North American winter-spring streamflow timing is examined using nine global climate models under multiple forcing scenarios. In this study, robustness across models, start/end dates for trends, and assumptions about internal variability is evaluated. Marginal evidence for an emerging detectable anthropogenic influence (according to four or five of nine models) is found in the north-central U.S., where winter-spring streamflows have been coming earlier. Weaker indications of detectable anthropogenic influence (three of nine models) are found in the mountainous western U.S./southwestern Canada and in extreme northeastern U.S./Canadian Maritimes. In the former region, a recent shift toward later streamflows has rendered the full-record trend toward earlier streamflows only marginally significant, with possible implications for previously published climate change detection findings for streamflow timing in this region. In the latter region, no forced model shows as large a shift toward earlier streamflow timing as the detectable observed shift. In other (including warm, snow-free) regions, observed trends are typically not detectable, although in the U.S. central plains we find detectable delays in streamflow, which are inconsistent with forced model experiments.

  10. Bayesian Models for Streamflow and River Network Reconstruction using Tree Rings

    Science.gov (United States)

    Ravindranath, A.; Devineni, N.

    2016-12-01

    Water systems face non-stationary, dynamically shifting risks due to shifting societal conditions and systematic long-term variations in climate manifesting as quasi-periodic behavior on multi-decadal time scales. Water systems are thus vulnerable to long periods of wet or dry hydroclimatic conditions. Streamflow is a major component of water systems and a primary means by which water is transported to serve ecosystems' and human needs. Thus, our concern is in understanding streamflow variability. Climate variability and impacts on water resources are crucial factors affecting streamflow, and multi-scale variability increases risk to water sustainability and systems. Dam operations are necessary for collecting water brought by streamflow while maintaining downstream ecological health. Rules governing dam operations are based on streamflow records that are woefully short compared to periods of systematic variation present in the climatic factors driving streamflow variability and non-stationarity. We use hierarchical Bayesian regression methods in order to reconstruct paleo-streamflow records for dams within a basin using paleoclimate proxies (e.g. tree rings) to guide the reconstructions. The riverine flow network for the entire basin is subsequently modeled hierarchically using feeder stream and tributary flows. This is a starting point in analyzing streamflow variability and risks to water systems, and developing a scientifically-informed dynamic risk management framework for formulating dam operations and water policies to best hedge such risks. We will apply this work to the Missouri and Delaware River Basins (DRB). Preliminary results of streamflow reconstructions for eight dams in the upper DRB using standard Gaussian regression with regional tree ring chronologies give streamflow records that now span two to two and a half centuries, and modestly smoothed versions of these reconstructed flows indicate physically-justifiable trends in the time series.

  11. Simulation-optimization framework for multi-site multi-season hybrid stochastic streamflow modeling

    Science.gov (United States)

    Srivastav, Roshan; Srinivasan, K.; Sudheer, K. P.

    2016-11-01

    A simulation-optimization (S-O) framework is developed for the hybrid stochastic modeling of multi-site multi-season streamflows. The multi-objective optimization model formulated is the driver and the multi-site, multi-season hybrid matched block bootstrap model (MHMABB) is the simulation engine within this framework. The multi-site multi-season simulation model is the extension of the existing single-site multi-season simulation model. A robust and efficient evolutionary search based technique, namely, non-dominated sorting based genetic algorithm (NSGA - II) is employed as the solution technique for the multi-objective optimization within the S-O framework. The objective functions employed are related to the preservation of the multi-site critical deficit run sum and the constraints introduced are concerned with the hybrid model parameter space, and the preservation of certain statistics (such as inter-annual dependence and/or skewness of aggregated annual flows). The efficacy of the proposed S-O framework is brought out through a case example from the Colorado River basin. The proposed multi-site multi-season model AMHMABB (whose parameters are obtained from the proposed S-O framework) preserves the temporal as well as the spatial statistics of the historical flows. Also, the other multi-site deficit run characteristics namely, the number of runs, the maximum run length, the mean run sum and the mean run length are well preserved by the AMHMABB model. Overall, the proposed AMHMABB model is able to show better streamflow modeling performance when compared with the simulation based SMHMABB model, plausibly due to the significant role played by: (i) the objective functions related to the preservation of multi-site critical deficit run sum; (ii) the huge hybrid model parameter space available for the evolutionary search and (iii) the constraint on the preservation of the inter-annual dependence. Split-sample validation results indicate that the AMHMABB model is

  12. An effective streamflow process model for optimal reservoir operation using stochastic dual dynamic programming

    OpenAIRE

    Raso , L.; Malaterre , P.O.; Bader , J.C.

    2017-01-01

    International audience; This article presents an innovative streamflow process model for use in reservoir operational rule design in stochastic dual dynamic programming (SDDP). Model features, which can be applied independently, are (1) a multiplicative process model for the forward phase and its linearized version for the backward phase; and (2) a nonuniform time-step length that is inversely proportional to seasonal variability. The advantages are (1) guaranteeing positive streamflow values...

  13. A Bayesian joint probability modeling approach for seasonal forecasting of streamflows at multiple sites

    Science.gov (United States)

    Wang, Q. J.; Robertson, D. E.; Chiew, F. H. S.

    2009-05-01

    Seasonal forecasting of streamflows can be highly valuable for water resources management. In this paper, a Bayesian joint probability (BJP) modeling approach for seasonal forecasting of streamflows at multiple sites is presented. A Box-Cox transformed multivariate normal distribution is proposed to model the joint distribution of future streamflows and their predictors such as antecedent streamflows and El Niño-Southern Oscillation indices and other climate indicators. Bayesian inference of model parameters and uncertainties is implemented using Markov chain Monte Carlo sampling, leading to joint probabilistic forecasts of streamflows at multiple sites. The model provides a parametric structure for quantifying relationships between variables, including intersite correlations. The Box-Cox transformed multivariate normal distribution has considerable flexibility for modeling a wide range of predictors and predictands. The Bayesian inference formulated allows the use of data that contain nonconcurrent and missing records. The model flexibility and data-handling ability means that the BJP modeling approach is potentially of wide practical application. The paper also presents a number of statistical measures and graphical methods for verification of probabilistic forecasts of continuous variables. Results for streamflows at three river gauges in the Murrumbidgee River catchment in southeast Australia show that the BJP modeling approach has good forecast quality and that the fitted model is consistent with observed data.

  14. Wavelet-linear genetic programming: A new approach for modeling monthly streamflow

    Science.gov (United States)

    Ravansalar, Masoud; Rajaee, Taher; Kisi, Ozgur

    2017-06-01

    The streamflows are important and effective factors in stream ecosystems and its accurate prediction is an essential and important issue in water resources and environmental engineering systems. A hybrid wavelet-linear genetic programming (WLGP) model, which includes a discrete wavelet transform (DWT) and a linear genetic programming (LGP) to predict the monthly streamflow (Q) in two gauging stations, Pataveh and Shahmokhtar, on the Beshar River at the Yasuj, Iran were used in this study. In the proposed WLGP model, the wavelet analysis was linked to the LGP model where the original time series of streamflow were decomposed into the sub-time series comprising wavelet coefficients. The results were compared with the single LGP, artificial neural network (ANN), a hybrid wavelet-ANN (WANN) and Multi Linear Regression (MLR) models. The comparisons were done by some of the commonly utilized relevant physical statistics. The Nash coefficients (E) were found as 0.877 and 0.817 for the WLGP model, for the Pataveh and Shahmokhtar stations, respectively. The comparison of the results showed that the WLGP model could significantly increase the streamflow prediction accuracy in both stations. Since, the results demonstrate a closer approximation of the peak streamflow values by the WLGP model, this model could be utilized for the simulation of cumulative streamflow data prediction in one month ahead.

  15. Retrospective evaluation of continental-scale streamflow nudging with WRF-Hydro National Water Model V1

    Science.gov (United States)

    McCreight, J. L.; Wu, Y.; Gochis, D.; Rafieeinasab, A.; Dugger, A. L.; Yu, W.; Cosgrove, B.; Cui, Z.; Oubeidillah, A.; Briar, D.

    2016-12-01

    The streamflow (discharge) data assimilation capability in version 1 of the National Water Model (NWM; a WRF-Hydro configuration) is applied and evaluated in a 5-year (2011-2015) retrospective study using NLDAS2 forcing data over CONUS. This talk will describe the NWM V1 operational nudging (continuous-time) streamflow data assimilation approach, its motivation, and its relationship to this retrospective evaluation. Results from this study will provide a an analysis-based (not forecast-based) benchmark for streamflow DA in the NWM. The goal of the assimilation is to reduce discharge bias and improve channel initial conditions for discharge forecasting (though forecasts are not considered here). The nudging method assimilates discharge observations at nearly 7,000 USGS gages (at frequency up to 1/15 minutes) to produce a (univariate) discharge reanalysis (i.e. this is the only variable affected by the assimilation). By withholding 14% nested gages throughout CONUS in a separate validation run, we evaluate the downstream impact of assimilation at upstream gages. Based on this sample, we estimate the skill of the streamflow reanalysis at ungaged locations and examine factors governing the skill of the assimilation. Comparison of assimilation and open-loop runs is presented. Performance of DA under both high and low flow regimes and selected flooding events is examined. Preliminary evaluation of nudging parameter sensitivity and its relationship to flow regime will be presented.

  16. A watershed modeling approach to streamflow reconstruction from tree-ring records

    International Nuclear Information System (INIS)

    Saito, Laurel; Biondi, Franco; Salas, Jose D; Panorska, Anna K; Kozubowski, Tomasz J

    2008-01-01

    Insight into long-term changes of streamflow is critical for addressing implications of global warming for sustainable water management. To date, dendrohydrologists have employed sophisticated regression techniques to extend runoff records, but this empirical approach cannot directly test the influence of watershed factors that alter streamflow independently of climate. We designed a mechanistic watershed model to calculate streamflows at annual timescales using as few inputs as possible. The model was calibrated for upper reaches of the Walker River, which straddles the boundary between the Sierra Nevada of California and the Great Basin of Nevada. Even though the model incorporated simplified relationships between precipitation and other components of the hydrologic cycle, it predicted water year streamflows with correlations of 0.87 when appropriate precipitation values were used

  17. A Pareto-optimal moving average multigene genetic programming model for daily streamflow prediction

    Science.gov (United States)

    Danandeh Mehr, Ali; Kahya, Ercan

    2017-06-01

    Genetic programming (GP) is able to systematically explore alternative model structures of different accuracy and complexity from observed input and output data. The effectiveness of GP in hydrological system identification has been recognized in recent studies. However, selecting a parsimonious (accurate and simple) model from such alternatives still remains a question. This paper proposes a Pareto-optimal moving average multigene genetic programming (MA-MGGP) approach to develop a parsimonious model for single-station streamflow prediction. The three main components of the approach that take us from observed data to a validated model are: (1) data pre-processing, (2) system identification and (3) system simplification. The data pre-processing ingredient uses a simple moving average filter to diminish the lagged prediction effect of stand-alone data-driven models. The multigene ingredient of the model tends to identify the underlying nonlinear system with expressions simpler than classical monolithic GP and, eventually simplification component exploits Pareto front plot to select a parsimonious model through an interactive complexity-efficiency trade-off. The approach was tested using the daily streamflow records from a station on Senoz Stream, Turkey. Comparing to the efficiency results of stand-alone GP, MGGP, and conventional multi linear regression prediction models as benchmarks, the proposed Pareto-optimal MA-MGGP model put forward a parsimonious solution, which has a noteworthy importance of being applied in practice. In addition, the approach allows the user to enter human insight into the problem to examine evolved models and pick the best performing programs out for further analysis.

  18. Evaluation of streamflow forecast for the National Water Model of U.S. National Weather Service

    Science.gov (United States)

    Rafieeinasab, A.; McCreight, J. L.; Dugger, A. L.; Gochis, D.; Karsten, L. R.; Zhang, Y.; Cosgrove, B.; Liu, Y.

    2016-12-01

    The National Water Model (NWM), an implementation of the community WRF-Hydro modeling system, is an operational hydrologic forecasting model for the contiguous United States. The model forecasts distributed hydrologic states and fluxes, including soil moisture, snowpack, ET, and ponded water. In particular, the NWM provides streamflow forecasts at more than 2.7 million river reaches for three forecast ranges: short (15 hr), medium (10 days), and long (30 days). In this study, we verify short and medium range streamflow forecasts in the context of the verification of their respective quantitative precipitation forecasts/forcing (QPF), the High Resolution Rapid Refresh (HRRR) and the Global Forecast System (GFS). The streamflow evaluation is performed for summer of 2016 at more than 6,000 USGS gauges. Both individual forecasts and forecast lead times are examined. Selected case studies of extreme events aim to provide insight into the quality of the NWM streamflow forecasts. A goal of this comparison is to address how much streamflow bias originates from precipitation forcing bias. To this end, precipitation verification is performed over the contributing areas above (and between assimilated) USGS gauge locations. Precipitation verification is based on the aggregated, blended StageIV/StageII data as the "reference truth". We summarize the skill of the streamflow forecasts, their skill relative to the QPF, and make recommendations for improving NWM forecast skill.

  19. Historical Streamflow Series Analysis Applied to Furnas HPP Reservoir Watershed Using the SWAT Model

    Directory of Open Access Journals (Sweden)

    Viviane de Souza Dias

    2018-04-01

    Full Text Available Over the last few years, the operation of the Furnas Hydropower Plant (HPP reservoir, located in the Grande River Basin, has been threatened due to a significant reduction in inflow. In the region, hydrological modelling tools are being used and tested to support decision making and water sustainability. In this study, the streamflow was modelled in the area of direct influence of the Furnas HPP reservoir, and the Soil and Water Assessment Tool (SWAT model performance was verified for studies in the region. Analyses of sensitivity and uncertainty were undertaken using the Sequential Uncertainty Fitting algorithm (SUFI-2 with a Calibration Uncertainty Program (SWAT-CUP. The hydrological modelling, at a monthly scale, presented good results in the calibration (NS 0.86, with a slight reduction of the coefficient in the validation period (NS 0.64. The results suggested that this tool could be applied in future hydrological studies in the region of study. With the consideration that special attention should be given to the historical series used in the calibration and validation of the models. It is important to note that this region has high demands for water resources, primarily for agricultural use. Water demands must also be taken into account in future hydrological simulations. The validation of this methodology led to important contributions to the management of water resources in regions with tropical climates, whose climatological and geological reality resembles the one studied here.

  20. Response of streamflow to climate change in a sub-basin of the source region of the Yellow River based on a tank model

    Directory of Open Access Journals (Sweden)

    P. Wu

    2018-06-01

    Full Text Available Though extensive researches were conducted in the source region of the Yellow River (SRYR to analyse climate change influence on streamflow, however, few researches concentrate on streamflow of the sub-basin above the Huangheyan station in the SRYR (HSRYR where a water retaining dam was built in the outlet in 1999. To improve the reservoir regulation strategies, this study analysed streamflow change of the HSRYR in a mesoscale. A tank model (TM was proposed and calibrated with monthly observation streamflow from 1991 to 1998. In the validation period, though there is a simulation deviation during the water storage and power generation period, simulated streamflow agrees favourably with observation data from 2008 to 2013. The model was further validated by two inside lakes area obtained from Landsat 5, 7, 8 datasets from 2000 to 2014, and significant correlations were found between the simulated lake outlet runoff and respective lake area. Then 21 Global Climate Models (GCM ensembled data of three emission scenarios (SRA2, SRA1B and SRB1 were downscaled and used as input to the TM to simulate the runoff change of three benchmark periods 2011–2030 (2020s, 2046–2065 (2050s, 2080–2099 (2090s, respectively. Though temperature increase dramatically, these projected results similarly indicated that streamflow shows an increase trend in the long term. Runoff increase is mainly caused by increasing precipitation and decreasing evaporation. Water resources distribution is projected to change from summer-autumn dominant to autumn winter dominant. Annual lowest runoff will occur in May caused by earlier snow melting and increasing evaporation in March. According to the obtained results, winter runoff should be artificially stored by reservoir regulation in the future to prevent zero-flow occurrent in May. This research is helpful for water resources management and provides a better understand of streamflow change caused by climate change in the

  1. Response of streamflow to climate change in a sub-basin of the source region of the Yellow River based on a tank model

    Science.gov (United States)

    Wu, Pan; Wang, Xu-Sheng; Liang, Sihai

    2018-06-01

    Though extensive researches were conducted in the source region of the Yellow River (SRYR) to analyse climate change influence on streamflow, however, few researches concentrate on streamflow of the sub-basin above the Huangheyan station in the SRYR (HSRYR) where a water retaining dam was built in the outlet in 1999. To improve the reservoir regulation strategies, this study analysed streamflow change of the HSRYR in a mesoscale. A tank model (TM) was proposed and calibrated with monthly observation streamflow from 1991 to 1998. In the validation period, though there is a simulation deviation during the water storage and power generation period, simulated streamflow agrees favourably with observation data from 2008 to 2013. The model was further validated by two inside lakes area obtained from Landsat 5, 7, 8 datasets from 2000 to 2014, and significant correlations were found between the simulated lake outlet runoff and respective lake area. Then 21 Global Climate Models (GCM) ensembled data of three emission scenarios (SRA2, SRA1B and SRB1) were downscaled and used as input to the TM to simulate the runoff change of three benchmark periods 2011-2030 (2020s), 2046-2065 (2050s), 2080-2099 (2090s), respectively. Though temperature increase dramatically, these projected results similarly indicated that streamflow shows an increase trend in the long term. Runoff increase is mainly caused by increasing precipitation and decreasing evaporation. Water resources distribution is projected to change from summer-autumn dominant to autumn winter dominant. Annual lowest runoff will occur in May caused by earlier snow melting and increasing evaporation in March. According to the obtained results, winter runoff should be artificially stored by reservoir regulation in the future to prevent zero-flow occurrent in May. This research is helpful for water resources management and provides a better understand of streamflow change caused by climate change in the future.

  2. Short-term streamflow forecasting with global climate change implications A comparative study between genetic programming and neural network models

    Science.gov (United States)

    Makkeasorn, A.; Chang, N. B.; Zhou, X.

    2008-05-01

    SummarySustainable water resources management is a critically important priority across the globe. While water scarcity limits the uses of water in many ways, floods may also result in property damages and the loss of life. To more efficiently use the limited amount of water under the changing world or to resourcefully provide adequate time for flood warning, the issues have led us to seek advanced techniques for improving streamflow forecasting on a short-term basis. This study emphasizes the inclusion of sea surface temperature (SST) in addition to the spatio-temporal rainfall distribution via the Next Generation Radar (NEXRAD), meteorological data via local weather stations, and historical stream data via USGS gage stations to collectively forecast discharges in a semi-arid watershed in south Texas. Two types of artificial intelligence models, including genetic programming (GP) and neural network (NN) models, were employed comparatively. Four numerical evaluators were used to evaluate the validity of a suite of forecasting models. Research findings indicate that GP-derived streamflow forecasting models were generally favored in the assessment in which both SST and meteorological data significantly improve the accuracy of forecasting. Among several scenarios, NEXRAD rainfall data were proven its most effectiveness for a 3-day forecast, and SST Gulf-to-Atlantic index shows larger impacts than the SST Gulf-to-Pacific index on the streamflow forecasts. The most forward looking GP-derived models can even perform a 30-day streamflow forecast ahead of time with an r-square of 0.84 and RMS error 5.4 in our study.

  3. Simulation of streamflow in the Pleasant, Narraguagus, Sheepscot, and Royal Rivers, Maine, using watershed models

    Science.gov (United States)

    Dudley, Robert W.; Nielsen, Martha G.

    2011-01-01

    The U.S. Geological Survey (USGS) began a study in 2008 to investigate anticipated changes in summer streamflows and stream temperatures in four coastal Maine river basins and the potential effects of those changes on populations of endangered Atlantic salmon. To achieve this purpose, it was necessary to characterize the quantity and timing of streamflow in these rivers by developing and evaluating a distributed-parameter watershed model for a part of each river basin by using the USGS Precipitation-Runoff Modeling System (PRMS). The GIS (geographic information system) Weasel, a USGS software application, was used to delineate the four study basins and their many subbasins, and to derive parameters for their geographic features. The models were calibrated using a four-step optimization procedure in which model output was evaluated against four datasets for calibrating solar radiation, potential evapotranspiration, annual and seasonal water balances, and daily streamflows. The calibration procedure involved thousands of model runs that used the USGS software application Luca (Let us calibrate). Luca uses the Shuffled Complex Evolution (SCE) global search algorithm to calibrate the model parameters. The calibrated watershed models performed satisfactorily, in that Nash-Sutcliffe efficiency (NSE) statistic values for the calibration periods ranged from 0.59 to 0.75 (on a scale of negative infinity to 1) and NSE statistic values for the evaluation periods ranged from 0.55 to 0.73. The calibrated watershed models simulate daily streamflow at many locations in each study basin. These models enable natural resources managers to characterize the timing and amount of streamflow in order to support a variety of water-resources efforts including water-quality calculations, assessments of water use, modeling of population dynamics and migration of Atlantic salmon, modeling and assessment of habitat, and simulation of anticipated changes to streamflow and water temperature

  4. Selective Tree-ring Models: A Novel Method for Reconstructing Streamflow Using Tree Rings

    Science.gov (United States)

    Foard, M. B.; Nelson, A. S.; Harley, G. L.

    2017-12-01

    Surface water is among the most instrumental and vulnerable resources in the Northwest United States (NW). Recent observations show that overall water quantity is declining in streams across the region, while extreme flooding events occur more frequently. Historical streamflow models inform probabilities of extreme flow events (flood or drought) by describing frequency and duration of past events. There are numerous examples of tree-rings being utilized to reconstruct streamflow in the NW. These models confirm that tree-rings are highly accurate at predicting streamflow, however there are many nuances that limit their applicability through time and space. For example, most models predict streamflow from hydrologically altered rivers (e.g. dammed, channelized) which may hinder our ability to predict natural prehistoric flow. They also have a tendency to over/under-predict extreme flow events. Moreover, they often neglect to capture the changing relationships between tree-growth and streamflow over time and space. To address these limitations, we utilized national tree-ring and streamflow archives to investigate the relationships between the growth of multiple coniferous species and free-flowing streams across the NW using novel species-and site-specific streamflow models - a term we coined"selective tree-ring models." Correlation function analysis and regression modeling were used to evaluate the strengths and directions of the flow-growth relationships. Species with significant relationships in the same direction were identified as strong candidates for selective models. Temporal and spatial patterns of these relationships were examined using running correlations and inverse distance weighting interpolation, respectively. Our early results indicate that (1) species adapted to extreme climates (e.g. hot-dry, cold-wet) exhibit the most consistent relationships across space, (2) these relationships weaken in locations with mild climatic variability, and (3) some

  5. Using a predictive model to evaluate spatiotemporal variability in streamflow permanence across the Pacific Northwest region

    Science.gov (United States)

    Jaeger, K. L.

    2017-12-01

    The U.S. Geological Survey (USGS) has developed the PRObability Of Streamflow PERmanence (PROSPER) model, a GIS-based empirical model that provides predictions of the annual probability of a stream channel having year-round flow (Streamflow permanence probability; SPP) for any unregulated and minimally-impaired stream channel in the Pacific Northwest (Washington, Oregon, Idaho, western Montana). The model provides annual predictions for 2004-2016 at a 30-m spatial resolution based on monthly or annually updated values of climatic conditions, and static physiographic variables associated with the upstream basin. Prediction locations correspond to the channel network consistent with the National Hydrography Dataset stream grid and are publicly available through the USGS StreamStats platform (https://water.usgs.gov/osw/streamstats/). In snowmelt-driven systems, the most informative predictor variable was mean upstream snow water equivalent on May 1, which highlights the influence of late spring snow cover for supporting streamflow in mountain river networks. In non-snowmelt-driven systems, the most informative variable was mean annual precipitation. Streamflow permanence probabilities varied across the study area by geography and from year-to-year. Notably lower SPP corresponded to the climatically drier subregions of the study area. Higher SPP were concentrated in coastal and higher elevation mountain regions. In addition, SPP appeared to trend with average hydroclimatic conditions, which were also geographically coherent. The year-to-year variability lends support for the growing recognition of the spatiotemporal dynamism of streamflow permanence. An analysis of three focus basins located in contrasting geographical and hydroclimatic settings demonstrates differences in the sensitivity of streamflow permanence to antecedent climate conditions as a function of geography. Consequently, results suggest that PROSPER model can be a useful tool to evaluate regions of the

  6. Simulating the effects of ground-water withdrawals on streamflow in a precipitation-runoff model

    Science.gov (United States)

    Zarriello, Philip J.; Barlow, P.M.; Duda, P.B.

    2004-01-01

    Precipitation-runoff models are used to assess the effects of water use and management alternatives on streamflow. Often, ground-water withdrawals are a major water-use component that affect streamflow, but the ability of surface-water models to simulate ground-water withdrawals is limited. As part of a Hydrologic Simulation Program-FORTRAN (HSPF) precipitation-runoff model developed to analyze the effect of ground-water and surface-water withdrawals on streamflow in the Ipswich River in northeastern Massachusetts, an analytical technique (STRMDEPL) was developed for calculating the effects of pumped wells on streamflow. STRMDEPL is a FORTRAN program based on two analytical solutions that solve equations for ground-water flow to a well completed in a semi-infinite, homogeneous, and isotropic aquifer in direct hydraulic connection to a fully penetrating stream. One analytical method calculates unimpeded flow at the stream-aquifer boundary and the other method calculates the resistance to flow caused by semipervious streambed and streambank material. The principle of superposition is used with these analytical equations to calculate time-varying streamflow depletions due to daily pumping. The HSPF model can readily incorporate streamflow depletions caused by a well or surface-water withdrawal, or by multiple wells or surface-water withdrawals, or both, as a combined time-varying outflow demand from affected channel reaches. These demands are stored as a time series in the Watershed Data Management (WDM) file. This time-series data is read into the model as an external source used to specify flow from the first outflow gate in the reach where these withdrawals are located. Although the STRMDEPL program can be run independently of the HSPF model, an extension was developed to run this program within GenScn, a scenario generator and graphical user interface developed for use with the HSPF model. This extension requires that actual pumping rates for each well be stored

  7. A comparative analysis of 9 multi-model averaging approaches in hydrological continuous streamflow simulation

    Science.gov (United States)

    Arsenault, Richard; Gatien, Philippe; Renaud, Benoit; Brissette, François; Martel, Jean-Luc

    2015-10-01

    This study aims to test whether a weighted combination of several hydrological models can simulate flows more accurately than the models taken individually. In addition, the project attempts to identify the most efficient model averaging method and the optimal number of models to include in the weighting scheme. In order to address the first objective, streamflow was simulated using four lumped hydrological models (HSAMI, HMETS, MOHYSE and GR4J-6), each of which were calibrated with three different objective functions on 429 watersheds. The resulting 12 hydrographs (4 models × 3 metrics) were weighted and combined with the help of 9 averaging methods which are the simple arithmetic mean (SAM), Akaike information criterion (AICA), Bates-Granger (BGA), Bayes information criterion (BICA), Bayesian model averaging (BMA), Granger-Ramanathan average variant A, B and C (GRA, GRB and GRC) and the average by SCE-UA optimization (SCA). The same weights were then applied to the hydrographs in validation mode, and the Nash-Sutcliffe Efficiency metric was measured between the averaged and observed hydrographs. Statistical analyses were performed to compare the accuracy of weighted methods to that of individual models. A Kruskal-Wallis test and a multi-objective optimization algorithm were then used to identify the most efficient weighted method and the optimal number of models to integrate. Results suggest that the GRA, GRB, GRC and SCA weighted methods perform better than the individual members. Model averaging from these four methods were superior to the best of the individual members in 76% of the cases. Optimal combinations on all watersheds included at least one of each of the four hydrological models. None of the optimal combinations included all members of the ensemble of 12 hydrographs. The Granger-Ramanathan average variant C (GRC) is recommended as the best compromise between accuracy, speed of execution, and simplicity.

  8. SWAT-based streamflow and embayment modeling of Karst-affected Chapel branch watershed, South Carolina

    Science.gov (United States)

    Devendra Amatya; M. Jha; A.E. Edwards; T.M. Williams; D.R. Hitchcock

    2011-01-01

    SWAT is a GIS-based basin-scale model widely used for the characterization of hydrology and water quality of large, complex watersheds; however, SWAT has not been fully tested in watersheds with karst geomorphology and downstream reservoir-like embayment. In this study, SWAT was applied to test its ability to predict monthly streamflow dynamics for a 1,555 ha karst...

  9. 78 FR 13874 - Watershed Modeling To Assess the Sensitivity of Streamflow, Nutrient, and Sediment Loads to...

    Science.gov (United States)

    2013-03-01

    ... EPA's policy to include all comments it receives in the public docket without change and to make the... Modeling To Assess the Sensitivity of Streamflow, Nutrient, and Sediment Loads to Climate Change and Urban... Loads to Climate Change and Urban Development in 20 U.S. Watersheds (EPA/600/R-12/058). EPA also is...

  10. Improving streamflow simulations and forecasting performance of SWAT model by assimilating remotely sensed soil moisture observations

    Science.gov (United States)

    Patil, Amol; Ramsankaran, RAAJ

    2017-12-01

    This article presents a study carried out using EnKF based assimilation of coarser-scale SMOS soil moisture retrievals to improve the streamflow simulations and forecasting performance of SWAT model in a large catchment. This study has been carried out in Munneru river catchment, India, which is about 10,156 km2. In this study, an EnkF based new approach is proposed for improving the inherent vertical coupling of soil layers of SWAT hydrological model during soil moisture data assimilation. Evaluation of the vertical error correlation obtained between surface and subsurface layers indicates that the vertical coupling can be improved significantly using ensemble of soil storages compared to the traditional static soil storages based EnKF approach. However, the improvements in the simulated streamflow are moderate, which is due to the limitations in SWAT model in reflecting the profile soil moisture updates in surface runoff computations. Further, it is observed that the durability of streamflow improvements is longer when the assimilation system effectively updates the subsurface flow component. Overall, the results of the present study indicate that the passive microwave-based coarser-scale soil moisture products like SMOS hold significant potential to improve the streamflow estimates when assimilating into large-scale distributed hydrological models operating at a daily time step.

  11. Stochastic model for simulating Souris River Basin precipitation, evapotranspiration, and natural streamflow

    Science.gov (United States)

    Kolars, Kelsey A.; Vecchia, Aldo V.; Ryberg, Karen R.

    2016-02-24

    The Souris River Basin is a 61,000-square-kilometer basin in the Provinces of Saskatchewan and Manitoba and the State of North Dakota. In May and June of 2011, record-setting rains were seen in the headwater areas of the basin. Emergency spillways of major reservoirs were discharging at full or nearly full capacity, and extensive flooding was seen in numerous downstream communities. To determine the probability of future extreme floods and droughts, the U.S. Geological Survey, in cooperation with the North Dakota State Water Commission, developed a stochastic model for simulating Souris River Basin precipitation, evapotranspiration, and natural (unregulated) streamflow. Simulations from the model can be used in future studies to simulate regulated streamflow, design levees, and other structures; and to complete economic cost/benefit analyses.Long-term climatic variability was analyzed using tree-ring chronologies to hindcast precipitation to the early 1700s and compare recent wet and dry conditions to earlier extreme conditions. The extended precipitation record was consistent with findings from the Devils Lake and Red River of the North Basins (southeast of the Souris River Basin), supporting the idea that regional climatic patterns for many centuries have consisted of alternating wet and dry climate states.A stochastic climate simulation model for precipitation, temperature, and potential evapotranspiration for the Souris River Basin was developed using recorded meteorological data and extended precipitation records provided through tree-ring analysis. A significant climate transition was seen around1970, with 1912–69 representing a dry climate state and 1970–2011 representing a wet climate state. Although there were some distinct subpatterns within the basin, the predominant differences between the two states were higher spring through early fall precipitation and higher spring potential evapotranspiration for the wet compared to the dry state.A water

  12. Novel approach for streamflow forecasting using a hybrid ANFIS-FFA model

    Science.gov (United States)

    Yaseen, Zaher Mundher; Ebtehaj, Isa; Bonakdari, Hossein; Deo, Ravinesh C.; Danandeh Mehr, Ali; Mohtar, Wan Hanna Melini Wan; Diop, Lamine; El-shafie, Ahmed; Singh, Vijay P.

    2017-11-01

    The present study proposes a new hybrid evolutionary Adaptive Neuro-Fuzzy Inference Systems (ANFIS) approach for monthly streamflow forecasting. The proposed method is a novel combination of the ANFIS model with the firefly algorithm as an optimizer tool to construct a hybrid ANFIS-FFA model. The results of the ANFIS-FFA model is compared with the classical ANFIS model, which utilizes the fuzzy c-means (FCM) clustering method in the Fuzzy Inference Systems (FIS) generation. The historical monthly streamflow data for Pahang River, which is a major river system in Malaysia that characterized by highly stochastic hydrological patterns, is used in the study. Sixteen different input combinations with one to five time-lagged input variables are incorporated into the ANFIS-FFA and ANFIS models to consider the antecedent seasonal variations in historical streamflow data. The mean absolute error (MAE), root mean square error (RMSE) and correlation coefficient (r) are used to evaluate the forecasting performance of ANFIS-FFA model. In conjunction with these metrics, the refined Willmott's Index (Drefined), Nash-Sutcliffe coefficient (ENS) and Legates and McCabes Index (ELM) are also utilized as the normalized goodness-of-fit metrics. Comparison of the results reveals that the FFA is able to improve the forecasting accuracy of the hybrid ANFIS-FFA model (r = 1; RMSE = 0.984; MAE = 0.364; ENS = 1; ELM = 0.988; Drefined = 0.994) applied for the monthly streamflow forecasting in comparison with the traditional ANFIS model (r = 0.998; RMSE = 3.276; MAE = 1.553; ENS = 0.995; ELM = 0.950; Drefined = 0.975). The results also show that the ANFIS-FFA is not only superior to the ANFIS model but also exhibits a parsimonious modelling framework for streamflow forecasting by incorporating a smaller number of input variables required to yield the comparatively better performance. It is construed that the FFA optimizer can thus surpass the accuracy of the traditional ANFIS model in general

  13. Watershed-scale modeling of streamflow change in incised montane meadows

    Science.gov (United States)

    Essaid, Hedeff I.; Hill, Barry R.

    2014-01-01

    Land use practices have caused stream channel incision and water table decline in many montane meadows of the Western United States. Incision changes the magnitude and timing of streamflow in water supply source watersheds, a concern to resource managers and downstream water users. The hydrology of montane meadows under natural and incised conditions was investigated using watershed simulation for a range of hydrologic conditions. The results illustrate the interdependence between: watershed and meadow hydrology; bedrock and meadow aquifers; and surface and groundwater flow through the meadow for the modeled scenarios. During the wet season, stream incision resulted in less overland flow and interflow and more meadow recharge causing a net decrease in streamflow and increase in groundwater storage relative to natural meadow conditions. During the dry season, incision resulted in less meadow evapotranspiration and more groundwater discharge to the stream causing a net increase in streamflow and a decrease in groundwater storage relative to natural meadow conditions. In general, for a given meadow setting, the magnitude of change in summer streamflow and long-term change in watershed groundwater storage due to incision will depend on the combined effect of: reduced evapotranspiration in the eroded meadow; induced groundwater recharge; replenishment of dry season groundwater storage depletion in meadow and bedrock aquifers by precipitation during wet years; and groundwater storage depletion that is not replenished by precipitation during wet years.

  14. Use of medium-range numerical weather prediction model output to produce forecasts of streamflow

    Science.gov (United States)

    Clark, M.P.; Hay, L.E.

    2004-01-01

    This paper examines an archive containing over 40 years of 8-day atmospheric forecasts over the contiguous United States from the NCEP reanalysis project to assess the possibilities for using medium-range numerical weather prediction model output for predictions of streamflow. This analysis shows the biases in the NCEP forecasts to be quite extreme. In many regions, systematic precipitation biases exceed 100% of the mean, with temperature biases exceeding 3??C. In some locations, biases are even higher. The accuracy of NCEP precipitation and 2-m maximum temperature forecasts is computed by interpolating the NCEP model output for each forecast day to the location of each station in the NWS cooperative network and computing the correlation with station observations. Results show that the accuracy of the NCEP forecasts is rather low in many areas of the country. Most apparent is the generally low skill in precipitation forecasts (particularly in July) and low skill in temperature forecasts in the western United States, the eastern seaboard, and the southern tier of states. These results outline a clear need for additional processing of the NCEP Medium-Range Forecast Model (MRF) output before it is used for hydrologic predictions. Techniques of model output statistics (MOS) are used in this paper to downscale the NCEP forecasts to station locations. Forecasted atmospheric variables (e.g., total column precipitable water, 2-m air temperature) are used as predictors in a forward screening multiple linear regression model to improve forecasts of precipitation and temperature for stations in the National Weather Service cooperative network. This procedure effectively removes all systematic biases in the raw NCEP precipitation and temperature forecasts. MOS guidance also results in substantial improvements in the accuracy of maximum and minimum temperature forecasts throughout the country. For precipitation, forecast improvements were less impressive. MOS guidance increases

  15. Distributed HUC-based modeling with SUMMA for ensemble streamflow forecasting over large regional domains.

    Science.gov (United States)

    Saharia, M.; Wood, A.; Clark, M. P.; Bennett, A.; Nijssen, B.; Clark, E.; Newman, A. J.

    2017-12-01

    Most operational streamflow forecasting systems rely on a forecaster-in-the-loop approach in which some parts of the forecast workflow require an experienced human forecaster. But this approach faces challenges surrounding process reproducibility, hindcasting capability, and extension to large domains. The operational hydrologic community is increasingly moving towards `over-the-loop' (completely automated) large-domain simulations yet recent developments indicate a widespread lack of community knowledge about the strengths and weaknesses of such systems for forecasting. A realistic representation of land surface hydrologic processes is a critical element for improving forecasts, but often comes at the substantial cost of forecast system agility and efficiency. While popular grid-based models support the distributed representation of land surface processes, intermediate-scale Hydrologic Unit Code (HUC)-based modeling could provide a more efficient and process-aligned spatial discretization, reducing the need for tradeoffs between model complexity and critical forecasting requirements such as ensemble methods and comprehensive model calibration. The National Center for Atmospheric Research is collaborating with the University of Washington, the Bureau of Reclamation and the USACE to implement, assess, and demonstrate real-time, over-the-loop distributed streamflow forecasting for several large western US river basins and regions. In this presentation, we present early results from short to medium range hydrologic and streamflow forecasts for the Pacific Northwest (PNW). We employ a real-time 1/16th degree daily ensemble model forcings as well as downscaled Global Ensemble Forecasting System (GEFS) meteorological forecasts. These datasets drive an intermediate-scale configuration of the Structure for Unifying Multiple Modeling Alternatives (SUMMA) model, which represents the PNW using over 11,700 HUCs. The system produces not only streamflow forecasts (using the Mizu

  16. Modelling the effects of land use changes on the streamflow of a peri-urban catchment in central Portugal

    Science.gov (United States)

    Hävermark, Saga; Santos Ferreira, Carla Sofia; Kalantari, Zahra; Di Baldassarre, Giuliano

    2016-04-01

    was calibrated for the hydrological years 2008 to 2010 and validated for the three following years using streamflow data. The impact of future land use changes was analysed by investigating the impact of the size and location of the urban areas within the catchment. Modelling results are expected to support the decision making process in planning and developing new urban areas.

  17. Assessment of land-use change on streamflow using GIS, remote sensing and a physically-based model, SWAT

    Directory of Open Access Journals (Sweden)

    J. Y. G. Dos Santos

    2014-09-01

    Full Text Available This study aims to assess the impact of the land-use changes between the periods 1967−1974 and 1997−2008 on the streamflow of Tapacurá catchment (northeastern Brazil using the Soil and Water Assessment Tool (SWAT model. The results show that the most sensitive parameters were the baseflow, Manning factor, time of concentration and soil evaporation compensation factor, which affect the catchment hydrology. The model calibration and validation were performed on a monthly basis, and the streamflow simulation showed a good level of accuracy for both periods. The obtained R2 and Nash-Sutcliffe Efficiency values for each period were respectively 0.82 and 0.81 for 1967−1974, and 0.93 and 0.92 for the period 1997−2008. The evaluation of the SWAT model response to the land cover has shown that the mean monthly flow, during the rainy seasons for 1967−1974, decreased when compared to 1997−2008.

  18. Improved Assimilation of Streamflow and Satellite Soil Moisture with the Evolutionary Particle Filter and Geostatistical Modeling

    Science.gov (United States)

    Yan, Hongxiang; Moradkhani, Hamid; Abbaszadeh, Peyman

    2017-04-01

    Assimilation of satellite soil moisture and streamflow data into hydrologic models using has received increasing attention over the past few years. Currently, these observations are increasingly used to improve the model streamflow and soil moisture predictions. However, the performance of this land data assimilation (DA) system still suffers from two limitations: 1) satellite data scarcity and quality; and 2) particle weight degeneration. In order to overcome these two limitations, we propose two possible solutions in this study. First, the general Gaussian geostatistical approach is proposed to overcome the limitation in the space/time resolution of satellite soil moisture products thus improving their accuracy at uncovered/biased grid cells. Secondly, an evolutionary PF approach based on Genetic Algorithm (GA) and Markov Chain Monte Carlo (MCMC), the so-called EPF-MCMC, is developed to further reduce weight degeneration and improve the robustness of the land DA system. This study provides a detailed analysis of the joint and separate assimilation of streamflow and satellite soil moisture into a distributed Sacramento Soil Moisture Accounting (SAC-SMA) model, with the use of recently developed EPF-MCMC and the general Gaussian geostatistical approach. Performance is assessed over several basins in the USA selected from Model Parameter Estimation Experiment (MOPEX) and located in different climate regions. The results indicate that: 1) the general Gaussian approach can predict the soil moisture at uncovered grid cells within the expected satellite data quality threshold; 2) assimilation of satellite soil moisture inferred from the general Gaussian model can significantly improve the soil moisture predictions; and 3) in terms of both deterministic and probabilistic measures, the EPF-MCMC can achieve better streamflow predictions. These results recommend that the geostatistical model is a helpful tool to aid the remote sensing technique and the EPF-MCMC is a

  19. Projected Changes to Streamflow Characteristics in Quebec Basins as Simulated by the Canadian Regional Climate Model (CRCM4)

    Science.gov (United States)

    Huziy, O.; Sushama, L.; Khaliq, M.; Lehner, B.; Laprise, R.; Roy, R.

    2011-12-01

    According to the Intergovernmental Panel on Climate Change (IPCC, 2007), an intensification of the global hydrological cycle and increase in precipitation for some regions around the world, including the northern mid- to high-latitudes, is expected in future climate. This will have an impact on mean and extreme flow characteristics, which need to be assessed for better development of adaptation strategies. Analysis of the mean and extreme streamflow characteristics for Quebec (North-eastern Canada) basins in current climate and their projected changes in future climate are assessed using a 10 member ensemble of current (1970 - 1999) and future (2041 - 2070) Canadian RCM (CRCM4) simulations. Validation of streamflow characteristics, performed by comparing modeled values with those observed, available from the Centre d'expertise hydrique du Quebec (CEHQ) shows that the model captures reasonably well the high flows. Results suggest increase in mean and 10 year return levels of 1 day high flows, which appear significant for most of the northern basins.

  20. Artificial intelligence based models for stream-flow forecasting: 2000-2015

    Science.gov (United States)

    Yaseen, Zaher Mundher; El-shafie, Ahmed; Jaafar, Othman; Afan, Haitham Abdulmohsin; Sayl, Khamis Naba

    2015-11-01

    The use of Artificial Intelligence (AI) has increased since the middle of the 20th century as seen in its application in a wide range of engineering and science problems. The last two decades, for example, has seen a dramatic increase in the development and application of various types of AI approaches for stream-flow forecasting. Generally speaking, AI has exhibited significant progress in forecasting and modeling non-linear hydrological applications and in capturing the noise complexity in the dataset. This paper explores the state-of-the-art application of AI in stream-flow forecasting, focusing on defining the data-driven of AI, the advantages of complementary models, as well as the literature and their possible future application in modeling and forecasting stream-flow. The review also identifies the major challenges and opportunities for prospective research, including, a new scheme for modeling the inflow, a novel method for preprocessing time series frequency based on Fast Orthogonal Search (FOS) techniques, and Swarm Intelligence (SI) as an optimization approach.

  1. Improving probabilistic prediction of daily streamflow by identifying Pareto optimal approaches for modelling heteroscedastic residual errors

    Science.gov (United States)

    David, McInerney; Mark, Thyer; Dmitri, Kavetski; George, Kuczera

    2017-04-01

    This study provides guidance to hydrological researchers which enables them to provide probabilistic predictions of daily streamflow with the best reliability and precision for different catchment types (e.g. high/low degree of ephemerality). Reliable and precise probabilistic prediction of daily catchment-scale streamflow requires statistical characterization of residual errors of hydrological models. It is commonly known that hydrological model residual errors are heteroscedastic, i.e. there is a pattern of larger errors in higher streamflow predictions. Although multiple approaches exist for representing this heteroscedasticity, few studies have undertaken a comprehensive evaluation and comparison of these approaches. This study fills this research gap by evaluating 8 common residual error schemes, including standard and weighted least squares, the Box-Cox transformation (with fixed and calibrated power parameter, lambda) and the log-sinh transformation. Case studies include 17 perennial and 6 ephemeral catchments in Australia and USA, and two lumped hydrological models. We find the choice of heteroscedastic error modelling approach significantly impacts on predictive performance, though no single scheme simultaneously optimizes all performance metrics. The set of Pareto optimal schemes, reflecting performance trade-offs, comprises Box-Cox schemes with lambda of 0.2 and 0.5, and the log scheme (lambda=0, perennial catchments only). These schemes significantly outperform even the average-performing remaining schemes (e.g., across ephemeral catchments, median precision tightens from 105% to 40% of observed streamflow, and median biases decrease from 25% to 4%). Theoretical interpretations of empirical results highlight the importance of capturing the skew/kurtosis of raw residuals and reproducing zero flows. Recommendations for researchers and practitioners seeking robust residual error schemes for practical work are provided.

  2. Improving probabilistic prediction of daily streamflow by identifying Pareto optimal approaches for modeling heteroscedastic residual errors

    Science.gov (United States)

    McInerney, David; Thyer, Mark; Kavetski, Dmitri; Lerat, Julien; Kuczera, George

    2017-03-01

    Reliable and precise probabilistic prediction of daily catchment-scale streamflow requires statistical characterization of residual errors of hydrological models. This study focuses on approaches for representing error heteroscedasticity with respect to simulated streamflow, i.e., the pattern of larger errors in higher streamflow predictions. We evaluate eight common residual error schemes, including standard and weighted least squares, the Box-Cox transformation (with fixed and calibrated power parameter λ) and the log-sinh transformation. Case studies include 17 perennial and 6 ephemeral catchments in Australia and the United States, and two lumped hydrological models. Performance is quantified using predictive reliability, precision, and volumetric bias metrics. We find the choice of heteroscedastic error modeling approach significantly impacts on predictive performance, though no single scheme simultaneously optimizes all performance metrics. The set of Pareto optimal schemes, reflecting performance trade-offs, comprises Box-Cox schemes with λ of 0.2 and 0.5, and the log scheme (λ = 0, perennial catchments only). These schemes significantly outperform even the average-performing remaining schemes (e.g., across ephemeral catchments, median precision tightens from 105% to 40% of observed streamflow, and median biases decrease from 25% to 4%). Theoretical interpretations of empirical results highlight the importance of capturing the skew/kurtosis of raw residuals and reproducing zero flows. Paradoxically, calibration of λ is often counterproductive: in perennial catchments, it tends to overfit low flows at the expense of abysmal precision in high flows. The log-sinh transformation is dominated by the simpler Pareto optimal schemes listed above. Recommendations for researchers and practitioners seeking robust residual error schemes for practical work are provided.

  3. Validation of HEDR models

    International Nuclear Information System (INIS)

    Napier, B.A.; Simpson, J.C.; Eslinger, P.W.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1994-05-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid

  4. Skills of General Circulation and Earth System Models in reproducing streamflow to the ocean: the case of Congo river

    Science.gov (United States)

    Santini, M.; Caporaso, L.

    2017-12-01

    Although the importance of water resources in the context of climate change, it is still difficult to correctly simulate the freshwater cycle over the land via General Circulation and Earth System Models (GCMs and ESMs). Existing efforts from the Climate Model Intercomparison Project 5 (CMIP5) were mainly devoted to the validation of atmospheric variables like temperature and precipitation, with low attention to discharge.Here we investigate the present-day performances of GCMs and ESMs participating to CMIP5 in simulating the discharge of the river Congo to the sea thanks to: i) the long-term availability of discharge data for the Kinshasa hydrological station representative of more than 95% of the water flowing in the whole catchment; and ii) the River's still low influence by human intervention, which enables comparison with the (mostly) natural streamflow simulated within CMIP5.Our findings suggest how most of models appear overestimating the streamflow in terms of seasonal cycle, especially in the late winter and spring, while overestimation and variability across models are lower in late summer. Weighted ensemble means are also calculated, based on simulations' performances given by several metrics, showing some improvements of results.Although simulated inter-monthly and inter-annual percent anomalies do not appear significantly different from those in observed data, when translated into well consolidated indicators of drought attributes (frequency, magnitude, timing, duration), usually adopted for more immediate communication to stakeholders and decision makers, such anomalies can be misleading.These inconsistencies produce incorrect assessments towards water management planning and infrastructures (e.g. dams or irrigated areas), especially if models are used instead of measurements, as in case of ungauged basins or for basins with insufficient data, as well as when relying on models for future estimates without a preliminary quantification of model biases.

  5. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2013-02-01

    Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.

  6. A simple model for assessing utilisable streamflow allocations in the ...

    African Journals Online (AJOL)

    -sized water resource systems (without major storage) and displays the results as flow duration curves so that they can be compared with the standard information available for the Ecological Reserve requirements. The model is designed to ...

  7. Using satellite-based rainfall estimates for streamflow modelling: Bagmati Basin

    Science.gov (United States)

    Shrestha, M.S.; Artan, Guleid A.; Bajracharya, S.R.; Sharma, R. R.

    2008-01-01

    In this study, we have described a hydrologic modelling system that uses satellite-based rainfall estimates and weather forecast data for the Bagmati River Basin of Nepal. The hydrologic model described is the US Geological Survey (USGS) Geospatial Stream Flow Model (GeoSFM). The GeoSFM is a spatially semidistributed, physically based hydrologic model. We have used the GeoSFM to estimate the streamflow of the Bagmati Basin at Pandhera Dovan hydrometric station. To determine the hydrologic connectivity, we have used the USGS Hydro1k DEM dataset. The model was forced by daily estimates of rainfall and evapotranspiration derived from weather model data. The rainfall estimates used for the modelling are those produced by the National Oceanic and Atmospheric Administration Climate Prediction Centre and observed at ground rain gauge stations. The model parameters were estimated from globally available soil and land cover datasets – the Digital Soil Map of the World by FAO and the USGS Global Land Cover dataset. The model predicted the daily streamflow at Pandhera Dovan gauging station. The comparison of the simulated and observed flows at Pandhera Dovan showed that the GeoSFM model performed well in simulating the flows of the Bagmati Basin.

  8. Being an honest broker of hydrology: Uncovering, communicating and addressing model error in a climate change streamflow dataset

    Science.gov (United States)

    Chegwidden, O.; Nijssen, B.; Pytlak, E.

    2017-12-01

    Any model simulation has errors, including errors in meteorological data, process understanding, model structure, and model parameters. These errors may express themselves as bias, timing lags, and differences in sensitivity between the model and the physical world. The evaluation and handling of these errors can greatly affect the legitimacy, validity and usefulness of the resulting scientific product. In this presentation we will discuss a case study of handling and communicating model errors during the development of a hydrologic climate change dataset for the Pacific Northwestern United States. The dataset was the result of a four-year collaboration between the University of Washington, Oregon State University, the Bonneville Power Administration, the United States Army Corps of Engineers and the Bureau of Reclamation. Along the way, the partnership facilitated the discovery of multiple systematic errors in the streamflow dataset. Through an iterative review process, some of those errors could be resolved. For the errors that remained, honest communication of the shortcomings promoted the dataset's legitimacy. Thoroughly explaining errors also improved ways in which the dataset would be used in follow-on impact studies. Finally, we will discuss the development of the "streamflow bias-correction" step often applied to climate change datasets that will be used in impact modeling contexts. We will describe the development of a series of bias-correction techniques through close collaboration among universities and stakeholders. Through that process, both universities and stakeholders learned about the others' expectations and workflows. This mutual learning process allowed for the development of methods that accommodated the stakeholders' specific engineering requirements. The iterative revision process also produced a functional and actionable dataset while preserving its scientific merit. We will describe how encountering earlier techniques' pitfalls allowed us

  9. Model Validation Status Review

    International Nuclear Information System (INIS)

    E.L. Hardin

    2001-01-01

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M and O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  10. Model Validation Status Review

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2001-11-28

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  11. River Stream-Flow and Zayanderoud Reservoir Operation Modeling Using the Fuzzy Inference System

    Directory of Open Access Journals (Sweden)

    Saeed Jamali

    2007-12-01

    Full Text Available The Zayanderoud basin is located in the central plateau of Iran. As a result of population increase and agricultural and industrial developments, water demand on this basin has increased extensively. Given the importance of reservoir operation in water resource and management studies, the performance of fuzzy inference system (FIS for Zayanderoud reservoir operation is investigated in this paper. The model of operation consists of two parts. In the first part, the seasonal river stream-flow is forecasted using the fuzzy rule-based system. The southern oscillated index, rain, snow, and discharge are inputs of the model and the seasonal river stream-flow its output. In the second part, the operation model is constructed. The amount of releases is first optimized by a nonlinear optimization model and then the rule curves are extracted using the fuzzy inference system. This model operates on an "if-then" principle, where the "if" is a vector of fuzzy permits and "then" is the fuzzy result. The reservoir storage capacity, inflow, demand, and year condition factor are used as permits. Monthly release is taken as the consequence. The Zayanderoud basin is investigated as a case study. Different performance indices such as reliability, resiliency, and vulnerability are calculated. According to results, FIS works more effectively than the traditional reservoir operation methods such as standard operation policy (SOP or linear regression.

  12. Multiobjective Optimal Algorithm for Automatic Calibration of Daily Streamflow Forecasting Model

    Directory of Open Access Journals (Sweden)

    Yi Liu

    2016-01-01

    Full Text Available Single-objection function cannot describe the characteristics of the complicated hydrologic system. Consequently, it stands to reason that multiobjective functions are needed for calibration of hydrologic model. The multiobjective algorithms based on the theory of nondominate are employed to solve this multiobjective optimal problem. In this paper, a novel multiobjective optimization method based on differential evolution with adaptive Cauchy mutation and Chaos searching (MODE-CMCS is proposed to optimize the daily streamflow forecasting model. Besides, to enhance the diversity performance of Pareto solutions, a more precise crowd distance assigner is presented in this paper. Furthermore, the traditional generalized spread metric (SP is sensitive with the size of Pareto set. A novel diversity performance metric, which is independent of Pareto set size, is put forward in this research. The efficacy of the new algorithm MODE-CMCS is compared with the nondominated sorting genetic algorithm II (NSGA-II on a daily streamflow forecasting model based on support vector machine (SVM. The results verify that the performance of MODE-CMCS is superior to the NSGA-II for automatic calibration of hydrologic model.

  13. Improving the performance of streamflow forecasting model using data-preprocessing technique in Dungun River Basin

    Science.gov (United States)

    Khai Tiu, Ervin Shan; Huang, Yuk Feng; Ling, Lloyd

    2018-03-01

    An accurate streamflow forecasting model is important for the development of flood mitigation plan as to ensure sustainable development for a river basin. This study adopted Variational Mode Decomposition (VMD) data-preprocessing technique to process and denoise the rainfall data before putting into the Support Vector Machine (SVM) streamflow forecasting model in order to improve the performance of the selected model. Rainfall data and river water level data for the period of 1996-2016 were used for this purpose. Homogeneity tests (Standard Normal Homogeneity Test, the Buishand Range Test, the Pettitt Test and the Von Neumann Ratio Test) and normality tests (Shapiro-Wilk Test, Anderson-Darling Test, Lilliefors Test and Jarque-Bera Test) had been carried out on the rainfall series. Homogenous and non-normally distributed data were found in all the stations, respectively. From the recorded rainfall data, it was observed that Dungun River Basin possessed higher monthly rainfall from November to February, which was during the Northeast Monsoon. Thus, the monthly and seasonal rainfall series of this monsoon would be the main focus for this research as floods usually happen during the Northeast Monsoon period. The predicted water levels from SVM model were assessed with the observed water level using non-parametric statistical tests (Biased Method, Kendall's Tau B Test and Spearman's Rho Test).

  14. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...

  15. Development of a Precipitation-Runoff Model to Simulate Unregulated Streamflow in the Salmon Creek Basin, Okanogan County, Washington

    Science.gov (United States)

    van Heeswijk, Marijke

    2006-01-01

    Surface water has been diverted from the Salmon Creek Basin for irrigation purposes since the early 1900s, when the Bureau of Reclamation built the Okanogan Project. Spring snowmelt runoff is stored in two reservoirs, Conconully Reservoir and Salmon Lake Reservoir, and gradually released during the growing season. As a result of the out-of-basin streamflow diversions, the lower 4.3 miles of Salmon Creek typically has been a dry creek bed for almost 100 years, except during the spring snowmelt season during years of high runoff. To continue meeting the water needs of irrigators but also leave water in lower Salmon Creek for fish passage and to help restore the natural ecosystem, changes are being considered in how the Okanogan Project is operated. This report documents development of a precipitation-runoff model for the Salmon Creek Basin that can be used to simulate daily unregulated streamflows. The precipitation-runoff model is a component of a Decision Support System (DSS) that includes a water-operations model the Bureau of Reclamation plans to develop to study the water resources of the Salmon Creek Basin. The DSS will be similar to the DSS that the Bureau of Reclamation and the U.S. Geological Survey developed previously for the Yakima River Basin in central southern Washington. The precipitation-runoff model was calibrated for water years 1950-89 and tested for water years 1990-96. The model was used to simulate daily streamflows that were aggregated on a monthly basis and calibrated against historical monthly streamflows for Salmon Creek at Conconully Dam. Additional calibration data were provided by the snowpack water-equivalent record for a SNOTEL station in the basin. Model input time series of daily precipitation and minimum and maximum air temperatures were based on data from climate stations in the study area. Historical records of unregulated streamflow for Salmon Creek at Conconully Dam do not exist for water years 1950-96. Instead, estimates of

  16. Monthly streamflow forecasting based on hidden Markov model and Gaussian Mixture Regression

    Science.gov (United States)

    Liu, Yongqi; Ye, Lei; Qin, Hui; Hong, Xiaofeng; Ye, Jiajun; Yin, Xingli

    2018-06-01

    Reliable streamflow forecasts can be highly valuable for water resources planning and management. In this study, we combined a hidden Markov model (HMM) and Gaussian Mixture Regression (GMR) for probabilistic monthly streamflow forecasting. The HMM is initialized using a kernelized K-medoids clustering method, and the Baum-Welch algorithm is then executed to learn the model parameters. GMR derives a conditional probability distribution for the predictand given covariate information, including the antecedent flow at a local station and two surrounding stations. The performance of HMM-GMR was verified based on the mean square error and continuous ranked probability score skill scores. The reliability of the forecasts was assessed by examining the uniformity of the probability integral transform values. The results show that HMM-GMR obtained reasonably high skill scores and the uncertainty spread was appropriate. Different HMM states were assumed to be different climate conditions, which would lead to different types of observed values. We demonstrated that the HMM-GMR approach can handle multimodal and heteroscedastic data.

  17. Impacts of uncertainties in weather and streamflow observations in calibration and evaluation of an elevation distributed HBV-model

    Science.gov (United States)

    Engeland, K.; Steinsland, I.; Petersen-Øverleir, A.; Johansen, S.

    2012-04-01

    The aim of this study is to assess the uncertainties in streamflow simulations when uncertainties in both observed inputs (precipitation and temperature) and streamflow observations used in the calibration of the hydrological model are explicitly accounted for. To achieve this goal we applied the elevation distributed HBV model operating on daily time steps to a small catchment in high elevation in Southern Norway where the seasonal snow cover is important. The uncertainties in precipitation inputs were quantified using conditional simulation. This procedure accounts for the uncertainty related to the density of the precipitation network, but neglects uncertainties related to measurement bias/errors and eventual elevation gradients in precipitation. The uncertainties in temperature inputs were quantified using a Bayesian temperature interpolation procedure where the temperature lapse rate is re-estimated every day. The uncertainty in the lapse rate was accounted for whereas the sampling uncertainty related to network density was neglected. For every day a random sample of precipitation and temperature inputs were drawn to be applied as inputs to the hydrologic model. The uncertainties in observed streamflow were assessed based on the uncertainties in the rating curve model. A Bayesian procedure was applied to estimate the probability for rating curve models with 1 to 3 segments and the uncertainties in their parameters. This method neglects uncertainties related to errors in observed water levels. Note that one rating curve was drawn to make one realisation of a whole time series of streamflow, thus the rating curve errors lead to a systematic bias in the streamflow observations. All these uncertainty sources were linked together in both calibration and evaluation of the hydrologic model using a DREAM based MCMC routine. Effects of having less information (e.g. missing one streamflow measurement for defining the rating curve or missing one precipitation station

  18. HEDR model validation plan

    International Nuclear Information System (INIS)

    Napier, B.A.; Gilbert, R.O.; Simpson, J.C.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1993-06-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computational ''tools'' for estimating the possible radiation dose that individuals may have received from past Hanford Site operations. This document describes the planned activities to ''validate'' these tools. In the sense of the HEDR Project, ''validation'' is a process carried out by comparing computational model predictions with field observations and experimental measurements that are independent of those used to develop the model

  19. Groundwater Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed E. Hassan

    2006-01-24

    Models have an inherent uncertainty. The difficulty in fully characterizing the subsurface environment makes uncertainty an integral component of groundwater flow and transport models, which dictates the need for continuous monitoring and improvement. Building and sustaining confidence in closure decisions and monitoring networks based on models of subsurface conditions require developing confidence in the models through an iterative process. The definition of model validation is postulated as a confidence building and long-term iterative process (Hassan, 2004a). Model validation should be viewed as a process not an end result. Following Hassan (2004b), an approach is proposed for the validation process of stochastic groundwater models. The approach is briefly summarized herein and detailed analyses of acceptance criteria for stochastic realizations and of using validation data to reduce input parameter uncertainty are presented and applied to two case studies. During the validation process for stochastic models, a question arises as to the sufficiency of the number of acceptable model realizations (in terms of conformity with validation data). Using a hierarchical approach to make this determination is proposed. This approach is based on computing five measures or metrics and following a decision tree to determine if a sufficient number of realizations attain satisfactory scores regarding how they represent the field data used for calibration (old) and used for validation (new). The first two of these measures are applied to hypothetical scenarios using the first case study and assuming field data consistent with the model or significantly different from the model results. In both cases it is shown how the two measures would lead to the appropriate decision about the model performance. Standard statistical tests are used to evaluate these measures with the results indicating they are appropriate measures for evaluating model realizations. The use of validation

  20. A modelling framework to project future climate change impacts on streamflow variability and extremes in the West River, China

    Directory of Open Access Journals (Sweden)

    Y. Fei

    2014-09-01

    Full Text Available In this study, a hydrological modelling framework was introduced to assess the climate change impacts on future river flow in the West River basin, China, especially on streamflow variability and extremes. The modelling framework includes a delta-change method with the quantile-mapping technique to construct future climate forcings on the basis of observed meteorological data and the downscaled climate model outputs. This method is able to retain the signals of extreme weather events, as projected by climate models, in the constructed future forcing scenarios. Fed with the historical and future forcing data, a large-scale hydrologic model (the Variable Infiltration Capacity model, VIC was executed for streamflow simulations and projections at daily time scales. A bootstrapping resample approach was used as an indirect alternative to test the equality of means, standard deviations and the coefficients of variation for the baseline and future streamflow time series, and to assess the future changes in flood return levels. The West River basin case study confirms that the introduced modelling framework is an efficient effective tool to quantify streamflow variability and extremes in response to future climate change.

  1. Phosphorus Export Model Development in a Terminal Lake Basin using Concentration-Streamflow Relationship

    Science.gov (United States)

    Jeannotte, T.; Mahmood, T. H.; Matheney, R.; Hou, X.

    2017-12-01

    Nutrient export to streams and lakes by anthropogenic activities can lead to eutrophication and degradation of surface water quality. In Devils Lake, ND, the only terminal lake in the Northern Great Plains, the algae boom is of great concern due to the recent increase in streamflow and consequent rise in phosphorus (P) export from prairie agricultural fields. However, to date, very few studies explored the concentration (c) -streamflow (q) relationship in the headwater catchments of the Devils Lake basin. A robust watershed-scale quantitative framework would aid understanding of the c-q relationship, simulating P concentration and load. In this study, we utilize c-q relationships to develop a simple model to estimate phosphorus concentration and export from two headwater catchments of different size (Mauvais Coulee: 1032 km2 and Trib 3: 160 km2) draining to Devils Lake. Our goal is to link the phosphorus export model with a physically based hydrologic model to identify major drivers of phosphorus export. USGS provided the streamflow measurements, and we collected water samples (filtered and unfiltered) three times daily during the spring snowmelt season (March 31, 2017- April 12, 2017) at the outlets of both headwater catchments. Our results indicate that most P is dissolved and very little is particulate, suggesting little export of fine-grained sediment from agricultural fields. Our preliminary analyses in the Mauvais Coulee catchment show a chemostatic c-q relationship in the rising limb of the hydrograph, while the recession limb shows a linear and positive c-q relationship. The poor correlation in the rising limb of the hydrograph suggests intense flushing of P by spring snowmelt runoff. Flushing then continues in the recession limb of the hydrograph, but at a more constant rate. The estimated total P load for the Mauvais Coulee basin is 193 kg/km2, consistent with other catchments of similar size across the Red River of the North basin to the east. We expect

  2. Effect of monthly areal rainfall uncertainty on streamflow simulation

    Science.gov (United States)

    Ndiritu, J. G.; Mkhize, N.

    2017-08-01

    Areal rainfall is mostly obtained from point rainfall measurements that are sparsely located and several studies have shown that this results in large areal rainfall uncertainties at the daily time step. However, water resources assessment is often carried out a monthly time step and streamflow simulation is usually an essential component of this assessment. This study set out to quantify monthly areal rainfall uncertainties and assess their effect on streamflow simulation. This was achieved by; i) quantifying areal rainfall uncertainties and using these to generate stochastic monthly areal rainfalls, and ii) finding out how the quality of monthly streamflow simulation and streamflow variability change if stochastic areal rainfalls are used instead of historic areal rainfalls. Tests on monthly rainfall uncertainty were carried out using data from two South African catchments while streamflow simulation was confined to one of them. A non-parametric model that had been applied at a daily time step was used for stochastic areal rainfall generation and the Pitman catchment model calibrated using the SCE-UA optimizer was used for streamflow simulation. 100 randomly-initialised calibration-validation runs using 100 stochastic areal rainfalls were compared with 100 runs obtained using the single historic areal rainfall series. By using 4 rain gauges alternately to obtain areal rainfall, the resulting differences in areal rainfall averaged to 20% of the mean monthly areal rainfall and rainfall uncertainty was therefore highly significant. Pitman model simulations obtained coefficient of efficiencies averaging 0.66 and 0.64 in calibration and validation using historic rainfalls while the respective values using stochastic areal rainfalls were 0.59 and 0.57. Average bias was less than 5% in all cases. The streamflow ranges using historic rainfalls averaged to 29% of the mean naturalised flow in calibration and validation and the respective average ranges using stochastic

  3. Simulation of daily streamflow for 12 river basins in western Iowa using the Precipitation-Runoff Modeling System

    Science.gov (United States)

    Christiansen, Daniel E.; Haj, Adel E.; Risley, John C.

    2017-10-24

    The U.S. Geological Survey, in cooperation with the Iowa Department of Natural Resources, constructed Precipitation-Runoff Modeling System models to estimate daily streamflow for 12 river basins in western Iowa that drain into the Missouri River. The Precipitation-Runoff Modeling System is a deterministic, distributed-parameter, physical-process-based modeling system developed to evaluate the response of streamflow and general drainage basin hydrology to various combinations of climate and land use. Calibration periods for each basin varied depending on the period of record available for daily mean streamflow measurements at U.S. Geological Survey streamflow-gaging stations.A geographic information system tool was used to delineate each basin and estimate initial values for model parameters based on basin physical and geographical features. A U.S. Geological Survey automatic calibration tool that uses a shuffled complex evolution algorithm was used for initial calibration, and then manual modifications were made to parameter values to complete the calibration of each basin model. The main objective of the calibration was to match daily discharge values of simulated streamflow to measured daily discharge values. The Precipitation-Runoff Modeling System model was calibrated at 42 sites located in the 12 river basins in western Iowa.The accuracy of the simulated daily streamflow values at the 42 calibration sites varied by river and by site. The models were satisfactory at 36 of the sites based on statistical results. Unsatisfactory performance at the six other sites can be attributed to several factors: (1) low flow, no flow, and flashy flow conditions in headwater subbasins having a small drainage area; (2) poor representation of the groundwater and storage components of flow within a basin; (3) lack of accounting for basin withdrawals and water use; and (4) limited availability and accuracy of meteorological input data. The Precipitation-Runoff Modeling System

  4. Model-Based Attribution of High-Resolution Streamflow Trends in Two Alpine Basins of Western Austria

    Directory of Open Access Journals (Sweden)

    Christoph Kormann

    2016-02-01

    Full Text Available Several trend studies have shown that hydrological conditions are changing considerably in the Alpine region. However, the reasons for these changes are only partially understood and trend analyses alone are not able to shed much light. Hydrological modelling is one possible way to identify the trend drivers, i.e., to attribute the detected streamflow trends, given that the model captures all important processes causing the trends. We modelled the hydrological conditions for two alpine catchments in western Austria (a large, mostly lower-altitude catchment with wide valley plains and a nested high-altitude, glaciated headwater catchment with the distributed, physically-oriented WaSiM-ETH model, which includes a dynamical glacier module. The model was calibrated in a transient mode, i.e., not only on several standard goodness measures and glacier extents, but also in such a way that the simulated streamflow trends fit with the observed ones during the investigation period 1980 to 2007. With this approach, it was possible to separate streamflow components, identify the trends of flow components, and study their relation to trends in atmospheric variables. In addition to trends in annual averages, highly resolved trends for each Julian day were derived, since they proved powerful in an earlier, data-based attribution study. We were able to show that annual and highly resolved trends can be modelled sufficiently well. The results provide a holistic, year-round picture of the drivers of alpine streamflow changes: Higher-altitude catchments are strongly affected by earlier firn melt and snowmelt in spring and increased ice melt throughout the ablation season. Changes in lower-altitude areas are mostly caused by earlier and lower snowmelt volumes. All highly resolved trends in streamflow and its components show an explicit similarity to the local temperature trends. Finally, results indicate that evapotranspiration has been increasing in the lower

  5. Assessing the Use of Remote Sensing and a Crop Growth Model to Improve Modeled Streamflow in Central Asia

    Science.gov (United States)

    Richey, A. S.; Richey, J. E.; Tan, A.; Liu, M.; Adam, J. C.; Sokolov, V.

    2015-12-01

    Central Asia presents a perfect case study to understand the dynamic, and often conflicting, linkages between food, energy, and water in natural systems. The destruction of the Aral Sea is a well-known environmental disaster, largely driven by increased irrigation demand on the rivers that feed the endorheic sea. Continued reliance on these rivers, the Amu Darya and Syr Darya, often place available water resources at odds between hydropower demands upstream and irrigation requirements downstream. A combination of tools is required to understand these linkages and how they may change in the future as a function of climate change and population growth. In addition, the region is geopolitically complex as the former Soviet basin states develop management strategies to sustainably manage shared resources. This complexity increases the importance of relying upon publically available information sources and tools. Preliminary work has shown potential for the Variable Infiltration Capacity (VIC) model to recreate the natural water balance in the Amu Darya and Syr Darya basins by comparing results to total terrestrial water storage changes observed from NASA's Gravity Recovery and Climate Experiment (GRACE) satellite mission. Modeled streamflow is well correlated to observed streamflow at upstream gauges prior to the large-scale expansion of irrigation and hydropower. However, current modeled results are unable to capture the human influence of water use on downstream flow. This study examines the utility of a crop simulation model, CropSyst, to represent irrigation demand and GRACE to improve modeled streamflow estimates in the Amu Darya and Syr Darya basins. Specifically we determine crop water demand with CropSyst utilizing available data on irrigation schemes and cropping patterns. We determine how this demand can be met either by surface water, modeled by VIC with a reservoir operation scheme, and/or by groundwater derived from GRACE. Finally, we assess how the

  6. Predicting long-term streamflow variability in moist eucalypt forests using forest growth models and a sapwood area index

    Science.gov (United States)

    Jaskierniak, D.; Kuczera, G.; Benyon, R.

    2016-04-01

    A major challenge in surface hydrology involves predicting streamflow in ungauged catchments with heterogeneous vegetation and spatiotemporally varying evapotranspiration (ET) rates. We present a top-down approach for quantifying the influence of broad-scale changes in forest structure on ET and hence streamflow. Across three catchments between 18 and 100 km2 in size and with regenerating Eucalyptus regnans and E. delegatensis forest, we demonstrate how variation in ET can be mapped in space and over time using LiDAR data and commonly available forest inventory data. The model scales plot-level sapwood area (SA) to the catchment-level using basal area (BA) and tree stocking density (N) estimates in forest growth models. The SA estimates over a 69 year regeneration period are used in a relationship between SA and vegetation induced streamflow loss (L) to predict annual streamflow (Q) with annual rainfall (P) estimates. Without calibrating P, BA, N, SA, and L to Q data, we predict annual Q with R2 between 0.68 and 0.75 and Nash Sutcliffe efficiency (NSE) between 0.44 and 0.48. To remove bias, the model was extended to allow for runoff carry-over into the following year as well as minor correction to rainfall bias, which produced R2 values between 0.72 and 0.79, and NSE between 0.70 and 0.79. The model under-predicts streamflow during drought periods as it lacks representation of ecohydrological processes that reduce L with either reduced growth rates or rainfall interception during drought. Refining the relationship between sapwood thickness and forest inventory variables is likely to further improve results.

  7. Streamflow characteristics from modelled runoff time series: Importance of calibration criteria selection

    Science.gov (United States)

    Poole, Sandra; Vis, Marc; Knight, Rodney; Seibert, Jan

    2017-01-01

    Ecologically relevant streamflow characteristics (SFCs) of ungauged catchments are often estimated from simulated runoff of hydrologic models that were originally calibrated on gauged catchments. However, SFC estimates of the gauged donor catchments and subsequently the ungauged catchments can be substantially uncertain when models are calibrated using traditional approaches based on optimization of statistical performance metrics (e.g., Nash–Sutcliffe model efficiency). An improved calibration strategy for gauged catchments is therefore crucial to help reduce the uncertainties of estimated SFCs for ungauged catchments. The aim of this study was to improve SFC estimates from modeled runoff time series in gauged catchments by explicitly including one or several SFCs in the calibration process. Different types of objective functions were defined consisting of the Nash–Sutcliffe model efficiency, single SFCs, or combinations thereof. We calibrated a bucket-type runoff model (HBV – Hydrologiska Byråns Vattenavdelning – model) for 25 catchments in the Tennessee River basin and evaluated the proposed calibration approach on 13 ecologically relevant SFCs representing major flow regime components and different flow conditions. While the model generally tended to underestimate the tested SFCs related to mean and high-flow conditions, SFCs related to low flow were generally overestimated. The highest estimation accuracies were achieved by a SFC-specific model calibration. Estimates of SFCs not included in the calibration process were of similar quality when comparing a multi-SFC calibration approach to a traditional model efficiency calibration. For practical applications, this implies that SFCs should preferably be estimated from targeted runoff model calibration, and modeled estimates need to be carefully interpreted.

  8. Bayesian inference of uncertainties in precipitation-streamflow modeling in a snow affected catchment

    Science.gov (United States)

    Koskela, J. J.; Croke, B. W. F.; Koivusalo, H.; Jakeman, A. J.; Kokkonen, T.

    2012-11-01

    Bayesian inference is used to study the effect of precipitation and model structural uncertainty on estimates of model parameters and confidence limits of predictive variables in a conceptual rainfall-runoff model in the snow-fed Rudbäck catchment (142 ha) in southern Finland. The IHACRES model is coupled with a simple degree day model to account for snow accumulation and melt. The posterior probability distribution of the model parameters is sampled by using the Differential Evolution Adaptive Metropolis (DREAM(ZS)) algorithm and the generalized likelihood function. Precipitation uncertainty is taken into account by introducing additional latent variables that were used as multipliers for individual storm events. Results suggest that occasional snow water equivalent (SWE) observations together with daily streamflow observations do not contain enough information to simultaneously identify model parameters, precipitation uncertainty and model structural uncertainty in the Rudbäck catchment. The addition of an autoregressive component to account for model structure error and latent variables having uniform priors to account for input uncertainty lead to dubious posterior distributions of model parameters. Thus our hypothesis that informative priors for latent variables could be replaced by additional SWE data could not be confirmed. The model was found to work adequately in 1-day-ahead simulation mode, but the results were poor in the simulation batch mode. This was caused by the interaction of parameters that were used to describe different sources of uncertainty. The findings may have lessons for other cases where parameterizations are similarly high in relation to available prior information.

  9. Assessing the predictive capability of randomized tree-based ensembles in streamflow modelling

    Science.gov (United States)

    Galelli, S.; Castelletti, A.

    2013-07-01

    Combining randomization methods with ensemble prediction is emerging as an effective option to balance accuracy and computational efficiency in data-driven modelling. In this paper, we investigate the prediction capability of extremely randomized trees (Extra-Trees), in terms of accuracy, explanation ability and computational efficiency, in a streamflow modelling exercise. Extra-Trees are a totally randomized tree-based ensemble method that (i) alleviates the poor generalisation property and tendency to overfitting of traditional standalone decision trees (e.g. CART); (ii) is computationally efficient; and, (iii) allows to infer the relative importance of the input variables, which might help in the ex-post physical interpretation of the model. The Extra-Trees potential is analysed on two real-world case studies - Marina catchment (Singapore) and Canning River (Western Australia) - representing two different morphoclimatic contexts. The evaluation is performed against other tree-based methods (CART and M5) and parametric data-driven approaches (ANNs and multiple linear regression). Results show that Extra-Trees perform comparatively well to the best of the benchmarks (i.e. M5) in both the watersheds, while outperforming the other approaches in terms of computational requirement when adopted on large datasets. In addition, the ranking of the input variable provided can be given a physically meaningful interpretation.

  10. An initial abstraction and constant loss model, and methods for estimating unit hydrographs, peak streamflows, and flood volumes for urban basins in Missouri

    Science.gov (United States)

    Huizinga, Richard J.

    2014-01-01

    Streamflow data, basin characteristics, and rainfall data from 39 streamflow-gaging stations for urban areas in and adjacent to Missouri were used by the U.S. Geological Survey in cooperation with the Metropolitan Sewer District of St. Louis to develop an initial abstraction and constant loss model (a time-distributed basin-loss model) and a gamma unit hydrograph (GUH) for urban areas in Missouri. Study-specific methods to determine peak streamflow and flood volume for a given rainfall event also were developed.

  11. Modeling streamflow from coupled airborne laser scanning and acoustic Doppler current profiler data

    Science.gov (United States)

    Norris, Lam; Kean, Jason W.; Lyon, Steve

    2016-01-01

    The rating curve enables the translation of water depth into stream discharge through a reference cross-section. This study investigates coupling national scale airborne laser scanning (ALS) and acoustic Doppler current profiler (ADCP) bathymetric survey data for generating stream rating curves. A digital terrain model was defined from these data and applied in a physically based 1-D hydraulic model to generate rating curves for a regularly monitored location in northern Sweden. Analysis of the ALS data showed that overestimation of the streambank elevation could be adjusted with a root mean square error (RMSE) block adjustment using a higher accuracy manual topographic survey. The results of our study demonstrate that the rating curve generated from the vertically corrected ALS data combined with ADCP data had lower errors (RMSE = 0.79 m3/s) than the empirical rating curve (RMSE = 1.13 m3/s) when compared to streamflow measurements. We consider these findings encouraging as hydrometric agencies can potentially leverage national-scale ALS and ADCP instrumentation to reduce the cost and effort required for maintaining and establishing rating curves at gauging station sites similar to the Röån River.

  12. Daily Streamflow Predictions in an Ungauged Watershed in Northern California Using the Precipitation-Runoff Modeling System (PRMS): Calibration Challenges when nearby Gauged Watersheds are Hydrologically Dissimilar

    Science.gov (United States)

    Dhakal, A. S.; Adera, S.

    2017-12-01

    Accurate daily streamflow prediction in ungauged watersheds with sparse information is challenging. The ability of a hydrologic model calibrated using nearby gauged watersheds to predict streamflow accurately depends on hydrologic similarities between the gauged and ungauged watersheds. This study examines daily streamflow predictions using the Precipitation-Runoff Modeling System (PRMS) for the largely ungauged San Antonio Creek watershed, a 96 km2 sub-watershed of the Alameda Creek watershed in Northern California. The process-based PRMS model is being used to improve the accuracy of recent San Antonio Creek streamflow predictions generated by two empirical methods. Although San Antonio Creek watershed is largely ungauged, daily streamflow data exists for hydrologic years (HY) 1913 - 1930. PRMS was calibrated for HY 1913 - 1930 using streamflow data, modern-day land use and PRISM precipitation distribution, and gauged precipitation and temperature data from a nearby watershed. The PRMS model was then used to generate daily streamflows for HY 1996-2013, during which the watershed was ungauged, and hydrologic responses were compared to two nearby gauged sub-watersheds of Alameda Creek. Finally, the PRMS-predicted daily flows between HY 1996-2013 were compared to the two empirically-predicted streamflow time series: (1) the reservoir mass balance method and (2) correlation of historical streamflows from 80 - 100 years ago between San Antonio Creek and a nearby sub-watershed located in Alameda Creek. While the mass balance approach using reservoir storage and transfers is helpful for estimating inflows to the reservoir, large discrepancies in daily streamflow estimation can arise. Similarly, correlation-based predicted daily flows which rely on a relationship from flows collected 80-100 years ago may not represent current watershed hydrologic conditions. This study aims to develop a method of streamflow prediction in the San Antonio Creek watershed by examining PRMS

  13. Identifying influential data points in hydrological model calibration and their impact on streamflow predictions

    Science.gov (United States)

    Wright, David; Thyer, Mark; Westra, Seth

    2015-04-01

    Highly influential data points are those that have a disproportionately large impact on model performance, parameters and predictions. However, in current hydrological modelling practice the relative influence of individual data points on hydrological model calibration is not commonly evaluated. This presentation illustrates and evaluates several influence diagnostics tools that hydrological modellers can use to assess the relative influence of data. The feasibility and importance of including influence detection diagnostics as a standard tool in hydrological model calibration is discussed. Two classes of influence diagnostics are evaluated: (1) computationally demanding numerical "case deletion" diagnostics; and (2) computationally efficient analytical diagnostics, based on Cook's distance. These diagnostics are compared against hydrologically orientated diagnostics that describe changes in the model parameters (measured through the Mahalanobis distance), performance (objective function displacement) and predictions (mean and maximum streamflow). These influence diagnostics are applied to two case studies: a stage/discharge rating curve model, and a conceptual rainfall-runoff model (GR4J). Removing a single data point from the calibration resulted in differences to mean flow predictions of up to 6% for the rating curve model, and differences to mean and maximum flow predictions of up to 10% and 17%, respectively, for the hydrological model. When using the Nash-Sutcliffe efficiency in calibration, the computationally cheaper Cook's distance metrics produce similar results to the case-deletion metrics at a fraction of the computational cost. However, Cooks distance is adapted from linear regression with inherit assumptions on the data and is therefore less flexible than case deletion. Influential point detection diagnostics show great potential to improve current hydrological modelling practices by identifying highly influential data points. The findings of this

  14. Measurement of Hydrologic Streamflow Metrics and Estimation of Streamflow with Lumped Parameter Models in a Managed Lake System, Sebago Lake, Maine

    Science.gov (United States)

    Reeve, A. S.; Martin, D.; Smith, S. M.

    2013-12-01

    hydraulic structures that are not included in the lumped parameter model. Calibrated watershed models tend to substantially underestimate the highest streamflows while overestimating low flows. An early June 2012 event caused extremely high flows with discharge in the Crooked River (the most significant tributary) peaking at about 85 m3/day. The lumped parameter model dramatically underestimated this important and anomalous event, but provided a reasonable prediction of flows throughout the rest of 2012. Ongoing work includes incorporating hydraulic structures in the lumped parameter model and using the available data to drive the lake water-balance model that has been prepared.

  15. Uncertainties in Forecasting Streamflow using Entropy Theory

    Science.gov (United States)

    Cui, H.; Singh, V. P.

    2017-12-01

    Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.

  16. An integrated modeling system for estimating glacier and snow melt driven streamflow from remote sensing and earth system data products in the Himalayas

    Science.gov (United States)

    Brown, M. E.; Racoviteanu, A. E.; Tarboton, D. G.; Gupta, A. Sen; Nigro, J.; Policelli, F.; Habib, S.; Tokay, M.; Shrestha, M. S.; Bajracharya, S.; Hummel, P.; Gray, M.; Duda, P.; Zaitchik, B.; Mahat, V.; Artan, G.; Tokar, S.

    2014-11-01

    Quantification of the contribution of the hydrologic components (snow, ice and rain) to river discharge in the Hindu Kush Himalayan (HKH) region is important for decision-making in water sensitive sectors, and for water resources management and flood risk reduction. In this area, access to and monitoring of the glaciers and their melt outflow is challenging due to difficult access, thus modeling based on remote sensing offers the potential for providing information to improve water resources management and decision making. This paper describes an integrated modeling system developed using downscaled NASA satellite based and earth system data products coupled with in-situ hydrologic data to assess the contribution of snow and glaciers to the flows of the rivers in the HKH region. Snow and glacier melt was estimated using the Utah Energy Balance (UEB) model, further enhanced to accommodate glacier ice melt over clean and debris-covered tongues, then meltwater was input into the USGS Geospatial Stream Flow Model (GeoSFM). The two model components were integrated into Better Assessment Science Integrating point and Nonpoint Sources modeling framework (BASINS) as a user-friendly open source system and was made available to countries in high Asia. Here we present a case study from the Langtang Khola watershed in the monsoon-influenced Nepal Himalaya, used to validate our energy balance approach and to test the applicability of our modeling system. The snow and glacier melt model predicts that for the eight years used for model evaluation (October 2003-September 2010), the total surface water input over the basin was 9.43 m, originating as 62% from glacier melt, 30% from snowmelt and 8% from rainfall. Measured streamflow for those years were 5.02 m, reflecting a runoff coefficient of 0.53. GeoSFM simulated streamflow was 5.31 m indicating reasonable correspondence between measured and model confirming the capability of the integrated system to provide a quantification of

  17. An Integrated Modeling System for Estimating Glacier and Snow Melt Driven Streamflow from Remote Sensing and Earth System Data Products in the Himalayas

    Science.gov (United States)

    Brown, M. E.; Racoviteanu, A. E.; Tarboton, D. G.; Sen Gupta, A.; Nigro, J.; Policelli, F.; Habib, S.; Tokay, M.; Shrestha, M. S.; Bajracharya, S.

    2014-01-01

    Quantification of the contribution of the hydrologic components (snow, ice and rain) to river discharge in the Hindu Kush Himalayan (HKH) region is important for decision-making in water sensitive sectors, and for water resources management and flood risk reduction. In this area, access to and monitoring of the glaciers and their melt outflow is challenging due to difficult access, thus modeling based on remote sensing offers the potential for providing information to improve water resources management and decision making. This paper describes an integrated modeling system developed using downscaled NASA satellite based and earth system data products coupled with in-situ hydrologic data to assess the contribution of snow and glaciers to the flows of the rivers in the HKH region. Snow and glacier melt was estimated using the Utah Energy Balance (UEB) model, further enhanced to accommodate glacier ice melt over clean and debris-covered tongues, then meltwater was input into the USGS Geospatial Stream Flow Model (Geo- SFM). The two model components were integrated into Better Assessment Science Integrating point and Nonpoint Sources modeling framework (BASINS) as a user-friendly open source system and was made available to countries in high Asia. Here we present a case study from the Langtang Khola watershed in the monsoon-influenced Nepal Himalaya, used to validate our energy balance approach and to test the applicability of our modeling system. The snow and glacier melt model predicts that for the eight years used for model evaluation (October 2003-September 2010), the total surface water input over the basin was 9.43 m, originating as 62% from glacier melt, 30% from snowmelt and 8% from rainfall. Measured streamflow for those years were 5.02 m, reflecting a runoff coefficient of 0.53. GeoSFM simulated streamflow was 5.31 m indicating reasonable correspondence between measured and model confirming the capability of the integrated system to provide a quantification

  18. Efficient multi-scenario Model Predictive Control for water resources management with ensemble streamflow forecasts

    Science.gov (United States)

    Tian, Xin; Negenborn, Rudy R.; van Overloop, Peter-Jules; María Maestre, José; Sadowska, Anna; van de Giesen, Nick

    2017-11-01

    Model Predictive Control (MPC) is one of the most advanced real-time control techniques that has been widely applied to Water Resources Management (WRM). MPC can manage the water system in a holistic manner and has a flexible structure to incorporate specific elements, such as setpoints and constraints. Therefore, MPC has shown its versatile performance in many branches of WRM. Nonetheless, with the in-depth understanding of stochastic hydrology in recent studies, MPC also faces the challenge of how to cope with hydrological uncertainty in its decision-making process. A possible way to embed the uncertainty is to generate an Ensemble Forecast (EF) of hydrological variables, rather than a deterministic one. The combination of MPC and EF results in a more comprehensive approach: Multi-scenario MPC (MS-MPC). In this study, we will first assess the model performance of MS-MPC, considering an ensemble streamflow forecast. Noticeably, the computational inefficiency may be a critical obstacle that hinders applicability of MS-MPC. In fact, with more scenarios taken into account, the computational burden of solving an optimization problem in MS-MPC accordingly increases. To deal with this challenge, we propose the Adaptive Control Resolution (ACR) approach as a computationally efficient scheme to practically reduce the number of control variables in MS-MPC. In brief, the ACR approach uses a mixed-resolution control time step from the near future to the distant future. The ACR-MPC approach is tested on a real-world case study: an integrated flood control and navigation problem in the North Sea Canal of the Netherlands. Such an approach reduces the computation time by 18% and up in our case study. At the same time, the model performance of ACR-MPC remains close to that of conventional MPC.

  19. Remote Sensing-based Methodologies for Snow Model Adjustments in Operational Streamflow Prediction

    Science.gov (United States)

    Bender, S.; Miller, W. P.; Bernard, B.; Stokes, M.; Oaida, C. M.; Painter, T. H.

    2015-12-01

    Water management agencies rely on hydrologic forecasts issued by operational agencies such as NOAA's Colorado Basin River Forecast Center (CBRFC). The CBRFC has partnered with the Jet Propulsion Laboratory (JPL) under funding from NASA to incorporate research-oriented, remotely-sensed snow data into CBRFC operations and to improve the accuracy of CBRFC forecasts. The partnership has yielded valuable analysis of snow surface albedo as represented in JPL's MODIS Dust Radiative Forcing in Snow (MODDRFS) data, across the CBRFC's area of responsibility. When dust layers within a snowpack emerge, reducing the snow surface albedo, the snowmelt rate may accelerate. The CBRFC operational snow model (SNOW17) is a temperature-index model that lacks explicit representation of snowpack surface albedo. CBRFC forecasters monitor MODDRFS data for emerging dust layers and may manually adjust SNOW17 melt rates. A technique was needed for efficient and objective incorporation of the MODDRFS data into SNOW17. Initial development focused in Colorado, where dust-on-snow events frequently occur. CBRFC forecasters used retrospective JPL-CBRFC analysis and developed a quantitative relationship between MODDRFS data and mean areal temperature (MAT) data. The relationship was used to generate adjusted, MODDRFS-informed input for SNOW17. Impacts of the MODDRFS-SNOW17 MAT adjustment method on snowmelt-driven streamflow prediction varied spatially and with characteristics of the dust deposition events. The largest improvements occurred in southwestern Colorado, in years with intense dust deposition events. Application of the method in other regions of Colorado and in "low dust" years resulted in minimal impact. The MODDRFS-SNOW17 MAT technique will be implemented in CBRFC operations in late 2015, prior to spring 2016 runoff. Collaborative investigation of remote sensing-based adjustment methods for the CBRFC operational hydrologic forecasting environment will continue over the next several years.

  20. CEREF: A hybrid data-driven model for forecasting annual streamflow from a socio-hydrological system

    Science.gov (United States)

    Zhang, Hongbo; Singh, Vijay P.; Wang, Bin; Yu, Yinghao

    2016-09-01

    Hydrological forecasting is complicated by flow regime alterations in a coupled socio-hydrologic system, encountering increasingly non-stationary, nonlinear and irregular changes, which make decision support difficult for future water resources management. Currently, many hybrid data-driven models, based on the decomposition-prediction-reconstruction principle, have been developed to improve the ability to make predictions of annual streamflow. However, there exist many problems that require further investigation, the chief among which is the direction of trend components decomposed from annual streamflow series and is always difficult to ascertain. In this paper, a hybrid data-driven model was proposed to capture this issue, which combined empirical mode decomposition (EMD), radial basis function neural networks (RBFNN), and external forces (EF) variable, also called the CEREF model. The hybrid model employed EMD for decomposition and RBFNN for intrinsic mode function (IMF) forecasting, and determined future trend component directions by regression with EF as basin water demand representing the social component in the socio-hydrologic system. The Wuding River basin was considered for the case study, and two standard statistical measures, root mean squared error (RMSE) and mean absolute error (MAE), were used to evaluate the performance of CEREF model and compare with other models: the autoregressive (AR), RBFNN and EMD-RBFNN. Results indicated that the CEREF model had lower RMSE and MAE statistics, 42.8% and 7.6%, respectively, than did other models, and provided a superior alternative for forecasting annual runoff in the Wuding River basin. Moreover, the CEREF model can enlarge the effective intervals of streamflow forecasting compared to the EMD-RBFNN model by introducing the water demand planned by the government department to improve long-term prediction accuracy. In addition, we considered the high-frequency component, a frequent subject of concern in EMD

  1. Modeling Potential Impacts of Climate Change on Streamflow Using Projections of the 5th Assessment Report for the Bernam River Basin, Malaysia

    Directory of Open Access Journals (Sweden)

    Nkululeko Simeon Dlamini

    2017-03-01

    Full Text Available Potential impacts of climate change on the streamflow of the Bernam River Basin in Malaysia are assessed using ten Global Climate Models (GCMs under three Representative Concentration Pathways (RCP4.5, RCP6.0 and RCP8.5. A graphical user interface was developed that integrates all of the common procedures of assessing climate change impacts, to generate high resolution climate variables (e.g., rainfall, temperature, etc. at the local scale from large-scale climate models. These are linked in one executable module to generate future climate sequences that can be used as inputs to various models, including hydrological and crop models. The generated outputs were used as inputs to the SWAT hydrological model to simulate the hydrological processes. The evaluation results indicated that the model performed well for the watershed with a monthly R2, Nash–Sutcliffe Efficiency (NSE and Percent Bias (PBIAS values of 0.67, 0.62 and −9.4 and 0.62, 0.61 and −4.2 for the calibration and validation periods, respectively. The multi-model projections show an increase in future temperature (tmax and tmin in all respective scenarios, up to an average of 2.5 °C for under the worst-case scenario (RC8.5. Rainfall is also predicted to change with clear variations between the dry and wet season. Streamflow projections also followed rainfall pattern to a great extent with a distinct change between the dry and wet season possibly due to the increase in evapotranspiration in the watershed. In principle, the interface can be customized for the application to other watersheds by incorporating GCMs’ baseline data and their corresponding future data for those particular stations in the new watershed. Methodological limitations of the study are also discussed.

  2. Estimating daily time series of streamflow using hydrological model calibrated based on satellite observations of river water surface width: Toward real world applications.

    Science.gov (United States)

    Sun, Wenchao; Ishidaira, Hiroshi; Bastola, Satish; Yu, Jingshan

    2015-05-01

    Lacking observation data for calibration constrains applications of hydrological models to estimate daily time series of streamflow. Recent improvements in remote sensing enable detection of river water-surface width from satellite observations, making possible the tracking of streamflow from space. In this study, a method calibrating hydrological models using river width derived from remote sensing is demonstrated through application to the ungauged Irrawaddy Basin in Myanmar. Generalized likelihood uncertainty estimation (GLUE) is selected as a tool for automatic calibration and uncertainty analysis. Of 50,000 randomly generated parameter sets, 997 are identified as behavioral, based on comparing model simulation with satellite observations. The uncertainty band of streamflow simulation can span most of 10-year average monthly observed streamflow for moderate and high flow conditions. Nash-Sutcliffe efficiency is 95.7% for the simulated streamflow at the 50% quantile. These results indicate that application to the target basin is generally successful. Beyond evaluating the method in a basin lacking streamflow data, difficulties and possible solutions for applications in the real world are addressed to promote future use of the proposed method in more ungauged basins. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Streamflow in the upper Mississippi river basin as simulated by SWAT driven by 20{sup th} century contemporary results of global climate models and NARCCAP regional climate models

    Energy Technology Data Exchange (ETDEWEB)

    Takle, Eugene S.; Jha, Manoj; Lu, Er; Arritt, Raymond W.; Gutowski, William J. [Iowa State Univ. Ames, IA (United States)

    2010-06-15

    We use Soil and Water Assessment Tool (SWAT) when driven by observations and results of climate models to evaluate hydrological quantities, including streamflow, in the Upper Mississippi River Basin (UMRB) for 1981-2003 in comparison to observed streamflow. Daily meteorological conditions used as input to SWAT are taken from (1) observations at weather stations in the basin, (2) daily meteorological conditions simulated by a collection of regional climate models (RCMs) driven by reanalysis boundary conditions, and (3) daily meteorological conditions simulated by a collection of global climate models (GCMs). Regional models used are those whose data are archived by the North American Regional Climate Change Assessment Program (NARCCAP). Results show that regional models correctly simulate the seasonal cycle of precipitation, temperature, and streamflow within the basin. Regional models also capture interannual extremes represented by the flood of 1993 and the dry conditions of 2000. The ensemble means of both the GCM-driven and RCM-driven simulations by SWAT capture both the timing and amplitude of the seasonal cycle of streamflow with neither demonstrating significant superiority at the basin level. (orig.)

  4. Intercomparison of Streamflow Simulations between WRF-Hydro and Hydrology Laboratory-Research Distributed Hydrologic Model Frameworks

    Science.gov (United States)

    KIM, J.; Smith, M. B.; Koren, V.; Salas, F.; Cui, Z.; Johnson, D.

    2017-12-01

    The National Oceanic and Atmospheric Administration (NOAA)-National Weather Service (NWS) developed the Hydrology Laboratory-Research Distributed Hydrologic Model (HL-RDHM) framework as an initial step towards spatially distributed modeling at River Forecast Centers (RFCs). Recently, the NOAA/NWS worked with the National Center for Atmospheric Research (NCAR) to implement the National Water Model (NWM) for nationally-consistent water resources prediction. The NWM is based on the WRF-Hydro framework and is run at a 1km spatial resolution and 1-hour time step over the contiguous United States (CONUS) and contributing areas in Canada and Mexico. In this study, we compare streamflow simulations from HL-RDHM and WRF-Hydro to observations from 279 USGS stations. For streamflow simulations, HL-RDHM is run on 4km grids with the temporal resolution of 1 hour for a 5-year period (Water Years 2008-2012), using a priori parameters provided by NOAA-NWS. The WRF-Hydro streamflow simulations for the same time period are extracted from NCAR's 23 retrospective run of the NWM (version 1.0) over CONUS based on 1km grids. We choose 279 USGS stations which are relatively less affected by dams or reservoirs, in the domains of six different RFCs. We use the daily average values of simulations and observations for the convenience of comparison. The main purpose of this research is to evaluate how HL-RDHM and WRF-Hydro perform at USGS gauge stations. We compare daily time-series of observations and both simulations, and calculate the error values using a variety of error functions. Using these plots and error values, we evaluate the performances of HL-RDHM and WRF-Hydro models. Our results show a mix of model performance across geographic regions.

  5. Monthly hydrometeorological ensemble prediction of streamflow droughts and corresponding drought indices

    Directory of Open Access Journals (Sweden)

    F. Fundel

    2013-01-01

    Full Text Available Streamflow droughts, characterized by low runoff as consequence of a drought event, affect numerous aspects of life. Economic sectors that are impacted by low streamflow are, e.g., power production, agriculture, tourism, water quality management and shipping. Those sectors could potentially benefit from forecasts of streamflow drought events, even of short events on the monthly time scales or below. Numerical hydrometeorological models have increasingly been used to forecast low streamflow and have become the focus of recent research. Here, we consider daily ensemble runoff forecasts for the river Thur, which has its source in the Swiss Alps. We focus on the evaluation of low streamflow and of the derived indices as duration, severity and magnitude, characterizing streamflow droughts up to a lead time of one month.

    The ECMWF VarEPS 5-member ensemble reforecast, which covers 18 yr, is used as forcing for the hydrological model PREVAH. A thorough verification reveals that, compared to probabilistic peak-flow forecasts, which show skill up to a lead time of two weeks, forecasts of streamflow droughts are skilful over the entire forecast range of one month. For forecasts at the lower end of the runoff regime, the quality of the initial state seems to be crucial to achieve a good forecast quality in the longer range. It is shown that the states used in this study to initialize forecasts satisfy this requirement. The produced forecasts of streamflow drought indices, derived from the ensemble forecasts, could be beneficially included in a decision-making process. This is valid for probabilistic forecasts of streamflow drought events falling below a daily varying threshold, based on a quantile derived from a runoff climatology. Although the forecasts have a tendency to overpredict streamflow droughts, it is shown that the relative economic value of the ensemble forecasts reaches up to 60%, in case a forecast user is able to take preventive

  6. Monthly hydrometeorological ensemble prediction of streamflow droughts and corresponding drought indices

    Science.gov (United States)

    Fundel, F.; Jörg-Hess, S.; Zappa, M.

    2013-01-01

    Streamflow droughts, characterized by low runoff as consequence of a drought event, affect numerous aspects of life. Economic sectors that are impacted by low streamflow are, e.g., power production, agriculture, tourism, water quality management and shipping. Those sectors could potentially benefit from forecasts of streamflow drought events, even of short events on the monthly time scales or below. Numerical hydrometeorological models have increasingly been used to forecast low streamflow and have become the focus of recent research. Here, we consider daily ensemble runoff forecasts for the river Thur, which has its source in the Swiss Alps. We focus on the evaluation of low streamflow and of the derived indices as duration, severity and magnitude, characterizing streamflow droughts up to a lead time of one month. The ECMWF VarEPS 5-member ensemble reforecast, which covers 18 yr, is used as forcing for the hydrological model PREVAH. A thorough verification reveals that, compared to probabilistic peak-flow forecasts, which show skill up to a lead time of two weeks, forecasts of streamflow droughts are skilful over the entire forecast range of one month. For forecasts at the lower end of the runoff regime, the quality of the initial state seems to be crucial to achieve a good forecast quality in the longer range. It is shown that the states used in this study to initialize forecasts satisfy this requirement. The produced forecasts of streamflow drought indices, derived from the ensemble forecasts, could be beneficially included in a decision-making process. This is valid for probabilistic forecasts of streamflow drought events falling below a daily varying threshold, based on a quantile derived from a runoff climatology. Although the forecasts have a tendency to overpredict streamflow droughts, it is shown that the relative economic value of the ensemble forecasts reaches up to 60%, in case a forecast user is able to take preventive action based on the forecast.

  7. SWAT application in intensive irrigation systems: Model modification, calibration and validation

    Science.gov (United States)

    Dechmi, Farida; Burguete, Javier; Skhiri, Ahmed

    2012-11-01

    SummaryThe Soil and Water Assessment Tool (SWAT) is a well established, distributed, eco-hydrologic model. However, using the study case of an agricultural intensive irrigated watershed, it was shown that all the model versions are not able to appropriately reproduce the total streamflow in such system when the irrigation source is outside the watershed. The objective of this study was to modify the SWAT2005 version for correctly simulating the main hydrological processes. Crop yield, total streamflow, total suspended sediment (TSS) losses and phosphorus load calibration and validation were performed using field survey information and water quantity and quality data recorded during 2008 and 2009 years in Del Reguero irrigated watershed in Spain. The goodness of the calibration and validation results was assessed using five statistical measures, including the Nash-Sutcliffe efficiency (NSE). Results indicated that the average annual crop yield and actual evapotranspiration estimations were quite satisfactory. On a monthly basis, the values of NSE were 0.90 (calibration) and 0.80 (validation) indicating that the modified model could reproduce accurately the observed streamflow. The TSS losses were also satisfactorily estimated (NSE = 0.72 and 0.52 for the calibration and validation steps). The monthly temporal patterns and all the statistical parameters indicated that the modified SWAT-IRRIG model adequately predicted the total phosphorus (TP) loading. Therefore, the model could be used to assess the impacts of different best management practices on nonpoint phosphorus losses in irrigated systems.

  8. Validating Animal Models

    Directory of Open Access Journals (Sweden)

    Nina Atanasova

    2015-06-01

    Full Text Available In this paper, I respond to the challenge raised against contemporary experimental neurobiology according to which the field is in a state of crisis because of the multiple experimental protocols employed in different laboratories and strengthening their reliability that presumably preclude the validity of neurobiological knowledge. I provide an alternative account of experimentation in neurobiology which makes sense of its experimental practices. I argue that maintaining a multiplicity of experimental protocols and strengthening their reliability are well justified and they foster rather than preclude the validity of neurobiological knowledge. Thus, their presence indicates thriving rather than crisis of experimental neurobiology.

  9. Application of the geological streamflow and Muskingum Cunge ...

    African Journals Online (AJOL)

    ... of the geological streamflow and Muskingum Cunge models in the Yala River Basin, Kenya. ... can be represented by the application of hydrologic and hydraulic models. ... verification and streamflow routing based on a split record analysis.

  10. Parameterizing sub-surface drainage with geology to improve modeling streamflow responses to climate in data limited environments

    Directory of Open Access Journals (Sweden)

    C. L. Tague

    2013-01-01

    Full Text Available Hydrologic models are one of the core tools used to project how water resources may change under a warming climate. These models are typically applied over a range of scales, from headwater streams to higher order rivers, and for a variety of purposes, such as evaluating changes to aquatic habitat or reservoir operation. Most hydrologic models require streamflow data to calibrate subsurface drainage parameters. In many cases, long-term gage records may not be available for calibration, particularly when assessments are focused on low-order stream reaches. Consequently, hydrologic modeling of climate change impacts is often performed in the absence of sufficient data to fully parameterize these hydrologic models. In this paper, we assess a geologic-based strategy for assigning drainage parameters. We examine the performance of this modeling strategy for the McKenzie River watershed in the US Oregon Cascades, a region where previous work has demonstrated sharp contrasts in hydrology based primarily on geological differences between the High and Western Cascades. Based on calibration and verification using existing streamflow data, we demonstrate that: (1 a set of streams ranging from 1st to 3rd order within the Western Cascade geologic region can share the same drainage parameter set, while (2 streams from the High Cascade geologic region require a different parameter set. Further, we show that a watershed comprised of a mixture of High and Western Cascade geologies can be modeled without additional calibration by transferring parameters from these distinctive High and Western Cascade end-member parameter sets. More generally, we show that by defining a set of end-member parameters that reflect different geologic classes, we can more efficiently apply a hydrologic model over a geologically complex landscape and resolve geo-climatic differences in how different watersheds are likely to respond to simple warming scenarios.

  11. Modelling the effects of climate change on streamflow in a sub-basin of the lower Churchill River

    International Nuclear Information System (INIS)

    Pryse-Phillips, Amy; Snelgrove, Ken

    2010-01-01

    Climate change is likely to affect extreme flows as well as average flows. This is an important consideration for hydroelectric power producers. This paper presented the development of an approach to assess the impact of climate changes on seasonal and average annual river flows. The main goal was to investigate how climate change will affect the hydroelectric potential of the Lower Churchill Project using different combinations of emissions scenarios, climate model output and downscaling techniques. The setup and calibration of the numerical hydrological model, WATFLOOD, were performed as preliminary work for the Pinus River basin selected as study basin. Downscaled climate data from the North America change assessment program for both current and future climate periods were analysed. The calibrated model was used to simulate the current and future period streamflow scenarios. The results showed a 13 percent increase in mean annual flows concentrated in the winter and spring seasons.

  12. Modelling the effects of climate change on streamflow in a sub-basin of the lower Churchill River

    Energy Technology Data Exchange (ETDEWEB)

    Pryse-Phillips, Amy [Hatch Ltd., St John' s, (Canada); Snelgrove, Ken [Memorial University of Newfoundland, St John' s, (Canada)

    2010-07-01

    Climate change is likely to affect extreme flows as well as average flows. This is an important consideration for hydroelectric power producers. This paper presented the development of an approach to assess the impact of climate changes on seasonal and average annual river flows. The main goal was to investigate how climate change will affect the hydroelectric potential of the Lower Churchill Project using different combinations of emissions scenarios, climate model output and downscaling techniques. The setup and calibration of the numerical hydrological model, WATFLOOD, were performed as preliminary work for the Pinus River basin selected as study basin. Downscaled climate data from the North America change assessment program for both current and future climate periods were analysed. The calibrated model was used to simulate the current and future period streamflow scenarios. The results showed a 13 percent increase in mean annual flows concentrated in the winter and spring seasons.

  13. Quantifying Streamflow Variations in Ungauged Lake Basins by Integrating Remote Sensing and Water Balance Modelling: A Case Study of the Erdos Larus relictus National Nature Reserve, China

    Directory of Open Access Journals (Sweden)

    Kang Liang

    2017-06-01

    Full Text Available Hydrological predictions in ungauged lakes are one of the most important issues in hydrological sciences. The habitat of the Relict Gull (Larus relictus in the Erdos Larus relictus National Nature Reserve (ELRNNR has been seriously endangered by lake shrinkage, yet the hydrological processes in the catchment are poorly understood due to the lack of in-situ observations. Therefore, it is necessary to assess the variation in lake streamflow and its drivers. In this study, we employed the remote sensing technique and empirical equation to quantify the time series of lake water budgets, and integrated a water balance model and climate elasticity method to further examine ELRNNR basin streamflow variations from1974 to 2013. The results show that lake variations went through three phases with significant differences: The rapidly expanding sub-period (1974–1979, the relatively stable sub-period (1980–1999, and the dramatically shrinking sub-period (2000–2013. Both climate variation (expressed by precipitation and evapotranspiration and human activities were quantified as drivers of streamflow variation, and the driving forces in the three phases had different contributions. As human activities gradually intensified, the contributions of human disturbances on streamflow variation obviously increased, accounting for 22.3% during 1980–1999 and up to 59.2% during 2000–2013. Intensified human interferences and climate warming have jointly led to the lake shrinkage since 1999. This study provides a useful reference to quantify lake streamflow and its drivers in ungauged basins.

  14. Exploiting Soil Moisture, Precipitation, and Streamflow Observations to Evaluate Soil Moisture/Runoff Coupling in Land Surface Models

    Science.gov (United States)

    Crow, W. T.; Chen, F.; Reichle, R. H.; Xia, Y.; Liu, Q.

    2018-05-01

    Accurate partitioning of precipitation into infiltration and runoff is a fundamental objective of land surface models tasked with characterizing the surface water and energy balance. Temporal variability in this partitioning is due, in part, to changes in prestorm soil moisture, which determine soil infiltration capacity and unsaturated storage. Utilizing the National Aeronautics and Space Administration Soil Moisture Active Passive Level-4 soil moisture product in combination with streamflow and precipitation observations, we demonstrate that land surface models (LSMs) generally underestimate the strength of the positive rank correlation between prestorm soil moisture and event runoff coefficients (i.e., the fraction of rainfall accumulation volume converted into stormflow runoff during a storm event). Underestimation is largest for LSMs employing an infiltration-excess approach for stormflow runoff generation. More accurate coupling strength is found in LSMs that explicitly represent subsurface stormflow or saturation-excess runoff generation processes.

  15. Validating Dart Model

    Directory of Open Access Journals (Sweden)

    Mazur Jolanta

    2014-12-01

    Full Text Available The primary objective of the study was to quantitatively test the DART model, which despite being one of the most popular representations of co-creation concept was so far studied almost solely with qualitative methods. To this end, the researchers developed a multiple measurement scale and employed it in interviewing managers. The statistical evidence for adequacy of the model was obtained through CFA with AMOS software. The findings suggest that the DART model may not be an accurate representation of co-creation practices in companies. From the data analysis it was evident that the building blocks of DART had too much of conceptual overlap to be an effective framework for quantitative analysis. It was also implied that the phenomenon of co-creation is so rich and multifaceted that it may be more adequately captured by a measurement model where co-creation is conceived as a third-level factor with two layers of intermediate latent variables.

  16. Global-scale high-resolution ( 1 km) modelling of mean, maximum and minimum annual streamflow

    Science.gov (United States)

    Barbarossa, Valerio; Huijbregts, Mark; Hendriks, Jan; Beusen, Arthur; Clavreul, Julie; King, Henry; Schipper, Aafke

    2017-04-01

    Quantifying mean, maximum and minimum annual flow (AF) of rivers at ungauged sites is essential for a number of applications, including assessments of global water supply, ecosystem integrity and water footprints. AF metrics can be quantified with spatially explicit process-based models, which might be overly time-consuming and data-intensive for this purpose, or with empirical regression models that predict AF metrics based on climate and catchment characteristics. Yet, so far, regression models have mostly been developed at a regional scale and the extent to which they can be extrapolated to other regions is not known. We developed global-scale regression models that quantify mean, maximum and minimum AF as function of catchment area and catchment-averaged slope, elevation, and mean, maximum and minimum annual precipitation and air temperature. We then used these models to obtain global 30 arc-seconds (˜ 1 km) maps of mean, maximum and minimum AF for each year from 1960 through 2015, based on a newly developed hydrologically conditioned digital elevation model. We calibrated our regression models based on observations of discharge and catchment characteristics from about 4,000 catchments worldwide, ranging from 100 to 106 km2 in size, and validated them against independent measurements as well as the output of a number of process-based global hydrological models (GHMs). The variance explained by our regression models ranged up to 90% and the performance of the models compared well with the performance of existing GHMs. Yet, our AF maps provide a level of spatial detail that cannot yet be achieved by current GHMs.

  17. Improving Streamflow Simulation in Gaged and Ungaged Areas Using a Multi-Model Synthesis Combined with Remotely-Sensed Data and Estimates of Uncertainty

    Science.gov (United States)

    Lafontaine, J.; Hay, L.

    2015-12-01

    The United States Geological Survey (USGS) has developed a National Hydrologic Model (NHM) to support coordinated, comprehensive and consistent hydrologic model development, and facilitate the application of hydrologic simulations within the conterminous United States (CONUS). More than 1,700 gaged watersheds across the CONUS were modeled to test the feasibility of improving streamflow simulations in gaged and ungaged watersheds by linking statistically- and physically-based hydrologic models with remotely-sensed data products (i.e. - snow water equivalent) and estimates of uncertainty. Initially, the physically-based models were calibrated to measured streamflow data to provide a baseline for comparison. As many stream reaches in the CONUS are either not gaged, or are substantially impacted by water use or flow regulation, ancillary information must be used to determine reasonable parameter estimations for streamflow simulations. In addition, not all ancillary datasets are appropriate for application to all parts of the CONUS (e.g. - snow water equivalent in the southeastern U.S., where snow is a rarity). As it is not expected that any one data product or model simulation will be sufficient for representing hydrologic behavior across the entire CONUS, a systematic evaluation of which data products improve simulations of streamflow for various regions across the CONUS was performed. The resulting portfolio of calibration strategies can be used to guide selection of an appropriate combination of simulated and measured information for model development and calibration at a given location of interest. In addition, these calibration strategies have been developed to be flexible so that new data products or simulated information can be assimilated. This analysis provides a foundation to understand how well models work when streamflow data is either not available or is limited and could be used to further inform hydrologic model parameter development for ungaged areas.

  18. Evaluating Impacts of climate and land use changes on streamflow using SWAT and land use models based CESM1-CAM5 Climate scenarios

    Science.gov (United States)

    Lin, Tzu Ping; Lin, Yu Pin; Lien, Wan Yu

    2015-04-01

    Climate change projects have various levels of impacts on hydrological cycles around the world. The impact of climate change and uncertainty of climate projections from general circulation models (GCMs) from the Coupled Model Intercomparison Project (CMIP5) which has been just be released in Taiwan, 2014. Since the streamflow run into ocean directly due to the steep terrain and the rainfall difference between wet and dry seasons is apparent; as a result, the allocation water resource reasonable is very challenge in Taiwan, particularly under climate change. The purpose of this study was to evaluate the impacts of climate and land use changes on a small watershed in Taiwan. The AR5 General Circulation Models(GCM) output data was adopted in this study and was downscaled from the monthly to the daily weather data as the input data of hydrological model such as Soil and Water Assessment Tool (SWAT) model in this study. The spatially explicit land uses change model, the Conservation of Land Use and its Effects at Small regional extent (CLUE-s), was applied to simulate land use scenarios in 2020-2039. Combined climate and land use change scenarios were adopted as input data of the hydrological model, the SWAT model, to estimate the future streamflows. With the increasing precipitation, increasing urban area and decreasing agricultural and grass land, the annual streamflow in the most of twenty-three subbasins were also increased. Besides, due to the increasing rainfall in wet season and decreasing rainfall in dry season, the difference of streamflow between wet season and dry season are also increased. This result indicates a more stringent challenge on the water resource management in future. Therefore, impacts on water resource caused by climate change and land use change should be considered in water resource planning for the Datuan river watershed. Keywords: SWAT, GCM, CLUE-s, streamflow, climate change, land use change

  19. Validation through model testing

    International Nuclear Information System (INIS)

    1995-01-01

    Geoval-94 is the third Geoval symposium arranged jointly by the OECD/NEA and the Swedish Nuclear Power Inspectorate. Earlier symposia in this series took place in 1987 and 1990. In many countries, the ongoing programmes to site and construct deep geological repositories for high and intermediate level nuclear waste are close to realization. A number of studies demonstrates the potential barrier function of the geosphere, but also that there are many unresolved issues. A key to these problems are the possibilities to gain knowledge by model testing with experiments and to increase confidence in models used for prediction. The sessions cover conclusions from the INTRAVAL-project, experiences from integrated experimental programs and underground research laboratories as well as the integration between performance assessment and site characterisation. Technical issues ranging from waste and buffer interactions with the rock to radionuclide migration in different geological media is addressed. (J.S.)

  20. Perpetual Model Validation

    Science.gov (United States)

    2017-03-01

    25]. This inference process is carried out by a tool referred to as Hynger (Hybrid iNvariant GEneratoR), overviewed in Figure 4, which is a MATLAB ...initially on memory access patterns. A monitoring module will check, at runtime that the observed memory access pattern matches the pattern the software is...necessary. By using the developed approach, a model may be derived from initial tests or simulations , which will then be formally checked at runtime

  1. Using internal discharge data in a distributed conceptual model to reduce uncertainty in streamflow simulations

    Science.gov (United States)

    Guerrero, J.; Halldin, S.; Xu, C.; Lundin, L.

    2011-12-01

    Distributed hydrological models are important tools in water management as they account for the spatial variability of the hydrological data, as well as being able to produce spatially distributed outputs. They can directly incorporate and assess potential changes in the characteristics of our basins. A recognized problem for models in general is equifinality, which is only exacerbated for distributed models who tend to have a large number of parameters. We need to deal with the fundamentally ill-posed nature of the problem that such models force us to face, i.e. a large number of parameters and very few variables that can be used to constrain them, often only the catchment discharge. There is a growing but yet limited literature showing how the internal states of a distributed model can be used to calibrate/validate its predictions. In this paper, a distributed version of WASMOD, a conceptual rainfall runoff model with only three parameters, combined with a routing algorithm based on the high-resolution HydroSHEDS data was used to simulate the discharge in the Paso La Ceiba basin in Honduras. The parameter space was explored using Monte-Carlo simulations and the region of space containing the parameter-sets that were considered behavioral according to two different criteria was delimited using the geometric concept of alpha-shapes. The discharge data from five internal sub-basins was used to aid in the calibration of the model and to answer the following questions: Can this information improve the simulations at the outlet of the catchment, or decrease their uncertainty? Also, after reducing the number of model parameters needing calibration through sensitivity analysis: Is it possible to relate them to basin characteristics? The analysis revealed that in most cases the internal discharge data can be used to reduce the uncertainty in the discharge at the outlet, albeit with little improvement in the overall simulation results.

  2. Simulation of the Quantity, Variability, and Timing of Streamflow in the Dennys River Basin, Maine, by Use of a Precipitation-Runoff Watershed Model

    Science.gov (United States)

    Dudley, Robert W.

    2008-01-01

    The U.S. Geological Survey (USGS), in cooperation with the Maine Department of Marine Resources Bureau of Sea Run Fisheries and Habitat, began a study in 2004 to characterize the quantity, variability, and timing of streamflow in the Dennys River. The study included a synoptic summary of historical streamflow data at a long-term streamflow gage, collecting data from an additional four short-term streamflow gages, and the development and evaluation of a distributed-parameter watershed model for the Dennys River Basin. The watershed model used in this investigation was the USGS Precipitation-Runoff Modeling System (PRMS). The Geographic Information System (GIS) Weasel was used to delineate the Dennys River Basin and subbasins and derive parameters for their physical geographic features. Calibration of the models used in this investigation involved a four-step procedure in which model output was evaluated against four calibration data sets using computed objective functions for solar radiation, potential evapotranspiration, annual and seasonal water budgets, and daily streamflows. The calibration procedure involved thousands of model runs and was carried out using the USGS software application Luca (Let us calibrate). Luca uses the Shuffled Complex Evolution (SCE) global search algorithm to calibrate the model parameters. The SCE method reliably produces satisfactory solutions for large, complex optimization problems. The primary calibration effort went into the Dennys main stem watershed model. Calibrated parameter values obtained for the Dennys main stem model were transferred to the Cathance Stream model, and a similar four-step SCE calibration procedure was performed; this effort was undertaken to determine the potential to transfer modeling information to a nearby basin in the same region. The calibrated Dennys main stem watershed model performed with Nash-Sutcliffe efficiency (NSE) statistic values for the calibration period and evaluation period of 0.79 and 0

  3. Skilful seasonal forecasts of streamflow over Europe?

    Science.gov (United States)

    Arnal, Louise; Cloke, Hannah L.; Stephens, Elisabeth; Wetterhall, Fredrik; Prudhomme, Christel; Neumann, Jessica; Krzeminski, Blazej; Pappenberger, Florian

    2018-04-01

    This paper considers whether there is any added value in using seasonal climate forecasts instead of historical meteorological observations for forecasting streamflow on seasonal timescales over Europe. A Europe-wide analysis of the skill of the newly operational EFAS (European Flood Awareness System) seasonal streamflow forecasts (produced by forcing the Lisflood model with the ECMWF System 4 seasonal climate forecasts), benchmarked against the ensemble streamflow prediction (ESP) forecasting approach (produced by forcing the Lisflood model with historical meteorological observations), is undertaken. The results suggest that, on average, the System 4 seasonal climate forecasts improve the streamflow predictability over historical meteorological observations for the first month of lead time only (in terms of hindcast accuracy, sharpness and overall performance). However, the predictability varies in space and time and is greater in winter and autumn. Parts of Europe additionally exhibit a longer predictability, up to 7 months of lead time, for certain months within a season. In terms of hindcast reliability, the EFAS seasonal streamflow hindcasts are on average less skilful than the ESP for all lead times. The results also highlight the potential usefulness of the EFAS seasonal streamflow forecasts for decision-making (measured in terms of the hindcast discrimination for the lower and upper terciles of the simulated streamflow). Although the ESP is the most potentially useful forecasting approach in Europe, the EFAS seasonal streamflow forecasts appear more potentially useful than the ESP in some regions and for certain seasons, especially in winter for almost 40 % of Europe. Patterns in the EFAS seasonal streamflow hindcast skill are however not mirrored in the System 4 seasonal climate hindcasts, hinting at the need for a better understanding of the link between hydrological and meteorological variables on seasonal timescales, with the aim of improving climate-model

  4. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  5. Long-range forecasting of intermittent streamflow

    OpenAIRE

    F. F. van Ogtrop; R. W. Vervoort; G. Z. Heller; D. M. Stasinopoulos; R. A. Rigby

    2011-01-01

    Long-range forecasting of intermittent streamflow in semi-arid Australia poses a number of major challenges. One of the challenges relates to modelling zero, skewed, non-stationary, and non-linear data. To address this, a statistical model to forecast streamflow up to 12 months ahead is applied to five semi-arid catchments in South Western Queensland. The model uses logistic regression through Generalised Additive Models for Location, Scale and Shape (GAMLSS) to determine th...

  6. Long-range forecasting of intermittent streamflow

    OpenAIRE

    F. F. van Ogtrop; R. W. Vervoort; G. Z. Heller; D. M. Stasinopoulos; R. A. Rigby

    2011-01-01

    Long-range forecasting of intermittent streamflow in semi-arid Australia poses a number of major challenges. One of the challenges relates to modelling zero, skewed, non-stationary, and non-linear data. To address this, a probabilistic statistical model to forecast streamflow 12 months ahead is applied to five semi-arid catchments in South Western Queensland. The model uses logistic regression through Generalised Additive Models for Location, Scale and Shape (GAMLSS) to determine the probabil...

  7. Verification and validation of models

    International Nuclear Information System (INIS)

    Herbert, A.W.; Hodgkinson, D.P.; Jackson, C.P.; Lever, D.A.; Robinson, P.C.

    1986-12-01

    The numerical accuracy of the computer models for groundwater flow and radionuclide transport that are to be used in repository safety assessment must be tested, and their ability to describe experimental data assessed: they must be verified and validated respectively. Also appropriate ways to use the codes in performance assessments, taking into account uncertainties in present data and future conditions, must be studied. These objectives are being met by participation in international exercises, by developing bench-mark problems, and by analysing experiments. In particular the project has funded participation in the HYDROCOIN project for groundwater flow models, the Natural Analogues Working Group, and the INTRAVAL project for geosphere models. (author)

  8. Streamflow alteration at selected sites in Kansas

    Science.gov (United States)

    Juracek, Kyle E.; Eng, Ken

    2017-06-26

    An understanding of streamflow alteration in response to various disturbances is necessary for the effective management of stream habitat for a variety of species in Kansas. Streamflow alteration can have negative ecological effects. Using a modeling approach, streamflow alteration was assessed for 129 selected U.S. Geological Survey streamgages in the State for which requisite streamflow and basin-characteristic information was available. The assessment involved a comparison of the observed condition from 1980 to 2015 with the predicted expected (least-disturbed) condition for 29 streamflow metrics. The metrics represent various characteristics of streamflow including average flow (annual, monthly) and low and high flow (frequency, duration, magnitude).Streamflow alteration in Kansas was indicated locally, regionally, and statewide. Given the absence of a pronounced trend in annual precipitation in Kansas, a precipitation-related explanation for streamflow alteration was not supported. Thus, the likely explanation for streamflow alteration was human activity. Locally, a flashier flow regime (typified by shorter lag times and more frequent and higher peak discharges) was indicated for three streamgages with urbanized basins that had higher percentages of impervious surfaces than other basins in the State. The combination of localized reservoir effects and regional groundwater pumping from the High Plains aquifer likely was responsible, in part, for diminished conditions indicated for multiple streamflow metrics in western and central Kansas. Statewide, the implementation of agricultural land-management practices to reduce runoff may have been responsible, in part, for a diminished duration and magnitude of high flows. In central and eastern Kansas, implemented agricultural land-management practices may have been partly responsible for an inflated magnitude of low flows at several sites.

  9. Disentangling the response of streamflow to forest management and climate

    Science.gov (United States)

    Dymond, S.; Miniat, C.; Bladon, K. D.; Keppeler, E.; Caldwell, P. V.

    2016-12-01

    Paired watershed studies have showcased the relationships between forests, management, and streamflow. However, classical analyses of paired-watershed studies have done little to disentangle the effects of management from overarching climatic signals, potentially masking the interaction between management and climate. Such approaches may confound our understanding of how forest management impacts streamflow. Here we use a 50-year record of streamflow and climate data from the Caspar Creek Experimental Watersheds (CCEW), California, USA to separate the effects of forest management and climate on streamflow. CCEW has two treatment watersheds that have been harvested in the past 50 years. We used a nonlinear mixed model to combine the pre-treatment relationship between streamflow and climate and the post-treatment relationship via an interaction between climate and management into one equation. Our results show that precipitation and potential evapotranspiration alone can account for >95% of the variability in pre-treatment streamflow. Including management scenarios into the model explained most of the variability in streamflow (R2 > 0.98). While forest harvesting altered streamflow in both of our modeled watersheds, removing 66% of the vegetation via selection logging using a tractor yarding system over the entire watershed had a more substantial impact on streamflow than clearcutting small portions of a watershed using cable-yarding. These results suggest that forest harvesting may result in differing impacts on streamflow and highlights the need to incorporate climate into streamflow analyses of paired-watershed studies.

  10. PEMFC modeling and experimental validation

    Energy Technology Data Exchange (ETDEWEB)

    Vargas, J.V.C. [Federal University of Parana (UFPR), Curitiba, PR (Brazil). Dept. of Mechanical Engineering], E-mail: jvargas@demec.ufpr.br; Ordonez, J.C.; Martins, L.S. [Florida State University, Tallahassee, FL (United States). Center for Advanced Power Systems], Emails: ordonez@caps.fsu.edu, martins@caps.fsu.edu

    2009-07-01

    In this paper, a simplified and comprehensive PEMFC mathematical model introduced in previous studies is experimentally validated. Numerical results are obtained for an existing set of commercial unit PEM fuel cells. The model accounts for pressure drops in the gas channels, and for temperature gradients with respect to space in the flow direction, that are investigated by direct infrared imaging, showing that even at low current operation such gradients are present in fuel cell operation, and therefore should be considered by a PEMFC model, since large coolant flow rates are limited due to induced high pressure drops in the cooling channels. The computed polarization and power curves are directly compared to the experimentally measured ones with good qualitative and quantitative agreement. The combination of accuracy and low computational time allow for the future utilization of the model as a reliable tool for PEMFC simulation, control, design and optimization purposes. (author)

  11. Can We Use Regression Modeling to Quantify Mean Annual Streamflow at a Global-Scale?

    Science.gov (United States)

    Barbarossa, V.; Huijbregts, M. A. J.; Hendriks, J. A.; Beusen, A.; Clavreul, J.; King, H.; Schipper, A.

    2016-12-01

    Quantifying mean annual flow of rivers (MAF) at ungauged sites is essential for a number of applications, including assessments of global water supply, ecosystem integrity and water footprints. MAF can be quantified with spatially explicit process-based models, which might be overly time-consuming and data-intensive for this purpose, or with empirical regression models that predict MAF based on climate and catchment characteristics. Yet, regression models have mostly been developed at a regional scale and the extent to which they can be extrapolated to other regions is not known. In this study, we developed a global-scale regression model for MAF using observations of discharge and catchment characteristics from 1,885 catchments worldwide, ranging from 2 to 106 km2 in size. In addition, we compared the performance of the regression model with the predictive ability of the spatially explicit global hydrological model PCR-GLOBWB [van Beek et al., 2011] by comparing results from both models to independent measurements. We obtained a regression model explaining 89% of the variance in MAF based on catchment area, mean annual precipitation and air temperature, average slope and elevation. The regression model performed better than PCR-GLOBWB for the prediction of MAF, as root-mean-square error values were lower (0.29 - 0.38 compared to 0.49 - 0.57) and the modified index of agreement was higher (0.80 - 0.83 compared to 0.72 - 0.75). Our regression model can be applied globally at any point of the river network, provided that the input parameters are within the range of values employed in the calibration of the model. The performance is reduced for water scarce regions and further research should focus on improving such an aspect for regression-based global hydrological models.

  12. Developing and testing a global-scale regression model to quantify mean annual streamflow

    Science.gov (United States)

    Barbarossa, Valerio; Huijbregts, Mark A. J.; Hendriks, A. Jan; Beusen, Arthur H. W.; Clavreul, Julie; King, Henry; Schipper, Aafke M.

    2017-01-01

    Quantifying mean annual flow of rivers (MAF) at ungauged sites is essential for assessments of global water supply, ecosystem integrity and water footprints. MAF can be quantified with spatially explicit process-based models, which might be overly time-consuming and data-intensive for this purpose, or with empirical regression models that predict MAF based on climate and catchment characteristics. Yet, regression models have mostly been developed at a regional scale and the extent to which they can be extrapolated to other regions is not known. In this study, we developed a global-scale regression model for MAF based on a dataset unprecedented in size, using observations of discharge and catchment characteristics from 1885 catchments worldwide, measuring between 2 and 106 km2. In addition, we compared the performance of the regression model with the predictive ability of the spatially explicit global hydrological model PCR-GLOBWB by comparing results from both models to independent measurements. We obtained a regression model explaining 89% of the variance in MAF based on catchment area and catchment averaged mean annual precipitation and air temperature, slope and elevation. The regression model performed better than PCR-GLOBWB for the prediction of MAF, as root-mean-square error (RMSE) values were lower (0.29-0.38 compared to 0.49-0.57) and the modified index of agreement (d) was higher (0.80-0.83 compared to 0.72-0.75). Our regression model can be applied globally to estimate MAF at any point of the river network, thus providing a feasible alternative to spatially explicit process-based global hydrological models.

  13. Application of the Water Erosion Prediction Project (WEPP) Model to simulate streamflow in a PNW forest watershed

    Science.gov (United States)

    A. Srivastava; M. Dobre; E. Bruner; W. J. Elliot; I. S. Miller; J. Q. Wu

    2011-01-01

    Assessment of water yields from watersheds into streams and rivers is critical to managing water supply and supporting aquatic life. Surface runoff typically contributes the most to peak discharge of a hydrograph while subsurface flow dominates the falling limb of hydrograph and baseflow contributes to streamflow from shallow unconfined aquifers primarily during the...

  14. Multivariate power-law models for streamflow prediction in the Mekong Basin

    Directory of Open Access Journals (Sweden)

    Guillaume Lacombe

    2014-11-01

    New hydrological insights for the region: A combination of 3–6 explanatory variables – chosen among annual rainfall, drainage area, perimeter, elevation, slope, drainage density and latitude – is sufficient to predict a range of flow metrics with a prediction R-squared ranging from 84 to 95%. The inclusion of forest or paddy percentage coverage as an additional explanatory variable led to slight improvements in the predictive power of some of the low-flow models (lowest prediction R-squared = 89%. A physical interpretation of the model structure was possible for most of the resulting relationships. Compared to regional regression models developed in other parts of the world, this new set of equations performs reasonably well.

  15. Quantifying Uncertainty in Flood Inundation Mapping Using Streamflow Ensembles and Multiple Hydraulic Modeling Techniques

    Science.gov (United States)

    Hosseiny, S. M. H.; Zarzar, C.; Gomez, M.; Siddique, R.; Smith, V.; Mejia, A.; Demir, I.

    2016-12-01

    The National Water Model (NWM) provides a platform for operationalize nationwide flood inundation forecasting and mapping. The ability to model flood inundation on a national scale will provide invaluable information to decision makers and local emergency officials. Often, forecast products use deterministic model output to provide a visual representation of a single inundation scenario, which is subject to uncertainty from various sources. While this provides a straightforward representation of the potential inundation, the inherent uncertainty associated with the model output should be considered to optimize this tool for decision making support. The goal of this study is to produce ensembles of future flood inundation conditions (i.e. extent, depth, and velocity) to spatially quantify and visually assess uncertainties associated with the predicted flood inundation maps. The setting for this study is located in a highly urbanized watershed along the Darby Creek in Pennsylvania. A forecasting framework coupling the NWM with multiple hydraulic models was developed to produce a suite ensembles of future flood inundation predictions. Time lagged ensembles from the NWM short range forecasts were used to account for uncertainty associated with the hydrologic forecasts. The forecasts from the NWM were input to iRIC and HEC-RAS two-dimensional software packages, from which water extent, depth, and flow velocity were output. Quantifying the agreement between output ensembles for each forecast grid provided the uncertainty metrics for predicted flood water inundation extent, depth, and flow velocity. For visualization, a series of flood maps that display flood extent, water depth, and flow velocity along with the underlying uncertainty associated with each of the forecasted variables were produced. The results from this study demonstrate the potential to incorporate and visualize model uncertainties in flood inundation maps in order to identify the high flood risk zones.

  16. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  17. Hydrologic scales, cloud variability, remote sensing, and models: Implications for forecasting snowmelt and streamflow

    Science.gov (United States)

    Simpson, James J.; Dettinger, M.D.; Gehrke, F.; McIntire, T.J.; Hufford, Gary L.

    2004-01-01

    Accurate prediction of available water supply from snowmelt is needed if the myriad of human, environmental, agricultural, and industrial demands for water are to be satisfied, especially given legislatively imposed conditions on its allocation. Robust retrievals of hydrologic basin model variables (e.g., insolation or areal extent of snow cover) provide several advantages over the current operational use of either point measurements or parameterizations to help to meet this requirement. Insolation can be provided at hourly time scales (or better if needed during rapid melt events associated with flooding) and at 1-km spatial resolution. These satellite-based retrievals incorporate the effects of highly variable (both in space and time) and unpredictable cloud cover on estimates of insolation. The insolation estimates are further adjusted for the effects of basin topography using a high-resolution digital elevation model prior to model input. Simulations of two Sierra Nevada rivers in the snowmelt seasons of 1998 and 1999 indicate that even the simplest improvements in modeled insolation can improve snowmelt simulations, with 10%-20% reductions in root-mean-square errors. Direct retrieval of the areal extent of snow cover may mitigate the need to rely entirely on internal calculations of this variable, a reliance that can yield large errors that are difficult to correct until long after the season is complete and that often leads to persistent underestimates or overestimates of the volumes of the water to operational reservoirs. Agencies responsible for accurately predicting available water resources from the melt of snowpack [e.g., both federal (the National Weather Service River Forecast Centers) and state (the California Department of Water Resources)] can benefit by incorporating concepts developed herein into their operational forecasting procedures. ?? 2004 American Meteorological Society.

  18. Streamflow Gaging Stations

    Data.gov (United States)

    Department of Homeland Security — This map layer shows selected streamflow gaging stations of the United States, Puerto Rico, and the U.S. Virgin Islands, in 2013. Gaging stations, or gages, measure...

  19. Geochemistry Model Validation Report: External Accumulation Model

    International Nuclear Information System (INIS)

    Zarrabi, K.

    2001-01-01

    The purpose of this Analysis and Modeling Report (AMR) is to validate the External Accumulation Model that predicts accumulation of fissile materials in fractures and lithophysae in the rock beneath a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. (Lithophysae are voids in the rock having concentric shells of finely crystalline alkali feldspar, quartz, and other materials that were formed due to entrapped gas that later escaped, DOE 1998, p. A-25.) The intended use of this model is to estimate the quantities of external accumulation of fissile material for use in external criticality risk assessments for different types of degrading WPs: U.S. Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The scope of the model validation is to (1) describe the model and the parameters used to develop the model, (2) provide rationale for selection of the parameters by comparisons with measured values, and (3) demonstrate that the parameters chosen are the most conservative selection for external criticality risk calculations. To demonstrate the applicability of the model, a Pu-ceramic WP is used as an example. The model begins with a source term from separately documented EQ6 calculations; where the source term is defined as the composition versus time of the water flowing out of a breached waste package (WP). Next, PHREEQC, is used to simulate the transport and interaction of the source term with the resident water and fractured tuff below the repository. In these simulations the primary mechanism for accumulation is mixing of the high pH, actinide-laden source term with resident water; thus lowering the pH values sufficiently for fissile minerals to become insoluble and precipitate. In the final section of the model, the outputs from PHREEQC, are processed to produce mass of accumulation

  20. Analysing the Effects of Forest Cover and Irrigation Farm Dams on Streamflows of Water-Scarce Catchments in South Australia through the SWAT Model

    Directory of Open Access Journals (Sweden)

    Hong Hanh Nguyen

    2017-01-01

    Full Text Available To assist water resource managers with future land use planning efforts, the eco-hydrological model Soil and Water Assessment Tool (SWAT was applied to three catchments in South Australia that experience extreme low flow conditions. Particular land uses and management issues of interest included forest covers, known to affect water yields, and farm dams, known to intercept and change the hydrological dynamics in a catchment. The study achieved a satisfactory daily calibration when irrigation farm dams were incorporated in the model. For the catchment dominated by extreme low flows, a better daily simulation across a range of qualitative and quantitative metrics was gained using the base-flow static threshold optimization technique. Scenario analysis on effects of forest cover indicated an increase of surface flow and a reduction of base-flow when native eucalyptus lands were replaced by pastures and vice versa. A decreasing trend was observed for the overall water yield of catchments with more forest plantation due to the higher evapotranspiration (ET rate and the decline in surface flow. With regards to effects of irrigation farm dams, assessment on a daily time step suggested that a significant volume of water is stored in these systems with the water loss rate highest in June and July. On an annual basis, the model indicated that approximately 13.1% to 22.0% of water has been captured by farm dams for irrigation. However, the scenario analysis revealed that the purposes of use of farm dams rather than their volumetric capacities in the catchment determined the magnitude of effects on streamflows. Water extracted from farm dams for irrigation of orchards and vineyards are more likely to diminish streamflows than other land uses. Outputs from this study suggest that the water use restrictions from farm dams during recent drought periods were an effective tool to minimize impacts on streamflows.

  1. Seasonal forecasts of the SINTEX-F coupled model applied to maize yield and streamflow estimates over north-eastern South Africa

    CSIR Research Space (South Africa)

    Malherbe, J

    2014-07-01

    Full Text Available -1 Meteorological Applications Vol. 21(3) Seasonal forecasts of the SINTEX-F coupled model applied to maize yield and streamflow estimates over north-eastern South Africa J. Malherbe,a* W. A. Landman,b,c C. Olivier,d H. Sakumae and J- J. Luof a Institute... for Soil, Climate and Water, Agricultural Research Council, Pretoria, South Africa b Council for Scientific and Industrial Research, Natural Resources and the Environment, Pretoria, South Africa c Department of Geography, Geoinformatics and Meteorology...

  2. Long Term Quantification of Climate and Land Cover Change Impacts on Streamflow in an Alpine River Catchment, Northwestern China

    Directory of Open Access Journals (Sweden)

    Zhenliang Yin

    2017-07-01

    Full Text Available Quantifying the long term impacts of climate and land cover change on streamflow is of great important for sustainable water resources management in inland river basins. The Soil and Water Assessment Tool (SWAT model was employed to simulate the streamflow in the upper reaches of Heihe River Basin, northwestern China, over the last half century. The Sequential Uncertainty Fitting algorithm (SUFI-2 was selected to calibrate and validate the SWAT model. The results showed that both Nash-Sutcliffe efficiency (NSE and determination coefficient (R2 were over 0.93 for calibration and validation periods, the percent bias (PBIAS of the two periods were—3.47% and 1.81%, respectively. The precipitation, average, maximum, and minimum air temperature were all showing increasing trends, with 14.87 mm/10 years, 0.30 °C/10 years, 0.27 °C/10 year, and 0.37 °C/10 years, respectively. Runoff coefficient has increased from 0.36 (averaged during 1964 to 1988 to 0.39 (averaged during 1989 to 2013. Based on the SWAT simulation, we quantified the contribution of climate and land cover change to streamflow change, indicated that the land cover change had a positive impact on river discharge by increasing 7.12% of the streamflow during 1964 to 1988, and climate change contributed 14.08% for the streamflow increasing over last 50 years. Meanwhile, the climate change impact was intensive after 2000s. The increasing of streamflow contributed to the increasing of total streamflow by 64.1% for cold season (November to following March and 35.9% for warm season (April to October. The results provide some references for dealing with climate and land cover change in an inland river basin for water resource management and planning.

  3. Application of ANN and fuzzy logic algorithms for streamflow ...

    Indian Academy of Sciences (India)

    The present study focusses on development of models using ANN and fuzzy logic (FL) algorithm for predicting the streamflow for catchment of Savitri River Basin. The input vector to these models were daily rainfall, mean daily evaporation, mean daily temperature and lag streamflow used. In the present study, 20 years ...

  4. Drivers influencing streamflow changes in the Upper Turia basin, Spain.

    Science.gov (United States)

    Salmoral, Gloria; Willaarts, Bárbara A; Troch, Peter A; Garrido, Alberto

    2015-01-15

    Many rivers across the world have experienced a significant streamflow reduction over the last decades. Drivers of the observed streamflow changes are multiple, including climate change (CC), land use and land cover changes (LULCC), water transfers and river impoundment. Many of these drivers inter-act simultaneously, making it difficult to discern the impact of each driver individually. In this study we isolate the effects of LULCC on the observed streamflow reduction in the Upper Turia basin (east Spain) during the period 1973-2008. Regression models of annual streamflow are fitted with climatic variables and also additional time variant drivers like LULCC. The ecohydrological model SWAT is used to study the magnitude and sign of streamflow change when LULCC occurs. Our results show that LULCC does play a significant role on the water balance, but it is not the main driver underpinning the observed reduction on Turia's streamflow. Increasing mean temperature is the main factor supporting increasing evapotranspiration and streamflow reduction. In fact, LULCC and CC have had an offsetting effect on the streamflow generation during the study period. While streamflow has been negatively affected by increasing temperature, ongoing LULCC have positively compensated with reduced evapotranspiration rates, thanks to mainly shrubland clearing and forest degradation processes. These findings are valuable for the management of the Turia river basin, as well as a useful approach for the determination of the weight of LULCC on the hydrological response in other regions. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. An evaluation of the accuracy of modeled and computed streamflow time-series data for the Ohio River at Hannibal Lock and Dam and at a location upstream from Sardis, Ohio

    Science.gov (United States)

    Koltun, G.F.

    2015-01-01

    Between July 2013 and June 2014, the U.S. Geological Survey (USGS) made 10 streamflow measurements on the Ohio River about 1.5 miles (mi) downstream from the Hannibal Lock and Dam (near Hannibal, Ohio) and 11 streamflow measurements near the USGS Sardis gage (station number 03114306) located approximately 2.4 mi upstream from Sardis, Ohio. The measurement results were used to assess the accuracy of modeled or computed instantaneous streamflow time series created and supplied by the USGS, U.S. Army Corps of Engineers (USACE), and National Weather Service (NWS) for the Ohio River at Hannibal Lock and Dam and (or) at the USGS streamgage. Hydraulic or hydrologic models were used to create the modeled time series; index-velocity methods or gate-opening ratings coupled with hydropower operation data were used to create the computed time series. The time step of the various instantaneous streamflow time series ranged from 15 minutes to 24 hours (once-daily values at 12:00 Coordinated Universal Time [UTC]). The 15-minute time-series data, computed by the USGS for the Sardis gage, also were downsampled to 1-hour and 24-hour time steps to permit more direct comparisons with other streamflow time series.

  6. Global Sensitivity Analysis of Environmental Models: Convergence, Robustness and Validation

    Science.gov (United States)

    Sarrazin, Fanny; Pianosi, Francesca; Khorashadi Zadeh, Farkhondeh; Van Griensven, Ann; Wagener, Thorsten

    2015-04-01

    Global Sensitivity Analysis aims to characterize the impact that variations in model input factors (e.g. the parameters) have on the model output (e.g. simulated streamflow). In sampling-based Global Sensitivity Analysis, the sample size has to be chosen carefully in order to obtain reliable sensitivity estimates while spending computational resources efficiently. Furthermore, insensitive parameters are typically identified through the definition of a screening threshold: the theoretical value of their sensitivity index is zero but in a sampling-base framework they regularly take non-zero values. There is little guidance available for these two steps in environmental modelling though. The objective of the present study is to support modellers in making appropriate choices, regarding both sample size and screening threshold, so that a robust sensitivity analysis can be implemented. We performed sensitivity analysis for the parameters of three hydrological models with increasing level of complexity (Hymod, HBV and SWAT), and tested three widely used sensitivity analysis methods (Elementary Effect Test or method of Morris, Regional Sensitivity Analysis, and Variance-Based Sensitivity Analysis). We defined criteria based on a bootstrap approach to assess three different types of convergence: the convergence of the value of the sensitivity indices, of the ranking (the ordering among the parameters) and of the screening (the identification of the insensitive parameters). We investigated the screening threshold through the definition of a validation procedure. The results showed that full convergence of the value of the sensitivity indices is not necessarily needed to rank or to screen the model input factors. Furthermore, typical values of the sample sizes that are reported in the literature can be well below the sample sizes that actually ensure convergence of ranking and screening.

  7. Streamflow depletion by wells--Understanding and managing the effects of groundwater pumping on streamflow

    Science.gov (United States)

    Barlow, Paul M.; Leake, Stanley A.

    2012-11-02

    Groundwater is an important source of water for many human needs, including public supply, agriculture, and industry. With the development of any natural resource, however, adverse consequences may be associated with its use. One of the primary concerns related to the development of groundwater resources is the effect of groundwater pumping on streamflow. Groundwater and surface-water systems are connected, and groundwater discharge is often a substantial component of the total flow of a stream. Groundwater pumping reduces the amount of groundwater that flows to streams and, in some cases, can draw streamflow into the underlying groundwater system. Streamflow reductions (or depletions) caused by pumping have become an important water-resource management issue because of the negative impacts that reduced flows can have on aquatic ecosystems, the availability of surface water, and the quality and aesthetic value of streams and rivers. Scientific research over the past seven decades has made important contributions to the basic understanding of the processes and factors that affect streamflow depletion by wells. Moreover, advances in methods for simulating groundwater systems with computer models provide powerful tools for estimating the rates, locations, and timing of streamflow depletion in response to groundwater pumping and for evaluating alternative approaches for managing streamflow depletion. The primary objective of this report is to summarize these scientific insights and to describe the various field methods and modeling approaches that can be used to understand and manage streamflow depletion. A secondary objective is to highlight several misconceptions concerning streamflow depletion and to explain why these misconceptions are incorrect.

  8. Streamflow response to increasing precipitation extremes altered by forest management

    Science.gov (United States)

    Charlene N. Kelly; Kevin J. McGuire; Chelcy Ford Miniat; James M. Vose

    2016-01-01

    Increases in extreme precipitation events of floods and droughts are expected to occur worldwide. The increase in extreme events will result in changes in streamflow that are expected to affect water availability for human consumption and aquatic ecosystem function. We present an analysis that may greatly improve current streamflow models by quantifying the...

  9. Validation of models with multivariate output

    International Nuclear Information System (INIS)

    Rebba, Ramesh; Mahadevan, Sankaran

    2006-01-01

    This paper develops metrics for validating computational models with experimental data, considering uncertainties in both. A computational model may generate multiple response quantities and the validation experiment might yield corresponding measured values. Alternatively, a single response quantity may be predicted and observed at different spatial and temporal points. Model validation in such cases involves comparison of multiple correlated quantities. Multiple univariate comparisons may give conflicting inferences. Therefore, aggregate validation metrics are developed in this paper. Both classical and Bayesian hypothesis testing are investigated for this purpose, using multivariate analysis. Since, commonly used statistical significance tests are based on normality assumptions, appropriate transformations are investigated in the case of non-normal data. The methodology is implemented to validate an empirical model for energy dissipation in lap joints under dynamic loading

  10. Feature Extraction for Structural Dynamics Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Farrar, Charles [Los Alamos National Laboratory; Nishio, Mayuko [Yokohama University; Hemez, Francois [Los Alamos National Laboratory; Stull, Chris [Los Alamos National Laboratory; Park, Gyuhae [Chonnam Univesity; Cornwell, Phil [Rose-Hulman Institute of Technology; Figueiredo, Eloi [Universidade Lusófona; Luscher, D. J. [Los Alamos National Laboratory; Worden, Keith [University of Sheffield

    2016-01-13

    As structural dynamics becomes increasingly non-modal, stochastic and nonlinear, finite element model-updating technology must adopt the broader notions of model validation and uncertainty quantification. For example, particular re-sampling procedures must be implemented to propagate uncertainty through a forward calculation, and non-modal features must be defined to analyze nonlinear data sets. The latter topic is the focus of this report, but first, some more general comments regarding the concept of model validation will be discussed.

  11. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  12. Long-range forecasting of intermittent streamflow

    Science.gov (United States)

    van Ogtrop, F. F.; Vervoort, R. W.; Heller, G. Z.; Stasinopoulos, D. M.; Rigby, R. A.

    2011-11-01

    Long-range forecasting of intermittent streamflow in semi-arid Australia poses a number of major challenges. One of the challenges relates to modelling zero, skewed, non-stationary, and non-linear data. To address this, a statistical model to forecast streamflow up to 12 months ahead is applied to five semi-arid catchments in South Western Queensland. The model uses logistic regression through Generalised Additive Models for Location, Scale and Shape (GAMLSS) to determine the probability of flow occurring in any of the systems. We then use the same regression framework in combination with a right-skewed distribution, the Box-Cox t distribution, to model the intensity (depth) of the non-zero streamflows. Time, seasonality and climate indices, describing the Pacific and Indian Ocean sea surface temperatures, are tested as covariates in the GAMLSS model to make probabilistic 6 and 12-month forecasts of the occurrence and intensity of streamflow. The output reveals that in the study region the occurrence and variability of flow is driven by sea surface temperatures and therefore forecasts can be made with some skill.

  13. Long-range forecasting of intermittent streamflow

    Directory of Open Access Journals (Sweden)

    F. F. van Ogtrop

    2011-11-01

    Full Text Available Long-range forecasting of intermittent streamflow in semi-arid Australia poses a number of major challenges. One of the challenges relates to modelling zero, skewed, non-stationary, and non-linear data. To address this, a statistical model to forecast streamflow up to 12 months ahead is applied to five semi-arid catchments in South Western Queensland. The model uses logistic regression through Generalised Additive Models for Location, Scale and Shape (GAMLSS to determine the probability of flow occurring in any of the systems. We then use the same regression framework in combination with a right-skewed distribution, the Box-Cox t distribution, to model the intensity (depth of the non-zero streamflows. Time, seasonality and climate indices, describing the Pacific and Indian Ocean sea surface temperatures, are tested as covariates in the GAMLSS model to make probabilistic 6 and 12-month forecasts of the occurrence and intensity of streamflow. The output reveals that in the study region the occurrence and variability of flow is driven by sea surface temperatures and therefore forecasts can be made with some skill.

  14. A broad view of model validation

    International Nuclear Information System (INIS)

    Tsang, C.F.

    1989-10-01

    The safety assessment of a nuclear waste repository requires the use of models. Such models need to be validated to ensure, as much as possible, that they are a good representation of the actual processes occurring in the real system. In this paper we attempt to take a broad view by reviewing step by step the modeling process and bringing out the need to validating every step of this process. This model validation includes not only comparison of modeling results with data from selected experiments, but also evaluation of procedures for the construction of conceptual models and calculational models as well as methodologies for studying data and parameter correlation. The need for advancing basic scientific knowledge in related fields, for multiple assessment groups, and for presenting our modeling efforts in open literature to public scrutiny is also emphasized. 16 refs

  15. Establishing model credibility involves more than validation

    International Nuclear Information System (INIS)

    Kirchner, T.

    1991-01-01

    One widely used definition of validation is that the quantitative test of the performance of a model through the comparison of model predictions to independent sets of observations from the system being simulated. The ability to show that the model predictions compare well with observations is often thought to be the most rigorous test that can be used to establish credibility for a model in the scientific community. However, such tests are only part of the process used to establish credibility, and in some cases may be either unnecessary or misleading. Naylor and Finger extended the concept of validation to include the establishment of validity for the postulates embodied in the model and the test of assumptions used to select postulates for the model. Validity of postulates is established through concurrence by experts in the field of study that the mathematical or conceptual model contains the structural components and mathematical relationships necessary to adequately represent the system with respect to the goals for the model. This extended definition of validation provides for consideration of the structure of the model, not just its performance, in establishing credibility. Evaluation of a simulation model should establish the correctness of the code and the efficacy of the model within its domain of applicability. (24 refs., 6 figs.)

  16. Analysis of streamflow response to land use and land cover changes using satellite data and hydrological modelling: case study of Dinder and Rahad tributaries of the Blue Nile (Ethiopia-Sudan)

    Science.gov (United States)

    Hassaballah, Khalid; Mohamed, Yasir; Uhlenbrook, Stefan; Biro, Khalid

    2017-10-01

    Understanding the land use and land cover changes (LULCCs) and their implication on surface hydrology of the Dinder and Rahad basins (D&R, approximately 77 504 km2) is vital for the management and utilization of water resources in the basins. Although there are many studies on LULCC in the Blue Nile Basin, specific studies on LULCC in the D&R are still missing. Hence, its impact on streamflow is unknown. The objective of this paper is to understand the LULCC in the Dinder and Rahad and its implications on streamflow response using satellite data and hydrological modelling. The hydrological model has been derived by different sets of land use and land cover maps from 1972, 1986, 1998 and 2011. Catchment topography, land cover and soil maps are derived from satellite images and serve to estimate model parameters. Results of LULCC detection between 1972 and 2011 indicate a significant decrease in woodland and an increase in cropland. Woodland decreased from 42 to 14 % and from 35 to 14 % for Dinder and Rahad, respectively. Cropland increased from 14 to 47 % and from 18 to 68 % in Dinder and Rahad, respectively. The model results indicate that streamflow is affected by LULCC in both the Dinder and the Rahad rivers. The effect of LULCC on streamflow is significant during 1986 and 2011. This could be attributed to the severe drought during the mid-1980s and the recent large expansion in cropland.

  17. Base Flow Model Validation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets to...

  18. Model validation: Correlation for updating

    Indian Academy of Sciences (India)

    In this paper, a review is presented of the various methods which ... to make a direct and objective comparison of specific dynamic properties, measured ..... stiffness matrix is available from the analytical model, is that of reducing or condensing.

  19. Validating EHR clinical models using ontology patterns.

    Science.gov (United States)

    Martínez-Costa, Catalina; Schulz, Stefan

    2017-12-01

    Clinical models are artefacts that specify how information is structured in electronic health records (EHRs). However, the makeup of clinical models is not guided by any formal constraint beyond a semantically vague information model. We address this gap by advocating ontology design patterns as a mechanism that makes the semantics of clinical models explicit. This paper demonstrates how ontology design patterns can validate existing clinical models using SHACL. Based on the Clinical Information Modelling Initiative (CIMI), we show how ontology patterns detect both modeling and terminology binding errors in CIMI models. SHACL, a W3C constraint language for the validation of RDF graphs, builds on the concept of "Shape", a description of data in terms of expected cardinalities, datatypes and other restrictions. SHACL, as opposed to OWL, subscribes to the Closed World Assumption (CWA) and is therefore more suitable for the validation of clinical models. We have demonstrated the feasibility of the approach by manually describing the correspondences between six CIMI clinical models represented in RDF and two SHACL ontology design patterns. Using a Java-based SHACL implementation, we found at least eleven modeling and binding errors within these CIMI models. This demonstrates the usefulness of ontology design patterns not only as a modeling tool but also as a tool for validation. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  1. HYDRORECESSION: A toolbox for streamflow recession analysis

    Science.gov (United States)

    Arciniega, S.

    2015-12-01

    Streamflow recession curves are hydrological signatures allowing to study the relationship between groundwater storage and baseflow and/or low flows at the catchment scale. Recent studies have showed that streamflow recession analysis can be quite sensitive to the combination of different models, extraction techniques and parameter estimation methods. In order to better characterize streamflow recession curves, new methodologies combining multiple approaches have been recommended. The HYDRORECESSION toolbox, presented here, is a Matlab graphical user interface developed to analyse streamflow recession time series with the support of different tools allowing to parameterize linear and nonlinear storage-outflow relationships through four of the most useful recession models (Maillet, Boussinesq, Coutagne and Wittenberg). The toolbox includes four parameter-fitting techniques (linear regression, lower envelope, data binning and mean squared error) and three different methods to extract hydrograph recessions segments (Vogel, Brutsaert and Aksoy). In addition, the toolbox has a module that separates the baseflow component from the observed hydrograph using the inverse reservoir algorithm. Potential applications provided by HYDRORECESSION include model parameter analysis, hydrological regionalization and classification, baseflow index estimates, catchment-scale recharge and low-flows modelling, among others. HYDRORECESSION is freely available for non-commercial and academic purposes.

  2. Tracer travel time and model validation

    International Nuclear Information System (INIS)

    Tsang, Chin-Fu.

    1988-01-01

    The performance assessment of a nuclear waste repository demands much more in comparison to the safety evaluation of any civil constructions such as dams, or the resource evaluation of a petroleum or geothermal reservoir. It involves the estimation of low probability (low concentration) of radionuclide transport extrapolated 1000's of years into the future. Thus models used to make these estimates need to be carefully validated. A number of recent efforts have been devoted to the study of this problem. Some general comments on model validation were given by Tsang. The present paper discusses some issues of validation in regards to radionuclide transport. 5 refs

  3. Validating the passenger traffic model for Copenhagen

    DEFF Research Database (Denmark)

    Overgård, Christian Hansen; VUK, Goran

    2006-01-01

    The paper presents a comprehensive validation procedure for the passenger traffic model for Copenhagen based on external data from the Danish national travel survey and traffic counts. The model was validated for the years 2000 to 2004, with 2004 being of particular interest because the Copenhagen...... matched the observed traffic better than those of the transit assignment model. With respect to the metro forecasts, the model over-predicts metro passenger flows by 10% to 50%. The wide range of findings from the project resulted in two actions. First, a project was started in January 2005 to upgrade...

  4. Moving Beyond Streamflow Observations: Lessons From A Multi-Objective Calibration Experiment in the Mississippi Basin

    Science.gov (United States)

    Koppa, A.; Gebremichael, M.; Yeh, W. W. G.

    2017-12-01

    Calibrating hydrologic models in large catchments using a sparse network of streamflow gauges adversely affects the spatial and temporal accuracy of other water balance components which are important for climate-change, land-use and drought studies. This study combines remote sensing data and the concept of Pareto-Optimality to address the following questions: 1) What is the impact of streamflow (SF) calibration on the spatio-temporal accuracy of Evapotranspiration (ET), near-surface Soil Moisture (SM) and Total Water Storage (TWS)? 2) What is the best combination of fluxes that can be used to calibrate complex hydrological models such that both the accuracy of streamflow and the spatio-temporal accuracy of ET, SM and TWS is preserved? The study area is the Mississippi Basin in the United States (encompassing HUC-2 regions 5,6,7,9,10 and 11). 2003 and 2004, two climatologically average years are chosen for calibration and validation of the Noah-MP hydrologic model. Remotely sensed ET data is sourced from GLEAM, SM from ESA-CCI and TWS from GRACE. Single objective calibration is carried out using DDS Algorithm. For Multi objective calibration PA-DDS is used. First, the Noah-MP model is calibrated using a single objective function (Minimize Mean Square Error) for the outflow from the 6 HUC-2 sub-basins for 2003. Spatial correlograms are used to compare the spatial structure of ET, SM and TWS between the model and the remote sensing data. Spatial maps of RMSE and Mean Error are used to quantify the impact of calibrating streamflow on the accuracy of ET, SM and TWS estimates. Next, a multi-objective calibration experiment is setup to determine the pareto optimal parameter sets (pareto front) for the following cases - 1) SF and ET, 2) SF and SM, 3) SF and TWS, 4) SF, ET and SM, 5) SF, ET and TWS, 6) SF, SM and TWS, 7) SF, ET, SM and TWS. The best combination of fluxes that provides the optimal trade-off between accurate streamflow and preserving the spatio

  5. BIOMOVS: an international model validation study

    International Nuclear Information System (INIS)

    Haegg, C.; Johansson, G.

    1988-01-01

    BIOMOVS (BIOspheric MOdel Validation Study) is an international study where models used for describing the distribution of radioactive and nonradioactive trace substances in terrestrial and aquatic environments are compared and tested. The main objectives of the study are to compare and test the accuracy of predictions between such models, explain differences in these predictions, recommend priorities for future research concerning the improvement of the accuracy of model predictions and act as a forum for the exchange of ideas, experience and information. (author)

  6. BIOMOVS: An international model validation study

    International Nuclear Information System (INIS)

    Haegg, C.; Johansson, G.

    1987-01-01

    BIOMOVS (BIOspheric MOdel Validation Study) is an international study where models used for describing the distribution of radioactive and nonradioactive trace substances in terrestrial and aquatic environments are compared and tested. The main objectives of the study are to compare and test the accuracy of predictions between such models, explain differences in these predictions, recommend priorities for future research concerning the improvement of the accuracy of model predictions and act as a forum for the exchange of ideas, experience and information. (orig.)

  7. Model validation: a systemic and systematic approach

    International Nuclear Information System (INIS)

    Sheng, G.; Elzas, M.S.; Cronhjort, B.T.

    1993-01-01

    The term 'validation' is used ubiquitously in association with the modelling activities of numerous disciplines including social, political natural, physical sciences, and engineering. There is however, a wide range of definitions which give rise to very different interpretations of what activities the process involves. Analyses of results from the present large international effort in modelling radioactive waste disposal systems illustrate the urgent need to develop a common approach to model validation. Some possible explanations are offered to account for the present state of affairs. The methodology developed treats model validation and code verification in a systematic fashion. In fact, this approach may be regarded as a comprehensive framework to assess the adequacy of any simulation study. (author)

  8. Ground-water models: Validate or invalidate

    Science.gov (United States)

    Bredehoeft, J.D.; Konikow, Leonard F.

    1993-01-01

    The word validation has a clear meaning to both the scientific community and the general public. Within the scientific community the validation of scientific theory has been the subject of philosophical debate. The philosopher of science, Karl Popper, argued that scientific theory cannot be validated, only invalidated. Popper’s view is not the only opinion in this debate; however, many scientists today agree with Popper (including the authors). To the general public, proclaiming that a ground-water model is validated carries with it an aura of correctness that we do not believe many of us who model would claim. We can place all the caveats we wish, but the public has its own understanding of what the word implies. Using the word valid with respect to models misleads the public; verification carries with it similar connotations as far as the public is concerned. Our point is this: using the terms validation and verification are misleading, at best. These terms should be abandoned by the ground-water community.

  9. Model performance analysis and model validation in logistic regression

    Directory of Open Access Journals (Sweden)

    Rosa Arboretti Giancristofaro

    2007-10-01

    Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.

  10. Spatial patterns of March and September streamflow trends in Pacific Northwest Streams, 1958-2008

    Science.gov (United States)

    Chang, Heejun; Jung, Il-Won; Steele, Madeline; Gannett, Marshall

    2012-01-01

    Summer streamflow is a vital water resource for municipal and domestic water supplies, irrigation, salmonid habitat, recreation, and water-related ecosystem services in the Pacific Northwest (PNW) in the United States. This study detects significant negative trends in September absolute streamflow in a majority of 68 stream-gauging stations located on unregulated streams in the PNW from 1958 to 2008. The proportion of March streamflow to annual streamflow increases in most stations over 1,000 m elevation, with a baseflow index of less than 50, while absolute March streamflow does not increase in most stations. The declining trends of September absolute streamflow are strongly associated with seven-day low flow, January–March maximum temperature trends, and the size of the basin (19–7,260 km2), while the increasing trends of the fraction of March streamflow are associated with elevation, April 1 snow water equivalent, March precipitation, center timing of streamflow, and October–December minimum temperature trends. Compared with ordinary least squares (OLS) estimated regression models, spatial error regression and geographically weighted regression (GWR) models effectively remove spatial autocorrelation in residuals. The GWR model results show spatial gradients of local R 2 values with consistently higher local R 2 values in the northern Cascades. This finding illustrates that different hydrologic landscape factors, such as geology and seasonal distribution of precipitation, also influence streamflow trends in the PNW. In addition, our spatial analysis model results show that considering various geographic factors help clarify the dynamics of streamflow trends over a large geographical area, supporting a spatial analysis approach over aspatial OLS-estimated regression models for predicting streamflow trends. Results indicate that transitional rain–snow surface water-dominated basins are likely to have reduced summer streamflow under warming scenarios

  11. A discussion on validation of hydrogeological models

    International Nuclear Information System (INIS)

    Carrera, J.; Mousavi, S.F.; Usunoff, E.J.; Sanchez-Vila, X.; Galarza, G.

    1993-01-01

    Groundwater flow and solute transport are often driven by heterogeneities that elude easy identification. It is also difficult to select and describe the physico-chemical processes controlling solute behaviour. As a result, definition of a conceptual model involves numerous assumptions both on the selection of processes and on the representation of their spatial variability. Validating a numerical model by comparing its predictions with actual measurements may not be sufficient for evaluating whether or not it provides a good representation of 'reality'. Predictions will be close to measurements, regardless of model validity, if these are taken from experiments that stress well-calibrated model modes. On the other hand, predictions will be far from measurements when model parameters are very uncertain, even if the model is indeed a very good representation of the real system. Hence, we contend that 'classical' validation of hydrogeological models is not possible. Rather, models should be viewed as theories about the real system. We propose to follow a rigorous modeling approach in which different sources of uncertainty are explicitly recognized. The application of one such approach is illustrated by modeling a laboratory uranium tracer test performed on fresh granite, which was used as Test Case 1b in INTRAVAL. (author)

  12. Structural system identification: Structural dynamics model validation

    Energy Technology Data Exchange (ETDEWEB)

    Red-Horse, J.R.

    1997-04-01

    Structural system identification is concerned with the development of systematic procedures and tools for developing predictive analytical models based on a physical structure`s dynamic response characteristics. It is a multidisciplinary process that involves the ability (1) to define high fidelity physics-based analysis models, (2) to acquire accurate test-derived information for physical specimens using diagnostic experiments, (3) to validate the numerical simulation model by reconciling differences that inevitably exist between the analysis model and the experimental data, and (4) to quantify uncertainties in the final system models and subsequent numerical simulations. The goal of this project was to develop structural system identification techniques and software suitable for both research and production applications in code and model validation.

  13. Calibration parameters used to simulate streamflow from application of the Hydrologic Simulation Program-FORTRAN Model (HSPF) to mountainous basins containing coal mines in West Virginia

    Science.gov (United States)

    Atkins, John T.; Wiley, Jeffrey B.; Paybins, Katherine S.

    2005-01-01

    This report presents the Hydrologic Simulation Program-FORTRAN Model (HSPF) parameters for eight basins in the coal-mining region of West Virginia. The magnitude and characteristics of model parameters from this study will assist users of HSPF in simulating streamflow at other basins in the coal-mining region of West Virginia. The parameter for nominal capacity of the upper-zone storage, UZSN, increased from south to north. The increase in UZSN with the increase in basin latitude could be due to decreasing slopes, decreasing rockiness of the soils, and increasing soil depths from south to north. A special action was given to the parameter for fraction of ground-water inflow that flows to inactive ground water, DEEPFR. The basis for this special action was related to the seasonal movement of the water table and transpiration from trees. The models were most sensitive to DEEPFR and the parameter for interception storage capacity, CEPSC. The models were also fairly sensitive to the parameter for an index representing the infiltration capacity of the soil, INFILT; the parameter for indicating the behavior of the ground-water recession flow, KVARY; the parameter for the basic ground-water recession rate, AGWRC; the parameter for nominal capacity of the upper zone storage, UZSN; the parameter for the interflow inflow, INTFW; the parameter for the interflow recession constant, IRC; and the parameter for lower zone evapotranspiration, LZETP.

  14. Advanced training simulator models. Implementation and validation

    International Nuclear Information System (INIS)

    Borkowsky, Jeffrey; Judd, Jerry; Belblidia, Lotfi; O'farrell, David; Andersen, Peter

    2008-01-01

    Modern training simulators are required to replicate plant data for both thermal-hydraulic and neutronic response. Replication is required such that reactivity manipulation on the simulator properly trains the operator for reactivity manipulation at the plant. This paper discusses advanced models which perform this function in real-time using the coupled code system THOR/S3R. This code system models the all fluids systems in detail using an advanced, two-phase thermal-hydraulic a model. The nuclear core is modeled using an advanced, three-dimensional nodal method and also by using cycle-specific nuclear data. These models are configured to run interactively from a graphical instructor station or handware operation panels. The simulator models are theoretically rigorous and are expected to replicate the physics of the plant. However, to verify replication, the models must be independently assessed. Plant data is the preferred validation method, but plant data is often not available for many important training scenarios. In the absence of data, validation may be obtained by slower-than-real-time transient analysis. This analysis can be performed by coupling a safety analysis code and a core design code. Such a coupling exists between the codes RELAP5 and SIMULATE-3K (S3K). RELAP5/S3K is used to validate the real-time model for several postulated plant events. (author)

  15. Future Climate Change Impacts on Streamflows of Two Main West Africa River Basins: Senegal and Gambia

    Directory of Open Access Journals (Sweden)

    Ansoumana Bodian

    2018-03-01

    Full Text Available This research investigated the effect of climate change on the two main river basins of Senegal in West Africa: the Senegal and Gambia River Basins. We used downscaled projected future rainfall and potential evapotranspiration based on projected temperature from six General Circulation Models (CanESM2, CNRM, CSIRO, HadGEM2-CC, HadGEM2-ES, and MIROC5 and two scenarios (RCP4.5 and RCP8.5 to force the GR4J model. The GR4J model was calibrated and validated using observed daily rainfall, potential evapotranspiration from observed daily temperature, and streamflow data. For the cross-validation, two periods for each river basin were considered: 1961–1982 and 1983–2004 for the Senegal River Basin at Bafing Makana, and 1969–1985 and 1986–2000 for the Gambia River Basin at Mako. Model efficiency is evaluated using a multi-criteria function (Fagg which aggregates Nash and Sutcliffe criteria, cumulative volume error, and mean volume error. Alternating periods of simulation for calibration and validation were used. This process allows us to choose the parameters that best reflect the rainfall-runoff relationship. Once the model was calibrated and validated, we simulated streamflow at Bafing Makana and Mako stations in the near future at a daily scale. The characteristic flow rates were calculated to evaluate their possible evolution under the projected climate scenarios at the 2050 horizon. For the near future (2050 horizon, compared to the 1971–2000 reference period, results showed that for both river basins, multi-model ensemble predicted a decrease of annual streamflow from 8% (Senegal River Basin to 22% (Gambia River Basin under the RCP4.5 scenario. Under the RCP8.5 scenario, the decrease is more pronounced: 16% (Senegal River Basin and 26% (Gambia River Basin. The Gambia River Basin will be more affected by the climate change.

  16. Validation of a phytoremediation computer model

    Energy Technology Data Exchange (ETDEWEB)

    Corapcioglu, M Y; Sung, K; Rhykerd, R L; Munster, C; Drew, M [Texas A and M Univ., College Station, TX (United States)

    1999-01-01

    The use of plants to stimulate remediation of contaminated soil is an effective, low-cost cleanup method which can be applied to many different sites. A phytoremediation computer model has been developed to simulate how recalcitrant hydrocarbons interact with plant roots in unsaturated soil. A study was conducted to provide data to validate and calibrate the model. During the study, lysimeters were constructed and filled with soil contaminated with 10 [mg kg[sub -1

  17. Computing daily mean streamflow at ungaged locations in Iowa by using the Flow Anywhere and Flow Duration Curve Transfer statistical methods

    Science.gov (United States)

    Linhart, S. Mike; Nania, Jon F.; Sanders, Curtis L.; Archfield, Stacey A.

    2012-01-01

    linear regression method and the daily mean streamflow for the 15th day of every other month. The Flow Duration Curve Transfer method was used to estimate unregulated daily mean streamflow from the physical and climatic characteristics of gaged basins. For the Flow Duration Curve Transfer method, daily mean streamflow quantiles at the ungaged site were estimated with the parameter-based regression model, which results in a continuous daily flow-duration curve (the relation between exceedance probability and streamflow for each day of observed streamflow) at the ungaged site. By the use of a reference streamgage, the Flow Duration Curve Transfer is converted to a time series. Data used in the Flow Duration Curve Transfer method were retrieved for 113 continuous-record streamgages in Iowa and within a 50-mile buffer of Iowa. The final statewide regression equations for Iowa were computed by using a weighted-least-squares multiple linear regression method and were computed for the 0.01-, 0.05-, 0.10-, 0.15-, 0.20-, 0.30-, 0.40-, 0.50-, 0.60-, 0.70-, 0.80-, 0.85-, 0.90-, and 0.95-exceedance probability statistics determined from the daily mean streamflow with a reporting limit set at 0.1 ft3/s. The final statewide regression equation for Iowa computed by using left-censored regression techniques was computed for the 0.99-exceedance probability statistic determined from the daily mean streamflow with a low limit threshold and a reporting limit set at 0.1 ft3/s. For the Flow Anywhere method, results of the validation study conducted by using six streamgages show that differences between the root-mean-square error and the mean absolute error ranged from 1,016 to 138 ft3/s, with the larger value signifying a greater occurrence of outliers between observed and estimated streamflows. Root-mean-square-error values ranged from 1,690 to 237 ft3/s. Values of the percent root-mean-square error ranged from 115 percent to 26.2 percent. The logarithm (base 10) streamflow percent root

  18. The Global Streamflow Indices and Metadata Archive (GSIM – Part 2: Quality control, time-series indices and homogeneity assessment

    Directory of Open Access Journals (Sweden)

    L. Gudmundsson

    2018-04-01

    Full Text Available This is Part 2 of a two-paper series presenting the Global Streamflow Indices and Metadata Archive (GSIM, which is a collection of daily streamflow observations at more than 30 000 stations around the world. While Part 1 (Do et al., 2018a describes the data collection process as well as the generation of auxiliary catchment data (e.g. catchment boundary, land cover, mean climate, Part 2 introduces a set of quality controlled time-series indices representing (i the water balance, (ii the seasonal cycle, (iii low flows and (iv floods. To this end we first consider the quality of individual daily records using a combination of quality flags from data providers and automated screening methods. Subsequently, streamflow time-series indices are computed for yearly, seasonal and monthly resolution. The paper provides a generalized assessment of the homogeneity of all generated streamflow time-series indices, which can be used to select time series that are suitable for a specific task. The newly generated global set of streamflow time-series indices is made freely available with an digital object identifier at https://doi.pangaea.de/10.1594/PANGAEA.887470 and is expected to foster global freshwater research, by acting as a ground truth for model validation or as a basis for assessing the role of human impacts on the terrestrial water cycle. It is hoped that a renewed interest in streamflow data at the global scale will foster efforts in the systematic assessment of data quality and provide momentum to overcome administrative barriers that lead to inconsistencies in global collections of relevant hydrological observations.

  19. The Global Streamflow Indices and Metadata Archive (GSIM) - Part 2: Quality control, time-series indices and homogeneity assessment

    Science.gov (United States)

    Gudmundsson, Lukas; Do, Hong Xuan; Leonard, Michael; Westra, Seth

    2018-04-01

    This is Part 2 of a two-paper series presenting the Global Streamflow Indices and Metadata Archive (GSIM), which is a collection of daily streamflow observations at more than 30 000 stations around the world. While Part 1 (Do et al., 2018a) describes the data collection process as well as the generation of auxiliary catchment data (e.g. catchment boundary, land cover, mean climate), Part 2 introduces a set of quality controlled time-series indices representing (i) the water balance, (ii) the seasonal cycle, (iii) low flows and (iv) floods. To this end we first consider the quality of individual daily records using a combination of quality flags from data providers and automated screening methods. Subsequently, streamflow time-series indices are computed for yearly, seasonal and monthly resolution. The paper provides a generalized assessment of the homogeneity of all generated streamflow time-series indices, which can be used to select time series that are suitable for a specific task. The newly generated global set of streamflow time-series indices is made freely available with an digital object identifier at https://doi.pangaea.de/10.1594/PANGAEA.887470" target="_blank">https://doi.pangaea.de/10.1594/PANGAEA.887470 and is expected to foster global freshwater research, by acting as a ground truth for model validation or as a basis for assessing the role of human impacts on the terrestrial water cycle. It is hoped that a renewed interest in streamflow data at the global scale will foster efforts in the systematic assessment of data quality and provide momentum to overcome administrative barriers that lead to inconsistencies in global collections of relevant hydrological observations.

  20. Impact of rain gauge quality control and interpolation on streamflow simulation: an application to the Warwick catchment, Australia

    Science.gov (United States)

    Liu, Shulun; Li, Yuan; Pauwels, Valentijn R. N.; Walker, Jeffrey P.

    2017-12-01

    Rain gauges are widely used to obtain temporally continuous point rainfall records, which are then interpolated into spatially continuous data to force hydrological models. However, rainfall measurements and interpolation procedure are subject to various uncertainties, which can be reduced by applying quality control and selecting appropriate spatial interpolation approaches. Consequently, the integrated impact of rainfall quality control and interpolation on streamflow simulation has attracted increased attention but not been fully addressed. This study applies a quality control procedure to the hourly rainfall measurements obtained in the Warwick catchment in eastern Australia. The grid-based daily precipitation from the Australian Water Availability Project was used as a reference. The Pearson correlation coefficient between the daily accumulation of gauged rainfall and the reference data was used to eliminate gauges with significant quality issues. The unrealistic outliers were censored based on a comparison between gauged rainfall and the reference. Four interpolation methods, including the inverse distance weighting (IDW), nearest neighbors (NN), linear spline (LN), and ordinary Kriging (OK), were implemented. The four methods were firstly assessed through a cross-validation using the quality-controlled rainfall data. The impacts of the quality control and interpolation on streamflow simulation were then evaluated through a semi-distributed hydrological model. The results showed that the Nash–Sutcliffe model efficiency coefficient (NSE) and Bias of the streamflow simulations were significantly improved after quality control. In the cross-validation, the IDW and OK methods resulted in good interpolation rainfall, while the NN led to the worst result. In term of the impact on hydrological prediction, the IDW led to the most consistent streamflow predictions with the observations, according to the validation at five streamflow-gauged locations. The OK method

  1. A methodology for PSA model validation

    International Nuclear Information System (INIS)

    Unwin, S.D.

    1995-09-01

    This document reports Phase 2 of work undertaken by Science Applications International Corporation (SAIC) in support of the Atomic Energy Control Board's Probabilistic Safety Assessment (PSA) review. A methodology is presented for the systematic review and evaluation of a PSA model. These methods are intended to support consideration of the following question: To within the scope and depth of modeling resolution of a PSA study, is the resultant model a complete and accurate representation of the subject plant? This question was identified as a key PSA validation issue in SAIC's Phase 1 project. The validation methods are based on a model transformation process devised to enhance the transparency of the modeling assumptions. Through conversion to a 'success-oriented' framework, a closer correspondence to plant design and operational specifications is achieved. This can both enhance the scrutability of the model by plant personnel, and provide an alternative perspective on the model that may assist in the identification of deficiencies. The model transformation process is defined and applied to fault trees documented in the Darlington Probabilistic Safety Evaluation. A tentative real-time process is outlined for implementation and documentation of a PSA review based on the proposed methods. (author). 11 refs., 9 tabs., 30 refs

  2. Paleoclimate validation of a numerical climate model

    International Nuclear Information System (INIS)

    Schelling, F.J.; Church, H.W.; Zak, B.D.; Thompson, S.L.

    1994-01-01

    An analysis planned to validate regional climate model results for a past climate state at Yucca Mountain, Nevada, against paleoclimate evidence for the period is described. This analysis, which will use the GENESIS model of global climate nested with the RegCM2 regional climate model, is part of a larger study for DOE's Yucca Mountain Site Characterization Project that is evaluating the impacts of long term future climate change on performance of the potential high level nuclear waste repository at Yucca Mountain. The planned analysis and anticipated results are presented

  3. Validation of the STAFF-5 computer model

    International Nuclear Information System (INIS)

    Fletcher, J.F.; Fields, S.R.

    1981-04-01

    STAFF-5 is a dynamic heat-transfer-fluid-flow stress model designed for computerized prediction of the temperature-stress performance of spent LWR fuel assemblies under storage/disposal conditions. Validation of the temperature calculating abilities of this model was performed by comparing temperature calculations under specified conditions to experimental data from the Engine Maintenance and Dissassembly (EMAD) Fuel Temperature Test Facility and to calculations performed by Battelle Pacific Northwest Laboratory (PNL) using the HYDRA-1 model. The comparisons confirmed the ability of STAFF-5 to calculate representative fuel temperatures over a considerable range of conditions, as a first step in the evaluation and prediction of fuel temperature-stress performance

  4. Methods for estimating drought streamflow probabilities for Virginia streams

    Science.gov (United States)

    Austin, Samuel H.

    2014-01-01

    Maximum likelihood logistic regression model equations used to estimate drought flow probabilities for Virginia streams are presented for 259 hydrologic basins in Virginia. Winter streamflows were used to estimate the likelihood of streamflows during the subsequent drought-prone summer months. The maximum likelihood logistic regression models identify probable streamflows from 5 to 8 months in advance. More than 5 million streamflow daily values collected over the period of record (January 1, 1900 through May 16, 2012) were compiled and analyzed over a minimum 10-year (maximum 112-year) period of record. The analysis yielded the 46,704 equations with statistically significant fit statistics and parameter ranges published in two tables in this report. These model equations produce summer month (July, August, and September) drought flow threshold probabilities as a function of streamflows during the previous winter months (November, December, January, and February). Example calculations are provided, demonstrating how to use the equations to estimate probable streamflows as much as 8 months in advance.

  5. Understanding uncertainties in future Colorado River streamflow

    Science.gov (United States)

    Julie A. Vano,; Bradley Udall,; Cayan, Daniel; Jonathan T Overpeck,; Brekke, Levi D.; Das, Tapash; Hartmann, Holly C.; Hidalgo, Hugo G.; Hoerling, Martin P; McCabe, Gregory J.; Morino, Kiyomi; Webb, Robert S.; Werner, Kevin; Lettenmaier, Dennis P.

    2014-01-01

    The Colorado River is the primary water source for more than 30 million people in the United States and Mexico. Recent studies that project streamf low changes in the Colorado River all project annual declines, but the magnitude of the projected decreases range from less than 10% to 45% by the mid-twenty-first century. To understand these differences, we address the questions the management community has raised: Why is there such a wide range of projections of impacts of future climate change on Colorado River streamflow, and how should this uncertainty be interpreted? We identify four major sources of disparities among studies that arise from both methodological and model differences. In order of importance, these are differences in 1) the global climate models (GCMs) and emission scenarios used; 2) the ability of land surface and atmospheric models to simulate properly the high-elevation runoff source areas; 3) the sensitivities of land surface hydrology models to precipitation and temperature changes; and 4) the methods used to statistically downscale GCM scenarios. In accounting for these differences, there is substantial evidence across studies that future Colorado River streamflow will be reduced under the current trajectories of anthropogenic greenhouse gas emissions because of a combination of strong temperature-induced runoff curtailment and reduced annual precipitation. Reconstructions of preinstrumental streamflows provide additional insights; the greatest risk to Colorado River streamf lows is a multidecadal drought, like that observed in paleoreconstructions, exacerbated by a steady reduction in flows due to climate change. This could result in decades of sustained streamflows much lower than have been observed in the ~100 years of instrumental record.

  6. Geospatial tools effectively estimate nonexceedance probabilities of daily streamflow at ungauged and intermittently gauged locations in Ohio

    Directory of Open Access Journals (Sweden)

    William H. Farmer

    2017-10-01

    New hydrological insights for the region: Several methods for estimating nonexceedance probabilities of daily mean streamflows are explored, including single-index methodologies (nearest-neighboring index and geospatial tools (kriging and topological kriging. These methods were evaluated by conducting leave-one-out cross-validations based on analyses of nearly 7 years of daily streamflow data from 79 unregulated streamgages in Ohio and neighboring states. The pooled, ordinary kriging model, with a median Nash–Sutcliffe performance of 0.87, was superior to the single-site index methods, though there was some bias in the tails of the probability distribution. Incorporating network structure through topological kriging did not improve performance. The pooled, ordinary kriging model was applied to 118 locations without systematic streamgaging across Ohio where instantaneous streamflow measurements had been made concurrent with water-quality sampling on at least 3 separate days. Spearman rank correlations between estimated nonexceedance probabilities and measured streamflows were high, with a median value of 0.76. In consideration of application, the degree of regulation in a set of sample sites helped to specify the streamgages required to implement kriging approaches successfully.

  7. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  8. Natural analogues and radionuclide transport model validation

    International Nuclear Information System (INIS)

    Lever, D.A.

    1987-08-01

    In this paper, some possible roles for natural analogues are discussed from the point of view of those involved with the development of mathematical models for radionuclide transport and with the use of these models in repository safety assessments. The characteristic features of a safety assessment are outlined in order to address the questions of where natural analogues can be used to improve our understanding of the processes involved and where they can assist in validating the models that are used. Natural analogues have the potential to provide useful information about some critical processes, especially long-term chemical processes and migration rates. There is likely to be considerable uncertainty and ambiguity associated with the interpretation of natural analogues, and thus it is their general features which should be emphasized, and models with appropriate levels of sophistication should be used. Experience gained in modelling the Koongarra uranium deposit in northern Australia is drawn upon. (author)

  9. Predictive validation of an influenza spread model.

    Directory of Open Access Journals (Sweden)

    Ayaz Hyder

    Full Text Available BACKGROUND: Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. METHODS AND FINDINGS: We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998-1999. Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type. Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. CONCLUSIONS: Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve

  10. Predictive Validation of an Influenza Spread Model

    Science.gov (United States)

    Hyder, Ayaz; Buckeridge, David L.; Leung, Brian

    2013-01-01

    Background Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. Methods and Findings We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998–1999). Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type). Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. Conclusions Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers) with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve their predictive

  11. External validation of EPIWIN biodegradation models.

    Science.gov (United States)

    Posthumus, R; Traas, T P; Peijnenburg, W J G M; Hulzebos, E M

    2005-01-01

    The BIOWIN biodegradation models were evaluated for their suitability for regulatory purposes. BIOWIN includes the linear and non-linear BIODEG and MITI models for estimating the probability of rapid aerobic biodegradation and an expert survey model for primary and ultimate biodegradation estimation. Experimental biodegradation data for 110 newly notified substances were compared with the estimations of the different models. The models were applied separately and in combinations to determine which model(s) showed the best performance. The results of this study were compared with the results of other validation studies and other biodegradation models. The BIOWIN models predict not-readily biodegradable substances with high accuracy in contrast to ready biodegradability. In view of the high environmental concern of persistent chemicals and in view of the large number of not-readily biodegradable chemicals compared to the readily ones, a model is preferred that gives a minimum of false positives without a corresponding high percentage false negatives. A combination of the BIOWIN models (BIOWIN2 or BIOWIN6) showed the highest predictive value for not-readily biodegradability. However, the highest score for overall predictivity with lowest percentage false predictions was achieved by applying BIOWIN3 (pass level 2.75) and BIOWIN6.

  12. Evaluating the APEX model for simulating streamflow and water quality on ten agricultural watersheds in the U.S.

    Science.gov (United States)

    Simulation models are increasingly used to assess water quality constituent losses from agricultural systems. Mis-use often gives irrelevant or erroneous answers. The Agricultural Policy Environmental Extender (APEX) model is emerging as one of the premier modeling tools for fields, farms, and agr...

  13. Reconstructing pre-instrumental streamflow in Eastern Australia using a water balance approach

    Science.gov (United States)

    Tozer, C. R.; Kiem, A. S.; Vance, T. R.; Roberts, J. L.; Curran, M. A. J.; Moy, A. D.

    2018-03-01

    Streamflow reconstructions based on paleoclimate proxies provide much longer records than the short instrumental period records on which water resource management plans are currently based. In Australia there is a lack of in-situ high resolution paleoclimate proxy records, but remote proxies with teleconnections to Australian climate have utility in producing streamflow reconstructions. Here we investigate, via a case study for a catchment in eastern Australia, the novel use of an Antarctic ice-core based rainfall reconstruction within a Budyko-framework to reconstruct ∼1000 years of annual streamflow. The resulting streamflow reconstruction captures interannual to decadal variability in the instrumental streamflow, validating both the use of the ice core rainfall proxy record and the Budyko-framework method. In the preinstrumental era the streamflow reconstruction shows longer wet and dry epochs and periods of streamflow variability that are higher than observed in the instrumental era. Importantly, for both the instrumental record and preinstrumental reconstructions, the wet (dry) epochs in the rainfall record are shorter (longer) in the streamflow record and this non-linearity must be considered when inferring hydroclimatic risk or historical water availability directly from rainfall proxy records alone. These insights provide a better understanding of present infrastructure vulnerability in the context of past climate variability for eastern Australia. The streamflow reconstruction presented here also provides a better understanding of the range of hydroclimatic variability possible, and therefore represents a more realistic baseline on which to quantify the potential impacts of anthropogenic climate change on water security.

  14. Validation of a phytoremediation computer model

    International Nuclear Information System (INIS)

    Corapcioglu, M.Y.; Sung, K.; Rhykerd, R.L.; Munster, C.; Drew, M.

    1999-01-01

    The use of plants to stimulate remediation of contaminated soil is an effective, low-cost cleanup method which can be applied to many different sites. A phytoremediation computer model has been developed to simulate how recalcitrant hydrocarbons interact with plant roots in unsaturated soil. A study was conducted to provide data to validate and calibrate the model. During the study, lysimeters were constructed and filled with soil contaminated with 10 [mg kg -1 ] TNT, PBB and chrysene. Vegetated and unvegetated treatments were conducted in triplicate to obtain data regarding contaminant concentrations in the soil, plant roots, root distribution, microbial activity, plant water use and soil moisture. When given the parameters of time and depth, the model successfully predicted contaminant concentrations under actual field conditions. Other model parameters are currently being evaluated. 15 refs., 2 figs

  15. Improved large-scale hydrological modelling through the assimilation of streamflow and downscaled satellite soil moisture observations

    NARCIS (Netherlands)

    Lopez, Patricia Lopez; Wanders, Niko; Schellekens, Jaap; Renzullo, Luigi J.; Sutanudjaja, Edwin H.; Bierkens, Marc F. P.

    2016-01-01

    The coarse spatial resolution of global hydrological models (typically > 0.25◦ ) limits their ability to resolve key water balance processes for many river basins and thus compromises their suitability for water resources management, especially when compared to locally tuned river models. A possible

  16. Towards policy relevant environmental modeling: contextual validity and pragmatic models

    Science.gov (United States)

    Miles, Scott B.

    2000-01-01

    "What makes for a good model?" In various forms, this question is a question that, undoubtedly, many people, businesses, and institutions ponder with regards to their particular domain of modeling. One particular domain that is wrestling with this question is the multidisciplinary field of environmental modeling. Examples of environmental models range from models of contaminated ground water flow to the economic impact of natural disasters, such as earthquakes. One of the distinguishing claims of the field is the relevancy of environmental modeling to policy and environment-related decision-making in general. A pervasive view by both scientists and decision-makers is that a "good" model is one that is an accurate predictor. Thus, determining whether a model is "accurate" or "correct" is done by comparing model output to empirical observations. The expected outcome of this process, usually referred to as "validation" or "ground truthing," is a stamp on the model in question of "valid" or "not valid" that serves to indicate whether or not the model will be reliable before it is put into service in a decision-making context. In this paper, I begin by elaborating on the prevailing view of model validation and why this view must change. Drawing from concepts coming out of the studies of science and technology, I go on to propose a contextual view of validity that can overcome the problems associated with "ground truthing" models as an indicator of model goodness. The problem of how we talk about and determine model validity has much to do about how we perceive the utility of environmental models. In the remainder of the paper, I argue that we should adopt ideas of pragmatism in judging what makes for a good model and, in turn, developing good models. From such a perspective of model goodness, good environmental models should facilitate communication, convey—not bury or "eliminate"—uncertainties, and, thus, afford the active building of consensus decisions, instead

  17. Natural streamflow simulation for two largest river basins in Poland: a baseline for identification of flow alterations

    Science.gov (United States)

    Piniewski, Mikołaj

    2016-05-01

    The objective of this study was to apply a previously developed large-scale and high-resolution SWAT model of the Vistula and the Odra basins, calibrated with the focus of natural flow simulation, in order to assess the impact of three different dam reservoirs on streamflow using the Indicators of Hydrologic Alteration (IHA). A tailored spatial calibration approach was designed, in which calibration was focused on a large set of relatively small non-nested sub-catchments with semi-natural flow regime. These were classified into calibration clusters based on the flow statistics similarity. After performing calibration and validation that gave overall positive results, the calibrated parameter values were transferred to the remaining part of the basins using an approach based on hydrological similarity of donor and target catchments. The calibrated model was applied in three case studies with the purpose of assessing the effect of dam reservoirs (Włocławek, Siemianówka and Czorsztyn Reservoirs) on streamflow alteration. Both the assessment based on gauged streamflow (Before-After design) and the one based on simulated natural streamflow showed large alterations in selected flow statistics related to magnitude, duration, high and low flow pulses and rate of change. Some benefits of using a large-scale and high-resolution hydrological model for the assessment of streamflow alteration include: (1) providing an alternative or complementary approach to the classical Before-After designs, (2) isolating the climate variability effect from the dam (or any other source of alteration) effect, (3) providing a practical tool that can be applied at a range of spatial scales over large area such as a country, in a uniform way. Thus, presented approach can be applied for designing more natural flow regimes, which is crucial for river and floodplain ecosystem restoration in the context of the European Union's policy on environmental flows.

  18. Natural streamflow simulation for two largest river basins in Poland: a baseline for identification of flow alterations

    Directory of Open Access Journals (Sweden)

    M. Piniewski

    2016-05-01

    Full Text Available The objective of this study was to apply a previously developed large-scale and high-resolution SWAT model of the Vistula and the Odra basins, calibrated with the focus of natural flow simulation, in order to assess the impact of three different dam reservoirs on streamflow using the Indicators of Hydrologic Alteration (IHA. A tailored spatial calibration approach was designed, in which calibration was focused on a large set of relatively small non-nested sub-catchments with semi-natural flow regime. These were classified into calibration clusters based on the flow statistics similarity. After performing calibration and validation that gave overall positive results, the calibrated parameter values were transferred to the remaining part of the basins using an approach based on hydrological similarity of donor and target catchments. The calibrated model was applied in three case studies with the purpose of assessing the effect of dam reservoirs (Włocławek, Siemianówka and Czorsztyn Reservoirs on streamflow alteration. Both the assessment based on gauged streamflow (Before-After design and the one based on simulated natural streamflow showed large alterations in selected flow statistics related to magnitude, duration, high and low flow pulses and rate of change. Some benefits of using a large-scale and high-resolution hydrological model for the assessment of streamflow alteration include: (1 providing an alternative or complementary approach to the classical Before-After designs, (2 isolating the climate variability effect from the dam (or any other source of alteration effect, (3 providing a practical tool that can be applied at a range of spatial scales over large area such as a country, in a uniform way. Thus, presented approach can be applied for designing more natural flow regimes, which is crucial for river and floodplain ecosystem restoration in the context of the European Union's policy on environmental flows.

  19. Impact of LUCC on streamflow based on the SWAT model over the Wei River basin on the Loess Plateau in China

    Directory of Open Access Journals (Sweden)

    H. Wang

    2017-04-01

    impact on both soil flow and baseflow by compensating for reduced surface runoff, which leads to a slight increase in the streamflow in the Wei River with the mixed landscapes on the Loess Plateau that include earth–rock mountain area.

  20. Concepts of Model Verification and Validation

    International Nuclear Information System (INIS)

    Thacker, B.H.; Doebling, S.W.; Hemez, F.M.; Anderson, M.C.; Pepin, J.E.; Rodriguez, E.A.

    2004-01-01

    Model verification and validation (VandV) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model VandV procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model VandV program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model VandV is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define VandV methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for VandV applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of VandV procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model

  1. A retrospective streamflow ensemble forecast for an extreme hydrologic event: a case study of Hurricane Irene and on the Hudson River basin

    Science.gov (United States)

    Saleh, Firas; Ramaswamy, Venkatsundar; Georgas, Nickitas; Blumberg, Alan F.; Pullen, Julie

    2016-07-01

    This paper investigates the uncertainties in hourly streamflow ensemble forecasts for an extreme hydrological event using a hydrological model forced with short-range ensemble weather prediction models. A state-of-the art, automated, short-term hydrologic prediction framework was implemented using GIS and a regional scale hydrological model (HEC-HMS). The hydrologic framework was applied to the Hudson River basin ( ˜ 36 000 km2) in the United States using gridded precipitation data from the National Centers for Environmental Prediction (NCEP) North American Regional Reanalysis (NARR) and was validated against streamflow observations from the United States Geologic Survey (USGS). Finally, 21 precipitation ensemble members of the latest Global Ensemble Forecast System (GEFS/R) were forced into HEC-HMS to generate a retrospective streamflow ensemble forecast for an extreme hydrological event, Hurricane Irene. The work shows that ensemble stream discharge forecasts provide improved predictions and useful information about associated uncertainties, thus improving the assessment of risks when compared with deterministic forecasts. The uncertainties in weather inputs may result in false warnings and missed river flooding events, reducing the potential to effectively mitigate flood damage. The findings demonstrate how errors in the ensemble median streamflow forecast and time of peak, as well as the ensemble spread (uncertainty) are reduced 48 h pre-event by utilizing the ensemble framework. The methodology and implications of this work benefit efforts of short-term streamflow forecasts at regional scales, notably regarding the peak timing of an extreme hydrologic event when combined with a flood threshold exceedance diagram. Although the modeling framework was implemented on the Hudson River basin, it is flexible and applicable in other parts of the world where atmospheric reanalysis products and streamflow data are available.

  2. A validated physical model of greenhouse climate

    International Nuclear Information System (INIS)

    Bot, G.P.A.

    1989-01-01

    In the greenhouse model the momentaneous environmental crop growth factors are calculated as output, together with the physical behaviour of the crop. The boundary conditions for this model are the outside weather conditions; other inputs are the physical characteristics of the crop, of the greenhouse and of the control system. The greenhouse model is based on the energy, water vapour and CO 2 balances of the crop-greenhouse system. While the emphasis is on the dynamic behaviour of the greenhouse for implementation in continuous optimization, the state variables temperature, water vapour pressure and carbondioxide concentration in the relevant greenhouse parts crop, air, soil and cover are calculated from the balances over these parts. To do this in a proper way, the physical exchange processes between the system parts have to be quantified first. Therefore the greenhouse model is constructed from submodels describing these processes: a. Radiation transmission model for the modification of the outside to the inside global radiation. b. Ventilation model to describe the ventilation exchange between greenhouse and outside air. c. The description of the exchange of energy and mass between the crop and the greenhouse air. d. Calculation of the thermal radiation exchange between the various greenhouse parts. e. Quantification of the convective exchange processes between the greenhouse air and respectively the cover, the heating pipes and the soil surface and between the cover and the outside air. f. Determination of the heat conduction in the soil. The various submodels are validated first and then the complete greenhouse model is verified

  3. Hydrologic functioning of the deep Critical Zone and contributions to streamflow in a high elevation catchment: testing of multiple conceptual models

    Science.gov (United States)

    Dwivedi, R.; Meixner, T.; McIntosh, J. C.; Ferre, T. P. A.; Eastoe, C. J.; Minor, R. L.; Barron-Gafford, G.; Chorover, J.

    2017-12-01

    The composition of natural mountainous waters maintains important control over the water quality available to downstream users. Furthermore, the geochemical constituents of stream water in the mountainous catchments represent the result of the spatial and temporal evolution of critical zone structure and processes. A key problem is that high elevation catchments involve rugged terrain and are subject to extreme climate and landscape gradients; therefore, high density or high spatial resolution hydro-geochemical observations are rare. Despite such difficulties, the Santa Catalina Mountains Critical Zone Observatory (SCM-CZO), Tucson, AZ, generates long-term hydrogeochemical data for understanding not only hydrological processes and their seasonal characters, but also the geochemical impacts of such processes on streamflow chemical composition. Using existing instrumentation and hydrogeochemical observations from the last 9+ years (2009 through 2016 and an initial part of 2017), we employed a multi-tracer approach along with principal component analysis to identify water sources and their seasonal character. We used our results to inform hydrological process understanding (flow paths, residence times, and water sources) for our study site. Our results indicate that soil water is the largest contributor to streamflow, which is ephemeral in nature. Although a 3-dimensional mixing space involving precipitation, soil water, interflow, and deep groundwater end-members could explain most of the streamflow chemistry, geochemical complexity was observed to grow with catchment storage. In terms of processes and their seasonal character, we found soil water and interflow were the primary end-member contributors to streamflow in all seasons. Deep groundwater only contributes to streamflow at high catchment storage conditions, but it provides major ions such as Na, Mg, and Ca that are lacking in other water types. In this way, our results indicate that any future efforts aimed

  4. Validated predictive modelling of the environmental resistome.

    Science.gov (United States)

    Amos, Gregory C A; Gozzard, Emma; Carter, Charlotte E; Mead, Andrew; Bowes, Mike J; Hawkey, Peter M; Zhang, Lihong; Singer, Andrew C; Gaze, William H; Wellington, Elizabeth M H

    2015-06-01

    Multi-drug-resistant bacteria pose a significant threat to public health. The role of the environment in the overall rise in antibiotic-resistant infections and risk to humans is largely unknown. This study aimed to evaluate drivers of antibiotic-resistance levels across the River Thames catchment, model key biotic, spatial and chemical variables and produce predictive models for future risk assessment. Sediment samples from 13 sites across the River Thames basin were taken at four time points across 2011 and 2012. Samples were analysed for class 1 integron prevalence and enumeration of third-generation cephalosporin-resistant bacteria. Class 1 integron prevalence was validated as a molecular marker of antibiotic resistance; levels of resistance showed significant geospatial and temporal variation. The main explanatory variables of resistance levels at each sample site were the number, proximity, size and type of surrounding wastewater-treatment plants. Model 1 revealed treatment plants accounted for 49.5% of the variance in resistance levels. Other contributing factors were extent of different surrounding land cover types (for example, Neutral Grassland), temporal patterns and prior rainfall; when modelling all variables the resulting model (Model 2) could explain 82.9% of variations in resistance levels in the whole catchment. Chemical analyses correlated with key indicators of treatment plant effluent and a model (Model 3) was generated based on water quality parameters (contaminant and macro- and micro-nutrient levels). Model 2 was beta tested on independent sites and explained over 78% of the variation in integron prevalence showing a significant predictive ability. We believe all models in this study are highly useful tools for informing and prioritising mitigation strategies to reduce the environmental resistome.

  5. Validity of information security policy models

    Directory of Open Access Journals (Sweden)

    Joshua Onome Imoniana

    Full Text Available Validity is concerned with establishing evidence for the use of a method to be used with a particular set of population. Thus, when we address the issue of application of security policy models, we are concerned with the implementation of a certain policy, taking into consideration the standards required, through attribution of scores to every item in the research instrument. En today's globalized economic scenarios, the implementation of information security policy, in an information technology environment, is a condition sine qua non for the strategic management process of any organization. Regarding this topic, various studies present evidences that, the responsibility for maintaining a policy rests primarily with the Chief Security Officer. The Chief Security Officer, in doing so, strives to enhance the updating of technologies, in order to meet all-inclusive business continuity planning policies. Therefore, for such policy to be effective, it has to be entirely embraced by the Chief Executive Officer. This study was developed with the purpose of validating specific theoretical models, whose designs were based on literature review, by sampling 10 of the Automobile Industries located in the ABC region of Metropolitan São Paulo City. This sampling was based on the representativeness of such industries, particularly with regards to each one's implementation of information technology in the region. The current study concludes, presenting evidence of the discriminating validity of four key dimensions of the security policy, being such: the Physical Security, the Logical Access Security, the Administrative Security, and the Legal & Environmental Security. On analyzing the Alpha of Crombach structure of these security items, results not only attest that the capacity of those industries to implement security policies is indisputable, but also, the items involved, homogeneously correlate to each other.

  6. Polarographic validation of chemical speciation models

    International Nuclear Information System (INIS)

    Duffield, J.R.; Jarratt, J.A.

    2001-01-01

    It is well established that the chemical speciation of an element in a given matrix, or system of matrices, is of fundamental importance in controlling the transport behaviour of the element. Therefore, to accurately understand and predict the transport of elements and compounds in the environment it is a requirement that both the identities and concentrations of trace element physico-chemical forms can be ascertained. These twin requirements present the analytical scientist with considerable challenges given the labile equilibria, the range of time scales (from nanoseconds to years) and the range of concentrations (ultra-trace to macro) that may be involved. As a result of this analytical variability, chemical equilibrium modelling has become recognised as an important predictive tool in chemical speciation analysis. However, this technique requires firm underpinning by the use of complementary experimental techniques for the validation of the predictions made. The work reported here has been undertaken with the primary aim of investigating possible methodologies that can be used for the validation of chemical speciation models. However, in approaching this aim, direct chemical speciation analyses have been made in their own right. Results will be reported and analysed for the iron(II)/iron(III)-citrate proton system (pH 2 to 10; total [Fe] = 3 mmol dm -3 ; total [citrate 3- ] 10 mmol dm -3 ) in which equilibrium constants have been determined using glass electrode potentiometry, speciation is predicted using the PHREEQE computer code, and validation of predictions is achieved by determination of iron complexation and redox state with associated concentrations. (authors)

  7. Sensitivity of streamflow to climate change in California

    Science.gov (United States)

    Grantham, T.; Carlisle, D.; Wolock, D.; McCabe, G. J.; Wieczorek, M.; Howard, J.

    2015-12-01

    Trends of decreasing snowpack and increasing risk of drought are looming challenges for California water resource management. Increasing vulnerability of the state's natural water supplies threatens California's social-economic vitality and the health of its freshwater ecosystems. Despite growing awareness of potential climate change impacts, robust management adaptation has been hindered by substantial uncertainty in future climate predictions for the region. Down-scaled global climate model (GCM) projections uniformly suggest future warming of the region, but projections are highly variable with respect to the direction and magnitude of change in regional precipitation. Here we examine the sensitivity of California surface water supplies to climate variation independently of GCMs. We use a statistical approach to construct predictive models of monthly streamflow based on historical climate and river basin features. We then propagate an ensemble of synthetic climate simulations through the models to assess potential streamflow responses to changes in temperature and precipitation in different months and regions of the state. We also consider the range of streamflow change predicted by bias-corrected downscaled GCMs. Our results indicate that the streamflow in the xeric and coastal mountain regions of California is more sensitive to changes in precipitation than temperature, whereas streamflow in the interior mountain region responds strongly to changes in both temperature and precipitation. Mean climate projections for 2025-2075 from GCM ensembles are highly variable, indicating streamflow changes of -50% to +150% relative to baseline (1980-2010) for most months and regions. By quantifying the sensitivity of streamflow to climate change, rather than attempting to predict future hydrologic conditions based on uncertain GCM projections, these results should be more informative to water managers seeking to assess, and potentially reduce, the vulnerability of surface

  8. Model-Based Method for Sensor Validation

    Science.gov (United States)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  9. Assessment model validity document FARF31

    International Nuclear Information System (INIS)

    Elert, Mark; Gylling Bjoern; Lindgren, Maria

    2004-08-01

    The prime goal of model validation is to build confidence in the model concept and that the model is fit for its intended purpose. In other words: Does the model predict transport in fractured rock adequately to be used in repository performance assessments. Are the results reasonable for the type of modelling tasks the model is designed for. Commonly, in performance assessments a large number of realisations of flow and transport is made to cover the associated uncertainties. Thus, the flow and transport including radioactive chain decay are preferably calculated in the same model framework. A rather sophisticated concept is necessary to be able to model flow and radionuclide transport in the near field and far field of a deep repository, also including radioactive chain decay. In order to avoid excessively long computational times there is a need for well-based simplifications. For this reason, the far field code FARF31 is made relatively simple, and calculates transport by using averaged entities to represent the most important processes. FARF31 has been shown to be suitable for the performance assessments within the SKB studies, e.g. SR 97. Among the advantages are that it is a fast, simple and robust code, which enables handling of many realisations with wide spread in parameters in combination with chain decay of radionuclides. Being a component in the model chain PROPER, it is easy to assign statistical distributions to the input parameters. Due to the formulation of the advection-dispersion equation in FARF31 it is possible to perform the groundwater flow calculations separately.The basis for the modelling is a stream tube, i.e. a volume of rock including fractures with flowing water, with the walls of the imaginary stream tube defined by streamlines. The transport within the stream tube is described using a dual porosity continuum approach, where it is assumed that rock can be divided into two distinct domains with different types of porosity

  10. Modelling and validation of electromechanical shock absorbers

    Science.gov (United States)

    Tonoli, Andrea; Amati, Nicola; Girardello Detoni, Joaquim; Galluzzi, Renato; Gasparin, Enrico

    2013-08-01

    Electromechanical vehicle suspension systems represent a promising substitute to conventional hydraulic solutions. However, the design of electromechanical devices that are able to supply high damping forces without exceeding geometric dimension and mass constraints is a difficult task. All these challenges meet in off-road vehicle suspension systems, where the power density of the dampers is a crucial parameter. In this context, the present paper outlines a particular shock absorber configuration where a suitable electric machine and a transmission mechanism are utilised to meet off-road vehicle requirements. A dynamic model is used to represent the device. Subsequently, experimental tests are performed on an actual prototype to verify the functionality of the damper and validate the proposed model.

  11. Atmospheric corrosion: statistical validation of models

    International Nuclear Information System (INIS)

    Diaz, V.; Martinez-Luaces, V.; Guineo-Cobs, G.

    2003-01-01

    In this paper we discuss two different methods for validation of regression models, applied to corrosion data. One of them is based on the correlation coefficient and the other one is the statistical test of lack of fit. Both methods are used here to analyse fitting of bi logarithmic model in order to predict corrosion for very low carbon steel substrates in rural and urban-industrial atmospheres in Uruguay. Results for parameters A and n of the bi logarithmic model are reported here. For this purpose, all repeated values were used instead of using average values as usual. Modelling is carried out using experimental data corresponding to steel substrates under the same initial meteorological conditions ( in fact, they are put in the rack at the same time). Results of correlation coefficient are compared with the lack of it tested at two different signification levels (α=0.01 and α=0.05). Unexpected differences between them are explained and finally, it is possible to conclude, at least in the studied atmospheres, that the bi logarithmic model does not fit properly the experimental data. (Author) 18 refs

  12. Streamflow impacts of biofuel policy-driven landscape change.

    Directory of Open Access Journals (Sweden)

    Sami Khanal

    Full Text Available Likely changes in precipitation (P and potential evapotranspiration (PET resulting from policy-driven expansion of bioenergy crops in the United States are shown to create significant changes in streamflow volumes and increase water stress in the High Plains. Regional climate simulations for current and biofuel cropping system scenarios are evaluated using the same atmospheric forcing data over the period 1979-2004 using the Weather Research Forecast (WRF model coupled to the NOAH land surface model. PET is projected to increase under the biofuel crop production scenario. The magnitude of the mean annual increase in PET is larger than the inter-annual variability of change in PET, indicating that PET increase is a forced response to the biofuel cropping system land use. Across the conterminous U.S., the change in mean streamflow volume under the biofuel scenario is estimated to range from negative 56% to positive 20% relative to a business-as-usual baseline scenario. In Kansas and Oklahoma, annual streamflow volume is reduced by an average of 20%, and this reduction in streamflow volume is due primarily to increased PET. Predicted increase in mean annual P under the biofuel crop production scenario is lower than its inter-annual variability, indicating that additional simulations would be necessary to determine conclusively whether predicted change in P is a response to biofuel crop production. Although estimated changes in streamflow volume include the influence of P change, sensitivity results show that PET change is the significantly dominant factor causing streamflow change. Higher PET and lower streamflow due to biofuel feedstock production are likely to increase water stress in the High Plains. When pursuing sustainable biofuels policy, decision-makers should consider the impacts of feedstock production on water scarcity.

  13. SDG and qualitative trend based model multiple scale validation

    Science.gov (United States)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  14. A Self-Calibrating Runoff and Streamflow Remote Sensing Model for Ungauged Basins Using Open-Access Earth Observation Data

    Directory of Open Access Journals (Sweden)

    Ate Poortinga

    2017-01-01

    Full Text Available Due to increasing pressures on water resources, there is a need to monitor regional water resource availability in a spatially and temporally explicit manner. However, for many parts of the world, there is insufficient data to quantify stream flow or ground water infiltration rates. We present the results of a pixel-based water balance formulation to partition rainfall into evapotranspiration, surface water runoff and potential ground water infiltration. The method leverages remote sensing derived estimates of precipitation, evapotranspiration, soil moisture, Leaf Area Index, and a single F coefficient to distinguish between runoff and storage changes. The study produced significant correlations between the remote sensing method and field based measurements of river flow in two Vietnamese river basins. For the Ca basin, we found R2 values ranging from 0.88–0.97 and Nash–Sutcliffe efficiency (NSE values varying between 0.44–0.88. The R2 for the Red River varied between 0.87–0.93 and NSE values between 0.61 and 0.79. Based on these findings, we conclude that the method allows for a fast and cost-effective way to map water resource availability in basins with no gauges or monitoring infrastructure, without the need for application of sophisticated hydrological models or resource-intensive data.

  15. Unit testing, model validation, and biological simulation.

    Science.gov (United States)

    Sarma, Gopal P; Jacobs, Travis W; Watts, Mark D; Ghayoomie, S Vahid; Larson, Stephen D; Gerkin, Richard C

    2016-01-01

    The growth of the software industry has gone hand in hand with the development of tools and cultural practices for ensuring the reliability of complex pieces of software. These tools and practices are now acknowledged to be essential to the management of modern software. As computational models and methods have become increasingly common in the biological sciences, it is important to examine how these practices can accelerate biological software development and improve research quality. In this article, we give a focused case study of our experience with the practices of unit testing and test-driven development in OpenWorm, an open-science project aimed at modeling Caenorhabditis elegans. We identify and discuss the challenges of incorporating test-driven development into a heterogeneous, data-driven project, as well as the role of model validation tests, a category of tests unique to software which expresses scientific models.

  16. Downscaling of GCM forecasts to streamflow over Scandinavia

    DEFF Research Database (Denmark)

    Nilsson, P.; Uvo, C.B.; Landman, W.A.

    2008-01-01

    flows. The technique includes model output statistics (MOS) based on a non-linear Neural Network (NN) approach. Results show that streamflow forecasts from Global Circulation Model (GCM) predictions, for the Scandinavia region are viable and highest skill values were found for basins located in south......A seasonal forecasting technique to produce probabilistic and deterministic streamflow forecasts for 23 basins in Norway and northern Sweden is developed in this work. Large scale circulation and moisture fields, forecasted by the ECHAM4.5 model 4 months in advance, are used to forecast spring...

  17. MAAP4 model and validation status

    International Nuclear Information System (INIS)

    Plys, M.G.; Paik, C.Y.; Henry, R.E.; Wu, Chunder; Suh, K.Y.; Sung Jin Lee; McCartney, M.A.; Wang, Zhe

    1993-01-01

    The MAAP 4 code for integrated severe accident analysis is intended to be used for Level 1 and Level 2 probabilistic safety assessment and severe accident management evaluations for current and advanced light water reactors. MAAP 4 can be used to determine which accidents lead to fuel damage and which are successfully terminated which accidents lead to fuel damage and which are successfully terminated before or after fuel damage (a level 1 application). It can also be used to determine which sequences result in fission product release to the environment and provide the time history of such releases (a level 2 application). The MAAP 4 thermal-hydraulic and fission product models and their validation are discussed here. This code is the newest version of MAAP offered by the Electric Power Research Institute (EPRI) and it contains substantial mechanistic improvements over its predecessor, MAAP 3.0B

  18. Seasonal Prediction of Taiwan's Streamflow Using Teleconnection Patterns

    Science.gov (United States)

    Chen, Chia-Jeng; Lee, Tsung-Yu

    2017-04-01

    Seasonal streamflow as an integrated response to complex hydro-climatic processes can be subject to activity of prevailing weather systems potentially modulated by large-scale climate oscillations (e.g., El Niño-Southern Oscillation, ENSO). To develop a seamless seasonal forecasting system in Taiwan, this study assesses how significant Taiwan's precipitation and streamflow in different seasons correlate with selected teleconnection patterns. Long-term precipitation and streamflow data in three major precipitation seasons, namely the spring rains (February to April), Mei-Yu (May and June), and typhoon (July to September) seasons, are derived at 28 upstream and 13 downstream catchments in Taiwan. The three seasons depict a complete wet period of Taiwan as well as many regions bearing similar climatic conditions in East Asia. Lagged correlation analysis is then performed to investigate how the precipitation and streamflow data correlate with predominant teleconnection indices at varied lead times. Teleconnection indices are selected only if they show certain linkage with weather systems and activity in the three seasons based on previous literature. For instance, the ENSO and Quasi-Biennial Oscillation, proven to influence East Asian climate across seasons and summer typhoon activity, respectively, are included in the list of climate indices for correlation analysis. Significant correlations found between Taiwan's precipitation and streamflow and teleconnection indices are further examined by a climate regime shift (CRS) test to identify any abrupt changes in the correlations. The understanding of existing CRS is useful for informing the forecasting system of the changes in the predictor-predictand relationship. To evaluate prediction skill in the three seasons and skill differences between precipitation and streamflow, hindcasting experiments of precipitation and streamflow are conducted using stepwise linear regression models. Discussion and suggestions for coping

  19. A Linear Dynamical Systems Approach to Streamflow Reconstruction Reveals History of Regime Shifts in Northern Thailand

    Science.gov (United States)

    Nguyen, Hung T. T.; Galelli, Stefano

    2018-03-01

    Catchment dynamics is not often modeled in streamflow reconstruction studies; yet, the streamflow generation process depends on both catchment state and climatic inputs. To explicitly account for this interaction, we contribute a linear dynamic model, in which streamflow is a function of both catchment state (i.e., wet/dry) and paleoclimatic proxies. The model is learned using a novel variant of the Expectation-Maximization algorithm, and it is used with a paleo drought record—the Monsoon Asia Drought Atlas—to reconstruct 406 years of streamflow for the Ping River (northern Thailand). Results for the instrumental period show that the dynamic model has higher accuracy than conventional linear regression; all performance scores improve by 45-497%. Furthermore, the reconstructed trajectory of the state variable provides valuable insights about the catchment history—e.g., regime-like behavior—thereby complementing the information contained in the reconstructed streamflow time series. The proposed technique can replace linear regression, since it only requires information on streamflow and climatic proxies (e.g., tree-rings, drought indices); furthermore, it is capable of readily generating stochastic streamflow replicates. With a marginal increase in computational requirements, the dynamic model brings more desirable features and value to streamflow reconstructions.

  20. Validation of A Global Hydrological Model

    Science.gov (United States)

    Doell, P.; Lehner, B.; Kaspar, F.; Vassolo, S.

    due to the precipitation mea- surement errors. Even though the explicit modeling of wetlands and lakes leads to a much improved modeling of both the vertical water balance and the lateral transport of water, not enough information is included in WGHM to accurately capture the hy- drology of these water bodies. Certainly, the reliability of model results is highest at the locations at which WGHM was calibrated. The validation indicates that reliability for cells inside calibrated basins is satisfactory if the basin is relatively homogeneous. Analyses of the few available stations outside of calibrated basins indicate a reason- ably high model reliability, particularly in humid regions.

  1. Developing a model for validation and prediction of bank customer ...

    African Journals Online (AJOL)

    Credit risk is the most important risk of banks. The main approaches of the bank to reduce credit risk are correct validation using the final status and the validation model parameters. High fuel of bank reserves and lost or outstanding facilities of banks indicate the lack of appropriate validation models in the banking network.

  2. A proposed best practice model validation framework for banks

    Directory of Open Access Journals (Sweden)

    Pieter J. (Riaan de Jongh

    2017-06-01

    Full Text Available Background: With the increasing use of complex quantitative models in applications throughout the financial world, model risk has become a major concern. The credit crisis of 2008–2009 provoked added concern about the use of models in finance. Measuring and managing model risk has subsequently come under scrutiny from regulators, supervisors, banks and other financial institutions. Regulatory guidance indicates that meticulous monitoring of all phases of model development and implementation is required to mitigate this risk. Considerable resources must be mobilised for this purpose. The exercise must embrace model development, assembly, implementation, validation and effective governance. Setting: Model validation practices are generally patchy, disparate and sometimes contradictory, and although the Basel Accord and some regulatory authorities have attempted to establish guiding principles, no definite set of global standards exists. Aim: Assessing the available literature for the best validation practices. Methods: This comprehensive literature study provided a background to the complexities of effective model management and focussed on model validation as a component of model risk management. Results: We propose a coherent ‘best practice’ framework for model validation. Scorecard tools are also presented to evaluate if the proposed best practice model validation framework has been adequately assembled and implemented. Conclusion: The proposed best practice model validation framework is designed to assist firms in the construction of an effective, robust and fully compliant model validation programme and comprises three principal elements: model validation governance, policy and process.

  3. Aerosol modelling and validation during ESCOMPTE 2001

    Science.gov (United States)

    Cousin, F.; Liousse, C.; Cachier, H.; Bessagnet, B.; Guillaume, B.; Rosset, R.

    The ESCOMPTE 2001 programme (Atmospheric Research. 69(3-4) (2004) 241) has resulted in an exhaustive set of dynamical, radiative, gas and aerosol observations (surface and aircraft measurements). A previous paper (Atmospheric Research. (2004) in press) has dealt with dynamics and gas-phase chemistry. The present paper is an extension to aerosol formation, transport and evolution. To account for important loadings of primary and secondary aerosols and their transformation processes in the ESCOMPTE domain, the ORISAM aerosol module (Atmospheric Environment. 35 (2001) 4751) was implemented on-line in the air-quality Meso-NH-C model. Additional developments have been introduced in ORganic and Inorganic Spectral Aerosol Module (ORISAM) to improve the comparison between simulations and experimental surface and aircraft field data. This paper discusses this comparison for a simulation performed during one selected day, 24 June 2001, during the Intensive Observation Period IOP2b. Our work relies on BC and OCp emission inventories specifically developed for ESCOMPTE. This study confirms the need for a fine resolution aerosol inventory with spectral chemical speciation. BC levels are satisfactorily reproduced, thus validating our emission inventory and its processing through Meso-NH-C. However, comparisons for reactive species generally denote an underestimation of concentrations. Organic aerosol levels are rather well simulated though with a trend to underestimation in the afternoon. Inorganic aerosol species are underestimated for several reasons, some of them have been identified. For sulphates, primary emissions were introduced. Improvement was obtained too for modelled nitrate and ammonium levels after introducing heterogeneous chemistry. However, no modelling of terrigeneous particles is probably a major cause for nitrates and ammonium underestimations. Particle numbers and size distributions are well reproduced, but only in the submicrometer range. Our work points out

  4. Validating agent based models through virtual worlds.

    Energy Technology Data Exchange (ETDEWEB)

    Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E.; Bernstein, Jeremy Ray Rhythm

    2014-01-01

    As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior

  5. Geochemistry Model Validation Report: Material Degradation and Release Model

    Energy Technology Data Exchange (ETDEWEB)

    H. Stockman

    2001-09-28

    The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17).

  6. Geochemistry Model Validation Report: Material Degradation and Release Model

    International Nuclear Information System (INIS)

    Stockman, H.

    2001-01-01

    The purpose of this Analysis and Modeling Report (AMR) is to validate the Material Degradation and Release (MDR) model that predicts degradation and release of radionuclides from a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. This AMR is prepared according to ''Technical Work Plan for: Waste Package Design Description for LA'' (Ref. 17). The intended use of the MDR model is to estimate the long-term geochemical behavior of waste packages (WPs) containing U. S . Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The model is intended to predict (1) the extent to which criticality control material, such as gadolinium (Gd), will remain in the WP after corrosion of the initial WP, (2) the extent to which fissile Pu and uranium (U) will be carried out of the degraded WP by infiltrating water, and (3) the chemical composition and amounts of minerals and other solids left in the WP. The results of the model are intended for use in criticality calculations. The scope of the model validation report is to (1) describe the MDR model, and (2) compare the modeling results with experimental studies. A test case based on a degrading Pu-ceramic WP is provided to help explain the model. This model does not directly feed the assessment of system performance. The output from this model is used by several other models, such as the configuration generator, criticality, and criticality consequence models, prior to the evaluation of system performance. This document has been prepared according to AP-3.10Q, ''Analyses and Models'' (Ref. 2), and prepared in accordance with the technical work plan (Ref. 17)

  7. The effects of changing land cover on streamflow simulation in Puerto Rico

    Science.gov (United States)

    A.E. Van Beusekom; L.E. Hay; R.J. Viger; W.A. Gould; J.A. Collazo; A. Henareh Khalyani

    2014-01-01

    This study quantitatively explores whether land cover changes have a substantive impact on simulated streamflow within the tropical island setting of Puerto Rico. The Precipitation Runoff Modeling System (PRMS) was used to compare streamflow simulations based on five static parameterizations of land cover with those based on dynamically varying parameters derived from...

  8. Statistical validation of normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; van t Veld, Aart; Langendijk, Johannes A.; Schilstra, Cornelis

    2012-01-01

    PURPOSE: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: A penalized regression method, LASSO (least absolute shrinkage

  9. Spatiotemporal patterns of precipitation inferred from streamflow observations across the Sierra Nevada mountain range

    Science.gov (United States)

    Henn, Brian; Clark, Martyn P.; Kavetski, Dmitri; Newman, Andrew J.; Hughes, Mimi; McGurk, Bruce; Lundquist, Jessica D.

    2018-01-01

    Given uncertainty in precipitation gauge-based gridded datasets over complex terrain, we use multiple streamflow observations as an additional source of information about precipitation, in order to identify spatial and temporal differences between a gridded precipitation dataset and precipitation inferred from streamflow. We test whether gridded datasets capture across-crest and regional spatial patterns of variability, as well as year-to-year variability and trends in precipitation, in comparison to precipitation inferred from streamflow. We use a Bayesian model calibration routine with multiple lumped hydrologic model structures to infer the most likely basin-mean, water-year total precipitation for 56 basins with long-term (>30 year) streamflow records in the Sierra Nevada mountain range of California. We compare basin-mean precipitation derived from this approach with basin-mean precipitation from a precipitation gauge-based, 1/16° gridded dataset that has been used to simulate and evaluate trends in Western United States streamflow and snowpack over the 20th century. We find that the long-term average spatial patterns differ: in particular, there is less precipitation in the gridded dataset in higher-elevation basins whose aspect faces prevailing cool-season winds, as compared to precipitation inferred from streamflow. In a few years and basins, there is less gridded precipitation than there is observed streamflow. Lower-elevation, southern, and east-of-crest basins show better agreement between gridded and inferred precipitation. Implied actual evapotranspiration (calculated as precipitation minus streamflow) then also varies between the streamflow-based estimates and the gridded dataset. Absolute uncertainty in precipitation inferred from streamflow is substantial, but the signal of basin-to-basin and year-to-year differences are likely more robust. The findings suggest that considering streamflow when spatially distributing precipitation in complex terrain

  10. Validation of the community radiative transfer model

    International Nuclear Information System (INIS)

    Ding Shouguo; Yang Ping; Weng Fuzhong; Liu Quanhua; Han Yong; Delst, Paul van; Li Jun; Baum, Bryan

    2011-01-01

    To validate the Community Radiative Transfer Model (CRTM) developed by the U.S. Joint Center for Satellite Data Assimilation (JCSDA), the discrete ordinate radiative transfer (DISORT) model and the line-by-line radiative transfer model (LBLRTM) are combined in order to provide a reference benchmark. Compared with the benchmark, the CRTM appears quite accurate for both clear sky and ice cloud radiance simulations with RMS errors below 0.2 K, except for clouds with small ice particles. In a computer CPU run time comparison, the CRTM is faster than DISORT by approximately two orders of magnitude. Using the operational MODIS cloud products and the European Center for Medium-range Weather Forecasting (ECMWF) atmospheric profiles as an input, the CRTM is employed to simulate the Atmospheric Infrared Sounder (AIRS) radiances. The CRTM simulations are shown to be in reasonably close agreement with the AIRS measurements (the discrepancies are within 2 K in terms of brightness temperature difference). Furthermore, the impact of uncertainties in the input cloud properties and atmospheric profiles on the CRTM simulations has been assessed. The CRTM-based brightness temperatures (BTs) at the top of the atmosphere (TOA), for both thin (τ 30) clouds, are highly sensitive to uncertainties in atmospheric temperature and cloud top pressure. However, for an optically thick cloud, the CRTM-based BTs are not sensitive to the uncertainties of cloud optical thickness, effective particle size, and atmospheric humidity profiles. On the contrary, the uncertainties of the CRTM-based TOA BTs resulting from effective particle size and optical thickness are not negligible in an optically thin cloud.

  11. Development of a Conservative Model Validation Approach for Reliable Analysis

    Science.gov (United States)

    2015-01-01

    CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...obtain a conservative simulation model for reliable design even with limited experimental data. Very little research has taken into account the...3, the proposed conservative model validation is briefly compared to the conventional model validation approach. Section 4 describes how to account

  12. Validation of ecological state space models using the Laplace approximation

    DEFF Research Database (Denmark)

    Thygesen, Uffe Høgsbro; Albertsen, Christoffer Moesgaard; Berg, Casper Willestofte

    2017-01-01

    Many statistical models in ecology follow the state space paradigm. For such models, the important step of model validation rarely receives as much attention as estimation or hypothesis testing, perhaps due to lack of available algorithms and software. Model validation is often based on a naive...... for estimation in general mixed effects models. Implementing one-step predictions in the R package Template Model Builder, we demonstrate that it is possible to perform model validation with little effort, even if the ecological model is multivariate, has non-linear dynamics, and whether observations...... useful directions in which the model could be improved....

  13. Validation of models in an imaging infrared simulation

    CSIR Research Space (South Africa)

    Willers, C

    2007-10-01

    Full Text Available threeprocessesfortransformingtheinformationbetweentheentities. Reality/ Problem Entity Conceptual Model Computerized Model Model Validation ModelVerification Model Qualification Computer Implementation Analysisand Modelling Simulationand Experimentation “Substantiationthata....C.Refsgaard ,ModellingGuidelines-terminology andguidingprinciples, AdvancesinWaterResources, Vol27,No1,January2004,?pp.71-82(12),Elsevier. et.al. [5]N.Oreskes,et.al.,Verification,Validation,andConfirmationof NumericalModelsintheEarthSciences,Science,Vol263, Number...

  14. Model Validation Using Coordinate Distance with Performance Sensitivity

    Directory of Open Access Journals (Sweden)

    Jiann-Shiun Lew

    2008-01-01

    Full Text Available This paper presents an innovative approach to model validation for a structure with significant parameter variations. Model uncertainty of the structural dynamics is quantified with the use of a singular value decomposition technique to extract the principal components of parameter change, and an interval model is generated to represent the system with parameter uncertainty. The coordinate vector, corresponding to the identified principal directions, of the validation system is computed. The coordinate distance between the validation system and the identified interval model is used as a metric for model validation. A beam structure with an attached subsystem, which has significant parameter uncertainty, is used to demonstrate the proposed approach.

  15. In Brief: Online database for instantaneous streamflow data

    Science.gov (United States)

    Showstack, Randy

    2007-11-01

    Access to U.S. Geological Survey (USGS) historical instantaneous streamflow discharge data, dating from around 1990, is now available online through the Instantaneous Data Archive (IDA), the USGS announced on 14 November. In this new system, users can find streamflow information reported at the time intervals at which it is collected, typically 15-minute to hourly intervals. Although instantaneous data have been available for many years, they were not accessible through the Internet. Robert Hirsch, USGS Associate Director of Water, said, ``A user-friendly archive of historical instantaneous streamflow data is important to many different users for such things as floodplain mapping, flood modeling, and estimating pollutant transport.''The site currently has about 1.5 billion instantaneous data values from 5500 stream gages in 26 states. The number of states and stream gages with data will continue to increase, according to the USGS. For more information, visit the Web site: http://ida.water.usgs.gov/ida/.

  16. Regression models of discharge and mean velocity associated with near-median streamflow conditions in Texas: utility of the U.S. Geological Survey discharge measurement database

    Science.gov (United States)

    Asquith, William H.

    2014-01-01

    A database containing more than 16,300 discharge values and ancillary hydraulic attributes was assembled from summaries of discharge measurement records for 391 USGS streamflow-gauging stations (streamgauges) in Texas. Each discharge is between the 40th- and 60th-percentile daily mean streamflow as determined by period-of-record, streamgauge-specific, flow-duration curves. Each discharge therefore is assumed to represent a discharge measurement made for near-median streamflow conditions, and such conditions are conceptualized as representative of midrange to baseflow conditions in much of the state. The hydraulic attributes of each discharge measurement included concomitant cross-section flow area, water-surface top width, and reported mean velocity. Two regression equations are presented: (1) an expression for discharge and (2) an expression for mean velocity, both as functions of selected hydraulic attributes and watershed characteristics. Specifically, the discharge equation uses cross-sectional area, water-surface top width, contributing drainage area of the watershed, and mean annual precipitation of the location; the equation has an adjusted R-squared of approximately 0.95 and residual standard error of approximately 0.23 base-10 logarithm (cubic meters per second). The mean velocity equation uses discharge, water-surface top width, contributing drainage area, and mean annual precipitation; the equation has an adjusted R-squared of approximately 0.50 and residual standard error of approximately 0.087 third root (meters per second). Residual plots from both equations indicate that reliable estimates of discharge and mean velocity at ungauged stream sites are possible. Further, the relation between contributing drainage area and main-channel slope (a measure of whole-watershed slope) is depicted to aid analyst judgment of equation applicability for ungauged sites. Example applications and computations are provided and discussed within a real-world, discharge

  17. System Advisor Model: Flat Plate Photovoltaic Performance Modeling Validation Report

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, Janine [National Renewable Energy Lab. (NREL), Golden, CO (United States); Whitmore, Jonathan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Kaffine, Leah [National Renewable Energy Lab. (NREL), Golden, CO (United States); Blair, Nate [National Renewable Energy Lab. (NREL), Golden, CO (United States); Dobos, Aron P. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    The System Advisor Model (SAM) is a free software tool that performs detailed analysis of both system performance and system financing for a variety of renewable energy technologies. This report provides detailed validation of the SAM flat plate photovoltaic performance model by comparing SAM-modeled PV system generation data to actual measured production data for nine PV systems ranging from 75 kW to greater than 25 MW in size. The results show strong agreement between SAM predictions and field data, with annualized prediction error below 3% for all fixed tilt cases and below 8% for all one axis tracked cases. The analysis concludes that snow cover and system outages are the primary sources of disagreement, and other deviations resulting from seasonal biases in the irradiation models and one axis tracking issues are discussed in detail.

  18. Test-driven verification/validation of model transformations

    Institute of Scientific and Technical Information of China (English)

    László LENGYEL; Hassan CHARAF

    2015-01-01

    Why is it important to verify/validate model transformations? The motivation is to improve the quality of the trans-formations, and therefore the quality of the generated software artifacts. Verified/validated model transformations make it possible to ensure certain properties of the generated software artifacts. In this way, verification/validation methods can guarantee different requirements stated by the actual domain against the generated/modified/optimized software products. For example, a verified/ validated model transformation can ensure the preservation of certain properties during the model-to-model transformation. This paper emphasizes the necessity of methods that make model transformation verified/validated, discusses the different scenarios of model transformation verification and validation, and introduces the principles of a novel test-driven method for verifying/ validating model transformations. We provide a solution that makes it possible to automatically generate test input models for model transformations. Furthermore, we collect and discuss the actual open issues in the field of verification/validation of model transformations.

  19. A comprehensive model for piezoceramic actuators: modelling, validation and application

    International Nuclear Information System (INIS)

    Quant, Mario; Elizalde, Hugo; Flores, Abiud; Ramírez, Ricardo; Orta, Pedro; Song, Gangbing

    2009-01-01

    This paper presents a comprehensive model for piezoceramic actuators (PAs), which accounts for hysteresis, non-linear electric field and dynamic effects. The hysteresis model is based on the widely used general Maxwell slip model, while an enhanced electro-mechanical non-linear model replaces the linear constitutive equations commonly used. Further on, a linear second order model compensates the frequency response of the actuator. Each individual model is fully characterized from experimental data yielded by a specific PA, then incorporated into a comprehensive 'direct' model able to determine the output strain based on the applied input voltage, fully compensating the aforementioned effects, where the term 'direct' represents an electrical-to-mechanical operating path. The 'direct' model was implemented in a Matlab/Simulink environment and successfully validated via experimental results, exhibiting higher accuracy and simplicity than many published models. This simplicity would allow a straightforward inclusion of other behaviour such as creep, ageing, material non-linearity, etc, if such parameters are important for a particular application. Based on the same formulation, two other models are also presented: the first is an 'alternate' model intended to operate within a force-controlled scheme (instead of a displacement/position control), thus able to capture the complex mechanical interactions occurring between a PA and its host structure. The second development is an 'inverse' model, able to operate within an open-loop control scheme, that is, yielding a 'linearized' PA behaviour. The performance of the developed models is demonstrated via a numerical sample case simulated in Matlab/Simulink, consisting of a PA coupled to a simple mechanical system, aimed at shifting the natural frequency of the latter

  20. Response of streamflow to projected climate change scenarios in an ...

    Indian Academy of Sciences (India)

    Snowmelt run-off model (SRM) based on degree-day approach has been employed to evaluate the change in snow-cover depletion and corresponding streamflow under different projected climatic scenarios foran eastern Himalayan catchment in India. Nuranang catchment located at Tawang district of ArunachalPradesh ...

  1. An Approach to Flooding Inundation Combining the Streamflow Prediction Tool (SPT) and Downscaled Soil Moisture

    Science.gov (United States)

    Cotterman, K. A.; Follum, M. L.; Pradhan, N. R.; Niemann, J. D.

    2017-12-01

    Flooding impacts numerous aspects of society, from localized flash floods to continental-scale flood events. Many numerical flood models focus solely on riverine flooding, with some capable of capturing both localized and continental-scale flood events. However, these models neglect flooding away from channels that are related to excessive ponding, typically found in areas with flat terrain and poorly draining soils. In order to obtain a holistic view of flooding, we combine flood results from the Streamflow Prediction Tool (SPT), a riverine flood model, with soil moisture downscaling techniques to determine if a better representation of flooding is obtained. This allows for a more holistic understanding of potential flood prone areas, increasing the opportunity for more accurate warnings and evacuations during flooding conditions. Thirty-five years of near-global historical streamflow is reconstructed with continental-scale flow routing of runoff from global land surface models. Elevation data was also obtained worldwide, to establish a relationship between topographic attributes and soil moisture patterns. Derived soil moisture data is validated against observed soil moisture, increasing confidence in the ability to accurately capture soil moisture patterns. Potential flooding situations can be examined worldwide, with this study focusing on the United States, Central America, and the Philippines.

  2. seawaveQ: an R package providing a model and utilities for analyzing trends in chemical concentrations in streams with a seasonal wave (seawave) and adjustment for streamflow (Q) and other ancillary variables

    Science.gov (United States)

    Ryberg, Karen R.; Vecchia, Aldo V.

    2013-01-01

    The seawaveQ R package fits a parametric regression model (seawaveQ) to pesticide concentration data from streamwater samples to assess variability and trends. The model incorporates the strong seasonality and high degree of censoring common in pesticide data and users can incorporate numerous ancillary variables, such as streamflow anomalies. The model is fitted to pesticide data using maximum likelihood methods for censored data and is robust in terms of pesticide, stream location, and degree of censoring of the concentration data. This R package standardizes this methodology for trend analysis, documents the code, and provides help and tutorial information, as well as providing additional utility functions for plotting pesticide and other chemical concentration data.

  3. Some considerations for validation of repository performance assessment models

    International Nuclear Information System (INIS)

    Eisenberg, N.

    1991-01-01

    Validation is an important aspect of the regulatory uses of performance assessment. A substantial body of literature exists indicating the manner in which validation of models is usually pursued. Because performance models for a nuclear waste repository cannot be tested over the long time periods for which the model must make predictions, the usual avenue for model validation is precluded. Further impediments to model validation include a lack of fundamental scientific theory to describe important aspects of repository performance and an inability to easily deduce the complex, intricate structures characteristic of a natural system. A successful strategy for validation must attempt to resolve these difficulties in a direct fashion. Although some procedural aspects will be important, the main reliance of validation should be on scientific substance and logical rigor. The level of validation needed will be mandated, in part, by the uses to which these models are put, rather than by the ideal of validation of a scientific theory. Because of the importance of the validation of performance assessment models, the NRC staff has engaged in a program of research and international cooperation to seek progress in this important area. 2 figs., 16 refs

  4. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  5. Statistical validation of normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. Validation of mentorship model for newly qualified professional ...

    African Journals Online (AJOL)

    Newly qualified professional nurses (NQPNs) allocated to community health care services require the use of validated model to practice independently. Validation was done to adapt and assess if the model is understood and could be implemented by NQPNs and mentors employed in community health care services.

  7. Validation and Adaptation of Router and Switch Models

    NARCIS (Netherlands)

    Boltjes, B.; Fernandez Diaz, I.; Kock, B.A.; Langeveld, R.J.G.M.; Schoenmaker, G.

    2003-01-01

    This paper describes validating OPNET models of key devices for the next generation IP-based tactical network of the Royal Netherlands Army (RNLA). The task of TNO-FEL is to provide insight in scalability and performance of future deployed networks. Because validated models ol key Cisco equipment

  8. Geologic and climatic controls on streamflow generation processes in a complex eogenetic karst basin

    Science.gov (United States)

    Vibhava, F.; Graham, W. D.; Maxwell, R. M.

    2012-12-01

    Streamflow at any given location and time is representative of surface and subsurface contributions from various sources. The ability to fully identify the factors controlling these contributions is key to successfully understanding the transport of contaminants through the system. In this study we developed a fully integrated 3D surface water-groundwater-land surface model, PARFLOW, to evaluate geologic and climatic controls on streamflow generation processes in a complex eogenetic karst basin in North Central Florida. In addition to traditional model evaluation criterion, such as comparing field observations to model simulated streamflow and groundwater elevations, we quantitatively evaluated the model's predictions of surface-groundwater interactions over space and time using a suite of binary end-member mixing models that were developed using observed specific conductivity differences among surface and groundwater sources throughout the domain. Analysis of model predictions showed that geologic heterogeneity exerts a strong control on both streamflow generation processes and land atmospheric fluxes in this watershed. In the upper basin, where the karst aquifer is overlain by a thick confining layer, approximately 92% of streamflow is "young" event flow, produced by near stream rainfall. Throughout the upper basin the confining layer produces a persistent high surficial water table which results in high evapotranspiration, low groundwater recharge and thus negligible "inter-event" streamflow. In the lower basin, where the karst aquifer is unconfined, deeper water tables result in less evapotranspiration. Thus, over 80% of the streamflow is "old" subsurface flow produced by diffuse infiltration through the epikarst throughout the lower basin, and all surface contributions to streamflow originate in the upper confined basin. Climatic variability provides a secondary control on surface-subsurface and land-atmosphere fluxes, producing significant seasonal and

  9. Cost model validation: a technical and cultural approach

    Science.gov (United States)

    Hihn, J.; Rosenberg, L.; Roust, K.; Warfield, K.

    2001-01-01

    This paper summarizes how JPL's parametric mission cost model (PMCM) has been validated using both formal statistical methods and a variety of peer and management reviews in order to establish organizational acceptance of the cost model estimates.

  10. Validation of Embedded System Verification Models

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    The result of a model-based requirements verification shows that the model of a system satisfies (or not) formalised system requirements. The verification result is correct only if the model represents the system adequately. No matter what modelling technique we use, what precedes the model

  11. IVIM: modeling, experimental validation and application to animal models

    International Nuclear Information System (INIS)

    Fournet, Gabrielle

    2016-01-01

    This PhD thesis is centered on the study of the IVIM ('Intravoxel Incoherent Motion') MRI sequence. This sequence allows for the study of the blood microvasculature such as the capillaries, arterioles and venules. To be sensitive only to moving groups of spins, diffusion gradients are added before and after the 180 degrees pulse of a spin echo (SE) sequence. The signal component corresponding to spins diffusing in the tissue can be separated from the one related to spins travelling in the blood vessels which is called the IVIM signal. These two components are weighted by f IVIM which represents the volume fraction of blood inside the tissue. The IVIM signal is usually modelled by a mono-exponential (ME) function and characterized by a pseudo-diffusion coefficient, D*. We propose instead a bi-exponential IVIM model consisting of a slow pool, characterized by F slow and D* slow corresponding to the capillaries as in the ME model, and a fast pool, characterized by F fast and D* fast, related to larger vessels such as medium-size arterioles and venules. This model was validated experimentally and more information was retrieved by comparing the experimental signals to a dictionary of simulated IVIM signals. The influence of the pulse sequence, the repetition time and the diffusion encoding time was also studied. Finally, the IVIM sequence was applied to the study of an animal model of Alzheimer's disease. (author) [fr

  12. SWAT application in intensive irrigation systems: Model modification, calibration and validation

    OpenAIRE

    Dechmi, Farida; Burguete, Javier; Skhiri, Ahmed

    2012-01-01

    The Soil and Water Assessment Tool (SWAT) is a well established, distributed, eco-hydrologic model. However, using the study case of an agricultural intensive irrigated watershed, it was shown that all the model versions are not able to appropriately reproduce the total streamflow in such system when the irrigation source is outside the watershed. The objective of this study was to modify the SWAT2005 version for correctly simulating the main hydrological processes. Crop yield, total streamfl...

  13. The concept of validation of numerical models for consequence analysis

    International Nuclear Information System (INIS)

    Borg, Audun; Paulsen Husted, Bjarne; Njå, Ove

    2014-01-01

    Numerical models such as computational fluid dynamics (CFD) models are increasingly used in life safety studies and other types of analyses to calculate the effects of fire and explosions. The validity of these models is usually established by benchmark testing. This is done to quantitatively measure the agreement between the predictions provided by the model and the real world represented by observations in experiments. This approach assumes that all variables in the real world relevant for the specific study are adequately measured in the experiments and in the predictions made by the model. In this paper the various definitions of validation for CFD models used for hazard prediction are investigated to assess their implication for consequence analysis in a design phase. In other words, how is uncertainty in the prediction of future events reflected in the validation process? The sources of uncertainty are viewed from the perspective of the safety engineer. An example of the use of a CFD model is included to illustrate the assumptions the analyst must make and how these affect the prediction made by the model. The assessments presented in this paper are based on a review of standards and best practice guides for CFD modeling and the documentation from two existing CFD programs. Our main thrust has been to assess how validation work is performed and communicated in practice. We conclude that the concept of validation adopted for numerical models is adequate in terms of model performance. However, it does not address the main sources of uncertainty from the perspective of the safety engineer. Uncertainty in the input quantities describing future events, which are determined by the model user, outweighs the inaccuracies in the model as reported in validation studies. - Highlights: • Examine the basic concept of validation applied to models for consequence analysis. • Review standards and guides for validation of numerical models. • Comparison of the validation

  14. Models for Validation of Prior Learning (VPL)

    DEFF Research Database (Denmark)

    Ehlers, Søren

    The national policies for the education/training of adults are in the 21st century highly influenced by proposals which are formulated and promoted by The European Union (EU) as well as other transnational players and this shift in policy making has consequences. One is that ideas which in the past...... would have been categorized as utopian can become realpolitik. Validation of Prior Learning (VPL) was in Europe mainly regarded as utopian while universities in the United States of America (USA) were developing ways to obtain credits to those students which was coming with experiences from working life....

  15. The effects of changing land cover on streamflow simulation in Puerto Rico

    Science.gov (United States)

    Van Beusekom, Ashley; Hay, Lauren E.; Viger, Roland; Gould, William A.; Collazo, Jaime; Henareh Khalyani, Azad

    2014-01-01

    This study quantitatively explores whether land cover changes have a substantive impact on simulated streamflow within the tropical island setting of Puerto Rico. The Precipitation Runoff Modeling System (PRMS) was used to compare streamflow simulations based on five static parameterizations of land cover with those based on dynamically varying parameters derived from four land cover scenes for the period 1953-2012. The PRMS simulations based on static land cover illustrated consistent differences in simulated streamflow across the island. It was determined that the scale of the analysis makes a difference: large regions with localized areas that have undergone dramatic land cover change may show negligible difference in total streamflow, but streamflow simulations using dynamic land cover parameters for a highly altered subwatershed clearly demonstrate the effects of changing land cover on simulated streamflow. Incorporating dynamic parameterization in these highly altered watersheds can reduce the predictive uncertainty in simulations of streamflow using PRMS. Hydrologic models that do not consider the projected changes in land cover may be inadequate for water resource management planning for future conditions.

  16. Multi-site Stochastic Simulation of Daily Streamflow with Markov Chain and KNN Algorithm

    Science.gov (United States)

    Mathai, J.; Mujumdar, P.

    2017-12-01

    A key focus of this study is to develop a method which is physically consistent with the hydrologic processes that can capture short-term characteristics of daily hydrograph as well as the correlation of streamflow in temporal and spatial domains. In complex water resource systems, flow fluctuations at small time intervals require that discretisation be done at small time scales such as daily scales. Also, simultaneous generation of synthetic flows at different sites in the same basin are required. We propose a method to equip water managers with a streamflow generator within a stochastic streamflow simulation framework. The motivation for the proposed method is to generate sequences that extend beyond the variability represented in the historical record of streamflow time series. The method has two steps: In step 1, daily flow is generated independently at each station by a two-state Markov chain, with rising limb increments randomly sampled from a Gamma distribution and the falling limb modelled as exponential recession and in step 2, the streamflow generated in step 1 is input to a nonparametric K-nearest neighbor (KNN) time series bootstrap resampler. The KNN model, being data driven, does not require assumptions on the dependence structure of the time series. A major limitation of KNN based streamflow generators is that they do not produce new values, but merely reshuffle the historical data to generate realistic streamflow sequences. However, daily flow generated using the Markov chain approach is capable of generating a rich variety of streamflow sequences. Furthermore, the rising and falling limbs of daily hydrograph represent different physical processes, and hence they need to be modelled individually. Thus, our method combines the strengths of the two approaches. We show the utility of the method and improvement over the traditional KNN by simulating daily streamflow sequences at 7 locations in the Godavari River basin in India.

  17. Assessing Discriminative Performance at External Validation of Clinical Prediction Models.

    Directory of Open Access Journals (Sweden)

    Daan Nieboer

    Full Text Available External validation studies are essential to study the generalizability of prediction models. Recently a permutation test, focusing on discrimination as quantified by the c-statistic, was proposed to judge whether a prediction model is transportable to a new setting. We aimed to evaluate this test and compare it to previously proposed procedures to judge any changes in c-statistic from development to external validation setting.We compared the use of the permutation test to the use of benchmark values of the c-statistic following from a previously proposed framework to judge transportability of a prediction model. In a simulation study we developed a prediction model with logistic regression on a development set and validated them in the validation set. We concentrated on two scenarios: 1 the case-mix was more heterogeneous and predictor effects were weaker in the validation set compared to the development set, and 2 the case-mix was less heterogeneous in the validation set and predictor effects were identical in the validation and development set. Furthermore we illustrated the methods in a case study using 15 datasets of patients suffering from traumatic brain injury.The permutation test indicated that the validation and development set were homogenous in scenario 1 (in almost all simulated samples and heterogeneous in scenario 2 (in 17%-39% of simulated samples. Previously proposed benchmark values of the c-statistic and the standard deviation of the linear predictors correctly pointed at the more heterogeneous case-mix in scenario 1 and the less heterogeneous case-mix in scenario 2.The recently proposed permutation test may provide misleading results when externally validating prediction models in the presence of case-mix differences between the development and validation population. To correctly interpret the c-statistic found at external validation it is crucial to disentangle case-mix differences from incorrect regression coefficients.

  18. Analytical flow duration curves for summer streamflow in Switzerland

    Science.gov (United States)

    Santos, Ana Clara; Portela, Maria Manuela; Rinaldo, Andrea; Schaefli, Bettina

    2018-04-01

    This paper proposes a systematic assessment of the performance of an analytical modeling framework for streamflow probability distributions for a set of 25 Swiss catchments. These catchments show a wide range of hydroclimatic regimes, including namely snow-influenced streamflows. The model parameters are calculated from a spatially averaged gridded daily precipitation data set and from observed daily discharge time series, both in a forward estimation mode (direct parameter calculation from observed data) and in an inverse estimation mode (maximum likelihood estimation). The performance of the linear and the nonlinear model versions is assessed in terms of reproducing observed flow duration curves and their natural variability. Overall, the nonlinear model version outperforms the linear model for all regimes, but the linear model shows a notable performance increase with catchment elevation. More importantly, the obtained results demonstrate that the analytical model performs well for summer discharge for all analyzed streamflow regimes, ranging from rainfall-driven regimes with summer low flow to snow and glacier regimes with summer high flow. These results suggest that the model's encoding of discharge-generating events based on stochastic soil moisture dynamics is more flexible than previously thought. As shown in this paper, the presence of snowmelt or ice melt is accommodated by a relative increase in the discharge-generating frequency, a key parameter of the model. Explicit quantification of this frequency increase as a function of mean catchment meteorological conditions is left for future research.

  19. Variance analysis of forecasted streamflow maxima in a wet temperate climate

    Science.gov (United States)

    Al Aamery, Nabil; Fox, James F.; Snyder, Mark; Chandramouli, Chandra V.

    2018-05-01

    Coupling global climate models, hydrologic models and extreme value analysis provides a method to forecast streamflow maxima, however the elusive variance structure of the results hinders confidence in application. Directly correcting the bias of forecasts using the relative change between forecast and control simulations has been shown to marginalize hydrologic uncertainty, reduce model bias, and remove systematic variance when predicting mean monthly and mean annual streamflow, prompting our investigation for maxima streamflow. We assess the variance structure of streamflow maxima using realizations of emission scenario, global climate model type and project phase, downscaling methods, bias correction, extreme value methods, and hydrologic model inputs and parameterization. Results show that the relative change of streamflow maxima was not dependent on systematic variance from the annual maxima versus peak over threshold method applied, albeit we stress that researchers strictly adhere to rules from extreme value theory when applying the peak over threshold method. Regardless of which method is applied, extreme value model fitting does add variance to the projection, and the variance is an increasing function of the return period. Unlike the relative change of mean streamflow, results show that the variance of the maxima's relative change was dependent on all climate model factors tested as well as hydrologic model inputs and calibration. Ensemble projections forecast an increase of streamflow maxima for 2050 with pronounced forecast standard error, including an increase of +30(±21), +38(±34) and +51(±85)% for 2, 20 and 100 year streamflow events for the wet temperate region studied. The variance of maxima projections was dominated by climate model factors and extreme value analyses.

  20. Experimental Validation of Flow Force Models for Fast Switching Valves

    DEFF Research Database (Denmark)

    Bender, Niels Christian; Pedersen, Henrik Clemmensen; Nørgård, Christian

    2017-01-01

    This paper comprises a detailed study of the forces acting on a Fast Switching Valve (FSV) plunger. The objective is to investigate to what extend different models are valid to be used for design purposes. These models depend on the geometry of the moving plunger and the properties of the surroun......This paper comprises a detailed study of the forces acting on a Fast Switching Valve (FSV) plunger. The objective is to investigate to what extend different models are valid to be used for design purposes. These models depend on the geometry of the moving plunger and the properties...... to compare and validate different models, where an effort is directed towards capturing the fluid squeeze effect just before material on material contact. The test data is compared with simulation data relying solely on analytic formulations. The general dynamics of the plunger is validated...

  1. Validation of elk resource selection models with spatially independent data

    Science.gov (United States)

    Priscilla K. Coe; Bruce K. Johnson; Michael J. Wisdom; John G. Cook; Marty Vavra; Ryan M. Nielson

    2011-01-01

    Knowledge of how landscape features affect wildlife resource use is essential for informed management. Resource selection functions often are used to make and validate predictions about landscape use; however, resource selection functions are rarely validated with data from landscapes independent of those from which the models were built. This problem has severely...

  2. A Practical Approach to Validating a PD Model

    NARCIS (Netherlands)

    Medema, L.; Koning, de R.; Lensink, B.W.

    2009-01-01

    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their

  3. A practical approach to validating a PD model

    NARCIS (Netherlands)

    Medema, Lydian; Koning, Ruud H.; Lensink, Robert; Medema, M.

    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their

  4. Amendment to Validated dynamic flow model

    DEFF Research Database (Denmark)

    Knudsen, Torben

    2011-01-01

    The purpose of WP2 is to establish flow models relating the wind speed at turbines in a farm. Until now, active control of power reference has not been included in these models as only data with standard operation has been available. In this report the first data series with power reference excit...... turbine in undisturbed flow. For this data set both the multiplicative model and in particular the simple first order transfer function model can predict the down wind wind speed from upwind wind speed and loading.......The purpose of WP2 is to establish flow models relating the wind speed at turbines in a farm. Until now, active control of power reference has not been included in these models as only data with standard operation has been available. In this report the first data series with power reference...

  5. Low Streamflow Forcasting using Minimum Relative Entropy

    Science.gov (United States)

    Cui, H.; Singh, V. P.

    2013-12-01

    Minimum relative entropy spectral analysis is derived in this study, and applied to forecast streamflow time series. Proposed method extends the autocorrelation in the manner that the relative entropy of underlying process is minimized so that time series data can be forecasted. Different prior estimation, such as uniform, exponential and Gaussian assumption, is taken to estimate the spectral density depending on the autocorrelation structure. Seasonal and nonseasonal low streamflow series obtained from Colorado River (Texas) under draught condition is successfully forecasted using proposed method. Minimum relative entropy determines spectral of low streamflow series with higher resolution than conventional method. Forecasted streamflow is compared to the prediction using Burg's maximum entropy spectral analysis (MESA) and Configurational entropy. The advantage and disadvantage of each method in forecasting low streamflow is discussed.

  6. Bivariate Drought Analysis Using Streamflow Reconstruction with Tree Ring Indices in the Sacramento Basin, California, USA

    Directory of Open Access Journals (Sweden)

    Jaewon Kwak

    2016-03-01

    Full Text Available Long-term streamflow data are vital for analysis of hydrological droughts. Using an artificial neural network (ANN model and nine tree-ring indices, this study reconstructed the annual streamflow of the Sacramento River for the period from 1560 to 1871. Using the reconstructed streamflow data, the copula method was used for bivariate drought analysis, deriving a hydrological drought return period plot for the Sacramento River basin. Results showed strong correlation among drought characteristics, and the drought with a 20-year return period (17.2 million acre-feet (MAF per year in the Sacramento River basin could be considered a critical level of drought for water shortages.

  7. A validated physical model of greenhouse climate.

    NARCIS (Netherlands)

    Bot, G.P.A.

    1989-01-01

    In the greenhouse model the momentaneous environmental crop growth factors are calculated as output, together with the physical behaviour of the crop. The boundary conditions for this model are the outside weather conditions; other inputs are the physical characteristics of the crop, of the

  8. Propagation of soil moisture memory to streamflow and evapotranspiration in Europe

    Science.gov (United States)

    Orth, R.; Seneviratne, S. I.

    2013-10-01

    As a key variable of the land-climate system soil moisture is a main driver of streamflow and evapotranspiration under certain conditions. Soil moisture furthermore exhibits outstanding memory (persistence) characteristics. Many studies also report distinct low frequency variations for streamflow, which are likely related to soil moisture memory. Using data from over 100 near-natural catchments located across Europe, we investigate in this study the connection between soil moisture memory and the respective memory of streamflow and evapotranspiration on different time scales. For this purpose we use a simple water balance model in which dependencies of runoff (normalised by precipitation) and evapotranspiration (normalised by radiation) on soil moisture are fitted using streamflow observations. The model therefore allows us to compute the memory characteristics of soil moisture, streamflow and evapotranspiration on the catchment scale. We find considerable memory in soil moisture and streamflow in many parts of the continent, and evapotranspiration also displays some memory at monthly time scale in some catchments. We show that the memory of streamflow and evapotranspiration jointly depend on soil moisture memory and on the strength of the coupling of streamflow and evapotranspiration to soil moisture. Furthermore, we find that the coupling strengths of streamflow and evapotranspiration to soil moisture depend on the shape of the fitted dependencies and on the variance of the meteorological forcing. To better interpret the magnitude of the respective memories across Europe, we finally provide a new perspective on hydrological memory by relating it to the mean duration required to recover from anomalies exceeding a certain threshold.

  9. Quantifying streamflow change caused by forest disturbance at a large spatial scale: A single watershed study

    Science.gov (United States)

    Wei, Xiaohua; Zhang, Mingfang

    2010-12-01

    Climatic variability and forest disturbance are commonly recognized as two major drivers influencing streamflow change in large-scale forested watersheds. The greatest challenge in evaluating quantitative hydrological effects of forest disturbance is the removal of climatic effect on hydrology. In this paper, a method was designed to quantify respective contributions of large-scale forest disturbance and climatic variability on streamflow using the Willow River watershed (2860 km2) located in the central part of British Columbia, Canada. Long-term (>50 years) data on hydrology, climate, and timber harvesting history represented by equivalent clear-cutting area (ECA) were available to discern climatic and forestry influences on streamflow by three steps. First, effective precipitation, an integrated climatic index, was generated by subtracting evapotranspiration from precipitation. Second, modified double mass curves were developed by plotting accumulated annual streamflow against annual effective precipitation, which presented a much clearer picture of the cumulative effects of forest disturbance on streamflow following removal of climatic influence. The average annual streamflow changes that were attributed to forest disturbances and climatic variability were then estimated to be +58.7 and -72.4 mm, respectively. The positive (increasing) and negative (decreasing) values in streamflow change indicated opposite change directions, which suggest an offsetting effect between forest disturbance and climatic variability in the study watershed. Finally, a multivariate Autoregressive Integrated Moving Average (ARIMA) model was generated to establish quantitative relationships between accumulated annual streamflow deviation attributed to forest disturbances and annual ECA. The model was then used to project streamflow change under various timber harvesting scenarios. The methodology can be effectively applied to any large-scale single watershed where long-term data (>50

  10. Propagation of soil moisture memory to streamflow and evapotranspiration in Europe

    Directory of Open Access Journals (Sweden)

    R. Orth

    2013-10-01

    Full Text Available As a key variable of the land-climate system soil moisture is a main driver of streamflow and evapotranspiration under certain conditions. Soil moisture furthermore exhibits outstanding memory (persistence characteristics. Many studies also report distinct low frequency variations for streamflow, which are likely related to soil moisture memory. Using data from over 100 near-natural catchments located across Europe, we investigate in this study the connection between soil moisture memory and the respective memory of streamflow and evapotranspiration on different time scales. For this purpose we use a simple water balance model in which dependencies of runoff (normalised by precipitation and evapotranspiration (normalised by radiation on soil moisture are fitted using streamflow observations. The model therefore allows us to compute the memory characteristics of soil moisture, streamflow and evapotranspiration on the catchment scale. We find considerable memory in soil moisture and streamflow in many parts of the continent, and evapotranspiration also displays some memory at monthly time scale in some catchments. We show that the memory of streamflow and evapotranspiration jointly depend on soil moisture memory and on the strength of the coupling of streamflow and evapotranspiration to soil moisture. Furthermore, we find that the coupling strengths of streamflow and evapotranspiration to soil moisture depend on the shape of the fitted dependencies and on the variance of the meteorological forcing. To better interpret the magnitude of the respective memories across Europe, we finally provide a new perspective on hydrological memory by relating it to the mean duration required to recover from anomalies exceeding a certain threshold.

  11. Statistical Validation of Engineering and Scientific Models: Background

    International Nuclear Information System (INIS)

    Hills, Richard G.; Trucano, Timothy G.

    1999-01-01

    A tutorial is presented discussing the basic issues associated with propagation of uncertainty analysis and statistical validation of engineering and scientific models. The propagation of uncertainty tutorial illustrates the use of the sensitivity method and the Monte Carlo method to evaluate the uncertainty in predictions for linear and nonlinear models. Four example applications are presented; a linear model, a model for the behavior of a damped spring-mass system, a transient thermal conduction model, and a nonlinear transient convective-diffusive model based on Burger's equation. Correlated and uncorrelated model input parameters are considered. The model validation tutorial builds on the material presented in the propagation of uncertainty tutoriaI and uses the damp spring-mass system as the example application. The validation tutorial illustrates several concepts associated with the application of statistical inference to test model predictions against experimental observations. Several validation methods are presented including error band based, multivariate, sum of squares of residuals, and optimization methods. After completion of the tutorial, a survey of statistical model validation literature is presented and recommendations for future work are made

  12. Validity of microgravity simulation models on earth

    DEFF Research Database (Denmark)

    Regnard, J; Heer, M; Drummer, C

    2001-01-01

    Many studies have used water immersion and head-down bed rest as experimental models to simulate responses to microgravity. However, some data collected during space missions are at variance or in contrast with observations collected from experimental models. These discrepancies could reflect...... incomplete knowledge of the characteristics inherent to each model. During water immersion, the hydrostatic pressure lowers the peripheral vascular capacity and causes increased thoracic blood volume and high vascular perfusion. In turn, these changes lead to high urinary flow, low vasomotor tone, and a high...

  13. Verification and Validation of Tropospheric Model/Database

    National Research Council Canada - National Science Library

    Junho, choi

    1998-01-01

    A verification and validation of tropospheric models and databases has been performed based on ray tracing algorithm, statistical analysis, test on real time system operation, and other technical evaluation process...

  14. Base Flow Model Validation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The program focuses on turbulence modeling enhancements for predicting high-speed rocket base flows. A key component of the effort is the collection of high-fidelity...

  15. Validating predictions from climate envelope models.

    Directory of Open Access Journals (Sweden)

    James I Watling

    Full Text Available Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species' distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967-1971 (t1 and evaluated using occurrence data from 1998-2002 (t2. Model sensitivity (the ability to correctly classify species presences was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on

  16. Validating predictions from climate envelope models

    Science.gov (United States)

    Watling, J.; Bucklin, D.; Speroterra, C.; Brandt, L.; Cabal, C.; Romañach, Stephanie S.; Mazzotti, Frank J.

    2013-01-01

    Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species’ distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967–1971 (t1) and evaluated using occurrence data from 1998–2002 (t2). Model sensitivity (the ability to correctly classify species presences) was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences) was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on species.

  17. Preliminary validation of a Monte Carlo model for IMRT fields

    International Nuclear Information System (INIS)

    Wright, Tracy; Lye, Jessica; Mohammadi, Mohammad

    2011-01-01

    Full text: A Monte Carlo model of an Elekta linac, validated for medium to large (10-30 cm) symmetric fields, has been investigated for small, irregular and asymmetric fields suitable for IMRT treatments. The model has been validated with field segments using radiochromic film in solid water. The modelled positions of the multileaf collimator (MLC) leaves have been validated using EBT film, In the model, electrons with a narrow energy spectrum are incident on the target and all components of the linac head are included. The MLC is modelled using the EGSnrc MLCE component module. For the validation, a number of single complex IMRT segments with dimensions approximately 1-8 cm were delivered to film in solid water (see Fig, I), The same segments were modelled using EGSnrc by adjusting the MLC leaf positions in the model validated for 10 cm symmetric fields. Dose distributions along the centre of each MLC leaf as determined by both methods were compared. A picket fence test was also performed to confirm the MLC leaf positions. 95% of the points in the modelled dose distribution along the leaf axis agree with the film measurement to within 1%/1 mm for dose difference and distance to agreement. Areas of most deviation occur in the penumbra region. A system has been developed to calculate the MLC leaf positions in the model for any planned field size.

  18. Validation of the simulator neutronics model

    International Nuclear Information System (INIS)

    Gregory, M.V.

    1984-01-01

    The neutronics model in the SRP reactor training simulator computes the variation with time of the neutron population in the reactor core. The power output of a reactor is directly proportional to the neutron population, thus in a very real sense the neutronics model determines the response of the simulator. The geometrical complexity of the reactor control system in SRP reactors requires the neutronics model to provide a detailed, 3D representation of the reactor core. Existing simulator technology does not allow such a detailed representation to run in real-time in a minicomputer environment, thus an entirely different approach to the problem was required. A prompt jump method has been developed in answer to this need

  19. Monthly streamflow forecasting with auto-regressive integrated moving average

    Science.gov (United States)

    Nasir, Najah; Samsudin, Ruhaidah; Shabri, Ani

    2017-09-01

    Forecasting of streamflow is one of the many ways that can contribute to better decision making for water resource management. The auto-regressive integrated moving average (ARIMA) model was selected in this research for monthly streamflow forecasting with enhancement made by pre-processing the data using singular spectrum analysis (SSA). This study also proposed an extension of the SSA technique to include a step where clustering was performed on the eigenvector pairs before reconstruction of the time series. The monthly streamflow data of Sungai Muda at Jeniang, Sungai Muda at Jambatan Syed Omar and Sungai Ketil at Kuala Pegang was gathered from the Department of Irrigation and Drainage Malaysia. A ratio of 9:1 was used to divide the data into training and testing sets. The ARIMA, SSA-ARIMA and Clustered SSA-ARIMA models were all developed in R software. Results from the proposed model are then compared to a conventional auto-regressive integrated moving average model using the root-mean-square error and mean absolute error values. It was found that the proposed model can outperform the conventional model.

  20. Context discovery using attenuated Bloom codes: model description and validation

    NARCIS (Netherlands)

    Liu, F.; Heijenk, Geert

    A novel approach to performing context discovery in ad-hoc networks based on the use of attenuated Bloom filters is proposed in this report. In order to investigate the performance of this approach, a model has been developed. This document describes the model and its validation. The model has been

  1. Traffic modelling validation of advanced driver assistance systems

    NARCIS (Netherlands)

    Tongeren, R. van; Gietelink, O.J.; Schutter, B. de; Verhaegen, M.

    2007-01-01

    This paper presents a microscopic traffic model for the validation of advanced driver assistance systems. This model describes single-lane traffic and is calibrated with data from a field operational test. To illustrate the use of the model, a Monte Carlo simulation of single-lane traffic scenarios

  2. Application of parameters space analysis tools for empirical model validation

    Energy Technology Data Exchange (ETDEWEB)

    Paloma del Barrio, E. [LEPT-ENSAM UMR 8508, Talence (France); Guyon, G. [Electricite de France, Moret-sur-Loing (France)

    2004-01-01

    A new methodology for empirical model validation has been proposed in the framework of the Task 22 (Building Energy Analysis Tools) of the International Energy Agency. It involves two main steps: checking model validity and diagnosis. Both steps, as well as the underlying methods, have been presented in the first part of the paper. In this part, they are applied for testing modelling hypothesis in the framework of the thermal analysis of an actual building. Sensitivity analysis tools have been first used to identify the parts of the model that can be really tested on the available data. A preliminary diagnosis is then supplied by principal components analysis. Useful information for model behaviour improvement has been finally obtained by optimisation techniques. This example of application shows how model parameters space analysis is a powerful tool for empirical validation. In particular, diagnosis possibilities are largely increased in comparison with residuals analysis techniques. (author)

  3. Quantitative system validation in model driven design

    DEFF Research Database (Denmark)

    Hermanns, Hilger; Larsen, Kim Guldstrand; Raskin, Jean-Francois

    2010-01-01

    The European STREP project Quasimodo1 develops theory, techniques and tool components for handling quantitative constraints in model-driven development of real-time embedded systems, covering in particular real-time, hybrid and stochastic aspects. This tutorial highlights the advances made, focus...

  4. Ensuring the Validity of the Micro Foundation in DSGE Models

    DEFF Research Database (Denmark)

    Andreasen, Martin Møller

    & Primiceri (American Economic Review, forth- coming) and Fernández-Villaverde & Rubio-Ramírez (Review of Economic Studies, 2007) do not satisfy these sufficient conditions, or any other known set of conditions ensuring finite values for the objective functions. Thus, the validity of the micro foundation......The presence of i) stochastic trends, ii) deterministic trends, and/or iii) stochastic volatil- ity in DSGE models may imply that the agents' objective functions attain infinite values. We say that such models do not have a valid micro foundation. The paper derives sufficient condi- tions which...... ensure that the objective functions of the households and the firms are finite even when various trends and stochastic volatility are included in a standard DSGE model. Based on these conditions we test the validity of the micro foundation in six DSGE models from the literature. The models of Justiniano...

  5. What Do They Have in Common? Drivers of Streamflow Spatial Correlation and Prediction of Flow Regimes in Ungauged Locations

    Science.gov (United States)

    Betterle, A.; Radny, D.; Schirmer, M.; Botter, G.

    2017-12-01

    The spatial correlation of daily streamflows represents a statistical index encapsulating the similarity between hydrographs at two arbitrary catchment outlets. In this work, a process-based analytical framework is utilized to investigate the hydrological drivers of streamflow spatial correlation through an extensive application to 78 pairs of stream gauges belonging to 13 unregulated catchments in the eastern United States. The analysis provides insight on how the observed heterogeneity of the physical processes that control flow dynamics ultimately affect streamflow correlation and spatial patterns of flow regimes. Despite the variability of recession properties across the study catchments, the impact of heterogeneous drainage rates on the streamflow spatial correlation is overwhelmed by the spatial variability of frequency and intensity of effective rainfall events. Overall, model performances are satisfactory, with root mean square errors between modeled and observed streamflow spatial correlation below 10% in most cases. We also propose a method for estimating streamflow correlation in the absence of discharge data, which proves useful to predict streamflow regimes in ungauged areas. The method consists in setting a minimum threshold on the modeled flow correlation to individuate hydrologically similar sites. Catchment outlets that are most correlated (ρ>0.9) are found to be characterized by analogous streamflow distributions across a broad range of flow regimes.

  6. Effect of Tree-to-Shrub Type Conversion in Lower Montane Forests of the Sierra Nevada (USA) on Streamflow.

    Science.gov (United States)

    Bart, Ryan R; Tague, Christina L; Moritz, Max A

    2016-01-01

    Higher global temperatures and increased levels of disturbance are contributing to greater tree mortality in many forest ecosystems. These same drivers can also limit forest regeneration, leading to vegetation type conversion. For the Sierra Nevada of California, little is known about how type conversion may affect streamflow, a critical source of water supply for urban, agriculture and environmental purposes. In this paper, we examined the effects of tree-to-shrub type conversion, in combination with climate change, on streamflow in two lower montane forest watersheds in the Sierra Nevada. A spatially distributed ecohydrologic model was used to simulate changes in streamflow, evaporation, and transpiration following type conversion, with an explicit focus on the role of vegetation size and aspect. Model results indicated that streamflow may show negligible change or small decreases following type conversion when the difference between tree and shrub leaf areas is small, partly due to the higher stomatal conductivity and the deep rooting depth of shrubs. In contrast, streamflow may increase when post-conversion shrubs have a small leaf area relative to trees. Model estimates also suggested that vegetation change could have a greater impact on streamflow magnitude than the direct hydrologic impacts of increased temperatures. Temperature increases, however, may have a greater impact on streamflow timing. Tree-to-shrub type conversion increased streamflow only marginally during dry years (annual precipitation importance of accounting for changes in vegetation communities to accurately characterize future hydrologic regimes for the Sierra Nevada.

  7. Functional Validation of Heteromeric Kainate Receptor Models.

    Science.gov (United States)

    Paramo, Teresa; Brown, Patricia M G E; Musgaard, Maria; Bowie, Derek; Biggin, Philip C

    2017-11-21

    Kainate receptors require the presence of external ions for gating. Most work thus far has been performed on homomeric GluK2 but, in vivo, kainate receptors are likely heterotetramers. Agonists bind to the ligand-binding domain (LBD) which is arranged as a dimer of dimers as exemplified in homomeric structures, but no high-resolution structure currently exists of heteromeric kainate receptors. In a full-length heterotetramer, the LBDs could potentially be arranged either as a GluK2 homomer alongside a GluK5 homomer or as two GluK2/K5 heterodimers. We have constructed models of the LBD dimers based on the GluK2 LBD crystal structures and investigated their stability with molecular dynamics simulations. We have then used the models to make predictions about the functional behavior of the full-length GluK2/K5 receptor, which we confirmed via electrophysiological recordings. A key prediction and observation is that lithium ions bind to the dimer interface of GluK2/K5 heteromers and slow their desensitization. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  8. Validation of limited sampling models (LSM) for estimating AUC in therapeutic drug monitoring - is a separate validation group required?

    NARCIS (Netherlands)

    Proost, J. H.

    Objective: Limited sampling models (LSM) for estimating AUC in therapeutic drug monitoring are usually validated in a separate group of patients, according to published guidelines. The aim of this study is to evaluate the validation of LSM by comparing independent validation with cross-validation

  9. Validation of nuclear models used in space radiation shielding applications

    International Nuclear Information System (INIS)

    Norman, Ryan B.; Blattnig, Steve R.

    2013-01-01

    A program of verification and validation has been undertaken to assess the applicability of models to space radiation shielding applications and to track progress as these models are developed over time. In this work, simple validation metrics applicable to testing both model accuracy and consistency with experimental data are developed. The developed metrics treat experimental measurement uncertainty as an interval and are therefore applicable to cases in which epistemic uncertainty dominates the experimental data. To demonstrate the applicability of the metrics, nuclear physics models used by NASA for space radiation shielding applications are compared to an experimental database consisting of over 3600 experimental cross sections. A cumulative uncertainty metric is applied to the question of overall model accuracy, while a metric based on the median uncertainty is used to analyze the models from the perspective of model development by examining subsets of the model parameter space.

  10. Data Pre-Analysis and Ensemble of Various Artificial Neural Networks for Monthly Streamflow Forecasting

    Directory of Open Access Journals (Sweden)

    Jianzhong Zhou

    2018-05-01

    Full Text Available This paper introduces three artificial neural network (ANN architectures for monthly streamflow forecasting: a radial basis function network, an extreme learning machine, and the Elman network. Three ensemble techniques, a simple average ensemble, a weighted average ensemble, and an ANN-based ensemble, were used to combine the outputs of the individual ANN models. The objective was to highlight the performance of the general regression neural network-based ensemble technique (GNE through an improvement of monthly streamflow forecasting accuracy. Before the construction of an ANN model, data preanalysis techniques, such as empirical wavelet transform (EWT, were exploited to eliminate the oscillations of the streamflow series. Additionally, a theory of chaos phase space reconstruction was used to select the most relevant and important input variables for forecasting. The proposed GNE ensemble model has been applied for the mean monthly streamflow observation data from the Wudongde hydrological station in the Jinsha River Basin, China. Comparisons and analysis of this study have demonstrated that the denoised streamflow time series was less disordered and unsystematic than was suggested by the original time series according to chaos theory. Thus, EWT can be adopted as an effective data preanalysis technique for the prediction of monthly streamflow. Concurrently, the GNE performed better when compared with other ensemble techniques.

  11. Composing, Analyzing and Validating Software Models

    Science.gov (United States)

    Sheldon, Frederick T.

    1998-10-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  12. Model validation and calibration based on component functions of model output

    International Nuclear Information System (INIS)

    Wu, Danqing; Lu, Zhenzhou; Wang, Yanping; Cheng, Lei

    2015-01-01

    The target in this work is to validate the component functions of model output between physical observation and computational model with the area metric. Based on the theory of high dimensional model representations (HDMR) of independent input variables, conditional expectations are component functions of model output, and the conditional expectations reflect partial information of model output. Therefore, the model validation of conditional expectations tells the discrepancy between the partial information of the computational model output and that of the observations. Then a calibration of the conditional expectations is carried out to reduce the value of model validation metric. After that, a recalculation of the model validation metric of model output is taken with the calibrated model parameters, and the result shows that a reduction of the discrepancy in the conditional expectations can help decrease the difference in model output. At last, several examples are employed to demonstrate the rationality and necessity of the methodology in case of both single validation site and multiple validation sites. - Highlights: • A validation metric of conditional expectations of model output is proposed. • HDRM explains the relationship of conditional expectations and model output. • An improved approach of parameter calibration updates the computational models. • Validation and calibration process are applied at single site and multiple sites. • Validation and calibration process show a superiority than existing methods

  13. Estimation of average annual streamflows and power potentials for Alaska and Hawaii

    Energy Technology Data Exchange (ETDEWEB)

    Verdin, Kristine L. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Idaho National Engineering and Environmental Lab. (INEEL)

    2004-05-01

    This paper describes the work done to develop average annual streamflow estimates and power potential for the states of Alaska and Hawaii. The Elevation Derivatives for National Applications (EDNA) database was used, along with climatic datasets, to develop flow and power estimates for every stream reach in the EDNA database. Estimates of average annual streamflows were derived using state-specific regression equations, which were functions of average annual precipitation, precipitation intensity, drainage area, and other elevation-derived parameters. Power potential was calculated through the use of the average annual streamflow and the hydraulic head of each reach, which is calculated from the EDNA digital elevation model. In all, estimates of streamflow and power potential were calculated for over 170,000 stream segments in the Alaskan and Hawaiian datasets.

  14. Potential effects of climate change on streamflow for seven watersheds in eastern and central Montana

    Directory of Open Access Journals (Sweden)

    Katherine J. Chase

    2016-09-01

    New hydrological insights for the region: Projected changes in mean annual and mean monthly streamflow vary by the RegCM3 model selected, by watershed, and by future period. Mean annual streamflows for all future periods are projected to increase (11–21% for two of the four central Montana watersheds: Middle Musselshell River and Cottonwood Creek. Mean annual streamflows for all future periods are projected to decrease (changes of −24 to −75% for Redwater River watershed in eastern Montana. Mean annual streamflows are projected to increase slightly (2–15% for the 2030 period and decrease (changes of −16 to −44% for the 2080 period for the four remaining watersheds.

  15. Predicting the ungauged basin: Model validation and realism assessment

    Directory of Open Access Journals (Sweden)

    Tim evan Emmerik

    2015-10-01

    Full Text Available The hydrological decade on Predictions in Ungauged Basins (PUB led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of model outcome has not been discussed to a great extent. With this paper we aim to contribute to the discussion on how one can determine the value and validity of a hydrological model developed for an ungauged basin. As in many cases no local, or even regional, data are available, alternative methods should be applied. Using a PUB case study in a genuinely ungauged basin in southern Cambodia, we give several examples of how one can use different types of soft data to improve model design, calibrate and validate the model, and assess the realism of the model output. A rainfall-runoff model was coupled to an irrigation reservoir, allowing the use of additional and unconventional data. The model was mainly forced with remote sensing data, and local knowledge was used to constrain the parameters. Model realism assessment was done using data from surveys. This resulted in a successful reconstruction of the reservoir dynamics, and revealed the different hydrological characteristics of the two topographical classes. This paper does not present a generic approach that can be transferred to other ungauged catchments, but it aims to show how clever model design and alternative data acquisition can result in a valuable hydrological model for an ungauged catchment.

  16. Making Validated Educational Models Central in Preschool Standards.

    Science.gov (United States)

    Schweinhart, Lawrence J.

    This paper presents some ideas to preschool educators and policy makers about how to make validated educational models central in standards for preschool education and care programs that are available to all 3- and 4-year-olds. Defining an educational model as a coherent body of program practices, curriculum content, program and child, and teacher…

  17. Validation of ASTEC core degradation and containment models

    International Nuclear Information System (INIS)

    Kruse, Philipp; Brähler, Thimo; Koch, Marco K.

    2014-01-01

    Ruhr-Universitaet Bochum performed in a German funded project validation of in-vessel and containment models of the integral code ASTEC V2, jointly developed by IRSN (France) and GRS (Germany). In this paper selected results of this validation are presented. In the in-vessel part, the main point of interest was the validation of the code capability concerning cladding oxidation and hydrogen generation. The ASTEC calculations of QUENCH experiments QUENCH-03 and QUENCH-11 show satisfactory results, despite of some necessary adjustments in the input deck. Furthermore, the oxidation models based on the Cathcart–Pawel and Urbanic–Heidrick correlations are not suitable for higher temperatures while the ASTEC model BEST-FIT based on the Prater–Courtright approach at high temperature gives reliable enough results. One part of the containment model validation was the assessment of three hydrogen combustion models of ASTEC against the experiment BMC Ix9. The simulation results of these models differ from each other and therefore the quality of the simulations depends on the characteristic of each model. Accordingly, the CPA FRONT model, corresponding to the simplest necessary input parameters, provides the best agreement to the experimental data

  18. Validation of a multi-objective, predictive urban traffic model

    NARCIS (Netherlands)

    Wilmink, I.R.; Haak, P. van den; Woldeab, Z.; Vreeswijk, J.

    2013-01-01

    This paper describes the results of the verification and validation of the ecoStrategic Model, which was developed, implemented and tested in the eCoMove project. The model uses real-time and historical traffic information to determine the current, predicted and desired state of traffic in a

  19. Predicting the ungauged basin : Model validation and realism assessment

    NARCIS (Netherlands)

    Van Emmerik, T.H.M.; Mulder, G.; Eilander, D.; Piet, M.; Savenije, H.H.G.

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  20. Validating a Technology Enhanced Student-Centered Learning Model

    Science.gov (United States)

    Kang, Myunghee; Hahn, Jungsun; Chung, Warren

    2015-01-01

    The Technology Enhanced Student Centered Learning (TESCL) Model in this study presents the core factors that ensure the quality of learning in a technology-supported environment. Although the model was conceptually constructed using a student-centered learning framework and drawing upon previous studies, it should be validated through real-world…

  1. Validation of Power Requirement Model for Active Loudspeakers

    DEFF Research Database (Denmark)

    Schneider, Henrik; Madsen, Anders Normann; Bjerregaard, Ruben

    2015-01-01

    . There are however many advantages that could be harvested from such knowledge like size, cost and efficiency improvements. In this paper a recently proposed power requirement model for active loudspeakers is experimentally validated and the model is expanded to include the closed and vented type enclosures...

  2. Predicting the ungauged basin: model validation and realism assessment

    NARCIS (Netherlands)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  3. Validation and comparison of dispersion models of RTARC DSS

    International Nuclear Information System (INIS)

    Duran, J.; Pospisil, M.

    2004-01-01

    RTARC DSS (Real Time Accident Release Consequences - Decision Support System) is a computer code developed at the VUJE Trnava, Inc. (Stubna, M. et al, 1993). The code calculations include atmospheric transport and diffusion, dose assessment, evaluation and displaying of the affected zones, evaluation of the early health effects, concentration and dose rate time dependence in the selected sites etc. The simulation of the protective measures (sheltering, iodine administration) is involved. The aim of this paper is to present the process of validation of the RTARC dispersion models. RTARC includes models for calculations of release for very short (Method Monte Carlo - MEMOC), short (Gaussian Straight-Line Model) and long distances (Puff Trajectory Model - PTM). Validation of the code RTARC was performed using the results of comparisons and experiments summarized in the Table 1.: 1. Experiments and comparisons in the process of validation of the system RTARC - experiments or comparison - distance - model. Wind tunnel experiments (Universitaet der Bundeswehr, Muenchen) - Area of NPP - Method Monte Carlo. INEL (Idaho National Engineering Laboratory) - short/medium - Gaussian model and multi tracer atmospheric experiment - distances - PTM. Model Validation Kit - short distances - Gaussian model. STEP II.b 'Realistic Case Studies' - long distances - PTM. ENSEMBLE comparison - long distances - PTM (orig.)

  4. Model Validation and Verification of Data Mining from the ...

    African Journals Online (AJOL)

    Michael Horsfall

    In this paper, we seek to present a hybrid method for Model Validation and Verification of Data Mining from the ... This model generally states the numerical value of knowledge .... procedures found in the field of software engineering should be ...

  5. Streamflow conditions along Soldier Creek, Northeast Kansas

    Science.gov (United States)

    Juracek, Kyle E.

    2017-11-14

    The availability of adequate water to meet the present (2017) and future needs of humans, fish, and wildlife is a fundamental issue for the Prairie Band Potawatomi Nation in northeast Kansas. Because Soldier Creek flows through the Prairie Band Potawatomi Nation Reservation, it is an important tribal resource. An understanding of historical Soldier Creek streamflow conditions is required for the effective management of tribal water resources, including drought contingency planning. Historical data for six selected U.S. Geological Survey (USGS) streamgages along Soldier Creek were used in an assessment of streamflow characteristics and trends by Juracek (2017). Streamflow data for the period of record at each streamgage were used to compute annual mean streamflow, annual mean base flow, mean monthly flow, annual peak flow, and annual minimum flow. Results of the assessment are summarized in this fact sheet.

  6. Spatial Correlation Of Streamflows: An Analytical Approach

    Science.gov (United States)

    Betterle, A.; Schirmer, M.; Botter, G.

    2016-12-01

    The interwoven space and time variability of climate and landscape properties results in complex and non-linear hydrological response of streamflow dynamics. Understanding how meteorologic and morphological characteristics of catchments affect similarity/dissimilarity of streamflow timeseries at their outlets represents a scientific challenge with application in water resources management, ecological studies and regionalization approaches aimed to predict streamflows in ungauged areas. In this study, we establish an analytical approach to estimate the spatial correlation of daily streamflows in two arbitrary locations within a given hydrologic district or river basin at seasonal and annual time scales. The method is based on a stochastic description of the coupled streamflow dynamics at the outlet of two catchments. The framework aims to express the correlation of daily streamflows at two locations along a river network as a function of a limited number of physical parameters characterizing the main underlying hydrological drivers, that include climate conditions, precipitation regime and catchment drainage rates. The proposed method portrays how heterogeneity of climate and landscape features affect the spatial variability of flow regimes along river systems. In particular, we show that frequency and intensity of synchronous effective rainfall events in the relevant contributing catchments are the main driver of the spatial correlation of daily discharge, whereas only pronounced differences in the drainage rate of the two basins bear a significant effect on the streamflow correlation. The topological arrangement of the two outlets also influences the underlying streamflow correlation, as we show that nested catchments tend to maximize the spatial correlation of flow regimes. The application of the method to a set of catchments in the South-Eastern US suggests the potential of the proposed tool for the characterization of spatial connections of flow regimes in the

  7. Streamflow predictions under climate scenarios in the Boulder Creek Watershed at Orodell

    Science.gov (United States)

    Zhang, Q.; Williams, M. W.; Livneh, B.

    2016-12-01

    Mountainous areas have complex geological features and climatic variability, which limit our ability to simulate and predict hydrologic processes, especially in face to a changing climate. Hydrologic models can improve our understanding of land surface water and energy budgets in these regions. In this study, a distributed physically-based hydrologic model is applied to the Boulder Creek Watershed, USA to study streamflow conditions under future climatic scenarios. Model parameters were adjusted using observed streamflow data at 1/16th degree resolution, with a NSE value of 0.69. The results from CMIP5 models can give a general range of streamflow conditions under different climatic scenarios. Two scenarios are being applied, including the RCP 4.5 and 8.5 scenarios. RCP 8.5 has higher emission concentrations than RCP 4.5, but not very significant in the period of study. Using pair t-test and Mann-Whitney test at specific grid cells to compare modeled and observed climate data, four CMIP5 models were chosen to predict streamflow from 2010 to 2025. Of the four models, two models predicted increased precipitation, while the other two models predicted decreased precipitation, and the four models predicted increased minimum and maximum temperature in RCP 4.5. Average streamflow decreased by 2% 14%, while maximum SWE varies from -7% to +210% from 2010 to 2025, relative to 2006 to 2010. In RCP 8.5, three models predicted increased precipitation, while the other one model predicted decreased precipitation, and the four models predicted increased maximum and minimum temperature. Besides one model, the other three models predicted increased average streamflow by 3.5% 32%, which results from the higher increasing magnitude in precipitation. Maximum SWE varies by 6% 55% higher than that from 2006 to 2010. This study shows that average daily maximum and minimum temperature will increase toward 2025 from different climate models, while average streamflow will decrease in RCP 4

  8. Validation of heat transfer models for gap cooling

    International Nuclear Information System (INIS)

    Okano, Yukimitsu; Nagae, Takashi; Murase, Michio

    2004-01-01

    For severe accident assessment of a light water reactor, models of heat transfer in a narrow annular gap between overheated core debris and a reactor pressure vessel are important for evaluating vessel integrity and accident management. The authors developed and improved the models of heat transfer. However, validation was not sufficient for applicability of the gap heat flux correlation to the debris cooling in the vessel lower head and applicability of the local boiling heat flux correlations to the high-pressure conditions. Therefore, in this paper, we evaluated the validity of the heat transfer models and correlations by analyses for ALPHA and LAVA experiments where molten aluminum oxide (Al 2 O 3 ) at about 2700 K was poured into the high pressure water pool in a small-scale simulated vessel lower head. In the heating process of the vessel wall, the calculated heating rate and peak temperature agreed well with the measured values, and the validity of the heat transfer models and gap heat flux correlation was confirmed. In the cooling process of the vessel wall, the calculated cooling rate was compared with the measured value, and the validity of the nucleate boiling heat flux correlation was confirmed. The peak temperatures of the vessel wall in ALPHA and LAVA experiments were lower than the temperature at the minimum heat flux point between film boiling and transition boiling, so the minimum heat flux correlation could not be validated. (author)

  9. Drivers of annual to decadal streamflow variability in the lower Colorado River Basin

    Science.gov (United States)

    Lambeth-Beagles, R. S.; Troch, P. A.

    2010-12-01

    The Colorado River is the main water supply to the southwest region. As demand reaches the limit of supply in the southwest it becomes increasingly important to understand the dynamics of streamflow in the Colorado River and in particular the tributaries to the lower Colorado River. Climate change may pose an additional threat to the already-scarce water supply in the southwest. Due to the narrowing margin for error, water managers are keen on extending their ability to predict streamflow volumes on a mid-range to decadal scale. Before a predictive streamflow model can be developed, an understanding of the physical drivers of annual to decadal streamflow variability in the lower Colorado River Basin is needed. This research addresses this need by applying multiple statistical methods to identify trends, patterns and relationships present in streamflow, precipitation and temperature over the past century in four contributing watersheds to the lower Colorado River. The four watersheds selected were the Paria, Little Colorado, Virgin/Muddy, and Bill Williams. Time series data over a common period from 1906-2007 for streamflow, precipitation and temperature were used for the initial analysis. Through statistical analysis the following questions were addressed: 1) are there observable trends and patterns in these variables during the past century and 2) if there are trends or patterns, how are they related to each other? The Mann-Kendall test was used to identify trends in the three variables. Assumptions regarding autocorrelation and persistence in the data were taken into consideration. Kendall’s tau-b test was used to establish association between any found trends in the data. Initial results suggest there are two primary processes occurring. First, statistical analysis reveals significant upward trends in temperatures and downward trends in streamflow. However, there appears to be no trend in precipitation data. These trends in streamflow and temperature speak to

  10. Developing rural palliative care: validating a conceptual model.

    Science.gov (United States)

    Kelley, Mary Lou; Williams, Allison; DeMiglio, Lily; Mettam, Hilary

    2011-01-01

    The purpose of this research was to validate a conceptual model for developing palliative care in rural communities. This model articulates how local rural healthcare providers develop palliative care services according to four sequential phases. The model has roots in concepts of community capacity development, evolves from collaborative, generalist rural practice, and utilizes existing health services infrastructure. It addresses how rural providers manage challenges, specifically those related to: lack of resources, minimal community understanding of palliative care, health professionals' resistance, the bureaucracy of the health system, and the obstacles of providing services in rural environments. Seven semi-structured focus groups were conducted with interdisciplinary health providers in 7 rural communities in two Canadian provinces. Using a constant comparative analysis approach, focus group data were analyzed by examining participants' statements in relation to the model and comparing emerging themes in the development of rural palliative care to the elements of the model. The data validated the conceptual model as the model was able to theoretically predict and explain the experiences of the 7 rural communities that participated in the study. New emerging themes from the data elaborated existing elements in the model and informed the requirement for minor revisions. The model was validated and slightly revised, as suggested by the data. The model was confirmed as being a useful theoretical tool for conceptualizing the development of rural palliative care that is applicable in diverse rural communities.

  11. Streamflow disaggregation: a nonlinear deterministic approach

    Directory of Open Access Journals (Sweden)

    B. Sivakumar

    2004-01-01

    Full Text Available This study introduces a nonlinear deterministic approach for streamflow disaggregation. According to this approach, the streamflow transformation process from one scale to another is treated as a nonlinear deterministic process, rather than a stochastic process as generally assumed. The approach follows two important steps: (1 reconstruction of the scalar (streamflow series in a multi-dimensional phase-space for representing the transformation dynamics; and (2 use of a local approximation (nearest neighbor method for disaggregation. The approach is employed for streamflow disaggregation in the Mississippi River basin, USA. Data of successively doubled resolutions between daily and 16 days (i.e. daily, 2-day, 4-day, 8-day, and 16-day are studied, and disaggregations are attempted only between successive resolutions (i.e. 2-day to daily, 4-day to 2-day, 8-day to 4-day, and 16-day to 8-day. Comparisons between the disaggregated values and the actual values reveal excellent agreements for all the cases studied, indicating the suitability of the approach for streamflow disaggregation. A further insight into the results reveals that the best results are, in general, achieved for low embedding dimensions (2 or 3 and small number of neighbors (less than 50, suggesting possible presence of nonlinear determinism in the underlying transformation process. A decrease in accuracy with increasing disaggregation scale is also observed, a possible implication of the existence of a scaling regime in streamflow.

  12. Importance of Computer Model Validation in Pyroprocessing Technology Development

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Y. E.; Li, Hui; Yim, M. S. [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2014-05-15

    In this research, we developed a plan for experimental validation of one of the computer models developed for ER process modeling, i. e., the ERAD code. Several candidate surrogate materials are selected for the experiment considering the chemical and physical properties. Molten salt-based pyroprocessing technology is being examined internationally as an alternative to treat spent nuclear fuel over aqueous technology. The central process in pyroprocessing is electrorefining(ER) which separates uranium from transuranic elements and fission products present in spent nuclear fuel. ER is a widely used process in the minerals industry to purify impure metals. Studies of ER by using actual spent nuclear fuel materials are problematic for both technical and political reasons. Therefore, the initial effort for ER process optimization is made by using computer models. A number of models have been developed for this purpose. But as validation of these models is incomplete and often times problematic, the simulation results from these models are inherently uncertain.

  13. The turbulent viscosity models and their experimental validation; Les modeles de viscosite turbulente et leur validation experimentale

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    This workshop on turbulent viscosity models and on their experimental validation was organized by the `convection` section of the French society of thermal engineers. From the 9 papers presented during this workshop, 8 deal with the modeling of turbulent flows inside combustion chambers, turbo-machineries or in other energy-related applications, and have been selected for ETDE. (J.S.)

  14. Analytical models approximating individual processes: a validation method.

    Science.gov (United States)

    Favier, C; Degallier, N; Menkès, C E

    2010-12-01

    Upscaling population models from fine to coarse resolutions, in space, time and/or level of description, allows the derivation of fast and tractable models based on a thorough knowledge of individual processes. The validity of such approximations is generally tested only on a limited range of parameter sets. A more general validation test, over a range of parameters, is proposed; this would estimate the error induced by the approximation, using the original model's stochastic variability as a reference. This method is illustrated by three examples taken from the field of epidemics transmitted by vectors that bite in a temporally cyclical pattern, that illustrate the use of the method: to estimate if an approximation over- or under-fits the original model; to invalidate an approximation; to rank possible approximations for their qualities. As a result, the application of the validation method to this field emphasizes the need to account for the vectors' biology in epidemic prediction models and to validate these against finer scale models. Copyright © 2010 Elsevier Inc. All rights reserved.

  15. Field validation of the contaminant transport model, FEMA

    International Nuclear Information System (INIS)

    Wong, K.-F.V.

    1986-01-01

    The work describes the validation with field data of a finite element model of material transport through aquifers (FEMA). Field data from the Idaho Chemical Processing Plant, Idaho, USA and from the 58th Street landfill in Miami, Florida, USA are used. In both cases the model was first calibrated and then integrated over a span of eight years to check on the predictive capability of the model. Both predictive runs gave results that matched well with available data. (author)

  16. Results from the Savannah River Laboratory model validation workshop

    International Nuclear Information System (INIS)

    Pepper, D.W.

    1981-01-01

    To evaluate existing and newly developed air pollution models used in DOE-funded laboratories, the Savannah River Laboratory sponsored a model validation workshop. The workshop used Kr-85 measurements and meteorology data obtained at SRL during 1975 to 1977. Individual laboratories used models to calculate daily, weekly, monthly or annual test periods. Cumulative integrated air concentrations were reported at each grid point and at each of the eight sampler locations

  17. A Conjunction Method of Wavelet Transform-Particle Swarm Optimization-Support Vector Machine for Streamflow Forecasting

    Directory of Open Access Journals (Sweden)

    Fanping Zhang

    2014-01-01

    Full Text Available Streamflow forecasting has an important role in water resource management and reservoir operation. Support vector machine (SVM is an appropriate and suitable method for streamflow prediction due to its best versatility, robustness, and effectiveness. In this study, a wavelet transform particle swarm optimization support vector machine (WT-PSO-SVM model is proposed and applied for streamflow time series prediction. Firstly, the streamflow time series were decomposed into various details (Ds and an approximation (A3 at three resolution levels (21-22-23 using Daubechies (db3 discrete wavelet. Correlation coefficients between each D subtime series and original monthly streamflow time series are calculated. Ds components with high correlation coefficients (D3 are added to the approximation (A3 as the input values of the SVM model. Secondly, the PSO is employed to select the optimal parameters, C, ε, and σ, of the SVM model. Finally, the WT-PSO-SVM models are trained and tested by the monthly streamflow time series of Tangnaihai Station located in Yellow River upper stream from January 1956 to December 2008. The test results indicate that the WT-PSO-SVM approach provide a superior alternative to the single SVM model for forecasting monthly streamflow in situations without formulating models for internal structure of the watershed.

  18. Contact Modelling in Resistance Welding, Part II: Experimental Validation

    DEFF Research Database (Denmark)

    Song, Quanfeng; Zhang, Wenqi; Bay, Niels

    2006-01-01

    Contact algorithms in resistance welding presented in the previous paper are experimentally validated in the present paper. In order to verify the mechanical contact algorithm, two types of experiments, i.e. sandwich upsetting of circular, cylindrical specimens and compression tests of discs...... with a solid ring projection towards a flat ring, are carried out at room temperature. The complete algorithm, involving not only the mechanical model but also the thermal and electrical models, is validated by projection welding experiments. The experimental results are in satisfactory agreement...

  19. Validation of spectral gas radiation models under oxyfuel conditions

    Energy Technology Data Exchange (ETDEWEB)

    Becher, Johann Valentin

    2013-05-15

    Combustion of hydrocarbon fuels with pure oxygen results in a different flue gas composition than combustion with air. Standard computational-fluid-dynamics (CFD) spectral gas radiation models for air combustion are therefore out of their validity range in oxyfuel combustion. This thesis provides a common spectral basis for the validation of new spectral models. A literature review about fundamental gas radiation theory, spectral modeling and experimental methods provides the reader with a basic understanding of the topic. In the first results section, this thesis validates detailed spectral models with high resolution spectral measurements in a gas cell with the aim of recommending one model as the best benchmark model. In the second results section, spectral measurements from a turbulent natural gas flame - as an example for a technical combustion process - are compared to simulated spectra based on measured gas atmospheres. The third results section compares simplified spectral models to the benchmark model recommended in the first results section and gives a ranking of the proposed models based on their accuracy. A concluding section gives recommendations for the selection and further development of simplified spectral radiation models. Gas cell transmissivity spectra in the spectral range of 2.4 - 5.4 {mu}m of water vapor and carbon dioxide in the temperature range from 727 C to 1500 C and at different concentrations were compared in the first results section at a nominal resolution of 32 cm{sup -1} to line-by-line models from different databases, two statistical-narrow-band models and the exponential-wide-band model. The two statistical-narrow-band models EM2C and RADCAL showed good agreement with a maximal band transmissivity deviation of 3 %. The exponential-wide-band model showed a deviation of 6 %. The new line-by-line database HITEMP2010 had the lowest band transmissivity deviation of 2.2% and was therefore recommended as a reference model for the

  20. Validation techniques of agent based modelling for geospatial simulations

    Directory of Open Access Journals (Sweden)

    M. Darvishi

    2014-10-01

    Full Text Available One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS, biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI’s ArcGIS, OpenMap, GeoTools, etc for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  1. Validation techniques of agent based modelling for geospatial simulations

    Science.gov (United States)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  2. Effect of Streamflow Forecast Uncertainty on Real-Time Reservoir Operation

    Science.gov (United States)

    Zhao, T.; Cai, X.; Yang, D.

    2010-12-01

    Various hydrological forecast products have been applied to real-time reservoir operation, including deterministic streamflow forecast (DSF), DSF-based probabilistic streamflow forecast (DPSF), and ensemble streamflow forecast (ESF), which represent forecast uncertainty in the form of deterministic forecast error, deterministic forecast error-based uncertainty distribution, and ensemble forecast errors, respectively. Compared to previous studies that treat these forecast products as ad hoc inputs for reservoir operation models, this paper attempts to model the uncertainties involved in the various forecast products and explores their effect on real-time reservoir operation decisions. In hydrology, there are various indices reflecting the magnitude of streamflow forecast uncertainty; meanwhile, few models illustrate the forecast uncertainty evolution process. This research introduces Martingale Model of Forecast Evolution (MMFE) from supply chain management and justifies its assumptions for quantifying the evolution of uncertainty in streamflow forecast as time progresses. Based on MMFE, this research simulates the evolution of forecast uncertainty in DSF, DPSF, and ESF, and applies the reservoir operation models (dynamic programming, DP; stochastic dynamic programming, SDP; and standard operation policy, SOP) to assess the effect of different forms of forecast uncertainty on real-time reservoir operation. Through a hypothetical single-objective real-time reservoir operation model, the results illustrate that forecast uncertainty exerts significant effects. Reservoir operation efficiency, as measured by a utility function, decreases as the forecast uncertainty increases. Meanwhile, these effects also depend on the type of forecast product being used. In general, the utility of reservoir operation with ESF is nearly as high as the utility obtained with a perfect forecast; the utilities of DSF and DPSF are similar to each other but not as efficient as ESF. Moreover

  3. Validation of Computer Models for Homeland Security Purposes

    International Nuclear Information System (INIS)

    Schweppe, John E.; Ely, James; Kouzes, Richard T.; McConn, Ronald J.; Pagh, Richard T.; Robinson, Sean M.; Siciliano, Edward R.; Borgardt, James D.; Bender, Sarah E.; Earnhart, Alison H.

    2005-01-01

    At Pacific Northwest National Laboratory, we are developing computer models of radiation portal monitors for screening vehicles and cargo. Detailed models of the radiation detection equipment, vehicles, cargo containers, cargos, and radioactive sources have been created. These are used to determine the optimal configuration of detectors and the best alarm algorithms for the detection of items of interest while minimizing nuisance alarms due to the presence of legitimate radioactive material in the commerce stream. Most of the modeling is done with the Monte Carlo code MCNP to describe the transport of gammas and neutrons from extended sources through large, irregularly shaped absorbers to large detectors. A fundamental prerequisite is the validation of the computational models against field measurements. We describe the first step of this validation process, the comparison of the models to measurements with bare static sources

  4. Thermal hydraulic model validation for HOR mixed core fuel management

    International Nuclear Information System (INIS)

    Gibcus, H.P.M.; Vries, J.W. de; Leege, P.F.A. de

    1997-01-01

    A thermal-hydraulic core management model has been developed for the Hoger Onderwijsreactor (HOR), a 2 MW pool-type university research reactor. The model was adopted for safety analysis purposes in the framework of HEU/LEU core conversion studies. It is applied in the thermal-hydraulic computer code SHORT (Steady-state HOR Thermal-hydraulics) which is presently in use in designing core configurations and for in-core fuel management. An elaborate measurement program was performed for establishing the core hydraulic characteristics for a variety of conditions. The hydraulic data were obtained with a dummy fuel element with special equipment allowing a.o. direct measurement of the true core flow rate. Using these data the thermal-hydraulic model was validated experimentally. The model, experimental tests, and model validation are discussed. (author)

  5. Validation of the newborn larynx modeling with aerodynamical experimental data.

    Science.gov (United States)

    Nicollas, R; Giordano, J; Garrel, R; Medale, M; Caminat, P; Giovanni, A; Ouaknine, M; Triglia, J M

    2009-06-01

    Many authors have studied adult's larynx modelization, but the mechanisms of newborn's voice production have very rarely been investigated. After validating a numerical model with acoustic data, studies were performed on larynges of human fetuses in order to validate this model with aerodynamical experiments. Anatomical measurements were performed and a simplified numerical model was built using Fluent((R)) with the vocal folds in phonatory position. The results obtained are in good agreement with those obtained by laser Doppler velocimetry (LDV) and high-frame rate particle image velocimetry (HFR-PIV), on an experimental bench with excised human fetus larynges. It appears that computing with first cry physiological parameters leads to a model which is close to those obtained in experiments with real organs.

  6. Predictability of soil moisture and streamflow on subseasonal timescales: A case study

    Science.gov (United States)

    Orth, Rene; Seneviratne, Sonia I.

    2013-10-01

    Hydrological forecasts constitute an important tool in water resource management, especially in the case of impending extreme events. This study investigates the potential predictability of soil moisture and streamflow in Switzerland using a conceptual model including a simple water balance representation and a snow module. Our results show that simulated soil moisture and streamflow are more predictable (as indicated by significantly improved performance compared to climatology) until lead times of approximately 1 week and 2-3 days, respectively, when using initial soil moisture information and climatological atmospheric forcing. Using also initial snow information and seasonal weather forecasts as forcing, the predictable lead time doubles in case of soil moisture and triples for streamflow. The skill contributions of the additional information vary with altitude; at low altitudes the precipitation forecast is most important, whereas in mountainous areas the temperature forecast and the initial snow information are the most valuable contributors. We find furthermore that the soil moisture and streamflow forecast skills increase with increasing initial soil moisture anomalies. Comparing the respective value of realistic initial conditions and state-of-the-art forcing forecasts, we show that the former are generally more important for soil moisture forecasts, whereas the latter are more valuable for streamflow forecasts. To relate the derived predictabilities to respective soil moisture and streamflow memories investigated in other publications, we additionally illustrate the similarity between the concepts of memory and predictability as measures of persistence in the last part of this study.

  7. Deducing Climatic Elasticity to Assess Projected Climate Change Impacts on Streamflow Change across China

    Science.gov (United States)

    Liu, Jianyu; Zhang, Qiang; Zhang, Yongqiang; Chen, Xi; Li, Jianfeng; Aryal, Santosh K.

    2017-10-01

    Climatic elasticity has been widely applied to assess streamflow responses to climate changes. To fully assess impacts of climate under global warming on streamflow and reduce the error and uncertainty from various control variables, we develop a four-parameter (precipitation, catchment characteristics n, and maximum and minimum temperatures) climatic elasticity method named PnT, based on the widely used Budyko framework and simplified Makkink equation. We use this method to carry out the first comprehensive evaluation of the streamflow response to potential climate change for 372 widely spread catchments in China. The PnT climatic elasticity was first evaluated for a period 1980-2000, and then used to evaluate streamflow change response to climate change based on 12 global climate models under Representative Concentration Pathway 2.6 (RCP2.6) and RCP 8.5 emission scenarios. The results show that (1) the PnT climatic elasticity method is reliable; (2) projected increasing streamflow takes place in more than 60% of the selected catchments, with mean increments of 9% and 15.4% under RCP2.6 and RCP8.5 respectively; and (3) uncertainties in the projected streamflow are considerable in several regions, such as the Pearl River and Yellow River, with more than 40% of the selected catchments showing inconsistent change directions. Our results can help Chinese policy makers to manage and plan water resources more effectively, and the PnT climatic elasticity should be applied to other parts of the world.

  8. Relative contributions of transient and steady state infiltration during ephemeral streamflow

    Science.gov (United States)

    Blasch, Kyle W.; Ferré, Ty P.A.; Hoffmann, John P.; Fleming, John B.

    2006-01-01

    Simulations of infiltration during three ephemeral streamflow events in a coarse‐grained alluvial channel overlying a less permeable basin‐fill layer were conducted to determine the relative contribution of transient infiltration at the onset of streamflow to cumulative infiltration for the event. Water content, temperature, and piezometric measurements from 2.5‐m vertical profiles within the alluvial sediments were used to constrain a variably saturated water flow and heat transport model. Simulated and measured transient infiltration rates at the onset of streamflow were about two to three orders of magnitude greater than steady state infiltration rates. The duration of simulated transient infiltration ranged from 1.8 to 20 hours, compared with steady state flow periods of 231 to 307 hours. Cumulative infiltration during the transient period represented 10 to 26% of the total cumulative infiltration, with an average contribution of approximately 18%. Cumulative infiltration error for the simulated streamflow events ranged from 9 to 25%. Cumulative infiltration error for typical streamflow events of about 8 hours in duration in is about 90%. This analysis indicates that when estimating total cumulative infiltration in coarse‐grained ephemeral stream channels, consideration of the transient infiltration at the onset of streamflow will improve predictions of the total volume of infiltration that may become groundwater recharge.

  9. Validation of the dynamic model for a pressurized water reactor

    International Nuclear Information System (INIS)

    Zwingelstein, Gilles.

    1979-01-01

    Dynamic model validation is a necessary procedure to assure that the developed empirical or physical models are satisfactorily representing the dynamic behavior of the actual plant during normal or abnormal transients. For small transients, physical models which represent isolated core, isolated steam generator and the overall pressurized water reactor are described. Using data collected during the step power changes that occured during the startup procedures, comparisons of experimental and actual transients are given at 30% and 100% of full power. The agreement between the transients derived from the model and those recorded on the plant indicates that the developed models are well suited for use for functional or control studies

  10. Development and validation of a mass casualty conceptual model.

    Science.gov (United States)

    Culley, Joan M; Effken, Judith A

    2010-03-01

    To develop and validate a conceptual model that provides a framework for the development and evaluation of information systems for mass casualty events. The model was designed based on extant literature and existing theoretical models. A purposeful sample of 18 experts validated the model. Open-ended questions, as well as a 7-point Likert scale, were used to measure expert consensus on the importance of each construct and its relationship in the model and the usefulness of the model to future research. Computer-mediated applications were used to facilitate a modified Delphi technique through which a panel of experts provided validation for the conceptual model. Rounds of questions continued until consensus was reached, as measured by an interquartile range (no more than 1 scale point for each item); stability (change in the distribution of responses less than 15% between rounds); and percent agreement (70% or greater) for indicator questions. Two rounds of the Delphi process were needed to satisfy the criteria for consensus or stability related to the constructs, relationships, and indicators in the model. The panel reached consensus or sufficient stability to retain all 10 constructs, 9 relationships, and 39 of 44 indicators. Experts viewed the model as useful (mean of 5.3 on a 7-point scale). Validation of the model provides the first step in understanding the context in which mass casualty events take place and identifying variables that impact outcomes of care. This study provides a foundation for understanding the complexity of mass casualty care, the roles that nurses play in mass casualty events, and factors that must be considered in designing and evaluating information-communication systems to support effective triage under these conditions.

  11. Verification and Validation of FAARR Model and Data Envelopment Analysis Models for United States Army Recruiting

    National Research Council Canada - National Science Library

    Piskator, Gene

    1998-01-01

    ...) model and to develop a Data Envelopment Analysis (DEA) modeling strategy. First, the FAARR model was verified using a simulation of a known production function and validated using sensitivity analysis and ex-post forecasts...

  12. Continuous validation of ASTEC containment models and regression testing

    International Nuclear Information System (INIS)

    Nowack, Holger; Reinke, Nils; Sonnenkalb, Martin

    2014-01-01

    The focus of the ASTEC (Accident Source Term Evaluation Code) development at GRS is primarily on the containment module CPA (Containment Part of ASTEC), whose modelling is to a large extent based on the GRS containment code COCOSYS (COntainment COde SYStem). Validation is usually understood as the approval of the modelling capabilities by calculations of appropriate experiments done by external users different from the code developers. During the development process of ASTEC CPA, bugs and unintended side effects may occur, which leads to changes in the results of the initially conducted validation. Due to the involvement of a considerable number of developers in the coding of ASTEC modules, validation of the code alone, even if executed repeatedly, is not sufficient. Therefore, a regression testing procedure has been implemented in order to ensure that the initially obtained validation results are still valid with succeeding code versions. Within the regression testing procedure, calculations of experiments and plant sequences are performed with the same input deck but applying two different code versions. For every test-case the up-to-date code version is compared to the preceding one on the basis of physical parameters deemed to be characteristic for the test-case under consideration. In the case of post-calculations of experiments also a comparison to experimental data is carried out. Three validation cases from the regression testing procedure are presented within this paper. The very good post-calculation of the HDR E11.1 experiment shows the high quality modelling of thermal-hydraulics in ASTEC CPA. Aerosol behaviour is validated on the BMC VANAM M3 experiment, and the results show also a very good agreement with experimental data. Finally, iodine behaviour is checked in the validation test-case of the THAI IOD-11 experiment. Within this test-case, the comparison of the ASTEC versions V2.0r1 and V2.0r2 shows how an error was detected by the regression testing

  13. Validation od computational model ALDERSON/EGSnrc for chest radiography

    International Nuclear Information System (INIS)

    Muniz, Bianca C.; Santos, André L. dos; Menezes, Claudio J.M.

    2017-01-01

    To perform dose studies in situations of exposure to radiation, without exposing individuals, the numerical dosimetry uses Computational Exposure Models (ECM). Composed essentially by a radioactive source simulator algorithm, a voxel phantom representing the human anatomy and a Monte Carlo code, the ECMs must be validated to determine the reliability of the physical array representation. The objective of this work is to validate the ALDERSON / EGSnrc MCE by through comparisons between the experimental measurements obtained with the ionization chamber and virtual simulations using Monte Carlo Method to determine the ratio of the input and output radiation dose. Preliminary results of these comparisons showed that the ECM reproduced the results of the experimental measurements performed with the physical phantom with a relative error of less than 10%, validating the use of this model for simulations of chest radiographs and estimates of radiation doses in tissues in the irradiated structures

  14. A practical guide for operational validation of discrete simulation models

    Directory of Open Access Journals (Sweden)

    Fabiano Leal

    2011-04-01

    Full Text Available As the number of simulation experiments increases, the necessity for validation and verification of these models demands special attention on the part of the simulation practitioners. By analyzing the current scientific literature, it is observed that the operational validation description presented in many papers does not agree on the importance designated to this process and about its applied techniques, subjective or objective. With the expectation of orienting professionals, researchers and students in simulation, this article aims to elaborate a practical guide through the compilation of statistical techniques in the operational validation of discrete simulation models. Finally, the guide's applicability was evaluated by using two study objects, which represent two manufacturing cells, one from the automobile industry and the other from a Brazilian tech company. For each application, the guide identified distinct steps, due to the different aspects that characterize the analyzed distributions

  15. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    International Nuclear Information System (INIS)

    Apostolakis, J; Burkhardt, H; Ivanchenko, V N; Asai, M; Bagulya, A; Grichine, V; Brown, J M C; Chikuma, N; Cortes-Giraldo, M A; Elles, S; Jacquemier, J; Guatelli, S; Incerti, S; Kadri, O; Maire, M; Urban, L; Pandola, L; Sawkey, D; Toshito, T; Yamashita, T

    2015-01-01

    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed. (paper)

  16. Validation of statistical models for creep rupture by parametric analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bolton, J., E-mail: john.bolton@uwclub.net [65, Fisher Ave., Rugby, Warks CV22 5HW (United Kingdom)

    2012-01-15

    Statistical analysis is an efficient method for the optimisation of any candidate mathematical model of creep rupture data, and for the comparative ranking of competing models. However, when a series of candidate models has been examined and the best of the series has been identified, there is no statistical criterion to determine whether a yet more accurate model might be devised. Hence there remains some uncertainty that the best of any series examined is sufficiently accurate to be considered reliable as a basis for extrapolation. This paper proposes that models should be validated primarily by parametric graphical comparison to rupture data and rupture gradient data. It proposes that no mathematical model should be considered reliable for extrapolation unless the visible divergence between model and data is so small as to leave no apparent scope for further reduction. This study is based on the data for a 12% Cr alloy steel used in BS PD6605:1998 to exemplify its recommended statistical analysis procedure. The models considered in this paper include a) a relatively simple model, b) the PD6605 recommended model and c) a more accurate model of somewhat greater complexity. - Highlights: Black-Right-Pointing-Pointer The paper discusses the validation of creep rupture models derived from statistical analysis. Black-Right-Pointing-Pointer It demonstrates that models can be satisfactorily validated by a visual-graphic comparison of models to data. Black-Right-Pointing-Pointer The method proposed utilises test data both as conventional rupture stress and as rupture stress gradient. Black-Right-Pointing-Pointer The approach is shown to be more reliable than a well-established and widely used method (BS PD6605).

  17. Multiphysics modelling and experimental validation of high concentration photovoltaic modules

    International Nuclear Information System (INIS)

    Theristis, Marios; Fernández, Eduardo F.; Sumner, Mike; O'Donovan, Tadhg S.

    2017-01-01

    Highlights: • A multiphysics modelling approach for concentrating photovoltaics was developed. • An experimental campaign was conducted to validate the models. • The experimental results were in good agreement with the models. • The multiphysics modelling allows the concentrator’s optimisation. - Abstract: High concentration photovoltaics, equipped with high efficiency multijunction solar cells, have great potential in achieving cost-effective and clean electricity generation at utility scale. Such systems are more complex compared to conventional photovoltaics because of the multiphysics effect that is present. Modelling the power output of such systems is therefore crucial for their further market penetration. Following this line, a multiphysics modelling procedure for high concentration photovoltaics is presented in this work. It combines an open source spectral model, a single diode electrical model and a three-dimensional finite element thermal model. In order to validate the models and the multiphysics modelling procedure against actual data, an outdoor experimental campaign was conducted in Albuquerque, New Mexico using a high concentration photovoltaic monomodule that is thoroughly described in terms of its geometry and materials. The experimental results were in good agreement (within 2.7%) with the predicted maximum power point. This multiphysics approach is relatively more complex when compared to empirical models, but besides the overall performance prediction it can also provide better understanding of the physics involved in the conversion of solar irradiance into electricity. It can therefore be used for the design and optimisation of high concentration photovoltaic modules.

  18. Monthly streamflow forecasting using continuous wavelet and multi-gene genetic programming combination

    Science.gov (United States)

    Hadi, Sinan Jasim; Tombul, Mustafa

    2018-06-01

    Streamflow is an essential component of the hydrologic cycle in the regional and global scale and the main source of fresh water supply. It is highly associated with natural disasters, such as droughts and floods. Therefore, accurate streamflow forecasting is essential. Forecasting streamflow in general and monthly streamflow in particular is a complex process that cannot be handled by data-driven models (DDMs) only and requires pre-processing. Wavelet transformation is a pre-processing technique; however, application of continuous wavelet transformation (CWT) produces many scales that cause deterioration in the performance of any DDM because of the high number of redundant variables. This study proposes multigene genetic programming (MGGP) as a selection tool. After the CWT analysis, it selects important scales to be imposed into the artificial neural network (ANN). A basin located in the southeast of Turkey is selected as case study to prove the forecasting ability of the proposed model. One month ahead downstream flow is used as output, and downstream flow, upstream, rainfall, temperature, and potential evapotranspiration with associated lags are used as inputs. Before modeling, wavelet coherence transformation (WCT) analysis was conducted to analyze the relationship between variables in the time-frequency domain. Several combinations were developed to investigate the effect of the variables on streamflow forecasting. The results indicated a high localized correlation between the streamflow and other variables, especially the upstream. In the models of the standalone layout where the data were entered to ANN and MGGP without CWT, the performance is found poor. In the best-scale layout, where the best scale of the CWT identified as the highest correlated scale is chosen and enters to ANN and MGGP, the performance increased slightly. Using the proposed model, the performance improved dramatically particularly in forecasting the peak values because of the inclusion

  19. On the sensitivity of annual streamflow to air temperature

    Science.gov (United States)

    Milly, Paul C.D.; Kam, Jonghun; Dunne, Krista A.

    2018-01-01

    Although interannual streamflow variability is primarily a result of precipitation variability, temperature also plays a role. The relative weakness of the temperature effect at the annual time scale hinders understanding, but may belie substantial importance on climatic time scales. Here we develop and evaluate a simple theory relating variations of streamflow and evapotranspiration (E) to those of precipitation (P) and temperature. The theory is based on extensions of the Budyko water‐balance hypothesis, the Priestley‐Taylor theory for potential evapotranspiration ( ), and a linear model of interannual basin storage. The theory implies that the temperature affects streamflow by modifying evapotranspiration through a Clausius‐Clapeyron‐like relation and through the sensitivity of net radiation to temperature. We apply and test (1) a previously introduced “strong” extension of the Budyko hypothesis, which requires that the function linking temporal variations of the evapotranspiration ratio (E/P) and the index of dryness ( /P) at an annual time scale is identical to that linking interbasin variations of the corresponding long‐term means, and (2) a “weak” extension, which requires only that the annual evapotranspiration ratio depends uniquely on the annual index of dryness, and that the form of that dependence need not be known a priori nor be identical across basins. In application of the weak extension, the readily observed sensitivity of streamflow to precipitation contains crucial information about the sensitivity to potential evapotranspiration and, thence, to temperature. Implementation of the strong extension is problematic, whereas the weak extension appears to capture essential controls of the temperature effect efficiently.

  20. Improving Perovskite Solar Cells: Insights From a Validated Device Model

    NARCIS (Netherlands)

    Sherkar, Tejas S.; Momblona, Cristina; Gil-Escrig, Lidon; Bolink, Henk J.; Koster, L. Jan Anton

    2017-01-01

    To improve the efficiency of existing perovskite solar cells (PSCs), a detailed understanding of the underlying device physics during their operation is essential. Here, a device model has been developed and validated that describes the operation of PSCs and quantitatively explains the role of

  1. Model validation studies of solar systems, Phase III. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Lantz, L.J.; Winn, C.B.

    1978-12-01

    Results obtained from a validation study of the TRNSYS, SIMSHAC, and SOLCOST solar system simulation and design are presented. Also included are comparisons between the FCHART and SOLCOST solar system design programs and some changes that were made to the SOLCOST program. Finally, results obtained from the analysis of several solar radiation models are presented. Separate abstracts were prepared for ten papers.

  2. Validation of a tuber blight (Phytophthora infestans) prediction model

    Science.gov (United States)

    Potato tuber blight caused by Phytophthora infestans accounts for significant losses in storage. There is limited published quantitative data on predicting tuber blight. We validated a tuber blight prediction model developed in New York with cultivars Allegany, NY 101, and Katahdin using independent...

  3. Validating soil phosphorus routines in the SWAT model

    Science.gov (United States)

    Phosphorus transfer from agricultural soils to surface waters is an important environmental issue. Commonly used models like SWAT have not always been updated to reflect improved understanding of soil P transformations and transfer to runoff. Our objective was to validate the ability of the P routin...

  4. Temporal validation for landsat-based volume estimation model

    Science.gov (United States)

    Renaldo J. Arroyo; Emily B. Schultz; Thomas G. Matney; David L. Evans; Zhaofei Fan

    2015-01-01

    Satellite imagery can potentially reduce the costs and time associated with ground-based forest inventories; however, for satellite imagery to provide reliable forest inventory data, it must produce consistent results from one time period to the next. The objective of this study was to temporally validate a Landsat-based volume estimation model in a four county study...

  5. Multiphysics software and the challenge to validating physical models

    International Nuclear Information System (INIS)

    Luxat, J.C.

    2008-01-01

    This paper discusses multi physics software and validation of physical models in the nuclear industry. The major challenge is to convert the general purpose software package to a robust application-specific solution. This requires greater knowledge of the underlying solution techniques and the limitations of the packages. Good user interfaces and neat graphics do not compensate for any deficiencies

  6. Technical Note: Calibration and validation of geophysical observation models

    NARCIS (Netherlands)

    Salama, M.S.; van der Velde, R.; van der Woerd, H.J.; Kromkamp, J.C.; Philippart, C.J.M.; Joseph, A.T.; O'Neill, P.E.; Lang, R.H.; Gish, T.; Werdell, P.J.; Su, Z.

    2012-01-01

    We present a method to calibrate and validate observational models that interrelate remotely sensed energy fluxes to geophysical variables of land and water surfaces. Coincident sets of remote sensing observation of visible and microwave radiations and geophysical data are assembled and subdivided

  7. Has streamflow changed in the Nordic countries?

    Energy Technology Data Exchange (ETDEWEB)

    Hisdal, Hege; Holmqvist, Erik; Jonsdottir, Jona Finndis; Jonsson, Pall; Kuusisto, Esko; Lindstroem, Goeran; Roald, Lars A.

    2010-01-15

    Climate change studies traditionally include elaboration of possible scenarios for the future and attempts to detect a climate change signal in historical data. This study focuses on the latter. A pan-Nordic dataset of more than 160 streamflow records was analysed to detect spatial and temporal changes in streamflow. The Mann-Kendall trend test was applied to study changes in annual and seasonal streamflow as well as floods and droughts for three periods: 1961-2000, 1941-2002 and 1920-2002. The period analysed and the selection of stations influenced the regional patterns found, but the overall picture was that trends towards increased streamflow were dominating for annual values and the winter and spring seasons. Trends in summer flow highly depended on the period analysed whereas no trend was found for the autumn season. A signal towards earlier snowmelt floods was clear and a tendency towards more severe summer droughts was found in southern Norway. A qualitative comparison of the findings to available streamflow scenarios for the region showed that the strongest trends found are coherent with changes expected in the scenario period, for example increased winter discharge and earlier snowmelt floods. However, there are also expected changes that are not reflected in the trends, such as the expected increase in autumn discharge in Norway. It can be concluded that the observed temperature increase has clearly affected the streamflow in the Nordic countries. These changes correspond well with the estimated consequences of a projected temperature increase. The effect of the observed and projected precipitation increase on streamflow is less clear.(Author)

  8. Validating firn compaction model with remote sensing data

    DEFF Research Database (Denmark)

    Simonsen, S. B.; Stenseng, Lars; Sørensen, Louise Sandberg

    A comprehensive understanding of firn processes is of outmost importance, when estimating present and future changes of the Greenland Ice Sheet. Especially, when remote sensing altimetry is used to assess the state of ice sheets and their contribution to global sea level rise, firn compaction...... models have been shown to be a key component. Now, remote sensing data can also be used to validate the firn models. Radar penetrating the upper part of the firn column in the interior part of Greenland shows a clear layering. The observed layers from the radar data can be used as an in-situ validation...... correction relative to the changes in the elevation of the surface observed with remote sensing altimetry? What model time resolution is necessary to resolved the observed layering? What model refinements are necessary to give better estimates of the surface mass balance of the Greenland ice sheet from...

  9. Selection, calibration, and validation of models of tumor growth.

    Science.gov (United States)

    Lima, E A B F; Oden, J T; Hormuth, D A; Yankeelov, T E; Almeida, R C

    2016-11-01

    This paper presents general approaches for addressing some of the most important issues in predictive computational oncology concerned with developing classes of predictive models of tumor growth. First, the process of developing mathematical models of vascular tumors evolving in the complex, heterogeneous, macroenvironment of living tissue; second, the selection of the most plausible models among these classes, given relevant observational data; third, the statistical calibration and validation of models in these classes, and finally, the prediction of key Quantities of Interest (QOIs) relevant to patient survival and the effect of various therapies. The most challenging aspects of this endeavor is that all of these issues often involve confounding uncertainties: in observational data, in model parameters, in model selection, and in the features targeted in the prediction. Our approach can be referred to as "model agnostic" in that no single model is advocated; rather, a general approach that explores powerful mixture-theory representations of tissue behavior while accounting for a range of relevant biological factors is presented, which leads to many potentially predictive models. Then representative classes are identified which provide a starting point for the implementation of OPAL, the Occam Plausibility Algorithm (OPAL) which enables the modeler to select the most plausible models (for given data) and to determine if the model is a valid tool for predicting tumor growth and morphology ( in vivo ). All of these approaches account for uncertainties in the model, the observational data, the model parameters, and the target QOI. We demonstrate these processes by comparing a list of models for tumor growth, including reaction-diffusion models, phase-fields models, and models with and without mechanical deformation effects, for glioma growth measured in murine experiments. Examples are provided that exhibit quite acceptable predictions of tumor growth in laboratory

  10. Escherichia coli bacteria density in relation to turbidity, streamflow characteristics, and season in the Chattahoochee River near Atlanta, Georgia, October 2000 through September 2008—Description, statistical analysis, and predictive modeling

    Science.gov (United States)

    Lawrence, Stephen J.

    2012-01-01

    Water-based recreation—such as rafting, canoeing, and fishing—is popular among visitors to the Chattahoochee River National Recreation Area (CRNRA) in north Georgia. The CRNRA is a 48-mile reach of the Chattahoochee River upstream from Atlanta, Georgia, managed by the National Park Service (NPS). Historically, high densities of fecal-indicator bacteria have been documented in the Chattahoochee River and its tributaries at levels that commonly exceeded Georgia water-quality standards. In October 2000, the NPS partnered with the U.S. Geological Survey (USGS), State and local agencies, and non-governmental organizations to monitor Escherichia coli bacteria (E. coli) density and develop a system to alert river users when E. coli densities exceeded the U.S. Environmental Protection Agency (USEPA) single-sample beach criterion of 235 colonies (most probable number) per 100 milliliters (MPN/100 mL) of water. This program, called BacteriALERT, monitors E. coli density, turbidity, and water temperature at two sites on the Chattahoochee River upstream from Atlanta, Georgia. This report summarizes E. coli bacteria density and turbidity values in water samples collected between 2000 and 2008 as part of the BacteriALERT program; describes the relations between E. coli density and turbidity, streamflow characteristics, and season; and describes the regression analyses used to develop predictive models that estimate E. coli density in real time at both sampling sites.

  11. Long-term variation analysis of a tropical river's annual streamflow regime over a 50-year period

    Science.gov (United States)

    Seyam, Mohammed; Othman, Faridah

    2015-07-01

    Studying the long-term changes of streamflow is an important tool for enhancing water resource and river system planning, design, and management. The aim of this work is to identify the long-term variations in annual streamflow regime over a 50-year period from 1961 to 2010 in the Selangor River, which is one of the main tropical rivers in Malaysia. Initially, the data underwent preliminary independence, normality, and homogeneity testing using the Pearson correlation coefficient and Shapiro-Wilk and Pettitt's tests, respectively. The work includes a study and analysis of the changes through nine variables describing the annual streamflow and variations in the yearly duration of high and low streamflows. The analyses were conducted via two time scales: yearly and sub-periodic. The sub-periods were obtained by segmenting the 50 years into seven sub-periods by two techniques, namely the change-point test and direct method. Even though analysis revealed nearly negligible changes in mean annual flow over the study period, the maximum annual flow generally increased while the minimum annual flow significantly decreased with respect to time. It was also observed that the variables describing the dispersion in streamflow continually increased with respect to time. An obvious increase was detected in the yearly duration of danger level of streamflow, a slight increase was noted in the yearly duration of warning and alert levels, and a slight decrease in the yearly duration of low streamflow was found. The perceived changes validate the existence of long-term changes in annual streamflow regime, which increase the probability of floods and droughts occurring in future. In light of the results, attention should be drawn to developing water resource management and flood protection plans in order to avert the harmful effects potentially resulting from the expected changes in annual streamflow regime.

  12. Statistical methods for mechanistic model validation: Salt Repository Project

    International Nuclear Information System (INIS)

    Eggett, D.L.

    1988-07-01

    As part of the Department of Energy's Salt Repository Program, Pacific Northwest Laboratory (PNL) is studying the emplacement of nuclear waste containers in a salt repository. One objective of the SRP program is to develop an overall waste package component model which adequately describes such phenomena as container corrosion, waste form leaching, spent fuel degradation, etc., which are possible in the salt repository environment. The form of this model will be proposed, based on scientific principles and relevant salt repository conditions with supporting data. The model will be used to predict the future characteristics of the near field environment. This involves several different submodels such as the amount of time it takes a brine solution to contact a canister in the repository, how long it takes a canister to corrode and expose its contents to the brine, the leach rate of the contents of the canister, etc. These submodels are often tested in a laboratory and should be statistically validated (in this context, validate means to demonstrate that the model adequately describes the data) before they can be incorporated into the waste package component model. This report describes statistical methods for validating these models. 13 refs., 1 fig., 3 tabs

  13. Verification and Validation of Heat Transfer Model of AGREE Code

    Energy Technology Data Exchange (ETDEWEB)

    Tak, N. I. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Seker, V.; Drzewiecki, T. J.; Downar, T. J. [Department of Nuclear Engineering and Radiological Sciences, Univ. of Michigan, Michigan (United States); Kelly, J. M. [US Nuclear Regulatory Commission, Washington (United States)

    2013-05-15

    The AGREE code was originally developed as a multi physics simulation code to perform design and safety analysis of Pebble Bed Reactors (PBR). Currently, additional capability for the analysis of Prismatic Modular Reactor (PMR) core is in progress. Newly implemented fluid model for a PMR core is based on a subchannel approach which has been widely used in the analyses of light water reactor (LWR) cores. A hexagonal fuel (or graphite block) is discretized into triangular prism nodes having effective conductivities. Then, a meso-scale heat transfer model is applied to the unit cell geometry of a prismatic fuel block. Both unit cell geometries of multi-hole and pin-in-hole types of prismatic fuel blocks are considered in AGREE. The main objective of this work is to verify and validate the heat transfer model newly implemented for a PMR core in the AGREE code. The measured data in the HENDEL experiment were used for the validation of the heat transfer model for a pin-in-hole fuel block. However, the HENDEL tests were limited to only steady-state conditions of pin-in-hole fuel blocks. There exist no available experimental data regarding a heat transfer in multi-hole fuel blocks. Therefore, numerical benchmarks using conceptual problems are considered to verify the heat transfer model of AGREE for multi-hole fuel blocks as well as transient conditions. The CORONA and GAMMA+ codes were used to compare the numerical results. In this work, the verification and validation study were performed for the heat transfer model of the AGREE code using the HENDEL experiment and the numerical benchmarks of selected conceptual problems. The results of the present work show that the heat transfer model of AGREE is accurate and reliable for prismatic fuel blocks. Further validation of AGREE is in progress for a whole reactor problem using the HTTR safety test data such as control rod withdrawal tests and loss-of-forced convection tests.

  14. Validation of the Colorado Retinopathy of Prematurity Screening Model.

    Science.gov (United States)

    McCourt, Emily A; Ying, Gui-Shuang; Lynch, Anne M; Palestine, Alan G; Wagner, Brandie D; Wymore, Erica; Tomlinson, Lauren A; Binenbaum, Gil

    2018-04-01

    The Colorado Retinopathy of Prematurity (CO-ROP) model uses birth weight, gestational age, and weight gain at the first month of life (WG-28) to predict risk of severe retinopathy of prematurity (ROP). In previous validation studies, the model performed very well, predicting virtually all cases of severe ROP and potentially reducing the number of infants who need ROP examinations, warranting validation in a larger, more diverse population. To validate the performance of the CO-ROP model in a large multicenter cohort. This study is a secondary analysis of data from the Postnatal Growth and Retinopathy of Prematurity (G-ROP) Study, a retrospective multicenter cohort study conducted in 29 hospitals in the United States and Canada between January 2006 and June 2012 of 6351 premature infants who received ROP examinations. Sensitivity and specificity for severe (early treatment of ROP [ETROP] type 1 or 2) ROP, and reduction in infants receiving examinations. The CO-ROP model was applied to the infants in the G-ROP data set with all 3 data points (infants would have received examinations if they met all 3 criteria: birth weight, large validation cohort. The model requires all 3 criteria to be met to signal a need for examinations, but some infants with a birth weight or gestational age above the thresholds developed severe ROP. Most of these infants who were not detected by the CO-ROP model had obvious deviation in expected weight trajectories or nonphysiologic weight gain. These findings suggest that the CO-ROP model needs to be revised before considering implementation into clinical practice.

  15. In ecoregions across western USA streamflow increases during post-wildfire recovery

    Science.gov (United States)

    Wine, Michael L.; Cadol, Daniel; Makhnin, Oleg

    2018-01-01

    Continued growth of the human population on Earth will increase pressure on already stressed terrestrial water resources required for drinking water, agriculture, and industry. This stress demands improved understanding of critical controls on water resource availability, particularly in water-limited regions. Mechanistic predictions of future water resource availability are needed because non-stationary conditions exist in the form of changing climatic conditions, land management paradigms, and ecological disturbance regimes. While historically ecological disturbances have been small and could be neglected relative to climatic effects, evidence is accumulating that ecological disturbances, particularly wildfire, can increase regional water availability. However, wildfire hydrologic impacts are typically estimated locally and at small spatial scales, via disparate measurement methods and analysis techniques, and outside the context of climate change projections. Consequently, the relative importance of climate change driven versus wildfire driven impacts on streamflow remains unknown across the western USA. Here we show that considering wildfire in modeling streamflow significantly improves model predictions. Mixed effects modeling attributed 2%-14% of long-term annual streamflow to wildfire effects. The importance of this wildfire-linked streamflow relative to predicted climate change-induced streamflow reductions ranged from 20%-370% of the streamflow decrease predicted to occur by 2050. The rate of post-wildfire vegetation recovery and the proportion of watershed area burned controlled the wildfire effect. Our results demonstrate that in large areas of the western USA affected by wildfire, regional predictions of future water availability are subject to greater structural uncertainty than previously thought. These results suggest that future streamflows may be underestimated in areas affected by increased prevalence of hydrologically relevant ecological

  16. A process improvement model for software verification and validation

    Science.gov (United States)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  17. Effect of Tree-to-Shrub Type Conversion in Lower Montane Forests of the Sierra Nevada (USA on Streamflow.

    Directory of Open Access Journals (Sweden)

    Ryan R Bart

    Full Text Available Higher global temperatures and increased levels of disturbance are contributing to greater tree mortality in many forest ecosystems. These same drivers can also limit forest regeneration, leading to vegetation type conversion. For the Sierra Nevada of California, little is known about how type conversion may affect streamflow, a critical source of water supply for urban, agriculture and environmental purposes. In this paper, we examined the effects of tree-to-shrub type conversion, in combination with climate change, on streamflow in two lower montane forest watersheds in the Sierra Nevada. A spatially distributed ecohydrologic model was used to simulate changes in streamflow, evaporation, and transpiration following type conversion, with an explicit focus on the role of vegetation size and aspect. Model results indicated that streamflow may show negligible change or small decreases following type conversion when the difference between tree and shrub leaf areas is small, partly due to the higher stomatal conductivity and the deep rooting depth of shrubs. In contrast, streamflow may increase when post-conversion shrubs have a small leaf area relative to trees. Model estimates also suggested that vegetation change could have a greater impact on streamflow magnitude than the direct hydrologic impacts of increased temperatures. Temperature increases, however, may have a greater impact on streamflow timing. Tree-to-shrub type conversion increased streamflow only marginally during dry years (annual precipitation < 800 mm, with most streamflow change observed during wetter years. These modeling results underscore the importance of accounting for changes in vegetation communities to accurately characterize future hydrologic regimes for the Sierra Nevada.

  18. Effect of Tree-to-Shrub Type Conversion in Lower Montane Forests of the Sierra Nevada (USA) on Streamflow

    Science.gov (United States)

    Tague, Christina L.; Moritz, Max A.

    2016-01-01

    Higher global temperatures and increased levels of disturbance are contributing to greater tree mortality in many forest ecosystems. These same drivers can also limit forest regeneration, leading to vegetation type conversion. For the Sierra Nevada of California, little is known about how type conversion may affect streamflow, a critical source of water supply for urban, agriculture and environmental purposes. In this paper, we examined the effects of tree-to-shrub type conversion, in combination with climate change, on streamflow in two lower montane forest watersheds in the Sierra Nevada. A spatially distributed ecohydrologic model was used to simulate changes in streamflow, evaporation, and transpiration following type conversion, with an explicit focus on the role of vegetation size and aspect. Model results indicated that streamflow may show negligible change or small decreases following type conversion when the difference between tree and shrub leaf areas is small, partly due to the higher stomatal conductivity and the deep rooting depth of shrubs. In contrast, streamflow may increase when post-conversion shrubs have a small leaf area relative to trees. Model estimates also suggested that vegetation change could have a greater impact on streamflow magnitude than the direct hydrologic impacts of increased temperatures. Temperature increases, however, may have a greater impact on streamflow timing. Tree-to-shrub type conversion increased streamflow only marginally during dry years (annual precipitation < 800 mm), with most streamflow change observed during wetter years. These modeling results underscore the importance of accounting for changes in vegetation communities to accurately characterize future hydrologic regimes for the Sierra Nevada. PMID:27575592

  19. Comparative calculations and validation studies with atmospheric dispersion models

    International Nuclear Information System (INIS)

    Paesler-Sauer, J.

    1986-11-01

    This report presents the results of an intercomparison of different mesoscale dispersion models and measured data of tracer experiments. The types of models taking part in the intercomparison are Gaussian-type, numerical Eulerian, and Lagrangian dispersion models. They are suited for the calculation of the atmospherical transport of radionuclides released from a nuclear installation. For the model intercomparison artificial meteorological situations were defined and corresponding arithmetical problems were formulated. For the purpose of model validation real dispersion situations of tracer experiments were used as input data for model calculations; in these cases calculated and measured time-integrated concentrations close to the ground are compared. Finally a valuation of the models concerning their efficiency in solving the problems is carried out by the aid of objective methods. (orig./HP) [de

  20. On the contribution of groundwater storage to interannual streamflow anomalies in the Colorado River basin

    Directory of Open Access Journals (Sweden)

    E. A. Rosenberg

    2013-04-01

    Full Text Available We assess the significance of groundwater storage for seasonal streamflow forecasts by evaluating its contribution to interannual streamflow anomalies in the 29 tributary sub-basins of the Colorado River. Monthly and annual changes in total basin storage are simulated by two implementations of the Variable Infiltration Capacity (VIC macroscale hydrology model – the standard release of the model, and an alternate version that has been modified to include the SIMple Groundwater Model (SIMGM, which represents an unconfined aquifer underlying the soil column. These estimates are compared to those resulting from basin-scale water balances derived exclusively from observational data and changes in terrestrial water storage from the Gravity Recovery and Climate Experiment (GRACE satellites. Changes in simulated groundwater storage are then compared to those derived via baseflow recession analysis for 72 reference-quality watersheds. Finally, estimates are statistically analyzed for relationships to interannual streamflow anomalies, and predictive capacities are compared across storage terms. We find that both model simulations result in similar estimates of total basin storage change, that these estimates compare favorably with those obtained from basin-scale water balances and GRACE data, and that baseflow recession analyses are consistent with simulated changes in groundwater storage. Statistical analyses reveal essentially no relationship between groundwater storage and interannual streamflow anomalies, suggesting that operational seasonal streamflow forecasts, which do not account for groundwater conditions implicitly or explicitly, are likely not detrimentally affected by this omission in the Colorado River basin.

  1. Improvement and Validation of Weld Residual Stress Modelling Procedure

    International Nuclear Information System (INIS)

    Zang, Weilin; Gunnars, Jens; Dong, Pingsha; Hong, Jeong K.

    2009-06-01

    The objective of this work is to identify and evaluate improvements for the residual stress modelling procedure currently used in Sweden. There is a growing demand to eliminate any unnecessary conservatism involved in residual stress assumptions. The study was focused on the development and validation of an improved weld residual stress modelling procedure, by taking advantage of the recent advances in residual stress modelling and stress measurement techniques. The major changes applied in the new weld residual stress modelling procedure are: - Improved procedure for heat source calibration based on use of analytical solutions. - Use of an isotropic hardening model where mixed hardening data is not available. - Use of an annealing model for improved simulation of strain relaxation in re-heated material. The new modelling procedure is demonstrated to capture the main characteristics of the through thickness stress distributions by validation to experimental measurements. Three austenitic stainless steel butt-welds cases are analysed, covering a large range of pipe geometries. From the cases it is evident that there can be large differences between the residual stresses predicted using the new procedure, and the earlier procedure or handbook recommendations. Previously recommended profiles could give misleading fracture assessment results. The stress profiles according to the new procedure agree well with the measured data. If data is available then a mixed hardening model should be used

  2. Validated TRNSYS Model for Solar Assisted Space Heating System

    International Nuclear Information System (INIS)

    Abdalla, Nedal

    2014-01-01

    The present study involves a validated TRNSYS model for solar assisted space heating system as applied to a residential building in Jordan using new detailed radiation models of the TRNSYS 17.1 and geometric building model Trnsys3d for the Google SketchUp 3D drawing program. The annual heating load for a building (Solar House) which is located at the Royal ScientiFIc Society (RS5) in Jordan is estimated under climatological conditions of Amman. The aim of this Paper is to compare measured thermal performance of the Solar House with that modeled using TRNSYS. The results showed that the annual measured space heating load for the building was 6,188 kWh while the heati.ng load for the modeled building was 6,391 kWh. Moreover, the measured solar fraction for the solar system was 50% while the modeled solar fraction was 55%. A comparison of modeled and measured data resulted in percentage mean absolute errors for solar energy for space heating, auxiliary heating and solar fraction of 13%, 7% and 10%, respectively. The validated model will be useful for long-term performance simulation under different weather and operating conditions.(author)

  3. Improvement and Validation of Weld Residual Stress Modelling Procedure

    Energy Technology Data Exchange (ETDEWEB)

    Zang, Weilin; Gunnars, Jens (Inspecta Technology AB, Stockholm (Sweden)); Dong, Pingsha; Hong, Jeong K. (Center for Welded Structures Research, Battelle, Columbus, OH (United States))

    2009-06-15

    The objective of this work is to identify and evaluate improvements for the residual stress modelling procedure currently used in Sweden. There is a growing demand to eliminate any unnecessary conservatism involved in residual stress assumptions. The study was focused on the development and validation of an improved weld residual stress modelling procedure, by taking advantage of the recent advances in residual stress modelling and stress measurement techniques. The major changes applied in the new weld residual stress modelling procedure are: - Improved procedure for heat source calibration based on use of analytical solutions. - Use of an isotropic hardening model where mixed hardening data is not available. - Use of an annealing model for improved simulation of strain relaxation in re-heated material. The new modelling procedure is demonstrated to capture the main characteristics of the through thickness stress distributions by validation to experimental measurements. Three austenitic stainless steel butt-welds cases are analysed, covering a large range of pipe geometries. From the cases it is evident that there can be large differences between the residual stresses predicted using the new procedure, and the earlier procedure or handbook recommendations. Previously recommended profiles could give misleading fracture assessment results. The stress profiles according to the new procedure agree well with the measured data. If data is available then a mixed hardening model should be used

  4. Geographic and temporal validity of prediction models: Different approaches were useful to examine model performance

    NARCIS (Netherlands)

    P.C. Austin (Peter); D. van Klaveren (David); Y. Vergouwe (Yvonne); D. Nieboer (Daan); D.S. Lee (Douglas); E.W. Steyerberg (Ewout)

    2016-01-01

    textabstractObjective: Validation of clinical prediction models traditionally refers to the assessment of model performance in new patients. We studied different approaches to geographic and temporal validation in the setting of multicenter data from two time periods. Study Design and Setting: We

  5. Validation of an O-18 leaf water enrichment model

    Energy Technology Data Exchange (ETDEWEB)

    Jaeggi, M.; Saurer, M.; Siegwolf, R.

    2002-03-01

    The seasonal trend in {delta}{sup 18}O{sub ol} in leaf organic matter of spruce needles of mature trees could be modelled for two years. The seasonality was mainly explained by the {delta}{sup 18}O of top-soil water, whereas between years differences were due to variation in air humidity. Application of a third year's data set improved the correlation between modelled and measured {delta}{sup 18}O{sub ol} and thus validated our extended Dongmann model. (author)

  6. Validation study of safety assessment model for radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Munakata, Masahiro; Takeda, Seiji; Kimura, Hideo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-12-01

    The JAERI-AECL collaboration research program has been conducted to validate a groundwater flow and radionuclide transport models for safety assessment. JAERI have developed a geostatistical model for radionuclide transport through a heterogeneous geological media and verify using experimental results of field tracer tests. The simulated tracer plumes explain favorably the experimental tracer plumes. A regional groundwater flow and transport model using site-scale parameter obtained from tracer tests have been verified by comparing simulation results with observation ones of natural environmental tracer. (author)

  7. Numerical Validation of Chemical Compositional Model for Wettability Alteration Processes

    Science.gov (United States)

    Bekbauov, Bakhbergen; Berdyshev, Abdumauvlen; Baishemirov, Zharasbek; Bau, Domenico

    2017-12-01

    Chemical compositional simulation of enhanced oil recovery and surfactant enhanced aquifer remediation processes is a complex task that involves solving dozens of equations for all grid blocks representing a reservoir. In the present work, we perform a numerical validation of the newly developed mathematical formulation which satisfies the conservation laws of mass and energy and allows applying a sequential solution approach to solve the governing equations separately and implicitly. Through its application to the numerical experiment using a wettability alteration model and comparisons with existing chemical compositional model's numerical results, the new model has proven to be practical, reliable and stable.

  8. Validation of Slosh Modeling Approach Using STAR-CCM+

    Science.gov (United States)

    Benson, David J.; Ng, Wanyi

    2018-01-01

    Without an adequate understanding of propellant slosh, the spacecraft attitude control system may be inadequate to control the spacecraft or there may be an unexpected loss of science observation time due to higher slosh settling times. Computational fluid dynamics (CFD) is used to model propellant slosh. STAR-CCM+ is a commercially available CFD code. This paper seeks to validate the CFD modeling approach via a comparison between STAR-CCM+ liquid slosh modeling results and experimental, empirically, and analytically derived results. The geometries examined are a bare right cylinder tank and a right cylinder with a single ring baffle.

  9. Advances in the spatially distributed ages-w model: parallel computation, java connection framework (JCF) integration, and streamflow/nitrogen dynamics assessment

    Science.gov (United States)

    AgroEcoSystem-Watershed (AgES-W) is a modular, Java-based spatially distributed model which implements hydrologic and water quality (H/WQ) simulation components under the Java Connection Framework (JCF) and the Object Modeling System (OMS) environmental modeling framework. AgES-W is implicitly scala...

  10. Interaction between stream temperature, streamflow, and groundwater exchanges in alpine streams

    Science.gov (United States)

    Constantz, James E.

    1998-01-01

    Four alpine streams were monitored to continuously collect stream temperature and streamflow for periods ranging from a week to a year. In a small stream in the Colorado Rockies, diurnal variations in both stream temperature and streamflow were significantly greater in losing reaches than in gaining reaches, with minimum streamflow losses occurring early in the day and maximum losses occurring early in the evening. Using measured stream temperature changes, diurnal streambed infiltration rates were predicted to increase as much as 35% during the day (based on a heat and water transport groundwater model), while the measured increase in streamflow loss was 40%. For two large streams in the Sierra Nevada Mountains, annual stream temperature variations ranged from 0° to 25°C. In summer months, diurnal stream temperature variations were 30–40% of annual stream temperature variations, owing to reduced streamflows and increased atmospheric heating. Previous reports document that one Sierra stream site generally gains groundwater during low flows, while the second Sierra stream site may lose water during low flows. For August the diurnal streamflow variation was 11% at the gaining stream site and 30% at the losing stream site. On the basis of measured diurnal stream temperature variations, streambed infiltration rates were predicted to vary diurnally as much as 20% at the losing stream site. Analysis of results suggests that evapotranspiration losses determined diurnal streamflow variations in the gaining reaches, while in the losing reaches, evapotranspiration losses were compounded by diurnal variations in streambed infiltration. Diurnal variations in stream temperature were reduced in the gaining reaches as a result of discharging groundwater of relatively constant temperature. For the Sierra sites, comparison of results with those from a small tributary demonstrated that stream temperature patterns were useful in delineating discharges of bank storage following

  11. A global evaluation of streamflow drought characteristics

    Directory of Open Access Journals (Sweden)

    A. K. Fleig

    2006-01-01

    Full Text Available How drought is characterised depends on the purpose and region of the study and the available data. In case of regional applications or global comparison a standardisation of the methodology to characterise drought is preferable. In this study the threshold level method in combination with three common pooling procedures is applied to daily streamflow series from a wide range of hydrological regimes. Drought deficit characteristics, such as drought duration and deficit volume, are derived, and the methods are evaluated for their applicability for regional studies. Three different pooling procedures are evaluated: the moving-average procedure (MA-procedure, the inter-event time method (IT-method, and the sequent peak algorithm (SPA. The MA-procedure proved to be a flexible approach for the different series, and its parameter, the averaging interval, can easily be optimised for each stream. However, it modifies the discharge series and might introduce dependency between drought events. For the IT-method it is more difficult to find an optimal value for its parameter, the length of the excess period, in particular for flashy streams. The SPA can only be recommended as pooling procedure for the selection of annual maximum series of deficit characteristics and for very low threshold levels to ensure that events occurring shortly after major events are recognized. Furthermore, a frequency analysis of deficit volume and duration is conducted based on partial duration series of drought events. According to extreme value theory, excesses over a certain limit are Generalized Pareto (GP distributed. It was found that this model indeed performed better than or equally to other distribution models. In general, the GP-model could be used for streams of all regime types. However, for intermittent streams, zero-flow periods should be treated as censored data. For catchments with frost during the winter season, summer and winter droughts have to be analysed

  12. Cross validation for the classical model of structured expert judgment

    International Nuclear Information System (INIS)

    Colson, Abigail R.; Cooke, Roger M.

    2017-01-01

    We update the 2008 TU Delft structured expert judgment database with data from 33 professionally contracted Classical Model studies conducted between 2006 and March 2015 to evaluate its performance relative to other expert aggregation models. We briefly review alternative mathematical aggregation schemes, including harmonic weighting, before focusing on linear pooling of expert judgments with equal weights and performance-based weights. Performance weighting outperforms equal weighting in all but 1 of the 33 studies in-sample. True out-of-sample validation is rarely possible for Classical Model studies, and cross validation techniques that split calibration questions into a training and test set are used instead. Performance weighting incurs an “out-of-sample penalty” and its statistical accuracy out-of-sample is lower than that of equal weighting. However, as a function of training set size, the statistical accuracy of performance-based combinations reaches 75% of the equal weight value when the training set includes 80% of calibration variables. At this point the training set is sufficiently powerful to resolve differences in individual expert performance. The information of performance-based combinations is double that of equal weighting when the training set is at least 50% of the set of calibration variables. Previous out-of-sample validation work used a Total Out-of-Sample Validity Index based on all splits of the calibration questions into training and test subsets, which is expensive to compute and includes small training sets of dubious value. As an alternative, we propose an Out-of-Sample Validity Index based on averaging the product of statistical accuracy and information over all training sets sized at 80% of the calibration set. Performance weighting outperforms equal weighting on this Out-of-Sample Validity Index in 26 of the 33 post-2006 studies; the probability of 26 or more successes on 33 trials if there were no difference between performance

  13. Spatiotemporal impacts of LULC changes on hydrology from the perspective of runoff generation mechanism using SWAT model with evolving parameters

    Science.gov (United States)

    Li, Y.; Chang, J.; Luo, L.

    2017-12-01

    It is of great importance for water resources management to model the truly hydrological process under changing environment, especially under significant changes of underlying surfaces like the Wei River Bain (WRB) where the subsurface hydrology is highly influenced by human activities, and to systematically investigate the interactions among LULC change, streamflow variation and changes in runoff generation process. Therefore, we proposed the idea of evolving parameters in hydrological model (SWAT) to reflect the changes in physical environment with different LULC conditions. Then with these evolving parameters, the spatiotemporal impacts of LULC changes on streamflow were quantified, and qualitative analysis was conducted to further explore how LULC changes affect the streamflow from the perspective of runoff generation mechanism. Results indicate the following: 1) evolving parameter calibration is not only effective but necessary to ensure the validity of the model when dealing with significant changes in underlying surfaces due to human activities. 2) compared to the baseline period, the streamflow in wet seasons increased in the 1990s but decreased in the 2000s. While at yearly and dry seasonal scales, the streamflow decreased in both two decades; 3) the expansion of cropland is the major contributor to the reduction of surface water component, thus causing the decline in streamflow at yearly and dry seasonal scales. While compared to the 1990s, the expansions of woodland in the middle stream and grassland in the downstream are the main stressors that increased the soil water component, thus leading to the more decline of the streamflow in the 2000s.

  14. ADMS-AIRPORT: MODEL INTER-COMPARISIONS AND MODEL VALIDATION

    OpenAIRE

    Carruthers, David; McHugh, Christine; Church, Stephanie; Jackson, Mark; Williams, Matt; Price, Catheryn; Lad, Chetan

    2008-01-01

    Abstract: The functionality of ADMS-Airport and details of its use in the Model Inter-comparison Study of the Project for the Sustainable Development of Heathrow Airport (PSDH) have previously been presented, Carruthers et al (2007). A distinguishing feature is the treatment of jet engine emissions as moving jet sources rather than averaging these emissions into volume sources as is the case in some other models. In this presentation two further studies are presented which each contribu...

  15. Approaches to Validation of Models for Low Gravity Fluid Behavior

    Science.gov (United States)

    Chato, David J.; Marchetta, Jeffery; Hochstein, John I.; Kassemi, Mohammad

    2005-01-01

    This paper details the author experiences with the validation of computer models to predict low gravity fluid behavior. It reviews the literature of low gravity fluid behavior as a starting point for developing a baseline set of test cases. It examines authors attempts to validate their models against these cases and the issues they encountered. The main issues seem to be that: Most of the data is described by empirical correlation rather than fundamental relation; Detailed measurements of the flow field have not been made; Free surface shapes are observed but through thick plastic cylinders, and therefore subject to a great deal of optical distortion; and Heat transfer process time constants are on the order of minutes to days but the zero-gravity time available has been only seconds.

  16. Isotopes as validation tools for global climate models

    International Nuclear Information System (INIS)

    Henderson-Sellers, A.

    2001-01-01

    Global Climate Models (GCMs) are the predominant tool with which we predict the future climate. In order that people can have confidence in such predictions, GCMs require validation. As almost every available item of meteorological data has been exploited in the construction and tuning of GCMs to date, independent validation is very difficult. This paper explores the use of isotopes as a novel and fully independent means of evaluating GCMs. The focus is the Amazon Basin which has a long history of isotope collection and analysis and also of climate modelling: both having been reported for over thirty years. Careful consideration of the results of GCM simulations of Amazonian deforestation and climate change suggests that the recent stable isotope record is more consistent with the predicted effects of greenhouse warming, possibly combined with forest removal, than with GCM predictions of the effects of deforestation alone

  17. Modeling and Simulation Behavior Validation Methodology and Extension Model Validation for the Individual Soldier

    Science.gov (United States)

    2015-03-01

    domains. Major model functions include: • Ground combat: Light and heavy forces. • Air mobile forces. • Future forces. • Fixed-wing and rotary-wing...Constraints: • Study must be completed no later than 31 December 2014. • Entity behavior limited to select COMBATXXI Mobility , Unmanned Aerial System...and SQL backend , as well as any open application programming interface API. • Allows data transparency and data driven navigation through the model

  18. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis: Model development and validation of existing models

    Science.gov (United States)

    van der Wijk, Lars; Proost, Johannes H.; Sinha, Bhanu; Touw, Daan J.

    2017-01-01

    Gentamicin shows large variations in half-life and volume of distribution (Vd) within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for predicting gentamicin doses in adults. For endocarditis patients the optimal model is unknown. We aimed at: 1) creating an optimal model for endocarditis patients; and 2) assessing whether the endocarditis and existing models can accurately predict serum levels. We performed a retrospective observational two-cohort study: one cohort to parameterize the endocarditis model by iterative two-stage Bayesian analysis, and a second cohort to validate and compare all three models. The Akaike Information Criterion and the weighted sum of squares of the residuals divided by the degrees of freedom were used to select the endocarditis model. Median Prediction Error (MDPE) and Median Absolute Prediction Error (MDAPE) were used to test all models with the validation dataset. We built the endocarditis model based on data from the modeling cohort (65 patients) with a fixed 0.277 L/h/70kg metabolic clearance, 0.698 (±0.358) renal clearance as fraction of creatinine clearance, and Vd 0.312 (±0.076) L/kg corrected lean body mass. External validation with data from 14 validation cohort patients showed a similar predictive power of the endocarditis model (MDPE -1.77%, MDAPE 4.68%) as compared to the intensive-care (MDPE -1.33%, MDAPE 4.37%) and standard (MDPE -0.90%, MDAPE 4.82%) models. All models acceptably predicted pharmacokinetic parameters for gentamicin in endocarditis patients. However, these patients appear to have an increased Vd, similar to intensive care patients. Vd mainly determines the height of peak serum levels, which in turn correlate with bactericidal activity. In order to maintain simplicity, we advise to use the existing intensive-care model in clinical practice to avoid

  19. Pharmacokinetic modeling of gentamicin in treatment of infective endocarditis: Model development and validation of existing models.

    Directory of Open Access Journals (Sweden)

    Anna Gomes

    Full Text Available Gentamicin shows large variations in half-life and volume of distribution (Vd within and between individuals. Thus, monitoring and accurately predicting serum levels are required to optimize effectiveness and minimize toxicity. Currently, two population pharmacokinetic models are applied for predicting gentamicin doses in adults. For endocarditis patients the optimal model is unknown. We aimed at: 1 creating an optimal model for endocarditis patients; and 2 assessing whether the endocarditis and existing models can accurately predict serum levels. We performed a retrospective observational two-cohort study: one cohort to parameterize the endocarditis model by iterative two-stage Bayesian analysis, and a second cohort to validate and compare all three models. The Akaike Information Criterion and the weighted sum of squares of the residuals divided by the degrees of freedom were used to select the endocarditis model. Median Prediction Error (MDPE and Median Absolute Prediction Error (MDAPE were used to test all models with the validation dataset. We built the endocarditis model based on data from the modeling cohort (65 patients with a fixed 0.277 L/h/70kg metabolic clearance, 0.698 (±0.358 renal clearance as fraction of creatinine clearance, and Vd 0.312 (±0.076 L/kg corrected lean body mass. External validation with data from 14 validation cohort patients showed a similar predictive power of the endocarditis model (MDPE -1.77%, MDAPE 4.68% as compared to the intensive-care (MDPE -1.33%, MDAPE 4.37% and standard (MDPE -0.90%, MDAPE 4.82% models. All models acceptably predicted pharmacokinetic parameters for gentamicin in endocarditis patients. However, these patients appear to have an increased Vd, similar to intensive care patients. Vd mainly determines the height of peak serum levels, which in turn correlate with bactericidal activity. In order to maintain simplicity, we advise to use the existing intensive-care model in clinical practice to

  20. In-Drift Microbial Communities Model Validation Calculations

    Energy Technology Data Exchange (ETDEWEB)

    D. M. Jolley

    2001-09-24

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data.

  1. In-Drift Microbial Communities Model Validation Calculation

    Energy Technology Data Exchange (ETDEWEB)

    D. M. Jolley

    2001-10-31

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data.

  2. IN-DRIFT MICROBIAL COMMUNITIES MODEL VALIDATION CALCULATIONS

    Energy Technology Data Exchange (ETDEWEB)

    D.M. Jolley

    2001-12-18

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M&O 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M&O 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN M09909SPAMINGl.003 using its replacement DTN M00106SPAIDMO 1.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M&O 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M&O (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 200 1) which includes controls for the management of electronic data.

  3. In-Drift Microbial Communities Model Validation Calculations

    International Nuclear Information System (INIS)

    Jolley, D.M.

    2001-01-01

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS MandO 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS MandO 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN MO9909SPAMING1.003 using its replacement DTN MO0106SPAIDM01.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS MandO 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS MandO (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 2001) which includes controls for the management of electronic data

  4. IN-DRIFT MICROBIAL COMMUNITIES MODEL VALIDATION CALCULATIONS

    International Nuclear Information System (INIS)

    D.M. Jolley

    2001-01-01

    The objective and scope of this calculation is to create the appropriate parameter input for MING 1.0 (CSCI 30018 V1.0, CRWMS M andO 1998b) that will allow the testing of the results from the MING software code with both scientific measurements of microbial populations at the site and laboratory and with natural analogs to the site. This set of calculations provides results that will be used in model validation for the ''In-Drift Microbial Communities'' model (CRWMS M andO 2000) which is part of the Engineered Barrier System Department (EBS) process modeling effort that eventually will feed future Total System Performance Assessment (TSPA) models. This calculation is being produced to replace MING model validation output that is effected by the supersession of DTN M09909SPAMINGl.003 using its replacement DTN M00106SPAIDMO 1.034 so that the calculations currently found in the ''In-Drift Microbial Communities'' AMR (CRWMS M andO 2000) will be brought up to date. This set of calculations replaces the calculations contained in sections 6.7.2, 6.7.3 and Attachment I of CRWMS M andO (2000) As all of these calculations are created explicitly for model validation, the data qualification status of all inputs can be considered corroborative in accordance with AP-3.15Q. This work activity has been evaluated in accordance with the AP-2.21 procedure, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities'', and is subject to QA controls (BSC 2001). The calculation is developed in accordance with the AP-3.12 procedure, Calculations, and prepared in accordance with the ''Technical Work Plan For EBS Department Modeling FY 01 Work Activities'' (BSC 200 1) which includes controls for the management of electronic data

  5. Monte Carlo Modelling of Mammograms : Development and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Spyrou, G; Panayiotakis, G [Univercity of Patras, School of Medicine, Medical Physics Department, 265 00 Patras (Greece); Bakas, A [Technological Educational Institution of Athens, Department of Radiography, 122 10 Athens (Greece); Tzanakos, G [University of Athens, Department of Physics, Divission of Nuclear and Particle Physics, 157 71 Athens (Greece)

    1999-12-31

    A software package using Monte Carlo methods has been developed for the simulation of x-ray mammography. A simplified geometry of the mammographic apparatus has been considered along with the software phantom of compressed breast. This phantom may contain inhomogeneities of various compositions and sizes at any point. Using this model one can produce simulated mammograms. Results that demonstrate the validity of this simulation are presented. (authors) 16 refs, 4 figs

  6. Towards Model Validation and Verification with SAT Techniques

    OpenAIRE

    Gogolla, Martin

    2010-01-01

    After sketching how system development and the UML (Unified Modeling Language) and the OCL (Object Constraint Language) are related, validation and verification with the tool USE (UML-based Specification Environment) is demonstrated. As a more efficient alternative for verification tasks, two approaches using SAT-based techniques are put forward: First, a direct encoding of UML and OCL with Boolean variables and propositional formulas, and second, an encoding employing an...

  7. Trailing Edge Noise Model Validation and Application to Airfoil Optimization

    DEFF Research Database (Denmark)

    Bertagnolio, Franck; Aagaard Madsen, Helge; Bak, Christian

    2010-01-01

    The aim of this article is twofold. First, an existing trailing edge noise model is validated by comparing with airfoil surface pressure fluctuations and far field sound pressure levels measured in three different experiments. The agreement is satisfactory in one case but poor in two other cases...... across the boundary layer near the trailing edge and to a lesser extent by a smaller boundary layer displacement thickness. ©2010 American Society of Mechanical Engineers...

  8. Experimental Validation of a Permeability Model for Enrichment Membranes

    International Nuclear Information System (INIS)

    Orellano, Pablo; Brasnarof, Daniel; Florido Pablo

    2003-01-01

    An experimental loop with a real scale diffuser, in a single enrichment-stage configuration, was operated with air at different process conditions, in order to characterize the membrane permeability.Using these experimental data, an analytical geometric-and-morphologic-based model was validated.It is conclude that a new set of independent measurements, i.e. enrichment, is necessary in order to fully characterize diffusers, because of its internal parameters are not univocally determinated with permeability experimental data only

  9. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Eve...... requirements, where the system to be built must explicitly support the interaction between people within a pervasive cooperative workflow execution. A case study from a real project is used to illustrate the proposed approach....

  10. Monte Carlo Modelling of Mammograms : Development and Validation

    International Nuclear Information System (INIS)

    Spyrou, G.; Panayiotakis, G.; Bakas, A.; Tzanakos, G.

    1998-01-01

    A software package using Monte Carlo methods has been developed for the simulation of x-ray mammography. A simplified geometry of the mammographic apparatus has been considered along with the software phantom of compressed breast. This phantom may contain inhomogeneities of various compositions and sizes at any point. Using this model one can produce simulated mammograms. Results that demonstrate the validity of this simulation are presented. (authors)

  11. Simulation of groundwater conditions and streamflow depletion to evaluate water availability in a Freeport, Maine, watershed

    Science.gov (United States)

    Nielsen, Martha G.; Locke, Daniel B.

    2012-01-01

    In order to evaluate water availability in the State of Maine, the U.S. Geological Survey (USGS) and the Maine Geological Survey began a cooperative investigation to provide the first rigorous evaluation of watersheds deemed "at risk" because of the combination of instream flow requirements and proportionally large water withdrawals. The study area for this investigation includes the Harvey and Merrill Brook watersheds and the Freeport aquifer in the towns of Freeport, Pownal, and Yarmouth, Maine. A numerical groundwater- flow model was used to evaluate groundwater withdrawals, groundwater-surface-water interactions, and the effect of water-management practices on streamflow. The water budget illustrates the effect that groundwater withdrawals have on streamflow and the movement of water within the system. Streamflow measurements were made following standard USGS techniques, from May through September 2009 at one site in the Merrill Brook watershed and four sites in the Harvey Brook watershed. A record-extension technique was applied to estimate long-term monthly streamflows at each of the five sites. The conceptual model of the groundwater system consists of a deep, confined aquifer (the Freeport aquifer) in a buried valley that trends through the middle of the study area, covered by a discontinuous confining unit, and topped by a thin upper saturated zone that is a mixture of sandy units, till, and weathered clay. Harvey and Merrill Brooks flow southward through the study area, and receive groundwater discharge from the upper saturated zone and from the deep aquifer through previously unknown discontinuities in the confining unit. The Freeport aquifer gets most of its recharge from local seepage around the edges of the confining unit, the remainder is received as inflow from the north within the buried valley. Groundwater withdrawals from the Freeport aquifer in the study area were obtained from the local water utility and estimated for other categories. Overall

  12. Groundwater Pumping and Streamflow in the Yuba Basin, Sacramento Valley, California

    Science.gov (United States)

    Moss, D. R.; Fogg, G. E.; Wallender, W. W.

    2011-12-01

    Water transfers during drought in California's Sacramento Valley can lead to increased groundwater pumping, and as yet unknown effects on stream baseflow. Two existing groundwater models of the greater Sacramento Valley together with localized, monitoring of groundwater level fluctuations adjacent to the Bear, Feather, and Yuba Rivers, indicate cause and effect relations between the pumping and streamflow. The models are the Central Valley Hydrologic Model (CVHM) developed by the U.S. Geological Survey and C2VSIM developed by Department of Water Resources. Using two models which have similar complexity and data but differing approaches to the agricultural water boundary condition illuminates both the water budget and its uncertainty. Water budget and flux data for localized areas can be obtained from the models allowing for parameters such as precipitation, irrigation recharge, and streamflow to be compared to pumping on different temporal scales. Continuous groundwater level measurements at nested, near-stream piezometers show seasonal variations in streamflow and groundwater levels as well as the timing and magnitude of recharge and pumping. Preliminary results indicate that during years with relatively wet conditions 65 - 70% of the surface recharge for the groundwater system comes from irrigation and precipitation and 30 - 35% comes from streamflow losses. The models further indicate that during years with relatively dry conditions, 55 - 60% of the surface recharge for the groundwater system comes from irrigation and precipitation while 40 - 45% comes from streamflow losses. The models irrigation water demand, surface-water and groundwater supply, and deep percolation are integrated producing values for irrigation pumping. Groundwater extractions during the growing season, approximately between April and October, increase by almost 200%. The effects of increased pumping seasonally are not readily evident in stream stage measurements. However, during dry time

  13. Cross-Validation of Aerobic Capacity Prediction Models in Adolescents.

    Science.gov (United States)

    Burns, Ryan Donald; Hannon, James C; Brusseau, Timothy A; Eisenman, Patricia A; Saint-Maurice, Pedro F; Welk, Greg J; Mahar, Matthew T

    2015-08-01

    Cardiorespiratory endurance is a component of health-related fitness. FITNESSGRAM recommends the Progressive Aerobic Cardiovascular Endurance Run (PACER) or One mile Run/Walk (1MRW) to assess cardiorespiratory endurance by estimating VO2 Peak. No research has cross-validated prediction models from both PACER and 1MRW, including the New PACER Model and PACER-Mile Equivalent (PACER-MEQ) using current standards. The purpose of this study was to cross-validate prediction models from PACER and 1MRW against measured VO2 Peak in adolescents. Cardiorespiratory endurance data were collected on 90 adolescents aged 13-16 years (Mean = 14.7 ± 1.3 years; 32 girls, 52 boys) who completed the PACER and 1MRW in addition to a laboratory maximal treadmill test to measure VO2 Peak. Multiple correlations among various models with measured VO2 Peak were considered moderately strong (R = .74-0.78), and prediction error (RMSE) ranged from 5.95 ml·kg⁻¹,min⁻¹ to 8.27 ml·kg⁻¹.min⁻¹. Criterion-referenced agreement into FITNESSGRAM's Healthy Fitness Zones was considered fair-to-good among models (Kappa = 0.31-0.62; Agreement = 75.5-89.9%; F = 0.08-0.65). In conclusion, prediction models demonstrated moderately strong linear relationships with measured VO2 Peak, fair prediction error, and fair-to-good criterion referenced agreement with measured VO2 Peak into FITNESSGRAM's Healthy Fitness Zones.

  14. Recent validation studies for two NRPB environmental transfer models

    International Nuclear Information System (INIS)

    Brown, J.; Simmonds, J.R.

    1991-01-01

    The National Radiological Protection Board (NRPB) developed a dynamic model for the transfer of radionuclides through terrestrial food chains some years ago. This model, now called FARMLAND, predicts both instantaneous and time integrals of concentration of radionuclides in a variety of foods. The model can be used to assess the consequences of both accidental and routine releases of radioactivity to the environment; and results can be obtained as a function of time. A number of validation studies have been carried out on FARMLAND. In these the model predictions have been compared with a variety of sets of environmental measurement data. Some of these studies will be outlined in the paper. A model to predict external radiation exposure from radioactivity deposited on different surfaces in the environment has also been developed at NRPB. This model, called EXPURT (EXPosure from Urban Radionuclide Transfer), can be used to predict radiation doses as a function of time following deposition in a variety of environments, ranging from rural to inner-city areas. This paper outlines validation studies and future extensions to be carried out on EXPURT. (12 refs., 4 figs.)

  15. Methods for Geometric Data Validation of 3d City Models

    Science.gov (United States)

    Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2015-12-01

    Geometric quality of 3D city models is crucial for data analysis and simulation tasks, which are part of modern applications of the data (e.g. potential heating energy consumption of city quarters, solar potential, etc.). Geometric quality in these contexts is however a different concept as it is for 2D maps. In the latter case, aspects such as positional or temporal accuracy and correctness represent typical quality metrics of the data. They are defined in ISO 19157 and should be mentioned as part of the metadata. 3D data has a far wider range of aspects which influence their quality, plus the idea of quality itself is application dependent. Thus, concepts for definition of quality are needed, including methods to validate these definitions. Quality on this sense means internal validation and detection of inconsistent or wrong geometry according to a predefined set of rules. A useful starting point would be to have correct geometry in accordance with ISO 19107. A valid solid should consist of planar faces which touch their neighbours exclusively in defined corner points and edges. No gaps between them are allowed, and the whole feature must be 2-manifold. In this paper, we present methods to validate common geometric requirements for building geometry. Different checks based on several algorithms have been implemented to validate a set of rules derived from the solid definition mentioned above (e.g. water tightness of the solid or planarity of its polygons), as they were developed for the software tool CityDoctor. The method of each check is specified, with a special focus on the discussion of tolerance values where they are necessary. The checks include polygon level checks to validate the correctness of each polygon, i.e. closeness of the bounding linear ring and planarity. On the solid level, which is only validated if the polygons have passed validation, correct polygon orientation is checked, after self-intersections outside of defined corner points and edges

  16. Validation of the WATEQ4 geochemical model for uranium

    International Nuclear Information System (INIS)

    Krupka, K.M.; Jenne, E.A.; Deutsch, W.J.

    1983-09-01

    As part of the Geochemical Modeling and Nuclide/Rock/Groundwater Interactions Studies Program, a study was conducted to partially validate the WATEQ4 aqueous speciation-solubility geochemical model for uranium. The solubility controls determined with the WATEQ4 geochemical model were in excellent agreement with those laboratory studies in which the solids schoepite [UO 2 (OH) 2 . H 2 O], UO 2 (OH) 2 , and rutherfordine ((UO 2 CO 3 ) were identified as actual solubility controls for uranium. The results of modeling solution analyses from laboratory studies of uranyl phosphate solids, however, identified possible errors in the characterization of solids in the original solubility experiments. As part of this study, significant deficiencies in the WATEQ4 thermodynamic data base for uranium solutes and solids were corrected. Revisions included recalculation of selected uranium reactions. Additionally, thermodynamic data for the hydroxyl complexes of U(VI), including anionic (VI) species, were evaluated (to the extent permitted by the available data). Vanadium reactions were also added to the thermodynamic data base because uranium-vanadium solids can exist in natural ground-water systems. This study is only a partial validation of the WATEQ4 geochemical model because the available laboratory solubility studies do not cover the range of solid phases, alkaline pH values, and concentrations of inorganic complexing ligands needed to evaluate the potential solubility of uranium in ground waters associated with various proposed nuclear waste repositories. Further validation of this or other geochemical models for uranium will require careful determinations of uraninite solubility over the pH range of 7 to 10 under highly reducing conditions and of uranyl hydroxide and phosphate solubilities over the pH range of 7 to 10 under oxygenated conditions

  17. Investigation of the complexity of streamflow fluctuations in a large heterogeneous lake catchment in China

    Science.gov (United States)

    Ye, Xuchun; Xu, Chong-Yu; Li, Xianghu; Zhang, Qi

    2018-05-01

    The occurrence of flood and drought frequency is highly correlated with the temporal fluctuations of streamflow series; understanding of these fluctuations is essential for the improved modeling and statistical prediction of extreme changes in river basins. In this study, the complexity of daily streamflow fluctuations was investigated by using multifractal detrended fluctuation analysis (MF-DFA) in a large heterogeneous lake basin, the Poyang Lake basin in China, and the potential impacts of human activities were also explored. Major results indicate that the multifractality of streamflow fluctuations shows significant regional characteristics. In the study catchment, all the daily streamflow series present a strong long-range correlation with Hurst exponents bigger than 0.8. The q-order Hurst exponent h( q) of all the hydrostations can be characterized well by only two parameters: a (0.354 ≤ a ≤ 0.384) and b (0.627 ≤ b ≤ 0.677), with no pronounced differences. Singularity spectrum analysis pointed out that small fluctuations play a dominant role in all daily streamflow series. Our research also revealed that both the correlation properties and the broad probability density function (PDF) of hydrological series can be responsible for the multifractality of streamflow series that depends on watershed areas. In addition, we emphasized the relationship between watershed area and the estimated multifractal parameters, such as the Hurst exponent and fitted parameters a and b from the q-order Hurst exponent h( q). However, the relationship between the width of the singularity spectrum (Δ α) and watershed area is not clear. Further investigation revealed that increasing forest coverage and reservoir storage can effectively enhance the persistence of daily streamflow, decrease the hydrological complexity of large fluctuations, and increase the small fluctuations.

  18. Validation of fracture flow models in the Stripa project

    International Nuclear Information System (INIS)

    Herbert, A.; Dershowitz, W.; Long, J.; Hodgkinson, D.

    1991-01-01

    One of the objectives of Phase III of the Stripa Project is to develop and evaluate approaches for the prediction of groundwater flow and nuclide transport in a specific unexplored volume of the Stripa granite and make a comparison with data from field measurements. During the first stage of the project, a prediction of inflow to the D-holes, an array of six parallel closely spaced 100m boreholes, was made based on data from six other boreholes. This data included fracture geometry, stress, single borehole geophysical logging, crosshole and reflection radar and seismic tomogram, head monitoring and single hole packer test measurements. Maps of fracture traces on the drift walls have also been made. The D-holes are located along a future Validation Drift which will be excavated. The water inflow to the D-holes has been measured in an experiment called the Simulated Drift Experiment. The paper reviews the Simulated Drift Experiment validation exercise. Following a discussion of the approach to validation, the characterization data and its preliminary interpretation are summarised and commented upon. That work has proved feasible to carry through all the complex and interconnected tasks associated with the gathering and interpretation of characterization data, the development and application of complex models, and the comparison with measured inflows. This exercise has provided detailed feed-back to the experimental and theoretical work required for measurements and predictions of flow into the Validation Drift. Computer codes used: CHANGE, FRACMAN, MAFIC, NAPSAC and TRINET. 2 figs., 2 tabs., 19 refs

  19. CrowdWater - Can people observe what models need?

    Science.gov (United States)

    van Meerveld, I. H. J.; Seibert, J.; Vis, M.; Etter, S.; Strobl, B.

    2017-12-01

    CrowdWater (www.crowdwater.ch) is a citizen science project that explores the usefulness of crowd-sourced data for hydrological model calibration and prediction. Hydrological models are usually calibrated based on observed streamflow data but it is likely easier for people to estimate relative stream water levels, such as the water level above or below a rock, than streamflow. Relative stream water levels may, therefore, be a more suitable variable for citizen science projects than streamflow. In order to test this assumption, we held surveys near seven different sized rivers in Switzerland and asked more than 450 volunteers to estimate the water level class based on a picture with a virtual staff gauge. The results show that people can generally estimate the relative water level well, although there were also a few outliers. We also asked the volunteers to estimate streamflow based on the stick method. The median estimated streamflow was close to the observed streamflow but the spread in the streamflow estimates was large and there were very large outliers, suggesting that crowd-based streamflow data is highly uncertain. In order to determine the potential value of water level class data for model calibration, we converted streamflow time series for 100 catchments in the US to stream level class time series and used these to calibrate the HBV model. The model was then validated using the streamflow data. The results of this modeling exercise show that stream level class data are useful for constraining a simple runoff model. Time series of only two stream level classes, e.g. above or below a rock in the stream, were already informative, especially when the class boundary was chosen towards the highest stream levels. There was hardly any improvement in model performance when more than five water level classes were used. This suggests that if crowd-sourced stream level observations are available for otherwise ungauged catchments, these data can be used to constrain

  20. Multi-site calibration, validation, and sensitivity analysis of the MIKE SHE Model for a large watershed in northern China

    Directory of Open Access Journals (Sweden)

    S. Wang

    2012-12-01

    Full Text Available Model calibration is essential for hydrologic modeling of large watersheds in a heterogeneous mountain environment. Little guidance is available for model calibration protocols for distributed models that aim at capturing the spatial variability of hydrologic processes. This study used the physically-based distributed hydrologic model, MIKE SHE, to contrast a lumped calibration protocol that used streamflow measured at one single watershed outlet to a multi-site calibration method which employed streamflow measurements at three stations within the large Chaohe River basin in northern China. Simulation results showed that the single-site calibrated model was able to sufficiently simulate the hydrographs for two of the three stations (Nash-Sutcliffe coefficient of 0.65–0.75, and correlation coefficient 0.81–0.87 during the testing period, but the model performed poorly for the third station (Nash-Sutcliffe coefficient only 0.44. Sensitivity analysis suggested that streamflow of upstream area of the watershed was dominated by slow groundwater, whilst streamflow of middle- and down- stream areas by relatively quick interflow. Therefore, a multi-site calibration protocol was deemed necessary. Due to the potential errors and uncertainties with respect to the representation of spatial variability, performance measures from the multi-site calibration protocol slightly decreased for two of the three stations, whereas it was improved greatly for the third station. We concluded that multi-site calibration protocol reached a compromise in term of model performance for the three stations, reasonably representing the hydrographs of all three stations with Nash-Sutcliffe coefficient ranging from 0.59–072. The multi-site calibration protocol applied in the analysis generally has advantages to the single site calibration protocol.

  1. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  2. Validating and Verifying Biomathematical Models of Human Fatigue

    Science.gov (United States)

    Martinez, Siera Brooke; Quintero, Luis Ortiz; Flynn-Evans, Erin

    2015-01-01

    Airline pilots experience acute and chronic sleep deprivation, sleep inertia, and circadian desynchrony due to the need to schedule flight operations around the clock. This sleep loss and circadian desynchrony gives rise to cognitive impairments, reduced vigilance and inconsistent performance. Several biomathematical models, based principally on patterns observed in circadian rhythms and homeostatic drive, have been developed to predict a pilots levels of fatigue or alertness. These models allow for the Federal Aviation Administration (FAA) and commercial airlines to make decisions about pilot capabilities and flight schedules. Although these models have been validated in a laboratory setting, they have not been thoroughly tested in operational environments where uncontrolled factors, such as environmental sleep disrupters, caffeine use and napping, may impact actual pilot alertness and performance. We will compare the predictions of three prominent biomathematical fatigue models (McCauley Model, Harvard Model, and the privately-sold SAFTE-FAST Model) to actual measures of alertness and performance. We collected sleep logs, movement and light recordings, psychomotor vigilance task (PVT), and urinary melatonin (a marker of circadian phase) from 44 pilots in a short-haul commercial airline over one month. We will statistically compare with the model predictions to lapses on the PVT and circadian phase. We will calculate the sensitivity and specificity of each model prediction under different scheduling conditions. Our findings will aid operational decision-makers in determining the reliability of each model under real-world scheduling situations.

  3. Validation of High Displacement Piezoelectric Actuator Finite Element Models

    Science.gov (United States)

    Taleghani, B. K.

    2000-01-01

    The paper presents the results obtained by using NASTRAN(Registered Trademark) and ANSYS(Regitered Trademark) finite element codes to predict doming of the THUNDER piezoelectric actuators during the manufacturing process and subsequent straining due to an applied input voltage. To effectively use such devices in engineering applications, modeling and characterization are essential. Length, width, dome height, and thickness are important parameters for users of such devices. Therefore, finite element models were used to assess the effects of these parameters. NASTRAN(Registered Trademark) and ANSYS(Registered Trademark) used different methods for modeling piezoelectric effects. In NASTRAN(Registered Trademark), a thermal analogy was used to represent voltage at nodes as equivalent temperatures, while ANSYS(Registered Trademark) processed the voltage directly using piezoelectric finite elements. The results of finite element models were validated by using the experimental results.

  4. Calibration and validation of a general infiltration model

    Science.gov (United States)

    Mishra, Surendra Kumar; Ranjan Kumar, Shashi; Singh, Vijay P.

    1999-08-01

    A general infiltration model proposed by Singh and Yu (1990) was calibrated and validated using a split sampling approach for 191 sets of infiltration data observed in the states of Minnesota and Georgia in the USA. Of the five model parameters, fc (the final infiltration rate), So (the available storage space) and exponent n were found to be more predictable than the other two parameters: m (exponent) and a (proportionality factor). A critical examination of the general model revealed that it is related to the Soil Conservation Service (1956) curve number (SCS-CN) method and its parameter So is equivalent to the potential maximum retention of the SCS-CN method and is, in turn, found to be a function of soil sorptivity and hydraulic conductivity. The general model was found to describe infiltration rate with time varying curve number.

  5. Condensation of steam in horizontal pipes: model development and validation

    International Nuclear Information System (INIS)

    Szijarto, R.

    2015-01-01

    This thesis submitted to the Swiss Federal Institute of Technology ETH in Zurich presents the development and validation of a model for the condensation of steam in horizontal pipes. Condensation models were introduced and developed particularly for the application in the emergency cooling system of a Gen-III+ boiling water reactor. Such an emergency cooling system consists of slightly inclined horizontal pipes, which are immersed in a cold water tank. The pipes are connected to the reactor pressure vessel. They are responsible for a fast depressurization of the reactor core in the case of accident. Condensation in horizontal pipes was investigated with both one-dimensional system codes (RELAP5) and three-dimensional computational fluid dynamics software (ANSYS FLUENT). The performance of the RELAP5 code was not sufficient for transient condensation processes. Therefore, a mechanistic model was developed and implemented. Four models were tested on the LAOKOON facility, which analysed direct contact condensation in a horizontal duct

  6. Streamflow effects on spawning, rearing, and outmigration of fall-run chinook salmon (Oncorhynchus tshawytscha) predicted by a spatial and individual-based model

    International Nuclear Information System (INIS)

    Jager, H.I.; Sale, M.J.; Cardwell, H.E.; Deangelis, D.L.; Bevelhimer, M.J.; Coutant, C.C.

    1994-01-01

    The thread posed to Pacific salmon by competing water demands is a great concern to regulators of the hydropower industry. Finding the balance between fish resource and economic objectives depends on our ability to quantify flow effects on salmon production. Because field experiments are impractical, simulation models are needed to predict the effects of minimum flows on chinook salmon during their freshwater residence. We have developed a model to simulate the survival and development of eggs and alevins in redds and the growth, survival, and movement of juvenile chinook in response to local stream conditions (flow, temperature, chinook and predator density). Model results suggest that smolt production during dry years can be increased by raising spring minimum flows

  7. Development of a modular streamflow model to quantify runoff contributions from different land uses in tropical urban environments using Genetic Programming

    Science.gov (United States)

    Meshgi, Ali; Schmitter, Petra; Chui, Ting Fong May; Babovic, Vladan

    2015-06-01

    The decrease of pervious areas during urbanization has severely altered the hydrological cycle, diminishing infiltration and therefore sub-surface flows during rainfall events, and further increasing peak discharges in urban drainage infrastructure. Designing appropriate waster sensitive infrastructure that reduces peak discharges requires a better understanding of land use specific contributions towards surface and sub-surface processes. However, to date, such understanding in tropical urban environments is still limited. On the other hand, the rainfall-runoff process in tropical urban systems experiences a high degree of non-linearity and heterogeneity. Therefore, this study used Genetic Programming to establish a physically interpretable modular model consisting of two sub-models: (i) a baseflow module and (ii) a quick flow module to simulate the two hydrograph flow components. The relationship between the input variables in the model (i.e. meteorological data and catchment initial conditions) and its overall structure can be explained in terms of catchment hydrological processes. Therefore, the model is a partial greying of what is often a black-box approach in catchment modelling. The model was further generalized to the sub-catchments of the main catchment, extending the potential for more widespread applications. Subsequently, this study used the modular model to predict both flow components of events as well as time series, and applied optimization techniques to estimate the contributions of various land uses (i.e. impervious, steep grassland, grassland on mild slope, mixed grasses and trees and relatively natural vegetation) towards baseflow and quickflow in tropical urban systems. The sub-catchment containing the highest portion of impervious surfaces (40% of the area) contributed the least towards the baseflow (6.3%) while the sub-catchment covered with 87% of relatively natural vegetation contributed the most (34.9%). The results from the quickflow

  8. Development and Validation of a 3-Dimensional CFB Furnace Model

    Science.gov (United States)

    Vepsäläinen, Arl; Myöhänen, Karl; Hyppäneni, Timo; Leino, Timo; Tourunen, Antti

    At Foster Wheeler, a three-dimensional CFB furnace model is essential part of knowledge development of CFB furnace process regarding solid mixing, combustion, emission formation and heat transfer. Results of laboratory and pilot scale phenomenon research are utilized in development of sub-models. Analyses of field-test results in industrial-scale CFB boilers including furnace profile measurements are simultaneously carried out with development of 3-dimensional process modeling, which provides a chain of knowledge that is utilized as feedback for phenomenon research. Knowledge gathered by model validation studies and up-to-date parameter databases are utilized in performance prediction and design development of CFB boiler furnaces. This paper reports recent development steps related to modeling of combustion and formation of char and volatiles of various fuel types in CFB conditions. Also a new model for predicting the formation of nitrogen oxides is presented. Validation of mixing and combustion parameters for solids and gases are based on test balances at several large-scale CFB boilers combusting coal, peat and bio-fuels. Field-tests including lateral and vertical furnace profile measurements and characterization of solid materials provides a window for characterization of fuel specific mixing and combustion behavior in CFB furnace at different loads and operation conditions. Measured horizontal gas profiles are projection of balance between fuel mixing and reactions at lower part of furnace and are used together with both lateral temperature profiles at bed and upper parts of furnace for determination of solid mixing and combustion model parameters. Modeling of char and volatile based formation of NO profiles is followed by analysis of oxidizing and reducing regions formed due lower furnace design and mixing characteristics of fuel and combustion airs effecting to formation ofNO furnace profile by reduction and volatile-nitrogen reactions. This paper presents

  9. Insights on surface-water/groundwater exchange in the upper Floridan aquifer, north-central Florida (USA), from streamflow data and numerical modeling

    Science.gov (United States)

    Sutton, James E.; Screaton, Elizabeth J.; Martin, Jonathan B.

    2015-03-01

    Surface-water/groundwater exchange impacts water quality and budgets. In karst aquifers, these exchanges also play an important role in dissolution. Five years of river discharge data were analyzed and a transient groundwater flow model was developed to evaluate large-scale temporal and spatial variations of exchange between an 80-km stretch of the Suwannee River in north-central Florida (USA) and the karstic upper Floridan aquifer. The one-layer transient groundwater flow model was calibrated using groundwater levels from 59 monitoring wells, and fluxes were compared to the exchange calculated from discharge data. Both the numerical modeling and the discharge analysis suggest that the Suwannee River loses water under both low- and high-stage conditions. River losses appear greatest at the inside of a large meander, and the former river water may continue across the meander within the aquifer rather than return to the river. In addition, the numerical model calibration reveals that aquifer transmissivity is elevated within this large meander, which is consistent with enhanced dissolution due to river losses. The results show the importance of temporal and spatial variations in head gradients to exchange between streams and karst aquifers and dissolution of the aquifers.

  10. Effects of streamflow diversion on a fish population: combining empirical data and individual-based models in a site-specific evaluation

    Science.gov (United States)

    Bret C. Harvey; Jason L. White; Rodney J. Nakamoto; Steven F. Railsback

    2014-01-01

    Resource managers commonly face the need to evaluate the ecological consequences of specific water diversions of small streams. We addressed this need by conducting 4 years of biophysical monitoring of stream reaches above and below a diversion and applying two individual-based models of salmonid fish that simulated different levels of behavioral complexity. The...

  11. Proceedings of the first SRL model validation workshop

    International Nuclear Information System (INIS)

    Buckner, M.R.

    1981-10-01

    The Clean Air Act and its amendments have added importance to knowing the accuracy of mathematical models used to assess transport and diffusion of environmental pollutants. These models are the link between air quality standards and emissions. To test the accuracy of a number of these models, a Model Validation Workshop was held. The meteorological, source-term, and Kr-85 concentration data bases for emissions from the separations areas of the Savannah River Plant during 1975 through 1977 were used to compare calculations from various atmospheric dispersion models. The results of statistical evaluation of the models show a degradation in the ability to predict pollutant concentrations as the time span over which the calculations are made is reduced. Forecasts for annual time periods were reasonably accurate. Weighted-average squared correlation coefficients (R 2 ) were 0.74 for annual, 0.28 for monthly, 0.21 for weekly, and 0.18 for twice-daily predictions. Model performance varied within each of these four categories; however, the results indicate that the more complex, three-dimensional models provide only marginal increases in accuracy. The increased costs of running these codes is not warranted for long-term releases or for conditions of relatively simple terrain and meteorology. The overriding factor in the calculational accuracy is the accurate description of the wind field. Further improvements of the numerical accuracy of the complex models is not nearly as important as accurate calculations of the meteorological transport conditions

  12. Validation, Optimization and Simulation of a Solar Thermoelectric Generator Model

    Science.gov (United States)

    Madkhali, Hadi Ali; Hamil, Ali; Lee, HoSung

    2017-12-01

    This study explores thermoelectrics as a viable option for small-scale solar thermal applications. Thermoelectric technology is based on the Seebeck effect, which states that a voltage is induced when a temperature gradient is applied to the junctions of two differing materials. This research proposes to analyze, validate, simulate, and optimize a prototype solar thermoelectric generator (STEG) model in order to increase efficiency. The intent is to further develop STEGs as a viable and productive energy source that limits pollution and reduces the cost of energy production. An empirical study (Kraemer et al. in Nat Mater 10:532, 2011) on the solar thermoelectric generator reported a high efficiency performance of 4.6%. The system had a vacuum glass enclosure, a flat panel (absorber), thermoelectric generator and water circulation for the cold side. The theoretical and numerical approach of this current study validated the experimental results from Kraemer's study to a high degree. The numerical simulation process utilizes a two-stage approach in ANSYS software for Fluent and Thermal-Electric Systems. The solar load model technique uses solar radiation under AM 1.5G conditions in Fluent. This analytical model applies Dr. Ho Sung Lee's theory of optimal design to improve the performance of the STEG system by using dimensionless parameters. Applying this theory, using two cover glasses and radiation shields, the STEG model can achieve a highest efficiency of 7%.

  13. The importance of warm season warming to western U.S. streamflow changes

    Science.gov (United States)

    Das, T.; Pierce, D.W.; Cayan, D.R.; Vano, J.A.; Lettenmaier, D.P.

    2011-01-01

    Warm season climate warming will be a key driver of annual streamflow changes in four major river basins of the western U.S., as shown by hydrological model simulations using fixed precipitation and idealized seasonal temperature changes based on climate projections with SRES A2 forcing. Warm season (April-September) warming reduces streamflow throughout the year; streamflow declines both immediately and in the subsequent cool season. Cool season (October-March) warming, by contrast, increases streamflow immediately, partially compensating for streamflow reductions during the subsequent warm season. A uniform warm season warming of 3C drives a wide range of annual flow declines across the basins: 13.3%, 7.2%, 1.8%, and 3.6% in the Colorado, Columbia, Northern and Southern Sierra basins, respectively. The same warming applied during the cool season gives annual declines of only 3.5%, 1.7%, 2.1%, and 3.1%, respectively. Copyright 2011 by the American Geophysical Union.

  14. One-day-ahead streamflow forecasting via super-ensembles of several neural network architectures based on the Multi-Level Diversity Model

    Science.gov (United States)

    Brochero, Darwin; Hajji, Islem; Pina, Jasson; Plana, Queralt; Sylvain, Jean-Daniel; Vergeynst, Jenna; Anctil, Francois

    2015-04-01

    Theories about generalization error with ensembles are mainly based on the diversity concept, which promotes resorting to many members of different properties to support mutually agreeable decisions. Kuncheva (2004) proposed the Multi Level Diversity Model (MLDM) to promote diversity in model ensembles, combining different data subsets, input subsets, models, parameters, and including a combiner level in order to optimize the final ensemble. This work tests the hypothesis about the minimisation of the generalization error with ensembles of Neural Network (NN) structures. We used the MLDM to evaluate two different scenarios: (i) ensembles from a same NN architecture, and (ii) a super-ensemble built by a combination of sub-ensembles of many NN architectures. The time series used correspond to the 12 basins of the MOdel Parameter Estimation eXperiment (MOPEX) project that were used by Duan et al. (2006) and Vos (2013) as benchmark. Six architectures are evaluated: FeedForward NN (FFNN) trained with the Levenberg Marquardt algorithm (Hagan et al., 1996), FFNN trained with SCE (Duan et al., 1993), Recurrent NN trained with a complex method (Weins et al., 2008), Dynamic NARX NN (Leontaritis and Billings, 1985), Echo State Network (ESN), and leak integrator neuron (L-ESN) (Lukosevicius and Jaeger, 2009). Each architecture performs separately an Input Variable Selection (IVS) according to a forward stepwise selection (Anctil et al., 2009) using mean square error as objective function. Post-processing by Predictor Stepwise Selection (PSS) of the super-ensemble has been done following the method proposed by Brochero et al. (2011). IVS results showed that the lagged stream flow, lagged precipitation, and Standardized Precipitation Index (SPI) (McKee et al., 1993) were the most relevant variables. They were respectively selected as one of the firsts three selected variables in 66, 45, and 28 of the 72 scenarios. A relationship between aridity index (Arora, 2002) and NN

  15. Contaminant transport model validation: The Oak Ridge Reservation

    International Nuclear Information System (INIS)

    Lee, R.R.; Ketelle, R.H.

    1988-09-01

    In the complex geologic setting on the Oak Ridge Reservation, hydraulic conductivity is anisotropic and flow is strongly influenced by an extensive and largely discontinuous fracture network. Difficulties in describing and modeling the aquifer system prompted a study to obtain aquifer property data to be used in a groundwater flow model validation experiment. Characterization studies included the performance of an extensive suite of aquifer test within a 600-square-meter area to obtain aquifer property values to describe the flow field in detail. Following aquifer test, a groundwater tracer test was performed under ambient conditions to verify the aquifer analysis. Tracer migration data in the near-field were used in model calibration to predict tracer arrival time and concentration in the far-field. Despite the extensive aquifer testing, initial modeling inaccurately predicted tracer migration direction. Initial tracer migration rates were consistent with those predicted by the model; however, changing environmental conditions resulted in an unanticipated decay in tracer movement. Evaluation of the predictive accuracy of groundwater flow and contaminant transport models on the Oak Ridge Reservation depends on defining the resolution required, followed by field testing and model grid definition at compatible scales. The use of tracer tests, both as a characterization method and to verify model results, provides the highest level of resolution of groundwater flow characteristics. 3 refs., 4 figs

  16. Can longer forest harvest intervals increase summer streamflow for salmon recovery?

    Science.gov (United States)

    The Mashel Streamflow Modeling Project in the Mashel River Basin, Washington, is using a watershed-scale ecohydrological model to assess whether longer forest harvest intervals can remediate summer low flow conditions that have contributed to sharply reduced runs of spawning Chin...

  17. Applications of the PyTOPKAPI model to ungauged catchments

    African Journals Online (AJOL)

    Many catchments in developing countries are poorly gauged/totally ungauged which hinders water resource ... INTRODUCTION ... focusing on the output of model calibration and validation. The last .... The improved PyTOPKAPI model is coded in Python ...... computer program for estimating streamflow statistics for ungaged.

  18. Prospective validation of pathologic complete response models in rectal cancer: Transferability and reproducibility.

    Science.gov (United States)

    van Soest, Johan; Meldolesi, Elisa; van Stiphout, Ruud; Gatta, Roberto; Damiani, Andrea; Valentini, Vincenzo; Lambin, Philippe; Dekker, Andre

    2017-09-01

    Multiple models have been developed to predict pathologic complete response (pCR) in locally advanced rectal cancer patients. Unfortunately, validation of these models normally omit the implications of cohort differences on prediction model performance. In this work, we will perform a prospective validation of three pCR models, including information whether this validation will target transferability or reproducibility (cohort differences) of the given models. We applied a novel methodology, the cohort differences model, to predict whether a patient belongs to the training or to the validation cohort. If the cohort differences model performs well, it would suggest a large difference in cohort characteristics meaning we would validate the transferability of the model rather than reproducibility. We tested our method in a prospective validation of three existing models for pCR prediction in 154 patients. Our results showed a large difference between training and validation cohort for one of the three tested models [Area under the Receiver Operating Curve (AUC) cohort differences model: 0.85], signaling the validation leans towards transferability. Two out of three models had a lower AUC for validation (0.66 and 0.58), one model showed a higher AUC in the validation cohort (0.70). We have successfully applied a new methodology in the validation of three prediction models, which allows us to indicate if a validation targeted transferability (large differences between training/validation cohort) or reproducibility (small cohort differences). © 2017 American Association of Physicists in Medicine.

  19. MOLECULAR VALIDATED MODEL FOR ADSORPTION OF PROTONATED DYE ON LDH

    Directory of Open Access Journals (Sweden)

    B. M. Braga

    Full Text Available Abstract Hydrotalcite-like compounds are anionic clays of scientific and technological interest for their use as ion exchange materials, catalysts and modified electrodes. Surface phenomenon are important for all these applications. Although conventional analytical methods have enabled progress in understanding the behavior of anionic clays in solution, an evaluation at the atomic scale of the dynamics of their ionic interactions has never been performed. Molecular simulation has become an extremely useful tool to provide this perspective. Our purpose is to validate a simplified model for the adsorption of 5-benzoyl-4-hydroxy-2-methoxy-benzenesulfonic acid (MBSA, a prototype molecule of anionic dyes, onto a hydrotalcite surface. Monte Carlo simulations were performed in the canonical ensemble with MBSA ions and a pore model of hydrotalcite using UFF and ClayFF force fields. The proposed molecular model has allowed us to reproduce experimental data of atomic force microscopy. Influences of protonation during the adsorption process are also presented.

  20. Validation of the replica trick for simple models

    Science.gov (United States)

    Shinzato, Takashi

    2018-04-01

    We discuss the replica analytic continuation using several simple models in order to prove mathematically the validity of the replica analysis, which is used in a wide range of fields related to large-scale complex systems. While replica analysis consists of two analytical techniques—the replica trick (or replica analytic continuation) and the thermodynamical limit (and/or order parameter expansion)—we focus our study on replica analytic continuation, which is the mathematical basis of the replica trick. We apply replica analysis to solve a variety of analytical models, and examine the properties of replica analytic continuation. Based on the positive results for these models we propose that replica analytic continuation is a robust procedure in replica analysis.

  1. Experimental Validation of a Dynamic Model for Lightweight Robots

    Directory of Open Access Journals (Sweden)

    Alessandro Gasparetto

    2013-03-01

    Full Text Available Nowadays, one of the main topics in robotics research is dynamic performance improvement by means of a lightening of the overall system structure. The effective motion and control of these lightweight robotic systems occurs with the use of suitable motion planning and control process. In order to do so, model-based approaches can be adopted by exploiting accurate dynamic models that take into account the inertial and elastic terms that are usually neglected in a heavy rigid link configuration. In this paper, an effective method for modelling spatial lightweight industrial robots based on an Equivalent Rigid Link System approach is considered from an experimental validation perspective. A dynamic simulator implementing the formulation is used and an experimental test-bench is set-up. Experimental tests are carried out with a benchmark L-shape mechanism.

  2. Analytical thermal model validation for Cassini radioisotope thermoelectric generator

    International Nuclear Information System (INIS)

    Lin, E.I.

    1997-01-01

    The Saturn-bound Cassini spacecraft is designed to rely, without precedent, on the waste heat from its three radioisotope thermoelectric generators (RTGs) to warm the propulsion module subsystem, and the RTG end dome temperature is a key determining factor of the amount of waste heat delivered. A previously validated SINDA thermal model of the RTG was the sole guide to understanding its complex thermal behavior, but displayed large discrepancies against some initial thermal development test data. A careful revalidation effort led to significant modifications and adjustments of the model, which result in a doubling of the radiative heat transfer from the heat source support assemblies to the end domes and bring up the end dome and flange temperature predictions to within 2 C of the pertinent test data. The increased inboard end dome temperature has a considerable impact on thermal control of the spacecraft central body. The validation process offers an example of physically-driven analytical model calibration with test data from not only an electrical simulator but also a nuclear-fueled flight unit, and has established the end dome temperatures of a flight RTG where no in-flight or ground-test data existed before

  3. Lessons learned from recent geomagnetic disturbance model validation activities

    Science.gov (United States)

    Pulkkinen, A. A.; Welling, D. T.

    2017-12-01

    Due to concerns pertaining to geomagnetically induced current impact on ground-based infrastructure, there has been significantly elevated interest in applying models for local geomagnetic disturbance or "delta-B" predictions. Correspondingly there has been elevated need for testing the quality of the delta-B predictions generated by the modern empirical and physics-based models. To address this need, community-wide activities were launched under the GEM Challenge framework and one culmination of the activities was the validation and selection of models that were transitioned into operations at NOAA SWPC. The community-wide delta-B action is continued under the CCMC-facilitated International Forum for Space Weather Capabilities Assessment and its "Ground Magnetic Perturbations: dBdt, delta-B, GICs, FACs" working group. The new delta-B working group builds on the past experiences and expands the collaborations to cover the entire international space weather community. In this paper, we discuss the key lessons learned from the past delta-B validation exercises and lay out the path forward for building on those experience under the new delta-B working group.

  4. Simulation Methods and Validation Criteria for Modeling Cardiac Ventricular Electrophysiology.

    Directory of Open Access Journals (Sweden)

    Shankarjee Krishnamoorthi

    Full Text Available We describe a sequence of methods to produce a partial differential equation model of the electrical activation of the ventricles. In our framework, we incorporate the anatomy and cardiac microstructure obtained from magnetic resonance imaging and diffusion tensor imaging of a New Zealand White rabbit, the Purkinje structure and the Purkinje-muscle junctions, and an electrophysiologically accurate model of the ventricular myocytes and tissue, which includes transmural and apex-to-base gradients of action potential characteristics. We solve the electrophysiology governing equations using the finite element method and compute both a 6-lead precordial electrocardiogram (ECG and the activation wavefronts over time. We are particularly concerned with the validation of the various methods used in our model and, in this regard, propose a series of validation criteria that we consider essential. These include producing a physiologically accurate ECG, a correct ventricular activation sequence, and the inducibility of ventricular fibrillation. Among other components, we conclude that a Purkinje geometry with a high density of Purkinje muscle junctions covering the right and left ventricular endocardial surfaces as well as transmural and apex-to-base gradients in action potential characteristics are necessary to produce ECGs and time activation plots that agree with physiological observations.

  5. Simulation Methods and Validation Criteria for Modeling Cardiac Ventricular Electrophysiology.

    Science.gov (United States)

    Krishnamoorthi, Shankarjee; Perotti, Luigi E; Borgstrom, Nils P; Ajijola, Olujimi A; Frid, Anna; Ponnaluri, Aditya V; Weiss, James N; Qu, Zhilin; Klug, William S; Ennis, Daniel B; Garfinkel, Alan

    2014-01-01

    We describe a sequence of methods to produce a partial differential equation model of the electrical activation of the ventricles. In our framework, we incorporate the anatomy and cardiac microstructure obtained from magnetic resonance imaging and diffusion tensor imaging of a New Zealand White rabbit, the Purkinje structure and the Purkinje-muscle junctions, and an electrophysiologically accurate model of the ventricular myocytes and tissue, which includes transmural and apex-to-base gradients of action potential characteristics. We solve the electrophysiology governing equations using the finite element method and compute both a 6-lead precordial electrocardiogram (ECG) and the activation wavefronts over time. We are particularly concerned with the validation of the various methods used in our model and, in this regard, propose a series of validation criteria that we consider essential. These include producing a physiologically accurate ECG, a correct ventricular activation sequence, and the inducibility of ventricular fibrillation. Among other components, we conclude that a Purkinje geometry with a high density of Purkinje muscle junctions covering the right and left ventricular endocardial surfaces as well as transmural and apex-to-base gradients in action potential characteristics are necessary to produce ECGs and time activation plots that agree with physiological observations.

  6. High Turbidity Solis Clear Sky Model: Development and Validation

    Directory of Open Access Journals (Sweden)

    Pierre Ineichen

    2018-03-01

    Full Text Available The Solis clear sky model is a spectral scheme based on radiative transfer calculations and the Lambert–Beer relation. Its broadband version is a simplified fast analytical version; it is limited to broadband aerosol optical depths lower than 0.45, which is a weakness when applied in countries with very high turbidity such as China or India. In order to extend the use of the original simplified version of the model for high turbidity values, we developed a new version of the broadband Solis model based on radiative transfer calculations, valid for turbidity values up to 7, for the three components, global, beam, and diffuse, and for the four aerosol types defined by Shettle and Fenn. A validation of low turbidity data acquired in Geneva shows slightly better results than the previous version. On data acquired at sites presenting higher turbidity data, the bias stays within ±4% for the beam and the global irradiances, and the standard deviation around 5% for clean and stable condition data and around 12% for questionable data and variable sky conditions.

  7. Validating modeled turbulent heat fluxes across large freshwater surfaces

    Science.gov (United States)

    Lofgren, B. M.; Fujisaki-Manome, A.; Gronewold, A.; Anderson, E. J.; Fitzpatrick, L.; Blanken, P.; Spence, C.; Lenters, J. D.; Xiao, C.; Charusambot, U.

    2017-12-01

    Turbulent fluxes of latent and sensible heat are important physical processes that influence the energy and water budgets of the Great Lakes. Validation and improvement of bulk flux algorithms to simulate these turbulent heat fluxes are critical for accurate prediction of hydrodynamics, water levels, weather, and climate over the region. Here we consider five heat flux algorithms from several model systems; the Finite-Volume Community Ocean Model, the Weather Research and Forecasting model, and the Large Lake Thermodynamics Model, which are used in research and operational environments and concentrate on different aspects of the Great Lakes' physical system, but interface at the lake surface. The heat flux algorithms were isolated from each model and driven by meteorological data from over-lake stations in the Great Lakes Evaporation Network. The simulation results were compared with eddy covariance flux measurements at the same stations. All models show the capacity to the seasonal cycle of the turbulent heat fluxes. Overall, the Coupled Ocean Atmosphere Response Experiment algorithm in FVCOM has the best agreement with eddy covariance measurements. Simulations with the other four algorithms are overall improved by updating the parameterization of roughness length scales of temperature and humidity. Agreement between modelled and observed fluxes notably varied with geographical locations of the stations. For example, at the Long Point station in Lake Erie, observed fluxes are likely influenced by the upwind land surface while the simulations do not take account of the land surface influence, and therefore the agreement is worse in general.

  8. Seismic behaviour of PWR fuel assemblies model and its validation

    International Nuclear Information System (INIS)

    Queval, J.C.; Gantenbein, F.; Brochard, D.; Benjedidia, A.

    1991-01-01

    The validity of the models simulating the seismic behaviour of PWR cores can only be exactly demonstrated by seismic testing on groups of fuel assemblies. Shake table seismic tests of rows of assembly mock-ups, conducted by the CEA in conjunction with FRAMATOME, are presented in reference /1/. This paper addresses the initial comparisons between model and test results for a row of five assemblies in air. Two models are used: a model with a single beam per assembly, used regularly in accident analyses, and described in reference /2/, and a more refined 2-beam per assembly model, geared mainly towards interpretation of test results. The 2-beam model is discussed first, together with parametric studies used to characterize it, and the study of the assembly row for a period limited to 2 seconds and for different excitation levels. For the 1-beam model assembly used in applications, the row is studied over the total test time, i.e twenty seconds, which covers the average duration of the core seismic behaviour studies, and for a peak exciting acceleration value at 0.4 g, which corresponds to the SSE level of the reference spectrum

  9. Discrete fracture modelling for the Stripa tracer validation experiment predictions

    International Nuclear Information System (INIS)

    Dershowitz, W.; Wallmann, P.

    1992-02-01

    Groundwater flow and transport through three-dimensional networks of discrete fractures was modeled to predict the recovery of tracer from tracer injection experiments conducted during phase 3 of the Stripa site characterization and validation protect. Predictions were made on the basis of an updated version of the site scale discrete fracture conceptual model used for flow predictions and preliminary transport modelling. In this model, individual fractures were treated as stochastic features described by probability distributions of geometric and hydrologic properties. Fractures were divided into three populations: Fractures in fracture zones near the drift, non-fracture zone fractures within 31 m of the drift, and fractures in fracture zones over 31 meters from the drift axis. Fractures outside fracture zones are not modelled beyond 31 meters from the drift axis. Transport predictions were produced using the FracMan discrete fracture modelling package for each of five tracer experiments. Output was produced in the seven formats specified by the Stripa task force on fracture flow modelling. (au)

  10. Numerical simulation and experimental validation of aircraft ground deicing model

    Directory of Open Access Journals (Sweden)

    Bin Chen

    2016-05-01

    Full Text Available Aircraft ground deicing plays an important role of guaranteeing the aircraft safety. In practice, most airports generally use as many deicing fluids as possible to remove the ice, which causes the waste of the deicing fluids and the pollution of the environment. Therefore, the model of aircraft ground deicing should be built to establish the foundation for the subsequent research, such as the optimization of the deicing fluid consumption. In this article, the heat balance of the deicing process is depicted, and the dynamic model of the deicing process is provided based on the analysis of the deicing mechanism. In the dynamic model, the surface temperature of the deicing fluids and the ice thickness are regarded as the state parameters, while the fluid flow rate, the initial temperature, and the injection time of the deicing fluids are treated as control parameters. Ignoring the heat exchange between the deicing fluids and the environment, the simplified model is obtained. The rationality of the simplified model is verified by the numerical simulation and the impacts of the flow rate, the initial temperature and the injection time on the deicing process are investigated. To verify the model, the semi-physical experiment system is established, consisting of the low-constant temperature test chamber, the ice simulation system, the deicing fluid heating and spraying system, the simulated wing, the test sensors, and the computer measure and control system. The actual test data verify the validity of the dynamic model and the accuracy of the simulation analysis.

  11. A physical framework for evaluating net effects of wet meadow restoration on late summer streamflow

    Science.gov (United States)

    Grant, G.; Nash, C.; Selker, J. S.; Lewis, S.; Noël, P.

    2017-12-01

    Restoration of degraded wet meadows that develop on upland valley floors is intended to achieve a range of ecological benefits. A widely cited benefit is the potential for meadow restoration to augment late-season streamflow; however, there has been little field data demonstrating increased summer flows following restoration. Instead, the hydrologic consequences of restoration have typically been explored using coupled groundwater and surface water flow models at instrumented sites. The expected magnitude and direction of change provided by models has, however, been inconclusive. Here, we assess the streamflow benefit that can be obtained by wet meadow restoration using a parsimonious, physically-based approach. We use a one-dimensional linearized Boussinesq equation with a superimposed solution for changes in storage due to groundwater upwelling and and explicitly calculate evapotranspiration using the White Method. The model accurately predicts water table elevations from field data in the Middle Fork John Day watershed in Oregon, USA. The full solution shows that while raising channel beds can increase total water storage via increases in water table elevation in upland valley bottoms, the contributions of both lateral and longitudinal drainage from restored floodplains to late summer streamflow are undetectably small, while losses in streamflow due to greater transpiration, lower hydraulic gradients, and less drainable pore volume are substantial. Although late-summer streamflow increases should not be expected as a direct result of wet meadow restoration, these approaches offer benefits for improving the quality and health of riparian and meadow vegetation that would warrant considering such measures, even at the cost of increased water demand and reduced streamflow.

  12. Validation of Symptom Validity Tests Using a "Child-model" of Adult Cognitive Impairments

    NARCIS (Netherlands)

    Rienstra, A.; Spaan, P. E. J.; Schmand, B.

    2010-01-01

    Validation studies of symptom validity tests (SVTs) in children are uncommon. However, since children's cognitive abilities are not yet fully developed, their performance may provide additional support for the validity of these measures in adult populations. Four SVTs, the Test of Memory Malingering

  13. Validation of symptom validity tests using a "child-model" of adult cognitive impairments

    NARCIS (Netherlands)

    Rienstra, A.; Spaan, P.E.J.; Schmand, B.

    2010-01-01

    Validation studies of symptom validity tests (SVTs) in children are uncommon. However, since children’s cognitive abilities are not yet fully developed, their performance may provide additional support for the validity of these measures in adult populations. Four SVTs, the Test of Memory Malingering

  14. Technical note: Combining quantile forecasts and predictive distributions of streamflows

    Science.gov (United States)

    Bogner, Konrad; Liechti, Katharina; Zappa, Massimiliano

    2017-11-01

    The enhanced availability of many different hydro-meteorological modelling and forecasting systems raises the issue of how to optimally combine this great deal of information. Especially the usage of deterministic and probabilistic forecasts with sometimes widely divergent predicted future streamflow values makes it even more complicated for decision makers to sift out the relevant information. In this study multiple streamflow forecast information will be aggregated based on several different predictive distributions, and quantile forecasts. For this combination the Bayesian model averaging (BMA) approach, the non-homogeneous Gaussian regression (NGR), also known as the ensemble model output statistic (EMOS) techniques, and a novel method called Beta-transformed linear pooling (BLP) will be applied. By the help of the quantile score (QS) and the continuous ranked probability score (CRPS), the combination results for the Sihl River in Switzerland with about 5 years of forecast data will be compared and the differences between the raw and optimally combined forecasts will be highlighted. The results demonstrate the importance of applying proper forecast combination methods for decision makers in the field of flood and water resource management.

  15. Smooth particle hydrodynamic modeling and validation for impact bird substitution

    Science.gov (United States)

    Babu, Arun; Prasad, Ganesh

    2018-04-01

    Bird strike events incidentally occur and can at times be fatal for air frame structures. Federal Aviation Regulations (FAR) and such other ones mandates aircrafts to be modeled to withstand various levels of bird hit damages. The subject matter of this paper is numerical modeling of a soft body geometry for realistically substituting an actual bird for carrying out simulations of bird hit on target structures. Evolution of such a numerical code to effect an actual bird behavior through impact is much desired for making use of the state of the art computational facilities in simulating bird strike events. Validity, of simulations depicting bird hits, is largely dependent on the correctness of the bird model. In an impact, a set of complex and coupled dynamic interaction exists between the target and the impactor. To simplify this problem, impactor response needs to be decoupled from that of the target. This can be done by assuming and modeling the target as noncompliant. Bird is assumed as fluidic in a impact. Generated stresses in the bird body are significant than its yield stresses. Hydrodynamic theory is most ideal for describing this problem. Impactor literally flows steadily over the target for most part of this problem. The impact starts with an initial shock and falls into a radial release shock regime. Subsequently a steady flow is established in the bird body and this phase continues till the whole length of the bird body is turned around. Initial shock pressure and steady state pressure are ideal variables for comparing and validating the bird model. Spatial discretization of the bird is done using Smooth Particle Hydrodynamic (SPH) approach. This Discrete Element Model (DEM) offers significant advantages over other contemporary approaches. Thermodynamic state variable relations are established using Polynomial Equation of State (EOS). ANSYS AUTODYN is used to perform the explicit dynamic simulation of the impact event. Validation of the shock and steady

  16. Validation of kinetic modeling of progesterone release from polymeric membranes

    Directory of Open Access Journals (Sweden)

    Analia Irma Romero

    2018-01-01

    Full Text Available Mathematical modeling in drug release systems is fundamental in development and optimization of these systems, since it allows to predict drug release rates and to elucidate the physical transport mechanisms involved. In this paper we validate a novel mathematical model that describes progesterone (Prg controlled release from poly-3-hydroxybutyric acid (PHB membranes. A statistical analysis was conducted to compare the fitting of our model with six different models and the Akaike information criterion (AIC was used to find the equation with best-fit. A simple relation between mass and drug released rate was found, which allows predicting the effect of Prg loads on the release behavior. Our proposed model was the one with minimum AIC value, and therefore it was the one that statistically fitted better the experimental data obtained for all the Prg loads tested. Furthermore, the initial release rate was calculated and therefore, the interface mass transfer coefficient estimated and the equilibrium distribution constant of Prg between the PHB and the release medium was also determined. The results lead us to conclude that our proposed model is the one which best fits the experimental data and can be successfully used to describe Prg drug release in PHB membranes.

  17. MT3DMS: Model use, calibration, and validation

    Science.gov (United States)

    Zheng, C.; Hill, Mary C.; Cao, G.; Ma, R.

    2012-01-01

    MT3DMS is a three-dimensional multi-species solute transport model for solving advection, dispersion, and chemical reactions of contaminants in saturated groundwater flow systems. MT3DMS interfaces directly with the U.S. Geological Survey finite-difference groundwater flow model MODFLOW for the flow solution and supports the hydrologic and discretization features of MODFLOW. MT3DMS contains multiple transport solution techniques in one code, which can often be important, including in model calibration. Since its first release in 1990 as MT3D for single-species mass transport modeling, MT3DMS has been widely used in research projects and practical field applications. This article provides a brief introduction to MT3DMS and presents recommendations about calibration and validation procedures for field applications of MT3DMS. The examples presented suggest the need to consider alternative processes as models are calibrated and suggest opportunities and difficulties associated with using groundwater age in transport model calibration.

  18. Non-Linear Slosh Damping Model Development and Validation

    Science.gov (United States)

    Yang, H. Q.; West, Jeff

    2015-01-01

    Propellant tank slosh dynamics are typically represented by a mechanical model of spring mass damper. This mechanical model is then included in the equation of motion of the entire vehicle for Guidance, Navigation and Control (GN&C) analysis. For a partially-filled smooth wall propellant tank, the critical damping based on classical empirical correlation is as low as 0.05%. Due to this low value of damping, propellant slosh is potential sources of disturbance critical to the stability of launch and space vehicles. It is postulated that the commonly quoted slosh damping is valid only under the linear regime where the slosh amplitude is small. With the increase of slosh amplitude, the critical damping value should also increase. If this nonlinearity can be verified and validated, the slosh stability margin can be significantly improved, and the level of conservatism maintained in the GN&C analysis can be lessened. The purpose of this study is to explore and to quantify the dependence of slosh damping with slosh amplitude. Accurately predicting the extremely low damping value of a smooth wall tank is very challenging for any Computational Fluid Dynamics (CFD) tool. One must resolve thin boundary layers near the wall and limit numerical damping to minimum. This computational study demonstrates that with proper grid resolution, CFD can indeed accurately predict the low damping physics from smooth walls under the linear regime. Comparisons of extracted damping values with experimental data for different tank sizes show very good agreements. Numerical simulations confirm that slosh damping is indeed a function of slosh amplitude. When slosh amplitude is low, the damping ratio is essentially constant, which is consistent with the empirical correlation. Once the amplitude reaches a critical value, the damping ratio becomes a linearly increasing function of the slosh amplitude. A follow-on experiment validated the developed nonlinear damping relationship. This discovery can

  19. Climate change streamflow scenarios designed for critical period water resources planning studies

    Science.gov (United States)

    Hamlet, A. F.; Snover, A. K.; Lettenmaier, D. P.

    2003-04-01

    Long-range water planning in the United States is usually conducted by individual water management agencies using a critical period planning exercise based on a particular period of the observed streamflow record and a suite of internally-developed simulation tools representing the water system. In the context of planning for climate change, such an approach is flawed in that it assumes that the future climate will be like the historic record. Although more sophisticated planning methods will probably be required as time goes on, a short term strategy for incorporating climate uncertainty into long-range water planning as soon as possible is to create alternate inputs to existing planning methods that account for climate uncertainty as it affects both supply and demand. We describe a straight-forward technique for constructing streamflow scenarios based on the historic record that include the broad-based effects of changed regional climate simulated by several global climate models (GCMs). The streamflow scenarios are based on hydrologic simulations driven by historic climate data perturbed according to regional climate signals from four GCMs using the simple "delta" method. Further data processing then removes systematic hydrologic model bias using a quantile-based bias correction scheme, and lastly, the effects of random errors in the raw hydrologic simulations are removed. These techniques produce streamflow scenarios that are consistent in time and space with the historic streamflow record while incorporating fundamental changes in temperature and precipitation from the GCM scenarios. Planning model simulations based on these climate change streamflow scenarios can therefore be compared directly to planning model simulations based on the historic record of streamflows to help planners understand the potential impacts of climate uncertainty. The methods are currently being tested and refined in two large-scale planning exercises currently being conducted in the

  20. Modeling and validation of existing VAV system components

    Energy Technology Data Exchange (ETDEWEB)

    Nassif, N.; Kajl, S.; Sabourin, R. [Ecole de Technologie Superieure, Montreal, PQ (Canada)

    2004-07-01

    The optimization of supervisory control strategies and local-loop controllers can improve the performance of HVAC (heating, ventilating, air-conditioning) systems. In this study, the component model of the fan, the damper and the cooling coil were developed and validated against monitored data of an existing variable air volume (VAV) system installed at Montreal's Ecole de Technologie Superieure. The measured variables that influence energy use in individual HVAC models included: (1) outdoor and return air temperature and relative humidity, (2) supply air and water temperatures, (3) zone airflow rates, (4) supply duct, outlet fan, mixing plenum static pressures, (5) fan speed, and (6) minimum and principal damper and cooling and heating coil valve positions. The additional variables that were considered, but not measured were: (1) fan and outdoor airflow rate, (2) inlet and outlet cooling coil relative humidity, and (3) liquid flow rate through the heating or cooling coils. The paper demonstrates the challenges of the validation process when monitored data of existing VAV systems are used. 7 refs., 11 figs.

  1. Streamflow responses in Chile to megathrust earthquakes in the 20th and 21st centuries

    Science.gov (United States)

    Mohr, Christian; Manga, Michael; Wang, Chi-yuen; Korup, Oliver

    2016-04-01

    Both coseismic static stress and dynamic stresses associated with seismic waves may cause responses in hydrological systems. Such responses include changes in the water level, hydrochemistry and streamflow discharge. Earthquake effects on hydrological systems provide a means to study the interaction between stress changes and regional hydrology, which is otherwise rarely possible. Chile is a country of frequent and large earthquakes and thus provides abundant opportunities to study such interactions and processes. We analyze streamflow responses in Chile to several megathrust earthquakes, including the 1943 Mw 8.1 Coquimbo, 1950 Mw 8.2 Antofagasta, 1960 Mw 9.5 Valdivia, 1985 Mw 8.0 Valparaiso, 1995 Mw 8.0 Antofagasta, 2010 Mw 8.8 Maule, and the 2014 Mw 8.2 Iquique earthquakes. We use data from 716 stream gauges distributed from the Altiplano in the North to Tierra del Fuego in the South. This network covers the Andes mountain ranges, the central valley, the Coastal Mountain ranges and (mainly in the more southern parts) the Coastal flats. We combine empirical magnitude-distance relationships, machine learning tools, and process-based modeling to characterize responses. We first assess the streamflow anomalies and relate these to topographical, hydro-climatic, geological and earthquake-related (volumetric and dynamic strain) factors using various classifiers. We then apply 1D-groundwater flow modeling to selected catchments in order to test competing hypotheses for the origin of streamflow changes. We show that the co-seismic responses of streamflow mostly involved increasing discharges. We conclude that enhanced vertical permeability can explain most streamflow responses at the regional scale. The total excess water released by a single earthquake, i.e. the Maule earthquake, yielded up to 1 km3. Against the background of megathrust earthquakes frequently hitting Chile, the amount of water released by earthquakes is substantial, particularly for the arid northern

  2. Numerical Simulation of Hydrogen Combustion: Global Reaction Model and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Yun [School of Energy and Power Engineering, Xi’an Jiaotong University, Xi’an (China); Department of Mechanical, Aerospace and Nuclear Engineering, Rensselaer Polytechnic Institute, Troy, NY (United States); Liu, Yinhe, E-mail: yinheliu@mail.xjtu.edu.cn [School of Energy and Power Engineering, Xi’an Jiaotong University, Xi’an (China)

    2017-11-20

    Due to the complexity of modeling the combustion process in nuclear power plants, the global mechanisms are preferred for numerical simulation. To quickly perform the highly resolved simulations with limited processing resources of large-scale hydrogen combustion, a method based on thermal theory was developed to obtain kinetic parameters of global reaction mechanism of hydrogen–air combustion in a wide range. The calculated kinetic parameters at lower hydrogen concentration (C{sub hydrogen} < 20%) were validated against the results obtained from experimental measurements in a container and combustion test facility. In addition, the numerical data by the global mechanism (C{sub hydrogen} > 20%) were compared with the results by detailed mechanism. Good agreement between the model prediction and the experimental data was achieved, and the comparison between simulation results by the detailed mechanism and the global reaction mechanism show that the present calculated global mechanism has excellent predictable capabilities for a wide range of hydrogen–air mixtures.

  3. Experimental validation of models for Plasma Focus devices

    International Nuclear Information System (INIS)

    Rodriguez Palomino, Luis; Gonzalez, Jose; Clausse, Alejandro

    2003-01-01

    Plasma Focus(PF) Devices are thermonuclear pulsators that produce short pulsed radiation (X-ray, charged particles and neutrons). Since Filippov and Mather, investigations have been used to study plasma properties. Nowadays the interest about PF is focused in technology applications, related to the use of these devices as pulsed neutron sources. In the numerical calculus the Inter institutional PLADEMA (PLAsmas DEnsos MAgnetizados) network is developing three models. Each one is useful in different engineering stages of the Plasma Focus design. One of the main objectives in this work is a comparative study on the influence of the different parameters involved in each models. To validate these results, several experimental measurements under different geometry and initial conditions were performed. (author)

  4. Numerical Simulation of Hydrogen Combustion: Global Reaction Model and Validation

    International Nuclear Information System (INIS)

    Zhang, Yun; Liu, Yinhe

    2017-01-01

    Due to the complexity of modeling the combustion process in nuclear power plants, the global mechanisms are preferred for numerical simulation. To quickly perform the highly resolved simulations with limited processing resources of large-scale hydrogen combustion, a method based on thermal theory was developed to obtain kinetic parameters of global reaction mechanism of hydrogen–air combustion in a wide range. The calculated kinetic parameters at lower hydrogen concentration (C hydrogen < 20%) were validated against the results obtained from experimental measurements in a container and combustion test facility. In addition, the numerical data by the global mechanism (C hydrogen > 20%) were compared with the results by detailed mechanism. Good agreement between the model prediction and the experimental data was achieved, and the comparison between simulation results by the detailed mechanism and the global reaction mechanism show that the present calculated global mechanism has excellent predictable capabilities for a wide range of hydrogen–air mixtures.

  5. Validating agent oriented methodology (AOM) for netlogo modelling and simulation

    Science.gov (United States)

    WaiShiang, Cheah; Nissom, Shane; YeeWai, Sim; Sharbini, Hamizan

    2017-10-01

    AOM (Agent Oriented Modeling) is a comprehensive and unified agent methodology for agent oriented software development. AOM methodology was proposed to aid developers with the introduction of technique, terminology, notation and guideline during agent systems development. Although AOM methodology is claimed to be capable of developing a complex real world system, its potential is yet to be realized and recognized by the mainstream software community and the adoption of AOM is still at its infancy. Among the reason is that there are not much case studies or success story of AOM. This paper presents two case studies on the adoption of AOM for individual based modelling and simulation. It demonstrate how the AOM is useful for epidemiology study and ecological study. Hence, it further validate the AOM in a qualitative manner.

  6. Merging Satellite Precipitation Products for Improved Streamflow Simulations

    Science.gov (United States)

    Maggioni, V.; Massari, C.; Barbetta, S.; Camici, S.; Brocca, L.

    2017-12-01

    statistics, as well as bias reduction and correlation coefficient, with the Bayesian approach being superior to other methods. A study case in the Tiber river basin is also presented to discuss the performance of forcing a hydrological model with the merged satellite precipitation product to simulate streamflow time series.

  7. External Validity and Model Validity: A Conceptual Approach for Systematic Review Methodology

    Directory of Open Access Journals (Sweden)

    Raheleh Khorsan

    2014-01-01

    Full Text Available Background. Evidence rankings do not consider equally internal (IV, external (EV, and model validity (MV for clinical studies including complementary and alternative medicine/integrative health care (CAM/IHC research. This paper describe this model and offers an EV assessment tool (EVAT© for weighing studies according to EV and MV in addition to IV. Methods. An abbreviated systematic review methodology was employed to search, assemble, and evaluate the literature that has been published on EV/MV criteria. Standard databases were searched for keywords relating to EV, MV, and bias-scoring from inception to Jan 2013. Tools identified and concepts described were pooled to assemble a robust tool for evaluating these quality criteria. Results. This study assembled a streamlined, objective tool to incorporate for the evaluation of quality of EV/MV research that is more sensitive to CAM/IHC research. Conclusion. Improved reporting on EV can help produce and provide information that will help guide policy makers, public health researchers, and other scientists in their selection, development, and improvement in their research-tested intervention. Overall, clinical studies with high EV have the potential to provide the most useful information about “real-world” consequences of health interventions. It is hoped that this novel tool which considers IV, EV, and MV on equal footing will better guide clinical decision making.

  8. Experimental validation of solid rocket motor damping models

    Science.gov (United States)

    Riso, Cristina; Fransen, Sebastiaan; Mastroddi, Franco; Coppotelli, Giuliano; Trequattrini, Francesco; De Vivo, Alessio

    2017-12-01

    In design and certification of spacecraft, payload/launcher coupled load analyses are performed to simulate the satellite dynamic environment. To obtain accurate predictions, the system damping properties must be properly taken into account in the finite element model used for coupled load analysis. This is typically done using a structural damping characterization in the frequency domain, which is not applicable in the time domain. Therefore, the structural damping matrix of the system must be converted into an equivalent viscous damping matrix when a transient coupled load analysis is performed. This paper focuses on the validation of equivalent viscous damping methods for dynamically condensed finite element models via correlation with experimental data for a realistic structure representative of a slender launch vehicle with solid rocket motors. A second scope of the paper is to investigate how to conveniently choose a single combination of Young's modulus and structural damping coefficient—complex Young's modulus—to approximate the viscoelastic behavior of a solid propellant material in the frequency band of interest for coupled load analysis. A scaled-down test article inspired to the Z9-ignition Vega launcher configuration is designed, manufactured, and experimentally tested to obtain data for validation of the equivalent viscous damping methods. The Z9-like component of the test article is filled with a viscoelastic material representative of the Z9 solid propellant that is also preliminarily tested to investigate the dependency of the complex Young's modulus on the excitation frequency and provide data for the test article finite element model. Experimental results from seismic and shock tests performed on the test configuration are correlated with numerical results from frequency and time domain analyses carried out on its dynamically condensed finite element model to assess the applicability of different equivalent viscous damping methods to describe

  9. Design-validation of a hand exoskeleton using musculoskeletal modeling.

    Science.gov (United States)

    Hansen, Clint; Gosselin, Florian; Ben Mansour, Khalil; Devos, Pierre; Marin, Frederic

    2018-04-01

    Exoskeletons are progressively reaching homes and workplaces, allowing interaction with virtual environments, remote control of robots, or assisting human operators in carrying heavy loads. Their design is however still a challenge as these robots, being mechanically linked to the operators who wear them, have to meet ergonomic constraints besides usual robotic requirements in terms of workspace, speed, or efforts. They have in particular to fit the anthropometry and mobility of their users. This traditionally results in numerous prototypes which are progressively fitted to each individual person. In this paper, we propose instead to validate the design of a hand exoskeleton in a fully digital environment, without the need for a physical prototype. The purpose of this study is thus to examine whether finger kinematics are altered when using a given hand exoskeleton. Therefore, user specific musculoskeletal models were created and driven by a motion capture system to evaluate the fingers' joint kinematics when performing two industrial related tasks. The kinematic chain of the exoskeleton was added to the musculoskeletal models and its compliance with the hand movements was evaluated. Our results show that the proposed exoskeleton design does not influence fingers' joints angles, the coefficient of determination between the model with and without exoskeleton being consistently high (R 2 ¯=0.93) and the nRMSE consistently low (nRMSE¯ = 5.42°). These results are promising and this approach combining musculoskeletal and robotic modeling driven by motion capture data could be a key factor in the ergonomics validation of the design of orthotic devices and exoskeletons prior to manufacturing. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Experimental validation of solid rocket motor damping models

    Science.gov (United States)

    Riso, Cristina; Fransen, Sebastiaan; Mastroddi, Franco; Coppotelli, Giuliano; Trequattrini, Francesco; De Vivo, Alessio

    2018-06-01

    In design and certification of spacecraft, payload/launcher coupled load analyses are performed to simulate the satellite dynamic environment. To obtain accurate predictions, the system damping properties must be properly taken into account in the finite element model used for coupled load analysis. This is typically done using a structural damping characterization in the frequency domain, which is not applicable in the time domain. Therefore, the structural damping matrix of the system must be converted into an equivalent viscous damping matrix when a transient coupled load analysis is performed. This paper focuses on the validation of equivalent viscous damping methods for dynamically condensed finite element models via correlation with experimental data for a realistic structure representative of a slender launch vehicle with solid rocket motors. A second scope of the paper is to investigate how to conveniently choose a single combination of Young's modulus and structural damping coefficient—complex Young's modulus—to approximate the viscoelastic behavior of a solid propellant material in the frequency band of interest for coupled load analysis. A scaled-down test article inspired to the Z9-ignition Vega launcher configuration is designed, manufactured, and experimentally tested to obtain data for validation of the equivalent viscous damping methods. The Z9-like component of the test article is filled with a viscoelastic material representative of the Z9 solid propellant that is also preliminarily tested to investigate the dependency of the complex Young's modulus on the excitation frequency and provide data for the test article finite element model. Experimental results from seismic and shock tests performed on the test configuration are correlated with numerical results from frequency and time domain analyses carried out on its dynamically condensed finite element model to assess the applicability of different equivalent viscous damping methods to describe

  11. An intercomparison of approaches for improving operational seasonal streamflow forecasts

    Science.gov (United States)

    Mendoza, Pablo A.; Wood, Andrew W.; Clark, Elizabeth; Rothwell, Eric; Clark, Martyn P.; Nijssen, Bart; Brekke, Levi D.; Arnold, Jeffrey R.

    2017-07-01

    For much of the last century, forecasting centers around the world have offered seasonal streamflow predictions to support water management. Recent work suggests that the two major avenues to advance seasonal predictability are improvements in the estimation of initial hydrologic conditions (IHCs) and the incorporation of climate information. This study investigates the marginal benefits of a variety of methods using IHCs and/or climate information, focusing on seasonal water supply forecasts (WSFs) in five case study watersheds located in the US Pacific Northwest region. We specify two benchmark methods that mimic standard operational approaches - statistical regression against IHCs and model-based ensemble streamflow prediction (ESP) - and then systematically intercompare WSFs across a range of lead times. Additional methods include (i) statistical techniques using climate information either from standard indices or from climate reanalysis variables and (ii) several hybrid/hierarchical approaches harnessing both land surface and climate predictability. In basins where atmospheric teleconnection signals are strong, and when watershed predictability is low, climate information alone provides considerable improvements. For those basins showing weak teleconnections, custom predictors from reanalysis fields were more effective in forecast skill than standard climate indices. ESP predictions tended to have high correlation skill but greater bias compared to other methods, and climate predictors failed to substantially improve these deficiencies within a trace weighting framework. Lower complexity techniques were competitive with more complex methods, and the hierarchical expert regression approach introduced here (hierarchical ensemble streamflow prediction - HESP) provided a robust alternative for skillful and reliable water supply forecasts at all initialization times. Three key findings from this effort are (1) objective approaches supporting methodologically

  12. An intercomparison of approaches for improving operational seasonal streamflow forecasts

    Directory of Open Access Journals (Sweden)

    P. A. Mendoza

    2017-07-01

    Full Text Available For much of the last century, forecasting centers around the world have offered seasonal streamflow predictions to support water management. Recent work suggests that the two major avenues to advance seasonal predictability are improvements in the estimation of initial hydrologic conditions (IHCs and the incorporation of climate information. This study investigates the marginal benefits of a variety of methods using IHCs and/or climate information, focusing on seasonal water supply forecasts (WSFs in five case study watersheds located in the US Pacific Northwest region. We specify two benchmark methods that mimic standard operational approaches – statistical regression against IHCs and model-based ensemble streamflow prediction (ESP – and then systematically intercompare WSFs across a range of lead times. Additional methods include (i statistical techniques using climate information either from standard indices or from climate reanalysis variables and (ii several hybrid/hierarchical approaches harnessing both land surface and climate predictability. In basins where atmospheric teleconnection signals are strong, and when watershed predictability is low, climate information alone provides considerable improvements. For those basins showing weak teleconnections, custom predictors from reanalysis fields were more effective in forecast skill than standard climate indices. ESP predictions tended to have high correlation skill but greater bias compared to other methods, and climate predictors failed to substantially improve these deficiencies within a trace weighting framework. Lower complexity techniques were competitive with more complex methods, and the hierarchical expert regression approach introduced here (hierarchical ensemble streamflow prediction – HESP provided a robust alternative for skillful and reliable water supply forecasts at all initialization times. Three key findings from this effort are (1 objective approaches supporting

  13. Effects of water-supply reservoirs on streamflow in Massachusetts

    Science.gov (United States)

    Levin, Sara B.

    2016-10-06

    State and local water-resource managers need modeling tools to help them manage and protect water-supply resources for both human consumption and ecological needs. The U.S. Geological Survey, in cooperation with the Massachusetts Department of Environmental Protection, has developed a decision-support tool to estimate the effects of reservoirs on natural streamflow. The Massachusetts Reservoir Simulation Tool is a model that simulates the daily water balance of a reservoir. The reservoir simulation tool provides estimates of daily outflows from reservoirs and compares the frequency, duration, and magnitude of the volume of outflows from reservoirs with estimates of the unaltered streamflow that would occur if no dam were present. This tool will help environmental managers understand the complex interactions and tradeoffs between water withdrawals, reservoir operational practices, and reservoir outflows needed for aquatic habitats.A sensitivity analysis of the daily water balance equation was performed to identify physical and operational features of reservoirs that could have the greatest effect on reservoir outflows. For the purpose of this report, uncontrolled releases of water (spills or spillage) over the reservoir spillway were considered to be a proxy for reservoir outflows directly below the dam. The ratio of average withdrawals to the average inflows had the largest effect on spillage patterns, with the highest withdrawals leading to the lowest spillage. The size of the surface area relative to the drainage area of the reservoir also had an effect on spillage; reservoirs with large surface areas have high evaporation rates during the summer, which can contribute to frequent and long periods without spillage, even in the absence of water withdrawals. Other reservoir characteristics, such as variability of inflows, groundwater interactions, and seasonal demand patterns, had low to moderate effects on the frequency, duration, and magnitude of spillage. The

  14. Validation of a Global Hydrodynamic Flood Inundation Model

    Science.gov (United States)

    Bates, P. D.; Smith, A.; Sampson, C. C.; Alfieri, L.; Neal, J. C.

    2014-12-01

    In this work we present first validation results for a hyper-resolution global flood inundation model. We use a true hydrodynamic model (LISFLOOD-FP) to simulate flood inundation at 1km resolution globally and then use downscaling algorithms to determine flood extent and depth at 90m spatial resolution. Terrain data are taken from a custom version of the SRTM data set that has been processed specifically for hydrodynamic modelling. Return periods of flood flows along the entire global river network are determined using: (1) empirical relationships between catchment characteristics and index flood magnitude in different hydroclimatic zones derived from global runoff data; and (2) an index flood growth curve, also empirically derived. Bankful return period flow is then used to set channel width and depth, and flood defence impacts are modelled using empirical relationships between GDP, urbanization and defence standard of protection. The results of these simulations are global flood hazard maps for a number of different return period events from 1 in 5 to 1 in 1000 years. We compare these predictions to flood hazard maps developed by national government agencies in the UK and Germany using similar methods but employing detailed local data, and to observed flood extent at a number of sites including St. Louis, USA and Bangkok in Thailand. Results show that global flood hazard models can have considerable skill given careful treatment to overcome errors in the publicly available data that are used as their input.

  15. Validation of Storm Water Management Model Storm Control Measures Modules

    Science.gov (United States)

    Simon, M. A.; Platz, M. C.

    2017-12-01

    EPA's Storm Water Management Model (SWMM) is a computational code heavily relied upon by industry for the simulation of wastewater and stormwater infrastructure performance. Many municipalities are relying on SWMM results to design multi-billion-dollar, multi-decade infrastructure upgrades. Since the 1970's, EPA and others have developed five major releases, the most recent ones containing storm control measures modules for green infrastructure. The main objective of this study was to quantify the accuracy with which SWMM v5.1.10 simulates the hydrologic activity of previously monitored low impact developments. Model performance was evaluated with a mathematical comparison of outflow hydrographs and total outflow volumes, using empirical data and a multi-event, multi-objective calibration method. The calibration methodology utilized PEST++ Version 3, a parameter estimation tool, which aided in the selection of unmeasured hydrologic parameters. From the validation study and sensitivity analysis, several model improvements were identified to advance SWMM LID Module performance for permeable pavements, infiltration units and green roofs, and these were performed and reported herein. Overall, it was determined that SWMM can successfully simulate low impact development controls given accurate model confirmation, parameter measurement, and model calibration.

  16. How Hydroclimate Influences the Effectiveness of Particle Filter Data Assimilation of Streamflow in Initializing Short- to Medium-range Streamflow Forecasts

    Science.gov (United States)

    Clark, E.; Wood, A.; Nijssen, B.; Clark, M. P.

    2017-12-01

    Short- to medium-range (1- to 7-day) streamflow forecasts are important for flood control operations and in issuing potentially life-save flood warnings. In the U.S., the National Weather Service River Forecast Centers (RFCs) issue such forecasts in real time, depending heavily on a manual data assimilation (DA) approach. Forecasters adjust model inputs, states, parameters and outputs based on experience and consideration of a range of supporting real-time information. Achieving high-quality forecasts from new automated, centralized forecast systems will depend critically on the adequacy of automated DA approaches to make analogous corrections to the forecasting system. Such approaches would further enable systematic evaluation of real-time flood forecasting methods and strategies. Toward this goal, we have implemented a real-time Sequential Importance Resampling particle filter (SIR-PF) approach to assimilate observed streamflow into simulated initial hydrologic conditions (states) for initializing ensemble flood forecasts. Assimilating streamflow alone in SIR-PF improves simulated streamflow and soil moisture during the model spin up period prior to a forecast, with consequent benefits for forecasts. Nevertheless, it only consistently limits error in simulated snow water equivalent during the snowmelt season and in basins where precipitation falls primarily as snow. We examine how the simulated initial conditions with and without SIR-PF propagate into 1- to 7-day ensemble streamflow forecasts. Forecasts are evaluated in terms of reliability and skill over a 10-year period from 2005-2015. The focus of this analysis is on how interactions between hydroclimate and SIR-PF performance impact forecast skill. To this end, we examine forecasts for 5 hydroclimatically diverse basins in the western U.S. Some of these basins receive most of their precipitation as snow, others as rain. Some freeze throughout the mid-winter while others experience significant mid-winter melt

  17. NAIRAS aircraft radiation model development, dose climatology, and initial validation

    Science.gov (United States)

    Mertens, Christopher J.; Meier, Matthias M.; Brown, Steven; Norman, Ryan B.; Xu, Xiaojing

    2013-10-01

    The Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) is a real-time, global, physics-based model used to assess radiation exposure to commercial aircrews and passengers. The model